Saturday, April 28, 2012

MIT's Glare-Free Glass Doesn't Fog Up, Cleans Itself



A piece of glass that has one half with a normal flat surface, and the other with the new patterned surface, demonstrates the anti-fogging properties that the patterning provides. [Photo: Kyoo-Chul Park and Hyungryul Choi]MIT scientists have found a new way of surfacing glass in such a way that it virtually eliminates its reflective properties--and that could mean glare-free screens!
Beyond saving the eyes of some smartphone and laptop users in the sun, the glass could have a wide range of applications. The glare-free properties of the glass could allow solar panels to absorb more of the sun’s rays that would have been reflected otherwise. This “multifunctional” glass is also resistant to fogging up, can repel water, and it can even clean itself.
Imagine car windows that did not fog up in the winter, or ones that could whisk away rain and dirt all on its own. The scientists imagine that their glass modifications could be applied to all sorts of optical lenses for eyeglasses and cameras, televisions and smaller screens, and windows for buildings.


 http://www.pcworld.com/article/254661/mits_glarefree_glass_doesnt_fog_up_cleans_itself.html

Friday, April 27, 2012

GE, Arista to backup solar, wind with battery system

By | April 25, 2012, 9:44 AM PDT
Photo: Wind turbine at Ascension Auxiliay Airfield in the south Atlantic Ocean by Flickr user Lance Cheung, CC 2.0
General Electric has partnered with Arista Power to make and sell systems that can store electricity from on-site solar and wind sources as well as from the grid and release it when demand peaks to help cut power bills.
GE’s Durathon nickel-salt battery —  the product of a $100 million investment by the company’s transportation division – will be used in Arista’s Power on Demand system. Arista designed a system that can store energy generated by wind turbines, solar photovoltaic and the electric grid. Real-time monitoring technology is used to track and help smooth out power demand on the grid. When demand for energy spikes, the system releases the stored power to reduce “peak demand” charges, prolong battery life and ultimately lower commercial electricity costs. Peak demand charges can account for up to 30 percent to 70 percent of a commercial electric bill.
In short, the system combines software and battery tech to give customers more control over their power. The systems aren’t meant for the average homeowner. Instead, it’s designed for large institutions like college campuses and hospitals that have solar or wind onsite and want to get the most out of their renewable energy source, not to mention cut their power bills.
About the battery
The Durathon is a sodium-metal halide battery made for the telecommunications industry, uninterruptable power supply and utilities markets. It was designed to store a lot of energy in a small space. GE says the batteries are 50 percent smaller and 25 percent lighter than “traditional lead acid batteries.” The batteries last up to 20 years, operate effectively in extreme temperatures, requires no cooling and are recyclable, the company says. The batteries will be made in the company’s new plant in Schenectady, New York.
The no cooling component could be the big selling feature for facilities. A lot of the energy storage uses sodium sulfur batteries, which can be hard to maintain because they run at high temperatures. As Greentech Media notes, that might be fine for a remote wind farm, but not so much for a data center.

http://www.smartplanet.com/blog/intelligent-energy/ge-arista-to-backup-solar-wind-with-battery-system/15322


Why baseload power is doomed

By | March 28, 2012, 5:00 AM PDT
Photo: Feet of Chinese woman, bound, compared with tea cup and American woman’s shoe, World War 1 era. (otisarchives/Flickr)
A persistent myth about the challenges of integrating renewable power into the grid is that because solar and wind are intermittent, grid operators need to maintain full generation capacity from “baseload” plants powered by coal and nuclear. Recent real-world data and research shows that not only is this not true, but that baseload capacity is fundamentally incompatible with renewables, and that as renewables provide a greater portion of the grid’s power, baseload generation will need to be phased out.
But before we get into the details, some background information is in order.

Types of power plants

“Baseload” power generators are typically large units that operate more or less continuously at 70 to 90 percent of their rated capacity, and do not shut down except for maintenance. These include nuclear, coal, and combined-cycle natural gas plants which capture and recycle the exhaust heat of traditional gas turbines. Coal and nuclear plants can take from one to three days to start up, and take a long time to shut down.
“Load-following” power generators can increase or reduce their output based upon demand, and typically run at 30 to 50 percent of capacity. They are typically traditional gas turbine units, and may be shut down on a daily or weekly basis as needed. Older coal plants, combined-cycle natural gas plants, and some nuclear plants can operate in a load-following mode, but their ability to do so is limited. For example, newer nuclear plants can cut output by as much as 20 percent in an hour, but need as much as eight hours to ramp back up to full capacity.
“Peaking units” typically run for a few hours at a time at low capacity factors when demand reaches unusually high peak levels, like in the middle of a hot summer day. These units are typically simple gas turbines.

The grid today

In the U.S., there are three main grids: one in the east, one in the west, and one in Texas. Some utilities are regulated while others are not, some are publicly owned while others are private, and although they are interconnected within the three main grids, they operate with a certain amount of autonomy. Grid power comes from about 5,800 utility-scale power plants, comprising some 18,000 generating units. A patchwork quilt of agencies with overlapping jurisdictions regulate the grid, including Federal Energy Regulatory Commission (FERC) and the North American Electricity Reliability Corporation (NERC) at the federal level, a range of Regional Transmission Operators (RTO) and Independent System Operators (ISO) at the regional level, and Public Utility Commissions (PUC) at the state level. Ten major RTOs and ISOs serve about two-thirds of consumers in the U.S. and more than half in Canada, with the remainder served by smaller regional operators.

The grid’s architecture developed in a fairly ad-hoc way. As the country was built up, more generation capacity was added, and the grid was extended. Technologically speaking, most of the grid is old and “dumb”: Power gets generated somewhere, and transmitted somewhere else, but there is very little in the way of sensors, storage buffers, switches, or security mechanisms along the way. It’s more like plumbing than an iPhone. This is why it was possible for one overloaded transmission line in Ohio take down much of the grid in Ontario, the Northeast and the Midwest in the blackout of August 14, 2003.
Grid operators have one overriding, fearsome task: They must maintain enough supply from this very complex system, within a narrow range of frequencies and voltages, to meet constantly fluctuating demand at all times. Therefore they tend to be risk-averse, preferring to stick with what they know to be reliable, and avoiding innovation.

Enter renewables

Before the advent of renewables, generating power was a pretty straightforward task: When demand increased, you just added more fuel to an engine. With renewables, the task is reversed: The engines (wind turbines and solar collectors) ramp up and down of their own accord, and grid operators must adjust to accommodate their output.
The growth of renewables in the U.S. has been driven primarily by state Renewable Portfolio Standards (RPS) requiring a certain percentage of power to be generated from renewables by a certain date. According to an April 2011 MIT report just released this month, 29 states have RPS mandates which typically require 15 to 25 percent renewables by 2015 to 2025. Many of these states mandate that grid operators give the renewably-generated power priority, so when wind generation spikes, for example, they must ramp down other generating units. In other areas of the U.S. and in parts of Europe, operators may instead curtail peak production from renewables to accommodate their baseload generation—for example, forcing a wind farm operator to furl their blades or apply brakes to their turbines.

The baseload fallacy

The notion that renewables cannot provide baseload power is really an artifact of the way the grid and its regulators have evolved. If all generators were able to ramp up and down on demand, and if grid operators were able to predict reliably when and where the sun would be shining and the wind would be blowing, accommodating any amount of power from renewables would be no problem.
A 2010 study called “The Base Load Fallacy” by Australian researcher Dr. Mark Diesendorf, an expert on integrating wind into power grids, fingers the “operational inflexibility of base-load power stations” as the main obstacle to further integration of renewables. “The renewable electricity system could be just as reliable as the dirty, fossil-fuelled system that it replaces,” he observes, if demand were more efficient and intelligent, and supply were made up of a wide variety of renewable sources plus a small amount of gas-fired capacity to cover the peaks. The perpetrators of the baseload fallacy, he argues, are mainly the industries who benefit from the status quo: coal, oil and gas companies, the nuclear industry, power generators, and industries who depend on them like aluminum and cement manufacturers.
Claims that renewables could never generate more than a few percent of grid power without taking down the grid have been given the lie by the real-world experience of areas that deliberately adapted their grids.
The best example in the U.S. is Texas. By virtue of having its own grid (technically, an “interconnection”), it is generally outside the purview of federal regulation by FERC. The entire grid is operated by a single ISO, ERCOT, so it has a lot of control over its generation mix and grid planning. Texas decided long ago to pursue its wind potential vigorously, and now has the largest installed wind capacity in the States at over 10 gigawatts (GW).
On March 7, ERCOT used a record 7,599 MW of wind power, constituting 22 percent of the load and representing over 77 percent of its nameplate wind capacity. The previous day it had met 24 percent of the load with wind. Baseload proponents had said that such levels of integration were flatly impossible. But ERCOT had made it possible with the help of a new modeling tool that analyzes real-time conditions every half-hour, giving grid technicians greater ability to match generation with demand and control transmission more discretely. The National Renewable Energy Laboratory has found that if other grid operators adopted similar tools, over one third of U.S. power could be generated from renewables.
All that ERCOT needed to accommodate more wind power was some sensors, a better flow of information, and better modeling tools. As the MIT report notes, the hardware to provide better grid information already exists, but few operators have employed it in their control and dispatch operations. The obstacle is not technology, but “the industry’s culture of resistance to new and experimental projects.”
That’s not a problem for China, however. The MIT report mentions that China is piloting a program that will allow it to monitor the national grid in real-time and control it automatically. The system eventually could allow China’s grid to uptake a far greater percentage of renewably-generated power than the antiquated and obsolete U.S. grid can, although the former is still the world’s top consumer of coal for power generation.
Another 2010 study by the German Renewable Energies Agency turned conventional baseload logic on its head, finding that due to their relatively inflexible ability to adjust to changing demand, “nuclear power plants are incompatible with renewable energies.” To meet forecasted wind production in Germany, conventional baseload operation would be cut in half by 2020, assuming renewable generation continues to enjoy priority dispatch. As renewables gradually replace conventional baseload capacity, only more flexible gas generators that can operate at under 50 percent of their capacity will still have a role to play.

The European example

Europe serves as another model of why good grid planning and management are key to integrating renewables into the grid. If baseload proponents were correct, then we would expect the countries with the highest levels of renewable penetration to have the most trouble in managing their grids, but the reality is quite the opposite.
A comprehensive new report on renewables integration by European consultancy eclareon GmbH surveyed the policies and grid functions of the 27 member states of the European Union, and found that “large quantities [of renewable generation] can be effectively managed on the grid.” Countries that planned for adequate grid capacity generally didn’t have a problem with accommodating renewables, and unsurprisingly, those are the same countries that have pushed for more renewable generation.

Solar and wind generation as a percentage of electricity consumption in 27 European Union countries in 2010 (first bar) and 2020 (second bar). Grid integration designated by color: green = positive, yellow = neutral, red = negative. Source: RES Integration Final Report, eclareon GmbH.
Countries where the share of renewable power is greatest—Germany, Denmark, Spain, Ireland, and Portugal—offer “positive conditions for grid operations,” although some barriers to integration were identified, including the potential for curtailment in Germany, challenges to priority dispatching in Ireland, and strict distribution parameters in Portugal. Identified barriers for grid development in those countries revolve around public policy issues, permitting, regulatory regimes, cost distribution, and the obligation (or lack thereof) of grid operators to beef up their grids to accommodate more renewable power.

Ripe for innovation

The real issues around the integration of renewables into the grid have to do with human arrangements, not technology. As the MIT report concluded, “There is a clear need for a statement on national goals for the electricity sector to streamline the US regulatory structure, which currently is complex and fragmented.” We need smart policy, and an intelligent approach to planning the grid of the future that is not simply beholden to the vested interests of the status quo.
This will run directly at odds with the free-market ideologies that have brought us this far. As the EU project THINK observed, “the main shortcomings of the conventional regulatory framework are that grid companies have disincentives to innovate.” A firm regulatory hand, like that in the most renewably-powered countries of Europe, will be necessary to integrate more power from solar and wind onto the grid.
Renewables should be able to meet at least 20 percent of electricity demand without disrupting the grid just about anywhere in the world with good grid planning and management. As geothermal and marine power technologies mature, they will become a much less intermittent, natural substitute for the baseload technologies of the past. A host of other technologies will even out the bumps in renewable generation by adding storage (batteries for distributed storage, and pumped hydro and solar thermal for utility scale); increasing the connections between grids (allowing better transmission between sunny and cloudy, or windy and still areas); and transitioning to on-demand natural gas-fired peaking generators. Over the next decade, the current assumptions about the need for traditional baseload capacity will begin to fade as new storage, interconnection, and smart grid management strategies come into play, and ultimately, a combination of these technologies might raise the limit on renewables to 100 percent.
The attachment to our antiquated architecture of power generation and grid management is simply a failure of imagination and innovation. Those who benefit from its arrangement today hold it up in too-precious reverence, not unlike the those who, one hundred years ago, protested the banning of the ancient Chinese practice of foot-binding depicted in the photo at top. It may be beautiful to them, but to those with modern sensibilities, it’s an ugly, even grotesque fetish that should be consigned to the dust bin of history, and one that one hundred years from now will seem unbearably dumb, quaint, and cruel. The problem is not that the feet are too big; it’s that the shoes are too small.

Thursday, April 26, 2012

Researchers claim quantum computer breakthrough

Updated April 26, 2012 18:21:14

Australian and international researchers say they have designed a tiny crystal able to run a quantum computer so powerful it would take a computer the size of the known universe to match it.
Details of the ion crystal, which is made up of just 300 atoms, are published in the journal Nature today by a team from Australia, South Africa and the United States.

"We've surpassed the computational potential of this system relative to classical computers by something like 10 to the [power of] 80, which is 80 orders of magnitude, a really enormous number," the University of Sydney's Dr Michael Biercuk told AM.
"Quantum computing is a kind of information science that is based on the notion that if one performs computations in a fundamentally different way than the way your classical desktop computer works, there's a huge potential to solve a variety of problems that are very, very hard or near impossible for standard computers," he explained.
"If you wanted to think how big a classical computer would need to be in order to solve this problem of roughly 300 interacting quantum particles, it turns out that that computer would need to be the size of the known universe - which is clearly something that's not possible to achieve."

What is quantum computing?

  • Normal computers use data encoded in binary digits (bits)
  • They work by manipulating bits that can exist in only one of two states - 0 or 1 - at any given time
  • Quantum computers instead use the properties of atoms and molecules to perform calculations
  • Quantum computers encode information as 'quantum bits', or 'qubits'
  • Qubits can exist in superposition - they can be both 0 and 1, and all points in between - at the same time.
  • Physicists believe this superposition will allow quantum computers to work on a million calculations at once, while a normal computer can only handle one
  • That gives quantum computers the potential to be millions of times more powerful than conventional machines

(Source: http://computer.howstuffworks.com)


However, he says there is still plenty of work to do.
"The central element is something like a millimetre in diameter, 300 atoms that are suspended in space," he said.
"But of course everything depends on a huge amount of technical infrastructure around it.
"So there are vacuum chambers and pumps and lasers, and all of that takes up something like a room."
The quantum computer will move to a stage where it is so far out in front and performing such complex tasks it will be difficult to check if it is working accurately.
"They're not easily checked by a classical computer which opens a whole variety of problems," Dr Biercuk said.

http://www.abc.net.au/news/2012-04-26/super-computer/3972832

Wednesday, April 25, 2012

Google bosses back plans to mine asteroids

  Top Google executives and film director James Cameron have backed a plan to mine asteroids for precious minerals and water.
Planetary Resources, the start-up whose backers include Google co-founders Larry Page and and Eric Schmidt, used a space museum in Seattle to launch a bold plan to prospect on resource-rich chunks of rock in space not far from Earth.
"The promise of Planetary Resources is to apply commercial innovation to space exploration," former NASA astronaut Tom Jones, an adviser to the start-up, said.
"They are developing cost-effective, production-line spacecraft that will visit near-Earth asteroids in rapid succession, increasing our scientific knowledge of these bodies and enabling the economic development of the resources they contain."
Planetary Resources said it was "poised to initiate" space mining missions in what it predicted would become a multi-billion-dollar industry.
A single 500-metre platinum-rich asteroid contains the equivalent of all the platinum group metals mined in history, according to the start-up
"Many of the scarce metals and minerals on Earth are in near-infinite quantities in space," Planetary Resources co-founder Peter Diamandis said.
"As access to these materials increases, not only will the cost of everything from microelectronics to energy storage be reduced, but new applications for these abundant elements will result in important and novel applications."
Asteroid
Photo: Space mining missions are tipped to become a multi-billion-dollar industry. (AFP)

Water-rich near-Earth asteroids (NEAs) could be springboards for deep space exploration by serving as fuelling and supply depots.
"Water is perhaps the most valuable resource in space," Planetary Resources co-founder Eric Anderson said.
"In addition to supporting life, water will also be separated into oxygen and hydrogen for breathable air and rocket propellant."
More than 1,500 of the approximately 9,000 known NEAs are as reachable as the Moon in terms of how much energy it would take for the trip, the start-up says.
"Our mission is not only to expand the world's resource base," said Planetary Resources chief engineer Chris Lewicki.
"We want to increase people's access to, and understanding of our planet and solar system by developing capable and cost-efficient systems."
The company has developed the first in what it said will be a family of prospecting spacecraft dubbed Arkyd-100 Series.

http://www.abc.net.au/news/2012-04-25/us-firm-plans-to-mine-asteroids/3970944

Tuesday, April 24, 2012

Airborne Wind Turbines - Altaeros Energies



At one point, it seemed like wind farms would be the answer to all our alternative power needs. Then came the concerns about health, decreased land values and noise. If only there was a way to get around those objections.
According to Altaeros Energies of the US, there is. If the turbines were housed in helium blimps that could be raised hundreds of metres into the air (and generate twice the energy of your average tower), that would about do it.

Altaeros also says that winds are much stronger at those higher altitudes, and more consistent than they are at ground level.
The floating turbines are tethered to the ground by cables that also transfer power back to ground level and the devices can be rapidly deployed at remote sites, on- or off-shore.

http://www.zdnet.com.au/winds-of-change-339336472.htm

Monday, April 23, 2012

The World’s Least Practical Flying Car Takes Flight

Thumbnail
Flying cars are the Jetsons dream that never came true. But wait, some Dutch dudes have cobbled together a weird helicar contraption that combines all the worst, most impractical elements of car and helicopter into one amazingly expensive prototype! Thank goodness.
On the ground, this thing — called the PAL-V One — has three wheels and tilts as you corner; it’s not a car, as such; it’s more a weird enclosed motorcycle. Like almost every other impractical flying car, you have to stop and unfold parts before you can even attempt to fly it. Then, you only get to take it up to 4000 feet (1219m) or less because of the way it’s licensed for flight.
Sigh. On the plus side, it does hit 180km/h, both in the air and on the ground, and you can get away with topping it up with good ol’ gasoline. There’s no word at the moment about when it will be commercially available, but if it follows the path of some of its rivals — like Terrafugia — we’ll be in for a long goddamn wait.
Speaking of which, Teraafugia reportedly hit a “major milestone” itself last week, as its first production prototype took to the air. It suggests that it might even have a model for sale some time this year. I suspect you may see a pig fly first.


Tech Billionaires Plan Audacious Mission to Mine Asteroids

By


There’s gold in them there hills. You know, those ones floating around in space. Asteroids contain many tons of precious metals, making them irresistible to scientists, aerospace engineers, futurists, fiction writers … and tech billionaires.
A group of wealthy, adventurous entrepreneurs will announce on Apr. 24 a new venture called Planetary Resources, Inc., which plans to send swarms of robots to space to scout asteroids for precious metals and set up mines to bring resources back to Earth, in the process adding trillions of dollars to the global GDP, helping ensure humanity’s prosperity and paving the way for the human settlement of space.
“The resources of Earth pale in comparison to the wealth of the solar system,” said Eric Anderson, who founded the commercial space tourism company Space Adventures, and is co-founder of a new company along with Peter Diamandis, who started the X Prize foundation, which offers prize-based incentives for advanced technology development.

Nearly 9,000 asteroids larger than 150 feet in diameter orbit near the Earth. Some could contain as much platinum as is mined in an entire year on Earth, making them potentially worth several billion dollars each. The right kinds of investment could reap huge rewards for those willing to take the risk.
Outside of NASA, Anderson and Diamandis are among the most likely candidates to realize such a dream. Space Adventures has sent seven private tourists to the International Space Station while the Ansari X Prize led to a spurt of non-governmental manned spaceships.
“We have a long track record of making large-scale space ventures real,” said Diamandis.
Despite the promise of astronomical profits, the long time-scales and uncertain return on asteroid mining has historically driven most investors away from such undertakings. But the new company is also backed by a number of other billionaire luminaries, including Google’s CEO Larry Page and executive chairman Eric Schmidt, former Microsoft chief architect Charles Simonyi, and Ross Perot Jr. The venture also counts on filmmaker James Cameron, former astronaut Tom Jones, former JPL engineer Chris Lewicki, and planetary scientist Sara Seager as advisers.

Still, this new undertaking will be much larger and more ambitious than anything Anderson and Diamandis have attempted before. The hurdles are many and high. While the endeavor is technically feasible, the technology has not yet been developed. And beyond their initial steps, the details of Planetary Resources’ plans remain scarce.
The first hurdle will likely be ensuring that Planetary Resources has covered all its legal bases. While some have argued that governments need to set up specific property rights before investors will make use of space, the majority of space lawyers agree that this isn’t necessary to assure the opportunity for a return on investment, said space policy analyst Henry Hertzfeld at George Washington University in Washington D.C. Mining occurs in international seabeds — even without specific property rights — overseen by a special commission dedicated to the task, he said. A similar arrangement would likely work in space.

In terms of extraction, Planetary Resources hopes to go after the platinum-group metals — which include platinum, palladium, osmium, and iridium — highly valuable commodities used in medical devices, renewable energy products, catalytic converters, and potentially in automotive fuel cells.
Platinum alone is worth around $23,000 a pound — nearly the same as gold. Mining the top few feet of a single modestly sized, half-mile-diameter asteroid could yield around 130 tons of platinum, worth roughly $6 billion.
Within the next 18 to 24 months, Planetary Resources hopes to launch between two and five space-based telescopes at an estimated cost of a few million dollars each that will identify potentially valuable asteroids. Other than their size and orbit, little detailed information is available about the current catalog of near-Earth asteroids. Planetary Resources’ Arkyd-101 Space Telescopes will figure out whether any are worth the trouble of resource extraction.
Within five to seven years, the company hopes to send out a small swarm of similar spacecraft for a more detailed prospecting mission, mapping out a valuable asteroid in detail and identifying rich resource veins. They estimate such a mission will cost between $25 and 30 million.
The next step — using robots to remotely mine, possibly refine ore, and return material to Earth safely — is probably the toughest phase, and Planetary Resources is still tight-lipped about its plans here.

This is an unprecedented challenge — the only asteroid material ever returned to Earth comes from the Japanese Space Agency’s Hayabusa spacecraft, which successfully returned a few hundred dust particles from asteroid 25143 Itokawa in 2010.
One possibility might be to find a useful asteroid and push it closer to Earth. A fairly low-power solar-electric ion engine could nudge a hunk of rock into orbit around the Earth, effectively creating a small second moon that could be easily accessed.
A recent white paper (.pdf) written by a team of scientists and engineers for the Keck Institute for Space Studies looked at exactly this proposition in order to use an asteroid for scientific and manned exploration. The team concluded that the technology exists, though such a plan would need at least $2.6 billion in funding. If Planetary Resources went this route, it would rack up a large initial investment, which doesn’t include actually mining and returning material back to Earth, potentially adding many hundreds more millions of dollars.
“It’s one thing to understand the mining and refining processes and another thing to actually build it,” said JPL engineer John Brophy, who co-authored the paper. “And everything in space tends to be harder than you think it will be.”
Another option to simplify the process might be to bring the ore back to Earth for refining, though that presents its own set of challenges. Say for the sake of argument that you send a 5,500-pound robot (roughly the weight of a small car) to an asteroid and it can mine and carry back 100 times its own weight in asteroid material. On most asteroids, chopping up a one-ton chunk of regolith will generate less than an ounce of platinum. Even asteroids with the highest concentration of platinum yield only about two ounces of platinum per ton.
This means that with the current commodity prices, each of your robot miners will generate about $875,000, even on an asteroid with the highest platinum amounts. Given a mission cost that is at least hundreds of millions of dollars, it wouldn’t be advantageous to refine ore on Earth.
There are also unknown financial aspects of a successful asteroid mining operation. The sudden influx of hundreds of tons of platinum into Earth’s economy would certainly drive the commodity’s price down. Looking at historical analogues, the enormous gold and silver reserves the Spanish inherited from their New World conquests led to terrible inflation and possibly the decline of their empire.
But Planetary Resources sees a platinum price drop as one of its potential goals.
“I would be overjoyed as a company if we brought back so much platinum that the price fell by a factor of 20 or 50,” said Anderson.

Aluminum was incredibly expensive in the 1800s, before new technology allowed it to be easily separated from its ore, said Diamandis. Today, aluminum is used in hundreds of applications, something that Anderson and Diamandis would like to see happen to the platinum-group metals.
While mining platinum and other rare metals is Planetary Resource’s way of bringing wealth to Earth, the world still has ample reserves of such material — South African platinum mines alone are expected to produce for another 300 years.
“In my view, its questionable how the economics of asteroid-retrieval works if you’re going to bring it to the ground,” said Brophy. “It makes more sense if you’re going to use the materials in space.”
Asteroids contain one substance that is of extremely high value for astronauts: water. Water can be used for drinking and it can be broken into its constituents. Oxygen is valuable for life support in space-based habitats, while liquid oxygen and hydrogen are both used to produce rocket fuel.
Rather than having to lug all the fuel for a mission out of Earth’s deep gravity well — an expensive proposition — having a “gas station” in space could help enable missions to Mars and beyond. Such a refueling depot might allow people to permanently live and work in space, another goal of Planetary Resources.

Of course, this creates a sort of chicken-and-egg problem. Do you generate tons of resources for your nonexistent space civilization first or do you get your space civilization started and then utilize the available resources?
Wired Science’s resident space historian David S. Portree thinks asteroid mining might make more sense when we have a more established space-based habitats with a different economy and better technology.
“Right now it would be like a big oil tanker dropping anchor off the coast of medieval England,” he said. “The medieval English might identify the oil as a useful commodity, but wouldn’t be able use enough to profit the tanker crew. Heck, they wouldn’t know how to get it off the tanker, except in wooden pails and rowboats.”

Image: 1) Artist rendition of a robotic mining mission to a near-Earth asteroid. NASA/Denise Watt. 2) A mock-up of the Arkyd-101 Space Telescope. Planetary Resources, Inc. 3) Manned exploration of an asteroid pushed into lunar orbit, from a recent KISS white paper. NASA/AMA, Inc. 4) Prices for various metals. Aluminum is $0.026/oz, barely registering on the chart.

Video: Robot miners rove over the surface of an asteroid, extracting resources. Planetary Resource, Inc.

http://www.wired.com/wiredscience/2012/04/planetary-resources-asteroid-mining/

Wednesday, April 18, 2012

Mars Viking robots 'found life'

Irene Klotz
Discovery News
A boulder-strewn field of red rocks stretches across the horizon in this self-portrait of Viking 2
A boulder-strewn field of red rocks stretches across the horizon in this self-portrait of Viking 2 on Mars Utopian Plain. (NASA)
New analysis of 36-year-old data, resuscitated from printouts, shows NASA may have found life on Mars, an international team of mathematicians and scientists conclude in a paper published this week.
Further, NASA doesn't need a human expedition to Mars to nail down the claim, says neuropharmacologist and biologist Joseph Miller ofthe University of Southern California Keck School of Medicine .
"The ultimate proof is to take a video of a Martian bacteria. They should send a microscope - watch the bacteria move," says Miller.
"On the basis of what we've done so far, I'd say I'm 99 per cent sure there's life there."
Miller's confidence stems in part from a new study that re-analysed results from a life-detection experiment conducted by NASA's Viking Mars robots in 1976.
Researchers crunched raw data collected during runs of the Labelled Release experiment, which looked for signs of microbial metabolism in soil samples scooped up and processed by the two Viking landers. General consensus of scientists has been that the experiment found geological, not biological, activity.
The new study took a different approach. Researchers distilled the Viking Labelled Release data, provided as hard copies by the original researchers, into sets of numbers and analysed the results for complexity.
Since living systems are more complicated than non-biological processes, the idea was to look at the experiment results from a purely numerical perspective.
They found close correlations between the Viking experiment results' complexity and those of terrestrial biological data sets. They say the high degree of order is more characteristic of biological, rather than purely physical processes.

Not iron-clad

Critics counter that the method has not yet been proven effective for differentiating between biological and non-biological processes on Earth so it's premature to draw any conclusions.
"Ideally to use a technique on data from Mars one would want to show that the technique has been well calibrated and well established on Earth. The need to do so is clear; on Mars we have no way to test the method, while on Earth we can," says planetary scientist and astrobiologist Christopher McKay, with NASA's Ames Research Center in California.
While not iron-clad, Miller says the findings are an additional plank of evidence challenging the popular contention that Viking did not find life.
He also is reanalysing the data to see if there are variations when sunlight was blocked by a weeks-long dust storm on Mars, with the idea being that biological systems would have acted differently to the environmental change than geologic ones. Results of the research are expected to be presented in August.
The research is published online in the International Journal of Aeronautical and Space Sciences.

http://www.abc.net.au/science/articles/2012/04/13/3476970.htm?site=science&topic=space

Tuesday, April 17, 2012

NASA clears SpaceX for space supply run



The SpaceX Dragon capsule is lifted to be placed atop its cargo ring inside a processing hangar
The SpaceX Dragon capsule is lifted to be placed atop its cargo ring inside a processing hangar at Cape Canaveral Air Force Station in Florida (NASA: Kim Shiflett)
NASA has cleared a privately-made cargo ship for a test flight to the International Space Station that is scheduled to launch in less than two weeks.
The Dragon mission would be the first time a privately owned and operated vessel visits the space station, a US$100 billion research laboratory that orbits about 380 kilometres above Earth.
NASA is counting on Space Exploration Technologies, also known as SpaceX, and a second company, Orbital Sciences Corporation, to keep the space station stocked with supplies and science experiments following the retirement of the space shuttles last year. The companies' combined contracts for cargo deliveries are worth US$3.8 billion.
"In order for space station to be successful, these systems have to be there for us," says space station program manager Mike Suffredini.
"We're really rooting for the teams to come through," adds NASA Associate Administrator Bill Gerstenmaier.
So far, NASA has invested US$381 million in the SpaceX rocket and cargo capsule, with the company and investors contributing about another US$700 million, says SpaceX founder and Chief Executive Elon Musk.
The Falcon 9 rocket and Dragon capsule also are in the running to serve as a space taxi for astronauts. The United States hopes to break Russia's monopoly on flying crews to the station, a service that costs more than US$60 million per person, by 2016 under a related NASA program.
"This is a test flight and we may not succeed on getting all the way to the space station," says Musk. "I think we've got a pretty good shot, but it's important to acknowledge that a lot can go wrong. This is pretty tricky."
If the launch is successful, the Dragon capsule would conduct a series of manoeuvres and tests in orbit before NASA clears it for approach and berthing at the station, which is targeted for 3 May. It would remain attached to the outpost for several weeks before flying back to Earth and splashing down in the Pacific Ocean for recovery.
The capsule will carry 521 kilograms of food and non-critical equipment and supplies to the station. It is expected to return 660 kilograms of cargo back to Earth, a capability that far exceeds what the Russian Soyuz capsules can hold.
The European and Japanese ships that also fly cargo to the station incinerate in the atmosphere after making deliveries and do not return to Earth.
NASA plans a final review of the Dragon mission next week to verify SpaceX flight software. The Falcon 9 rocket is scheduled for launch at 1:22 am AEST (16:22 GMT) on 1 May, with a backup launch opportunity four days later.

Wednesday, April 11, 2012

Domestic Battery Storage

Charge of the battery brigade

Stephen Luntz ABC Environment 26 Mar 2012

Redflow battery
A Redflow battery sits outside a house in Newcastle. Credit: Ausgrid
ON A QUIET CUL-DE-SAC in Elermore Vale, Newcastle, where the lawns are clipped and houses humble, Paul Jeffkins surveys a fridge-sized object installed outside his house. But it's no beer fridge. Instead of stubbies, it stores electric charge, grabbing electricity during off-peak times to release when demand is high. The unit is one of 40 installed in houses in this suburb of Newcastle, as part of a trial by electricity infrastructure corporation Ausgrid. Most of the power currently being stored comes from coal fired power stations, but the success of these batteries may also prove important to initiating a future powered by sun and wind.
The new arrivals colonising this deliberately ordinary street are zinc-bromine flow batteries. Each can store 10 kilowatt-hours of electricity, around what a basic 1.5kW set of solar panels generates on a sunny day. Two tanks hold a solution of zinc and bromine that can be pumped past a stack of plastic electrodes. When the battery is charged, zinc is deposited from the solution and coated onto the negative electrode; while at the positive electrode bromine is produced for storage within a tank. Zinc and bromide ions reform during discharging.
Ausgrid hopes the household-sized batteries will reduce the need for them to build new power stations. Gas plants currently sit idle for much of the day in order to be brought online for brief spurts at maximum demand, making peak power enormously expensive, a cost that is worn by consumers. If everyone had a battery, there would not be the same need to fire up a power station at peak demand.
Moreover, while there is much talk of "the grid", as though there were only one, Ausgrid's energy efficiency expert Paul Myors says there are actually many subgrids, with some getting close to requiring upgrading in order to handle peak demand. "In the commercial heart of a major city demand peaks during business hours, but in the suburbs it's when people get home on a hot day and start their air conditioners".
Unfortunately this occurs well past peak production for rooftop solar. A better integrated grid could feed power from panels installed across homes to commercial centres during the day. However, not only does this require expensive infrastructure upgrades, larger substations and more powerlines can spark local opposition.
Batteries sitting under the eves of houses are much less likely to attract complaints than increased transmission infrastructure, let alone a suburban gas fired power station.
It is a matter of weeks since Ausgrid began installing the batteries, far too early for results. Jeffkins says he has had no problems with noise or other interference, making the $150 payment he received a good deal for the minor loss of space. He's also untroubled by the possibility of electrolyte leakage that has hampered installation of large battery systems in the past.
The batteries currently charge and discharge to and from the grid, but in the second year of the trial they will charge while the demand is low, and enable Jeffkins to cut his consumption during the top tariff. In combination with his pre-existing 3kW solar panels, Jeffkins foresees substantial savings to his electricity bills, as long as Ausgrid allows him to keep the battery. "If there is some way I can buy it at the end of the trial I'll be interested," he remarks.

Salts and batteries

Alessandro Volta, after whom voltage is named, produced his first crude battery in 1800, and the more efficient Daniell cell preceded widespread electrification by many decades. However, the batteries in your car, laptop or flashlight are expensive ways to store household-sized quantities of power. Moreover, these well developed technologies may have little promise of significant further improvement. Fortunately, however, other technologies are emerging to fill the gap.
Not all of these are batteries in the traditional sense. Solar thermal power plants now run well into the night, powered by molten salts that can store heat to drive turbines. Solid blocks of graphite, run through with heat exchangers, have been proposed for Cloncurry and King Island. Other proposals include compressed air and splitting water to produce hydrogen. Hydroelectric power plants and flywheels represent more traditional forms of mass storage that some inventors hope to give a new lease of life.
In many cases these forms of storage are less expensive than batteries, but are often unsuited to dispersed application - Ausgrid might have had considerably more difficulty persuading Jeffkins and his neighbours to participate if they'd been asked to store high temperature molten salts on their front verandah.
Bruce Ebzery of Redflow, the supplier of the zinc-bromine flow batteries for Elermore Vale, believes his technology holds great promise. "The difference between all types of flow batteries and existing forms, such as lead-acid or nickel-hydride, is that in flow batteries the parts that make the reaction work don't get involved in the reaction so they don't degrade," he says. "In theory you can charge and discharge the battery forever with no loss of performance." Ebzery acknowledges the theory does not always work out, but flow batteries are still capable of far more cycles, at far higher efficiency, than the lead-acid batteries popular in off-grid systems.
Ezbery says there is no theoretical reason for them to be more expensive either. "Zinc and bromine are both common materials that you can buy easily, not rare earths. All the other components are plastics - really just advanced shopping bags," he jokes. Nevertheless, the Redflows installed in Elermore Vale are still impractically expensive for widespread use, costing around $15,000. Asked why, Ebzery invites people to "come and look at our factory in Brisbane. We're making them by hand."
Flow batteries require two electrolytes, rather than the one used in conventional batteries and this greater complexity has hampered commercialisation, but if demand can reach levels suitable for mass production, Redflow hopes to reduce prices by two-thirds. Meanwhile what many consider the brightest prospect amongst flow batteries has been largely neglected in Australia, despite being invented here.
Dr Maria Skyllas-Kazacos came up with the idea of using a little-known metal called vanadium as the basis of flow batteries in the 1980s. Vanadium's advantage is that it has enough 'oxidative states' that both sides of the battery use vanadium as the electrolyte. Therefore the challenge of ensuring the two electrolytes never mix loses its urgency. Skyllas-Kazacos says vanadium batteries also avoid the drawback of zinc-bromine competitors, "Which can form needles that penetrate membranes and cause short circuits. It's very difficult to lay down a uniform zinc layer." On the downside, current vanadium batteries are much heavier for the same energy storage than those Ausgrid is installing.
Although it has been a slow haul from Skyllas-Kazacos' pioneering work at the University of New South Wales, vanadium redox flow batteries are now taking off worldwide. Mass production is starting, a 1.5MW system has been installed at a semi-conductor factory in Japan and China sees vanadium batteries as a big part of its energy future.
The company that was commercialising the technology in Australia, Pinnacle VRB, however, decided to devote itself to coal seam gas instead, leaving the batteries without a local champion. Worse still, when the large prototype installed to store wind-generated energy on King Island ran into trouble there was no local vanadium developer around to fix it.
Rights to the technology have now been returned to UNSW, who are negotiating with companies for local developers. Meanwhile Skyllas-Kazacos is working on what she calls second generation vanadium batteries. These take bromine's solubility, and incorporate it with her work on vanadium. "Instead of a zinc-bromide solution you have vanadium-bromide. Vanadium ions are formed and everything stays in solution, rather than plating metals over and over again, which is hard," she says.
While there are still technical problems to solve, the highly soluble vanadium-bromide electrolytes offer the possibility of batteries that store plenty of energy for a given weight, release this energy quickly when required, are highly reliable and capable of charging and discharging thousands of times with little loss of performance.
One day these batteries may store the power generated by wind and solar for use when the wind isn't blowing and the sun is not shining. Or they may simply take the edge of peak demand. Either way, all players in the energy industry have much to gain by charging down the battery path.

http://www.abc.net.au/environment/articles/2012/03/26/3462426.htm?WT.svl=featuredSitesScroller

Reshaping the world with 3D printers


It's not quite the new tech bubble, but 3D printing has been quietly exploding in the background of the consumer technology sphere. In 2010, the value of shipped 3D printing products totalled US$1.33bn, and that's set to keep rising even as the price of the technology plummets.
A 3D printer
(Credit: Objet)
Where traditional industrial manufacturing is usually a subtractive process, where material is removed from an object to create the final product, 3D printing is an additive process, where material is piled up to build an object from nothing.
The methodology isn't all that different from 2D printing on a page with ink. Your device rolls the paper past the mechanism that applies the ink (an inkjet, laser or pinwheel if you're an early '90s holdout) and a layer of ink is applied in the required pattern, density and colour.
3D printing does the same, but the mechanism that applies the layer of material makes subsequent passes and sets down further layers until it has built up a 3D object. The process can take anywhere from hours to days and can be used to make anything from novelty chess pieces to car body parts.
And given that anybody can buy a 3D printer, like the Makerbot, for example, the concept has put manufacturing in the hands of anyone with a few square feet of free real estate on their desk.
At the moment, 3D printing can't match the quality of industrial manufacturing, but that hasn't stopped people from pushing the envelope. One US entrepreneur has been experimenting with creating synthetic human organs for transplant, and clinical trials on printing the first plastic bladder are starting soon. Some scientists familiar with the technology suggest the same thing can even be done with living cells (instead of plastic) to build organic body tissues.
The process starts with a design — which could be anything from a spaceship hull created using high-end computer aided design (CAD) software to a personalised robot using myrobotnation.com. The design work will result in a file (in a format such as .stl, for stereolithography), which is sent to the printing device in the same way a text document or a photo is sent to an inkjet.
A 3D printed object
(Credit: Objet)
The file contains information that renders triangulated surfaces of each layer and the device "draws" these layers one after the other. 3D printers look different than a normal device, and they differ vastly in price (they can run anywhere from $1200-$20,000), but the principle is the same. The system receives coordinates for each layer, and a series of arms drives the nozzle that applies the material in a series of movements, applying it to the layer below.
There are a host of choices of printing materials depending on the desired output: transparent, rubberised, medical or rigid. Most are variations on common polypropylenes, polymers or resins and can be bought on a spool just like a roll of cotton. Prices vary accordingly, but not all materials are prohibitively priced. One US website sells one kilogram of ABS (acrylonitrile butadiene styrene, an engineering polymer) for US$79.
What can be done with the final product is limited only by the tensile strength of the material. In one example that sounds like it's straight out of a sci-fi movie, you can use a 3D printer to produce the components required to create a precise copy of itself. Or how about printing up a few discontinued appliance parts? Just download the appropriate file.
But you don't have to own a 3D printer to take advantage of 3D printing — the web has made it easy to offer 3D printing as an online service. For as little as $15, for example, you can upload a design file and have your finished model shipped to you.

Thursday, April 05, 2012

Future looks good for bionic eye prototype

Prototype ... A camera mounted on glasses will capture vision and transforms it into electrical signals.
Photo: Prototype ... A camera mounted on glasses will capture vision and transforms it into electrical signals. (ABC TV)
A team of Australian researchers developing a bionic eye that could help restore sight to the blind will test a full prototype later this month.
The Monash Vision Group, a team of 50 scientists, believes it is on track to be the first in the world to implant a microchip into the brain of blind patients.
A patient will wear glasses with a tiny camera, which will act like an eye's retina. A pocket processor will then convert these images into electronic signals to be sent to a microchip implant in the brain.
The procedure will insert 650 hair-thin electrodes into the visual cortex. When fully operationally, the patient will see low resolution black and white images.
"It's a huge milestone in our technical development," said the group's general manager, Dr Jeanette Pritchard.
"It enables us to put together all the individual components of our device and test them to make sure that they're all working together and that they're integrated properly with each other.
"It will be implanted by highly skilled neurosurgeons and they incorporate a tiny microchip that's about 4x4 millimetres in size and that microchip then sends signals to hair-thin electrodes that penetrate into the visual cortex of the brain.
"When those electrodes are stimulated they produce sensations of light in the brain in the visual field of the recipient. They're known as phosphenes and they're almost like pixels on a TV screen. And for each electrode we'll get one flash of light or one phosphene."
At first the team expects to work with patients that have been blinded by trauma like car or industrial accidents.
But at this stage, they are not sure how the technology will react with people who have never seen.
"It's important that for our first patient that they have had full adult vision so that we know that their brain can process these kinds of signals because it has done so previously," Dr Pritchard said.
"It will be a lot about how the patient can learn to interpret that information to the optimum level to get the most out of it."

Lucrative potential


Another group, Bionic Vision Australia, is working on a different type of implant that will go directly into the eye.
Their technology will target older patients with eye disease.
Marketing and communications manager Veronika Gouskova says trials will begin next year.
"What we're doing is giving them some visual information via a retinal implant so that we can then take advantage of the natural processing abilities of that network of neurons and nerve cells," she said.
Australia has had a successful history of innovation with medical bionics; first the heart pacemaker and then the cochlear implant for the deaf. Bionic eye technology has grown out of this and the commercial potential is huge.
"The prediction is that by 2020 there'll be over 78,000 people who are clinically blind in Australia and it could be up to nearly 200 million people worldwide," Dr Pritchard said.
Monash Vision Group will start full-scale human trials in 2014 and hopes to have the product rolled out in world markets by 2020.

http://www.abc.net.au/news/2012-04-05/scientists-to-test-bionic-eye-prototype/3936204