Followers

Tuesday, February 19, 2008

A Second Earth in Our Solar System

Traveling to another Earth-like world just got a lot easier. It turns out that there may be many other dirt-and-water planets lurking at the edges of our solar system in places like the Oort Cloud. These planets, which could be roughly the size of our own, would contain all the elements we need for life. They're just sitting in a cold, dimly-lit part of the solar system, waiting to be defrosted and colonized. Yesterday, NASA scientists announced that this changes the prognosis for nearby livable planets.

NASA's Alan Stern said these planets are so far away from the sun that we haven't seen them yet:

Our old view, that the Solar System had nine planets will be supplanted by a view that there are hundreds if not thousands of planets in our Solar System. It could be that there are objects of Earth-mass in the Oort cloud (a band of debris surrounding our planetary system) but they would be frozen at these distances. They would look like a frozen Earth.
So all we need to do is haul one of those babies into our orbit, defrost it, and start populating. Earth 2, here I come!

Beyond our solar system, millions more Earth 2s await. University of Arizona astronomer Michael Meyer, co-author of a study about extrasolar dirt-and-water worlds, told reporters:

Our observations suggest that between 20% and 60% of Sun-like stars have evidence for the formation of rocky planets not unlike the processes we think led to planet Earth. That is very exciting.
Image from Guardian Unlimited.

Original here

Astronomy Picture of the Day

Discover the cosmos! Each day a different image or photograph of our fascinating universe is featured, along with a brief explanation written by a professional astronomer.Columbus Laboratory Installed on Space Station
Credit: STS-122 Crew, Expedition 16 Crew, ESA, NASA

Explanation: The International Space Station (ISS) has been equipped with a powerful new scientific laboratory. The Space Shuttle Atlantis delivered the Columbus Laboratory to the ISS and installed the seven meter long module over the past week. Columbus has ten racks for experiments that can be controlled from the station or the Columbus Control Center in Germany. The first set of experiments includes the Fluid Science Laboratory that will explore fluid properties in the microgravity of low Earth orbit, and Biolab which supports experiments on microorganisms. Future Columbus experiments include an atomic clock that will test minuscule timing effects including those expected by Einstein's General Theory of Relativity. Pictured above, mission specialist Hans Schlegel works on the outside of Columbus. Scientists from all over the world may propose and carry out experiments to be done on the laboratory during its ten year mission.

Original here

Scientists Would Turn Greenhouse Gas Into Gasoline

SEEING GREEN Air pollution in downtown Los Angeles.

If two scientists at Los Alamos National Laboratory are correct, people will still be driving gasoline-powered cars 50 years from now, churning out heat-trapping carbon dioxide into the atmosphere — and yet that carbon dioxide will not contribute to global warming.

The scientists, F. Jeffrey Martin and William L. Kubic Jr., are proposing a concept, which they have patriotically named Green Freedom, for removing carbon dioxide from the air and turning it back into gasoline.

The idea is simple. Air would be blown over a liquid solution of potassium carbonate, which would absorb the carbon dioxide. The carbon dioxide would then be extracted and subjected to chemical reactions that would turn it into fuel: methanol, gasoline or jet fuel.

This process could transform carbon dioxide from an unwanted, climate-changing pollutant into a vast resource for renewable fuels. The closed cycle — equal amounts of carbon dioxide emitted and removed — would mean that cars, trucks and airplanes using the synthetic fuels would no longer be contributing to global warming.

Although they have not yet built a synthetic fuel factory, or even a small prototype, the scientists say it is all based on existing technology.

“Everything in the concept has been built, is operating or has a close cousin that is operating,” Dr. Martin said.

The Los Alamos proposal does not violate any laws of physics, and other scientists, like George A. Olah, a Nobel Prize-winning chemist at the University of Southern California, and Klaus Lackner, a professor of geophysics at Columbia University, have independently suggested similar ideas. Dr. Martin said he and Dr. Kubic had worked out their concept in more detail than previous proposals.

There is, however, a major caveat that explains why no one has built a carbon-dioxide-to-gasoline factory: it requires a great deal of energy.

To deal with that problem, the Los Alamos scientists say they have developed a number of innovations, including a new electrochemical process for detaching the carbon dioxide after it has been absorbed into the potassium carbonate solution. The process has been tested in Dr. Kubic’s garage, in a simple apparatus that looks like mutant Tupperware.

Even with those improvements, providing the energy to produce gasoline on a commercial scale — say, 750,000 gallons a day — would require a dedicated power plant, preferably a nuclear one, the scientists say.

According to their analysis, their concept, which would cost about $5 billion to build, could produce gasoline at an operating cost of $1.40 a gallon and would turn economically viable when the price at the pump hits $4.60 a gallon, taking into account construction costs and other expenses in getting the gas to the consumer. With some additional technological advances, the break-even price would drop to $3.40 a gallon, they said.

A nuclear reactor is not required technologically. The same chemical processes could also be powered by solar panels, for instance, but the economics become far less favorable.

Dr. Martin and Dr. Kubic will present their Green Freedom concept on Wednesday at the Alternative Energy Now conference in Lake Buena Vista, Fla. They plan a simple demonstration within a year and a larger prototype within a couple of years after that.

A commercial nuclear-powered gasoline factory would have to jump some high hurdles before it could be built, and thousands of them would be needed to fully replace petroleum, but this part of the global warming problem has no easy solutions.

In the efforts to reduce humanity’s emissions of carbon dioxide, now nearing 30 billion metric tons a year, most of the attention so far has focused on large stationary sources, like power plants where, conceptually at least, one could imagine a shift from fuels that emit carbon dioxide — coal and natural gas — to those that do not — nuclear, solar and wind. Another strategy, known as carbon capture and storage, would continue the use of fossil fuels but trap the carbon dioxide and then pipe it underground where it would not affect the climate.

But to stabilize carbon dioxide levels in the atmosphere would require drastic cuts in emissions, and similar solutions do not exist for small, mobile sources of carbon dioxide. Nuclear and solar-powered cars do not seem plausible anytime soon.

Three solutions have been offered: hydrogen-powered fuel cells, electric cars and biofuels. Biofuels like ethanol are gasoline substitutes produced from plants like corn, sugar cane or switch grass, and the underlying idea is the same as Green Freedom. Plants absorb carbon dioxide as they grow, balancing out the carbon dioxide emitted when they are burned. But growing crops for fuel takes up wide swaths of land.

Hydrogen-powered cars emit no carbon dioxide, but producing hydrogen, by splitting water or some other chemical reaction, requires copious energy, and if that energy comes from coal-fired power plants, then the problem has not been solved. Hydrogen is also harder to store and move than gasoline and would require an overhaul of the world’s energy infrastructure.

Electric cars also push the carbon dioxide problem to the power plant. And electric cars have typically been limited to a range of tens of miles as opposed to the hundreds of miles that can be driven on a tank of gas.

Gasoline, it turns out, is an almost ideal fuel (except that it produces 19.4 pounds of carbon dioxide per gallon). It is easily transported, and it generates more energy per volume than most alternatives. If it can be made out of carbon dioxide in the air, the Los Alamos concept may mean there is little reason to switch, after all. The concept can also be adapted for jet fuel; for jetliners, neither hydrogen nor batteries seem plausible alternatives.

“This is the only one that I have seen that addresses all of the concerns that are out there right now,” Dr. Martin said.

Other scientists said the Los Alamos proposal perhaps looked promising but could not evaluate it fully because the details had not been published.

“It’s definitely worth pursuing,” said Martin I. Hoffert, a professor of physics at New York University. “It’s not that new an idea. It has a couple of pieces to it that are interesting.”

Original here

Report: Alberta Oil Sands Most Destructive Project on Earth

Environmental Defence has released a report calling the Alberta Oil Sands the most destructive project on Earth.

Few Canadians know that Canada is home to one of the world's largest dams and it is built to hold toxic waste from just one Tar Sands operation," Rick Smith, the executive director of Environmental Defence.

And according to the report this is just the beginning. Approvals have already been given that will double the size of existing operations and Canada's leaders have been talking with the US government to grow oil sands operations in a "short time span."

Even a former Premier of Alberta is concerned. Peter Lougheed who served as Premier from 1971 to 1985 was recently quoted on the oil sands as saying:

... it is just a moonscape. It is wrong in my judgment, a major wrong... So it is a major, major federal and provincial issue."

However, there is a silver lining in all this. A recent Canadian parliamentary committee recently stated that:

A business as usual approach to the development of the oil sands is not sustainable. The time has come to begin the transition to a clean energy future."

Here's a few facts about the Alberta Oil Sands:

Oil sands mining is licensed to use twice the amount of fresh water that the entire city of Calgary uses in a year.

- At least 90% of the fresh water used in the oil sands ends up in ends up in tailing ponds so toxic that propane cannons are used to keep ducks from landing in them.

- Processing the oil sands uses enough natural gas in a day to heat 3 million homes in Canada.

- The toxic tailing ponds are considered one of the largest human-made structures in the world. The ponds span 50 square kilometers and can be seen from space.

- Producing a barrel of oil from the oil sands produces three times more greenhouse gas emissions than a barrel of conventional oil.

- The oil sands operations are the fastest growing source of heat-trapping greenhouse gas in Canada. By 2020 the oil sands will release twice the amount produced currently by all the cars and trucks in Canada.

A full copy of the Environmental Defence report is attached to the end of this post.

Original here

Planet-hunters set for big bounty


Rocky planets, possibly with conditions suitable for life, may be more common than previously thought in our galaxy, a study has found.

New evidence suggests more than half the Sun-like stars in the Milky Way could have similar planetary systems.

There may also be hundreds of undiscovered worlds in outer parts of our Solar System, astronomers believe.

Future studies of such worlds will radically alter our understanding of how planets are formed, they say.

New findings about planets were presented at the American Association for the Advancement of Science (AAAS) in Boston.

Nasa telescope

Michael Meyer, an astronomer from the University of Arizona, said he believed Earth-like planets were probably very common around Sun-like stars.

I expect that we will find a very large number of planets
Alan Stern, Nasa
"Our observations suggest that between 20% and 60% of Sun-like stars have evidence for the formation of rocky planets not unlike the processes we think led to planet Earth," he said. "That is very exciting."

Mr Meyer's team used the US space agency's Spitzer space telescope to look at groups of stars with masses similar to the Sun.

They detected discs of cosmic dust around stars in some of the youngest groups surveyed.

The dust is believed to be a by-product of rocky debris colliding and merging to form planets.

Nasa's Kepler mission to search for Earth-sized and smaller planets, due to be launched next year, is expected to reveal more clues about these distant undiscovered worlds.

Frozen worlds

Some astronomers believe there may be hundreds of small rocky bodies in the outer edges of our own Solar System, and perhaps even a handful of frozen Earth-sized worlds.

We have to find the right mass planet and it has to be at the right distance from the star
Debra Fischer, San Francisco State University
Speaking at the AAAS meeting, Nasa's Alan Stern said he thought only the tip of the iceberg had been found in terms of planets within our own Solar System.

More than a thousand objects had already been discovered in the Kuiper belt alone, he said, many rivalling the planet Pluto in size.

"Our old view, that the Solar System had nine planets will be supplanted by a view that there are hundreds if not thousands of planets in our Solar System," he told BBC News.

He said many of these planets would be icy, some would be rocky, and there might even be objects with the same mass as Earth.

"It could be that there are objects of Earth-mass in the Oort cloud (a band of debris surrounding our planetary system) but they would be frozen at these distances," Dr Stern added.

"They would look like a frozen Earth."

Goldilocks zone

Excitement about finding other Earth-like planets is driven by the idea that some might contain life or perhaps, centuries from now, allow human colonies to be set up on them.

The key to this search, said Debra Fischer of San Francisco State University, California, was the "Goldilocks zone".

This refers to an area of space in which a planet is "just the right distance" from its parent star so that its surface is not-too-hot or not-too-cold to support liquid water.

"To my mind there are two things we have to go after: we have to find the right mass planet and it has to be at the right distance from the star," she said.

The AAAS meeting concludes on Monday.

Original here

Is science faith-based?

No.

Oh, you want details? OK then.

If you read any antiscience screeds, at some point or another most will claim that science is based on faith just as much as religion is. For example, the horrific Answers in Genesis website has this to say about science:

Much of the problem stems from the different starting points of our divergence with Darwinists. Everyone, scientist or not, must start their quests for knowledge with some unprovable axiom—some a priori belief on which they sort through experience and deduce other truths. This starting point, whatever it is, can only be accepted by faith; eventually, in each belief system, there must be some unprovable, presupposed foundation for reasoning (since an infinite regression is impossible).

This is completely wrong. It shows (unsurprisingly) an utter misunderstanding of how science works. Science is not faith-based, and here’s why.

The scientific method makes one assumption, and one assumption only: the Universe obeys a set of rules. That’s it. There is one corollary, and that is that if the Universe follows these rules, then those rules can be deduced by observing the way Universe behaves. This follows naturally; if it obeys the rules, then the rules must be revealed by that behavior.

A simple example: we see objects going around the Sun. The motion appears to follow some rules: the orbits are conic sections (ellipses, circles, parabolas, hyperbolas), the objects move faster when they are closer to the Sun, if they move too quickly they can escape forever, and so on.

From these observations we can apply mathematical equations to describe those motions, and then use that math to predict where a given object will be at some future date. Guess what? It works. It works so well that we can shoot probes at objects billions of kilometers away and still nail the target to phenomenal accuracy. This supports our conclusion that the math is correct. This in turn strongly implies that the Universe is following its own rules, and that we can figure them out.

Now, of course that is a very simple example, and is not meant to be complete, but it gives you an idea of how this works. Now think on this: the computer you are reading this on is entirely due to science. The circuits are the end result of decades, centuries of exploration in how electricity works and how quantum particles behave. The monitor is a triumph of scientific engineering, whether it’s a CRT or an LCD flat panel. The mouse might use an LED, or a simple ball-and-wheel. The keyboard uses springs, the wireless uses radio technology, the speakers use electromagnetism.*

Look around. Cars, airplanes, buildings. iPods, books, clothing. Agriculture, plumbing, waste disposal. Light bulbs, vacuum cleaners, ovens. These are all the products of scientific research. If your TV breaks, you can pray that it’ll spontaneously start working again, but my money would be on someone who has learned how to actually fix it based on scientific and engineering principles.

All the knowledge we have accumulated over the millennia comes together in a harmonious symphony of science. We’re not guessing here: this stuff was designed using previous knowledge developed in a scientific manner over centuries. And it works. All of this goes to support our underlying assumption that the Universe obeys rules that we can deduce.

Are there holes in this knowledge? Of course. Science doesn’t have all the answers. But science has a tool, a power that its detractors never seem to understand.

Science is not simply a database of knowledge. It’s a method, a way of finding this knowledge. Observe, hypothesize, predict, observe, revise. Science is provisional; it’s always open to improvement. Science is even subject to itself. If the method itself didn’t work, we’d see it. Our computers wouldn’t work (OK, bad example), our space probes wouldn’t get off the ground, our electronics wouldn’t work, our medicine wouldn’t work. Yet, all these things do in fact function, spectacularly well. Science is a check on itself, which is why it is such an astonishingly powerful way of understanding reality.

And that right there is where science and religion part ways. Science is not based on faith. Science is based on evidence. We have evidence it works, vast amounts of it, billions of individual pieces that fit together into a tapestry of reality. That is the critical difference. Faith, as it is interpreted by most religions, is not evidence-based, and is generally held tightly even despite evidence against it. In many cases, faith is even reinforced when evidence is found contrary to it.

To say that we have to take science on faith is such a gross misunderstanding of how science works that it can only be uttered by someone who is wholly ignorant of how reality works.

The next time someone tries to tell you that science is just as faith-based as religion, or that evolution is a religion, point them here. Perhaps the evidence of science may sway them. Perhaps not; it’s difficult to reason someone out of a position they didn’t reason themselves into. But the next time they get on a computer, maybe they’ll take a slightly more critical look at it, and wonder if its workings are a miracle, or the results of brilliant minds over many generations toiling away at the scientific method.

Original here

Scientists point out our flock mentality

This has been talked about for ever, and as much as we admit it or not, a big mass of people is in fact quite easy to manipulate, because of our… flock mentality. Results from a study at the University of Leeds show that it takes a minority of just five per cent to influence a crowd’s direction. The other 95% will follow the path of the 5%, without even realizing this.

The findings could have a major significance for directing the flow of large crowds especially in disaster cases, when it’s crucial to evaluate how the mass of people will react.

“There are many situations where this information could be used to good effect,” says Professor Jens Krause of the University’s Faculty of Biological Sciences. “At one extreme, it could be used to inform emergency planning strategies and at the other, it could be useful in organising pedestrian flow in busy areas.”

They conducted a series of experiments in which groups of people were asked to walk randomly around a large hall. A few of them received more exact instructions about wherethey were supposed to go. They were not allowed to talk with each other, but they were supposed to stay within an arm’s reach of any other person. So the results were not that surprising, when you stop to think about it: the ‘informed individuals’ were followed by others in the crowd.

“We initially started looking at consensus decision making in humans because we were interested in animal migration, particularly birds, where it can be difficult to identify the leaders of a flock,” says Professor Krause. “But it just goes to show that there are strong parallels between animal grouping behaviour and human crowds.”

Original here

50 Weird Science Tidbits & Oddities

Part 1: Items 1 - 10

In my surfing journeys through the internet’s reefs and shoals, I’ve encountered some really strange stuff. Factoids hardly anybody knows, about pretty much anything that might turn up as subject matter in a rousing championship match of Trivial Pursuit down at the pub on Thursday night. Some of these are real crowd-pleasers sure to draw spontaneous applause, stunned gasps, and plenty of free beers from admirers.

While there will be ten fun, honest-to-scientific facts or odd theories in each of the five posts on this subject this week, they’re not listed in any particular order on my weird-o-meter. If you have favorites among them, please log your votes in the comments. Eventually we should have a Top Ten!

1. Octopus Beats Grinch, Heart for Heart

Octopus

The Grinch (that green fuzzy guy who stole Christmas) became an official good-guy when his heart grew three sizes one day. The octopus does even better - he has three hearts!

Happy Valentine’s Day!



2. A Naked Tiger Still Has Stripes

Tiger

If a tiger loses all his hair, he’ll still be striped. Tiger stripes are like fingerprints, each individual cat’s markings are unique. And they’re not just hair, the stripes are in their skin. Seems to be a thing with cats, since your house cat’s fur markings are also skin deep!


3. Most Of Your Body’s Cells Aren’t Yours

Microbes

Strange but true. There are more microbial cells in your body than cells that have your own DNA. As NPR’s Robert Krulwich reported in 2006, the human body has 20 times more microbes than cells! I guess that pretty well justifies the “Imperial We.”


4. Insects Outnumber Us

BugMarket

Perhaps that’s not so surprising since our own body’s microbes outnumber us too, but the scale is pretty humbling. There are more insects in just one square mile of fertile soil than there are human beings on the entire planet! Quite the delicacy in Asia…


5. And We’re All Eating Them…

BugSmile

The average person manages to consume about 430 insects every year of their lives, whether they intended to or not! And no, not all “average” people ride motorcycles.


6. Alligators Never Need Dentures

GatorSmile

While both humans and alligators depend on their teeth in order to chew food, humans only get two sets of natural teeth to last them a lifetime. Alligators get from 2,000 to 3,000 teeth during the course of their lifetime! Which is no doubt why we never hear about grumpy ‘gators gumming anybody to death.


7. “Salt of the Earth” Is More Than A Title

SaltShaker

There is enough salt in the world’s oceans to cover all the land on all the continents to a depth of nearly 500 feet! I’m cutting down on salt anyway, hope you are too.


8. When It Rains, It Croaks

RainFrogs

Despite the common weather report that “it’s raining cats and dogs out there,” frogs and fish are the most likely animals to fall from the sky in rain. The most recent rain of frogs occurred in 2005 in Serbia, and it rained frogs in London in 1998. In 2006 it rained fish in India, while Wales got the fish-drop in 2004.


9. Space Resources We Could Put To Good Use

SpaceAlcohol

The interstellar gas cloud that comprises Sagittarius B contains a billion billion billion (yes, that’s three orders of billion) liters of alcohol. This factoid is bound to be a big hit at the pub!


10. Odd Theory Out, But Great Animation!

Though discounted by most scientists, comic book artist Neal Adams has popularized an Expanding Earth Theory that challenges standard plate tectonics. In this controversial theory our planet was once just about half as big as it is now - which purports to explain why dinosaurs got so big (less gravity) and other anomalies. Check Adams’ video, it’s very cool!

The Entire Series:


1-10 of 50 Weird Science Tidbits & Oddities
11-20 of 50 Weird Science Tidbits & Oddities
21-30 of 50 Weird Science Tidbits & Oddities
31-40 of 50 Weird Science Tidbits & Oddities
41-50 of 50 Weird Science Tidbits & Oddities

Original here

Biofuel: Gene scientists find secret to oil yield from corn


A sheaf of corn remains after the harvest in a field. Agricultural scientists in the United States have identified a key gene that determines oil yield in a corn, a finding that could have repercussions for the fast-expanding biofuels industry.

Agricultural scientists in the United States have identified a key gene that determines oil yield in a corn, a finding that could have repercussions for the fast-expanding biofuels industry.

The gene lies on Chromosome 6 of the maize genome, according to a paper published on Sunday by Nature Genetics.

It encodes a catalysing enzyme called DGAT1-2, which carries out the final step in the plant's oil-making process.

In addition, a tiny amino acid variant within this gene can boost the yield of oil and oleic acid -- the sought-after edible fat in corn -- by up to 41 percent and 107 percent respectively.

The paper, written by a team from the US chemicals and agribusiness giant DuPont, was based on a comparison of 71 strains of maize whose oil content ranged from low to high.

DGAT is "a promising target for increasing oil and oleic-acid contents in other crops," say the authors, led by Bo Shen of DuPont unit Pioneer Hi-Bred International, in Johnston, Iowa.

Present-generation biofuels are derived from food crops such as corn, sugar cane and soybeans.

Initially viewed as an environmentally-friendly alternative with no geopolitical risk compared with dirty fossil fuels, biofuels are now under attack as some unintended consequences emerge.

The impacts include higher prices in the global food market as more fields are devoted to growing fuel rather than food, and the destruction of forests in Brazil and Indonesia as land is cleared for fuel crops.

Scientists are looking at ways of boosting output from existing biofuel crops by adding the promise a higher yield in oil. Proposed methods include classic cross-breeding as well as genetic engineering, a technology that remains fiercely opposed in some countries.

Another avenue of exploration for biofuel production is in non-food fibrous plants and cellulose materials, such as switchgrass, wood chips and straw. But these novel sources, hampered by costs and technical complications, are struggling to reach commercial scale.

Global biofuel production tripled from 4.8 billion gallons (18.16 billion litres) in 2000 to about 16 billion gallons (60.56 billion litres) in 2007, but still accounts for less than three percent of the global transport fuel supply, according to US Department of Agriculture figures.

Original here

Rare Egyptian "Warrior" Tomb Found










An unusual, well-preserved burial chamber that may contain the mummy of an ancient warrior has been discovered in a necropolis in Luxor. Scientists opened the tomb—found in Dra Abul Naga, an ancient cemetery on Luxor's west bank—on Wednesday. (See an Egypt map.)

Inside the burial shaft—a recess crudely carved from bedrock—experts found a closed wooden coffin inscribed with the name "Iker," which translates to "excellent one" in ancient Egyptian.

Near the coffin they also found five arrows made of reeds, three of them still feathered.

A team of Spanish archaeologists made the surprise find during routine excavations in a courtyard of the tomb of Djehuty, a high-ranking official under Queen Hatshepsut whose burial site was built on top of graves dating to the Middle Kingdom, 2055 to 1650 B.C.

(Related: "Rare Middle-Class Tomb Found From Ancient Egypt" [January 18, 2008].)

Wealthy Warriors

The coffin dates to Egypt's Middle Kingdom era, though the cemetery is better known for its use during the New Kingdom, 1550 to 1070 B.C.

Based on the coffin's inscriptions and pottery found near it, experts date the burial to the early reign of the 11th dynasty, which lasted from 2125 to 1985 B.C. Soldiers played an important role in society during that time, when Egypt was reunified after years of civil war.

Some intact burials from that period had been found in the 1920s, but the leader of the new excavation, Jose Galán of the Spanish National Research Council, said the new find could offer a fresh look into the era's burial customs.

"It's fairly uncommon to find nowadays an 11th-dynasty intact burial. This is really remarkable," Galán said.

"It gives us information about the continuous use of the necropolis and ... about a period that was not so well documented."

Original here

Solar cell directly splits water for hydrogen

Plants trees and algae do it. Even some bacteria and moss do it, but scientists have had a difficult time developing methods to turn sunlight into useful fuel. Now, Penn State researchers have a proof-of-concept device that can split water and produce recoverable hydrogen.

"This is a proof-of-concept system that is very inefficient. But ultimately, catalytic systems with 10 to 15 percent solar conversion efficiency might be achievable," says Thomas E. Mallouk, the DuPont Professor of Materials Chemistry and Physics. "If this could be realized, water photolysis would provide a clean source of hydrogen fuel from water and sunlight."

Although solar cells can now produce electricity from visible light at efficiencies of greater than 10 percent, solar hydrogen cells – like those developed by Craig Grimes, professor of electrical engineering at Penn State – have been limited by the poor spectral response of the semiconductors used. In principle, molecular light absorbers can use more of the visible spectrum in a process that is mimetic of natural photosynthesis. Photosynthesis uses chlorophyll and other dye molecules to absorb visible light.

So far, experiments with natural and synthetic dye molecules have produced either hydrogen or oxygen-using chemicals consumed in the process, but have not yet created an ongoing, continuous process. Those processes also generally would cost more than splitting water with electricity. One reason for the difficulty is that once produced, hydrogen and oxygen easily recombine. The catalysts that have been used to study the oxygen and hydrogen half-reactions are also good catalysts for the recombination reaction.

Mallouk and W. Justin Youngblood, postdoctoral fellow in chemistry, together with collaborators at Arizona State University, developed a catalyst system that, combined with a dye, can mimic the electron transfer and water oxidation processes that occur in plants during photosynthesis. They reported the results of their experiments at the annual meeting of the American Association for the Advancement of Science today (Feb. 17) in Boston.

The key to their process is a tiny complex of molecules with a center catalyst of iridium oxide molecules surrounded by orange-red dye molecules. These clusters are about 2 nanometers in diameter with the catalyst and dye components approximately the same size. The researchers chose orange-red dye because it absorbs sunlight in the blue range, which has the most energy. The dye used has also been thoroughly studied in previous artificial photosynthesis experiments.

They space the dye molecules around the center core leaving surface area on the catalyst for the reaction. When visible light strikes the dye, the energy excites electrons in the dye, which, with the help of the catalyst, can split the water molecule, creating free oxygen.

"Each surface iridium atom can cycle through the water oxidation reaction about 50 times per second," says Mallouk. "That is about three orders of magnitude faster than the next best synthetic catalysts, and comparable to the turnover rate of Photosystem II in green plant photosynthesis." Photosystem II is the protein complex in plants that oxidizes water and starts the photosynthetic process.

The researchers impregnated a titanium dioxide electrode with the catalyst complex for the anode and used a platinum cathode. They immersed the electrodes in a salt solution, but separated them from each other to avoid the problem of the hydrogen and oxygen recombining. Light need only shine on the dye-sensitized titanium dioxide anode for the system to work. This type of cell is similar to those that produce electricity, but the addition of the catalyst allows the reaction to split the water into its component gases.

The water splitting requires 1.23 volts, and the current experimental configuration cannot quite achieve that level so the researchers add about 0.3 volts from an outside source. Their current system achieves an efficiency of about 0.3 percent.

"Nature is only 1 to 3 percent efficient with photosynthesis," says Mallouk. "Which is why you can not expect the clippings from your lawn to power your house and your car. We would like not to have to use all the land area that is used for agriculture to get the energy we need from solar cells."

The researchers have a variety of approaches to improve the process. They plan to investigate improving the efficiency of the dye, improving the catalyst and adjusting the general geometry of the system. Rather than spherical dye catalyst complexes, a different geometry that keeps more of the reacting area available to the sun and the reactants might be better. Improvements to the overall geometry may also help.

"At every branch in the process, there is a choice," says Mallouk. "The question is how to get the electrons to stay in the proper path and not, for example, release their energy and go down to ground state without doing any work."

The distance between molecules is important in controlling the rate of electron transfer and getting the electrons where they need to go. By shortening some of the distances and making others longer, more of the electrons would take the proper path and put their energy to work splitting water and producing hydrogen.

Original here

Goldfish memory myth busted

Out of the blue ... it sees our pet fish are smarter than we think

A 15-year-old South Australian school student has busted the myth that goldfish have a three second memory.

Rory Stokes, from the Australian Science and Mathematics School in Adelaide, conducted an experiment to test the commonly held theory that goldfish have short memory spans.

He was also keen to open people's minds to the cruelty of keeping fish in small tanks.

"We are told that a goldfish has a memory span of less than three seconds and that no matter how small its tank is, it will always discover new places and objects," Rory said.

"I wanted to challenge this theory as I believe it is a myth intended to make us feel less guilty about keeping fish in small tanks."

Rory's experiment involved teaching a small group of fish to swim to a beacon by establishing a memory connection between the beacon and food.

Over a period of three weeks, he placed a beacon in the water at feeding time each day, waited 30 seconds and then sprinkled fish food around the beacon.

The time taken for the fish to swim to the beacon reduced dramatically, from more than one minute for the first few feeds to less than five seconds by the end of the three weeks.

Following the initial three-week period, Rory removed the beacon from the feeding process.

Six days later, he once again placed the beacon in the water and despite not seeing it for almost a week, the fish swam to the beacon in 4.4 seconds, showing they had remembered the association between food and the beacon for at least six days.

"My results strongly showed that goldfish can retain knowledge for at least six days," Rory said.

"They can retain that knowledge indefinitely if they use it regularly."

Rory also conducted a number of other experiments to show goldfish were capable of negotiating a simple maze, by having them move onto a second beacon if they found no food at the first.

"My experiments showed that goldfish have the mental capabilities to learn and remember fairly complex concepts and they can retain that knowledge for at least a number of days," he said.

Australian Science and Mathematics School principal Jim Davies said the series of experiments were an excellent example of science investigation made fun.

Original here

A Solar Grand Plan



  • A massive switch from coal, oil, natural gas and nuclear power plants to solar power plants could supply 69 percent of the U.S.’s electricity and 35 percent of its total energy by 2050.
  • A vast area of photovoltaic cells would have to be erected in the Southwest. Excess daytime energy would be stored as compressed air in underground caverns to be tapped during nighttime hours.
  • Large solar concentrator power plants would be built as well.
  • A new direct-current power transmission backbone would deliver solar electricity across the country.
  • But $420 billion in subsidies from 2011 to 2050 would be required to fund the infrastructure and make it cost-competitive.

—The Editors

High prices for gasoline and home heating oil are here to stay. The U.S. is at war in the Middle East at least in part to protect its foreign oil interests. And as China, India and other nations rapidly increase their demand for fossil fuels, future fighting over energy looms large. In the meantime, power plants that burn coal, oil and natural gas, as well as vehicles everywhere, continue to pour millions of tons of pollutants and greenhouse gases into the atmosphere annually, threatening the planet.

Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels. Our analysis convinces us that a massive switch to solar power is the logical answer.

Solar energy’s potential is off the chart. The energy in sunlight striking the earth for 40 minutes is equivalent to global energy consumption for a year. The U.S. is lucky to be endowed with a vast resource; at least 250,000 square miles of land in the Southwest alone are suitable for constructing solar power plants, and that land receives more than 4,500 quadrillion British thermal units (Btu) of solar radiation a year. Converting only 2.5 percent of that radiation into electricity would match the nation’s total energy consumption in 2006.

To convert the country to solar power, huge tracts of land would have to be covered with photovoltaic panels and solar heating troughs. A direct-current (DC) transmission backbone would also have to be erected to send that energy efficiently across the nation.

The technology is ready. On the following pages we present a grand plan that could provide 69 percent of the U.S.’s electricity and 35 percent of its total energy (which includes transportation) with solar power by 2050. We project that this energy could be sold to consumers at rates equivalent to today’s rates for conventional power sources, about five cents per kilowatt-hour (kWh). If wind, biomass and geothermal sources were also developed, renewable energy could provide 100 percent of the nation’s electricity and 90 percent of its energy by 2100.

The federal government would have to invest more than $400 billion over the next 40 years to complete the 2050 plan. That investment is substantial, but the payoff is greater. Solar plants consume little or no fuel, saving billions of dollars year after year. The infrastructure would displace 300 large coal-fired power plants and 300 more large natural gas plants and all the fuels they consume. The plan would effectively eliminate all imported oil, fundamentally cutting U.S. trade deficits and easing political tension in the Middle East and elsewhere. Because solar technologies are almost pollution-free, the plan would also reduce greenhouse gas emissions from power plants by 1.7 billion tons a year, and another 1.9 billion tons from gasoline vehicles would be displaced by plug-in hybrids refueled by the solar power grid. In 2050 U.S. carbon dioxide emissions would be 62 percent below 2005 levels, putting a major brake on global warming.

Photovoltaic Farms
In the past few years the cost to produce photovoltaic cells and modules has dropped significantly, opening the way for large-scale deployment. Various cell types exist, but the least expen­sive modules today are thin films made of cadmium telluride. To provide electricity at six cents per kWh by 2020, cadmium telluride modules would have to convert electricity with 14 percent efficiency, and systems would have to be installed at $1.20 per watt of capacity. Current modules have 10 percent efficiency and an installed system cost of about $4 per watt. Progress is clearly needed, but the technology is advancing quickly; commercial efficiencies have risen from 9 to 10 percent in the past 12 months. It is worth noting, too, that as modules improve, rooftop photovoltaics will become more cost-competitive for homeowners, reducing daytime electricity demand.

In our plan, by 2050 photovoltaic technology would provide almost 3,000 gigawatts (GW), or billions of watts, of power. Some 30,000 square miles of photovoltaic arrays would have to be erected. Although this area may sound enormous, installations already in place indicate that the land required for each gigawatt-hour of solar energy produced in the Southwest is less than that needed for a coal-powered plant when factoring in land for coal mining. Studies by the National Renewable Energy Laboratory in Golden, Colo., show that more than enough land in the Southwest is available without requiring use of environmentally sensitive areas, population centers or difficult terrain. Jack Lavelle, a spokesperson for Arizona’s Department of Water Conservation, has noted that more than 80 percent of his state’s land is not privately owned and that Arizona is very interested in developing its solar potential. The benign nature of photovoltaic plants (including no water consumption) should keep environmental concerns to a minimum.

The main progress required, then, is to raise module efficiency to 14 percent. Although the efficiencies of commercial modules will never reach those of solar cells in the laboratory, cadmium telluride cells at the National Renewable Energy Laboratory are now up to 16.5 percent and rising. At least one manufacturer, First Solar in Perrysburg, Ohio, increased module efficiency from 6 to 10 percent from 2005 to 2007 and is reaching for 11.5 percent by 2010.

Pressurized Caverns
The great limiting factor of solar power, of course, is that it generates little electricity when skies are cloudy and none at night. Excess power must therefore be produced during sunny hours and stored for use during dark hours. Most energy storage systems such as batteries are expensive or inefficient.

Compressed-air energy storage has emerged as a successful alternative. Electricity from photovoltaic plants compresses air and pumps it into vacant underground caverns, abandoned mines, aquifers and depleted natural gas wells. The pressurized air is released on demand to turn a turbine that generates electricity, aided by burning small amounts of natural gas. Compressed-air energy storage plants have been operating reliably in Huntorf, Germany, since 1978 and in McIntosh, Ala., since 1991. The turbines burn only 40 percent of the natural gas they would if they were fueled by natural gas alone, and better heat recovery technology would lower that figure to 30 percent.

Studies by the Electric Power Research Institute in Palo Alto, Calif., indicate that the cost of compressed-air energy storage today is about half that of lead-acid batteries. The research indicates that these facilities would add three or four cents per kWh to photovoltaic generation, bringing the total 2020 cost to eight or nine cents per kWh.

Electricity from photovoltaic farms in the Southwest would be sent over high-voltage DC transmission lines to compressed-air storage facilities throughout the country, where turbines would generate electricity year-round. The key is to find adequate sites. Mapping by the natural gas industry and the Electric Power Research Institute shows that suitable geologic formations exist in 75 percent of the country, often close to metropolitan areas. Indeed, a compressed-air energy storage system would look similar to the U.S. natural gas storage system. The industry stores eight trillion cubic feet of gas in 400 underground reservoirs. By 2050 our plan would require 535 billion cubic feet of storage, with air pressurized at 1,100 pounds per square inch. Although development will be a challenge, plenty of reservoirs are available, and it would be reasonable for the natural gas industry to invest in such a network.

Hot Salt
Another technology that would supply perhaps one fifth of the solar energy in our vision is known as concentrated solar power. In this design, long, metallic mirrors focus sunlight onto a pipe filled with fluid, heating the fluid like a huge magnifying glass might. The hot fluid runs through a heat exchanger, producing steam that turns a turbine.

For energy storage, the pipes run into a large, insulated tank filled with molten salt, which retains heat efficiently. Heat is extracted at night, creating steam. The molten salt does slowly cool, however, so the energy stored must be tapped within a day.

Nine concentrated solar power plants with a total capacity of 354 megawatts (MW) have been generating electricity reliably for years in the U.S. A new 64-MW plant in Nevada came online in March 2007. These plants, however, do not have heat storage. The first commercial installation to incorporate it—a 50-MW plant with seven hours of molten salt storage—is being constructed in Spain, and others are being designed around the world. For our plan, 16 hours of storage would be needed so that electricity could be generated 24 hours a day.

Existing plants prove that concentrated solar power is practical, but costs must decrease. Economies of scale and continued research would help. In 2006 a report by the Solar Task Force of the Western Governors’ Association concluded that concentrated solar power could provide electricity at 10 cents per kWh or less by 2015 if 4 GW of plants were constructed. Finding ways to boost the temperature of heat exchanger fluids would raise operating efficiency, too. Engineers are also investigating how to use molten salt itself as the heat-transfer fluid, reducing heat losses as well as capital costs. Salt is corrosive, however, so more resilient piping systems are needed.

Concentrated solar power and photovoltaics represent two different technology paths. Neither is fully developed, so our plan brings them both to large-scale deployment by 2020, giving them time to mature. Various combinations of solar technologies might also evolve to meet demand economically. As installations expand, engineers and accountants can evaluate the pros and cons, and investors may decide to support one technology more than another.

Direct Current, Too
The geography of solar power is obviously different from the nation’s current supply scheme. Today coal, oil, natural gas and nuclear power plants dot the landscape, built relatively close to where power is needed. Most of the country’s solar generation would stand in the Southwest. The existing system of alternating-current (AC) power lines is not robust enough to carry power from these centers to consumers everywhere and would lose too much energy over long hauls. A new high-voltage, direct-current (HVDC) power transmission backbone would have to be built.

Studies by Oak Ridge National Laboratory indicate that long-distance HVDC lines lose far less energy than AC lines do over equivalent spans. The backbone would radiate from the Southwest toward the nation’s borders. The lines would terminate at converter stations where the power would be switched to AC and sent along existing regional transmission lines that supply customers.

The AC system is also simply out of capacity, leading to noted shortages in California and other regions; DC lines are cheaper to build and require less land area than equivalent AC lines. About 500 miles of HVDC lines operate in the U.S. today and have proved reliable and efficient. No major technical advances seem to be needed, but more experience would help refine operations. The Southwest Power Pool of Texas is designing an integrated system of DC and AC transmission to enable development of 10 GW of wind power in western Texas. And TransCanada, Inc., is proposing 2,200 miles of HVDC lines to carry wind energy from Montana and Wyoming south to Las Vegas and beyond.

Stage One: Present to 2020
We have given considerable thought to how the solar grand plan can be deployed. We foresee two distinct stages. The first, from now until 2020, must make solar competitive at the mass-production level. This stage will require the government to guarantee 30-year loans, agree to purchase power and provide price-support subsidies. The annual aid package would rise steadily from 2011 to 2020. At that time, the solar technologies would compete on their own merits. The cumulative subsidy would total $420 billion (we will explain later how to pay this bill).

About 84 GW of photovoltaics and concentrated solar power plants would be built by 2020. In parallel, the DC transmission system would be laid. It would expand via existing rights-of-way along interstate highway corridors, minimizing land-acquisition and regulatory hurdles. This backbone would reach major markets in Phoenix, Las Vegas, Los Angeles and San Diego to the west and San Antonio, Dallas, Houston, New Orleans, Birmingham, Ala., Tampa, Fla., and Atlanta to the east.

Building 1.5 GW of photovoltaics and 1.5 GW of concentrated solar power annually in the first five years would stimulate many manufacturers to scale up. In the next five years, annual construction would rise to 5 GW apiece, helping firms optimize production lines. As a result, solar electricity would fall toward six cents per kWh. This implementation schedule is realistic; more than 5 GW of nuclear power plants were built in the U.S. each year from 1972 to 1987. What is more, solar systems can be manufactured and installed at much faster rates than conventional power plants because of their straightforward design and relative lack of environmental and safety complications.

Stage Two: 2020 to 2050
It is paramount that major market incentives remain in effect through 2020, to set the stage for self-sustained growth thereafter. In extending our model to 2050, we have been conservative. We do not include any technological or cost improvements beyond 2020. We also assume that energy demand will grow nationally by 1 percent a year. In this scenario, by 2050 solar power plants will supply 69 percent of U.S. electricity and 35 percent of total U.S. energy. This quantity includes enough to supply all the electricity consumed by 344 million plug-in hybrid vehicles, which would displace their gasoline counterparts, key to reducing dependence on foreign oil and to mitigating greenhouse gas emissions. Some three million new domestic jobs—notably in manufacturing solar components—would be created, which is several times the number of U.S. jobs that would be lost in the then dwindling fossil-fuel industries.

The huge reduction in imported oil would lower trade balance payments by $300 billion a year, assuming a crude oil price of $60 a barrel (average prices were higher in 2007). Once solar power plants are installed, they must be maintained and repaired, but the price of sunlight is forever free, duplicating those fuel savings year after year. Moreover, the solar investment would enhance national energy security, reduce financial burdens on the military, and greatly decrease the societal costs of pollution and global warming, from human health problems to the ruining of coastlines and farmlands.

Ironically, the solar grand plan would lower energy consumption. Even with 1 percent annual growth in demand, the 100 quadrillion Btu consumed in 2006 would fall to 93 quadrillion Btu by 2050. This unusual offset arises because a good deal of energy is consumed to extract and process fossil fuels, and more is wasted in burning them and controlling their emissions.

To meet the 2050 projection, 46,000 square miles of land would be needed for photovoltaic and concentrated solar power installations. That area is large, and yet it covers just 19 percent of the suitable Southwest land. Most of that land is barren; there is no competing use value. And the land will not be polluted. We have assumed that only 10 percent of the solar capacity in 2050 will come from distributed photovoltaic installations—those on rooftops or commercial lots throughout the country. But as prices drop, these applications could play a bigger role.

2050 and Beyond
Although it is not possible to project with any exactitude 50 or more years into the future, as an exercise to demonstrate the full potential of solar energy we constructed a scenario for 2100. By that time, based on our plan, total energy demand (including transportation) is projected to be 140 quadrillion Btu, with seven times today’s electric generating capacity.

To be conservative, again, we estimated how much solar plant capacity would be needed under the historical worst-case solar radiation conditions for the Southwest, which occurred during the winter of 1982–1983 and in 1992 and 1993 following the Mount Pinatubo eruption, according to National Solar Radiation Data Base records from 1961 to 2005. And again, we did not assume any further technological and cost improvements beyond 2020, even though it is nearly certain that in 80 years ongoing research would improve solar efficiency, cost and storage.

Under these assumptions, U.S. energy demand could be fulfilled with the following capacities: 2.9 terawatts (TW) of photovoltaic power going directly to the grid and another 7.5 TW dedicated to compressed-air storage; 2.3 TW of concentrated solar power plants; and 1.3 TW of distributed photovoltaic installations. Supply would be rounded out with 1 TW of wind farms, 0.2 TW of geothermal power plants and 0.25 TW of biomass-based production for fuels. The model includes 0.5 TW of geothermal heat pumps for direct building heating and cooling. The solar systems would require 165,000 square miles of land, still less than the suitable available area in the Southwest.

In 2100 this renewable portfolio could generate 100 percent of all U.S. electricity and more than 90 percent of total U.S. energy. In the spring and summer, the solar infrastructure would produce enough hydrogen to meet more than 90 percent of all transportation fuel demand and would replace the small natural gas supply used to aid compressed-air turbines. Adding 48 billion gallons of biofuel would cover the rest of transportation energy. Energy-related carbon dioxide emissions would be reduced 92 percent below 2005 levels.

Who Pays?
Our model is not an austerity plan, because it includes a 1 percent annual increase in demand, which would sustain lifestyles similar to those today with expected efficiency improvements in energy generation and use. Perhaps the biggest question is how to pay for a $420-billion overhaul of the nation’s energy infrastructure. One of the most common ideas is a carbon tax. The International Energy Agency suggests that a carbon tax of $40 to $90 per ton of coal will be required to induce electricity generators to adopt carbon capture and storage systems to reduce carbon dioxide emissions. This tax is equivalent to raising the price of electricity by one to two cents per kWh. But our plan is less expensive. The $420 billion could be generated with a carbon tax of 0.5 cent per kWh. Given that electricity today generally sells for six to 10 cents per kWh, adding 0.5 cent per kWh seems reasonable.

Congress could establish the financial incentives by adopting a national renewable energy plan. Consider the U.S. Farm Price Support program, which has been justified in terms of national security. A solar price support program would secure the nation’s energy future, vital to the country’s long-term health. Subsidies would be gradually deployed from 2011 to 2020. With a standard 30-year payoff interval, the subsidies would end from 2041 to 2050. The HVDC transmission companies would not have to be subsidized, because they would finance construction of lines and converter stations just as they now finance AC lines, earning revenues by delivering electricity.

Although $420 billion is substantial, the annual expense would be less than the current U.S. Farm Price Support program. It is also less than the tax subsidies that have been levied to build the country’s high-speed telecommunications infrastructure over the past 35 years. And it frees the U.S. from policy and budget issues driven by international energy conflicts.

Without subsidies, the solar grand plan is impossible. Other countries have reached similar conclusions: Japan is already building a large, subsidized solar infrastructure, and Germany has embarked on a nationwide program. Although the investment is high, it is important to remember that the energy source, sunlight, is free. There are no annual fuel or pollution-control costs like those for coal, oil or nuclear power, and only a slight cost for natural gas in compressed-air systems, although hydrogen or biofuels could displace that, too. When fuel savings are factored in, the cost of solar would be a bargain in coming decades. But we cannot wait until then to begin scaling up.

Critics have raised other concerns, such as whether material constraints could stifle large-scale installation. With rapid deployment, temporary shortages are possible. But several types of cells exist that use different material combinations. Better processing and recycling are also reducing the amount of materials that cells require. And in the long term, old solar cells can largely be recycled into new solar cells, changing our energy supply picture from depletable fuels to recyclable materials.

The greatest obstacle to implementing a renewable U.S. energy system is not technology or money, however. It is the lack of public awareness that solar power is a practical alternative—and one that can fuel transportation as well. Forward-looking thinkers should try to inspire U.S. citizens, and their political and scientific leaders, about solar power’s incredible potential. Once Americans realize that potential, we believe the desire for energy self-sufficiency and the need to reduce carbon dioxide emissions will prompt them to adopt a national solar plan. 

Original here

Lawmakers: Quit flushing into Atlantic

State lawmakers this week will begin reviewing a timetable for Miami-Dade and Broward to end decades of discharging wastewater into the ocean.


In Southeast Florida, a lot of what gets flushed winds up where people fish and sometimes swim.

Every day, six plants in Miami-Dade, Broward and south Palm Beach counties pump about 300 million gallons of sewage into the Atlantic Ocean. The brew is screened of its foulest components but remains nutrient-rich, not even clean enough to sprinkle on a lawn.

State regulators, with support from Gov. Charlie Crist and a key state Senate panel, are stepping up a push to phase out a practice that environmentalists, divers and some scientists believe has tainted reefs, marine life and beaches.

Draft legislation, to be reviewed in a Senate environmental committee hearing in Tallahassee on Tuesday, would give the only three Florida counties that dump sewage into the ocean a decade to upgrade wastewater plants from minimal to advanced treatment. If approved, it would end daily discharges, aside from limited backup use, by 2025.

T.J. Marshall, the Miami Beach-based coordinator for the Florida Coastal and Ocean Coalition, praised the proposal and said environmentalists would support a 10-year wait for cleaner outfall flows if it can end decades of damage.

''In my lifetime, we've pumped enough sewage offshore to fill Lake Okeechobee twice,'' he said. ``It's probably the biggest environmental disaster in Florida, but because nobody sees it, nothing's been done.''

SHOT AT PASSING

Sen. Burt Saunders, a Naples Republican who chairs the committee, believes the proposal has a solid shot at passing -- despite the counties' continuing concerns about inconclusive studies and a $3 billion price tag to overhaul six plants.

''There is some question as to the amount of environmental degradation caused by this. There is no question there is some,'' Saunders said. ``Regardless of the environmental impact, it seems to me that if you have 300 million gallons a day, when we get into these kinds of droughts that we have now, that resource would be valuable.''

Miami-Dade, Broward and Hollywood -- which operates one of the two Broward regional plants -- have balked at initial proposals from the Florida Department of Environmental Protection, opposing a shutdown of pipelines as too costly and demanding more data to support its impact on marine life.

CONCERNS REMAIN

The latest proposal hasn't eased the counties' concerns. It would ban any new pipelines statewide, cap existing outfalls at current levels, and stipulate a 2018 deadline for installing advanced treatment systems -- seven years before closing off the pipes for regular use.

Alan Garcia, director of water and wastewater services for Broward County, said the price tag only seems to have gone up.

''It's really taken our scenario from $900 million to well over a billion, probably $1.2 billion,'' he said. ``Now we're dealing with two separate processes at the same time.''

Doug Yoder, deputy director of the Miami-Dade Water and Sewer Department, said the county questioned the costs, timing and limits on ocean discharge that could leave sewage plants overwhelmed during ''peak flow'' events -- such as during heavy rainstorms, when wastewater volumes can quickly triple.

And neither Miami-Dade nor Broward has figured out how to recycle such vast volumes of wastewater, decisions that could dictate whether expensive advanced treatment systems were even needed. Yoder said Miami-Dade, which already is committed to $1.4 billion in projects to reuse 40 percent of its wastewater over the next 20 years, would need to spend as much as $2 billion more to meet the state's demands, an expense that could double consumer water bills already expected to rise.

''Logically, it seems like you need to make a decision about how you're going to reuse the water before you decide on the treatment,'' Yoder said. ``We don't actually have an identified need for this additional water now.''

For Miami-Dade, supplying the 70 million to 90 million gallons a day that Florida Power & Light's two proposed nuclear reactors at Turkey Point would require is one option. But in Broward, which is largely built out, the options for reusing water are limited, Garcia said.

DEEP-WELL INJECTION

Garcia said the state should study potential effects of closing the outfalls. Broward, for example, might have to expand its use of deep-well injection, which pumps treated sewage underground -- an approach that some environmentalists oppose as a threat to drinking-water supplies.

Garcia cited an ongoing federal study that found that only 4 percent of the sewage from the ocean pipes is carried back over shallower reefs near shore. The sewage, he said, is almost instantly diluted and carried north in Gulf Stream currents. The outfalls empty from one to three miles offshore in 90 to 100 feet of water.

But environmental groups and some scientists have produced research showing higher concentrations of ammonia, nitrogen and other pollutants than the state has estimated.

They also point to direct impacts on corals from algae blooms that have decimated some Palm Beach reefs as well as increased diseases and other maladies. Many beachgoers also have blamed the pipes for periodic closures from high levels of waste bacteria, although no study has confirmed that link.

`WEIGHT OF EVIDENCE'

The Department of Environmental Protection, in a report on the proposal, doesn't claim a direct link between sewage and reef damage but says the ``weight of the evidence . . . calls into question the environmental acceptability.''

Sarah Williams, a DEP spokeswoman, said ``we feel like even without data to show one way or another, there is a need for this water.''

If properly treated, the water could be used for recharging groundwater supplies, irrigation, industrial use, preventing salt-water intrusion or even replenishing wetlands, she said. The treatment technology already is operating elsewhere in Florida, she said, and the state will offer counties loan programs to help bankroll projects.

While it sounds expensive, Williams said, the counties would have to expand water and sewage systems anyway as populations grow. And the 300 million gallons of sewage, she said, happens to be exactly the amount of additional regional demand for water predicted by 2025.

Join the discussion

The Miami Herald is pleased to provide this opportunity to share information, experiences and observations about what's in the news. Some of the comments may be reprinted elsewhere in the site or in the newspaper. We encourage lively, open debate on the issues of the day, and ask that you refrain from personal comments and remarks that are off point. Thank you for taking the time to offer your thoughts.

Original here