There was an error in this gadget


Tuesday, March 24, 2009

Targeting Bacterial Biofilm


The brown tube sponge Agelas conifera generates a compound that breaks up biofilms.

In the arms race between humans and bacteria, the ability to form "biofilms" — large aggregations of microbes embedded in a slimy matrix — has been one of the weapons the organisms use to defeat the immune system, antibiotic drugs and other threats. But scientists, who only recently recognized the role that biofilms play in antibiotic resistance, may now be closing in on promising prospects for defeating pathogens.

Scientists have learned that bacteria are vulnerable when floating around as individual cells in their "planktonic state." But they are much tougher to combat once they get established in a suitable place — whether the hull of a ship or inside the lungs — and come together in tightly bound biofilms. In that state, they can activate mechanisms like tiny pumps to expel antibiotics, share genes that confer protection against drugs, slow down their metabolism or become dormant, making them harder to kill.

The answer, say researchers, is to find substances that will break up biofilms.

"Since the time of Pasteur, we've been working on trying to kill off and control planktonic bacteria, but we've made very little progress in the control and understanding of biofilm bacteria," said David Davies, a biofilm expert at the State University of New York at Binghamton. "Now we're very good at getting rid of acute bacterial infections, which used to be a real scourge of mankind, but we have this incredible number of chronic debilitating bacterial infections" often linked to biofilms.

Notorious biofilm infections come from the bacterium Pseudomonas aeruginosa, which often affects lungs and can debilitate and kill cystic fibrosis sufferers, and methicillin-resistant Staphylococcus aureus (MRSA), which can spread quickly through prisons, hospitals and even beaches. Acinetobacter baumannii infections, which plague wounded soldiers, are also probably caused by biofilms, as are more mundane afflictions such as sinusitis and ear infections.

A successful means of dispersing biofilms, Davies said, would be a medical breakthrough akin to the discovery of penicillin in 1928.

The March 2009 Journal of Bacteriology features Davies' research on forcing biofilm dispersion by using bacteria's own chemical signals against them. Biofilm colonies disperse naturally in response to environmental factors or to spread and form new colonies. Davies and his colleagues have discovered a chemical signal, in the form of a fatty acid, that tells bacteria it is time to break up.

He hopes this naturally occurring molecule, cis-2-decenoic acid or CDA, which is approved by the Food and Drug Administration as a food additive, could be used to fight infection. Because it does not kill bacteria, he says, it should not trigger the development of resistant bacteria, which could happen through natural selection if the chemical killed its targets.

At North Carolina State University in Raleigh, two chemistry professors think they have found a potential key to biofilm dispersion in the oceans, which scientists are mining for a variety of new drugs. When John Cavanagh and Christian Melander saw photos of the sea sponge Agelas conifera looking clean and healthy on a coral reef smothered by bacterial biofilms, they had a eureka moment.

"We were looking at that and said this sponge probably has it figured out," Cavanagh said. "It has no immune system, but it's found a way to defend itself against all the biofilms in the ocean, where there is a lot of nasty stuff floating around."

Melander said "a throwaway sentence in an obscure journal" — the Bulletin of the Chemical Society of Japan — gave them another clue. They isolated a compound from the sponge that disperses biofilms and figured out how to synthesize it quickly and cheaply.

The professors said that in laboratory tests, the compound, paired with an antibiotic, has effectively dispersed and killed previously antibiotic-resistant forms of MRSA, A. baumannii and other bacteria, though the scientists do not know how it works.

Though ultimately they hope to pair it with antibiotics to be taken orally, first Melander and Cavanagh plan to impregnate the compound in implanted medical devices that are prone to bacterial contamination, such as catheters, stents and artificial limbs.

(3 of 3)

Similar projects are in the works at other labs worldwide.


"In the last 15 years or so, we've really seen things take off. There will be lots of novel technologies coming out," said Rodney Donlan, team leader of the biofilm lab at the Centers for Disease Control and Prevention. The agency is experimenting with using phages — viruses that can kill bacterial cells — to prevent biofilm formation on medical devices.

The Canadian company Kane Biotech plans this year to submit plans to the FDA for a wound gel containing a natural enzyme found in human mouths that disperses biofilms, which it has named DispersinB. The enzyme is nontoxic but makes biofilms susceptible to antibiotics and immune responses.

"The world is now turning their attention to the fact we just can't keep developing more and more drugs," said Gord Froehlich, Kane Biotech president and chief executive. "We have to look at how the bacteria actually live and survive rather than just shooting more bullets."

University of Florida molecular biologist Tony Romeo describes the research as still in its nascent stages and said that discovering exactly why and how biofilms form is crucial.

"By understanding the factors that are needed for biofilms to develop, we hope to identify chinks in the armor that can lead to novel ways to treat or prevent such kinds of infections," Romeo said.

Similar projects are in the works at other labs worldwide.

Brain On A Chip?

How does the human brain run itself without any software? Find that out, say European researchers, and a whole new field of neural computing will open up. A prototype ‘brain on a chip’ is already working.

“We know that the brain has amazing computational capabilities,” remarks Karlheinz Meier, a physicist at Heidelberg University. “Clearly there is something to learn from biology. I believe that the systems we are going to develop could form part of a new revolution in information technology.”

It’s a strong claim, but Meier is coordinating the EU-supported FACETS project which brings together scientists from 15 institutions in seven countries to do just that. Inspired by research in neuroscience, they are building a ‘neural’ computer that will work just like the brain but on a much smaller scale.

The human brain is often likened to a computer, but it differs from everyday computers in three important ways: it consumes very little power, it works well even if components fail, and it seems to work without any software.

How does it do that? Nobody yet knows, but a team within FACETS is completing an exhaustive study of brain cells – neurons – to find out exactly how they work, how they connect to each other and how the network can ‘learn’ to do new things.

Mapping brain cells

“We are now in a situation like molecular biology was a few years ago when people started to map the human genome and make the data available,” Meier says. “Our colleagues are recording data from neural tissues describing the neurons and synapses and their connectivity. This is being done almost on an industrial scale, recording data from many, many neural cells and putting them in databases.”

Meanwhile, another FACETS group is developing simplified mathematical models that will accurately describe the complex behaviour that is being uncovered. Although the neurons could be modelled in detail, they would be far too complicated to implement either in software or hardware.

The goal is to use these models to build a ‘neural computer’ which emulates the brain. The first effort is a network of 300 neurons and half a million synapses on a single chip. The team used analogue electronics to represent the neurons and digital electronics to represent communications between them. It’s a unique combination.

Since the neurons are so small, the system runs 100,000 times faster than the biological equivalent and 10 million times faster than a software simulation. “We can simulate a day in one second,” Meier notes.

The network is already being used by FACETS researchers to do experiments over the internet without needing to travel to Heidelberg.

New type of computing

But this ‘stage 1’ network was designed before the results came in from the mapping and modelling work. Now the team are working on stage 2, a network of 200,000 neurons and 50 million synapses that will incorporate all the neuroscience discoveries made so far.

To build it, the team is creating its network on a single 20cm silicon disk, a ‘wafer’, of the type normally used to mass-produce chips before they are cut out of the wafer and packaged. This approach will make for a more compact device.

So called ‘wafer-scale integration’ has not been used much before for this, as such a large circuit will certainly have manufacturing flaws. “Our chips will have faults but they are each likely to affect only a single synapse or a single connection in the network,” Meier points out. “We can easily live with that. So we exploit the fault tolerance and use the entire wafer as a neural network.”

How could we use a neural computer? Meier stresses that digital computers are built on principles that simply do not apply to devices modelled on the brain. To make them work requires a completely new theory of computing. Yet another FACETS group is already on the case. “Once you understand the basic principles you may hope to develop the hardware further, because biology has not necessarily found the best solution.”

Beyond the brain?

Practical neural computers could be only five years away. “The first step could be a little add-on to your computer at home, a device to handle very complex input data and to provide a simple decision,” Meier says. “A typical thing could be an internet search.”

In the longer term, he sees applications for neural computers wherever there are complex and difficult decisions to be made. Companies could use them, for example, to explore the consequences of critical business decisions before they are taken. In today’s gloomy economic climate, many companies will wish they already had one!

The FACETS project, which is supported by the EU’s Sixth Framework Programme for research, is due to end in August 2009 but the partners have agreed to continue working together for another year. They eventually hope to secure a follow-on project with support from both the European Commission and national agencies.

Meanwhile, the consortium has just obtained funding from the EU’s Marie Curie initiative to set up a four-year Initial Training Network to train PhD students in the interdisciplinary skills needed for research in this area.

Where could this go? Meier points out that neural computing, with its low-power demands and tolerance of faults, may make it possible to reduce components to molecular size. “We may then be able to make computing devices which are radically different and have amazing performance which, at some point, may approach the performance of the human brain – or even go beyond it!”

Original here

Water Acts As Catalyst In Explosives

Simulations of the detonation of a high explosive show that 'extreme' water (molecules with one red hydrogen atom and two white oxygen atoms) can act as a chemical catalyst that promotes the transport of oxygen between reactive sites. (Credit: Image courtesy of DOE/Lawrence Livermore National Laboratory)

The most abundant material on Earth exhibits some unusual chemical properties when placed under extreme conditions.

Lawrence Livermore National Laboratory scientists have shown that water, in hot dense environments, plays an unexpected role in catalyzing complex explosive reactions. A catalyst is a compound that speeds chemical reactions without being consumed. Platinum and enzymes are common catalysts. But water rarely, if ever, acts as a catalyst under ordinary conditions.

Detonations of high explosives made up of oxygen and hydrogen produce water at thousands of degrees Kelvin and up to 100,000 atmospheres of pressure, similar to conditions in the interiors of giant planets.

While the properties of pure water at high pressures and temperatures have been studied for years, this extreme water in a reactive environment has never been studied. Until now.

Using first-principle atomistic simulations of the detonation of the high explosive PETN (pentaerythritol tetranitrate), the team discovered that in water, when one hydrogen ion serves as a reducer and the hydroxide (OH) ion serves as an oxidizer, the ions act as a dynamic team that transports oxygen between reaction centers.

"This was news to us," said lead researcher Christine Wu. "This suggests that water also may catalyze reactions in other explosives and in planetary interiors."

This finding is contrary to the current view that water is simply a stable detonation product.

"Under extreme conditions, water is chemically peculiar because of its frequent dissociations," Wu said. "As you compress it to the conditions you'd find in the interior of a planet, the hydrogen of a water molecule starts to move around very fast."

In the molecular dynamic simulations using the Lab's BlueGene L supercomputer, Wu and colleagues Larry Fried, Lin Yang, Nir Goldman and Sorin Bastea found that the hydrogen (H) and hydroxide (OH) ions in water transport oxygen from nitrogen storage to carbon fuel under PETN detonation conditions (temperatures between 3,000 Kelvin and 4,200 Kelvin). Under both temperature conditions, this "extreme water" served both as an end product and as a key chemical catalyst.

For a molecular high explosive that is made up of carbon, nitrogen, oxygen and hydrogen, such as PETN, the three major gaseous products are water, carbon dioxide and molecular nitrogen.

But to date, the chemical processes leading to these stable compounds are not well understood.

The team found that nitrogen loses its oxygen mostly to hydrogen, not to carbon, even after the concentration of water reaches equilibrium. They also found that carbon atoms capture oxygen mostly from hydroxide, rather than directly from nitrogen monoxide (NO) or nitrogen dioxide (NO2). Meanwhile water disassociated and recombines with hydrogen and hydroxide frequently.

"The water that comes out is part of the energy release mechanism," Wu said. "This catalytic mechanism is completely different from previously proposed decomposition mechanisms for PETN or similar explosives, in which water is just an end product. This new discovery could have implications for scientists studying the interiors of Uranus and Neptune where water is in an extreme form."

Original here

Coming Clean on Household Cleaners

By Julie Scelfo

The New York Times S.C. Johnson, maker of Windex and other cleaning products, announced that it would begin disclosing ingredients.

Consumers who want to know what’s in the household cleaning products they purchase have long been frustrated: most cleaning product manufacturers do not disclose product ingredients, and no federal law requires them to do so.

So Green Inc. took notice last month when Earthjustice, a nonprofit public interest law firm, asked a New York State judge to force several major manufacturers to file reports listing all ingredients — including those that are potentially toxic — with the New York State Department of Environmental Conservation, in accordance with what they say is a long-forgotten law that has been on the books since the 1970s.

Among the companies names as targets in the suit: Procter & Gamble, Church & Dwight Company, Reckitt Benckiser, and Colgate-Palmolive.

Conspicuously missing among the companies named: S.C. Johnson & Son, the maker of Windex, Glade, Drano and other household products. Nonetheless, the company announced last week that it would begin disclosing all ingredients for home cleaning and air care products sold in the United States.

“Today’s families want to know what’s in the household cleaning and air freshening products they use in their homes,” Jennifer A. Taylor, an S.C. Johnson spokeswoman, said. “Ingredient communication extends the company’s long history of doing what’s right for people and the planet.”

Concurrent with that announcement, S.C. Johnson, which has more than $8 billion in annual sales, said it is in the process of phasing out the use of DEP, a plasticizer found in fragrances and belonging to a class of chemicals known as phthalates, which have been linked to health risks in animals and possibly humans.

Although Procter & Gamble and Church & Dwight did not immediately respond to requests for comment, Colgate-Palmolive indicated that it embraced the Soap and Detergent Association’s voluntary “Consumer Product Ingredient Communication Initiative” that was announced last November.

That initiative calls for providing information on all ingredients except dyes, fragrances and preservatives and anything deemed “incidental” and having “no technical or functional effect in the product.”

Environmental advocates say the difference between what S.C. Johnson is doing and the industry plan is significant.

“Those three categories contain some of the cleaning ingredients that we’re most concerned about in terms of potential risks to human health,” said Sonya Lunder, a senior scientist with the Environmental Working Group. “We support what S.C. Johnson is doing and hope that more companies follow suit.”

Brian Sansoni, a spokesman for the Soap and Detergent Association, said he could not speculate as to whether any individual companies would do so. As for the association’s views on the recent move by S.C. Johnson, one of its members, Mr. Sansoni said this in an e-mail message: “S.D.A. is certainly very pleased that our members continue to provide more information than ever before on the safety of cleaning products and their ingredients.”

Original here

Fossil Fragments Reveal 500-million-year-old Monster Predator

Reconstruction of Hurdia victoria. (Credit: Illustration: Marianne Collins)

Hurdia victoria was originally described in 1912 as a crustacean-like animal. Now, researchers from Uppsala University and colleagues reveal it to be just one part of a complex and remarkable new animal that has an important story to tell about the origin of the largest group of living animals, the arthropods.

The fossil fragments puzzled together come from the famous 505 million year old Burgess Shale, a UNESCO World Heritage Site in British Columbia, Canada. Uppsala researchers Allison Daley and Graham Budd at the Department of Earth Sciences, together with colleagues in Canada and Britain, describe the convoluted history and unique body construction of the newly-reconstructed Hurdia victoria, which would have been a formidable predator in its time.

Although the first fragments were described nearly one hundred years ago, they were assumed to be part of a crustacean-like animal. It was not then realised that other parts of the animal were also in collections, but had been described independently as jellyfish, sea cucumbers and other arthropods. However, collecting expeditions from in the 1990s uncovered more complete specimens and hundreds of isolated pieces that led to the first hints that Hurdia was more than it seemed. The last piece of the puzzle was found when the best-preserved specimen turned up in the old collections at the Smithsonian National Museum of Natural History, Washington DC. This specimen was first classified as an arthropod in the 1970s and 80s, and then as an unusual specimen of the famous monster predator Anomalocaris.

The new description of Hurdia shows that it is indeed related to Anomalocaris. Like Anomalocaris, Hurdia had a segmented body with a head bearing a pair of spinous claws and a circular jaw structure with many teeth. But it differs from Anomalocaris by the possession of a huge three-part carapace that projects out from the front of the animal's head.

"This structure is unlike anything seen in other fossil or living arthropods," says Ph.D. student Allison Daley, who has been studying the fossils for three years as part of her doctoral thesis.

"The use of the large carapace extending from the front of its head is a mystery. In many animals, a shell or carapace is used to protect the soft-parts of the body, as you would see in a crab or lobster, but this structure in Hurdia is empty and does not cover or protect the rest of the body. We can only guess at what its function might have been."

Hurdia and Anomalocaris are both early offshoots of the evolutionary lineage that led to the arthropods, the large modern group that contains the insects, crustaceans, spiders, millipedes and centipedes. They reveal details of the origins of important features that define the modern arthropods such as their head structures and limbs. As well as its bizarre frontal carapace, Hurdia reveals exquisite details of the gills associated with the body, some of the best preserved in the fossil record.

"Most of the body is covered in the gills, which were probably necessary to provide oxygen to such a large, actively swimming animal," says Allison Daley.

Original here

Beat the traffic: take the flying car

U.S. company tests its $194,000 US 'roadable aircraft'

The Terrafugia Transition seats two, drives regular speeds on roads and can do about 180 km/h in flight. (Courtesy Terrafugia)

A U.S. company has realized the longstanding fantasy of a flying car.

Terrafugia Inc.'s "roadable aircraft," named Transition, completed its first flight at Plattsburgh, N.Y., on Wednesday, the company said.

"It’s what aviation enthusiasts have been striving for since 1918," CEO Carl Dietrich said in a news release.

But with an expected price tag of $194,000 US by the time the first one is delivered to a user in 2011, it's not likely to reach a mass market. The Transition is aimed at giving sport and private pilots convenient ground transportation.

According to the company, the two-seat vehicle can take off and land at smaller airports, and the pilot can convert it to a roadworthy vehicle by retracting the wings from inside the cockpit in less than 30 seconds.

Terrafugia — Latin for "escape from land" — can drive at regular speeds on the ground and do about 180 km/h in the air. Its tank holds enough unleaded gas for four hours of cruising. It fits in a garage, has front-wheel drive for the surface and a pusher propeller for flight.

The Transition requires a U.S. sport pilot licence to fly, but the company website said that could take as little as 20 hours of flight time in a Transition-specific course.

A "full vehicle parachute" is available.

Users would have to file a flight plan on trips between larger airports.

Terrafugia, based in Woburn, Mass., near Boston, was founded by five pilots who are graduates of Massachusetts Institute of Technology, and is backed by private investors.

Original here

Smart People Really Do Think Faster

by Jon Hamilton

An image showing the connectivity between brain neurons.
David Shattuck/Arthur Toga/Paul Thompson/UCLA

This colorful brain image is like a map of mental speed. The bright spaghetti structures represent the pathways connecting different brain cells.

An image of the pathways of the brain.
David Shattuck/Arthur Toga/Paul Thompson/UCLA

This DTI brain scan shows more of the brain's wiring. Thompson says not only are these brain scans beautiful but "these images really give you a picture of the mental speed of the brain."

The smarter the person, the faster information zips around the brain, a UCLA study finds. And this ability to think quickly apparently is inherited.

The study, published in the Journal of Neuroscience, looked at the brains and intelligence of 92 people. All the participants took standard IQ tests. Then the researchers studied their brains using a technique called diffusion tensor imaging, or DTI.

Capturing Mental Speed

DTI is a variant of magnetic resonance imaging (MRI) that can measure the structural integrity of the brain's white matter, which is made up of cells that carry nerve impulses from one part of the brain to another. The greater the structural integrity, the faster nerve impulses travel.

"These images really give you a picture of the mental speed of the brain," says Paul Thompson, Ph.D., a professor of neurology at UCLA School of Medicine.

They're also "the most beautiful images of the brain you could imagine," Thompson says. "My daughter, who's 5, says they look like little flowers at each point in the brain."

Thompson says DTI scans of the 92 participants in the study revealed a clear link between brain speed and intelligence.

"When you say someone is quick-thinking, it's genuinely true," Thompson says. "The impulses are going faster and they are just more efficient at processing information, and then making a decision based on it."

Inherited Ability

Thompson's study also found that genetic factors played a big role in brain speed.

The team was able to figure this out because the 92 people in their study were all twins. Some were identical twins, who share all the same genes. Others were non-identical twins, who share only certain genes.

By comparing the groups, the researchers were able to tease out genes associated with the structural integrity of white matter. And it turned out many of these genes were also associated with intelligence.

Richard Haier, Ph.D., emeritus professor at the University of California, Irvine, says this may explain something scientists have been wondering about for a long time.

"We know that intelligence has some genetic component," he says. "And what the Thompson study is showing is that a large part of the genetic aspect of intelligence has to do with the white matter tracks that connect different parts of the brain."

Don't Give Up Just Yet

Haier says the good news is that we're not necessarily stuck with the brain, or the brain speed, we inherit. He says thinking is like running or weightlifting. It helps to have certain genes. But anyone can get stronger or faster by working out.

The brain is like a muscle, Haier says: "The more you work it the more efficient it gets."

So people who practice the violin, or do math problems, or learn a foreign language are constantly strengthening certain pathways in their brains.

And Thompson notes that our brains, unlike our bodies, peak relatively late in life.

"The wires between the brain cells, the connections, are the things that you can modify throughout life," he says. "They change and they improve through your 40s and 50s and 60s."

Thompson says there are practical, as well as academic, reasons to measure brain speed.

The technique can spot problems such as Alzheimer's disease, which slows down the brain. And because the scans are so sensitive, they can show whether new drugs for Alzheimer's are actually working.

Original here

Are Tropical Species More Threatened than Arctic Ones from Global Warming?


A team of researchers say that in spite of all the media attention given to the Arctic region and polar bears, species living in the tropics may face an even greater risk as the world warms up. Shrinking polar ice has concerned ecologists that polar bears will soon start dying off as their hunting ground literally melts away.

However, according to a team led by University of Washington, while temperature changes will be much more extreme at high latitudes, tropical species have a far greater risk of extinction since even relatively slight warming of just a degree or two can have a devastating impact. The Daily Galaxy asked Joshua Tewksbury, a biologist at the University of Washington who is studying tropic species, why these warm weather species are in greater danger. After all, it’s already warm where they live, so how could just a degree or two of warming make much of a difference?

“We’re looking specifically at the intersection between where an organism lives, and how susceptible they are to change. What we found is that organisms in the tropics are much less resilient to heat change,” Tewksbury explained to The Daily Galaxy.

Why? Because tropic species are adapted to living within a much more narrow temperature range. Once temperatures get beyond that comfortable range, many species will likely have a difficult time coping.

Geographically, Earth’s tropical region is a giant belt that stretches from the Tropic of Cancer to the Tropic of Capricorn; or, in actual terms, just south of Miami to half way through Australia.

However a more scientific definition is taken by meteorologists who define the tropics as a region defined by a long term climate. It is this definition that has shown the shift in the width of the tropics towards our planet’s poles.

"There's a strong relationship between your physiology and the climate you live in," said Tewksbury, "In the tropics many species appear to be living at or near their thermal optimum, a temperature that lets them thrive. But once temperature gets above the thermal optimum, fitness levels most likely decline quickly and there may not be much they can do about it."

Arctic species, on the other hand, often experience temperatures ranging from subzero to a comparatively warm 60 degrees Fahrenheit. They typically live at temperatures well below their thermal limit, and most will continue to do so even with climate change.

"Many tropical species can only tolerate a narrow range of temperatures because the climate they experience is pretty constant throughout the year," said Curtis Deutsch, an assistant professor of atmospheric and oceanic sciences at the University of California, Los Angeles. "Our calculations show that they will be harmed by rising temperatures more than would species in cold climates.

"Unfortunately, the tropics also hold the large majority of species on the planet," he said.

Tewksbury and Deutsch are lead authors of a paper detailing the research, published in the May 6 print edition of the Proceedings of the National Academy of Sciences. The scientists compared data describing the relationship between temperatures and fitness for a variety of temperate and tropical insect species, as well as frogs, lizards and turtles. Fitness levels were measured by examining population growth rates in combination with physical performance.

"The direct effects of climate change on the organisms we studied appear to depend a lot more on the organisms' flexibility than on the amount of warming predicted for where they live," Tewksbury said. "The tropical species in our data were mostly thermal specialists, meaning that their current climate is nearly ideal and any temperature increases will spell trouble for them."

So does that mean that we should turn our focus away from the Arctic? Definitely not, says Tewksbury. Polar bears are in danger, but its just for different reasons than for tropic species. It’s not the temperature itself that will harm the bears—they already live in a climate that varies wildly throughout the year—what will harm them is the loss of habitat they will face as polar ice disappears. Many tropic species, on the other hand, will actually likely be harmed directly from the rising temperature itself since their physiology cannot handle the vastly swaying temperatures like the bears can.

“The polar bears are in trouble, but what many don’t know is that even a small amount of change in the tropics could effect a vast number of species,” Tewksbury told The Daily Galaxy. “People won’t see as dramatic decline, because many tropic species that will be effected like insects and lizards, simply don’t have the appeal that the polar bear has. Yet they are still vitally important to their ecosystems.”

Independent teams set out to measure the atmospheric data available, and using four different meteorological measurements, found that the tropical belt has grown by between 2 and 4.8 degrees of latitude since 1979. This measurement translates to a total expansion – north and south – of 140 to 330 miles.

Climate scientists have long predicted that, by the end of the 21st century, a growth of the tropical belt was expected. But the growth that has taken place over the last quarter-century is puzzling, and not part of their theories.

Dian Seidel, a research meteorologist with the National Oceanic and Atmospheric Administration lab in Silver Spring, Md, is confused. "They are big changes," she said. "It's a little puzzling."

And while one explanation for the expanding tropics is indeed global warming, others vie for ranking as another explanation. Depletions in the ozone layer and changes in El Nino are both other options that could explain what has happened.

So while much of the tropics are thought to be just that – tropical – there are great swathes of desert as well. One only needs to look at Australia to see that played out, with the ‘Top End’ dominated by rain forests and rainy seasons, that sit just on top of the massive desert center.

It is these desert areas that sit on the edge of tropical locations, such as the U.S. Southwest, parts of the Mediterranean, and of course Australia, that are at the most risk, according to the experts.

While warming is happening much faster at higher elevations, it is also occurring at a slower rate in tropic zones, which over time will likely just as severe of an impact, but for different reasons. They may not be as majestic as polar bears, says Tewksbury, but we can’t forget about the little guys.

Posted by Rebecca Sato.

Original here

Northeast warned of new source of rising seas

Image: New York subway
Mario Tama / Getty Images
A study predicting even higher sea level rise in the Northeast due to ocean current behavior suggests that New York's subway system would be at even greater risk to flooding and storm surges.

WASHINGTON - The northeastern U.S. coast is likely to see the world's biggest sea level rise from man-made global warming, a new study predicts.

However much the oceans rise by the end of the century, add an extra 8 inches or so for New York, Boston and other spots along the coast from the mid-Atlantic to New England. That's because of predicted changes in ocean currents, according to a study based on computer models published online Sunday in the journal Nature Geoscience.

An extra 8 inches — on top of a possible 2 or 3 feet of sea rise globally by 2100 — is a big deal, especially when nor'easters and hurricanes hit, experts said.

"It's not just waterfront homes and wetlands that are at stake here," said Donald Boesch, president of the University of Maryland Center for Environmental Science, who wasn't part of the study. "Those kind of rises in sea level when placed on top of the storm surges we see today, put in jeopardy lots of infrastructure, including the New York subway system."

For years, scientists have talked about rising sea levels due to global warming — both from warm water expanding and the melt of ice sheets in Greenland and West Antarctica. Predictions for the average worldwide sea rise keep changing along with the rate of ice melt. Recently, more scientists are saying the situation has worsened so that a 3-foot rise in sea level by 2100 is becoming a common theme.

Boston singled out
But the oceans won't rise at the same rate everywhere, said study author Jianjun Yin of the Center for Ocean-Atmospheric Prediction Studies at Florida State University. It will be "greater and faster" for the Northeast, with Boston one of the worst hit among major cities, he said. So, if it's 3 feet, add another 8 inches for that region.

The explanation involves complicated ocean currents. Computer models forecast that as climate change continues, there will be a slowdown of the great ocean conveyor belt. That system moves heat energy in warm currents from the tropics to the North Atlantic and pushes the cooler, saltier water down, moving it farther south around Africa and into the Pacific. As the conveyor belt slows, so will the Gulf Stream and North Atlantic current. Those two fast-running currents have kept the Northeast's sea level unusually low because of a combination of physics and geography, Yin said.

Slow down the conveyor belt 33 to 43 percent as predicted by computer models, and the Northeast sea level rises faster, Yin said.

So far, the conveyor belt has not yet noticeably slowed.

A decade ago, scientists worried about the possibility that this current conveyor belt would halt altogether — something that would cause abrupt and catastrophic climate change like that shown in the movie "The Day After Tomorrow." But in recent years, they have concluded that a shutdown is unlikely to happen this century.

Other experts who reviewed Yin's work say it makes sense.

"Our coastlines aren't designed for that extra 8 inches of storm surge you get out of that sea level rise effect," said Jonathan Overpeck, director of an Earth studies institute at the University of Arizona.

Other areas estimated
While Boston and New York are looking at an additional 8 inches, other places wouldn't get that much extra rise. The study suggests Miami and much of the Southeast would get about 2 inches above the global sea rise average of perhaps 3 feet, and San Francisco would get less than an extra inch. Parts of southern Australia, northern Asia and southern and western South America would get less than the global average sea level rise.

This study along with another one last month looking at regional sea level rise from the projected melt of the west Antarctic ice sheet "provide a compelling argument for anticipating and preparing for higher rates of sea level rise," said Virginia Burkett, chief scientist for Global Change Research at the U.S. Geological Survey.

Burkett, who is based in Louisiana, said eventually New Englanders could be in the same "vulnerability situation" to storms and sea level rise as New Orleans.

Copyright 2009 The Associated Press.

Original here

Rumor Mill: Is IKEA Entering the Eco-Friendly Car Market?

BY Ariel Schwartz


We already know that IKEA is debuting a line of solar-powered lights; could the Swedish giant actually enter the eco-friendly car market?

The Internet is abuzz about a mysterious yet official-looking French website that appeared today. The site touts the LEKO, an environmentally-friendly IKEA-branded concept car. A video on the LEKO site says that the car is a modular design that can act as either a coupe or convertible. The car apparently also has the full backing of the World Wildlife Fund France, though it's not clear if that means the WWF is contributing to the LEKO's development or just endorsing it.

There's a distinct possibility that the LEKO video and site are the viral warning shots for someone's April Fool's Day hoax. The LEKO is absent from the IKEA website, and most importantly, the car will be unveiled on April Fools Day.

But hey, stranger things, right? April 1-7 is France's Sustainable Development Week, and IKEA already offers "kit homes" shipped in flatpacks to customers in Northern England and Scandinavia. I hope we can get a LEKO in Swedish blue and yellow.

Original here

Has Sustainability Become a Cliché?

By Robert Pojasek

In one of my previous blogs, I spoke about environmental sustainability as an oxymoron. The so-called "Google recognition index" now registers more than 3,240,000 hits for this term.

Although lagging far behind this "one-third of actual sustainability" term, there are the new terms being promoted on the Internet — water sustainability (65,100 hits) and energy sustainability (154,000 hits). One has to wonder if the term sustainability is becoming a cliche. It is being applied to so many terms that it is in danger of becoming meaningless.

Let's take a look at the use of the term, "energy sustainability."

The Electric Power Research Institute (EPRI) has started an energy sustainability interest group (pdf). "The electric sector faces unique challenges in balancing the social benefits of providing reliable electricity with environmental impact." Four goals are stated:

• Create a forum for a sustainable energy future
• Discover strategic value in sustainable business practices
• Exchange best practices
• Solve sustainable energy challenges collaboratively.

These goals points out the way the terms can be used: sustainability, energy sustainability, sustainable energy and sustainable company. "The group also provides the industry with opportunities to work on a common, industrywide definition of sustainability and a mechanism for reshaping that definition as the concept evolves over time." Is this what we want -- every sector coming up with its own definition of sustainability and allowing the term to evolve over time?

Sustainability is all about perspective. We can take the perspective of the companies that supply us with electricity or we can take the perspective of the organizations that are using that electricity. Energy is only a part of the sustainability footprint of the users.

A recruiter promoting a client called a person a "pioneer in energy sustainability." Is the person part of the EPRI working group or does the person just practice energy efficiency and conservation? Why do we need to call this energy sustainability? Maybe the recruiter would argue the differentiation associated in sustaining the energy sources.

Some might say that I am a bit too sensitive about trying to reconcile the many different ways that sustainability is being used on the Internet. After posting my blog on "defining sustainability," another blogger was wondering why sustainability is so hard to define. I guess the blogger had not read my previous blog. Many of us like to complain about this definition problem without proposing what we can do about it.

It would be easy to get all wrapped up in a sustainability definition campaign. However, too much precious time has already been wasted doing this. It might be interesting to see how the Xerox Corporation has dealt with people xeroxing reports. Does Google get upset with someone "googling" using another search engine? The lack of a clear definition is not the problem. The creation of a sustainability cliche is much more damaging. Energy sustainability and water sustainability will be right up there with paper bag sustainability and disposable diaper sustainability. What sustainability term do you want to launch? Before you know it, you'll be getting thousands of Google hits for that time.

Original here

Scientists drill deep into Greenland ice for global warming clues from Eemian Period

Greenland, area of Jakobshaven city, Inuits in Disko Bay

Carbon dioxide, methane and other chemicals trapped in the ice can provide a detailed picture of the atmosphere and the climate thousands of years ago

Scientists are to dig up ice dating back more than 100,000 years in an attempt to shed light on how global warming will change the world over the next century.

The ice, at the bottom of the Greenland ice sheet, was laid down at a time when temperatures were 3C (5.4F) to 5C warmer than they are today.

With temperatures forecast to rise by up to 7C in the next 100 years, the ice more than 8,000ft (2,400m) below the surface is thought by researchers to hold valuable clues to how much of the ice sheet will melt.

Drilling will start in northern Greenland during the summer in an international project involving researchers from 18 countries to extract ice cores covering the Eemian Period.

The Eemian began 130,000 years ago, ending 15,000 years later, and is the most recent time in the Earth's past when temperatures resembled those that can be expected if greenhouse gas emissions are not brought under control.

Carbon dioxide, methane and other chemicals trapped in the ice can provide a detailed picture of the atmosphere and the climate thousands of years ago.

Fragments of organic matter can offer details about animals and plants alive when the ice formed, while particles of dirt can indicate forest fires, tundra fires and volcanic activity.

Analysis of the ice should provide the first measurement of CO2 levels over Greenland during the Eemian and the most detailed analysis yet achieved of climate indicators from the period.

Lars Berg Larsen, of the University of Copenhagen, which is leading the project, said: “We are looking into this period to find out what happens to the climate if you get 3 to 5 degrees warmer.

“The Eemian is the nearest time we know that matches temperatures we can expect in the next 100 or 200 years. It will tell us much about what might happen.”

Four researchers from the British Antarctic Survey (BAS) will be taking part in the operation. They are hopeful of seeing ice not only from the whole Eemian but the years preceding it as well, which could hold clues to what prompted the temperature to start rising, or at least could chart the atmospheric changes that accompanied the rise.

Researchers also hope that the chemical traces hidden in the ice up to 8,340ft below the surface will reveal how the Greenland ice sheet responded to the higher temperatures. This will have implications for sea level rises in the coming century. If the ice sheet melts entirely, seas would be expected to rise by 21ft.

Researchers expect to find that much of the ice persisted even when temperatures were 5C higher than today, offering hope that much of it will remain in a world of manmade climate change.

Robert Mulvaney, of BAS, who has spent 24 years drilling for ice in both the Arctic and Antarctic, said: “Our ideal would be to get not only the whole of the Eemian but the last time that we had a collapse in the Greenland ice sheet.”

Original here

Obama commits to plugin hybrids, battery manufacturing

By John Timmer

Obama commits to plugin hybrids, battery manufacturing
Obama touring an electric vehicle testing center.

Electric vehicles currently face a variety of Catch-22 style challenges. They require vast, cutting-edge battery manufacturing capacity, but building that capacity won't happen until it's clear that electric vehicles will be sold on a large enough scale. Driving electric vehicles will be challenging without a support network of charging and maintenance stations, but it's hard to make an economic case for building the support infrastructure until the vehicles are already on the road. President Obama, armed with stimulus money, has apparently decided that the US government is now ideally placed to break the deadlock.

Obama used a tour of Southern California Edison's electric vehicle testing center to announce that $2.4 billion of stimulus money would be used to help get everything in place for widespread adoption of electric vehicles. The goal: to put a million plugin hybrid vehicles on the road by 2015.

All of this money will be administered by the Department of Energy, where agency head Steve Chu has recently streamlined the funding protocols with the intention of making sure the stimulus money is put to use quickly. The bulk of it, $1.5 billion, will go to the development of battery technology and manufacturing capacity. This was the sort of thing US battery manufacturers were looking for, as they've already formed the National Alliance for Advanced Transportation Battery Cell Manufacture in the hope of spreading some of the risk of investment in expanded battery manufacturing technology.

The DOE has announced that, in addition to manufacturing, the money will be used to develop recycling capacity for lithium batteries, something that will be essential if electric vehicles take off. They money will be distributed in the form of competitive grants. Another $500 million of grants will go to the companies that make components such as electric motors. As this technology is already fairly mature—it's certainly not viewed as the primary roadblock to electric cars—this will presumably go primarily to expanding the manufacturing capacity.

The last $400 million will be spent on infrastructure concepts, with grants funding research on the basic technology, and other money being spent on demonstration projects. According to the DOE, this money will cover everything from evaluating different approaches to plugin hybrid vehicles to training mechanics to service them.

This is actually one case where we may be underspending. Not a month goes by where a new public-private partnership for electric vehicle infrastructure is announced (see here for this month's example), and some of these are very clearly based on incompatible concepts, such as battery swaps vs. quick-charge stations. The US risks a balkanization of its roadways if each region winds up doing its own thing, and it seems worthwhile to spend the money up front to figure out what system is likely to scale to a market the size of the US.

In any case, the announcements make it clear that the US government is putting its money behind the plugin hybrid concept, rather than a pure electric vehicle. They also make it clear that the administration is viewing the issue as part of a larger whole—Obama specifically mentioned that this effort shouldn't be seen as separate from the efforts to upgrade the electric grid. "It won't come without cost, nor will it be easy," he said. "We've got 240 million cars already on the road. We've got to upgrade the world's largest energy grid while it's already in use."

Despite the challenges, Obama indicated that the effort was essential, saying, "We'll do this because we know that the nation that leads on energy will be the nation that leads the world in the 21st century."

Original here

Many bird populations in trouble, report says

(CNN) -- Bird populations native to several areas of the globe are in decline, with some teetering on the brink of extinction, according to a multi-agency report, the first of its kind, released Thursday.

The Western meadowlark is an endangered bird species, according to a new report.

The Western meadowlark is an endangered bird species, according to a new report.

But some other species are thriving, according to the "State of the Birds" report, which credited conservation programs and waterfowl management initiatives that it said can serve as a model in other areas.

"Every U.S. habitat harbors birds in need of conservation," said the report, issued by the Cornell University Lab of Ornithology in conjunction with federal agencies and other organizations. "Hawaiian birds and ocean birds appear most at risk, with populations in danger of collapse if immediate conservation measures are not implemented."

Of the more than 800 species of birds in the United States, 67 are federally listed as endangered or threatened, the report said. Another 184 are "species of conservation concern" because they have small distribution, are facing high threats or have a declining population.

Hawaiian birds, particularly, are in crisis, the report said. More than one-third of all U.S. bird species are in Hawaii. However, 71 species have gone extinct since the islands were colonized about 300 A.D., and 10 more species have not been seen in the past 40 years, contributing to fears they, too, have died out.

Grassland and arid-land birds are showing the most rapid declines over the last four decades, while forest birds are also declining, the report said.

"Just as they were when Rachel Carson published 'Silent Spring' nearly 50 years ago, birds today are a bellwether of the health of land, water and ecosystems," Interior Secretary Ken Salazar said Thursday in a statement on the report.

"From shorebirds in New England to warblers in Michigan to songbirds in Hawaii, we are seeing disturbing downward population trends that should set off environmental alarm bells. We must work together now to ensure we never hear the deafening silence in our forests, fields and backyards that Rachel Carson warned us about."

The declines can be traced to a variety of factors, depending on a bird's particular habitat. But the causes most frequently cited in the report are agriculture, climate change, development and energy, and invasive species.

For instance, some of the nation's fastest-growing cities -- Las Vegas, Nevada; Phoenix, Arizona; and San Diego, California -- are located in arid lands, the report said. "Unplanned and sprawling urban development is by far the greatest threat to arid lands."

In addition, invasive non-native plants have been introduced into the area, which can fuel wildfires and destroy native plants. Bird species of concern in arid lands include the elf owl, Bendire's and LeConte's thrashers and the gilded flicker, the report said. Some, such as the California condor, are already listed as endangered or threatened.

In the grasslands, intensified agricultural practices have hurt bird populations, according to the report. "Pastures cannot support many birds if overgrazed, burned too frequently or burned at the beginning of the nesting season or the end of the grass-growing season."

Also, public lands and parks are mowed too frequently and the grass kept too short to provide a habitat for birds, the reports said. Grassland birds showing marked population declines include the mountain plover, Eastern and Western meadowlarks, short-eared owls and Northern bobwhites.

In the forest, some species are doing well, but roughly one-third of forest-breeding bird species are showing decline. The same is true for arctic and alpine birds, where 38 percent of the 85 species are of conservation concern, the report said.

Game birds are also struggling. Of 19 resident game bird species, 47 percent are species of conservation concern, and two species are federally endangered, according to the report. The greatest hope for long-term management of game birds, however, lies in Farm Bill programs that retire millions of acres of land used for heavy agriculture, the report said.

Along the coast and in the ocean, bird populations have been hurt by overfishing, pollution and climate change, among other factors, the report said.

Each section of the report contains at least one "reason for hope." The California condor, for instance, has been reintroduced to some areas, and the bird's numbers are growing.

Urban birds, such as the American robin, hummingbirds, sparrows and woodpeckers, show a "steady, strong increase" in the last 40 years.

"A surprising number of native birds have adapted to life around humans," the report said. "... The wide variety of native birds that thrive in urban areas underscores the importance of these artificial habitats to the survival of many bird populations."

Wetland bird populations remain below historic levels, but "management and conservation measures have contributed to increases of many wetland birds, including hunted waterfowl."

Research shows that wetland bird species began to increase in the late 1970s, "coinciding with major policy shifts from draining to protecting wetlands," the report said. "Dramatic increases in many wetland generalist species, as well as arctic-nesting geese and cavity-nesting ducks, contribute to this overall trend."

"These results emphasize that investment in wetlands contribution has paid huge dividends," said Kenneth Rosenberg, director of Conservation Science at the Cornell Lab or Ornithology. "Now we need to invest similarly in other neglected habitats where birds are undergoing the steepest declines."

Taking action now -- particularly in the area of habitat loss -- can help reverse the declining trend seen in some species, according to the report. "The number and scope of severe threats to birds is daunting, but implementing solutions immediately and widely will pay off in benefits to society, the economy and the health of our environment."

It calls for measures such as increased monitoring of bird populations, stricter protection laws, more incentives, sustainable fishing practices and widespread education.

"Citizen science plays a critical role in monitoring and understanding the threats to these birds and their habitats, and only citizen involvement can help address them," said Greg Butcher, conservation director for the National Audubon Society.

"Conservation action can only make a real difference when concerned people support the kind of vital habitat restoration and protection measures this report explores."

Original here

New Bird Evolves Faster than Any Other

Gemini 3 Launches: A Photo Essay from 44 Years Ago Today

Posted by Michael Pinto

Astronaut John W. Young, the pilot of the Gemini-Titan 3 prime crew, is shown suited up for GT-3 pre-launch test exercises.

On March 23rd, 1965 the Gemini 3 launched into history — it was the first manned Gemini flight. The ship was manned by John W. Young (shown above) and Virgil I. Grissom (shown below). Grissom named the spacecraft the Molly Brown in reference to the Broadway show The Unsinkable Molly Brown as he was hoping not to duplicate his previous experience with the Liberty Bell 7. This mission was very much a test flight and this was the first time ever that an American spacecraft had a crew of two. NASA was still in catchup mode at this point as the USSR launched the Voskhod 1 in 1964 which had a crew of three.

Astronaut Virgil I. Grissom, the command pilot of the Gemini-Titan 3 space flight, is shown in the Gemini-3 spacecraft just before the hatches are secured prior to launch.

Prior to flight astronaut John W. Young, pilot of the Gemini-Titan 3 space flight, checks over his helmet during suiting operations in the suiting trailer at Pad 16:

Astronaut John W. Young, pilot of the Gemini-Titan 3 space flight, checks over his helmet during suiting operations in the suiting trailer at Pad 16 prior to flight.

Astronaut Roger B. Chaffee is shown at the consoles in the Mission Control Center:

Astronaut Roger B. Chaffee is shown at the consoles in the Mission Control Center - Houston during the Gemini-Titan 3 flight.

The launching of the Gemini-Titan 3:

Launching of the first manned Gemini flight, Gemini-Titan 3: Launching of the first manned Gemini flight. The Gemini-Titan 3 lifted off pad 19 at 9:24 a.m. The Gemini 3 spacecraft "Molly Brown" carried astronauts Vrigil I. Grissom, command pilot, and John W. Young, pilot, on three orbits of earth.

Astronauts John W. Young (left) and Viril I. Grissom stand before microphones at Cape Kennedy’s skid strip during welcome back ceremonies for Gemini-3 crew:

Astronauts John W. Young (left) and Viril I. Grissom stand before microphones at Cape Kennedy's skid strip during welcome back ceremonies for Gemini-3 crew.

Today the spacecraft is on display at within the Grissom Memorial of Spring Mill State Park, which is next to Grissom’s hometown of Mitchell, Indiana.

Original here

Breaking: Colbert Wins NASA's Node 3 Naming Contest

posted by: Matt Tobey

GYI0000537849.jpgWhen the aliens come to Earth, they're going to get a taste of hardcore patriotism on their way in. "Colbert" was the top vote-getter in NASA's contest to name the new Node 3 space-module.

NASA's online contest to name a new room at the international space station went awry. Comedian Stephen Colbert won.

The name "Colbert" beat out NASA's four suggested options in the space agency's effort to have the public help name the addition. The new room will be launched later this year.

NASA's mistake was allowing write-ins. Colbert urged viewers of his Comedy Central show, "The Colbert Report" to write in his name. And they complied, with 230,539 votes. That clobbered Serenity, one of the NASA choices, by more than 40,000 votes. Nearly 1.2 million votes were cast by the time the contest ended Friday.

NASA reserves the right to choose an appropriate name. Agency spokesman John Yembrick said NASA will decide in April, but will give top vote-getters "the most consideration."

Original here

Watch Saturn’s shadow dancing

As I mentioned in a recent blog post, Saturn is currently presenting itself to us with its rings and moon orbits nearly edge-on. I knew this would mean we’d see transits of the moons: from our view, the moons seem to pass directly over the face of Saturn.

Cassini animation of the moon Epimetheus’s shadow on the rings

What I didn’t think of is this also means the moons will cast shadows on the rings themselves! This is starting to happen now, and Cassini, our robot-on-the-spot, is now sending back spectacular pictures (like it ever sends back any other kind, duh) of these events! The animation you see here (click to embiggen) shows the tiny moon Epimetheus — only 113 km (70 miles) across — casting its own shadow on the rings. While it was still a million kilometers from the tiny world, Cassini took a series of images that the ground team strung together into this beautiful and somewhat eerie animation. The shadow moves across the rings because Epimetheus’s orbit isn’t precisely aligned with the rings, it’s tilted by less than a degree, but that’s enough to send its stretched-out shadow drifting across the rings like a ghost as the moon bobs above the ring plane.

If you look at this still frame from the animation, you can see incredible detail in the rings, and even that the shadow is not quite symmetric; probably a reflection (so to speak) of Epimetheus’ irregular shape. Cassini picture of the moon Epimetheus’s shadow on the rings

Some of the other moons are creating these dances as well; here is the even smaller flying-saucer-shaped Pan (just 20 km (12 miles) across) as it orbits in a gap in the rings, casting its own shadow across the rings:

Cassini picture of the moon Pan’s shadow on the rings

Can you see the moon Pan in the gap in the rings and its shadow? It’s tiny in that version, so take a look at this zoom:

Cassini zoomed picture of the moon Pan’s shadow on the rings

Whoa, cool. Pan is actually orbiting in that ring gap, so the rings have to be almost perfectly edge-on to the Sun to get that shadow. Right now that’s not quite the case; there’s still a bit of a tilt. But as Saturn orbits the Sun that angle will diminish, and in a few months (in August) it’ll be precisely 0. Then we’ll see the shadows stretching out along the rings, lengthened the same way your own shadow is elongated at sunset. As we approach this point in time — what’s really the Equinox on Saturn, the same as the Equinox we just had on Earth — well see this more and more, so expect a ton more devastating animations and images from Cassini in the months to come!

Original here

Star Explodes, and So Might Theory

A massive star a million times brighter than our sun exploded way too early in its life, suggesting scientists don't understand stellar evolution as well as they thought.

"This might mean that we are fundamentally wrong about the evolution of massive stars, and that theories need revising," said Avishay Gal-Yam of the Weizmann Institute of Science in Rehovot, Israel.

According to theory, the doomed star, about 100 times our sun's mass, was not mature enough to have evolved a massive iron core of nuclear fusion ash, considered a prerequisite for a core implosion that triggers the sort of supernova blast that was seen.

The new study involves old images that have just been compared. It is one of the rare instances where the progenitor of an exploded star has been found.

The explosion, called supernova SN 2005gl, was seen at a distance of 215 million light-years in the barred-spiral galaxy NGC 266 on Oct. 5, 2005. Pictures from the Hubble Space Telescope archive, taken in 1997, reveal the star, pre-explosion, as a very luminous one.

The progenitor had been proposed previously, but now has been firmly identified, according to a study published Sunday in the online version of the journal Nature.

The progenitor star was so bright that it probably belonged to a class of stars called Luminous Blue Variables (LBVs), "because no other type of star is as intrinsically brilliant," Gal-Yam said. As an LBV-class star evolves it sheds much of its mass through a violent stellar wind. Only at that point does it develop a large iron core, then the core collapses in an explosion.

The unexpected explosion could mean other stars may behave in ways not previously expected, including one relatively close to home, known as Eta Carinae, just 7,500 light-years away and in our own Milky Way galaxy. Extremely massive and luminous stars topping 100 solar masses, such as Eta Carinae, are expected to lose their entire hydrogen envelopes prior to their ultimate explosions as supernovae.

"These observations demonstrate that many details in the evolution and fate of LBVs remain a mystery," said Mario Livio of the Space Telescope Science Institute in Baltimore. "We should continue to keep an eye on Eta Carinae, it may surprise us yet again,"

"The progenitor identification shows that, at least in some cases, massive stars explode before losing most of their hydrogen envelope, suggesting that the evolution of the core and the evolution of the envelope are less coupled than previously thought, a finding which may require a revision of stellar evolution theory," said study co-author Douglas Leonard from San Diego State University.

One possibility is that the progenitor to SN 2005gl was really a pair of stars, a binary system that merged. This would have stoked nuclear reactions to brighten the star enormously, making it look more luminous and less evolved than it really is.

"This also leaves open the question that there may be other mechanisms for triggering supernova explosions," says Gal-Yam. "We may be missing something very basic in understanding how a superluminous star goes through mass loss."

Gal-Yam reports that the observation revealed that only a small part of the star's mass was flung off in the explosion. Most of the material, says Gal-Yam, was drawn into the collapsing core that has probably become a black hole estimated to be at least 10 to 15 solar masses.

Original here

Space storm alert: 90 seconds from catastrophe

A fierce solar storm could lead to a global disaster on an unprecedented scale (Image: SOHO Consortium / ESA / NASA)

A fierce solar storm could lead to a global disaster on an unprecedented scale (Image: SOHO Consortium / ESA / NASA)

Related editorial: We must heed the threat of solar storms

IT IS midnight on 22 September 2012 and the skies above Manhattan are filled with a flickering curtain of colourful light. Few New Yorkers have seen the aurora this far south but their fascination is short-lived. Within a few seconds, electric bulbs dim and flicker, then become unusually bright for a fleeting moment. Then all the lights in the state go out. Within 90 seconds, the entire eastern half of the US is without power.

A year later and millions of Americans are dead and the nation's infrastructure lies in tatters. The World Bank declares America a developing nation. Europe, Scandinavia, China and Japan are also struggling to recover from the same fateful event - a violent storm, 150 million kilometres away on the surface of the sun.

It sounds ridiculous. Surely the sun couldn't create so profound a disaster on Earth. Yet an extraordinary report funded by NASA and issued by the US National Academy of Sciences (NAS) in January this year claims it could do just that.

Over the last few decades, western civilisations have busily sown the seeds of their own destruction. Our modern way of life, with its reliance on technology, has unwittingly exposed us to an extraordinary danger: plasma balls spewed from the surface of the sun could wipe out our power grids, with catastrophic consequences.

The projections of just how catastrophic make chilling reading. "We're moving closer and closer to the edge of a possible disaster," says Daniel Baker, a space weather expert based at the University of Colorado in Boulder, and chair of the NAS committee responsible for the report.

It is hard to conceive of the sun wiping out a large amount of our hard-earned progress. Nevertheless, it is possible. The surface of the sun is a roiling mass of plasma - charged high-energy particles - some of which escape the surface and travel through space as the solar wind. From time to time, that wind carries a billion-tonne glob of plasma, a fireball known as a coronal mass ejection (see "When hell comes to Earth"). If one should hit the Earth's magnetic shield, the result could be truly devastating.

The incursion of the plasma into our atmosphere causes rapid changes in the configuration of Earth's magnetic field which, in turn, induce currents in the long wires of the power grids. The grids were not built to handle this sort of direct current electricity. The greatest danger is at the step-up and step-down transformers used to convert power from its transport voltage to domestically useful voltage. The increased DC current creates strong magnetic fields that saturate a transformer's magnetic core. The result is runaway current in the transformer's copper wiring, which rapidly heats up and melts. This is exactly what happened in the Canadian province of Quebec in March 1989, and six million people spent 9 hours without electricity. But things could get much, much worse than that.

Worse than Katrina

The most serious space weather event in history happened in 1859. It is known as the Carrington event, after the British amateur astronomer Richard Carrington, who was the first to note its cause: "two patches of intensely bright and white light" emanating from a large group of sunspots. The Carrington event comprised eight days of severe space weather.

There were eyewitness accounts of stunning auroras, even at equatorial latitudes. The world's telegraph networks experienced severe disruptions, and Victorian magnetometers were driven off the scale.

Though a solar outburst could conceivably be more powerful, "we haven't found an example of anything worse than a Carrington event", says James Green, head of NASA's planetary division and an expert on the events of 1859. "From a scientific perspective, that would be the one that we'd want to survive." However, the prognosis from the NAS analysis is that, thanks to our technological prowess, many of us may not.

There are two problems to face. The first is the modern electricity grid, which is designed to operate at ever higher voltages over ever larger areas. Though this provides a more efficient way to run the electricity networks, minimising power losses and wastage through overproduction, it has made them much more vulnerable to space weather. The high-power grids act as particularly efficient antennas, channelling enormous direct currents into the power transformers.

The second problem is the grid's interdependence with the systems that support our lives: water and sewage treatment, supermarket delivery infrastructures, power station controls, financial markets and many others all rely on electricity. Put the two together, and it is clear that a repeat of the Carrington event could produce a catastrophe the likes of which the world has never seen. "It's just the opposite of how we usually think of natural disasters," says John Kappenman, a power industry analyst with the Metatech Corporation of Goleta, California, and an advisor to the NAS committee that produced the report. "Usually the less developed regions of the world are most vulnerable, not the highly sophisticated technological regions."

According to the NAS report, a severe space weather event in the US could induce ground currents that would knock out 300 key transformers within about 90 seconds, cutting off the power for more than 130 million people (see map). From that moment, the clock is ticking for America.

First to go - immediately for some people - is drinkable water. Anyone living in a high-rise apartment, where water has to be pumped to reach them, would be cut off straight away. For the rest, drinking water will still come through the taps for maybe half a day. With no electricity to pump water from reservoirs, there is no more after that.

There is simply no electrically powered transport: no trains, underground or overground. Our just-in-time culture for delivery networks may represent the pinnacle of efficiency, but it means that supermarket shelves would empty very quickly - delivery trucks could only keep running until their tanks ran out of fuel, and there is no electricity to pump any more from the underground tanks at filling stations.

Back-up generators would run at pivotal sites - but only until their fuel ran out. For hospitals, that would mean about 72 hours of running a bare-bones, essential care only, service. After that, no more modern healthcare.

72 hours of healthcare remaining

The truly shocking finding is that this whole situation would not improve for months, maybe years: melted transformer hubs cannot be repaired, only replaced. "From the surveys I've done, you might have a few spare transformers around, but installing a new one takes a well-trained crew a week or more," says Kappenman. "A major electrical utility might have one suitably trained crew, maybe two."

Within a month, then, the handful of spare transformers would be used up. The rest will have to be built to order, something that can take up to 12 months.

Even when some systems are capable of receiving power again, there is no guarantee there will be any to deliver. Almost all natural gas and fuel pipelines require electricity to operate. Coal-fired power stations usually keep reserves to last 30 days, but with no transport systems running to bring more fuel, there will be no electricity in the second month.

30 days of coal left

Nuclear power stations wouldn't fare much better. They are programmed to shut down in the event of serious grid problems and are not allowed to restart until the power grid is up and running.

With no power for heating, cooling or refrigeration systems, people could begin to die within days. There is immediate danger for those who rely on medication. Lose power to New Jersey, for instance, and you have lost a major centre of production of pharmaceuticals for the entire US. Perishable medications such as insulin will soon be in short supply. "In the US alone there are a million people with diabetes," Kappenman says. "Shut down production, distribution and storage and you put all those lives at risk in very short order."

Help is not coming any time soon, either. If it is dark from the eastern seaboard to Chicago, some affected areas are hundreds, maybe thousands of miles away from anyone who might help. And those willing to help are likely to be ill-equipped to deal with the sheer scale of the disaster. "If a Carrington event happened now, it would be like a hurricane Katrina, but 10 times worse," says Paul Kintner, a plasma physicist at Cornell University in Ithaca, New York.

In reality, it would be much worse than that. Hurricane Katrina's societal and economic impact has been measured at $81 billion to $125 billion. According to the NAS report, the impact of what it terms a "severe geomagnetic storm scenario" could be as high as $2 trillion. And that's just the first year after the storm. The NAS puts the recovery time at four to 10 years. It is questionable whether the US would ever bounce back.

4-10 years to recover

"I don't think the NAS report is scaremongering," says Mike Hapgood, who chairs the European Space Agency's space weather team. Green agrees. "Scientists are conservative by nature and this group is really thoughtful," he says. "This is a fair and balanced report."

Such nightmare scenarios are not restricted to North America. High latitude nations such as Sweden and Norway have been aware for a while that, while regular views of the aurora are pretty, they are also reminders of an ever-present threat to their electricity grids. However, the trend towards installing extremely high voltage grids means that lower latitude countries are also at risk. For example, China is on the way to implementing a 1000-kilovolt electrical grid, twice the voltage of the US grid. This would be a superb conduit for space weather-induced disaster because the grid's efficiency to act as an antenna rises as the voltage between the grid and the ground increases. "China is going to discover at some point that they have a problem," Kappenman says.

Neither is Europe sufficiently prepared. Responsibility for dealing with space weather issues is "very fragmented" in Europe, says Hapgood.

Europe's electricity grids, on the other hand, are highly interconnected and extremely vulnerable to cascading failures. In 2006, the routine switch-off of a small part of Germany's grid - to let a ship pass safely under high-voltage cables - caused a cascade power failure across western Europe. In France alone, five million people were left without electricity for two hours. "These systems are so complicated we don't fully understand the effects of twiddling at one place," Hapgood says. "Most of the time it's alright, but occasionally it will get you."

The good news is that, given enough warning, the utility companies can take precautions, such as adjusting voltages and loads, and restricting transfers of energy so that sudden spikes in current don't cause cascade failures. There is still more bad news, however. Our early warning system is becoming more unreliable by the day.

By far the most important indicator of incoming space weather is NASA's Advanced Composition Explorer (ACE). The probe, launched in 1997, has a solar orbit that keeps it directly between the sun and Earth. Its uninterrupted view of the sun means it gives us continuous reports on the direction and velocity of the solar wind and other streams of charged particles that flow past its sensors. ACE can provide between 15 and 45 minutes' warning of any incoming geomagnetic storms. The power companies need about 15 minutes to prepare their systems for a critical event, so that would seem passable.

15 minutes' warning

However, observations of the sun and magnetometer readings during the Carrington event shows that the coronal mass ejection was travelling so fast it took less than 15 minutes to get from where ACE is positioned to Earth. "It arrived faster than we can do anything," Hapgood says.

There is another problem. ACE is 11 years old, and operating well beyond its planned lifespan. The onboard detectors are not as sensitive as they used to be, and there is no telling when they will finally give up the ghost. Furthermore, its sensors become saturated in the event of a really powerful solar flare. "It was built to look at average conditions rather than extremes," Baker says.

He was part of a space weather commission that three years ago warned about the problems of relying on ACE. "It's been on my mind for a long time," he says. "To not have a spare, or a strategy to replace it if and when it should fail, is rather foolish."

There is no replacement for ACE due any time soon. Other solar observation satellites, such as the Solar and Heliospheric Observatory (SOHO) can provide some warning, but with less detailed information and - crucially - much later. "It's quite hard to assess what the impact of losing ACE will be," Hapgood says. "We will largely lose the early warning capability."

The world will, most probably, yawn at the prospect of a devastating solar storm until it happens. Kintner says his students show a "deep indifference" when he lectures on the impact of space weather. But if policy-makers show a similar indifference in the face of the latest NAS report, it could cost tens of millions of lives, Kappenman reckons. "It could conceivably be the worst natural disaster possible," he says.

The report outlines the worst case scenario for the US. The "perfect storm" is most likely on a spring or autumn night in a year of heightened solar activity - something like 2012. Around the equinoxes, the orientation of the Earth's field to the sun makes us particularly vulnerable to a plasma strike.

What's more, at these times of year, electricity demand is relatively low because no one needs too much heating or air conditioning. With only a handful of the US grid's power stations running, the system relies on computer algorithms shunting large amounts of power around the grid and this leaves the network highly vulnerable to sudden spikes.

If ACE has failed by then, or a plasma ball flies at us too fast for any warning from ACE to reach us, the consequences could be staggering. "A really large storm could be a planetary disaster," Kappenman says.

So what should be done? No one knows yet - the report is meant to spark that conversation. Baker is worried, though, that the odds are stacked against that conversation really getting started. As the NAS report notes, it is terribly difficult to inspire people to prepare for a potential crisis that has never happened before and may not happen for decades to come. "It takes a lot of effort to educate policy-makers, and that is especially true with these low-frequency events," he says.

We should learn the lessons of hurricane Katrina, though, and realise that "unlikely" doesn't mean "won't happen". Especially when the stakes are so high. The fact is, it could come in the next three or four years - and with devastating effects. "The Carrington event happened during a mediocre, ho-hum solar cycle," Kintner says. "It came out of nowhere, so we just don't know when something like that is going to happen again."

Original here