There was an error in this gadget

Followers

Saturday, August 30, 2008

Integral locates origin of high-energy emission from Crab Nebula

Thanks to data from ESA’s Integral gamma-ray observatory, scientists have been able to locate where particles in the vicinity of the rotating neutron-star in the Crab Nebula are accelerated to immense energies.

The discovery, resulting from more than 600 individual observations of the nebula, put in place another piece of the puzzle in understanding how neutron stars work.

Rotating neutron-stars, or pulsars, are known to accelerate particles to enormous energies, typically one hundred times more than the most powerful accelerators on Earth, but scientists are still uncertain exactly how these systems work and where the particles are accelerated.

A step forward in this understanding is now accomplished thanks to a team of researchers from the UK and Italy, led by Professor Tony Dean of the University of Southampton, who studied high-energy polarised light emitted by the Crab Nebula – one of the most dramatic sights in deep space.

The Crab Nebula is the result of a supernova explosion which was seen from Earth on 4 July 1054. The explosion left behind a pulsar with a nebula of radiating particles around it. The pulsar contains the mass of the Sun squeezed into a volume of about 10 km radius, rotating very fast – about 30 times a second – thereby generating very powerful magnetic fields and accelerating particles. A highly collimated jet, aligned with the spin axis of the pulsar and a bright radiating ‘donut’ structure (or torus) around the pulsar itself, are also seen.

So, the Crab is known to accelerate electrons - and possibly other particles - to extremely high speed, and so produces high energy radiation. But where exactly are these particles accelerated?

Looking into the heart of the pulsar with Integral’s spectrometer (SPI), the researchers made a detailed study to assess the polarization – or the alignment - of the waves of high-energy radiation originating from the Crab.


They saw that this polarised radiation is highly aligned with the rotation axis of the pulsar. So they concluded that a significant portion of the electrons generating the high-energy radiation must originate from a highly-organised structure located very close to the pulsar, very likely directly from the jets themselves. The discovery allows the researchers to discard other theories that locate the origin of this radiation further away from the pulsar.

Professor Tony Dean of the University’s School of Physics and Astronomy commented that the discovery of such alignment – also matching with the polarisation observed in the visible band - is truly remarkable. “The findings have clear implications on many aspects of high energy accelerators such as the Crab,” he added.

"The detection of polarised radiation in space is very complicated and rare, as it requires dedicated instrumentation and an in-depth analysis of very complex data”, said Chris Winkler, Integral Project Scientist at ESA. “Integral’s ability to detect polarised gamma-radiation and, as a consequence, to obtain important results like this one, confirms it once more as a world-class observatory.”


Notes for editors:

The results are published in the 29 August issue of the scientific journal Science, in a paper titled ‘Polarized gamma-ray emission from the Crab’, by: A. J. Dean, D.J. Clark, V.A.McBride, A.J.Bird, A.B.Hill and S.E.Shaw (University of Southampton’s School of Physics and Astronomy); J.B. Stephen and L. Bassani (INAF-IASF, Bologna); and A. Bazzano and P. Ubertini (INAF-IASF, Roma).

NASA Mars Rover Opportunity Ascends to Level Ground

NASA's Mars Exploration Rover Opportunity has climbed out of the large crater that it had been examining from the inside since last September.

"The rover is back on flat ground," an engineer who drives it, Paolo Bellutta of NASA's Jet Propulsion Laboratory, announced to the mission's international team of scientists and engineers.

Opportunity used its own entry tracks from nearly a year ago as the path for a drive of 6.8 meters (22 feet) bringing the rover out over the top of the inner slope and through a sand ripple at the lip of Victoria Crater. The exit drive, conducted late Thursday, completed a series of drives covering 50 meters (164 feet) since the rover team decided about a month ago that it had completed its scientific investigations inside the crater.

"We're headed to the next adventure out on the plains of Meridiani," said JPL's John Callas, project manager for Opportunity and its twin Mars rover, Spirit. "We safely got into the crater, we completed our exploration there, and we safely got out. We were concerned that any wheel failure on our aging rover could have left us trapped inside the crater."

The Opportunity mission has focused on Victoria Crater for more than half of the 55 months since the rover landed in the Meridiani Planum region of equatorial Mars. The crater spans about 800 meters (half a mile) in diameter and reveals rock layers that hold clues to environmental conditions of the area through an extended period when the rocks were formed and altered.

The team selected Victoria as the next major destination after Opportunity exited smaller Endurance Crater in late 2004. The ensuing 22-month traverse to Victoria included stopping for studies along the route and escaping from a sand trap. The rover first reached the rim of Victoria in September 2007. For nearly a year, it then explored partway around the rim, checking for the best entry route and examining from above the rock layers exposed in a series of promontories that punctuate the crater perimeter.

Now that Opportunity has finished exploring Victoria Crater and returned to the surrounding plain, the rover team plans to use tools on the robotic arm in coming months to examine an assortment of cobbles -- rocks about fist-size and larger -- that may have been thrown from impacts that dug craters too distant for Opportunity to reach.

JPL, a division of the California Institute of Technology, Pasadena, manages the rovers for the NASA Science Mission Directorate, Washington. For images and information about NASA's Opportunity and Spirit Mars rovers, visit http://www.nasa.gov/rovers and http://marsrovers.jpl.nasa.gov.

EXCLUSIVE: NASA to study extending shuttle era to 2015

NASA Administrator Michael Griffin has ordered his subordinates to study how the agency could fly the space shuttle beyond its planned retirement in 2010, according to an internal e-mail obtained by the Orlando Sentinel.

The decision signals what could be a huge change in NASA policy. Griffin repeatedly has rejected the notion of extending the shuttle era beyond its 2010 retirement date, arguing it could cripple the fledgling Constellation program, a system of new rockets and capsules meant to replace the shuttle in 2015.

But Griffin has been under enormous external pressure. Sen. John McCain recently asked the White House to stop dismantling parts of the shuttle program for at least a year. At the same time, eroding relations with Russia have motivated lawmakers to find a way to fill the five-year gap between the shuttle's retirement and the maiden voyage of Constellation in 2015. The current plan calls for NASA to buy Russian spacecraft during the gap.

One NASA official said such "what-if studies" represent "prudent planning," especially in light of suggestions made by McCain, the Republican presidential nominee, who would dictate the agency's future if he captures the White House.

But the email, sent on Wednesday, August 27, by John Coggeshall, manager of "Manifest and Schedules" at Johnson Space Center in Houston, suggested that the analysis was more than just a contingency study.

"We want to focus on helping bridge the gap of US vehicles travelling to the ISS as efficiently as possible," it said.

The upcoming study raised the idea of retiring one of the three remaining orbiters, possibly for spare parts. "(We) don't necessarily need all 3 orbiters either," said the email. "We have been encouraged not to focus on a certain set of assumptions or costs," said the email.

But cost has been the exact reason why Griffin has dismissed the idea of extending the shuttle era. To have enough money to build Constellation's Ares 1 rocket and Orion crew capsule, NASA must spending stop money on shuttle flights. At one point last year, he estimated that it would cost as much as $4 billion a year to fly the shuttle beyond 2010.

NASA' current budget is about $17 billion.

"Continuing to fly the Shuttle beyond 2010 does not enhance U.S. human spaceflight capability, but rather delays the time until a new capability exists and increases the total life cycle cost to bring the new capability on line," he told Congress in November.

Another worry: NASA already has begun unplugging parts of the shuttle system. It has already terminated many contracts with vendors who make shuttle parts and NASA facilities already have begun converting its systems to handle the new Constellation program.

Wayne Hale, a NASA deputy assistant administrator and until recently the shuttle-program manager, has said that this fall marks the point of no return. That's when NASA is supposed to start ripping out the giant welding equipment and other machinery at Michoud Assembly Facility in New Orleans, which makes the shuttle's giant external fuel tank.

In a blog posted Thursday, Hale said that flying shuttle and building Constellation would strain NASA's budgets and overextend its workforce. "Hey, I am the biggest shuttle hugger there is. I think it is the best spacecraft ever built. But I also deal in the real world," he wrote.

"Where does the money come from? Where do the people -- who should be working on the moon rocket -- where do they come from? We started shutting down the shuttle four years ago. That horse has left the barn," he wrote.

Read the email: Download nasa_email.doc

NASA's 'electronic nose' could sniff out cancer

From rocket science to brain surgery: a device designed to sniff out leaks on the space shuttle may soon guide surgeons as they operate on cancer patients.

The ENose was originally developed by NASA's Jet Propulsion Laboratory in Pasadena, California, to detect low-level leaks of ammonia in shuttles. It is based on polymer films whose electrical conductivity varies as they encounter different substances. Now its creators believe the ENose could act as a highly sensitive detector of the characteristic compounds produced by cancer cells.

Such a device could be invaluable for surgeons operating on areas where spotting tumour tissue is particularly tricky. Surgeons currently rely on visual inspection to locate cancerous tissue, referring back to scans taken before surgery. But brain tissue, for example, is hard to distinguish from cancer, and it also changes shape when the skull is opened, so the scans don't match what the surgeon sees. That makes it difficult to cut out all the cancerous tissue while avoiding damage to healthy areas.

Babak Kateb of City of Hope Medical Center in Duarte, California, says the ENose has correctly diagnosed lung cancer and diabetes in patients who have breathed into it. He and his colleagues believe the device could be linked to other brain imaging and mapping devices to create a real-time high-resolution image of the brain that pinpoints cancer hotspots.

The work was presented at the International Brain Mapping & Intraoperative Surgical Planning Society Conference at the University of California, Los Angeles this week.

Now Hear This: Don't Remove Earwax

The gooey, golden stuff that builds up inside your ears should stay there, according to national guidelines on earwax removal released today.

"[Earwax] is not intrinsically evil stuff, and consequently does not have to be removed merely because it's present," said Peter Roland, an ear, nose and throat doctor at the University of Texas Southwestern Medical Center at Dallas. "In fact, it serves a function and so if you don't need to take it out, you should just leave it alone."

Roland chaired a panel of doctors in charge of the new guidelines for earwax removal issued by the American Academy of Otolaryngology - Head and Neck Surgery Foundation (AAO-HNSF). The guidelines are intended to serve two purposes: to determine under what circumstances earwax needs to be removed, and to give doctors the scoop on which removal methods work best.

They hope the guidelines won't fall on deaf ears: About 12 million people a year in the United States seek medical care for impacted or excessive earwax. Impaction, they say, can cause pain, pressure, itching, foul odor, ringing of the ears, ear discharge and, in extreme cases, hearing loss.

Good-for-you goo

So there's a reason for the goo. Earwax is a self-cleaning agent, with protective, lubricating and antibacterial properties, doctors say.

That's why tiny glands in the outer ear canal constantly pump out a watery substance, which gets mixed with bits of dead hair and skin and together is called earwax or cerumen. Excess earwax normally treks slowly out of the ear canal, with an extra boost from chewing and other jaw movements, carrying with it dirt, dust and other small particles from the ear canal. Then, dried-up clumps of the stuff fall out of the ear opening.

When this natural earwax train malfunctions, or when individuals poke around in their ears with cottons swabs or other foreign objects such as bobby pins or matchsticks, earwax can build up and block part of the ear canal.

"Then there are lots of people wearing earplugs for one reason or another, either because they've got hearing aids or they're transcriptionists at work or because they're addicted to their walkman," Roland told LiveScience, "and that can increase the likelihood that the wax doesn’t come out on its own."

Older adults are more prone to earwax buildup then younger individuals.

"The wax gets much thicker and drier, and plus you actually end up with more hair in your ear, when you're older, and so it traps it," Roland said.

He added, "Unfortunately, many people feel the need to manually 'remove' cerumen from the ears. This can result in further impaction and other complications to the ear canal." He said the saying, "Don't put anything smaller than your elbow in your ear," holds true.

Leave your ears alone

For the everyday individual, the new guidelines suggest you leave your ears alone unless you experience symptoms that you think are associated with too much earwax.

"If they're going to do something at home, they should probably use drops of some sort," Roland said. The panel found no evidence that one type of over-the-counter drops works better than another, or better than just plain sterile water or sterile saline, he said.

The drops help to loosen the earwax and then the ear often can do the rest, he added.

The guidelines also state that cotton-tipped swabs or other objects should not be used to remove earwax. Oral jet irrigators and the alternative medicine technique called ear candling are also strongly advised against.

Ear candling involves making a hollow tube from fabric and soaking that in warm beeswax, which is cooled and hardens. Once cooled and hardened, the beeswax cone is stuck into the ear. The outer end of the tube is lit and burns for about 15 minutes, a process that supposedly draws the wax out of the ear.

Studies have shown, however, that the drawn-out stuff is material from the candle itself. Doctors have also reported seeing patients who have burned the outer parts of their ears with this method.

If the drops don't relieve your symptoms, or if you dislike drops but still have symptoms, it's time to see a doctor, Roland said.

The panel found that three common techniques for earwax removal at the doctor's office work best, with no single method outshining the others. These include flushing the ear out with a water solution; manually removing the earwax under a microscope using medical instruments; and sending the patient home with ear drops.

While at the doctor's office, Roland urges patients not to be embarrassed by a little earwax.

"I get a lot of people in here who are horrified when I see a little wax in their ear, and then they start apologizing for being dirty and they're just very upset it's present at all," Roland said. "And I think the big message there is that it has a physiological function, and unless there's a reason to remove it, you should just leave it alone. It's OK."

Scientists find ancient lost settlements in Amazon

A vast region of the Amazon forest in Brazil was home to a complex of ancient towns in which about 50,000 people lived, according to scientists assisted by satellite images of the region.

The scientists, whose findings were published on Thursday in the journal Science, described clusters of towns and smaller villages connected by complex road networks and housing a society doomed by the arrival of Europeans five centuries ago.

European colonists and the diseases they brought with them probably killed most of the inhabitants, the researchers said. The settlements, consisting of networks of walled towns and smaller villages organized around a central plaza, are now almost entirely overgrown by the forest.

"These are not cities, but this is urbanism, built around towns," University of Florida anthropologist Mike Heckenberger said in a statement.

"If we look at your average medieval town or your average Greek polis, most are about the scale of those we find in this part of the Amazon. Only the ones we find are much more complicated in terms of their planning," Heckenberger added.

Helped by satellite imagery, the researchers spent more than a decade uncovering and mapping the lost communities.

Prior to the arrival of Europeans starting in 1492, the Americas were home to many prosperous and impressive societies and large cities. These findings add to the understanding of the various pre-Columbian civilizations.

The existence of the ancient settlements in the Upper Xingu region of the Amazon in north-central Brazil means what many experts had considered virgin tropical forests were in fact heavily affected by past human activity, the scientists said.

The U.S. and Brazilian scientists worked with a member of the Kuikuro, an indigenous Amazonian people descended from settlements' original inhabitants.

Original here

Friday, August 29, 2008

New Milky Way map reveals a complicated outer galaxy

CHICAGO -- The halo of stars that envelops the Milky Way galaxy is like a river delta criss-crossed by stellar streams large and small, according to new data from the Sloan Digital Sky Survey (SDSS-II). While the largest rivers of this delta have been mapped out over the last decade, analysis of the new SDSS-II map shows that smaller streams can be found throughout the stellar halo, said Kevin Schlaufman, a graduate student at the University of California at Santa Cruz.


A theoretical model of a galaxy like the Milky Way, showing trails of stars torn from disrupted satellite galaxies that have merged with the central galaxy. The structures seen in the SDSS-II star maps support this prediction of a complicated outer Galaxy. The region shown is about one million light years on a side; the sun is just 25,000 light years from the center of the Galaxy and would appear close to the center of this picture. Credit: K. Johnston, J. Bullock

Schlaufman reported his results at an international symposium in Chicago, titled "The Sloan Digital Sky Survey: Asteroids to Cosmology." Over the last three years, Schlaufmann explained, the SEGUE survey of SDSS-II has measured the motions of nearly a quarter million stars in selected areas of the sky. A careful search for groups of stars at the same velocity turned up 14 distinct structures, 11 of them previously unknown.

"Even with SEGUE, we are still only mapping a small fraction of the Galaxy," said Schlaufman, "so 14 streams in our data implies a huge number when we extrapolate to the rest of the Milky Way." If each velocity structure were a separate stream, Schlaufman explained, there would be close to 1,000 in the inner 75,000 light years of the Galaxy. However, these structures could arise from a smaller number of streams that are seen many times in different places.

"A jumble of pasta" is the way Columbia University researcher Kathryn Johnston described her theoretical models of the Milky Way's stellar halo. In a review talk at the symposium, Johnston explained how dwarf galaxies that pass close to the Milky Way can be stretched by gravitational tides into spaghetti-like strands, which wind around the Galaxy as stars trace out the same orbital paths at different rates.

"In the center of the Galaxy, these stellar strands crowd together and you just see a smooth mix of stars," said Johnston. "But as you look further away you can start to pick out individual strands, as well as features more akin to pasta shells that come from dwarfs that were on more elongated orbits. By looking at faint features, Kevin may be finding some of the 'angel hair' that came from smaller dwarfs, or ones that were destroyed longer ago."

Heidi Newberg of Rensselaer Polytechnic Institute and her thesis student Nathan Cole have been trying to follow some of the larger strands as they weave across the sky. "It's a big challenge to piece things together," said Cole, "because the stream from one dwarf galaxy can wrap around the Galaxy and pass through streams of stars ripped from other dwarf galaxies."

Toward the constellation Virgo, where SDSS images revealed an excess of stars covering a huge area of sky, Newberg finds that there are at least two superposed structures, and possibly three or more. The SEGUE velocity measurements can separate systems that overlap in sky maps, Newberg explained in her symposium talk. "Part of what we see toward Virgo is a tidal arm of the Sagittarius dwarf galaxy, whose main body lies on the opposite side of the Milky Way, but we don't know the origin of the other structures. There really aren't enough pasta varieties to describe all the structures we find."

In addition to stellar streams, astronomers searching the SDSS data have found 14 surviving dwarf companions of the Milky Way, including two new discoveries announced today at the symposium by Gerard Gilmore of Cambridge University. These satellite galaxies are orbiting within the halo of invisible dark matter whose gravity holds the Milky Way itself together. Most of them are much fainter than the ten satellites known prior to the SDSS.

Because even the SDSS can only detect these ultra-faint dwarfs if they are relatively nearby, there could be several hundred more of them further out in the Milky Way's dark halo, according to independent analyses by graduate students Sergey Koposov, of the Max Planck Institute for Astronomy in Heidelberg, Germany, and Eric Tollerud, of the University of California at Irvine. "Even so," said Koposov, "we expect that the number of dark matter clumps is much larger than that, so something must prevent the smaller clumps from gathering gas and forming stars."

The SDSS dwarfs have far fewer stars than the previously known satellites, noted Gilmore, but they have similar spatial extents, and the stars within them move at similar speeds. "I think the internal dynamics of these tiny galaxies may be hard to explain with our conventional ideas about dark matter," said Gilmore.

"The SDSS has taught us a huge amount about the Milky Way and its neighbors," said Johnston, who is pleased to see some of the predictions of her models confirmed by the new data. "But we're still just beginning to map the Galaxy in a comprehensive way, and there's a trove of discoveries out there for the next generation of surveys, including the two new Milky Way surveys that will be carried out in SDSS-III."

The Sloan Digital Sky Survey is the most ambitious survey of the sky ever undertaken, involving more than 300 astronomers and engineers at 25 institutions around the world. SDSS-II, which began in 2005 and finished observations in July, 2008, is comprised of three complementary projects. The Legacy Survey completed the original SDSS map of half the northern sky, determining the positions, brightness, and colors of hundreds of millions of celestial objects and measuring distances to more than a million galaxies and quasars. SEGUE (Sloan Extension for Galactic Understanding and Exploration) mapped the structure and stellar makeup of the Milky Way Galaxy. The Supernova Survey repeatedly scanned a stripe along the celestial equator to discover and measure supernovae and other variable objects, probing the accelerating expansion of the cosmos. All three surveys were carried out with special purpose instruments on the 2.5-meter telescope at Apache Point Observatory, in New Mexico.

Funding for the SDSS and SDSS-II has been provided by the Alfred P. Sloan Foundation, the Participating Institutions, the National Science Foundation, the U.S. Department of Energy, the National Aeronautics and Space Administration, the Japanese Monbukagakusho, the Max Planck Society, and the Higher Education Funding Council for England.

The SDSS is managed by the Astrophysical Research Consortium for the Participating Institutions. The SDSS-II Participating Institutions are the American Museum of Natural History, Astrophysical Institute Potsdam, University of Basel, University of Cambridge, Case Western Reserve University, University of Chicago, Drexel University, Fermilab, the Institute for Advanced Study, the Japan Participation Group, Johns Hopkins University, the Joint Institute for Nuclear Astrophysics, the Kavli Institute for Particle Astrophysics and Cosmology, the Korean Scientist Group, the Chinese Academy of Sciences (LAMOST), Los Alamos National Laboratory, the Max-Planck-Institute for Astronomy (MPIA), the Max-Planck-Institute for Astrophysics (MPA), New Mexico State University, Ohio State University, University of Pittsburgh, University of Portsmouth, Princeton University, the United States Naval Observatory, and the University of Washington.

Original here

Scientists discover why flies are so hard to swat

Over the past two decades, Michael Dickinson has been interviewed by reporters hundreds of times about his research on the biomechanics of insect flight. One question from the press has always dogged him: Why are flies so hard to swat?

"Now I can finally answer," says Dickinson, the Esther M. and Abe M. Zarem Professor of Bioengineering at the California Institute of Technology (Caltech).

Using high-resolution, high-speed digital imaging of fruit flies (Drosophila melanogaster) faced with a looming swatter, Dickinson and graduate student Gwyneth Card have determined the secret to a fly's evasive maneuvering. Long before the fly leaps, its tiny brain calculates the location of the impending threat, comes up with an escape plan, and places its legs in an optimal position to hop out of the way in the opposite direction. All of this action takes place within about 100 milliseconds after the fly first spots the swatter.

"This illustrates how rapidly the fly's brain can process sensory information into an appropriate motor response," Dickinson says.

For example, the videos showed that if the descending swatter--actually, a 14-centimeter-diameter black disk, dropping at a 50-degree angle toward a fly standing at the center of a small platform--comes from in front of the fly, the fly moves its middle legs forward and leans back, then raises and extends its legs to push off backward. When the threat comes from the back, however, the fly (which has a nearly 360-degree field of view and can see behind itself) moves its middle legs a tiny bit backwards. With a threat from the side, the fly keeps its middle legs stationary, but leans its whole body in the opposite direction before it jumps.

"We also found that when the fly makes planning movements prior to take-off, it takes into account its body position at the time it first sees the threat," Dickinson says. "When it first notices an approaching threat, a fly's body might be in any sort of posture depending on what it was doing at the time, like grooming, feeding, walking, or courting. Our experiments showed that the fly somehow 'knows' whether it needs to make large or small postural changes to reach the correct preflight posture. This means that the fly must integrate visual information from its eyes, which tell it where the threat is approaching from, with mechanosensory information from its legs, which tells it how to move to reach the proper preflight pose."

The results offer new insight into the fly nervous system, and suggest that within the fly brain there is a map in which the position of the looming threat "is transformed into an appropriate pattern of leg and body motion prior to take off," Dickinson says. "This is a rather sophisticated sensory-to-motor transformation and the search is on to find the place in the brain where this happens," he says.

Dickinson's research also suggests an optimal method for actually swatting a fly. "It is best not to swat at the fly's starting position, but rather to aim a bit forward of that to anticipate where the fly is going to jump when it first sees your swatter," he says.

The paper, "Visually Mediated Motor Planning in the Escape Response of Drosophila," will be published August 28 in the journal Current Biology.

Original here

Giant Clams Fed Early Humans

A new species of giant clam, Tridacna costata, found in the Red Sea. Credit: Carin Jantzen

By Charles Q. Choi, Special to LiveScience

Giant clams two feet long might have helped feed prehistoric humans as they first migrated out of Africa, new research reveals.

The species, Tridacna costata, once accounted for more than 80 percent of giant clams in the Red Sea, researcher now say. Today, these mollusks, the first new living species of giant clam found in two decades, represent less than 1 percent of giant clams living there.

This novel clam, whose shell has a distinctive scalloped edge, was discovered while scientists were attempting to develop a breeding program for another giant clam species, Tridacna maxima, which is prized in the aquarium trade. The new species appears to live only in the shallowest waters, which makes it particularly vulnerable to overfishing.

"These are all strong indications that T. costata may be the earliest example of marine overexploitation," said researcher Claudio Richter, a marine ecologist at the Alfred-Wegener-Institute for Polar and Marine Research in Bremerhaven, Germany.

Fossil evidence that the researchers uncovered suggests the stocks of these giant clams began crashing some 125,000 years ago, during the last interval between glacial periods. During that time, scientists think modern humans first emerged out of Africa, Richter said.

These mollusks could have played a key role in feeding people during that crucial era, serving as a prime target due to their large size, the scientists added. Indeed, competition for these clams and other valuable sea resources "may have been an important driver for human expansion," Richter told LiveScience.

Since this new species bore some features in common with two other living species of Red Sea clams, at first the researchers thought the new mollusk might have been a hybrid, but genetic analysis showed otherwise. These results were further corroborated by marked differences in behavior — while the other two clams spawn over a long period in summer, the new species reproduces during a brief spurt in spring.

No one had expected to discover a new giant clam species, "particularly in the Red Sea, one of the best investigated coral reef provinces," Richter said. The fact that it was overlooked for so long "is a testimony as to how little we really know about marine biodiversity."

Underwater surveys carried out in the Gulf of Aqaba (north of the Red Sea, between the Sinai Peninsula and Arabian mainland) and northern Red Sea revealed this long-overlooked clam must be considered critically endangered. Only six out of 1,000 live specimens the scientists observed belonged to the new species. This mollusk could be the earliest victim of human degradation of coral reefs in this region, the researchers added.

The scientists detailed their findings online on Aug. 28 in the journal Current Biology.

Original here


Bell Labs Kills Fundamental Physics Research


After six Nobel Prizes, the invention of the transistor, laser and countless contributions to computer science and technology, it is the end of the road for Bell Labs' fundamental physics research lab.

Alcatel-Lucent, the parent company of Bell Labs, is pulling out of basic science, material physics and semiconductor research and will instead be focusing on more immediately marketable areas such as networking, high-speed electronics, wireless, nanotechnology and software.

The idea is to align the research work in the Lab closer to areas that the parent company is focusing on, says Peter Benedict, spokesperson for Bell Labs and Alcatel-Lucent Ventures.

"In the new innovation model, research needs to keep addressing the need of the mother company," he says.

That view is shortsighted and may drastically curtail the Labs' ability to come up with truly innovative discoveries, respond critics.

"Fundamental physics is absolutely crucial to computing," says Mike Lubell, director of public affairs for the American Physical Society. "Say in the case of integrated circuits, there were many, many small steps that occurred along the way resulting from decades worth of work in matters of physics."

Bell Labs was one of the last bastions of basic research within the corporate world, which over the past several decades has largely focused its R&D efforts on applied research -- areas of study with more immediate prospects of paying off.

Without internally funded basic research, fundamental research has instead come to rely on academic and government-funded laboratories to do kind of long-term projects without immediate and obvious payback that Bell Labs used to historically do, says Lubell.

Most of the scientists working in the company's fundamental physics department have been reassigned, says Benedict. Nature, which first reported the news, says just four scientists are left working the fundamental physics department in Murray Hill, New Jersey. Benedict wouldn't confirm or deny that.

Computing and wireless technologies owe much to advancements in physics, though the connection may not always be immediately apparent. An example is the Global Positioning Systems or GPS.

For instance, an integral element of GPS are atomic clocks, which stemmed from the creation of the hydrogen maser.

The hydrogen maser, or hydrogen frequency standard, uses the properties of a hydrogen atom to serve as a precision frequency reference.

"GPS is based on very accurate timing mechanisms," says Lubell. "So the measure of time and the frequency standards that are used to do it date back to research in optical pumping which led to the development of hydrogen maser."

In the past Bell Labs was the place where such fundamental research that impacts the fields of both computing and physics could meet.

Bell Labs was founded in 1925 by Walter Gifford, then president of AT&T. AT&T, a monopoly, established Bell Telephone Laboratories, popularly known as Bell Labs, as a joint venture with Western Electric, AT&T's manufacturing subsidiary.

The Labs became the Mecca for researchers in science, computers and mathematics. Deregulation, however, forced AT&T in 1995 to spin off Bell and other parts of the company into Lucent Technologies. The move marked a shift in fortunes for the research arm as research budgets came to be trimmed and Alcatel-Lucent faced increasing pressure from stockholders.

"Bell Labs could do the kind of fundamental research it did in the past because it was functioning as part of a monopoly," says Lubell. "With that gone the landscape changed dramatically."

In recent years, Bell Labs' physics unit had its share of controversy when researcher J. Hendrik Schön was found to have published data in the area of molecular-scale transistors between 1998 and 2001 that had been manipulated and falsified.

That's a long way from where the Labs once stood with its position as a Nobel Prize magnet.

In 1937, Bell Labs researcher Clinton Davisson shared the Nobel Prize in physics for demonstrating the wave nature of matter.

Nearly twenty years later, in 1956 came the Nobel prize for inventing the transistor and it was shared by William Shockley, John Bardeen and Bell scientist Walter Brattain.

In the seventies, Bell Labs won two Nobel prizes in physics back-to-back in the years 1977 and 1978. Philip Anderson shared the Nobel for developing an improved understanding of the electronic structure of glass and magnetic materials. The next year Arno Penzias and Robert Wilson were feted for their discovery of cosmic microwave background radiation.

Former Bell Labs researcher Steven Chu shared the Nobel in 1997 for developing methods to cool and trap atoms with laser light. A year later Horst Stormer, Robert Laughlin, and Daniel Tsui were awarded a Nobel for the discovery and explanation of the fractional quantum Hall effect.

In the last few years, Lucent has sold its semiconductor business and that means research in areas connected to that had to be scaled back, especially in areas such as integrated circuits and Microelectromechanicals Systems (MEMS).

Meanwhile, Alcatel-Lucent continues to hack away at its jewels. Though Murray Hill in New Jersey, the company's U.S. headquarters, and the site of many great scientific discoveries remains safe, Alcatel-Lucent has sold its Holmdel campus. Holmdel's technological contributions include contributions to Telstar, the first communications satellite and Chu's Nobel Prize-winning work.

Still for fundamental physics research there will be life after Bell Labs, though it will be dependent on the whims of the federal government.

Increasingly, long-term research is being carried out in universities and national laboratories with federal grants, says Lubell.

For Bell Labs, yet another chapter in its storied history of comes to a close taking the once iconic institution closer to being just another research arm of a major corporation.

Photo: William Shockley, John Bardeen and Walter Brattain invented the transistor in 1947. (Alcatel-Lucent/Bell Labs)

Original here

How bacteria could help power the future

By Michael Schirber

Hydrogen is the cleanest and most abundant fuel there is, but extracting it from water or organic material is currently not a very efficient process. Scientists are therefore studying certain bacteria that exhale hydrogen as part of their normal metabolism.

"The production of hydrogen by microorganisms is intimately linked to their cellular processes, which must be understood to optimize bioenergy yields," said Amy VanFossen of North Carolina State University.

Of particular interest are microbes that thrive in hot temperatures, near the boiling point of water. VanFossen and her colleagues carried out a detailed DNA study of one of these thermophilic (heat-loving) bacteria called Caldicellulosiruptor saccharolyticus, which was first found in a hot spring in New Zealand.

The results, presented last week at the American Chemical Society meeting in Philadelphia, indicate which genes allow C. saccharolyticus to eat plant material, referred to as biomass, and expel hydrogen in the process.

Fuel cell vehicles are starting to be available for lease in California and the New York area. They run off of hydrogen gas and emit only water vapor out the tail pipe.

Hydrogen can be found everywhere: it's the "H" in H2O and a major element in biological processes. The problem is that it takes quite a bit of energy to separate the hydrogen from the molecules it is found in.

However, certain organisms, such as the bacteria in cow stomachs , get energy from food through a chemical reaction that releases hydrogen gas. Often this hydrogen is immediately taken up by other bacteria, called methanogens , that convert it to methane .

One of the challenges, therefore, of producing hydrogen from bacteria is to prevent the methanogens from gobbling up the gas. The advantage of thermophiles is that they operate at temperatures that are typically too hot for methanogens. C. saccharolyticus, for example, prefers a toasty 160 degrees Fahrenheit (70 degrees Celsius).

Furthermore, the chemistry of hydrogen formation is easier at these higher temperatures, said Servé Kengen from Wageningen University in the Netherlands.

"In general, thermophiles have a simpler fermentation pattern compared to [lower temperature] mesophiles, resulting in fewer byproducts," he said.

Bionic microbe
Kengen is part of a European Union project called Hyvolution, which is developing decentralized hydrogen production that can be performed near where biomass is grown.

"Biological hydrogen production is well suited for decentralized energy production," Kengen said. "The process is performed at almost ambient temperature and pressure, and therefore it is expected to be less energy intensive than thermochemical or electrochemical production methods [which are alternative ways to get hydrogen]."

Kengen said that C. saccharolyticus, or what he calls "Caldi," is very attractive for this application. It is unique in that it eats a wide range of plant materials, including cellulose , and can digest different sugars (technically carbohydrates) at the same time.

"The wide range of carbohydrates it grows on suggests that C. saccharolyticus will yield a plethora of industrially relevant carbohydrate degrading enzymes," VanFossen told LiveScience.

These enzymes — now isolated through VanFossen's genetic analysis — could help get more hydrogen from a given quantity of biomass.

"Once we are able to engineer Caldi (not yet possible) we want to further improve its hydrogen producing capacity," Kengen said.

© 2008 LiveScience.com. All rights reserved.

Original here

Swift Enterprises Joins Race for Alternative Jet Fuel


Green Gym Uses Human-Powered Energy


Why US must invest against climate change

Eight scientific organisations have urged the next US president to help protect the country from climate change by pushing for increased funding for research and forecasting. The organisations say about $2 trillion of US economic output could be hurt by storms, floods and droughts.

"We don't think we have the right kind of tools to help decision makers plan for the future," said Jack Fellows, the vice president for corporate affairs of the University Corporation for Atmospheric Research, a consortium of 71 universities.

The groups, including the American Geophysical Union and the American Meteorological Society, urged Democratic presidential candidate Barack Obama and Republican rival John McCain to support $9 billion in investments between 2010 and 2014 to help protect the country from extreme weather, which would nearly double the current US budget for the area.

The UN's science panel says extreme weather events could hit more often as temperatures rise due to climate change.

Each year the United States suffers billions of dollars in weather-related damages ranging from widespread events like Hurricanes Katrina and Rita, and the more recent droughts in the Southeast, to smaller, more frequent glitches like airline delays from storms, they said. More than a quarter of the country's economic output, about $2 trillion, is vulnerable to extreme weather, they added.

The investments would pay for satellite and ground-based instruments that observe the Earth's climate and for computers to help make weather predictions more accurate.

Invest to protect

John Snow, the co-chairman of the Weather Coalition, a business and university group that advocates for better weather prediction, said improved computers would help scientists forecast extreme weather events more locally, which could help cities better prepare for weather disasters.

It could also help businesses that produce virtually no greenhouse emissions, such as wind farms, know where to best locate their operations, he said.

The scientists said cooler temperatures in the first half of this year are making their task more difficult. "One of the challenges we face ... is to make the case that while we are in a period of warming, we should not expect every year to be the warmest year on record," Snow said.

The global mean temperature to the end of July was 0.28 C above the 1961-1990 average, the UK-based MetOffice for climate change research said on Wednesday. That would make the first half of 2008 the coolest since 2000.

Neither campaign responded immediately to questions about the plea for funding. Obama and McCain, who face off in a November election, both support regulation of greenhouse gases through market mechanisms such as cap-and-trade programs on emissions.

Original here

Ford Tests Improve Gas Mileage 24% with EcoDriving

Ford is really throwing down the guantlet by showing how dedicated it is to the new EcoDriving initiative we talked about the other day. I really liked it because it validates a lot of what we’re trying to do on the forums in terms of improving fuel economy on an individual level, but also showed that automakers were willing to commit (at least in name) to supporting fuel efficient driving. However, it seems Ford has really stepped up to the plate by offering ecodriving lessons over the course of several days to see how effective it really is.

Ford takes on ecodriving

Recently, Ford and a group called Pro Formance decided to take on ecodriving in the form of a 4-day long seminar with 48 different drivers taking part. Using the ecodriving tips taught by Pro Formance, the participants increased their fuel economy between 6-50%, with and average increase of 24%.

With the gas crunch hitting people hard, it’s good to see a company like Ford stepping up and showing consumers that there’s more than just air up your tires and cleaning out the trunk. Here’s their take on ecodriving:

“By working with Pro Formance to conduct validation testing, Ford is proving that eco-driving techniques are teachable and work across a broad spectrum of vehicles and drivers,” said Drew DeGrassi, president and CEO of Pro Formance Group. “It’s a great initiative for Ford to lead in this country. It’s not the end-all solution for America to obtain energy independence, but it is an important part of it.”

I would love to see what the training program is like, but for the rest of us Ford give us 10 ecodriving tips. Sure, they pale in comparison to EcoModder’s ecodriving tips list, but most drivers aren’t interested in getting really involved, and Ford’s hands-on approach is a good way to get results without asking too much of people.

Evidently they have been doing this since the 1990s in Germany, where gas mileage has been an issue for longer than it has in the US. Hopefully, this will encourage other manufacturers to bring their most efficient vehicles and programs to a ready-and-willing US market.

If you liked this post, sign up for out RSS Feed for automatic updates.

Original here

Scientists: Save the planet—have fewer kids

As rising populations strain a warming planet, a British journal suggests having smaller families

|Chicago Tribune correspondent

LONDON — There are plenty of ways to cut your carbon footprint, whether it's driving less or buying an energy-efficient refrigerator. But the British Medical Journal, in an editorial last month, urged a more controversial one: having fewer children.

With 60 million people already living in one of the most densely populated countries in the world, the journal said, British couples should aim to have no more than two children as part of their contribution to worldwide efforts to reduce carbon emissions, stem climate change and ease demands on the world's resources.

Limiting family size is "the simplest and biggest contribution anyone can make to leaving a habitable planet for our grandchildren," the editorial's authors said.

Family planning as a means to reduce climate change has been little talked about in international climate forums, largely because it is so politically sensitive. China's leaders, however, regularly argue that their country should get emission reduction credits because of their one-child policy, and many environmentalists—and even a growing number of religious and ethics scholars—say the biblical command to "be fruitful and multiply" needs to be balanced against Scripture calling for stewardship of the Earth.

Europe's rates diving

Increasingly, "a casual attitude toward global warming ought to be viewed as a sin," argues James Nash, director of the Churches' Center for Theology and Public Policy, a Washington-based research group that studies the relationship between Christian faith and public policy.

The appeal to have fewer children sounds a bit odd in Europe, where one of the biggest worries these days is plunging birthrates. German women today bear an average of 1.3 children, fewer than women in China, where the one-child policy is fast weakening. Even British women are giving birth to just 1.9 children on average, a level below that needed to produce a stable population.

But each child born in a rich country like Britain or the United States is likely to be responsible for 160 times as much carbon emitted as a child born in Ethiopia, said John Guillebaud, a British family-planning doctor, professor and one of the authors of the British Medical Journal editorial. With efforts to cut emissions likely to go only so far, cutting births may be the best option, he said.

"We're not Big Brother. We're not for pushing people," he insisted in an interview. "We just think deciding how big a family to have should take into consideration our descendants."

At the current projected rates of growth, the world's population, now at 6.7 billion, is expected to reach about 9 billion by 2050. Environmentalists argue that a population that large will dramatically overtax the world's resources and lead to growing conflict as well as potentially crippling climate change, particularly as poorer parts of the world develop and begin using more resources.

Most of the expected growth in population is projected to come in less-developed parts of the world, particularly Asia, where 60 percent of the world's people live, and Africa, where birthrates are the highest in the world.

Worldwide, population growth is declining, and even in much of Asia and Africa "the drop in fertility rate has been quite amazing," said Werner Haug, director of the United Nations Population Fund's technical division. Despite falling international investment in family planning, Thailand today has a European-like birthrate; Kenyan women, who once averaged eight children, are now having five.

Overall, Asia's birthrate, excluding China, is 2.8 children per woman, and Africa's is 5.4—well down from the past, said Carl Haub of the Washington-based Population Reference Bureau, an independent organization that analyzes demographic data.

Asia set for boom

But because a birthrate above 2.1 children per couple — the approximate replacement level, allowing for some untimely deaths—will produce ever-expanding growth, even Asia is still set to "grow like wildfire," Haub said.

The problem is worst in places such as northern India, where literacy, education and access to birth control are poor and poverty levels and population numbers are already high. If those conditions continue, runaway growth could push India toward a population of 2 billion people, Haub said. Sub-Saharan Africa, at expected growth rates, is likely to nearly triple its population by 2050, also to about 2 billion people, he said.

Even in the United States, birthrates, which had fallen to around 1.85 children per non-Hispanic white woman, are now about 2.1 children per U.S. couple, thanks to Hispanic migration.

In a nation where Texas' 23 million people account for more greenhouse gas emissions than all 720 million Sub-Saharan Africans, even small rates of U.S. population growth may have a disproportionate impact on global warming, said the UN's Haug.

Experts say the best way to cut the world's birthrate is simply to push ahead with what has worked best in the past: education, access to information about birth-control options, and better health care to give parents confidence that children born will survive to adulthood.

Original here

Thursday, August 28, 2008

MythBusters Tackle Moon Conspiracies: Behind the Scenes

On the eve of one of their biggest busts yet, PM contributing editors Jamie Hyneman and Adam Savage explain how they made their own fake photos, built a moon set in an hour—and even went weightless themselves. No, they didn't build a rocket ship and actually go ... yet.

NASA Images Show Gamma Ray Bursts Across Milky Way


NASA researchers yesterday released images collected by a new telescope studying high-energy gamma rays. A combined image from 95 hours of the telescope's initial observations showed bursts of gamma rays glowing across the plane of the Milky Way.

The Gamma-Ray Large Area Space Telescope, renamed Fermi, was launched in June and is off to a promising start, NASA scientists said.

"I like to call it our extreme machine," said Jon Morse, the director of astrophysics for NASA. "It will help us crack the mysteries of these enormously powerful emissions."

Gamma rays are powerful light rays invisible to the naked eye. Because Earth's atmosphere absorbs gamma rays, they can be studied only from the edges of the visible universe.

Fermi is gathering data on gamma rays that originate near black holes and high-energy stars called pulsars.

Though much remains unknown, bursts of gamma rays are thought to be emitted from particles coming out of black holes and pulsars, said Peter Michelson, a Stanford physicist and a principal investigator for the mission.

"We don't yet understand the mechanism for how the particles that emit the gamma rays are accelerated," Michelson said. "We're not even sure what the nature of the particles are."

The study is a follow-up on work done by the Energetic Gamma-Ray Experiment Telescope, a mission that studied gamma rays from 1991 to 2000.

Fermi's technology allowed scientists to compile in days what took the first mission one year to do, said Steve Ritz, one of the project's scientists.

The telescope was renamed Fermi yesterday, after Italian physicist Enrico Fermi, because he is "today regarded as one of the top scientists of the 20th century," Ritz said.

The scientists hope that in the five to 10 years that it is in orbit, Fermi will be as remarkable as its namesake.

"This powerful space observatory will explore the most extreme of environments for us," Morse said.

Original here

TV's 'Mythbusters' Tackle Moon Landing Hoax Claims

In 2005, Jamie Hyneman and Adam Savage, special effects experts better known by the title of their popular Discovery Channel series, "MythBusters", were asked during an interview about the myth they would most like to test provided an unlimited budget.

"Jamie and I have done the research, and figured that the only way to end the debate about the 'myth' of the Apollo moon landing is to go there," Savage replied to Slashdot, a technology news website, about the belief held by some that the United States faked the lunar landings.

Three years later, the Mythbusters are ready to share the results of their 'trip' as they devote their next show, airing on Wednesday, to the moon landing hoax claims.

"We built a hybrid rocket that was fueled by poo and nitrous oxide — thought we had enough Teflon tape on the seals but the stink got through anyway. Too bad that the footage got lost in transit to the editors," Hyneman told collectSPACE.com, explaining that their limited budget would not cover the cost of regular rocket fuel.

Of course, he was joking.

"Dude, I sooo wished we could have gone there," Savage admitted.

So, with their feet firmly planted on the Earth (at least for most of the time, but more on that later), Hyneman and Savage, along with fellow Mythbusters Tory Belleci, Kari Byron and Grant Imahara, set out to use science to 'bust' or confirm the truth behind the hoaxers' claims.

Low hanging fruit

Hoax believers have had 40 years to devise reasons why the Apollo moon landings must have been filmed in an Earth-based studio. As special effects experts, Hyneman felt they were well suited for the subject.

First however, they needed to choose which parts of the myth to test.

"We looked at the ones that for some reason or other, seemed most prevalent," Savage explained in an e-mail interview.

"We took the low hanging fruit," Hyneman added. "The key idea was that the footage that proved we were there was a special effect. Adam and I are experienced effects artists, so it was natural for us to dig into it."

"We wanted to tackle the ones that actually take some experimentation to prove," Savage said.

To narrow the field however, the Mythbusters sought the assistance of someone very familiar with debunking the moon hoax myth, or they would have if he had not come calling first.

"I was actually first involved with the Mythbusters early on, when I was contacted by one of their producers asking if I had any astronomical myths for them to bust," shared Dr. Phil Plait, a.k.a. "The Bad Astronomer", in an interview with collectSPACE.com. An astronomer who worked with the Hubble Telescope, Plait created a website, Bad Astronomy, aimed at dispelling astronomy and science based myths, including the moon hoax, which expanded into books and his recent appointment as president of the James Randi Educational Foundation.

"I made some suggestions but sadly they didn't use any of them," Plait said. "I guess most of them don't make very good TV."

That early interaction, which was followed by meeting the Mythbusters at conferences, led to Plait establishing a relationship with the show. So he was surprised when a fellow astronomer contacted him about the Mythbusters investigating the moon hoax.

"I hadn't heard anything about [this show] so I fired off an e-mail to Adam Savage and said, 'What gives?' and he e-mailed me back and said, 'Oh oh oh, we're going to ask you about this,'" recalled Plait.

"Over the course of a few days, they were on the phone with me and a lot of other people who knew about, for example, the properties of the lunar surface, to try to figure out not just the best way of debunking the moon hoax but the best aspects of it... so they wanted to know which ones that they had found were the ones that I ran into and what were the best ways to tackle them. It was actually a lot of fun."

Ultimately, Hyneman, Savage and the others settled on three major areas of the hoax: how light interacted with the lunar surface, how the astronauts appeared to move in the low gravity of the Moon and how items behaved in the airless void of space.

Original here

Strange Clouds at the Edge of Space

August 25, 2008: When in space, keep an eye on the window. You never know what you might see.

Last month, astronauts on board the International Space Station (ISS) witnessed a beautiful display of noctilucent or "night-shining" clouds. The station was located about 340 km over western Mongolia on July 22nd when the crew snapped this picture:

see caption

Above: Noctilucent clouds photographed by the crew of the ISS: more.

Atmospheric scientist Gary Thomas of the University of Colorado has seen thousands of noctilucent cloud (NLC) photos, and he ranks this one among the best. "It's lovely," he says. "And it shows just how high these clouds really are--at the very edge of space."

He estimates the electric-blue band was 83 km above Earth's surface, higher than 99.999% of our planet's atmosphere. The sky at that altitude is space-black. It is the realm of meteors, high-energy auroras and decaying satellites.

What are clouds doing up there? "That's what we're trying to find out," says Thomas.

People first noticed NLCs at the end of the 19th century after the 1883 eruption of Krakatoa. The Indonesian supervolcano hurled plumes of ash more than 50 km high in Earth's atmosphere. This produced spectacular sunsets and, for a while, turned twilight sky watching into a worldwide pastime. One evening in July 1885, Robert Leslie of Southampton, England, saw wispy blue filaments in the darkening sky. He published his observations in the journal Nature and is now credited with the discovery of noctilucent clouds.

Scientists of the 19th century figured the clouds were some curious manifestation of volcanic ash. Yet long after Krakatoa's ash settled, NLCs remained.

"It's a puzzle," says Thomas. "Noctilucent clouds have not only persisted, but also spread." In the beginning, the clouds were confined to latitudes above 50o; you had to go to places like Scandinavia, Siberia and Scotland to see them. In recent years, however, they have been sighted from mid-latitudes such as Washington, Oregon, Turkey and Iran:

see caption

Above: Noctilucent clouds over Mt. Sabalan, a 15,784 ft extinct volcano in northwestern Iran. Photo credit: Siamak Sabet. [more]

"This year's apparition over Iran (pictured above) was splendid," says Thomas. The Persian clouds appeared on July 19th, just a few days before the ISS display, and were photographed from latitude 38o N. "That's pretty far south," he says.

The genesis and spread of these clouds is an ongoing mystery. Could they be signs of climate change? "The first sightings do coincide with the Industrial Revolution," notes Thomas. "But the connection is controversial."

NASA is investigating. The AIM satellite, launched in April 2007, is now in polar orbit where it can monitor the size, shape and icy make-up of NLCs. The mission is still in its early stages, but already some things have been learned. Thomas, an AIM co-Investigator, offers these highlights:

1. Noctilucent clouds appear throughout the polar summer, are widespread, and are highly variable on hourly to daily time scales. A movie made from daily AIM snapshots shows the 2007 NLC season unfolding over the north pole: watch it.

see captionRight: A daily snapshot of noctilucent cloud activity over the North Pole in 2007. Click on the image to set the scene in motion. Credit: AIM/Goddard Space Flight Center Scientific Visualization Studio.

2. There is a substantial population of invisible noctilucent clouds. Thomas explains: "NLCs are made of tiny ice crystals 40 to 100 nanometers wide—just the right size to scatter blue wavelengths of sunlight. This was known before AIM. The spacecraft has detected another population of much smaller ice crystals (<>

3. Some of the shapes in noctilucent clouds, resolved for the first time by AIM's cameras, resemble shapes in tropospheric clouds near Earth's surface. AIM science team members have described the similarities as "startling." The dynamics of weather at the edge of space may not be as unEarthly as previously supposed.

These findings are new and important, but they don't yet unravel the central mysteries:

Why did NLCs first appear in the 19th century?

Why are they spreading?

What is ice doing in a rarefied layer of Earth's upper atmosphere that is one hundred million times dryer than air from the Sahara desert?

AIM has just received a 3-year extension (from 2009 to 2012) to continue its studies. "We believe that more time in orbit and more data are going to help us answer these questions," says Thomas.

Meanwhile, it's a beautiful mystery. Just ask anyone at the edge of space.

Original here

Yellowstone's Ancient Supervolcano: Only Lukewarm?

Molten plume of material beneath Yellowstone cooler than expected

Photo of a geyser in Yellowstone National Park.

Yellowstone National Park and its famous geysers are the remnants of an ancient supervolcano.
Credit and Larger Version

The geysers of Yellowstone National Park owe their eistence to the "Yellowstone hotspot"--a region of molten rock buried deep beneath Yellowstone, geologists have found.

But how hot is this "hotspot," and what's causing it?

In an effort to find out, Derek Schutt of Colorado State University and Ken Dueker of the University of Wyoming took the hotspot's temperature.

The scientists published results of their research, funded by the National Science Foundation (NSF)'s division of earth sciences, in the August, 2008, issue of the journal Geology.

"Yellowstone is located atop of one of the few large volcanic hotspots on Earth," said Schutt. "But though the hot material is a volcanic plume, it's cooler than others of its kind, such as one in Hawaii."

When a supervolcano last erupted at this spot more than 600,000 years ago, its plume covered half of today's United States with volcanic ash. Details of the cause of the Yellowstone supervolcano's periodic eruptions through history are still unknown.

Thanks to new seismometers in the Yellowstone area, however, scientists are obtaining new data on the hotspot.

Past research found that in rocks far beneath southern Idaho and northwestern Wyoming, seismic energy from distant earthquakes slows down considerably.

Using the recently deployed seismometers, Schutt and Dueker modeled the effects of temperature and other processes that affect the speed at which seismic energy travels. They then used these models to make an estimate of the Yellowstone hotspot's temperature.

They found that the hotspot is "only" 50 to 200 degrees Celsius hotter than its surroundings.

"Although Yellowstone sits above a plume of hot material coming up from deep with the Earth, it's a remarkably 'lukewarm' plume," said Schutt, comparing Yellowstone to other plumes.

Although the Yellowstone volcano's continued existence is likely due to the upwelling of this hot plume, the plume may have become disconnected from its heat source in Earth's core.

"Disconnected, however, does not mean extinct," said Schutt. "It would be a mistake to write off Yellowstone as a 'dead' volcano. A hot plume, even a slightly cooler one, is still hot."

Original here