Sunday, September 21, 2008

Stardust evidence points to planet collision

Masses of dust floating around a distant binary star system suggest that two Earth-like planets obliterated each other in a violent collision, U.S. researchers reported on Friday.

"It's as if Earth and Venus collided with each other," Benjamin Zuckerman, an astronomer at the University of California Los Angeles, who worked on the study, said in a statement.

"Astronomers have never seen anything like this before; apparently major, catastrophic, collisions can take place in a fully mature planetary system."

Writing in the Astrophysical Journal, the team at UCLA, Tennessee State University and the California Institute of Technology said it spotted the dust orbiting a star known as BD +20 307, 300 million light-years from Earth in the constellation Aries.

A light-year is the distance light travels in a year, or about 6 trillion miles. So the observations are, in essence, looking back in time 300 million years.

"If any life was present on either planet, the massive collision would have wiped out everything in a matter of minutes: the ultimate extinction event," said Gregory Henry of Tennessee State University.

BD +20 307 appears to be composed of two stars, both very similar in mass, temperature and size to the Earth's sun. They spin about their common centre of mass every 3 1/2 days or so.

"The planetary collision in BD +20 307 was not observed directly but, rather, was inferred from the extraordinary quantity of dust particles that orbit the binary pair at about the same distance as Earth and Venus are from our sun," Henry said.

"If this dust does indeed point to the presence of terrestrial planets, then this represents the first known example of planets of any mass in orbit around a close binary star."

In July 2005, the team reported it had spotted the system, then believed to consist of a single star. It was surrounded by more warm orbiting dust than any other sun-like star known to astronomers.

"This poses two very interesting questions," said Tennessee State's Francis Fekel. "How do planetary orbits become destabilized in such an old, mature system? Could such a collision happen in our own solar system?"

87 Million Year-Old Praying Mantis Found

080425-amber-mantis_1701 87 Million Year-Old Praying Mantis Found pictureIs the 87-million-year-old praying mantis recently found encased in amber in Japan a “missing link” between mantises from the Cretaceous period and modern-day insects?

It is a rare find indeed and its true significance is still to be deciphered.

Discovered in January of this year by Kazuhisa Sasaki, director of the Kuji Amber Museum, the fossil mantis measures 0.5 inch (1.4 centimeters) from its antennae to the tip of its abdomen.

It was found buried more than six feet below the surface of an amber mine in a part of Japan that is famous for producing large amounts of amber, the northeastern Iwate Prefecture.

“I found it in a deposit that had lots of other insects—ancient flies, bees, and cockroaches—but this was the only praying mantis” said Sasaki.

praying-mantis2 87 Million Year-Old Praying Mantis Found pictureThe fossil mantis is partially well preserved, although its wings and abdomen are badly crushed.

According to Kyoichiro Ueda, executive curator of the Kitakyushu Museum of Natural History, it is the oldest mantis fossil ever found in Japan and only one of seven in the entire world from the Cretaceous Period.

Even more unique is the fact that this mantis is different from the other six, in that it has two spines protruding from its femur and it has mysterious, tiny hairs on its forelegs.

No mantis from this particular time period has ever been found with spines, although modern mantises have five or six on their forelegs, which help them catch prey.

“The years of the late Cretaceous period were a kind of transition phase between the ancient and modern worlds, and this fossil displays many intermediate elements between the two eras” said Ueda.

Time alone will reveal the significance of this important find.

The Holes in Our Genomes Scanning DNA for structural changes brings new insight into disease.

Over the past two years, scientists have made a surprising discovery about our DNA. Like a book with torn pages, duplicate chapters, or upside-down paragraphs, everyone's genome is riddled with large mistakes. These "copy number variations" can include deletions, duplications, and rearrangements of stretches of DNA ranging in size from one thousand to one million base pairs. New tools to screen for such mistakes, described this month in Nature Genetics, should generate a more complete picture of the genetic root of common diseases.

"There has been a shock at the prevalence of this kind of variation and a desire to characterize it more fully and to integrate it into genome-wide studies of disease," says Matthew Hurles, a geneticist at the Wellcome Trust Sanger Institute, in Cambridge, U.K., who was not involved in either study. "Now we have the tools that will enable those discoveries."

Over the past few years, advances in gene microarray technologies, which can quickly survey large volumes of DNA, have allowed scientists to screen more human genomes than ever before, resulting in a flood of information linking specific genes to disease. Most of these studies begin by looking for single-letter changes in the DNA code, called single-nucleotide polymorphisms, or SNPs (pronounced "snips"). SNPs found more often in people with a particular disease point researchers to genetic variations that might play a role in that disease. Scientists have so far identified 200 disease-linked genes using this approach, but even large studies of thousands of patients have uncovered genetic variations that account for only a small proportion of complex diseases. In type 2 diabetes, for example, the 18 disease-linked genes that have been identified explain perhaps 5 percent of the disease's heritability.

Scientists have now adapted these microarrays to identify both small SNP changes and copy number variations, which they hope will help them identify a larger fraction of disease-causing genes. In one of the papers in Nature Genetics, David Altshuler, a physician and scientist at the Broad Institute, in Cambridge, MA, and his collaborators described the design of such a chip, in collaboration with genomics instrument maker Affymetrix, which they then used to map this kind of variation.

Altshuler's team assayed the DNA of 270 people whose genomes were already being studied as part of the HapMap project, which is cataloguing common genetic variants. They found that most copy number variations are inherited, as SNPs are, rather than arising anew in individuals. That news is likely to be a relief to geneticists, because it means that they can survey many structural changes by employing the same high-throughput approach used to catalogue SNPs.

The Broad group and others are now using microarrays that look for copy number variations to study a variety of common diseases. "In the next couple of years, we should really start to see new insights into the disease-causing mechanisms that result from this kind of mechanism," says Hurles. In fact, two such studies have already yielded important insights, identifying rare deletions linked to autism and schizophrenia. "If we can interrogate both kinds of variation in the same patient in the same experiment, we can get an integrated understanding of how variations come together to influence disease," says Steven McCarroll, a geneticist at the Broad and lead author of the paper.

Still, some types of copy number variations may be going undetected. In a second paper published in Nature Genetics, Greg Cooper and his colleagues at the University of Washington compared data collected using microarrays sold by Illumina with a sequencing-based assay published last year. They found that the array missed a number of changes centered on so-called hot spots, where multiple duplications--a string of four or five copies of a gene--make DNA difficult to study. Cooper says that different approaches will likely be needed to study these changes.

"The more detail we can get about our genomes--each peeling of the layer of the onion--teaches us more about disease," says Altshuler. "The technology is moving in parallel, so as we move further, we can investigate each layer in detail."

Problems Stall Action for Collider

A week after subatomic particles began zooming around its underground racetrack to cheers and Champagne toasts, the world’s most powerful particle accelerator, the Large Hadron Collider, at CERN outside Geneva, is still struggling to take its next big step.

The scientists and engineers on the project had hoped that as early as late next week the collider would actually begin to collide subatomic particles, though at energies far below the cataclysmic levels that have had some skeptics worried about the creation of black holes that could eat the world, a possibility that scientists dismiss as science fiction.

The machine is designed to accelerate protons to seven trillion electron volts and then bang them together in search of new forces and particles. The initial attempt at running protons through the collider, on Sept. 10, was so successful that CERN scientists thought they might achieve the initial collisions ahead of the scheduled two weeks after “first beam.” But those hopes were dashed by a series of “teething problems,” as one engineer put it, including the failure of a 30-ton transformer in the system for chilling the helium that, in turn, chills the superconducting magnets that guide the protons. On Friday, CERN announced that a large spill of helium in the collider tunnel would mean a further delay.

The collisions, when they happen, will be at the relatively modest energy of 450 billion electron volts, a realm well explored by other machines, and will last for only a day or two.

They will allow the scientists to calibrate and begin to understand the mountains of detectors, wires, computers and magnets that have been built to capture and analyze the proton collisions. “The first job is to relearn what we already know,” said Tom LeCompte, from the Argonne National Laboratory in Illinois. He works on a collider detector known as Atlas.

When the machine will begin colliding protons at high energy is a guessing game at best. If all goes well, scientists and engineers at CERN say, it could happen by the middle of October.

Einstein fridge design can help global cooling

An early invention by Albert Einstein has been rebuilt by scientists at Oxford University who are trying to develop an environmentally friendly refrigerator that runs without electricity.

Modern fridges are notoriously damaging to the environment. They work by compressing and expanding man-made greenhouse gases called freons - far more damaging that carbon dioxide - and are being manufactured in increasing numbers. Sales of fridges around the world are rising as demand increases in developing countries.

Now Malcolm McCulloch, an electrical engineer at Oxford who works on green technologies, is leading a three-year project to develop more robust appliances that can be used in places without electricity.

Einstein refrigerator

His team has completed a prototype of a type of fridge patented in 1930 by Einstein and his colleague, the Hungarian physicist Leo Szilard. It had no moving parts and used only pressurised gases to keep things cold. The design was partly used in the first domestic refrigerators, but the technology was abandoned when more efficient compressors became popular in the 1950s. That meant a switch to using freons.

Einstein and Szilard's idea avoids the need for freons. It uses ammonia, butane and water and takes advantage of the fact that liquids boil at lower temperatures when the air pressure around them is lower. 'If you go to the top of Mount Everest, water boils at a much lower temperature than it does when you're at sea level and that's because the pressure is much lower up there,' said McCulloch.

At one side is the evaporator, a flask that contains butane. 'If you introduce a new vapour above the butane, the liquid boiling temperature decreases and, as it boils off, it takes energy from the surroundings to do so,' says McCulloch. 'That's what makes it cold.'

Pressurised gas fridges based around Einstein's design were replaced by freon-compressor fridges partly because Einstein and Szilard's design was not very efficient. But McCulloch thinks that by tweaking the design and replacing the types of gases used it will be possible to quadruple the efficiency. He also wants to take the idea further. The only energy input needed into the fridge is to heat a pump, and McCulloch has been working on powering this with solar energy.

'No moving parts is a real benefit because it can carry on going without maintenance. This could have real applications in rural areas,' he said.

McCulloch's is not the only technology to improve the environmental credentials of fridges. Engineers working at a Cambridge-based start-up company, Camfridge, are using magnetic fields to cool things. 'Our fridge works, from a conceptual point of view, in a similar way [to gas compressor fridges] but instead of using a gas we use a magnetic field and a special metal alloy,' said managing director Neil Wilson.

'When the magnetic field is next to the alloy, it's like compressing the gas, and when the magnetic field leaves, it's like expanding the gas.' He added: 'This effect can be seen in rubber bands - when you stretch the band it gets hot, and when you let the band contract it gets cold.'

Doug Parr, chief scientist at Greenpeace UK, said creating greener fridges was hugely important. 'If you look at developing countries, if they're aspiring to the lifestyles that we lead, they're going to require more cooling - whether that's air conditioning, food cooling or freezing. Putting in place the technologies that are both low greenhouse-gas refrigerants and low energy use is critical.'

McCulloch's fridge is still in its early stages. 'It's very much a prototype; this is nowhere near commercialised,' he said. 'Give us another month and we'll have it working.'