Followers

Monday, July 28, 2008

Young Galaxies Have Surprisingly Strong Magnetic Fields: Contradicts Popular Theories


What equates to the magnetic field of perigee galaxies for quasars that are billions of light years away (large: "whirlpool" galaxy; small: quasar OC-65)? (Credit: Photo: www.mpifr-bonn.mpg.de)

The origin of magnetic fields in galaxies is still a mystery to astronomers. Popular theories suggest continual strengthening over billions of years. The latest results from Simon Lilly’s group, however, contradict this assumption and reveal that young galaxies also have strong magnetic fields.

“There is an astronomer joke that goes ‘to understand the universe, we examine galaxies for radiation, gases, temperatures, chemical constitution and much more. Anything we can’t explain after that we attribute to the magnetic fields’”, explains Simon Lilly, Professor at the Institute of Astronomy at ETH Zurich. The creations of the magnetic fields in galaxies remain a badly researched mystery. Until now, it was deduced that galaxies which formed after the Big Bang 13.8 billion years ago had very weak magnetic fields that then proceeded to grow exponentially in strength over several billions of years. At least that is what the dynamo theory (see box), which is often used to explain the development of magnetic fields, conveys.

Statistical approach for exact proof

In the journal Nature, Martin Bernet, Francesco Miniati and Simon Lilly probed into the topic of magnetic fields in young galaxies. The results are astonishing: Contrary to the popular dynamo explanatory model, the team was able to prove that even very young galaxies have a strong magnetic field on the basis of a statistical analysis of existing and new astronomical data. Technically speaking, determining the strength of magnetic fields that are many billions of light years from Earth is difficult and extremely time-consuming. This is probably one reason why the field has hardly been researched.

Using the Faraday Rotation (FR) as the parameter, however, the strength of a magnetic field can be deduced from the polarization of the light in the radio field. If linearly polarized light radiates through a magnetized gas cloud, the polarization level of the light rotates. The rotation of the polarization level is all the larger the stronger the magnetic field is. This effect was first described by Michael Faraday in 1845. The researchers used quasars as radiation sources to measure the magnetic fields in the galaxies in question. These are extremely luminous objects whose radiation can in all likelihood be explained through the existence of supermassive black holes at the heart of the galaxies.

Observations in Chile

For their analyses, the scientists around Lilly had recourse to FR quasar measurements conducted by the astronomer Philipp Kronberg from the University of Toronto. Martin Bernet, a PhD student of Lilly’s and Miniati’s, studied the relationship between the Faraday Rotation and the redshift of the quasars’ light for 300 of these FR measurements. In astronomy, the redshift is used to determine the age and distance of galaxies.

The researchers developed a thesis from the statistical distribution of values obtained: “The stronger, observed rotation of the quasar light with a higher redshift is on a longer path and can be ascribed to the subsequently greater probability of coming into contact with other galaxies”. To verify this thesis, the astronomers selected 76 quasars from the original Kronberg sample and, using the Very Large Telescope (VLT) in Chile, observed how many magnesium absorption lines are contained in quasar spectra. We know from earlier studies that almost every galaxy along a quasar’s line of sight (path of the light between quasar and telescope) displays magnesium absorption.

The researchers were thus able to determine how many galaxies there are between ours and the quasar and ascertain the galaxies’ magnetic field by comparing the FR values of the line of sight with and without magnesium absorption. For the magnetic fields of the galaxies, the calculations yielded a value of approximately 10 μGauss, in other words a field that is about a million times weaker than the Earth’s magnetic field. This more or less corresponds to the values of our own galaxy, the Milky Way. The results enabled the researchers to prove that young, distant galaxies also have a strong, large-scale magnetic field.

This at a time when the universe was only a third as old as it is today. The realization contradicts the popular dynamo theory, according to which magnetic fields build themselves up exponentially over billions of light years through constant reinforcement. “A galaxy’s magnetic field has to develop much more rapidly during its evolution than we previously assumed. A lasting equilibrium then appears at a relatively early stage”, explains Lilly.

Magnetic fields slip into the consciousness of astronomers

The scientific community has had doubts about the dynamo theory for some time. Kronberg also repeatedly expressed misgivings with regard to the existing model during his thirty-year FR measurements. The proof, however, has been lacking until now. According to Lilly, the latest issue of “Nature” above all reveals the high quality of the VLT measurements and the clear response to the original question that is somewhat rare for astronomy.

“I could imagine that the methodology we recently introduced and the combination of Faraday Rotation measurements and data from telescopes like the VLT might open a new window into the distant universe”, Lilly speculates.

As the authors point out in the conclusion of the “Nature” article, the results should also lead to the reconsideration of the existing astronomical practice, where the magnetic fields are often ignored. Lilly and his team will continue to peer out of the window that has just opened and probe into the secrets of the magnetic fields. The researchers have already been granted additional observation time with the telescope. The next steps towards a more comprehensive understanding of the “mystery of the magnetic field” will be to increase the quasar sample and localize the magnetic fields precisely in the galaxies.

The Dynamo Theory

A dynamo converts mechanical energy into magnetic energy. The dynamo theory is an attempt at explaining the mechanism with which bodies in the sky can develop a magnetic field. In astronomical objects like planets, stars or galaxies, the dynamo effect occurs if there are turbulent currents and a non-uniform (differential) rotation prevails. This so-called alpha-omega dynamo can generate large-scale magnetic fields – even if the initial field was chaotic.

Original here


New Cookware Speeds Microwaving Time

By Robert Roy Britt, LiveScience Managing Editor

This rice cooker, available in Japan, interacts with microwaves to generate heat and cook rice in about half the time of conventional cookware. Credit: Sridhar Komarneni

A new material designed for use in microwaves heats foods and beverages more quickly and saves energy, its inventors say.

A microwave oven bombards food with microwaves, which are absorbed by certain molecules, including water, fats and sugars. The microwaves, powerful enough to kill viruses and bacteria, vibrate those molecules, heating the food.

"Conventional coffee cups are made from ceramic compositions which do not absorb microwaves and hence they do not heat up," explained Sridhar Komarneni, a professor of clay mineralogy at Pennsylvania State University. "When conventional ceramics are used for heating food, only food heats up and then the hot food heats up the ceramic."

Komarneni and colleagues in Japan made plates from a mix of 20 percent magnetite and 80 percent of a naturally occurring petalite mineral containing lithium, aluminum and silicon oxides. The new ceramic interacts with the microwaves and heats up, and "the microwaves heat up the container and hence the food," Komarneni told LiveScience. "Rice cooks in about half or less time."

The research is detailed in the Aug. 26 issue of the American Chemical Society's journal Chemistry of Materials.

Containers made from the material could pop popcorn more quickly, too, the researchers say.

And food stays hot longer.

"These ceramic materials not only heat up with microwaves but also retain heat for about 15 minutes and hence the food stays hot in the container," Komarneni said. "Ceramic plates could be used for pizza delivery as these plates are insulating materials."

A rice cooker and plates made from the material are already being sold by ASAHI Ceramics Research Co. in Japan.

Original here


Diamonds May Have Jumpstarted Life on Earth

By Robert Roy Britt, LiveScience Managing Editor

The Hope Diamond on a mirror at the Smithsonian's Natural History Museum in Washington, D.C., in an Oct. 2, 2003 file photo. Credit: AP Photo/Ron Edmonds

One of the greatest mysteries in science is how life began. Now one group of researchers says diamonds may have been life's best friend.

Scientists have long theorized that life on Earth got going in a primordial soup of precursor chemicals. But nobody knows how these simple amino acids, known to be the building blocks of life, were assembled into complex polymers needed as a platform for genesis.

Diamonds are crystallized forms of carbon that predate the oldest known life on the planet. In lab experiments aimed to confirm work done more than three decades ago, researchers found that when treated with hydrogen, natural diamonds formed crystalline layers of water on the surface. Water is essential for life as we know it. Also, the tests found electrical conductivity that could have been key to forcing chemical reactions needed to generate the first birth.

When primitive molecules landed on the surface of these hydrogenated diamonds in the atmosphere of early Earth, a few billion years ago, the resulting reaction may have been sufficient enough to generate more complex organic molecules that eventually gave rise to life, the researchers say.

The research, by German scientists Andrei Sommer, Dan Zhu, and Hans-Joerg Fecht at the University of Ulm, is detailed in the Aug. 6 issue of the American Chemical Society's journal Crystal Growth & Design. Funding was provided by the Landesstiftung Baden-Wurttemberg Bionics Network.

Another theory, called panspermia, holds that life on Earth arrived from space, as organisms rained down inside tiny meteors or giant comets.

The new research does not conclusively determine how life began, but it lends support to one possible way.

"Hydrogenated diamond advances to the best of all possible origin-of-life platforms," the researchers contend.

Original here


Rip Currents: The Ocean's Deadliest Trick

By LiveScience Staff Writers

The four deaths and three missing people reported this weekend at New York beaches hightlights the greatest of ocean dangers.

Year after year, the ocean's most successful killer is not the great white shark. It's not the deadly jellyfish. Not even monster waves or hurricane-force winds. Your worst ocean nightmare during a day at the beach is more likely to be a rip current, experts say.

Every year more than 100 beachgoers drown in these strong rushes of water that pull swimmers away from the shore. And that's just in the United States. Nearly half of all rescues made by lifeguards at ocean beaches are related to rip currents, according to the United States Lifesaving Association. Sharks typically kill about 6 people a year globally.

According to news reports, four swimmers drowned and three were missing Saturday in multiple incidents at Long Island and New York City beaches. At least three other people were rescued.

How they work

A common perception is that rip currents pull you underwater, but in reality they're roughly horizontal currents that gradually suck you further and further from the beach.

Here's how they originate: Waves break differently at different parts of a shore -- in some places the waves are strong and in others they are weak. These differing conditions carve out channels in sand bars that lie just off the beach. When water returns to the ocean, it follows the path of least resistance, which is typically through these channels.

This creates a strong and often very localized current capable of sweeping unsuspecting swimmers out to sea. The currents usually move at one to two feet per second but stronger ones can pull at up to eight feet per second. (On a track, Olympic sprinters cover about 34 feet per second.)

Heavy breaking waves can trigger a sudden rip current, but rip currents are most hazardous around low tide, when water is already pulling away from the beach.

Hurricanes, widely spaced swells, and long periods of onshore wind flow can also drum up stronger than normal currents. These conditions also create larger waves, which sometimes draw more people into the water.

What to do

It is easy to be caught in a rip current. Most often it happens in waist deep water, experts say. A person will dive under a wave, but when they resurface they find they are much further from the beach and still being pulled away.

What they do next can decide their fate.

Those who understand the dynamics of rip currents advise remaning calm. Conserve energy. A rip current is like a giant water treadmill that you can't turn off, so it does no good to try and swim against it.

The United States Lifesaving Association (USLA) suggests trying to swim parallel to the shore and out of the current. Once you've gotten out of the current, you can begin swimming back to shore.

However, if it is too difficult to swim sideways out of the current, try floating or treading water and let nature do its thing. You'll wash out of the current at some point and can then make your way back to shore.

If neither of these options seems to be working for you, continue treading water and try to get the attention of someone on shore, hopefully a lifeguard.

The USLA also emphasizes anyone planning to swim in the ocean should learn to swim well and never swim alone. Pick a beach with a lifeguard if you don't feel comfortable with your swimming abilities but still want to enjoy the surf. And finally, take a look at the water -- if it looks dangerous, don't even try it.

Better warning

The University Corporation for Atmospheric Research (UCAR) in Boulder, CO has recently begun a program to educate meteorologists at the National Weather Service about rip currents.

"Weather forecasters are familiar with the atmosphere, but they often don't have a background in physical oceanography," says UCAR meteorologist Kevin Fuell.

The program was pioneered successfully in Miami, where local media now routinely highlight areas of rip current risk and public awareness is high. Now rip current outlooks are provided for most of the East Coast, Gulf Coast, and Southern California.

Original here

How the Personal Genome Project Could Unlock the Mysteries of Life

By Thomas Goetz


George Church is dyslexic, narcoleptic, and a vegan. He is married with one daughter, weighs about 210 pounds, and has worn a pioneer-style bushy beard for decades. He has elevated levels of creatine kinase in his blood, the consequence of a heart attack. He enjoys waterskiing, photography, rock climbing, and singing in his church choir. His mother's maiden name is Strong. He was born on August 28, 1954.

If this all seems like too much information, well, blame Church himself. As the director of the Lipper Center for Computational Genetics at Harvard Medical School, he has a thing about openness, and this information (and plenty more, down to his signature) is posted online at arep.med.harvard.edu/gmc/pers.html. By putting it out there for everyone to see, Church isn't just baiting identity thieves. He's hoping to demonstrate that all this personal information — even though we consider it private and somehow sacred — is actually fairly meaningless, little more than trivia. "The average person shouldn't be interested in this stuff," he says. "It's a philosophical exercise in what identity is and why we should care about that."

As Church sees it, the only real utility to his personal information is as data that reflects his phenotype — his physical traits and characteristics. If your genome is the blueprint of your genetic potential written across 6 billion base pairs of DNA, your phenome is the resulting edifice, how you actually turn out after the environment has had its say, influencing which genes get expressed and which traits repressed. Imagine that we could collect complete sets of data — genotype and phenotype — for a whole population. You would very quickly begin to see meaningful and powerful correlations between particular genetic sequences and particular physical characteristics, from height and hair color to disease risk and personality.

Church has done more than imagine such an undertaking; he has launched it: The Personal Genome Project, an effort to make those correlations on an unprecedented scale, began last year with 10 volunteers and will soon expand to 100,000 participants. It will generate a massive database of genomes, phenomes, and even some omes in between. The first step is to sequence 1 percent of each volunteer's genome, focusing on the so-called exome — the protein-coding regions that, Church suspects, do 90 percent of the work in our DNA. It's a long way from sequencing all 6 billion nucleotides — the As, Ts, Gs, and Cs — of the human genome, but even so, cataloging 60 million bits multiplied by 100,000 individuals is an audacious goal.

The PGP stands as the tent pole of what Church calls his "year of convergence," the moment when his 30 years as a geneticist, a technologist, and a synthetic biologist all come together. The project is a proof of concept for the Polonator G.007, the genetic-sequencing instrument developed in Church's lab that hit the market this spring. And the PGP will also put Church's expertise in synthetic biology to use, reverse engineering volunteers' skin cells into stem cells that could help diagnose and treat disease. If the convergence comes off as planned, the PGP will bring personal genomics to fruition and our genomes will unfold before us like road maps: We will peruse our DNA like we plan a trip, scanning it for possible detours (a predisposition for disease) or historical markers (a compelling ancestry).

Bringing the genome into the light, Church says, is the great project of our day. "We need to inspire our current youth in a way that outer space exploration inspired us in 1960," he says. "We're seeing signs that knowing about our inner space is very compelling."

To Church, who built his first computer at age 9 and taught himself three programming languages by 15, all of this is unfolding according to the same laws of exponential progress that have propelled digital technologies, from computer memory to the Internet itself, over the past 40 years: Moore's law for circuits and Metcalfe's law for networks. These principles are now at play in genetics, he argues, particularly in DNA sequencing and DNA synthesis.

Exponentials don't just happen. In Church's work, they proceed from two axioms. The first is automation, the idea that by automating human tasks, letting a computer or a machine replicate a manual process, technology becomes faster, easier to use, and more popular. The second is openness, the notion that sharing technologies by distributing them as widely as possible with minimal restrictions on use encourages both the adoption and the impact of a technology.

Inside the Personal Genome Project

The project will turn information from 100,000 subjects into a huge database thath can reveal the connections between our genes and our physical selves. Here's how. — Thomas Goetz
1. Entrance Exam
Volunteers take a quiz to show genetic literacy. One question: How many chromosomes do unfertilized human egg cells contain? a) 11, b) 22, c) 23, d) 46, or e) 92? (Answer: c.) Only those with a perfect score proceed, but retests are allowed.
2. Data Collection
Volunteers sign an "open consent" form acknowledging that their information, though anonymized, will be accessible by others. They fill out their phenotype traits, listing everything from waist size to diet habits. Suitable respondents go on to the next step.
3. Sample Collection
Volunteers hit the medical center, where they are interviewed by an MD. Then a technician draws some blood, gathers a saliva sample, and takes a punch of skin. Don't worry: It hurts about as much as a bee sting.
4. Lab Work
The tissues are sent to a biobank, where DNA is extracted from the blood. One percent of it — the exome — is sequenced. Meanwhile, bacteria DNA is extracted from the saliva and sequenced to reveal the volunteer's microbiome.
5. Research
Now the fun part: Crunching the numbers. PGP scientists and other researchers start working with the data assembled from 100,000 individuals to investigate potential links between phenotypes and genotypes. The team will look for patterns and statistically significant anomalies.
6. Sharing
The volunteers get access to not only the raw data from their genome, but anything the research team gleans from their information. Insights — a newly discovered cancer risk, for example — are posted in a volunteer's file, which they'll be free to share with other PGP participants.


"I always tell people, your biggest problem in life is not going to be hiding your stuff so nobody steals it," Church says. "It's going to be getting anybody to ever use it. Start hiding it and that decreases the probability to almost zero."

For most of his career, Church has been known as a brilliant technologist, more behind-the-scenes tinkerer than scientific visionary. Though he was part of the group that kicked off the Human Genome Project, he's far less known than scientists like Francis Collins or J. Craig Venter, who took the stage at the end. His obscurity is due partly to his style. He talks about his accomplishments with a certain detachment that one might mistake for ambivalence. "He's not without ego; it's just a different sort of ego," says entrepreneur Esther Dyson, a friend and one of the first 10 PGP volunteers. "Everything is a subject of his intellectual curiosity, including himself."

His low profile may be the result of his tendency to get too far ahead of the curve, working a decade or two ahead of his field — so far that even the experts don't always get what he's talking about. "Lots of George's work is so advanced it's not ready to become standard," says Drew Endy, a professor of bioengineering at Stanford and cofounder with Church of Codon Devices, a synthetic-biology startup. "He's perfectly happy to spin out tons of ideas and see what might stick. It's high-throughput screening for technology and science. That's not the way most people work."

But thanks to the PGP, the Polonator, and the fact that the rest of the world is finally starting to understand what he's been talking about, Church's obscurity is coming to an end. He sits on the advisory board of more than 14 biotech companies, including personal genomics startup 23andMe and genetic testing pioneer DNA Direct. He has also cofounded four companies in the past four years: Codon Devices, Knome, LS9, and Joule Biosciences, which makes biofuels from engineered algae. Newsweek recently tagged him as one of the 10 Hottest Nerds ("whatever that means," Church laughs).

For someone who has spent his whole career ahead of his time, he is suddenly very much a man of the moment.

Most historians would cite Prague or Paris or Berkeley as the intellectual hub of the 1960s, but for people interested in computers, there was no place so significant as Hanover, New Hampshire. There, at Dartmouth College, an experiment in time-share computing was flourishing. Developed by professors John Kemeny and Thomas Kurtz, the Dartmouth Time-Sharing System let students remotely access the power of a mainframe computer to do calculations for mathematics or science assignments or to play a simulated game of college football. It ran on an easy-to-learn, intuitive program that Kemeny and Kurtz called Basic.

In 1967, the DTSS transitioned to a more-powerful GE-635 machine and offered remote terminals to 33 secondary schools and colleges, including Phillips Academy, a prep school in nearby Andover, Massachusetts. The terminal — not much more than a teletype machine, really — sat in the basement of the school's math building, forgotten until the next fall, when a young George Church showed up for his freshman year and began asking whether there was a computer on campus. Someone pointed Church to the basement. "There wasn't even a chair in the room. I had used a typewriter before, but never a teletype. And so I just started pressing keys," Church recalls. "Eventually I hit Return, and it came back with 'What?' And so I started typing in stuff like crazy and hitting Return. And it kept coming back with 'What?' At that point, I was pretty convinced it wasn't a human, but it was actually talking in words. So I just hadn't asked the right question or given the right answer."

Soon, Church found a book on Basic. "I was just sailing," he says. He spent endless hours in that basement — he eventually borrowed a chair — and taught himself the intricacies of coding, learning to program in Basic, Lisp, and Fortran. Indeed, thinking in code came so naturally to Church that he stopped going to his classes (a habit that would later get him kicked out of graduate school at Duke) and taught the computer linear algebra instead.

It turns out that learning how to write code — change it, hit Return, see what it will do — was ideal training for Church's eventual career in computational biology. "That's how we reverse engineer things like E. coli — you change something, and you see how it behaves," he says. "Little did I know that 30 years later, we would use almost exactly the same operations to optimize metabolic networks."

Church first hit on the power of computation to automate biology in the mid-'70s when he was in graduate school at Harvard. At the time, he was working on recombinant DNA, a then-new technique to splice a gene from one organism into another. Identifying a sequence of 80 or so base pairs of genetic code was a slow, tedious process. "You had to literally read off the bases and write them on a piece of paper, one by one," Church says. "So I wrote a sequence-reading program that would crunch it out. When the senior graduate student heard I had automated that, he said, 'What do you want to do that for? That's the only fun part.'"

By 1980, when Church's adviser, Wally Gilbert, won the Nobel Prize for DNA sequencing techniques, the process was still slow and expensive, executing one DNA strand at a time. So Church began working on one of his earlier targets for automation. His idea was to sequence several strands together by combining them into a single sample mixture. He called it multiplexing, drawing an analogy to signal multiplexing in electronics, in which more than one signal flows through a current at the same time. Church thought most of the work could even be integrated into one device rather than numerous machines.

It was a provocative idea, not just because he was substituting several human tasks for machine-driven ones, but also because he didn't make the usual false promise that technology would simplify the process. On the contrary, multiplexing would be complicated, Church maintained. But technology was up to the task.

Four years later, Church was invited to present his work on multiplexing at a small meeting in Alta, Utah. The Department of Energy had gathered about 20 scientists to mull over one question for five days: How might recent advances in genetics be used to measure an increase in genetic mutations arising from radiation exposure, as in Hiroshima? The group quickly reached the conclusion that technology circa 1984 couldn't answer that question. Meanwhile, they still had several more days in the mountains. "There were a bunch of us there who could talk about genomics as if it were an engineering exercise. And then we said, well, as a kind of booby prize, we could think of other things you could do," Church recalls, "like, say, sequencing the human genome."

Though Church was almost entirely unknown before the meeting, his presentation on multiplex sequencing methods stole the show. When he fell into a huge snow drift during a break one afternoon, one participant worried that the future of sequencing had disappeared with him.

That Alta brainstorm would become the Human Genome Project — the effort, adopted by the National Institutes of Health, to sequence one human genome for $3 billion within 15 years. However audacious the HGP seemed, Church was disappointed by it almost from the start. "We could have said our goal was to get everybody's genome for some affordable price," he says, "and one genome would be a milestone" on the way toward that goal.

The HGP also played it safe with its choice of technology. Despite the promise of Church's multiplexing system, the HGP instead used a more established instrument manufactured by Applied Biosystems, based on a technique developed by biochemist Frederick Sanger. As Church saw it, this meant that the project had failed to put its $3 billion toward improving the state of the art. Even worse, the HGP consumed so many of the resources available to the field of genetics that it effectively locked that state of the art into 1980s technology.

The result was nearly two decades of inertia. It wasn't until 2005, when the Human Genome Project was complete and new goals were put forth, that Church finally perfected the multiplexing approach he had presented 20 years earlier at Alta. In a paper published in Science, Church demonstrated a technique that could analyze millions of sequences in one run (Sanger's method could handle just 96 strands of DNA at a time). And Church's method not only accelerated the process, it made it far cheaper, too, elegantly demonstrating the power of automation to drive exponential advances and bring down costs. Church's approach, and a competing innovation developed by 454 Life Sciences that same year, inaugurated the second generation of sequencing, now in full swing.

In the past three years, more companies have joined the marketplace with their own instruments, all of them driving toward the same goal: speeding up the process of sequencing DNA and cutting the cost. Most of the second-generation machines are priced at around $500,000. This spring, Church's lab undercut them all with the Polonator G.007 — offered at the low, low price of $150,000. The instrument, designed and fine-tuned by Church and his team, is manufactured and sold by Danaher, an $11 billion scientific-equipment company. The Polonator is already sequencing DNA from the first 10 PGP volunteers. What's more, both the software and hardware in the Polonator are open source. In other words, any competitor is free to buy a Polonator for $150,000 and copy it. The result, Church hopes, will be akin to how IBM's open-architecture approach in the early '80s fueled the PC revolution.

In the sequencing game, though, the cost of the machine is only half the equation. The more telling expense is the operating cost, particularly the cost of sequencing entire human genomes. Executives at 454 estimate that their latest machine can pull off a whole genome sequence for $200,000. Applied Biosystems claims its instrument has completed a genome for just $60,000. Church maintains that, while the Polonator isn't up to whole-genome reads, it is clocking in at about one-third the cost of Applied Biosystems' estimate. A whole sequence from Knome, the retail genomics firm cofounded by Church, goes for $350,000. (It's worth noting that these figures are only roughly comparable, since each company uses slightly different quality measures and specifications.)

As these numbers continue to drop, the mythical $1,000 genome comes ever closer. Sequencing a human genome for $1,000 is the somewhat arbitrary benchmark for true personalized genomics — when the science could become a component of standard medical care. An important catalyst in achieving that point is the Archon X Prize for Genomics, which is offering $10 million to the team that can sequence 100 complete genomes in 10 days for less than $10,000 each. As of June, seven teams, including Church's lab, had entered the competition. Church, who served for a time on the advisory board of the contest, says that the prize will drive costs down further and help publicize the potential of personalized whole-genome sequencing.

That's important because Church hopes the Polonator and other next-generation instruments will inspire a new generation of smaller labs to begin work in personal genomics, as well as other genetic sciences. Already, the onslaught of technology has jump-started new projects, like sequencing part of the Neanderthal genome, examining extremophile microbes in old California iron mines, and studying the regenerative properties of the salamander. In medicine, cheaper sequencing has enabled research into drug-resistant tuberculosis; the genetics of breast, lung, and other cancers; and the DNA architecture of schizophrenics.

But if the Polonator is going to lead that charge, it has to work — and work on a massive scale. And that means passing a major test: successfully sequencing the 100,000 exomes in the PGP.

Photo: Lloyd Ziff

All of us know our height, weight, and eye color. Fewer of us know our arm span or resting blood pressure. But who among us knows the direction of our hair whorls or the Gell-Coombs type of our allergies? This is the level of detail that the PGP requires the 100,000 volunteers to reveal about themselves, a list staggering in its exhaustiveness. The PGP will tally head circumferences, injuries, chin clefts and cheek dimples, whether volunteers can roll their tongues or hyperflex their joints, whether they dislike hot climates or are hot tempered, if they've often been exposed to power lines or wood dust or diesel exhaust or textile fibers. The project questionnaire asks how many meals they eat a day and whether they prefer their food fried, broiled, or barbecued. It even demands to know how much television they watch. And, of course, PGP volunteers will hand over most aspects of their medical history, from vaccines to prescriptions.

This phenotype data will be integrated with a volunteer's genomic information, then combined with statistics from all the other subjects to create a potent database ripe for interrogation. In contrast to the heavy lifting that genetic research requires now — each study starts from scratch with a new hypothesis and a fresh crop of subjects, consent forms, and tissue samples — the PGP will automate the research process. Scientists will simply choose a category of phenotype and a possible genetic correlation, and statistically significant associations should flow out of the data like honey from a hive. A genetic predisposition for colon cancer, for instance, might be found to lead to disease only in connection with a diet high in barbecued foods, or a certain form of heart disease might be associated with a particular gene and exposure to a particular virus. Genomic discovery won't be a research problem anymore. It'll be a search function. (This helps explain why Google, among others, has donated to the project).

The process began last year, and each of the first 10 volunteers has a background in medicine or genetics. They include John Halamka, CIO of Harvard Medical School and a physician; Rosalynn Gill, chief science officer at Sciona (a personalized genetics nutrition company); and Steven Pinker, the noted psychologist and author. The other 99,990 participants won't be expected to be so elite, though they will have to pass a genetics-literacy quiz to demonstrate informed consent. The general selection process, which starts with registration at personalgenomes.org, is scheduled to begin later this year.

Besides offering up their genomes, subjects will have to part with some spit and a bit of skin. The saliva contains their microbiome — the trillions of microbes that exist, mostly symbiotically, on and in our bodies. If phenotype is a combination of genotype plus environment, the microbiome is the first wash of that environment over our bodies. By measuring some fraction of it, the PGP should offer a first look at how the genome-to-microbiome-to-phenome chain plays out.

The skin sample goes into storage, creating what would be one of the world's largest biobanks. Members of Church's lab have devised a way to automate turning the skin cells into stem cells, and they hope to publish the technique later this year. (Similar work has been done at the University of Wisconsin and Kyoto University.) By reprogramming the skin cells using synthetically engineered adenoviruses, Church's team can transform the skin cells into many sorts of tissue — lungs, liver, heart. These tissues could be used as a diagnostic baseline to detect predisposition for various diseases. What's more, the reprogrammed cells could be used to treat disease, replacing damaged or failing tissue. It's an intriguing hint of how Church's work with synthetic biology complements genomic sequencing.

If the PGP were simply an exercise in breaking down 100,000 individuals into data streams, it would be ambitious enough. But the project takes one further, truly radical step: In accordance with Church's principle of openness, all the material will be accessible to any researcher (or lurker) who wants to plunder thousands of details from people's lives. Even the tissue banks will be largely accessible. After Church's lab transforms the skin into stem cells, those new cell lines — which have been in notoriously short supply despite their scientific promise — will be open to outside researchers. This is a significant divergence from most biobanks, which typically guard their materials like holy relics and severely restrict access.

For the PGP volunteers, this means they will have to sign on to a principle Church calls open consent, which acknowledges that, even though subjects' names will be removed to make the data anonymous, there's no promise of absolute confidentiality. As Church sees it, any guarantee of privacy is false; there is no way to ensure that a bad actor won't tap into a system and, once there, manage to extract bits of personal information. After all, even de-identified data is subject to misuse: Latanya Sweeney, a computer scientist at Carnegie Mellon University, demonstrated the ease of "re-identification" by cross-referencing anonymized health-insurance records with voter registration rolls. (She found former Massachusetts governor William Weld's medical files by cross-referencing his birth date, zip code, and sex.)

To Church, open consent isn't just a philosophical consideration; it's also a practical one. If the PGP were locked down, it would be far less valuable as a data source for research — and the pace of research would accordingly be much slower. By making the information open and available, Church hopes to draw curious scientists to the data to pursue their own questions and reach their own insights. The potential fields of inquiry range from medicine to genealogy, forensics, and general biology.

And the openness doesn't serve just researchers alone. PGP members will be seen as not only subjects, but as participants. So, for instance, if a researcher uses a volunteer's information to establish a link between some genetic sequence and a risk of disease, the volunteer would have that information communicated to them.

This is precisely what makes the PGP controversial in genetics circles. Though Church talks about it as the logical successor to the Human Genome Project, other geneticists see it as a risky proposition, not for its privacy policy but for its presumption that the emerging science of genomics already has implications for individual cases. The National Human Genome Research Institute, for example, has cautioned that the burgeoning personal-genomics industry, which includes research-oriented projects like the PGP as well as straight-to-consumer companies like Navigenics and 23andMe and whole-genome-sequencing shops like Knome, puts the sales pitch ahead of the science. "A lot of people would like to rapidly capitalize on this science," says Gregory Feero, a senior adviser at the NHGRI. "But for an individual venturing into this now, it's a risk to start making any judgments or decisions based on current knowledge. At some point, we'll cross over into a time when that's more sensible."

Church cautions, however, that keeping clinicians and patients in the dark about specific genetic information — essentially pretending the data or the technology behind it don't exist — is a farce. Even worse, it violates the principle of openness that leads to the fastest progress. "The ground is changing right underneath them," he says of the medical establishment. "Right now, there's a wall between clinical research and clinical practice. The science isn't jumping over. The PGP is what clinical practice would be like if the research actually made it to the patient."

In the not-too-distant future, Church says, hospitals and clinics could be outfitted with a genome sequencer much the way they now have x-ray machines or microscopes. "In the old books," Church says, "almost every scientist was sitting there with a microscope on their table. Whether they're a physical scientist or a biological scientist, they've got that microscope there. And that inspires me."

Original here

Japanese sushi rage threatens iconic Mediterranean tuna

Fishmongers check the quality of meat on large tuna fish at this year's first trading day at Tokyo's Tsukiji fish market, January 2008. The rage for sushi and sashimi, Japan's raw fish dishes that overtook the West and have now spread to increasingly prosperous China, risks wiping out one of the Mediterranean's most emblematic residents: the bluefin tuna.

The rage for sushi and sashimi, Japan's raw fish dishes that overtook the West and have now spread to increasingly prosperous China, risks wiping out one of the Mediterranean's most emblematic residents: the bluefin tuna.

Experts say too many of these majestic fish prized since Greek and Roman times -- each one of which can weigh up to 900 kilos (nearly 2,000 pounds) -- are ending up on the platters of restaurants around the globe.
"Japanese consumption was already a threat to bluefin tuna in the Mediterranean. The European craze for sushi bars has added to that," said Roberto Mielgo Bregazzi, a Spanish expert and author of several reports for Greenpeace and the World Wildlife Fund.

And "if the Chinese market continues to grow, that will be the end of the stock," he said.

Eating Japanese-style raw fish in rice packages spread to Europe and the United States in the 1990s and quickly grabbed palates there.

China seems to be next, according to Bregazzi who said there had been a significant increase in tuna consumption there in the past six years. Even though there are few official figures on Chinese consumption, the trend has also been observed by the International Commission for the Conservation of Atlantic Tunas (ICCAT), a body responsible for managing bluefin tuna fishing.

Japan, however, remains the main consumer of bluefin tuna. "Around 80 to 85 percent of bluefin tuna caught in the Mediterranean is exported to Japan," said Jean-Marc Fromentin, a leading worldwide expert on the subject at the French Research Institute for Exploitation of the Sea (IFREMER).

Sushi consumption took off after World War II, largely using southern bluefish tuna then found in huge numbers off the coast of Australia.

"This stock has now collapsed thanks to over-fishing, and the Japanese turned their attention to the Atlantic bluefin tuna," said Fromentin, adding that despite its name, Atlantic bluefin comes mainly from the Mediterranean.

Prices began to climb. Fishing fleets were modernised in Europe, and new fishing fleets created in Turkey and northern Africa. The result -- a huge over-capacity in fishing.

Today more than 50,000 tonnes of bluefin tuna are caught every year in the Mediterreanean. To prevent stocks from collapsing, that figure should be limited to 15,000 tonnes in the short term, according to ICCAT.

"The bluefin tuna industry is in the process of fishing itself to death," said Greenpeace oceans campaigner Karli Thomas.

The risk now is that the depletion of tuna will wipe out the fishing sector, and cost thousands of jobs in the Mediterranean region.
The big firms push fishermen into over-fishing --

----------------------------------------------------

In May and June, fishermen from France, Italy, Libya, Malta, Spain, Tunisia and Turkey are under pressure to maximise their catch. Most use a net called a "purse seine", which is weighted to reach the sea floor, with hoops and ropes which allow the fishermen to pull the drawstrings and trap the bluefin tuna within the net.

They work out of ultra-modern trawlers, up to 60 metres (196 feet) long and costing around five million euros (eight million dollars).

At this point in the chain, a kilo of bluefin tuna fetches the fisherman between eight and 10 euros.

The catch is sold to tuna fattening farms, many located in Cypriot, Croatian, Maltese, Sicilian, Spanish and Tunisian waters and many owned by Japnese companies like Mitsubishi, Maruha or Mitsui or the Spanish group Ricardo Fuentes e Hijos.

The tuna are transported to these offshore farms in huge circular cages 50 metres in diameter and 23 metres deep. Once in the tuna farms, or ranches as they are sometimes known, the tuna gorge on tonnes of sardines, mackerel and herring.

This boosts their weight to the demands of the Japanese buyers. Some farms also feed the tuna freeze-dried garlic to stimulate their blood circulation, or prawns to boost their reddish tinge.

It takes between nine and 20 kilos of small fish to put a kilo of weight on to a bluefin tuna, according to a co-owner of the Fish and Fish Farm in Malta, Joseph Caruana.

The fattened tuna are then sold at around 13 euros (20 dollars) a kilo to Japanese buyers, who in turn sell them for a much higher price in Tokyo -- where a good quality, 200-kilo tuna can fetch up to 20,000 euros.

"It is the big firms that push the fishermen into over-fishing," said Bregazzi.

In theory, the bluefin tuna harvest is monitored so that the ICCAT quotas are respected. But there are numerous flaws.

This year, the European Commission -- the EU executive -- has clamped down somewhat. Fishing was even put on hold for 15 days for some countries -- a decision which infuriated French and Italian fishing fleets in particular.

"But the Turkish tuna seiners continued to fish, and there is always the illegal fishing by Japan and Korea," said French fisherman Andre Fortassier.

Fishing and farms have also developed in non-EU countries in recent years, including Libya, Tunisia and Algeria where quota controls tend to be looser, industry sources say.

For Sergi Tudela, a Spanish marine biologist with Worldwide Fund for Nature (WWF), the responsibility lies with ICCAT.

"If important parties in ICCAT such as the US, the European Union and Japan decide to put an end to this unsustainable situation and to adopt real recovery measures, the other countries should accept them," he said.

"Japan is the key market. If there is a real willingness from Japan to ensure that only real sustainable production is being imported, they can implement that," he added.

"The potentiality is there, it only lacks political will."

Original here

Would You Drive 55?

By WILLIAM SCHULTZ / WASHINGTON
Liberals say Iraq is another Vietnam; conservatives say Barack Obama is Jimmy Carter redux. ABBA's a mega-hit and Elton John's going to be performing at Madison Square Garden. Had enough of these '70s flashbacks? Brace yourself for another: the return of the national speed limit, courtesy of one of the country's most venerable politicians.
Senator John Warner (R-VA) - elected in 1978 - recently expressed interest in the idea of a national speed limit to conserve gasoline. Warner, who is not running for re-election this year, wrote to U.S. Secretary of Energy Sam Bodman, asking "at what speed is the typical vehicle traveling on America's highways today most fuel efficient?"

Warner told TIME his concern is for "the many millions and millions [of Americans] of limited means, sitting around their kitchen table trying to figure out how to make ends meet." Unlike long-term alternative energy sources, Warner says, a speed limit would work to bring down gas prices immediately. "Maybe some guy's got a better idea," he says. "But I haven't seen it."

The National Maximum Speed Limit of 55 mph was created in 1974, when Richard Nixon signed the Emergency Energy Highway Conservation Act. Prior to that, states had been free to set their own speed limits, but the new law threatened to strip Federal highway funding from any state straying above the national standard. The ostensible purpose of this limit was to keep down gas prices, which had been driven through the roof by an OPEC embargo touched off by the 1973 Arab-Israeli war. And with gas-prices once again sky-high, Warner isn't alone in talking up a cap on speeding.

Jackie Speier, a first-term Democratic congresswoman from California, is already on the case. Earlier this month, she introduced a bill that would cap highway speed limits at 60 mph - 65 in rural areas. It's currently awaiting a hearing before the House Committee on Transportation. Warner says he hasn't contacted Speier, but adds that he'd be willing to "stroll out on the floor" in favor of a speed-limit bill. He has yet to propose a similar bill in the Senate.

The thinking behind Warner and Speier's speed-limit proposals is simple. At a certain speed, a car's gas mileage begins to drop; the faster you go, the more fuel your burn. Ergo, slow down and save gas. According to fueleconomy.gov, a website run by the Department of Energy, "each 5 mph you drive over 60 mph is like paying an additional $0.30 per gallon for gas." Warner approvingly cites a congressional study showing that "the law resulted in reduced consumption of 167,000 barrels of petroleum a day." With millions of more cars on the road now than there were in 1974, the volume saved could be even greater.

Then there's the issue of safety. Tim Castleman, founder of the pro-limit group Drive 55 Conservation Groups, notes: "When they instituted [a national speed limit] in 1974, it was a one-year deal, but after one year they found highway deaths had dropped by 4,000." This unexpected side benefit, Castleman says, led Congress to make what had been a temporary measure permanent.

Some opponents of the speed limit question the numbers tossed around by Warner, Castleman, and others. Indeed, the safety argument looks a bit flimsy on closer examination. Since the 55-mph limit was repealed in 1995, the number of fatal motor vehicle crashes has increased by little more than 1,000, while deaths per 100,000 licensed drivers dipped over the same period.

In a 1999 study for the libertarian Cato institute, economist Stephen Moore noted that the number of auto crashes actually fell by 66,000 after the 55 mph limit was lifted. Moore also points out that a lower speed limit means more time wasted idling in traffic: "The most valuable resource on this earth is not oil, its human time."

A law only works when it's obeyed - and it's an open question how many motorists would comply. The 1974 law was considered a joke by the many drivers, who violated it with impunity. "Real compliance out on the interstate was somewhere around twenty percent," says Jim Baxter, president of the National Motorists Association. "Eighty percent of the population was exceeding the 55 mile-per-hour speed limit!"

Some groups would meet the return of the speed limit with a yawn rather than a groan. Instead of waiting for the government to step in, they've chosen to self-regulate. A number of trucking companies have mandated that their fleets stay at or below 65 mph. Douglas Stotlar, CEO of Con-way Inc., says his company's decision to lower their limit to 62 was driven both by environmental concerns and because "fuel prices were going to unprecedented levels." Exactly, say foes of a national speed limit; people can be trusted to slow down and conserve gas without the government leaning over their shoulder.

But Warner insists the government has got to do something, and do it now. Though he favors drilling offshore, he also says "That's five, six, seven years out. The pain is tonight, tomorrow night, and the night after that. I'm just sensitive to people's pain. Who's got the courage to do something like this?" Warner, who arrived on the national political scene in the oil-starved '70s, thinks that era might just hold the solution to our current energy crisis. View this article on Time.com

Original here



Wind-Powered Tall Ships Are Once Again Important As Oil Prices Hurt Trade

Tarantulas, fire ants lurk in Texas floodwaters

A tarantula clings to a fence to escape the flood waters from Hurricane Dolly, Thursday, July 24, 2008, in San Benito, Texas. (AP Photo/Matt Slocum)

By CHRISTOPHER SHERMAN

EDINBURG, Texas (AP) — South Texans eager to salvage what they can from waterlogged homes struck by Hurricane Dolly have another problem: The floodwaters they're slogging through are laced with stinging fire ants, snakes and tarantulas.

"You don't want to wade in this water," state Health Services Commissioner David Lakey said during a visit to the Rio Grande Valley Friday. "You don't want to play in this water. You want to stay out of this water."

It was timely advice, but residents in many neighborhoods with waist-deep water had little choice as they sifted through the mess left by the Category 2 storm that hit the eastern Texas and Mexico coasts Wednesday. In eastern Hidalgo County, as much as 12 inches of rain fell in six hours, turning neighborhoods into coffee-colored lakes.

Officials estimated it could take six weeks for the low-lying region to completely dry out and 118,000 people still had no electricity Friday morning. Emergency managers tried to assure people that they would come to help and begged for patience. They said they were beginning to pump water from some of the worst hit areas and working to move water into floodways.

Residents were using backhoes to dig their own drainage canals and clear water off their property. But the water simply flowed into the neighbors' yards. Tempers among longtime neighbors were becoming strained.

Iliana Reyna, 34, was monitoring the floodwater's rise to the second step of her front porch in Edinburg.

Reyna, her husband and three children waded into the water Friday to gather a few belongings and what dry goods they could.

Suddenly, 4-year-old Adolfo, standing on the shoulder of the road in bare feet, screamed and began hopping. The other children scooped up water in their shoes and splashed it on his feet, while his father lifted him and brushed away the attacking ant.

"This is just too much for us," said neighbor Arnold Silva, whose yard was flooded when another neighbor dumped water into it. It rose throughout the night carrying runoff from a cow pasture and "worms, spiders and ants."

Fire ants and tarantulas — hairy spiders sometimes the size of a dinner plate — can deliver stinging, painful bites but are not deadly.

The National Weather Service said the remnants of Dolly could still add a few inches of rain to some areas, and the forecast called for isolated showers in the Rio Grande Valley.

But water wasn't the only danger. Illness also lurked in refrigerators, health officials said.

"If it doesn't look right, doesn't smell right, don't eat it," said Eddie Olivarez, Hidalgo County health administrator. He said inspectors were fanning out to restaurants to make sure they disposed of food properly as well.

Fewer than 200 people remained in shelters in Hidalgo County, down from a peak of nearly 3,300. But rescue crews in boats were still searching flooded neighborhoods and plucking people from homes.

Still, officials were relieved it wasn't worse that no one died in the first hurricane of the season to hit the U.S. mainland.

The clean up will be substantial: President Bush declared 15 counties in south Texas disaster areas to release federal funding to them, and insurance estimators put the losses at $750 million.

The storm brought 100 mph winds and broke all-time July rainfall records in the Lower Rio Grande Valley, dumping a foot of rain in some spots.

Steve McCraw, the state's homeland security director, said about 1,500 workers were on hand to help restore power and seven stations were distributing water, ice, food and hygiene kits.

Gov. Rick Perry, who flew over the area Thursday with U.S. Sen. John Cornyn, cautioned residents not to rest easy just yet.

"It appears that we have handled it as well as it can be handled. But it is far from over," Perry said, noting possible flooding over the next five days from runoff as the storm moves northward.

After crashing ashore on South Padre Island, Dolly meandered north, leaving towns on the northern tip of the Rio Grande Valley with a surprise. Officials had feared the levees would breach, but the storm veered from its predicted path and they held strong.

"We're glad it didn't make a direct hit, but it just refocuses on the issues we have," said Cameron County Judge Carlos Cascos. "The levees are suspect. Nothing's changed in my opinion."

On South Padre Island, which bore the brunt of the winds, officials said no buildings were in danger of collapse, but damage was widespread to hotels and other businesses.

Avi Fima was mourning the damage to "my baby" — his Surf Stop store on Padre Boulevard. Windows were blown out, half the roof was torn away and water bubbled up the carpeting inside.

"This is going to hit us good," Fima said. "We actually started summer really good. ... To rebuild it — the season will be over. We have a month left."

Shell Oil announced that it had restarted production at its natural gas operations in the Valley and was redeploying workers to rigs in the western Gulf of Mexico.

Original here