There was an error in this gadget

Followers

Sunday, September 7, 2008

Spy Satellites Chasing Shadows: NASA's Contribution to the War on Terror

995_sukhoi Are you having trouble with funding? Is your research unable to attract major media attention? Just add Terror (TM)! That's what Dr Stoica of NASA's Jet Propulsion Laboratory did, and it could work for you too.

Dr Stoica's research is based on gait analysis - the idea that everyone has a distinctive walking pattern, and no matter how many fake beards or dark glasses you put on you can't disguise your stride unless you're wearing so many your knees break.

So far so good. Things only go off the rails with his proposal to use this software to scan shadows to find terrorist suspects. From space. Seriously. Satellites have a hard time recognising individuals from the tops of their heads (the lack of Superman in our world means people rarely have reason to look straight up), which is why Stoica has developed software which can reconstruct their profile from their shadow and the time of day.

That sort of thing might work in CSI: Hawaii Beach (or whatever they're up to now), but here in the real world there are a few problems with the pitch:

1. Satellites simply do not have the required resolution. The highest commercial satellite resolution currently available is half a meter, meaning you can only make out details about that big. If your terror suspect can still be recognized at that scale you could have saved time by checking the International Sumo Federation database. The military often has better gear, of course, but in this case it's not a question of money - it's a question of what's physically possible.

2. Computer reconstruction of images might seem standard for the average movie-goer, but in the real world you can't just say "enhance" and capture a fingerprint from a week old pizza box. This idea calls for processor intensive, time sensitive reconstruction of hundreds of frames of footage for millions of people, all on the off chance of getting something clear enough to identify someone. There are a huge range of amazing data-mining applications of modern surveillance, but this isn't one of them. This is more a "with these new steam-engines, we could build a machine to care for infants" idea, i.e. a really bad one.

3. Walking patterns are distinctive, but they aren't "six billion different types" distinctive - especially not with reduced satellite resolution and when they've all been reprocessed by the same reconstruction algorithms (which will have their own bias).

4. Little thing, tiny point, but Dr Stoica does point out that this system will only catch those terrorists whose walking pattern is already on file. Unfortunately, the world's intelligence agencies don't actually have a catch-and-release policy for major terrorists - getting them, measuring them, then setting them loose to see which clever scientist can find them again - so we can expect this to be what we call a "major issue."

5. As well as the "it's good for Terror!" warning sign, there's also the "at the earliest stages of development" alarm bell in the announcements. Translation "You will see nothing useful out of this for ten years, if ever, and the only reason we're even mentioning it is we need the money/attention".

You'd be better off finding which soft drink terrorists like best, then tasking satellites to track the delivery trucks. At least you can SEE those from space.

Posted by Luke McKinney.

Original here

'Big Bang machine' fires up

THE CHALLENGE
Servicing ATLAS
The LHC took 10,000 scientists a total of 14 years to assemble

How do you build a "Big Bang Machine"? That was the challenge which scientists at Cern began to ponder in the early 1980s, when the idea for the Large Hadron Collider was born.

Cern's governing council wanted to build a kind of time machine that could open a window to how the Universe appeared in the first microseconds of its existence.

If it could recreate the fleeting moments 13.73 billion years ago, when the fundamental building blocks of the cosmos took shape, then the world we live in today would be brought into much sharper focus.

It could discover how matter prevailed over antimatter, learn how dark matter was formed, and catch our first glimpse of the elusive Higgs boson - a "missing jigsaw piece" in our model of the universe.

We might even find evidence of the existence of other dimensions. But to conjure up these conditions, the Cern council new it needed to perform an engineering miracle.

ATLAS
The 12-storey ATLAS detector weighs in at 7,000 tonnes

To generate the necessary high energies, the designers required a particle accelerator more magnificently complex than any machine ever built.

Beams of protons would be hurled together at 99.9999999% of the speed of light, in conditions colder than the space between the stars and each travelling with as much energy as a car at the speed of 1,600km/h.

And yet the fruits of these explosions - high-energy particles - would decay and disappear from view in less than a trillionth of a second.

To "photograph" these valuable prizes would require a detector as large as a five storey building, yet so precise, it could pinpoint a particle with an accuracy of 15 microns - 20 times thinner than a human hair.

How on earth do you build a machine like that? The journey took 14 years, more than 10,000 scientists, from 40 countries, and a financial injection anticipated at up to 6.2bn euros - four times the original budget. But it was achieved, on time. Well, almost.

THE LARGE HADRON COLLIDER
LHC Dipole magnet
The last of the LHC's 1,700 dipole magnets is lowered into place

The plans for the Large Hadron Collider began to gather momentum in the early 1980s, inspired by the success of its predecessor at Cern, a collider known as the Large Electron Positron (LEP).

But it was not until 1994 that the formal proposal for the LHC was ratified by Cern's member states, and the engineering work began.

The accelerator would be housed in a near-circular 27km-long tunnel, buried 50m-175m underneath the Jura mountains, criss-crossing the Swiss-French border. The tunnel was already in place - being the once occupied by LEP, which was eventually disassembled in 2000.

Inside the LHC vacuum pipe, two parallel beams of subatomic particles (protons or lead ions) would hurtle in opposite directions at record energies.

Crashing together at specially designated junctions, they would release unstable, high-energy particles - including, perhaps, the elusive Higgs Boson.

To generate a magnetic field powerful enough to steer the high-energy particles around the pipe requires 1,740 superconducting magnets, which together required some 40,000 leak-tight welds and 65,000 "splices" of superconducting cables.

If you added all the filaments of these strands together, they would stretch to the Sun and back five times, with enough left over for a few trips to the Moon.

In order to conduct, the magnets must be cooled to within a couple of degrees of "absolute zero", the theoretical limit for how cold anything can get. This requires a constant supply of liquid helium pumped down from eight over-ground refrigeration plants - about 400,000 litres per year in total. Enough to fill 1000 swimming pools.

THE DETECTORS
CMS cavern dig2
Engineers excavating the cavern for CMS encountered serious difficulty

At the junctions where particles collide, four enormous detectors have been designed to observe the microscopic wreckage.

Between 1996 and 1998, approval was granted for four giant "experiments" - Alice, Atlas, CMS and LHCb - to be housed in four enormous underground caverns, dug strategically around the collider loop.

Excavating these caverns out of sand, gravel and rock was a considerable feat. In the case of the 7,000 tonne ATLAS detector, it took two years to burrow a cavern large enough to hold a 12-storey building.

But while Atlas may be the largest cavern, it was CMS - 10km up the ring below the village of Dessy - which proved the most problematic at the excavation stage.

The cavern shaft had to be bored through a 50m layer of glacial deposits, including fast flowing water, which threatened to flood the shaft. Engineers repelled these underground rivers by piping super-chilled brine down the shaft, allowing a wall of ice 3m thick to form around the circumference.

It took six months to freeze the walls of the two CMS shafts. But while the barrier worked initially, the water eventually broke through, forcing engineers to first pump down liquid nitrogen to turn the area into "Siberian permafrost", in the words of Austin Ball, CMS Technical Coordinator.

MANUFACTURING PARTS
Transporting magnets
LHC components were transported to Cern from all over the world

Building the components of both the accelerator and the detectors was a truly international effort.

In the case of the 12,500-tonne CMS detector, the coiled strands of its central solenoid magnet - all 50km of them - began their life in Finland, before travelling to factories in Grenoble, Neuchatel and Genoa, to be braided, coated, and welded.

After being shipped to Marseille, they went up the river to Macon, where they were unpacked and driven by lorry under the mountains to Cern.

In fact, the diameter of the magnet was restricted to ensure it was just narrow enough that components could squeeze through the tunnels. The clearance was a matter of centimetres.

The CMS magnet is the most powerful solenoid ever built - conducting a current of 12,000 amperes - to create a magnetic field 100,000 times stronger than the Earth's.

ASSEMBLING THE DETECTORS
CMS unit lowered
The detector units of CMS were squeezed in with centimetres to spare

The next problem, of course, was how to get a 45m-long, 25m-high, 7,000-tonne detector, through a shaft hole 20m wide.

The answer of course is to do it in bits. ATLAS was lowered piece by piece over several years, and assembled almost entirely in the subterranean cavern.

The largest piece - the barrel toroid magnet - fitted down the cavern shaft with only 10cm of clearance on either side.

But the building of the detectors is not all heavy engineering. Layer upon layer of electronic sensors had to be wired and connected by hand, which meant up to 300 people a day working in the cavern cramped against each other.

Squeezing each piece into place was "like solving a wooden puzzle" - there is only one possible way of doing it, according to Professor Andy Parker of Cambridge University, one of the founders of Atlas.

"Everything fits together like Russian dolls. I saw one design for Atlas which fitted together, but you couldn't assemble it, because there was no room to move the pieces past each other. Every single millimetre of space was fought over," he said.

The CMS detector, on the other hand, was largely assembled above ground, in several enormous units.

The largest, at 2,000 tonnes (the weight of five jumbo jets, or one-third of the weight of the Eiffel tower) took 10 hours to lower down a 100m shaft, with a clearance of 20cm either side. The world's largest electromagnet had to be handled with extreme care.

Its cylindrically arranged silicon wafer detectors contain a vast network of micro-circuitry - including 73,000 radiation-hard, low-noise microelectronic chips, almost 40,000 analogue optical links and 1,000 power supply units.

To manufacture these required an entirely new method of auto-assembly.

PROBLEMS DURING TESTING
Unlinked magnets
Failure of a magnet in testing delayed the LHC start-up by almost a year

Though the LHC was originally slated to begin operations in late 2007, the entire project was set back after a failure in one of the quadrapole magnets used to focus the beam, which buckled during testing.

This meant all similar magnets would have to be redesigned and replaced.

Other, less serious problems arose due to with leaky plumbing of liquid helium, and also when some copper "fingers" used to ensure electrical continuity between magnets buckled when the magnets were warmed up.

GOING OVERBUDGET

The final tab for the LHC is expected to come in at a colossal 6.4bn euros, four times the original budget set by the Cern Council in 1995.

But that sum still represents good value for money, according to Dr Chris Parkes, of Glasgow University, UK, who works on the LHCb detector.

He said: "Tom Hanks is to appear in the movie of Dan Brown's Angels and Demons, which involves scientists at Cern making anti-matter. But the new experiment at the LHC to understand anti-matter cost less than Tom Hanks will earn from the movie."

Original here

For the Brain, Remembering Is Like Reliving

By BENEDICT CAREY

Scientists have for the first time recorded individual brain cells in the act of summoning a spontaneous memory, revealing not only where a remembered experience is registered but also, in part, how the brain is able to recreate it.

The recordings, taken from the brains of epilepsy patients being prepared for surgery, demonstrate that these spontaneous memories reside in some of the same neurons that fired most furiously when the recalled event had been experienced. Researchers had long theorized as much but until now had only indirect evidence.

Experts said the study had all but closed the case: For the brain, remembering is a lot like doing (at least in the short term, as the research says nothing about more distant memories).

The experiment, being reported Friday in the journal Science, is likely to open a new avenue in the investigation of Alzheimer’s disease and other forms of dementia, some experts said, as well as help explain how some memories seemingly come out of nowhere. The researchers were even able to identify specific memories in subjects a second or two before the people themselves reported having them.

“This is what I would call a foundational finding,” said Michael J. Kahana, a professor of psychology at the University of Pennsylvania, who was not involved in the research. “I cannot think of any recent study that’s comparable.

“It’s a really central piece of the memory puzzle and an important step in helping us fill in the detail of what exactly is happening when the brain performs this mental time travel” of summoning past experiences.

The new study moved beyond most previous memory research in that it focused not on recognition or recollection of specific symbols but on free recall — whatever popped into people’s heads when, in this case, they were asked to remember short film clips they had just seen.

This ability to richly reconstitute past experience often quickly deteriorates in people with Alzheimer’s and other forms of dementia, and it is fundamental to so-called episodic memory — the catalog of vignettes that together form our remembered past.

In the study, a team of American and Israeli researchers threaded tiny electrodes into the brains of 13 people with severe epilepsy. The electrode implants are standard procedure in such cases, allowing doctors to pinpoint the location of the mini-storms of brain activity that cause epileptic seizures.

The patients watched a series of 5- to 10-second film clips, some from popular television shows like “Seinfeld” and others depicting animals or landmarks like the Eiffel Tower. The researchers recorded the firing activity of about 100 neurons per person; the recorded neurons were concentrated in and around the hippocampus, a sliver of tissue deep in the brain known to be critical to forming memories.

In each person, the researchers identified single cells that became highly active during some videos and quiet during others. More than half the recorded cells hummed with activity in response to at least one film clip; many of them also responded weakly to others.

After briefly distracting the patients, the researchers then asked them to think about the clips for a minute and to report “what comes to mind.” The patients remembered almost all of the clips. And when they recalled a specific one — say, a clip of Homer Simpson — the same cells that had been active during the Homer clip reignited. In fact, the cells became active a second or two before people were conscious of the memory, which signaled to researchers the memory to come.

“It’s astounding to see this in a single trial; the phenomenon is strong, and we were listening in the right place,” said the senior author, Dr. Itzhak Fried, a professor of neurosurgery at the University of California, Los Angeles, and the University of Tel Aviv.

His co-authors were Hagar Gelbard-Sagiv, Michal Harel and Rafael Malach of the Weizmann Institute of Science in Israel, and Roy Mukamel, of U.C.L.A.

Dr. Fried said in a phone interview that the single neurons recorded firing most furiously during the film clips were not acting on their own; they were, like all such cells, part of a circuit responding to the videos, including thousands, perhaps millions, of other cells.

In studies of rodents, including a paper that will also appear Friday in the journal Science, neuroscientists have shown that special cells in the hippocampus are sensitive to location, activating when the animal passes a certain spot in a maze. The firing pattern of these cells forms the animals’ spatial memory and can predict which way the animal will turn, even if it makes a wrong move.

Some scientists argue that as humans evolved, these same cells adapted to register a longer list of elements — including possibly sounds, smells, time of day and chronology — when an experience occurred in relation to others.

Single-cell recordings cannot capture the entire array of circuitry involved in memory, which may be widely distributed beyond the hippocampus area, experts said. And as time passes, memories are consolidated, submerged, perhaps retooled and often entirely reshaped when retrieved later.

Though it did not address this longer-term process, the new study suggests that at least some of the neurons that fire when a distant memory comes to mind are those that were most active back when it happened, however long ago that was.

“The exciting thing about this,” said Dr. Kahana, the University of Pennsylvania professor, “is that it gives us direct biological evidence of what before was almost entirely theoretical.”

Original here

How the Human Got His Thumbs


For decades, people referred to the non-coding bits of DNA between genes as junk DNA. Then, in the eighties scientists discovered that some of that junk DNA served an important purpose. The DNA attracted or repelled transcription factors and RNA, greatly enhancing or inhibiting the potency of adjacent genes. Now scientists have just found that one of those gene enhancers may be what separates humans and chimps.

Researchers from U.S. Department of Energy's Lawrence Berkeley National Laboratory, publishing online in the most recent issue of the journal Science, placed human, chimpanzee and macaque versions of the enhancer into mouse DNA. The scientists also added a gene that would release a blue dye to show where in the mouse fetus the enhancer was most active. When the mice developed, the researcher saw that it was active in the hands, feet and throat. Additionally, the mice with the human version showed the most activity, with the chimp version producing some activity, and the macaque version producing very little.

The researchers then showed that the difference between the human and chimp versions of the enhancer result from a difference of only 13 nucleotides, a far larger number of changes than would be expected had the mutations been the result of drift rather than selection. The location of enhancer activity highlights the importance of the difference. Our hands, with their opposable thumbs, our feet, evolved for bipedal locomotion, and our throats, which allow us to speak, make up three key differences between humans and all other apes. Because of its role enhancing the genes that regulate the development of those regions, the evolution of this gene enhancer must have been a key step in the evolutionary separation of the human/chimpanzee common ancestor. Furthermore, by following the presence of this particular gene enhancer, researchers should be able to locate which genes are responsible for our differences from chimps and when they evolved. Mapping out humanity's divergence from apes? Not bad for a bunch of DNA once thought of as junk.

Feds Warn Climate Change Could Harm Giant Sequoias

Federal researchers are warning that warming temperatures could soon cause California's giant sequoia trees to die off more quickly unless forest managers plan with an eye toward climate change and the impact of a longer, harsher wildfire season.

Hot, dry weather over the last two decades already has contributed to the deaths of an unusual number of old-growth pine and fir trees growing in Yosemite and Sequoia National Parks, according to recent research from the U.S. Geological Survey.

In the next decade, climate change also could start interfering with the giant sequoias' ability to sprout new seedlings, said Nathan Stephenson, one of several scientists speaking Thursday at a government agency symposium on how global warming could affect the Sierra Nevada.

"The first effects of climate change that we're likely to see is that the giant sequoias will have trouble reproducing because their root systems don't work as well when temperatures warm," said Stephenson, a research ecologist with the USGS Western Ecological Research Center. "After that, I wouldn't be surprised if in 30 years we see their death rates go up."

Sequoiadendron giganteum, an inland cousin to the tall California coast redwood, can become 2,900 years old and bulk up to more than 36 feet in diameter, making them among the world's most massive living things.

Stephenson was among a team of tree demographers who monitored the health of pines and firs growing in the two southern Sierra Nevada parks from 1982 to 2004.

As both temperatures and summer droughts increased over that period, he found the trees' normal death rate more than doubled, and stands became more vulnerable to attacks from insects or fungus.

While those species have a faster life cycle than the ancient sequoias, scientists say the mortality rates can help predict what may happen to the massive organisms as temperatures increase as predicted an average of 3 to 10 degrees Fahrenheit statewide by the end of the century.

"We've got a lot of our most cherished species at stake," said Constance Millar, a senior research scientist with the U.S. Forest Service. "Rather than just managing forests for the plants we see growing there today, we're now having to look forward to think about what might thrive there in 100 years."

Native flora and fauna throughout the 400-mile-long Sierra Nevada mountain range are already under stress from a warming climate, and federal land managers have started monitoring wildlands to understand how they're transforming.

Some officials have already started making changes based on what they see on the ground.

Recently, the Forest Service redrew its decades-old maps for where to place fire breaks along the Sierra Nevada, moving suppression efforts down from the ridge lines to lower regions where scientists now believe habitats are at risk from wildfires, Millar said.

One local species troubled by rising temperatures is the mountain-dwelling American pika, or rock-rabbit. The 6-inch-long rodent thrives in cool, mountaintop climates, but at higher temperatures they can overheat and die within hours.

The population has been dwindling and drifting to ever higher elevations, but biologists fear it eventually could run out of mountain.

Still, because it could take years to understand how different animals and plants are influenced by not only rising temperatures, but fires, pollutants, forest management practices and other change agents, park officials said need to proceed cautiously.

"Right now, we're going to focus our efforts on the big icon for the parks, the giant sequoias," said Craig Axtell, superintendent of Sequoia and Kings Canyon National Parks. "But we may find that other problems come up down the road that we don't even know about."

GM Going 50% Landfill-Free By 2010

Sometimes when big traditional companies announce good news about ways they’re going to reduce waste, the question arises of shouldn’t they have done this earlier? That’s what I wonder with General Motor Corp. announcement today that within a year and a half, 50% of their facilities will be recycling virtually all of their waste.

GM says by the end of 2010, half of its major global manufacturing operations will be land-fill free. The facilities plan to achieve that landfill-free status when all production waste or garbage is recycled or reused. So far, the company says 33 of its operations recently reached that status for a total of 43.

At the landfill-free plants, more than 96% of waste materials are recycled or reused and 3% of that is converted to energy at waste-to-energy plants.

Doing good will help the company’s bottom line. In a statement, GM says as a result of its global recycling efforts, recycled metal scraps are approaching $1 billion in annual revenue. In North America alone, selling off its recycled cardboard, wood, oil, plastic and other materials added $16 million in revenue.

This on top of playing around with solar, getting rid of truck and SUV plants, and investing in ethanol technologies among many other eco-friendly moves, shows that GM has sustainability in mind, at least when it comes to sustaining their business, which works just fine for us.

GM has about 160 manufacturing facilities worldwide, including joint ventures. It plans to make 80 of them landfill-free.

Asia pollution may boost U.S. temperatures

Smog, soot and other particles like the kind often seen hanging over Beijing add to global warming and may raise summer temperatures in the American heartland by three degrees in about 50 years, says a new federal science report released Thursday.
Smog like this seen over Beijing, China, is caused mostly from burning wood and from driving trucks and cars.

Smog like this seen over Beijing, China, is caused mostly from burning wood and from driving trucks and cars.

These overlooked, shorter-term pollutants -- mostly from burning wood and kerosene and from driving trucks and cars -- cause more localized warming than once thought, the authors of the report say.

They contend there should be a greater effort to attack this type of pollution for faster results.

For decades, scientists have concentrated on carbon dioxide, the most damaging greenhouse gas because it lingers in the atmosphere for decades. Past studies have barely paid attention to global warming pollution that stays in the air merely for days.

The new report, written by scientists with NASA and the National Oceanic and Atmospheric Administration, makes a case for tackling the short-term pollutants, while acknowledging that carbon dioxide is still the chief cause of warming.

That concept is also the official policy of the Bush Administration, said assistant secretary of commerce Bill Brennan.

In the United States, this approach would mean cutting car and truck emissions perhaps before restricting coal-burning power plants. In the developing world, especially Asia, it would mean shifting to cleaner energy sources, more like those used in the Western world. Much of this type of pollution in Asia comes from burning kerosene and biofuels, such as wood and animal dung.

In addition to soot, smog and sulfates, other short-lived pollutants are organic carbon, dust and nitrates. While carbon dioxide is invisible, these are pollutants people can see.

Projected increases in some of these pollutants and decreases in others in Asia will eventually add up to about 20 percent of the already-predicted man-made summer warming in America by 2060, the report said.

"What they do about their pollution can affect our climate," said study co-author Hiram "Chip" Levy, a senior scientist at NOAA's fluid dynamics lab in Princeton, New Jersey.

This pollution will likely create three "hot spots" in the world: the central United States, Europe around the Mediterranean Sea, and Kazakhstan, which borders Russia and China. In the United States it's "a big blob in the middle of the country" stretching from the Rocky Mountains to the Appalachians, Levy said.

The same analysis also shows about an inch less of yearly rain in middle America because of Asian emissions by about 2060.

As far as American-produced pollution, smog is the main problem. Reducing diesel emissions and increasing mass transit would prove a more effective and immediate strategy over limiting power plants, said study co-author Drew Shindell, a climate scientist at NASA's Goddard Institute for Space Studies in New York.

The report makes sense, but should also include a strategy for man-made methane, a greenhouse gas which lasts 10 years in the atmosphere, said Michael MacCracken, chief scientist at the Climate Institute in Washington.

Methane mostly comes from landfills, natural gas use, livestock, coal mining and sewage treatment, according to the U.S. Environmental Protection Agency.