Followers

Sunday, March 23, 2008

Aurora Borealis, Witnessed by the Crew of the Space Shuttle

Original here

Space tourist tests bodgie 'boomerang'

Boomerang
Space cadet ... Takao Doi's whacky experiment worked.

IN an unprecedented experiment, a Japanese astronaut has thrown a boomerang-like object in space and confirmed it flies back much like on Earth.

Astronaut Takao Doi "threw a boomerang and saw it come back" during his free time on March 18 at the International Space Station, a spokeswoman at the Japan Aerospace Exploration Agency said.

Doi threw the boomerang after a request from compatriot Yasuhiro Togai, a world boomerang champion.

"I was very surprised and moved to see that it flew the same way it does on Earth," the Mainichi Shimbun daily quoted the 53-year-old astronaut as telling his wife in a chat from space.

The space agency said a videotape of the experiment would likely be released later.

Mr Doi travelled on US shuttle Endeavour on the March 11 blast-off and successfully delivered the first piece of a Japanese laboratory to the International Space Station.

Original here


Google Joins MIT in Search for Earth-like Planets

Earthlike_planets_3 "When starships transporting colonists first depart the solar system, they may well be headed toward a TESS-discovered planet as their new home."

George R. Ricker, senior research scientist at the Kavli Institute for Astrophysics and Space Research at MIT

Google has joined MIT scientists who are designing a satellite-based observatory -the Transiting Exoplanet Survey Satellite (TESS)- that they say could for the first time provide a sensitive survey of the entire sky to search for earth-like planets outside the solar system that appear to cross in front of bright stars. Google will fund development of the wide-field digital cameras needed for the satellite.

"Decades, or even centuries after the TESS survey is completed, the new planetary systems it discovers will continue to be studied because they are both nearby and bright," says George Ricker, leader of the project.

Most of the more than 200 extrasolar planets discovered so far have been much larger than Earth, similar in size to the solar system's giant planets (ranging from Jupiter to Neptune), or even larger. But to search for planets where there's a possibility of finding signs of living organisms, astronomers are much more interested in those that are similar to our own world.

Most searches so far depend on the gravitational attraction that planets exert on their stars in order to detect them, and therefore are best at finding large planets that orbit close to their stars. TESS, however, would search for stars whose orbits as seen from Earth carry them directly in front of the star, obscuring a tiny amount of starlight. Some ground-based searches have used this method and found about 20 planets so far, but a space-based search could detect much smaller, Earth-sized planets, as well as those with larger orbits.

This transit-detection method, by measuring the exact amount of light obscured by the planet, can pinpoint the planet's size. When combined with spectroscopic follow-up observations, it can determine the planet's temperature, probe the chemistry of its atmosphere, and perhaps even find signs of life, such as the presence of oxygen in the air.

The satellite will be equipped with six high-resolution, wide-field digital cameras, which are now under development. Two years after launch, the cameras--which have a total resolution of 192 megapixels--will cover the whole sky, getting precise brightness measurements of about two million stars in total.

Statistically, since the orientation of orbits is random, about one star out of a thousand will have its planets' orbits oriented perpendicular to Earth so that the planets will regularly cross in front of it, which is called a planetary transit. So, out of the two million stars observed, the new observatory should be able to find more than a thousand planetary systems within two years.

In fact, if a new estimate based on recent observations of dusty disks is confirmed, there might even be up to 10 times as many such planets.

Because the satellite will be repeatedly taking detailed pictures of the entire sky, the amount of data collected will be enormous. As a result, only selected portions will actually be transmitted back to Earth. But the remaining data will be stored on the satellite for about three months, so if astronomers want to check images in response to an unexpected event, such as a gamma-ray burst or supernova explosion, "they can send us the coordinates [of that event] and we could send them the information," Ricker says.

Because of the huge amount of data that will be generated by the satellite, which could launched as early as 2012, Google has an interest in working on the development of ways of process that data to find useful information.

Regardless of the funding for the satellite, the same wide-field cameras being developed for TESS could also be used for a planned ground-based search for dark matter in the universe--the invisible, unknown material that astronomers believe is more prevalent in space than the ordinary matter that we can see. Some of the unknown dark-matter particles must constantly be striking the Earth, and the plan is to train a bank of cameras inside tanks of fluid deep underground, to detect flashes of light produced by the impacts of these dark particles. Ricker's Kavli group is participating with MIT physics professor Peter Fisher's team in this new physics research initiative.

The electronic detectors for the new cameras are being developed in collaboration with MIT's Lincoln Laboratory. The lab's expertise in building large, highly sensitive detectors is a significant factor in making possible these unique cameras, which have no moving parts at all. If all goes well and funding is secured, the satellite could be launched in 2012 with NASA support, or even earlier with a private sponsor.

Posted by Casey Kazan. Adapted from an MIT release.

Original here

Mars is 'covered in table salt'

Possible chloride salts (Image courtesy of Mikki M Osterloo)
False-colour, composite image showing possible chloride salts (in blue)

Mars appears to be covered in salt crystals from ancient dried-up lakes, new evidence suggests.

A Nasa probe has found signs that the southern hemisphere is dusted with chloride mineral, perhaps "table salt".

US scientists think the mineral formed when water evaporated from salty lakes or soil billions of years ago.

The deposits, similar to salt-pans on Earth, are a good place to search for traces of past life preserved in salt, they report in the journal Science.

The evidence comes from a camera on Mars Odyssey, which has been mapping the Red Planet since early 2002.

The camera images Mars in the visible and infrared parts of the spectrum in order to work out the distribution of minerals on the surface.

Salt is an excellent means of preserving organic material, so if there was life present in the distant past, the signature might still be there
Dr Philip Christensen

It found about 200 places with spectral "fingerprints" consistent with chloride salt deposits.

All were in the middle to low latitudes of the ancient, heavily cratered terrain in the southern highlands.

Team member Professor Philip Christensen, of the School of Earth and Planetary Exploration at Arizona State University, Tempe, said the most likely chloride mineral "would be good old table salt (sodium chloride)".

He said many of the deposits lay in basins with channels leading into them, the kind of feature that is consistent with water flowing in over a long time.

He told BBC News: "Two possible mechanisms would be the evaporation of a large body of water (like a salt lake on Earth), or capillary action in the soil that could draw salt-rich water toward the surface, where the water evaporates and the salt is left behind and accumulates.

"Either case is exciting because it implies a large amount of water near the surface."

Warmer, wetter

The team - which also includes members from the University of Hawaii, University of Arizona and Stony Brook University - thinks the deposits formed about four billion years ago, when Mars was probably much warmer and wetter.

Mars Odyssey (Nasa)
Mars Odyssey: Its data suggests water was once plentiful

They locations range in area from one square kilometre to about 25 square kilometres, which approaches the size of some of the largest lakes on Earth.

The discovery suggests that there was once a lot of water near the surface of Mars and a source of energy, namely sunlight, said Dr Christensen.

He added: "Salt is also an excellent means of preserving organic material, so if there was life present in the distant past, the signature might still be there."

The scientists say the areas with chloride minerals should be given a high priority for future rover missions to Mars.

Until now, efforts have largely concentrated on a handful of places that show evidence of clay or sulphate minerals.

Andrew Knoll of Harvard University's Department of Earth and Planetary Sciences, who was not part of the study, said it suggested that water once occurred widely on the Martian surface, as salty transient deposits - playa lakes, not oceans.

But he said the findings carried "a double-edge sword for astrobiology".

"Water is the first sign that an environment might have been habitable, but waters that precipitate table salt on Mars would have been much saltier than any waters known to support microbial populations on Earth," he explained.

Original here

The UFO Phenomena -“Religion or Science”?

Shutterstock_2312404_2_3 Is the world’s fascination with the possibility of UFOs and more a religion or a natural intuitive sense that life is “out there” based on current scientific research and recent planet-search discoveries?

One of the world’s preeminent astrophysicists, Carl Sagan, believed that “the interest in unidentified flying objects derives, perhaps, not so much from scientific curiosity as from unfulfilled religious needs.”

No one could have foreseen the extent to which the idea of would pervade popular culture prior to the publication in 1897 of H.G. Wells War of the Worlds (see video below) and Kurd Lasswitz’s On Two Planets –both the vanguard of an enormous number of treatments of the alien theme in science fiction.

The modern UFO era and the birth of the extraterrestrial hypothesis began on June 24,1947, when Kenneth Arnold, flying his private plane near Mount Rainier in Washington, reported nine disk-shaped objects flying in formation at speeds he estimated to be over 1,000 miles per hour.

Arnold, a respected businessman and deputy U.S. marshal, was taken seriously and his description of the objects as flying “like a saucer if you skipped it across the water” led to newspapers to coin the term “flying saucer.”

The alien hypothesis first officially emerged in 1948 with the Air Force "Project Sign," which concluded that UFOs were of extraterrestrial origin. The report was later declassified and burned by General Hoyt Vandenburg.

If UFOs exist, how do they traverse the universe? According to conventional wisdom, one can only travel through time in a linear fashion at no faster than the speed of light. At that rate, it would take millions of years to traverse the universe, and who has time for that? If there’s a way to manipulate space and time curvatures, then we have all the time we need.

In sync with India’s love-affair with UFO’s, a recent editorial in a popular Indian news site, UFOs, singularity, time folding (and just about every other theory ever proposed) are a complete given. After all, one aspect of quantum physics is that in an alternate universe, anything could happen.

While several advanced theories do have some solid ideas to back them up, others seem a bit far-fetched—even for those willing to accept that there may be upwards of twenty-six dimensions, rather than the standard four. So what current theory is the most likely to someday satisfactorily explain the science of UFO’s?

Though it may seem like pure fiction, it is commonly accepted that wormholes are possible within the framework of general relativity. Although folding space has yet to be documented, there continues to be a healthy debate in the scientific community about their possible existence. If they do exist, it would explain how something or someone could traverse huge distances very quickly. Stephen Hawking gave a lecture, which discussed the possibility of wormholes in folding space. The implications of human travel through these wormholes could result in “short-cutting” through vast distances and even time itself.

According to this idea, one could even move faster than the speed of light. Professor Hawking puts it this way, “If you can travel from one side of the galaxy, to the other, in a week or two, you could go back through another wormhole, and arrive back before you set out.”

While a bit unfathomable, a similar type of “time travel” has already been demonstrated. Scientists who studied passengers on space shuttles have found that, because of the shuttle’s high speed, time moved more slowly for those on board.

So, what is a wormhole? Simply put, “masses that place pressure on different parts of the universe could eventually come together to form a tunnel.” Wormholes are also referred to as “Einstein-Rosen bridges”, and are related to Einstein’s theory of special relativity, and the space-time continuum.

While scientists currently have no realistic method of finding a wormhole (nor proof that they even exist), there is no reason why they couldn’t. In fact, their existence would certainly help make sense of some current paradoxes in the world of physics. While the answers aren’t quite here yet, the questions are being asked. If wormholes are proven to exist, the possibilities will be literally endless.

Original here

Earth, Mars, Moon Have Different Origin, Study Says

A new study is challenging the long-standing notion that the whole solar system formed from the same raw materials. Until now most scientists had believed that the inner solar system bodies—Mercury, Venus, Earth, its moon, and Mars—had the same composition as primitive meteorites called chondrites.

But, problematically, Earth's chemistry doesn't quite match.

Now, French researcher Guillaume Caro, from Centre de Recherches Pétrographiques et Géochimiques in France, and his colleagues say that the makeup of Mars and the moon don't correspond either.

It turns out the three bodies may be more similar to each other than the chondrite-rich asteroids located between Mars and Jupiter.

Caro and his team say scientists may now have to revisit the idea that chondrites represent the building blocks for the whole solar system.

"What our results suggest is that the sorting of the elements that make up these planets may have happened at a much earlier stage than had been believed," said Alex Halliday, a study co-author from Oxford University.

"The composition of these worlds is inconsistent with them simply forming out of large 'lumps' of stony meteorites like those we see today in the asteroid belt."

The study appears in this week's issue of the journal Nature.

Earth Askew

Chondrites are the most common class of meteorites, and, at an estimated 4.5 billion years old, believed to be the oldest.

Because the objects chemically resemble the sun, it is widely believed that they represent the basic materials for the entire solar system.

One telltale signature of chondrites is an abundance of neodymium 142, a by-product of the decay of the rare earth metal samarium.

In the past several years researchers noticed that Earth's crust contains too great a ratio of neodymium 142 compared to chondrites.

Seeking to show that the Earth isn't an oddball, Caro and his team turned to Mars and reviewed old data from Earth's moon.

"We found that Martian and lunar rocks are also characterized by an excess in neodymium 142 compared with chondrites," he said.

All Shook Up?

Supporters of the idea that the inner planets formed from chondritic materials have long speculated that the Earth's turbulent history could be to blame for its chemical differences.

Earth regularly shakes up its crust and mantle through plate tectonics and convection—which could have buried reservoirs of material that would balance out the elemental ratio, the scientists argue.

Mars and the moon haven't put their surfaces through the same grinders, however, and yet also appear to have excess neodymium 142.

Car and his team say the difference could come from erosion of planetary crusts in the bodies' formative years. Or the inner planets might have formed long before the rocky bodies of the outer solar system.

Not Quite Settled

Vinciane Debaille, of NASA's Lunar and Planetary Institute in Houston, and her colleagues published a paper in the November 22, 2007, issue of Nature that paints a different picture.

They studied the same class of Mars rocks used in the newer study and agreed that they differ from chondrites.

But her team suggests that early Mars had an insulating atmosphere that kept the planet's interior warm, thereby sustaining a molten magma ocean up to 110 million years after the solar system's formation.

This could have created underground magma reserves rich in the "missing" isotopes.

Richard Carlson, from the Carnegie Institution of Washington, says he hasn't given up on the idea that Earth could be harboring chondrite-like deposits close to its core.

For instance, ancient rocks from Greenland differ from rocks elsewhere on Earth and have different ratios of neodymium isotopes, he points out.

And he hesitates to draw any conclusions about all of Mars from a small sample of its rocks.

"To think that we can get definitive information about the bulk composition of Mars from a handful of meteorites," he said, "all likely from the same area of the Martian crust, is very optimistic."

Original here

British minister defends embryo research bill

LONDON (Reuters) - The British government is right to push through hybrid human-animal embryo legislation after a Roman Catholic cardinal attacked the government for "endorsement of experiments of Frankenstein proportion", Health Minister Ben Bradshaw has told the BBC.

The leader of the Roman Catholic Church in Scotland cardinal Keith O'Brien has called for a proposed new law -- the Human Fertilization and Embryology Bill -- to outlaw the practice and wants the government to allow a free vote on the legislation.

"I think if it was about the things the cardinal referred to, creating babies for spare parts or raiding dead people's tissue then there would be justification for a free vote," Bradshaw told the BBC Radio 4's "Any Questions" on Friday.

"But it's not about those things. He was wrong in fact, and I think rather intemperate and emotive in the way that he criticized this legislation.

"This is about using pre-embryonic cells to do research that has the potential to ease the suffering of millions of people in this country. The Government has taken a view that this is a good thing.

"We have free votes on issues of conscience like abortion, like the death penalty, where the government does not take a view.

"I think in this case the Government's absolutely right to try to push this through to the potential benefit of many people in this country."

Supporters of hybrid research say it will give scientists the large number of embryos they need to make stem cells to help find cures for a range of diseases.

Researchers create inter-species hybrids by injecting human DNA into a hollowed-out animal egg cell. The resulting embryo is 99.9 percent human and 0.1 percent animal.

Britain is one of the leading states for stem cell research, attracting scientists from around the world with a permissive environment that allows embryo studies within strict guidelines.

Scientists in China, the United States and Canada have carried out similar work, the same technique used to create Dolly the sheep, the world's first cloned mammal.

"This Bill represents a monstrous attack on human rights, human dignity and human life," said O'Brien in a speech to be delivered on Sunday.

"We are about to have a public government endorsement of experiments of Frankenstein proportion without many people really being aware of what is going on."

There are three Catholics in Prime Minister Gordon Brown's cabinet -- Welsh Secretary Paul Murphy, Transport Secretary Ruth Kelly and Defense Secretary Des Browne -- with one of them reportedly ready to resign over the proposed law.

The Human Fertilization and Embryology Authority, which regulates the research, gave permission to two groups of UK-based scientists to use hybrids in January.

The draft law is making its way through parliament and is due to return to the House of Commons in the coming weeks.

The House of Lords rejected attempts earlier this year to include a ban on hybrid research in the draft legislation

Original here

Superconductors at room-temp NOT reached, comments from researcher inside

As you may remember, yesterday we did a quick post on a team of international researchers that were being reported as having found a material that could superconduct at room-temperature (which is more or less the holy-grail of superconductors). As i mentioned in that article, there seemed to be some discrepancies between what was being reported in some places (like nextenergynews, where i originally found the report), and the press-release from Dr.John Tse’s (the lead researcher) University.

Basically the press-release seemed to say that they had achieved superconductivity in a material that could potentially lead to superconductors at room-temp, while other sources were claiming outright that there had been an actual room-temp superconductor. To get to the bottom of it, i contacted Dr. Tse directly and here is what he’s told me.

He was apparently originally misquoted misinterpreted in EETimes (who have since corrected it), which then lead to the other misquotes and subsequent wrong reporting of a new superconductor at room-temperature.

Here is how Dr. Tse explained to me what they DID do in their research:

What they did was follow up on a suggestion made in 2004 by Prof. Ashcroft of Cornell, that suggested that if high enough density of hydrogen could be prepared in a solid, it might exhibit superconducting properties. He suggested using Hydrogen rich compounds, which is exactly what they did (Silane). They did indeed achieve this high density hydrogen state in silane, and subsequently detected superconductivity. The temperature they found it to superconduct at was actually 16K (around 280K would be room-temperature), at a pressure of 120 Giga-Pascal, and as Dr. Tse said, ” A good understanding of the mechanism may lead to the design of materials with even higher T_c”.

So there you have it! Not a room-temp superconducting material, but it may pave the way for it.

I don’t mean to be link-whoring here, but i would really appreciate a Digg/Reddit vote, which can be done at the top of this page, as both websites did report the misquotes as truth, and it would be nice to have it corrected.

Original here

"A good metaphor is something even the police should keep an eye on." - G.C. Lichtenberg

Although the brain-computer metaphor has served cognitive psychology well, research in cognitive neuroscience has revealed many important differences between brains and computers. Appreciating these differences may be crucial to understanding the mechanisms of neural information processing, and ultimately for the creation of artificial intelligence. Below, I review the most important of these differences (and the consequences to cognitive psychology of failing to recognize them): similar ground is covered in this excellent (though lengthy) lecture.

Difference # 1: Brains are analogue; computers are digital

It's easy to think that neurons are essentially binary, given that they fire an action potential if they reach a certain threshold, and otherwise do not fire. This superficial similarity to digital "1's and 0's" belies a wide variety of continuous and non-linear processes that directly influence neuronal processing.

For example, one of the primary mechanisms of information transmission appears to be the rate at which neurons fire - an essentially continuous variable. Similarly, networks of neurons can fire in relative synchrony or in relative disarray; this coherence affects the strength of the signals received by downstream neurons. Finally, inside each and every neuron is a leaky integrator circuit, composed of a variety of ion channels and continuously fluctuating membrane potentials.

Failure to recognize these important subtleties may have contributed to Minksy & Papert's infamous mischaracterization of perceptrons, a neural network without an intermediate layer between input and output. In linear networks, any function computed by a 3-layer network can also be computed by a suitably rearranged 2-layer network. In other words, combinations of multiple linear functions can be modeled precisely by just a single linear function. Since their simple 2-layer networks could not solve many important problems, Minksy & Papert reasoned that that larger networks also could not. In contrast, the computations performed by more realistic (i.e., nonlinear) networks are highly dependent on the number of layers - thus, "perceptrons" grossly underestimate the computational power of neural networks.

Difference # 2: The brain uses content-addressable memory

In computers, information in memory is accessed by polling its precise memory address. This is known as byte-addressable memory. In contrast, the brain uses content-addressable memory, such that information can be accessed in memory through "spreading activation" from closely related concepts. For example, thinking of the word "fox" may automatically spread activation to memories related to other clever animals, fox-hunting horseback riders, or attractive members of the opposite sex.

The end result is that your brain has a kind of "built-in Google," in which just a few cues (key words) are enough to cause a full memory to be retrieved. Of course, similar things can be done in computers, mostly by building massive indices of stored data, which then also need to be stored and searched through for the relevant information (incidentally, this is pretty much what Google does, with a few twists).

Although this may seem like a rather minor difference between computers and brains, it has profound effects on neural computation. For example, a lasting debate in cognitive psychology concerned whether information is lost from memory because of simply decay or because of interference from other information. In retrospect, this debate is partially based on the false asssumption that these two possibilities are dissociable, as they can be in computers. Many are now realizing that this debate represents a false dichotomy.

Difference # 3: The brain is a massively parallel machine; computers are modular and serial

An unfortunate legacy of the brain-computer metaphor is the tendency for cognitive psychologists to seek out modularity in the brain. For example, the idea that computers require memory has lead some to seek for the "memory area," when in fact these distinctions are far more messy. One consequence of this over-simplification is that we are only now learning that "memory" regions (such as the hippocampus) are also important for imagination, the representation of novel goals, spatial navigation, and other diverse functions.

Similarly, one could imagine there being a "language module" in the brain, as there might be in computers with natural language processing programs. Cognitive psychologists even claimed to have found this module, based on patients with damage to a region of the brain known as Broca's area. More recent evidence has shown that language too is computed by widely distributed and domain-general neural circuits, and Broca's area may also be involved in other computations (see here for more on this).

Difference # 4: Processing speed is not fixed in the brain; there is no system clock

The speed of neural information processing is subject to a variety of constraints, including the time for electrochemical signals to traverse axons and dendrites, axonal myelination, the diffusion time of neurotransmitters across the synaptic cleft, differences in synaptic efficacy, the coherence of neural firing, the current availability of neurotransmitters, and the prior history of neuronal firing. Although there are individual differences in something psychometricians call "processing speed," this does not reflect a monolithic or unitary construct, and certainly nothing as concrete as the speed of a microprocessor. Instead, psychometric "processing speed" probably indexes a heterogenous combination of all the speed constraints mentioned above.

Similarly, there does not appear to be any central clock in the brain, and there is debate as to how clock-like the brain's time-keeping devices actually are. To use just one example, the cerebellum is often thought to calculate information involving precise timing, as required for delicate motor movements; however, recent evidence suggests that time-keeping in the brain bears more similarity to ripples on a pond than to a standard digital clock.

Difference # 5 - Short-term memory is not like RAM

Although the apparent similarities between RAM and short-term or "working" memory emboldened many early cognitive psychologists, a closer examination reveals strikingly important differences. Although RAM and short-term memory both seem to require power (sustained neuronal firing in the case of short-term memory, and electricity in the case of RAM), short-term memory seems to hold only "pointers" to long term memory whereas RAM holds data that is isomorphic to that being held on the hard disk. (See here for more about "attentional pointers" in short term memory).

Unlike RAM, the capacity limit of short-term memory is not fixed; the capacity of short-term memory seems to fluctuate with differences in "processing speed" (see Difference #4) as well as with expertise and familiarity.

Difference # 6: No hardware/software distinction can be made with respect to the brain or mind

For years it was tempting to imagine that the brain was the hardware on which a "mind program" or "mind software" is executing. This gave rise to a variety of abstract program-like models of cognition, in which the details of how the brain actually executed those programs was considered irrelevant, in the same way that a Java program can accomplish the same function as a C++ program.

Unfortunately, this appealing hardware/software distinction obscures an important fact: the mind emerges directly from the brain, and changes in the mind are always accompanied by changes in the brain. Any abstract information processing account of cognition will always need to specify how neuronal architecture can implement those processes - otherwise, cognitive modeling is grossly underconstrained. Some blame this misunderstanding for the infamous failure of "symbolic AI."

Difference # 7: Synapses are far more complex than electrical logic gates

Another pernicious feature of the brain-computer metaphor is that it seems to suggest that brains might also operate on the basis of electrical signals (action potentials) traveling along individual logical gates. Unfortunately, this is only half true. The signals which are propagated along axons are actually electrochemical in nature, meaning that they travel much more slowly than electrical signals in a computer, and that they can be modulated in myriad ways. For example, signal transmission is dependent not only on the putative "logical gates" of synaptic architecture but also by the presence of a variety of chemicals in the synaptic cleft, the relative distance between synapse and dendrites, and many other factors. This adds to the complexity of the processing taking place at each synapse - and it is therefore profoundly wrong to think that neurons function merely as transistors.

Difference #8: Unlike computers, processing and memory are performed by the same components in the brain

Computers process information from memory using CPUs, and then write the results of that processing back to memory. No such distinction exists in the brain. As neurons process information they are also modifying their synapses - which are themselves the substrate of memory. As a result, retrieval from memory always slightly alters those memories (usually making them stronger, but sometimes making them less accurate - see here for more on this).

Difference # 9: The brain is a self-organizing system

This point follows naturally from the previous point - experience profoundly and directly shapes the nature of neural information processing in a way that simply does not happen in traditional microprocessors. For example, the brain is a self-repairing circuit - something known as "trauma-induced plasticity" kicks in after injury. This can lead to a variety of interesting changes, including some that seem to unlock unused potential in the brain (known as acquired savantism), and others that can result in profound cognitive dysfunction (as is unfortunately far more typical in traumatic brain injury and developmental disorders).

One consequence of failing to recognize this difference has been in the field of neuropsychology, where the cognitive performance of brain-damaged patients is examined to determine the computational function of the damaged region. Unfortunately, because of the poorly-understood nature of trauma-induced plasticity, the logic cannot be so straightforward. Similar problems underlie work on developmental disorders and the emerging field of "cognitive genetics", in which the consequences of neural self-organization are frequently neglected .

Difference # 10: Brains have bodies

This is not as trivial as it might seem: it turns out that the brain takes surprising advantage of the fact that it has a body at its disposal. For example, despite your intuitive feeling that you could close your eyes and know the locations of objects around you, a series of experiments in the field of change blindness has shown that our visual memories are actually quite sparse. In this case, the brain is "offloading" its memory requirements to the environment in which it exists: why bother remembering the location of objects when a quick glance will suffice? A surprising set of experiments by Jeremy Wolfe has shown that even after being asked hundreds of times which simple geometrical shapes are displayed on a computer screen, human subjects continue to answer those questions by gaze rather than rote memory. A wide variety of evidence from other domains suggests that we are only beginning to understand the importance of embodiment in information processing.

Bonus Difference: The brain is much, much bigger than any [current] computer

Accurate biological models of the brain would have to include some 225,000,000,000,000,000 (225 million billion) interactions between cell types, neurotransmitters, neuromodulators, axonal branches and dendritic spines, and that doesn't include the influences of dendritic geometry, or the approximately 1 trillion glial cells which may or may not be important for neural information processing. Because the brain is nonlinear, and because it is so much larger than all current computers, it seems likely that it functions in a completely different fashion. (See here for more on this.) The brain-computer metaphor obscures this important, though perhaps obvious, difference in raw computational power.

Original here

Pipa pipa - Surinam Toad Babies Emerging


"Power Shift" -How to Boost Your Memory

Shutterstock_3118787_5 A recent study suggests that merely glancing from left to right (the traditional “shifty look” of spies and sneaks) can boost memory power and help people differentiate between real and imagined memories. Moving the eyes up and down had no such effect. The trick may work because the specific left/right eye movement engages both the left and right hemispheres of the brain at the same time. As little as 30 seconds of the activity could be enough to help you remember where you left your wallet, or the number sequence needed to deactivate that bomb.

Dr Andrew Parker, of Manchester Metropolitan University, explains “Often, we may be confused over whether a memory is for something real or something we only imagined or thought about.

"For example ‘Did I really lock the door or did I only imagine locking the door?’ Bilateral eye movements may help us to determine accurately the source of our memory.

“This could be important in situations where we feel uncertain, unclear or maybe even just confused about what we may have done or said.

“Our work shows that true memory can be improved and false memory reduced. One reason for this is that bilateral eye movements may improve our ability to monitor the source of our memories.”

“Some research indicates that certain types of memory – for example what one did yesterday, or memory for a word in an experiment – are dependent upon interactions between the cerebral hemispheres.”

This research was published in the science journal Brain and Cognition, and is further evidence that eyes and memory are likely somehow connected.

Original here

Ball lightning bamboozles physicist

lightning

Lightning strikes are sometimes associated with intriguing shimmering balls of light that hover above the ground for minutes at a time before disappearing. But what's behind this natural phenomenon? (Source: iStockphoto)

Scientific theories and experiments have failed to convince a physicist what's behind the mysterious natural phenomenon of ball lightning.

Emeritus Professor Bob Crompton of the Australian National University gave a presentation in Canberra this week on the latest scientific investigations into ball lightning, something once considered as likely as UFOs.

"I don't believe there is any satisfactory explanation so far," says Crompton for these small bright lights that appear after a lightning strike.

"[The theories] don't satisfy me and I don't think they satisfy anyone who looks at the evidence objectively."

Crompton, an expert in atomic and molecular physics and electrical discharges in gases, has been interested in the science behind ball lightning for decades.

He's collected 30-40 Australian sightings over a period of about 10 years, with the help of Australian meteorological services.

"In those early days I would have had enough to fill two inches of manila folders," he says.

Crompton says ball lightning is a bright light, anywhere in size from a golf ball to larger than a football.

It hovers above the ground, moving slowly, able to pass through walls, until it vanishes minutes later.

Eyewitness report

Crompton says he first became interested in ball lightning after an eyewitness report in the Canberra Times in 1970.

The eyewitness was the wife of a colleague and someone who Crompton thinks a reliable witness.

The woman awoke in the early hours one morning after a fierce lightning strike on a power pole near her home, he says.

As she went to check on her children she saw a sparkling golden ball of light sitting on the lintel above the doorway to the bathroom.

"It was a ball of about the size of an orange or a bit bigger," says Crompton. "Then in due course it just disappeared. The whole thing lasted about 5-10 seconds."

Scientific explanations

Crompton says two main theories have been put forward to explain ball lightning.

One theory, based on the physics of electrical discharges, says lightning strikes and travels slowly through conductive channels in the ground.

A high electrical field is created in the air as the lightning moves through the ground.

And ball lightning is formed from electricity discharging in this field.

The other theory, which is purely chemical, says lightning hits a surface containing silica and carbon in the ratio of 1:2.

The extreme heat of the lightning converts these chemicals into carbon dioxide and nanoparticles of silicon, which puff out of the surface in the shape of a ball.

The ball shimmers as the silicon oxidises in the air generating heat and light.

Crompton says this second theory was given a boost by an experiment carried out by French scientists that recreated silicon nanoparticles in the laboratory using electricity.

A synchrotron confirmed the presence of the nanoparticles, he says.

Mystery

While Crompton says this second theory is the most likely explanation for ball lightning, he says it doesn't really explain how ball lightning gets into a house.

The first theory does, he says, but doesn't explain other cases such as a report in the journal Nature by a UK scientist travelling in a plane during a thunderstorm over New York City in the 1960s.

Professor Roger Jennison of the University of Kent, reported seeing a glowing sphere emerge from one wall, drift down the aisle a metre above the floor, and disappear out of the rear of the aircraft.

"The aircraft one I find the hardest to explain," says Crompton. "[But] I think this is fascinating even though I can't explain it."

Forensic analysis

Forensic lab staff at the Australian Federal Police have also analysed apparent evidence for ball lightning.

Crompton once took them a piece of wood that a reliable witness reported had been marked by ball lightning.

The wood had a circular mark on it, dusty black on the outside and white on the inside.

But the x-ray fluorescence analysis didn't clearly support any of the theories, says Crompton.

Original here

Key to Happiness: Give Away Money

Those incoming federal tax-rebate checks could do more than boost the economy. They might also boost your mood, with one caveat: You must spend the cash on others, not yourself.

New research reveals that when individuals dole out money for gifts for friends or charitable donations, they get a boost in happiness while those who spend on themselves get no such cheery lift.

Scientists have found evidence that income is linked with a person's satisfaction with their life and other measures of happiness, but less is known about the link between how a person spends their money and happiness.

"We wanted to test our theory that how people spend their money is at least as important as how much money they earn," said Elizabeth Dunn, a psychologist at the University of British Columbia.

The findings, to be detailed in the March 21 issue of the journal Science, come as no surprise to some marketing scientists.

"It doesn't surprise me at all that people find giving money away very rewarding," said Aaron Ahuvia, associate professor of marketing at the University of Michigan-Dearborn, who was not involved in the current study.

The research was funded by a Hampton Research Grant.

Spending habits

Dunn and her colleagues surveyed a nationally representative sample of more than 630 Americans, about evenly split between males and females. Participants indicated their general happiness, annual income and a breakdown of monthly spending, including bills, purchases for themselves and for others, and donations to charity.

Despite the benefits of "prosocial spending" on others, participants spent more than 10 times as much on personal items as they did on charitable options. The researchers note personal purchases included paying bills.

Statistical analyses revealed personal spending had no link with a person's happiness, while spending on others and charity was significantly related to a boost in happiness.

"Regardless of how much income each person made," Dunn said, "those who spent money on others reported greater happiness, while those who spent more on themselves did not."

In a separate study of 13 employees at a Boston-based firm, the researchers found that employees who devoted more of their profit-sharing bonus (which ranged from $3,000 to $8,000) to others reported greater overall happiness than those who spent the windfall on their own needs.

Purchase power

A person apparently doesn't need to drop thousands of dollars on others to reap a gleeful reward.

In another experiment, the researchers gave college students a $5 or $20 bill, asking them to spend the money by that evening. Half the participants were instructed to spend the money on themselves, and the remaining students to spend on others.

Participants who spent the windfall on others — which included toys for siblings and meals eaten with friends — reported feeling happier at the end of the day than those who spent the money on themselves.

If as little as $5 spent on others could produce a surge in happiness on a given day, why don't people make these changes? In another study of more than 100 college students, the researchers found that most thought personal spending would make them happier than prosocial spending.

"Often people, at some implicit level, have this idea that 'buying these things is going to make me happier,'" Ahuvia said. "It does make them momentarily happy," he added, but the warm feelings are short-lived.

Buying buzz

Dunn's team puts forth several possible reasons to explain the charity-happiness link.

"I think it's a lot of factors of prosocial spending that are responsible for these happiness boosts," study researcher of UBC Lara Aknin told LiveScience. "I think it could be that people feel good about themselves when they do it; it could be the fact that it strengthens their social relationships; it could just be the act of spending time with other people."

Perhaps the fuzzy feelings associated with giving last longer than selfish buys. "The happiness 'hit' from giving may last a bit longer if the 'warm glow' from donation lasts longer than the hit from own consumption," said Paul Dolan, an economics professor at the Imperial College London in England. Dolan was not involved in Dunn's study.

Another idea is that charitable spending helps a person express a certain identity.

"People spend a lot of money to make their lives feel meaningful, significant and important," Ahuvia said during a telephone interview. "When you give away money you are making that same kind of purchase, only you are doing it in a more effective way."

He added, "What you're really trying to buy is meaning to life – Giving away money to a cause you believe in is a more effective purchase than buying a T-shirt that says "Save a Whale.'"

Original here

Why Overeaters Continue To Over Eat

Why do some people keep eating even when they have full tummies? This ScienCentral News video reports that research using water balloons in the stomach may answer that question.


Interviewee: Gene-Jack Wang
Brookhaven National Laboratory
Length: 1 min 26 sec
Produced by Sunita Reede
Edited by Sunita Reed & Chris Bergendorff
Copyright © ScienCentral, Inc.

Image made of chemo drug binding to DNA

INDIANAPOLIS, March 20 (UPI) -- A team of U.S. scientists has created the first three-dimensional image of how a chemotherapy agent targets and binds to DNA.

Researchers from the Indiana University School of Medicine and the Purdue University School of Science said their achievement might lead to the development of better chemotherapy drugs.

Using X-ray crystallography, the scientists produced the first 3-D molecular level images of bleomycin bound to DNA. X-ray crystallography is a widely used analytical technique in which X-rays are directed through crystals and results are deduced from the pattern of diffraction of the X-rays, the scientists said.

"Our 3-D picture of the structure of bleomycin gives us a much better understanding of exactly how the drug interacts with the DNA so we can begin thinking about engineering a better drug, with less toxicity," said Associate Professor Millie Georgiadis, co-senior author of the study with Professor Eric Long. "Since it's a DNA targeting agent, there's no limit to what type of cancers we could target with bleomycin if we can decrease the toxicity."

The study appears in the online early edition of The Proceedings of the National Academy of Sciences.


© 2008 United Press International. All Rights Reserved.
This material may not be reproduced, redistributed, or manipulated in any form.

Original here

63-year-old solves riddle from 1970

Israeli mathematician unravels puzzle that baffled scientists for decades

JERUSALEM - A mathematical puzzle that baffled the top minds in the esoteric field of symbolic dynamics for nearly four decades has been cracked — by a 63-year-old immigrant who once had to work as a security guard.

Avraham Trahtman, a mathematician who also toiled as a laborer after moving to Israel from Russia, succeeded where dozens failed, solving the elusive "Road Coloring Problem."

The conjecture essentially assumed it's possible to create a "universal map" that can direct people to arrive at a certain destination, at the same time, regardless of starting point. Experts say the proposition could have real-life applications in mapping and computer science.

The "Road Coloring Problem" was first posed in 1970 by Benjamin Weiss, an Israeli-American mathematician, and a colleague, Roy Adler, who worked at IBM at the time.

For eight years, Weiss tried to prove his theory. Over the next 30 years, some 100 other scientists attempted as well. All failed, until Trahtman came along and, in eight short pages, jotted the solution down in pencil last year.

"The solution is not that complicated. It's hard, but it is not that complicated," Trahtman said in heavily accented Hebrew. "Some people think they need to be complicated. I think they need to be nice and simple."

Weiss said it gave him great joy to see someone solve his problem.

Stuart Margolis, a mathematician who recruited Trahtman to teach at Bar Ilan University near Tel Aviv, called the solution one of the "beautiful results." But he said what makes the result especially remarkable is Trahtman's age and background.

"Math is usually a younger person's game, like music and the arts," Margolis said. "Usually you do your better work in your mid 20s and early 30s. He certainly came up with a good one at age 63."

Adding to the excitement is Trahtman's personal triumph in finally finding work as a mathematician after immigrating from Russia. "The first time I met him he was wearing a night watchman's uniform," Margolis said.

Originally from Yekaterinburg, Russia, Trahtman was an accomplished mathematician when he came to Israel in 1992, at age 48. But like many immigrants in the wave that followed the breakup of the Soviet Union, he struggled to find work in the Jewish state and was forced into stints working maintenance and security before landing a teaching position at Bar Ilan in 1995.

The soft-spoken Trahtman declined to talk about his odyssey, calling that the "old days." He said he felt "lucky" to be recognized for his solution, and played down the achievement as a "matter for mathematicians," saying it hasn't changed him a bit.

The puzzle tackled by Trahtman wasn't the longest-standing open problem to be solved recently. In 1994, British mathematician Andrew Wiles solved Fermat's last theorem, which had been open for more than 300 years.

Trahtman's solution is available on the Internet and is to be published soon in the Israel Journal of Mathematics.

Joel Friedman, a math professor at the University of British Columbia, said probably everyone in the field of symbolic dynamics had tried to solve the problem at some point, including himself. He said people in the related disciplines of graph theory, discrete math and theoretical computer science also tried.

"The solution to this problem has definitely generated excitement in the mathematical community," he said in an e-mail.

Margolis said the solution could have many applications.

"Say you've lost an e-mail and you want to get it back — it would be guaranteed," he said. "Let's say you are lost in a town you have never been in before and you have to get to a friend's house and there are no street signs — the directions will work no matter what."

Copyright 2008 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Original here

What Does a Plant Sound Like?

Researchers have developed a computer algorithm that can identify some plant species according to their unique sonar echoes. The experiments were meant to help biologists understand how bats find their favorite fruits or insects, but the research might also help engineers design high-speed systems to identify everything from widgets on conveyor belts to faces in crowds.

Bats might be legally blind, but they can fly straight to a desired fruit tree, even one growing amid dense foliage. They do so using a process called echolocation, in which they send out a series of chirps and then listen very carefully to the returning echoes. Inspired by this ability, researchers in Tьbingen, Germany, decided to see if they could invent an artificial system that would perform the same task.

First, the team developed data sets called spectrograms by bouncing sonar signals off five kinds of plants, including spruce trees and black thorn bushes. The researchers then characterized the echo response time and frequency of the resulting sound reflection patterns, which varied according to the number and size of the branches and leaves on each plant. The resulting computer program, says biophysicist and lead researcher Yossi Yovel of the University of Tьbingen, could distinguish similar plants with "surprisingly high accuracy." Eventually, the team was able to achieve near-100% success in identifying all five plant species used in the tests, as reported today in PLoS Computational Biology.

The findings will be valuable not only in understanding how bats echolocate, says Yovel, but they should help humans as well. The vast majority of remote-sensing algorithms are based on vision, he says, so if the sonar algorithm can be perfected, one of its advantages will be the ability to function in low light or darkness. (Infrared can't deliver the same degree of resolution.) That could be useful in picking out a crime suspect walking along a dark city street or hiding amid a crowd on a darkened mass-transit platform.

The research could turn out to be "major," says computational biologist Sorin Istrail of Brown University. The idea that a simple algorithm could provide a way to extract a meaningful model for bat echonavigation through tree environments is "remarkable," he says, and could lead the way to practical advances in the machine-learning field. And neuroethologist Steven Phelps of the University of Florida, Gainesville, says the research confirms that subtle differences in the qualities of echoes are enough for a bat to tell a spruce tree from a birch tree. "When we say apples and oranges, we generally assume the differences are obvious," he says, but "I can't imagine having to listen for them."

Original here

Climate facts to warm to

CATASTROPHIC predictions of global warming usually conjure with the notion of a tipping point, a point of no return.

Last Monday - on ABC Radio National, of all places - there was a tipping point of a different kind in the debate on climate change. It was a remarkable interview involving the co-host of Counterpoint, Michael Duffy and Jennifer Marohasy, a biologist and senior fellow of Melbourne-based think tank the Institute of Public Affairs. Anyone in public life who takes a position on the greenhouse gas hypothesis will ignore it at their peril.

Duffy asked Marohasy: "Is the Earth stillwarming?"

She replied: "No, actually, there has been cooling, if you take 1998 as your point of reference. If you take 2002 as your point of reference, then temperatures have plateaued. This is certainly not what you'd expect if carbon dioxide is driving temperature because carbon dioxide levels have been increasing but temperatures have actually been coming down over the last 10 years."

Duffy: "Is this a matter of any controversy?"

Marohasy: "Actually, no. The head of the IPCC (Intergovernmental Panel on Climate Change) has actually acknowledged it. He talks about the apparent plateau in temperatures so far this century. So he recognises that in this century, over the past eight years, temperatures have plateaued ... This is not what you'd expect, as I said, because if carbon dioxide is driving temperature then you'd expect that, given carbon dioxide levels have been continuing to increase, temperatures should be going up ... So (it's) very unexpected, not something that's being discussed. It should be being discussed, though, because it's very significant."

Duffy: "It's not only that it's not discussed. We never hear it, do we? Whenever there's any sort of weather event that can be linked into the global warming orthodoxy, it's put on the front page. But a fact like that, which is that global warming stopped a decade ago, is virtually never reported, which is extraordinary."

Duffy then turned to the question of how the proponents of the greenhouse gas hypothesis deal with data that doesn't support their case. "People like Kevin Rudd and Ross Garnaut are speaking as though the Earth is still warming at an alarming rate, but what is the argument from the other side? What would people associated with the IPCC say to explain the (temperature) dip?"

Marohasy: "Well, the head of the IPCC has suggested natural factors are compensating for the increasing carbon dioxide levels and I guess, to some extent, that's what sceptics have been saying for some time: that, yes, carbon dioxide will give you some warming but there are a whole lot of other factors that may compensate or that may augment the warming from elevated levels of carbon dioxide.

"There's been a lot of talk about the impact of the sun and that maybe we're going to go through or are entering a period of less intense solar activity and this could be contributing to the current cooling."

Duffy: "Can you tell us about NASA's Aqua satellite, because I understand some of the data we're now getting is quite important in our understanding of how climate works?"

Marohasy: "That's right. The satellite was only launched in 2002 and it enabled the collection of data, not just on temperature but also on cloud formation and water vapour. What all the climate models suggest is that, when you've got warming from additional carbon dioxide, this will result in increased water vapour, so you're going to get a positive feedback. That's what the models have been indicating. What this great data from the NASA Aqua satellite ... (is) actually showing is just the opposite, that with a little bit of warming, weather processes are compensating, so they're actually limiting the greenhouse effect and you're getting a negative rather than a positive feedback."

Duffy: "The climate is actually, in one way anyway, more robust than was assumed in the climate models?"

Marohasy: "That's right ... These findings actually aren't being disputed by the meteorological community. They're having trouble digesting the findings, they're acknowledging the findings, they're acknowledging that the data from NASA's Aqua satellite is not how the models predict, and I think they're about to recognise that the models really do need to be overhauled and that when they are overhauled they will probably show greatly reduced future warming projected as a consequence of carbon dioxide."

Duffy: "From what you're saying, it sounds like the implications of this could beconsiderable ..."

Marohasy: "That's right, very much so. The policy implications are enormous. The meteorological community at the moment is really just coming to terms with the output from this NASA Aqua satellite and (climate scientist) Roy Spencer's interpretation of them. His work is published, his work is accepted, but I think people are still in shock at this point."

If Marohasy is anywhere near right about the impending collapse of the global warming paradigm, life will suddenly become a whole lot more interesting.

A great many founts of authority, from the Royal Society to the UN, most heads of government along with countless captains of industry, learned professors, commentators and journalists will be profoundly embarrassed. Let us hope it is a prolonged and chastening experience.

With catastrophe off the agenda, for most people the fog of millennial gloom will lift, at least until attention turns to the prospect of the next ice age. Among the better educated, the sceptical cast of mind that is the basis of empiricism will once again be back in fashion. The delusion that by recycling and catching public transport we can help save the planet will quickly come to be seen for the childish nonsense it was all along.

The poorest Indians and Chinese will be left in peace to work their way towards prosperity, without being badgered about the size of their carbon footprint, a concept that for most of us will soon be one with Nineveh and Tyre, clean forgotten in six months.

The scores of town planners in Australia building empires out of regulating what can and can't be built on low-lying shorelines will have to come to terms with the fact inundation no longer impends and find something more plausible to do. The same is true of the bureaucrats planning to accommodate "climate refugees".

Penny Wong's climate mega-portfolio will suddenly be as ephemeral as the ministries for the year 2000 that state governments used to entrust to junior ministers. Malcolm Turnbull will have to reinvent himself at vast speed as a climate change sceptic and the Prime Minister will have to kiss goodbye what he likes to call the great moral issue and policy challenge of our times.

It will all be vastly entertaining to watch.

THE Age published an essay with an environmental theme by Ian McEwan on March 8 and its stablemate, The Sydney Morning Herald, also carried a slightly longer version of the same piece.

The Australian's Cut & Paste column two days later reproduced a telling paragraph from the Herald's version, which suggested that McEwan was a climate change sceptic and which The Age had excised. He was expanding on the proposition that "we need not only reliable data but their expression in the rigorous use of statistics".

What The Age decided to spare its readers was the following: "Well-meaning intellectual movements, from communism to post-structuralism, have a poor history of absorbing inconvenient fact or challenges to fundamental precepts. We should not ignore or suppress good indicators on the environment, though they have become extremely rare now. It is tempting to the layman to embrace with enthusiasm the latest bleak scenario because it fits the darkness of our soul, the prevailing cultural pessimism. The imagination, as Wallace Stevens once said, is always at the end of an era. But we should be asking, or expecting others to ask, for the provenance of the data, the assumptions fed into the computer model, the response of the peer review community, and so on. Pessimism is intellectually delicious, even thrilling, but the matter before us is too serious for mere self-pleasuring. It would be self-defeating if the environmental movement degenerated into a religion of gloomy faith. (Faith, ungrounded certainty, is no virtue.)"

The missing sentences do not appear anywhere else in The Age's version of the essay. The attribution reads: "Copyright Ian McEwan 2008" and there is no acknowledgment of editing by The Age.

Why did the paper decide to offer its readers McEwan lite? Was he, I wonder, consulted on the matter? And isn't there a nice irony that The Age chose to delete the line about ideologues not being very good at "absorbing inconvenient fact"?

Original here

New College MPG Challenge Comes with $1 Million Purse


Where will you be from August 14th - August 23rd? For my part, I’ll be doing my best to represent my college at the Great Race MPG Challenge, driving from New York to San Francisco on as little fuel as possible.

Why would I be doing this? Because this year, to mark the 100th anniversary of the 1908 Great Race, the organizers of Hybrid Fest will be running a competition to see if any college teams can break 100 MPG in a cross country race.

More about the race after the break!

Here’s what the organizers have to say about the purpose of the competition:

Our goal? To encourage students, alumni, faculty and business to combine forces and discover the perfect combination of technology and driving habits to reach the magic number of 100 MPG. Sounds crazy, right? So did driving across the Bering Straits in the winter. Right now, people we call “hypermilers” are doing just that! In fact, one team achieved almost 110 MPG in an unmodified Toyota Prius and another scored 186 MPG with a Honda Insight! These are the very same cars you will find on the road today.

Every team will have a Prius to modify and then race, and will have to make the 4,000 mile journey within the rally time limits. If the winning team gets over 100 MPG over the entire trip they will take home the Great Race Innovation Prize of $1 Million, to be given to their school’s scholarship fund. Other teams will win prizes depending on their performance in the competition.

For a $5,000 entry fee any college can field a team made of of students, faculty, or other representatives of the college. As I have been told by the event organizer there is already interest at many schools and there will likely be a large field of competitors.

The most exciting thing about this contest is the combination of your standard Toyota Prius and the ability to do many modifications to that platform. Drivetrain mods are out, but it should be very interesting to see how a combination of driving skills, aerodynamics, and weight reduction can improve on what is generally thought of as the Prius “gold standard.”

This competition grew out of a larger plan to reenact the round the world trip from the 1908 Great Race, but the MPG Challenge portion fell through because the organizers could not find sponsors for such a large event. Unfortunate as that is, it’s great that there will still be a competition to help bring some of these issues to light in the public sphere.

For more information (though the site is still rather bare), and to sign up for email updates, check out the MPG Challenge homepage. And of course, more updates to come as a possible EcoModder.com team goes through the works!

Original here

Australia plans carbon storage under ocean

By Michael Perry

SYDNEY (Reuters) - Australia plans to allow greenhouse gas emissions to be stored in the ocean floor around the island continent, with exploration for suitable sites possibly starting in 2008.

Energy Minister Martin Ferguson said the government would amend the Offshore Petroleum Act this year to allow for seabed storage of carbon emissions from coal-fired power stations.

"Australia has significant geological storage potential, particularly in our offshore sedimentary basins," Ferguson told an energy conference in Sydney late on Tuesday.

"I am hoping that amendments to the Offshore Petroleum Act 2006 will be passed in time for the government to release acreage for exploration in 2008, making Australia one of the first countries in the world to establish a regulated carbon capture and storage regime," Ferguson said.

Green groups are critical of the plan to store carbon emissions in the ocean floor, saying they are concerned about the chances of leakage of emissions into the ocean environment.

"The coal and energy corporations are doubtless lobbying hard for the government to carry all liability for any leakages while they continue to profit from their polluting practices," Greens Senator Christine Milne told local media on Wednesday.

Australia's Labor government, elected in November 2007, ratified the Kyoto Protocol the following month, reversing an 11-year policy by the previous conservative government.

Rudd's government has made climate change a priority and has released a "National Clean Coal Initiative" which will see a regulatory regime for access and tenure to offshore Australia for geological storage.

Australia is the world's largest coal exporter and is reliant on fossil fuel for transport and energy. About 80 percent of electricity is produced by coal-fired power stations.

THe country is responsible for about 1.2 percent of global greenhouse gas emissions and is one of the highest polluters per capita.

Its carbon emissions are forecast to continue to grow due to its heavy reliance on coal for electricity, although the government says the country will meet its Kyoto emissions targets by 2012. Emissions will grow by 108 percent of 1990 levels from 2008 to 2012.

"Coal will continue to make a major contribution to Australia's energy needs well into the future and therefore we need to urgently reduce greenhouse gas emissions from coal-fired electricity generation," said Ferguson.

"Clean coal technologies involving carbon capture and storage will play a vital role in meeting future greenhouse constraints. A nationally co-ordinated effort is needed to bring forward the commercial availability of these technologies."

Original here

The Long Emergency

A few weeks ago, the price of oil ratcheted above fifty-five dollars a barrel, which is about twenty dollars a barrel more than a year ago. The next day, the oil story was buried on page six of the New York Times business section. Apparently, the price of oil is not considered significant news, even when it goes up five bucks a barrel in the span of ten days. That same day, the stock market shot up more than a hundred points because, CNN said, government data showed no signs of inflation. Note to clueless nation: Call planet Earth.

Carl Jung, one of the fathers of psychology, famously remarked that "people cannot stand too much reality." What you're about to read may challenge your assumptions about the kind of world we live in, and especially the kind of world into which events are propelling us. We are in for a rough ride through uncharted territory.

It has been very hard for Americans -- lost in dark raptures of nonstop infotainment, recreational shopping and compulsive motoring -- to make sense of the gathering forces that will fundamentally alter the terms of everyday life in our technological society. Even after the terrorist attacks of 9/11, America is still sleepwalking into the future. I call this coming time the Long Emergency.

Most immediately we face the end of the cheap-fossil-fuel era. It is no exaggeration to state that reliable supplies of cheap oil and natural gas underlie everything we identify as the necessities of modern life -- not to mention all of its comforts and luxuries: central heating, air conditioning, cars, airplanes, electric lights, inexpensive clothing, recorded music, movies, hip-replacement surgery, national defense -- you name it.

The few Americans who are even aware that there is a gathering global-energy predicament usually misunderstand the core of the argument. That argument states that we don't have to run out of oil to start having severe problems with industrial civilization and its dependent systems. We only have to slip over the all-time production peak and begin a slide down the arc of steady depletion.

The term "global oil-production peak" means that a turning point will come when the world produces the most oil it will ever produce in a given year and, after that, yearly production will inexorably decline. It is usually represented graphically in a bell curve. The peak is the top of the curve, the halfway point of the world's all-time total endowment, meaning half the world's oil will be left. That seems like a lot of oil, and it is, but there's a big catch: It's the half that is much more difficult to extract, far more costly to get, of much poorer quality and located mostly in places where the people hate us. A substantial amount of it will never be extracted.

The United States passed its own oil peak -- about 11 million barrels a day -- in 1970, and since then production has dropped steadily. In 2004 it ran just above 5 million barrels a day (we get a tad more from natural-gas condensates). Yet we consume roughly 20 million barrels a day now. That means we have to import about two-thirds of our oil, and the ratio will continue to worsen.

The U.S. peak in 1970 brought on a portentous change in geoeconomic power. Within a few years, foreign producers, chiefly OPEC, were setting the price of oil, and this in turn led to the oil crises of the 1970s. In response, frantic development of non-OPEC oil, especially the North Sea fields of England and Norway, essentially saved the West's ass for about two decades. Since 1999, these fields have entered depletion. Meanwhile, worldwide discovery of new oil has steadily declined to insignificant levels in 2003 and 2004.

Some "cornucopians" claim that the Earth has something like a creamy nougat center of "abiotic" oil that will naturally replenish the great oil fields of the world. The facts speak differently. There has been no replacement whatsoever of oil already extracted from the fields of America or any other place.

Now we are faced with the global oil-production peak. The best estimates of when this will actually happen have been somewhere between now and 2010. In 2004, however, after demand from burgeoning China and India shot up, and revelations that Shell Oil wildly misstated its reserves, and Saudi Arabia proved incapable of goosing up its production despite promises to do so, the most knowledgeable experts revised their predictions and now concur that 2005 is apt to be the year of all-time global peak production.

It will change everything about how we live.

To aggravate matters, American natural-gas production is also declining, at five percent a year, despite frenetic new drilling, and with the potential of much steeper declines ahead. Because of the oil crises of the 1970s, the nuclear-plant disasters at Three Mile Island and Chernobyl and the acid-rain problem, the U.S. chose to make gas its first choice for electric-power generation. The result was that just about every power plant built after 1980 has to run on gas. Half the homes in America are heated with gas. To further complicate matters, gas isn't easy to import. Here in North America, it is distributed through a vast pipeline network. Gas imported from overseas would have to be compressed at minus-260 degrees Fahrenheit in pressurized tanker ships and unloaded (re-gasified) at special terminals, of which few exist in America. Moreover, the first attempts to site new terminals have met furious opposition because they are such ripe targets for terrorism.

Some other things about the global energy predicament are poorly understood by the public and even our leaders. This is going to be a permanent energy crisis, and these energy problems will synergize with the disruptions of climate change, epidemic disease and population overshoot to produce higher orders of trouble.

We will have to accommodate ourselves to fundamentally changed conditions.

No combination of alternative fuels will allow us to run American life the way we have been used to running it, or even a substantial fraction of it. The wonders of steady technological progress achieved through the reign of cheap oil have lulled us into a kind of Jiminy Cricket syndrome, leading many Americans to believe that anything we wish for hard enough will come true. These days, even people who ought to know better are wishing ardently for a seamless transition from fossil fuels to their putative replacements.

The widely touted "hydrogen economy" is a particularly cruel hoax. We are not going to replace the U.S. automobile and truck fleet with vehicles run on fuel cells. For one thing, the current generation of fuel cells is largely designed to run on hydrogen obtained from natural gas. The other way to get hydrogen in the quantities wished for would be electrolysis of water using power from hundreds of nuclear plants. Apart from the dim prospect of our building that many nuclear plants soon enough, there are also numerous severe problems with hydrogen's nature as an element that present forbidding obstacles to its use as a replacement for oil and gas, especially in storage and transport.

Wishful notions about rescuing our way of life with "renewables" are also unrealistic. Solar-electric systems and wind turbines face not only the enormous problem of scale but the fact that the components require substantial amounts of energy to manufacture and the probability that they can't be manufactured at all without the underlying support platform of a fossil-fuel economy. We will surely use solar and wind technology to generate some electricity for a period ahead but probably at a very local and small scale.

Virtually all "biomass" schemes for using plants to create liquid fuels cannot be scaled up to even a fraction of the level at which things are currently run. What's more, these schemes are predicated on using oil and gas "inputs" (fertilizers, weed-killers) to grow the biomass crops that would be converted into ethanol or bio-diesel fuels. This is a net energy loser -- you might as well just burn the inputs and not bother with the biomass products. Proposals to distill trash and waste into oil by means of thermal depolymerization depend on the huge waste stream produced by a cheap oil and gas economy in the first place.

Coal is far less versatile than oil and gas, extant in less abundant supplies than many people assume and fraught with huge ecological drawbacks -- as a contributor to greenhouse "global warming" gases and many health and toxicity issues ranging from widespread mercury poisoning to acid rain. You can make synthetic oil from coal, but the only time this was tried on a large scale was by the Nazis under wartime conditions, using impressive amounts of slave labor.

If we wish to keep the lights on in America after 2020, we may indeed have to resort to nuclear power, with all its practical problems and eco-conundrums. Under optimal conditions, it could take ten years to get a new generation of nuclear power plants into operation, and the price may be beyond our means. Uranium is also a resource in finite supply. We are no closer to the more difficult project of atomic fusion, by the way, than we were in the 1970s.

The upshot of all this is that we are entering a historical period of potentially great instability, turbulence and hardship. Obviously, geopolitical maneuvering around the world's richest energy regions has already led to war and promises more international military conflict. Since the Middle East contains two-thirds of the world's remaining oil supplies, the U.S. has attempted desperately to stabilize the region by, in effect, opening a big police station in Iraq. The intent was not just to secure Iraq's oil but to modify and influence the behavior of neighboring states around the Persian Gulf, especially Iran and Saudi Arabia. The results have been far from entirely positive, and our future prospects in that part of the world are not something we can feel altogether confident about.

And then there is the issue of China, which, in 2004, became the world's second-greatest consumer of oil, surpassing Japan. China's surging industrial growth has made it increasingly dependent on the imports we are counting on. If China wanted to, it could easily walk into some of these places -- the Middle East, former Soviet republics in central Asia -- and extend its hegemony by force. Is America prepared to contest for this oil in an Asian land war with the Chinese army? I doubt it. Nor can the U.S. military occupy regions of the Eastern Hemisphere indefinitely, or hope to secure either the terrain or the oil infrastructure of one distant, unfriendly country after another. A likely scenario is that the U.S. could exhaust and bankrupt itself trying to do this, and be forced to withdraw back into our own hemisphere, having lost access to most of the world's remaining oil in the process.

We know that our national leaders are hardly uninformed about this predicament. President George W. Bush has been briefed on the dangers of the oil-peak situation as long ago as before the 2000 election and repeatedly since then. In March, the Department of Energy released a report that officially acknowledges for the first time that peak oil is for real and states plainly that "the world has never faced a problem like this. Without massive mitigation more than a decade before the fact, the problem will be pervasive and will not be temporary."

Most of all, the Long Emergency will require us to make other arrangements for the way we live in the United States. America is in a special predicament due to a set of unfortunate choices we made as a society in the twentieth century. Perhaps the worst was to let our towns and cities rot away and to replace them with suburbia, which had the additional side effect of trashing a lot of the best farmland in America. Suburbia will come to be regarded as the greatest misallocation of resources in the history of the world. It has a tragic destiny. The psychology of previous investment suggests that we will defend our drive-in utopia long after it has become a terrible liability.

Before long, the suburbs will fail us in practical terms. We made the ongoing development of housing subdivisions, highway strips, fried-food shacks and shopping malls the basis of our economy, and when we have to stop making more of those things, the bottom will fall out.

The circumstances of the Long Emergency will require us to downscale and re-scale virtually everything we do and how we do it, from the kind of communities we physically inhabit to the way we grow our food to the way we work and trade the products of our work. Our lives will become profoundly and intensely local. Daily life will be far less about mobility and much more about staying where you are. Anything organized on the large scale, whether it is government or a corporate business enterprise such as Wal-Mart, will wither as the cheap energy props that support bigness fall away. The turbulence of the Long Emergency will produce a lot of economic losers, and many of these will be members of an angry and aggrieved former middle class.

Food production is going to be an enormous problem in the Long Emergency. As industrial agriculture fails due to a scarcity of oil- and gas-based inputs, we will certainly have to grow more of our food closer to where we live, and do it on a smaller scale. The American economy of the mid-twenty-first century may actually center on agriculture, not information, not high tech, not "services" like real estate sales or hawking cheeseburgers to tourists. Farming. This is no doubt a startling, radical idea, and it raises extremely difficult questions about the reallocation of land and the nature of work. The relentless subdividing of land in the late twentieth century has destroyed the contiguity and integrity of the rural landscape in most places. The process of readjustment is apt to be disorderly and improvisational. Food production will necessarily be much more labor-intensive than it has been for decades. We can anticipate the re-formation of a native-born American farm-laboring class. It will be composed largely of the aforementioned economic losers who had to relinquish their grip on the American dream. These masses of disentitled people may enter into quasi-feudal social relations with those who own land in exchange for food and physical security. But their sense of grievance will remain fresh, and if mistreated they may simply seize that land.

The way that commerce is currently organized in America will not survive far into the Long Emergency. Wal-Mart's "warehouse on wheels" won't be such a bargain in a non-cheap-oil economy. The national chain stores' 12,000-mile manufacturing supply lines could easily be interrupted by military contests over oil and by internal conflict in the nations that have been supplying us with ultra-cheap manufactured goods, because they, too, will be struggling with similar issues of energy famine and all the disorders that go with it.

As these things occur, America will have to make other arrangements for the manufacture, distribution and sale of ordinary goods. They will probably be made on a "cottage industry" basis rather than the factory system we once had, since the scale of available energy will be much lower -- and we are not going to replay the twentieth century. Tens of thousands of the common products we enjoy today, from paints to pharmaceuticals, are made out of oil. They will become increasingly scarce or unavailable. The selling of things will have to be reorganized at the local scale. It will have to be based on moving merchandise shorter distances. It is almost certain to result in higher costs for the things we buy and far fewer choices.

The automobile will be a diminished presence in our lives, to say the least. With gasoline in short supply, not to mention tax revenue, our roads will surely suffer. The interstate highway system is more delicate than the public realizes. If the "level of service" (as traffic engineers call it) is not maintained to the highest degree, problems multiply and escalate quickly. The system does not tolerate partial failure. The interstates are either in excellent condition, or they quickly fall apart.

America today has a railroad system that the Bulgarians would be ashamed of. Neither of the two major presidential candidates in 2004 mentioned railroads, but if we don't refurbish our rail system, then there may be no long-range travel or transport of goods at all a few decades from now. The commercial aviation industry, already on its knees financially, is likely to vanish. The sheer cost of maintaining gigantic airports may not justify the operation of a much-reduced air-travel fleet. Railroads are far more energy efficient than cars, trucks or airplanes, and they can be run on anything from wood to electricity. The rail-bed infrastructure is also far more economical to maintain than our highway network.

The successful regions in the twenty-first century will be the ones surrounded by viable farming hinterlands that can reconstitute locally sustainable economies on an armature of civic cohesion. Small towns and smaller cities have better prospects than the big cities, which will probably have to contract substantially. The process will be painful and tumultuous. In many American cities, such as Cleveland, Detroit and St. Louis, that process is already well advanced. Others have further to fall. New York and Chicago face extraordinary difficulties, being oversupplied with gigantic buildings out of scale with the reality of declining energy supplies. Their former agricultural hinterlands have long been paved over. They will be encysted in a surrounding fabric of necrotic suburbia that will only amplify and reinforce the cities' problems. Still, our cities occupy important sites. Some kind of urban entities will exist where they are in the future, but probably not the colossi of twentieth-century industrialism.

Some regions of the country will do better than others in the Long Emergency. The Southwest will suffer in proportion to the degree that it prospered during the cheap-oil blowout of the late twentieth century. I predict that Sunbelt states like Arizona and Nevada will become significantly depopulated, since the region will be short of water as well as gasoline and natural gas. Imagine Phoenix without cheap air conditioning.

I'm not optimistic about the Southeast, either, for different reasons. I think it will be subject to substantial levels of violence as the grievances of the formerly middle class boil over and collide with the delusions of Pentecostal Christian extremism. The latent encoded behavior of Southern culture includes an outsized notion of individualism and the belief that firearms ought to be used in the defense of it. This is a poor recipe for civic cohesion.

The Mountain States and Great Plains will face an array of problems, from poor farming potential to water shortages to population loss. The Pacific Northwest, New England and the Upper Midwest have somewhat better prospects. I regard them as less likely to fall into lawlessness, anarchy or despotism and more likely to salvage the bits and pieces of our best social traditions and keep them in operation at some level.

These are daunting and even dreadful prospects. The Long Emergency is going to be a tremendous trauma for the human race. We will not believe that this is happening to us, that 200 years of modernity can be brought to its knees by a world-wide power shortage. The survivors will have to cultivate a religion of hope -- that is, a deep and comprehensive belief that humanity is worth carrying on. If there is any positive side to stark changes coming our way, it may be in the benefits of close communal relations, of having to really work intimately (and physically) with our neighbors, to be part of an enterprise that really matters and to be fully engaged in meaningful social enactments instead of being merely entertained to avoid boredom. Years from now, when we hear singing at all, we will hear ourselves, and we will sing with our whole hearts.

Adapted from The Long Emergency, 2005, by James Howard Kunstler, and reprinted with permission of the publisher, Grove/Atlantic, Inc.

Original here