There was an error in this gadget

Followers

Friday, August 15, 2008

MIT solves puzzle of meteorite-asteroid link

For the last few years, astronomers have faced a puzzle: The vast majority of asteroids that come near the Earth are of a type that matches only a tiny fraction of the meteorites that most frequently hit our planet.

Since meteorites are mostly pieces of asteroids, this discrepancy was hard to explain, but a team from MIT and other institutions has now found what it believes is the answer to the puzzle. The smaller rocks that most often fall to Earth, it seems, come straight in from the main asteroid belt out between Mars and Jupiter, rather than from the near-Earth asteroid (NEA) population.

The puzzle gradually emerged from a long-term study of the properties of asteroids carried out by MIT professor of planetary science Richard Binzel and his students, along with postdoctoral researcher P. Vernazza, who is now with the European Space Agency, and A.T. Tokunaga, director of the University of Hawaii's Institute of Astronomy.

By studying the spectral signatures of near-Earth asteroids, they were able to compare them with spectra obtained on Earth from the thousands of meteorites that have been recovered from falls. But the more they looked, the more they found that most NEAs -- about two-thirds of them -- match a specific type of meteorites called LL chondrites, which only represent about 8 percent of meteorites. How could that be?

"Why do we see a difference between the objects hitting the ground and the big objects whizzing by?" Binzel asks. "It's been a headscratcher." As the effect became gradually more and more noticeable as more asteroids were analyzed, "we finally had a big enough data set that the statistics demanded an answer. It could no longer be just a coincidence."

Way out in the main belt, the population is much more varied, and approximates the mix of types that is found among meteorites. But why would the things that most frequently hit us match this distant population better than it matches the stuff that's right in our neighborhood? That's where the idea emerged of a fast track all the way from the main belt to a "splat!" on Earth's surface.

This fast track, it turns out, is caused by an obscure effect that was discovered long ago, but only recently recognized as a significant factor in moving asteroids around, called the Yarkovsky effect.

The Yarkovsky effect causes asteroids to change their orbits as a result of the way they absorb the sun's heat on one side and radiate it back later as they rotate around. This causes a slight imbalance that slowly, over time, alters the object's path. But the key thing is this: The effect acts much more strongly on the smallest objects, and only weakly on the larger ones.

"We think the Yarkovsky effect is so efficient for meter-size objects that it can operate on all regions of the asteroid belt," not just its inner edge, Binzel says.

Thus, for chunks of rock from boulder-size on down -- the kinds of things that end up as typical meteorites -- the Yarkovsky effect plays a major role, moving them with ease from throughout the asteroid belt on to paths that can head toward Earth. For larger asteroids a kilometer or so across, the kind that we worry about as potential threats to the Earth, the effect is so weak it can only move them small amounts.

Binzel's study concludes that the largest near-Earth asteroids mostly come from the asteroid belt's innermost edge, where they are part of a specific "family" thought to all be remnants of a larger asteroid that was broken apart by collisions. With an initial nudge from the Yarkovsky effect, kilometer-sized asteroids from the Flora region can find themselves "over the edge" of the asteroid belt and sent on a path to Earth's vicinity through the perturbing effects of the planets called resonances.

The new study is also good news for protecting the planet. One of the biggest problems in figuring out how to deal with an approaching asteroid, if and when one is discovered on a potential collision course, is that they are so varied. The best way of dealing with one kind might not work on another.

But now that this analysis has shown that the majority of near-Earth asteroids are of this specific type -- stony objects, rich in the mineral olivine and poor in iron -- it's possible to concentrate most planning on dealing with that kind of object, Binzel says. "Odds are, an object we might have to deal with would be like an LL chondrite, and thanks to our samples in the laboratory, we can measure its properties in detail," he says. "It's the first step toward 'know thy enemy'."

The research is being reported this week in the journal Nature. In addition to Binzel, Vernazza and Tokunaga, the co-authors are MIT graduate students Christina Thomas and Francesca DeMeo, S.J. Bus of the University of Hawaii, and A.S. Rivkin of Johns Hopkins University. The work was supported by NASA and the NSF.

Original here

Exclusive: A robot with a biological brain

University of Reading scientists have developed a robot controlled by a biological brain formed from cultured neurons. And this is a world’s premiere. Other research teams have tried to control robots with ‘brains,’ but there was always a computer in the loop. This new project is the first one to examine ‘how memories manifest themselves in the brain, and how a brain stores specific pieces of data.’ As life expectancy is increasing in most countries, this new research could provide insights into how the brain works and help aging people. In fact, the main goal of this project is to understand better the development of diseases and disorders which affect the brain such as Alzheimer or Parkinson diseases. It’s interesting to note that this project is being led by Professor Kevin Warwick, who became famous in 1998 when a silicon chip was implanted in his arm to allow a computer to monitor him in order to assess the latest technology for use with the disabled. But read more…

A robot with a biological brain

You can see on the left a picture of this robot with a biological brain. “The brain consists of a collection of neurons cultured on a Multi Electrode Array (MEA). It communicates and controls the robot via a Bluetooth connection.” (Credit: University of Reading). Here is a link to a larger version of this picture.

These robots are developed at the Cybernetic Intelligence Research Group, part of the School of Systems Engineering at the University of Reading. The team has been led by Kevin Warwick, Professor of Cybernetics (please also check his personal home page. He worked with two lecturers in his group, Dr Victor Becerra and Dr Slawomir Nasuto, as well as with Dr Ben Whalley, another lecturer in the School of Pharmacy.

Now, let’s look at these biological brains for robots. “The robot’s biological brain is made up of cultured neurons which are placed onto a multi electrode array (MEA). The MEA is a dish with approximately 60 electrodes which pick up the electrical signals generated by the cells. This is then used to drive the movement of the robot. Every time the robot nears an object, signals are directed to stimulate the brain by means of the electrodes. In response, the brain’s output is used to drive the wheels of the robot, left and right, so that it moves around in an attempt to avoid hitting objects. The robot has no additional control from a human or a computer, its sole means of control is from its own brain.”

Impressive, isn’t? The team is now working on “how memories manifest themselves in the brain when the robot revisits familiar territory,” hoping to help people affected by Alzheimer’s disease. Here is a quote from Warwick about this project. “This new research is tremendously exciting as firstly the biological brain controls its own moving robot body, and secondly it will enable us to investigate how the brain learns and memorises its experiences. This research will move our understanding forward of how brains work, and could have a profound effect on many areas of science and medicine.”

And here is another quote from Whalley. “One of the fundamental questions that scientists are facing today is how we link the activity of individual neurons with the complex behaviours that we see in whole organisms. This project gives us a really unique opportunity to look at something which may exhibit complex behaviours, but still remain closely tied to the activity of individual neurons. Hopefully we can use that to go some of the way to answer some of these very fundamental questions.”

This project has been funded by the UK’s Engineering and Physical Sciences Research Council (EPSRC) with a grant of £ 435,856. The projects started on January 1, 2007 and will end on December 31, 2009. Here is a link to the details of the grant awarded to this project called “Investigating the Computational Capacity of Cultured Neuronal Networks Using Machine Learning.”

Here is an excerpt from the project description. “In this project the neural cultures will be cultured locally in the University of Readings’ new Electrophysiological research laboratory allowing real-time access to the recording and stimulation hardware via an intranet link-up. In order to test the abilities of such cultured neural networks we propose using them to control some of our existing mobile robots. This is to be achieved by applying a number of Machine Learning and Artificial Intelligence techniques in order to correctly translate robot sensor inputs into suitable patterns of stimulation and interpret the resulting patterns of neural activity as motor actions. In order to measure the amount of computation the cultured “brain” is performing we will use a surrogate (an artificial neural network that redistributes the input signal to the output) in place of the the cultured “brain”. Both the cultured “brain” and the surrogate will be applied to various behavioural tasks (such as obstacle avoidance and wall following) the difference in performance between the cultured “brain” and the surrogate will give us some measure of the processing capabilities of cultured neural networks when used in this way.”

This project has been recently presented during the European Robotics Symposium 2008 (EUROS 2008) held in Prague, Czech Republic, on March 26-27, 2008. The title of the paper accepted for publication was “Architecture for Living Neuronal Cell Control of a Mobile Robot,” while Warwick’s keynote talk was named “Robots with Biological Brains and Humans with Part Machine Brains.”

Here is an excerpt from the introduction of this keynote talk. “In this presentation a look is taken at how the use of implant and electrode technology can be employed to create biological brains for robots, to enable human enhancement and to diminish the effects of certain neural illnesses. In all cases the end result is to increase the range of abilities of the recipients. […] The area of focus is notably the use of electrode technology, where a connection is made directly with the cerebral cortex and/or nervous system. The presentation will consider the future in which robots have biological, or part-biological, brains and in which neural implants link the human nervous system bi-directionally with technology and the internet.

If you’ve read this post up to now, you need a reward. Here is a link from where you will be able to download a broadcast quality video (I’m not sure if you have to register). Anyway, this video is 7 minutes and 22 seconds long and this is a 95.2 MB download. This movie is divided in three parts: the evolutions of the brainy robot and two interviews with the main researchers. Very instructive…

Original here

Brain will be battlefield of future, warns US intelligence report

Brain scan

Rapid advances in neuroscience could have a dramatic impact on national security and the way in which future wars are fought, US intelligence officials have been told.

In a report commissioned by the Defense Intelligence Agency, leading scientists were asked to examine how a greater understanding of the brain over the next 20 years is likely to drive the development of new medicines and technologies.

They found several areas in which progress could have a profound impact, including behaviour-altering drugs, scanners that can interpret a person's state of mind and devices capable of boosting senses such as hearing and vision.

On the battlefield, bullets may be replaced with "pharmacological land mines" that release drugs to incapacitate soldiers on contact, while scanners and other electronic devices could be developed to identify suspects from their brain activity and even disrupt their ability to tell lies when questioned, the report says.

"The concept of torture could also be altered by products in this market. It is possible that some day there could be a technique developed to extract information from a prisoner that does not have any lasting side effects," the report states.

The report highlights one electronic technique, called transcranial direct current stimulation, which involves using electrical pulses to interfere with the firing of neurons in the brain and has been shown to delay a person's ability to tell a lie.

Drugs could also be used to enhance the performance of military personnel. There is already anecdotal evidence of troops using the narcolepsy drug modafinil, and ritalin, which is prescribed for attention deficit disorder, to boost their performance. Future drugs, developed to boost the cognitive faculties of people with dementia, are likely to be used in a similar way, the report adds.

Greater understanding of the brain's workings is also expected to usher in new devices that link directly to the brain, either to allow operators to control machinery with their minds, such as flying unmanned reconnaissance drones, or to boost their natural senses.

For example, video from a person's glasses, or audio recorded from a headset, could be processed by a computer to help search for relevant information. "Experiments indicate that the advantages of these devices are such that human operators will be greatly enhanced for things like photo reconnaissance and so on," Kit Green, who chaired the report committee, said.

The report warns that while the US and other western nations might now consider themselves at the forefront of neuroscience, that is likely to change as other countries ramp up their computing capabilities. Unless security services can monitor progress internationally, they risk "major, even catastrophic, intelligence failures in the years ahead", the report warns.

"In the intelligence community, there is an extremely small number of people who understand the science and without that it's going to be impossible to predict surprises. This is a black hole that needs to be filled with light," Green told the Guardian.

The technologies will one day have applications in counter-terrorism and crime-fighting. The report says brain imaging will not improve sufficiently in the next 20 years to read peoples' intentions from afar and spot criminals before they act, but it might be good enough to help identify people at a checkpoint or counter who are afraid or anxious.

"We're not going to be reading minds at a distance, but that doesn't mean we can't detect gross changes in anxiety or fear, and then subsequently talk to those individuals to see what's upsetting them," Green said.

The development of advanced surveillance techniques, such as cameras that can spot fearful expressions on people's faces, could lead to some inventive ways to fool them, the report adds, such as Botox injections to relax facial muscles.

Report: spies need to stay on top of neuroscience research

Intelligence gathering is neither straightforward nor foolproof. The intelligence community's abject failure when it came to the matter of Iraq and WMDs illustrates that point rather effectively, as does the failure to anticipate the Soviet invasion of Afghanistan, as well as the USSR's wild goose chase over Able Archer 83. When we think of the application of science to the intelligence gathering world, it's usually something like spy satellites or listening devices, but the US intelligence community needs to pay more heed to the world of the neurosciences, according to a new report from the National Research Council.

One of the major findings in "Emerging Cognitive Neuroscience and Related Technologies" is that the intelligence community should invest in research for detecting and measuring psychological states via neurophysiological markers. It's no secret that polygraphs are almost worthless. Their analysis is highly subjective, and it's fairly simple to fool them. But advances in neuroscience, specifically neuroimaging, mean that a lie detector that actually works is much closer to reality. It's fairly obvious why this would be of interest to the intelligence community, but it's just a single example of developments in the neurosciences that are of potential interest.

Other key findings include the need to stay up-to-date with the state of new pharmacological developments, along with other neuroimaging advances, the potential growth in computer modeling of cognition, and distributed human-machine systems. The NRC recommends that the intelligence community pay particular attention to developments overseas. Although not explicitly stated by name, the implication is that China and India, referred to obliquely as "countries where software research and development is relatively inexpensive and where there exists a sizable workforce with the appropriate education and skills" may be in a position to advance past the US into technological superiority.

The report was commissioned by the Defense Intelligence Agency, who are aware of emerging trends in neuroscience but lack the personnel to truly understand them. This lack of sufficiently trained people is highlighted in the report, which points out that the intelligence community needs to recruit more officers and analysts with advanced science backgrounds, and should expand collaborations and links with academia.

That last point could be a potentially provocative one. Last year, arguments over the complicity of psychologists in the torture of detainees by the US military and intelligence agencies reverberated through the American Psychological Association, and it doesn't take much imagination to think similar things could happen in the neuroscience community. Then there's the specter of programs such as the CIA's MK Ultra, which ran during the 1950s and 1960s, and attempted to use psychoactive drugs to induce mind control.

On the other hand, it's worth bearing in mind that the US has neither a monopoly on intelligence agencies nor neuroscience research, and it would do well to make sure it was aware of developments in competitor nations with national security implications. After all, that's why we have intelligence agencies! Whether all of this means more funding for researchers in neuroscience is not quite clear, but we saw a large spike in funding for biodefense following the anthrax mailings. Then again, with black budgets, perhaps we'll never know.

Original here

Mankind is the 'Earth's biggest threat'

Researchers who analysed 30,000 academic studies dating back to 1970 said man was responsible for changes that ranged from the loss of ice sheets to the collapse in numbers of many species of wildlife.

"Humans are influencing climate through increasing greenhouse gas emissions, and the warming world is causing impacts on physical and biological systems," said Cynthia Rosenzweig, at the Nasa Goddard Institute for Space Studies.

The effects on living things include the earlier appearance of leaves on trees and plants; the movement of animals and birds to more northerly latitudes and to higher altitudes in the northern hemisphere; rapid advances in flowering time and earlier egg-laying in Britain; and changes in bird migrations in Europe, North America and Australia.

On a planetary scale the changes include the melting of glaciers on all continents; earlier spring river run-off; and the warming of oceans, lakes and rivers.

The study's conclusions go further than the most recent report by the Intergovernmental Panel on Climate Change, which concluded last year that man-made climate change was "likely" to have had a discernible effect on the planet.

It says natural climate variations cannot explain the changes to the Earth's natural systems.

In the study, published in the journal Nature, Miss Rosenzweig and researchers from 10 institutions across the world analysed data from published papers on 829 physical systems – such as glaciers and ice sheets – and 28,800 plant and animal systems. They produced a picture of the changes to each continent. The changes were most marked in North America, Asia and Europe but mainly because far more studies had been carried out there.

The authors said there was an urgent need to study environmental systems in South America, Australia and Africa, especially in tropical and subtropical areas.

In North America, the researchers found that 89 species of plants were flowering earlier, such as the American holly and box elder maple; a decline in the population of polar bears; and the rapid melting of Alaskan glaciers.

In Europe, they found evidence of glaciers melting in the Alps; earlier pollen release in the Netherlands; and apple trees producing leaves 35 days earlier in Spain.

In Asia they reported a change in the freeze depth of permafrost in Russia; and the earlier flowering of ginkgo in Japan.

In Antarctica, the population of emperor penguins had declined by 50 per cent. In South America, the melting of the Patagonia ice-fields were contributing to a rise in sea levels.

Prof Barry Brook, of the University of Adelaide, described the evidence that mankind was altering the world as "overwhelming".

He said: "These changes are only a minor portent of what is likely to come."

Original here