Followers

Thursday, March 6, 2008

Top 10 songs chosen to orbit Earth

ASTRONAUTS on the International Space Station (ISS) are to get an unusual treat - an MP3 player loaded with a playlist of songs specially chosen for people in orbit.

The top ten was selected by a 14-year-old Norwegian girl, Therese Miljeteig, who won a competition staged by the European Space Agency (ESA).

Her prize is to watch the launch this weekend of ESA's space freighter, the Automated Transfer Vehicle (ATV), at the Kourou space base, French Guiana.

After launch, the cargo ship will dock automatically in low Earth orbit with the ISS, bringing food, water and other essentials to the ISS crew, as well as clothing and other personal items.

This was the winning selection, which beat out 1000 rivals from 10 countries:

- Here Comes The Sun - Beatles

- Come Fly With Me - Frank Sinatra

- Rocket Man - Elton John

- Up Where We Belong - Joe Cocker and Jennifer Warnes

- Imagine - John Lennon

- Flashdance - What A Feeling - Irene Cara

- Walk of Life - Dire Straits

- Fly - Celine Dion

- Rockin' All Over The World - Status Quo

- I Believe I Can Fly - R Kelly

Original here

The Universe is 13.73 +/- .12 billion years old!

Happy birthday, Universe!

Kinda. It’s not really the Universe’s birthday, but now we do know to high accuracy just how old it is.

How?

NASA’s WMAP is the Wilkinson Microwave Anisotropy Probe (which is a mouthful, and why we just call it WMAP). It was designed to map the Universe with exquisite precision, detecting microwaves coming from the most distant source there is: the cooling fireball of the Big Bang itself.

New results just released from WMAP have nailed down lots of cool stuff — literally — about the Universe.

I am about to explain the early Universe to you. I’ll be brief, but if you want to skip to the results, then go ahead.

Here’s the quick version: the Big Bang was hot. The Universe itself expanded outward from a single point — actually, it’s space itself that expands, not the objects in it — and like any expanding gas it cooled. After about a microsecond, it had cooled enough for protons and neutrons to form. Three minutes later (yes, just three minutes) it had cooled enough for protons and neutrons to stick together. Hydrogen, helium, and just a dash of lithium were created, and these would be the only elements for some time (hundreds of millions of years, in fact). The Universe was a thick soup of matter and energy.

It kept expanding and cooling. At this point, it was opaque to light. A photon couldn’t travel an inch without smacking into an electron and then getting sent off in some other random direction. However, after a few hundred thousand years, an amazing thing happened: neutral hydrogen could form. Before this point, the Universe was still too hot; as soon as an electron bonded with a proton, some ultraviolet photon would come along and whack it off. But at that golden moment the cosmos had cooled off enough that a lasting atomic relationship was in the offing. Neutral hydrogen was born. At that moment — astronomers call it recombination, which is a misnomer, since it was the first time electrons and protons could combine — the Universe became transparent; without all those pesky electrons floating around, photons found themselves free to travel long distances.

It’s those photons WMAP sees. After 13.7 billion years, the expansion of the Universe has cooled the light, stretched its wavelength from ultraviolet to microwave. Another way to think about it is that the temperature associated with each photon went from thousands of Kelvins down to just a few, less than 3, in fact. That’s -270 Celsius, and -454 Fahrenheit.

Brrrr.

That light emitted just after recombination tells us a vast amount about the Universe at that time. By carefully mapping the exact wavelength of the light and the direction from where it came, we can tell the density and temperature of the matter at that time. Incredibly we can also tell how much dark energy there was, and even the geometry of the Universe: whether it is flat, open, or closed.

All this, from the dying glow of the Big Bang itself.

WMAP Results

A lot of this information was determined a while back, just a couple of years after WMAP launched. But now they have released the Five Year Data, a comprehensive analysis of what all that data means. Here’s a quick rundown:

1) The age of the Universe is 13.73 billion years, plus or minus 120 million years. Some people might say it doesn’t look a day over 6000 years. They’re wrong.

2) The image above shows the temperature difference between different parts of the sky. Red is hotter, blue is cooler. However, the difference is incredibly small: the entire temperature range from cold to hot is only 0.0002 degrees Celsius. The average temperature is 2.725 Kelvin, so you’re seeing temperatures from 2.7248 to 2.7252 Kelvins.

3) The age of the Universe when recombination occurred was 375,938 years, +/- about 3100 years. Wow.

4) The Universe is flat.

5) The energy budget of the Universe is the total amount of energy and matter in the whole cosmos added up. Together with some other observations, WMAP has been able to determine just how much of that budget is occupied by dark energy, dark matter, and normal matter. What they got was: the Universe is 72.1% dark energy, 23.3% dark matter, and 4.62% normal matter. You read that right: everything you can see, taste, hear, touch, just sense in any way… is less than 5% of the whole Universe.

We occupy a razor thin slice of reality.

There are other important things that have come from the WMAP data, and if you’re interested, you can read all about them on the WMAP site and in the professional journal papers.

But if you only want to peruse the results I’ve highlighted here, that’s fine too. But remember this, and remember it well: you are living in a unique time. For the first time in all of human history, we can look up at the sky, and when it looks back down on us it reveals its secrets. We are the very first humans to be able to do this… and we have the entire future of the Universe ahead of us

Original here

Hey Mom, I'm Standing on the Moon!

Original here

Universe submerged in a sea of chilled neutrinos

We are all submerged in a sea of almost undetectable particles left over from the first few seconds of the big bang, according to the latest observations from a NASA satellite. The Wilkinson Microwave Anisotropy Probe (WMAP) has confirmed the theory that the universe is filled with a fluid of cold neutrinos that remain almost entirely aloof from ordinary matter.

Cosmologists think that in the hot, dense, young universe, neutrinos should have been created in high-energy particle collisions. About two seconds after the big bang, the cauldron of colliding particles would have cooled down so much that most would not have had enough energy to interact strongly with neutrinos. The neutrinos would then have "de-coupled" from other matter and radiation.

In theory, they should still be buzzing around, a soup of slippery particles that by today has been chilled to a temperature of only 1.9 ° Celsius above absolute zero.

Now WMAP has found evidence of this cosmic gazpacho. The spacecraft, launched in 2001, has been building up a picture of the cosmic microwave background radiation, which carries a detailed imprint of the state of the universe 380,000 years after the big bang. In particular, it reveals the pattern of density fluctuations in space, the "texture" of the early universe.

Travelling at nearly the speed of light, neutrinos should have discouraged matter from forming tight clumps, and so smoothed out the texture of the universe slightly.

Only detector

The WMAP data clearly show this smoothing effect, implying that those fast-flowing neutrinos formed about 10% of all the energy in the 380,000-year-old universe. "This confirms the theory," says Eiichiro Komatsu of the University of Texas in Austin, US, lead author of a study about the result.

In 2005, another analysis also provided evidence for a cosmic neutrino background, but it relied on combining WMAP data from other sources, and making some assumptions about other cosmological parameters, says Komatsu. Now that WMAP has collected five years' worth of data, it is enough to show firm evidence of the neutrino background on its own.

The neutrinos are too weak to be detected individually. "These neutrinos cannot be detected on the ground; you need the CMB to do it," Komatsu told New Scientist.

Other neutrinos, for example those generated in the Sun's core, can be detected on Earth, often in large tanks of water buried deep underground, where an occasional neutrino is unlucky enough to hit an atomic nucleus. But cosmic background neutrinos have only a millionth of the energy of a typical solar neutrino, making them even more ethereal.

To stop a substantial fraction of solar neutrinos, you would already need a lead shield a light year thick, says Komatsu. How about cosmic background neutrinos? "I'd estimate you would need a block of lead that is thicker than the entire universe."

Cosmology - Keep up with the latest ideas in our special report.

WMAP measures the composition of the universe by observing the cosmic microwave background, radiation that was emitted just 380,000 years after the big bang. Dark matter and atoms have become less dense as the volume of the universe has increased over time. Photons and neutrino particles also lose energy as the universe expands, but dark energy now dominates the universe even though it was a tiny contributor 13.7 billion years ago (Illustration: NASA/WMAP Science Team)
WMAP measures the composition of the universe by observing the cosmic microwave background, radiation that was emitted just 380,000 years after the big bang. Dark matter and atoms have become less dense as the volume of the universe has increased over time. Photons and neutrino particles also lose energy as the universe expands, but dark energy now dominates the universe even though it was a tiny contributor 13.7 billion years ago (Illustration: NASA/WMAP Science Team)

Original here

3D Virus Image Taken At Highest Resolution Ever

virus

Viruses are sub-microscopic infectious agents that need other cells in order to reproduce. In fact, some scientists claim that viruses are not living beings, as they do not meet the criteria of the definition of life, because they don’t have cells; but they do have genes and evolve by natural selection. Still, they can be very harmful to us, so studying and understanding them is a must.

Now, a team of researchers from Purdue University have achieved images of a virus in detail two times greater than had previously been achieved. Wen Jiang, an assistant professor of biological sciences at Purdue, led a research team that was able to capture a three-dimensional image of a virus at a resolution of 4.5 angstroms by using the emerging technique of single-particle electron cryomicroscopy. Just so you get an idea, a pin has the diameted of a few million angstroms.

“This is one of the first projects to refine the technique to the point of near atomic-level resolution,” said Jiang, who also is a member of Purdue’s structural biology group. “This breaks a threshold and allows us to now see a whole new level of detail in the structure. This is the highest resolution ever achieved for a living organism of this size.”

“If we understand the system - how the virus particles assemble and how they infect a host cell - it will greatly improve our ability to design a treatment,” Jiang said. “Structural biologists perform the basic science and provide information to help those working on the clinical aspects.”

The team obtained a three-dimensional map of the protein shell of the epsilon15 bacteriophage, a virus that infects bacteria; it’s in fact one of the most abundant forms of life on Earth. They are planning to take things even further, and refine this process to improve the capabilities of the technique and to analyze more virus species.

Original here

Study rearranges some branches on animal tree of life

A study led by Brown University biologist Casey Dunn uses new genomics tools to answer old questions about animal evolution. The study is the most comprehensive animal phylogenomic research project to date, involving 40 million base pairs of new DNA data taken from 29 animal species.

The study, which appears in Nature, settles some long-standing debates about the relationships between major groups of animals and offers up a few surprises.

The big shocker: Comb jellyfish – common and extremely fragile jellies with well-developed tissues – appear to have diverged from other animals even before the lowly sponge, which has no tissue to speak of. This finding calls into question the very root of the animal tree of life, which traditionally placed sponges at the base.

“This finding suggests either that comb jellies evolved their complexity independently from other animals, or that sponges have become greatly simplified through the course of evolution. If corroborated by other types of evidence, this would significantly change the way we think about the earliest multicellular animals,” said Dunn, assistant professor of ecology and evolutionary biology at Brown. “Coming up with these surprises, and trying to better understand the relationships between living things, made this project so fascinating.”

Charles Darwin introduced the idea of a “tree of life” in his seminal book Origin of Species. A sketch of the tree was the book’s only illustration. Nearly 150 years after its publication, many relationships between animal groups are still unclear. While enormous strides have been made in genomics, offering up a species’ entire genome for comparison, there are millions of animal species on the planet. There simply isn’t the time to sequence all these genomes.

To get a better grasp of the tree of life – without sequencing the entire genomes of scores of species – Dunn and his team collected data, called expressed sequence tags, from the active genes of 29 poorly understood animals that perch on far-flung branches of the tree of life, including comb jellies, centipedes and mollusks. The scientists analyzed this data in combination with existing genomic data from 48 other animals, such as humans and fruit flies, looking for the most common genes being activated, or expressed.

The aim of this new approach is to analyze a large number of genes from a large number of animals – an improvement over comparative genomics methods which allow for a limited analysis of genes or animals. The new process is not only more comprehensive, it is also more computationally intensive. Dunn’s project demanded the power of more than 120 processors housed in computer clusters located in laboratories around the globe.

Dunn and his team:

-- unambiguously confirmed certain animal relationships, including the existence of a group that includes invertebrates that shed their skin, such as arthropods and nematodes;
-- convincingly resolved conflicting evidence surrounding other relationships, such as the close relationship of millipedes and centipedes to spiders rather than insects;
-- established new animal relationships, such as the close ties between nemerteans, or ribbon worms, and brachiopods, or two-shelled invertebrates.

“What is exciting is that this new information changes our basic understanding about the natural world – information found in basic biology books and natural history posters,” Dunn said. “While the picture of the tree of life is far from complete after this study, it is clearer. And these new results show that these new genomic approaches will be able to resolve at least some problems that have been previously intractable.”

Original here

Brain Scanner Can Tell What You're Looking At

Scientists have developed a computer model that predicts the brain patterns elicited by looking at different images -- a possible first step on the path to mind reading.
Image: University of California at Berkeley

Tell me what you see.

On second thought, don't: A computer will soon be able to do it, simply by analyzing the activity of your brain.

That's the promise of a decoding system unveiled this week in Nature by neuroscientists from the University of California at Berkeley.

The scientists used a functional magnetic resonance imaging machine -- a real-time brain scanner -- to record the mental activity of a person looking at thousands of random pictures: people, animals, landscapes, objects, the stuff of everyday visual life. With those recordings the researchers built a computational model for predicting the mental patterns elicited by looking at any other photograph. When tested with neurological readouts generated by a different set of pictures, the decoder passed with flying colors, identifying the images seen with unprecedented accuracy.

"No one that I know would ever have guessed our decoder would do this well," study co-author Jack Gallant said.

As the decoder is refined, it could be used to explore the phenomenon of visual attention -- concentration on one part of a complicated scene -- and then to illuminate the dimly understood intricacies of the mind's eyes.

"One day it may even be possible to reconstruct the visual content of dreams," Gallant said.

After that, the decoding model could be harnessed for more visionary purposes: early warning systems for neurological diseases or interfaces that allow paralyzed people to engage with the world.

Other uses may not be so noble, such as marketing campaigns crafted for maximum mental penetration or invasions of mental privacy mounted in the name of fighting terrorism and crime.

Those technologies remain decades away, but researchers say it's not too soon to think about them, especially if research progresses at the pace set by this study.

Earlier decoders could only tell whether someone looked at a general type of image -- at a dog, for example -- but couldn't identify more specific photos, such as a small dog eating a bone. They've also been incapable of predicting what thought patterns an image would provoke.

The Berkeley model overcame both those limitations.

"It's quite tedious to measure every possible thought you might encounter, then measure the brain activity for that," said John-Dylan Haynes, a Max Planck Institute neuroscientist who was not involved in the study. "This is a big step forward."

Future steps involve expanding the decoder beyond its current focus on the brain's primary visual cortex, which represents general forms but doesn't handle the more complicated optical information processed in other parts of the brain.

More detail is also required, as the brain scanners used for the study measure blood flow caused by neural activity at a relatively coarse resolution of two cubic millimeters.

A higher-resolution, fully reconstructive decoder could help researchers chart the incredibly complex processes underlying visual perception. Gallant also hopes it could be used to detect early symptoms of neurological diseases like Alzheimer's and Parkinson's.

Eventually, Haynes said, the Berkeley model could be harnessed for something akin to mind reading.

"We want not only to decode people's perceptions, but also high-level mental states: people's intentions, their plans," Haynes said.

But Gallant warned of technological malfeasance. In the courtroom, mental readouts could have the same problems as eyewitness testimony, which is often unreliable and biased even though witnesses believe they're telling the truth.

The allure of reading minds to prove innocence or guilt, Haynes said, could override concerns about mental privacy -- an ethically ambiguous conflict. More obviously dubious is the possible use of mind-reading machines by marketers.

"There's some things we can do, and some we can't," Haynes said. "Some things are very easy, and others are not. But it's vital to think about the ethics now."

Original here

How sci-fi boffins are making a bug turn into a fly-on-the-wall

A rove beetle

The agency that the Pentagon set up to turn outlandish sci-fi concepts into reality has come closer to creating an army – or air force – of cybugs: cyber-moths and beetles that can spy on the enemy.

Inspired by Thomas Easton’s 1990 novel Sparrowhawk in which animals enlarged by genetic engineering were fitted with implanted control systems, the Defence Advanced Research Projects Agency (Darpa) set out to insert micro-systems into living insects as they undergo metamorphosis.

The plan is that their organs will grow around the chips and wires that make up the remote-control devices.

Darpa’s goal is to create cyborg insects that can fly at least 100 metres from their controller and land within 5 metres of a target and stay there until commanded to buzz off again.

Although this goal is still in the future, the agency has made remarkable advances. In a series of videoclips shown at a conference in Tucson, Arizona, New Scientist reports, a tobacco hawkmoth with wires connected to its back lifts and lowers one wing, then the other, then both, in response to signals delivered to its flight muscles.

As the Darpa researchers increase the frequency of the muscle stimulation the moth’s wings beat faster, approaching take-off speed. In another clip the moth is flying, tethered from above, when electrical impulses applied to muscles cause it to swerve left or right.

The clips were filmed at the Boyce Thompson Institute in Ithaca, New York, where a team led by Dr David Stern implanted the flexible plastic probes into tobacco hawkmoth pupae seven days before the moths emerged.

A probe is embedded in each set of flight muscles on each side of the moth and a connection protrudes from the moth’s back. This can be hooked up to the tether wires which also deliver control signals and power.

Meanwhile another Darpa-funded group led by Dr Michel Maharbiz at California University has implanted electrodes into the brains of adult green June beetles, near brain cells that control flight.

When the team delivered pulses of negative voltage to the brain, the beetles’ wing muscles began beating and the bugs took off. A pulse of positive voltage shut the wings down, stopping flight short, and by rapidly switching between these signals, they controlled the insects’ thrust and lift.

The challenge now is to shrink the components to hide as many of them as possible inside the insect. They are also looking to harness power from the insects themselves and resolve how the insects will be guided to a target.

“There were a bunch of ideas,” said Dr Charles Higgins at the University of Arizona, who was involved in Darpa’s original brainstorming session for the Hi-Mems project.

His colleague, Dr John Hildebrand, an insect neurobiologist, added: “There’s a long history of trying to develop micro-robots that could be sent out as autonomous devices, but I think many engineers have realised they can’t improve on Mother Nature.”

Original here




Lightway- Revolutionary Lighting System

Lightway is a recently launched window and lighting system which permits sun rays to enter the house in the daytime and after dusk it will fill the house with light. Based on Breezeway, this innovative technology uses the latest OLED’s (Organic Light Emitting Diodes) and transparent Photovoltaic Nanoscale technology which allows the lightway to become special.

Lightway- Futuristic Lighting System

Lightway is first of its kind in the world which assimilates the necessary solar energy during the daytime and then using its inner devices it enlightens the area (for instance, house or streets) during the nights. Portability is one of the attractive features attached to this technology followed by cutting the electricity expenses for households and if used in streets and shopping centers by a whooping 22%. Thus, these features make Lightway eco-friendly as well.

Lightway - Collects solar energy

Honored by Australian Design Award, Lightway is a concept combining two advanced technologies and using it in the simplest way to make a useful product. On its way to make a revolution, Lightway can be widely used in homes, streets, shopping arcades, museums, art galleries etc. Operating Lightway is very easy you can simply rotate the louver handle for opening and closing the system. The system also meets Australian Standards in terms both of construction and voltage, remaining below the high risk 32v category.

Lightway can be widely used in homes, streets, shopping arcades, museums, art galleries

User friendly design interface of Lightway is safe to use and is bright in colors as well. The art and graphics used on the application enlightens it to look attractive. Functional-wise, Lightway is exclusive which is capable enough to produce 60wt of lights by intaking just 50 wt in comparison.

Lightway- Concept Power Saving Technology

Lightway product is sold in the markets by Breezeway in a display style where users can come and choose the product as per their requirement and then order them as specified.
We can conclude that this proven conceptualized product is sure to make a revolution in the market.

Original here

Subaru STI: Is Diesel the Intersection of Power and Fuel Economy?

car_photo6.jpgWe all know that fast cars are fun and fuel-sipping cars are environmentally responsible, but is there a middle ground?

Short of expensive electric sports cars like the Tesla Roadster, there may be a solution to be found in diesel. Not only can diesel cars be fueled with waste vegetable oil, biodiesel, or some mixture of these fuels, but diesel engines produce a lot of torque and get better fuel economy than their gasoline-powered brethren.

Autoexpress reports that the Impreza lineup will soon feature a 2.0L diesel engine sporting 148 bhp - but that engine could easily be tuned up to 180bhp for use in a sportier WRX model. This model could go 0-60 in under 7 seconds and wouldn’t top out until a respectable 140 mph.

Certainly impressive, but what we really care about is the fact that this engine could achieve up to 45mpg and reduce CO2 emissions 40% compared to the gasoline-powered STI.

Original here