Followers

Monday, May 26, 2008

Astronomy Picture of the Day

Discover the cosmos! Each day a different image or photograph of our fascinating universe is featured, along with a brief explanation written by a professional astronomer.

2008 May 26

A New Horizon for Phoenix
Credit: Phoenix Mission Team, NASA, JPL-Caltech, Univ. Arizona

Explanation: This flat horizon stretches across the red planet as seen by the Phoenix spacecraft after yesterday's landing on Mars. Touching down shortly after 7:30pm Eastern Time, Phoenix made the first successful soft landing on Mars, using rockets to control its final speed, since the Viking landers in 1976. Launched in August of 2007, Phoenix has now made the northernmost landing and is intended to explore the Martian arctic's potentially ice-rich soil. The lander has returned images and data initially indicating that it is in excellent shape after a nearly flawless descent. News updates will be available throughout the day.

Original here

Japan will allow military use of space

U.S. could join in missile-tracking defense system

TOKYO — The Diet has passed a law allowing space to be used for military purposes, including allowing Japan to possess early warning satellites that could be used as part of a missile-defense system and to jointly develop with the United States satellites for defense purposes.

The bill cleared the House of Councillors at a plenary session Wednesday morning.

The new law emphasizes the development and utilization of space, based on the U.N. Outer Space Treaty, which prohibits the deployment of nuclear and other arms in space, as well as on the defense-only principles of the Constitution.

The law proposes space development for security purposes based on the following aims:

• Improving people's livelihoods.
• Eliminating threats to human lives.
• Contributing to international peace and Japan's national security.

In addition to this, the law also includes provisions for the establishment of a new strategy headquarters, headed by the prime minister, to deal with space development in order to strengthen Japan's competitiveness in the global space industry.

Once the new office is created, a Cabinet member will be appointed to oversee development, while a space agency also will be established within the Cabinet Office within a year of the law going into force.

The law aims to set up a tax system and other monetary measures that will enhance the technical capacity of the nation's space industry and promote private sector investment in the field.

Since 1969, when the Diet adopted a resolution that restricted the use of space to peaceful purposes, successive governments have limited the nation's space development to "nonmilitary" purposes.

Original here

2012: No Planet X

Will Planet X cause mayhem in 2012? Nope.
Apparently, Planet X (a.k.a. Nibiru) was spotted by astronomers in the early 1980's in the outermost reaches of the Solar System. It has been tracked by infrared observatories; seen lurking around in the Kuiper Belt and now it is speeding right toward us and will enter the inner Solar System in 2012. So what does this mean to us? Well, the effects of the approach of Planet X on our planet will be biblical, and what's more the effects are being felt right now. Millions, even billions of people will die, global warming will increase; earthquakes, drought, famine, wars, social collapse, even killer solar flares will be caused by Nibiru blasting through the core of the Solar System. All of this will happen in 2012, and we must begin preparing for our demise right now…

As investigated in my previous article "No Doomsday in 2012", a lot of weight had been placed on the end of an ancient Mayan calendar, the "Long Count". According to this calendar and Mayan myth, something is going to happen on December 21st, 2012. Now the world's Planet X supporters seem to have calculated that this hypothetical, deadly planet will arrive from a highly eccentric orbit to wreak gravitational havoc on Earth, sparking geological, social, economic and environmental damage, killing a high proportion of life… in 2012.

I'm sorry, but the "facts" behind the Planet X/Nibiru myth simply do not add up. Don't worry, Planet X will not be knocking on our door in 2012 and here's why…

Nibiru and Planet X
The planet Neptune - could its orbital deviations reveal Planet X (NASA)

In 1843, John Couch Adams (a British mathematician and astronomer) studied the orbital perturbations of Uranus and deduced that through gravitational interactions, there must be an eighth planet, tugging at the gas giant. This led to the discovery of Neptune, orbiting at a distance of 30AU from the Sun. There have been numerous occasions where this method has been used to deduce the existence of other bodies in the Solar System before they were directly observed.

Neptune was also experiencing orbital perturbations, and on the discovery of Pluto in 1930, it was thought that the aptly named "Planet X" had been discovered. Alas, Pluto's mass was tiny, and once the orbit of Charon (Pluto's moon) was analysed it was found that the mass of the Pluto-Charon system was far too small to affect the orbit of Neptune. The hunt for Planet X continued…

After years of speculation and historic research, it was believed that a huge body astronomers were looking for was a huge planet or a small star, possibly a companion to our Sun (making the Solar System a binary system). The name "Nibiru" was unearthed by the author Zecharia Sitchin, on researching the possible intervention of extraterrestrials in the early history of mankind. Nibiru is a hypothetical planet as taught in ancient Sumerian culture (the Sumerians existed from around 6,000BC to 3,000BC, predating Babylon, in the current geographic location of Iraq). There is very little archaeological evidence to suggest this mythical planet has anything to do with Planet X. But since this dubious connection, Planet X and Nibiru are now thought by doomsayers to be the same thing, an ancient astronomical body that has returned after a long orbit beyond the Solar System.

OK, so the Nibiru/Planet X connection might be a bit ropey already, but is there any solid evidence for the modern-day Planet X?

Infrared observations = Planet X
A popular image on Planet X websites. Is this Planet X, or is it simply a young galaxy? (NASA - possible source)
There is much emphasis placed on the 1983 "discovery" of a mysterious heavenly body by NASA's Infrared Astronomical Satellite (IRAS) on the outskirts of the Solar system, some 50 billion miles (540 AU) away. Naturally the world's media will have been very excited by such a discovery and began making noises that perhaps this was Planet X (the most popular accessible resources for Planet X advocates is the Washington Post article published on December 31st 1983 titled "Mystery Heavenly Body Discovered"). In actuality, astronomers weren't sure what the infrared object was (the clue is in the word "mystery"). Initial media reports postulated that it could be a long-period comet, or a planet, or a far-off young galaxy or a protostar (i.e. a brown dwarf). As soon as the last possibility is mentioned, suddenly this became the "discovery" that Planet X was in fact a brown dwarf orbiting in the outer reaches of our Solar System.

"So mysterious is the object that astronomers do not know if it is a planet, a giant comet, a nearby "protostar" that never got hot enough to become a star, a distant galaxy so young that it is still in the process of forming its first stars or a galaxy so shrouded in dust that none of the light cast by its stars ever gets through." - Thomas O'Toole, Washington Post Staff Writer, December 30th 1983 (from text on the Planet X and Pole Shift website)

So where did the Washington Post get its story? The story was published in response to the research printed a paper titled "Unidentified point sources in the IRAS minisurvey" (by Houck et al, published in Astrophysical Journal Letters, 278:L63, 1984). Dr. Gerry Neugebauer, co-investigator in the IRAS project, was interviewed and strongly stated that what IRAS had seen was not "incoming mail" (i.e. the results did not suggest there was an object approaching Earth). On reading this interesting research, I was especially drawn to the paper's conclusion:

"A number of candidate identifications have been considered including near-solar system, galactic, and extragalactic objects. Further observations at infrared and other wavelengths may provide additional information in support of one of these conjectures, or perhaps these objects will require entirely different interpretations." - Houck et al, Astrophysical Journal Letters, 278:L63, 1984.

Although these IRAS observations were seeing mysterious objects, at this stage, there was no indication that there was an object (let alone a brown dwarf) powering its way toward us. But the rumours had already begun to flow. When follow-up papers were published in 1985 (Unidentified IRAS sources - Ultrahigh-luminosity galaxies, Houck et al., 1985) and 1987 (The IRAS View of the Extragalactic Sky, Soifer et al., 1987), there was little if any media interest in their findings. According to these publications, most of the IRAS observations in the 1984 paper were distant, ultra-luminous young galaxies and one was a filamentary structure known as "infrared cirrus" floating in intergalactic space. IRAS never observed any astronomical body in the outer reaches of the Solar System.

Orbital perturbations = Planet X
The bodies in the Kuiper Belt (Don Dixon)
In addition to the 1983 "discovery" of the Planet X brown dwarf, the 1992 Planet X claim goes something like this: "Unexplained deviations in the orbits of Uranus and Neptune point to a large outer solar system body of 4 to 8 Earth masses, on a highly tilted orbit, beyond 7 billion miles from the sun," - text from an un-cited NASA source on the "Planet X Forecast and 2012 Survival Guide" video.

Pulling up the discovery of planets using orbital perturbation measurements, Planet X advocates point to a NASA announcement that in 1992, there were indirect measurements of a planet some 7 billion miles from Earth. Alas, I cannot find the original source for this claim. The only huge discovery NASA announced along these lines was the discovery of the first major trans-Neptunian object (TNO) called 1992 QB1 (full details of the discovery of this "cubewano-class" object can be found in the original announcement transcript). It has a diameter of 200km and is confined to the Kuiper Belt, a zone of minor planets (where Pluto lives) and asteroids from 30AU to 55AU, just outside Neptune's orbit. Some of these bodies (like Pluto) cross the path of Neptune's orbit and there therefore designated as a TNO. These TNO's pose no threat to the Earth (in as much as they wont be leaving the Kuiper Belt to pay us a visit in 2012).

Since then, any Neptune orbital perturbations have been put down to observational error and have since not been observed… so there doesn't appear to be any obvious object any bigger than the largest Kuiper Belt objects out there. Still, to keep an open mind, there could be more large bodies to be discovered (that might explain why there is such a steep drop-off of Kuiper Belt objects at the "Kuiper Cliff", the jury is out on that idea), but there is no evidence for a massive body approaching from the vicinity of the Kuiper Belt. Even the strange Pioneer anomaly that the Pioneer and Voyager probes are experiencing cannot be attributed to Planet X. This anomaly appears to be a Sun-ward acceleration, if there was a massive planet out there, there should be some gravitational effect beyond what has been predicted by the other known objects in the Solar System.

4-8 Earth masses = a brown dwarf? It must be Planet X.
Brown dwarfs are 15-80 times the mass of Jupiter (NASA)
Probably the most glaring inconsistency in the Planet X hypothesis is the Planet X advocates assertion that the 1984 IRAS object and the 1992 body are one of the same thing. As announced on many websites and online videos about Planet X, the 1984 IRAS observation saw Planet X at 50 billion miles from Earth. The 1992 NASA "announcement" put Planet X at a distance of about 7 billion miles from Earth. Therefore, the logic goes, Planet X had travelled 43 billion miles in the course of only eight years (from 1984 to 1992). After some dubious mathematics, Planet X is therefore expected to reach the core of the Solar System in 2012. (Although many believed it should arrive in 2003… they were obviously wrong about that prediction.)

Well, I think we might be clutching at straws here. For starters, for the 1984 object to be the same as the 1992 object, surely they should be the same mass? If Planet X was a brown dwarf (as we are led to believe in the IRAS observations), how can it possibly weigh in at only 4 to 8 Earth masses eight years later? Brown dwarfs have a mass of around 15-80 Jupiter masses. As Jupiter is about 318 Earth masses, surely the object hurtling toward us should have a mass of somewhere between 4,770 and 25,440 Earth masses? So I am going to go out on a limb here and say that I reckon the 1984 object and the 1992 object (if either object actually existed that is) are not the same thing. Not by a very long shot.

If there is no evidence supporting Planet X, it must be a conspiracy
If it can be this easy to cast the fundamental "scientific" theory behind Planet X into doubt, I see little point in discussing the historical reasons (mass extinctions, volcanic activity, earthquakes etc.) as to why the doomsayers believe Planet X should exist. If there is no renegade planet out there of significant mass, how can Nibiru be a threat to us in 2012?

They will have us believe there is a global conspiracy of international governments hiding the facts from us. NASA is involved in the cover-up, hence the lack of evidence. In my opinion, simply because there is no evidence, doesn't mean there is a conspiracy to hide the truth from the public. So why would governments want to hide a "discovery" as historic as a doomsday planet approaching the inner Solar System anyway? To avoid mass panic and pursue their own, greedy agendas (obviously).

As it turns out, this is the only strength behind the Planet X myth. When confronted with scientific facts, the Planet X advocates reply with "…governments are sending out disinformation and covering up the true observations of Nibiru." Although I enjoy a good conspiracy theory, I will not support anything in the name of Planet X. If the basic science behind what we are led to believe are the foundation of Planet X existing is wrong, it seems a poor argument to say "the government did it".

Therefore, the story that Planet X will arrive in 2012 is, in my view, total bunkum (but it helps to sell doomsday books and DVDs by scaring people). Nibiru will remain in the realms of Sumerian myth.

Original here

Why Do Astronauts Suffer From Space Sickness?


Researchers are working to understand how 'space sickness' (the nausea and disorientation experienced by many astronauts) develops. (Credit: NASA)

Rotating astronauts for a lengthy period provided researcher Suzanne Nooij with better insight into how 'space sickness' develops, the nausea and disorientation experienced by many astronauts.

Gravity plays a major role in our spatial orientation. Changes in gravitational forces, such as the transition to weightlessness during a space voyage, influence our spatial orientation and require adaptation by many of the physiological processes in which our balance system plays a part. As long as this adaptation is incomplete, this can be coupled to motion sickness (nausea), visual illusions and disorientation.

This 'space sickness' or Space Adaptation Syndrome (SAS), is experienced by about half of all astronauts during the first few days of their space voyage. Wubbo Ockels, the first Dutchman in space in 1986, also suffered from these symptoms.

Nooij will receive her PhD from TU Delft on this subject on May 20. In his capacity as TU Delft professor, Ockels is PhD supervisor for Suzanne Nooij's research.

Rotation

Interestingly, SAS symptoms can even be experienced after lengthy exposure to high gravitational forces in a human centrifuge, as is used for instance for testing and training fighter pilots. To experience this, people have to spend longer than an hour in a centrifuge and be subjected to gravitational forces of three times higher than that on Earth. The rotation is in itself not unpleasant, but after leaving the centrifuge about half of the test subjects experience the same symptoms as caused by space sickness. It also turns out that astronauts who suffer from space sickness during space flights also experience these symptoms following lengthy rotation on Earth.

This means that these symptoms are not caused by weightlessness as such, but more generally by adaptation to a different gravitational force.

Suzanne Nooij has studied these effects closely using the human centrifuge at the Centre for Man and Aviation in Soesterberg. Her results confirm the theory that both types of nausea (space sickness and after rotation) are caused by the same mechanism and also provide better insight into why the symptoms arise.

Otoliths

Logically, Nooij focused her research on the organ of balance. This is located in the inner ear and comprises semi-circular canals, which are sensitive to rotation, and otoliths, which are sensitive to linear acceleration. It has previously been suggested that a difference between the functioning of the left and right otolith contributes to susceptibility to sickness among astronauts. If this is the case, this should also apply after lengthy rotation.

Nooij tested this otolith asymmetry hypothesis. The otolith and semi-circular canals functions on both sides were measured of fifteen test subjects known to be susceptible to space sickness. Those who suffered from space sickness following rotation proved to have high otolith asymmetry and more sensitive otolith and canal systems. These people could not be classified as sensitive or non-sensitive on the basis of this asymmetry alone, but could on the basis of a combination of various otolith and canal features.

This demonstrates that the entire organ of balance is involved in space sickness and that it probably entails complex interactions between the various parts of the organ of balance.

Original here


Probe lands on Mars, NASA says

(CNN) -- The first pictures from NASA's Mars Phoenix Lander, which successfully touched down near Mars' north pole Sunday, showed a pattern of brown polygons as far as the camera could see.

art.mars.surface.nasa.jpg

The Mars Phoenix Lander took this image of the planet's surface at its landing site Sunday.

Click to view previous image
1 of 4
Click to view next image

"It's surprisingly close to what we expected and that's what surprises me most," said Peter Smith, the mission's principal investigator. "I expected a bigger surprise."

The landing on the Red Planet's arctic plains -- which ended a 296-day journey -- was right on target, a feat NASA's Ed Weiler compared to landing a hole-in-one with a golf ball from 10,000 miles.

The landing -- dubbed the "seven minutes of terror" -- was a nerve-wracking experience for mission managers, who have witnessed the failure of similar missions.

In mission control at NASA's Jet Propulsion Laboratory in Pasadena, California, they celebrated the lander's much-anticipated entry.

"It was better than we could have imagined," Barry Goldstein, project manager for the Phoenix mission, told CNN.

The Phoenix's 90-day mission is to analyze the soils and permafrost of Mars' arctic tundra for signs of past or present life.

The lander is equipped with a robotic arm capable of scooping up ice and dirt to look for organic evidence that life once existed there, or even exists now.

"We are not going to be able to answer the final question of is there life on Mars," said principal investigator Peter Smith, an optical scientist with the University of Arizona. "We will take the next important step. We'll find out if there's organic material associated with this ice in the polar regions. Ice is a preserver, and if there ever were organics on Mars and they got into that ice, they will still be there today."

The twin to the Mars Polar Lander spacecraft, Phoenix was supposed to travel to Mars in 2001 as the Mars Surveyor spacecraft. They were originally part of the "better, faster, cheaper" program, formulated by then-NASA Administrator Dan Goldin to beef up planetary exploration on a lean budget.

But Polar malfunctioned during its descent into Mars' atmosphere in 1999 and crashed. An investigation concluded that as many as a dozen design flaws or malfunctions doomed the spacecraft.

The failure of that mission, as well as another spacecraft called the Mars Climate Orbiter the same year, led to NASA to put future missions on hold and rethink the "better, faster, cheaper" approach. Mars Surveyor went to the warehouse.

But all was not lost. In 2003, Smith proposed a plan to re-engineer the Mars Surveyor and fly it on a mission to look for signatures of life in the ice and dirt of Mars far North. Mars Phoenix, literally and figuratively, rose from the ashes of Surveyor.

Engineers set to work, testing and retesting the onboard system to ferret out and fix all the flaws they could find. iReport.com: Send your photos, video of space

"We always have to be scared to death," Goldstein said. "The minute we lose fear is the minute that we stop looking for the next problem."

The team was concerned about the Phoenix landing system. NASA had not successfully landed a probe on Mars using landing legs and stabilizing thrusters since the Viking missions in the late 1970s. The other three successful Mars landings -- Pathfinder in 1997 and the Spirit and Opportunity rovers in 2004 -- used massive airbags that inflated around the landing craft just before landing to cushion the impact.

The Phoenix doesn't have airbags because the lander is too big and heavy for them to work properly.

Its landing site was targeted for the far northern plains of Mars, near the northern polar ice cap. Data from the Mars Odyssey spacecraft indicate large quantities of ice there, likely in the form of permafrost, either on the surface or just barely underground.

"Follow the water" has become the unifying theme of NASA's Mars exploration strategy.

In 2004, the rover Opportunity found evidence that a salty sea once lapped the shores of an area near Mars' equator called Meridiani Planum. Astrobiologists generally agree that it's best to look for life in wet places.

Original here

That’s it. Texas really is doomed.

Well, it’s truly official: Texas is doomed.

Why? I’ve talked before about the guy that’s the head of the State Board of Education. His name is Don McLeroy, and he’s perhaps the least qualified guy on the planet to head a BoE. He’s a creationist. He thinks science is evil. The list of his disqualifications to be in charge of a BoE would be so big… well, it would be Texas-sized big.

I predicted nothing but doom and shame for the BoE this year, and it brings me no joy at all to say I was right. McLeroy’s latest antic — though I would call it the first shot fired in a war, a war on reality — was over, of all things, the English standards. According to an article in the Dallas Morning News, teachers and experts had worked for two and a half to three years on new standards for English. So what did McLeroy do? He ignored all that work entirely, and let "social conservatives" on the board draft a new set overnight.

Overnight? Think that’s better than Standards teachers and experts spent nearly three years on?

This new version cobbled together in a few hours was delivered to Board members an hour before the meeting in which they were to vote on it. An hour! In the meeting, McLeroy rammed through the discussion, even dismissing people who claimed he was going too quickly:

“Mr. Chair you’re going so fast … you’re moving so fast we can’t find it in the other document,” [board member Mary Helen] Berlanga said, shortly after the page-by-page explanation began.

After more complaints, McLeroy declared that he would continue at the fast pace.

“The ruling is you’re being dilatory in dragging this out,” McLeroy said.

What a guy! And now guess how this ends…

The board voted to approve the hastily cobbled-together standards, 9-6.

And if you’re not tired of guessing, then guess what discipline comes up next for review? Science!

We know where McLeroy stands there. Texas is actually and seriously looking down a cliff of educational repression that will doom the children there for the next decade. I really can’t be more serious about this. If I were a parent of a young child in Texas right now, I’d move out rather than let her be educated there.

FYI, McLeroy was appointed to his position by Texas Governor Perry, who apparently agrees with many if not all of McLeroy’s positions. Mary Helen Berlanga — the board member quoted above — wrote a letter to Perry complaining vociferously and specifically about McLeroy. As described in her letter, incredibly, when McLeroy invited experts to testify before the board on the English standards, he didn’t invite anyone with expertise in teaching Hispanic children, yet they make up a huge 47% portion of the populace of school children.

I remind you, science is next on their chopping block, and McLeroy is a vocal and adamant anti-intellectual. He admits on his own page he is not a professional educator… but he is the head of the State Board of Education.

I have no clue if it’s too late to save Texas or not. I strongly urge anyone reading this who lives in Texas to write to Perry, to McLeroy, and to Berlanga (she could use the support) letting them know what you think. In fact, if I lived in Texas, I would ask for the immediate resignation of McLeroy, or demand Perry to remove him.

And I remind you as well that Texas is a major force in determining curricula and textbook sales for the rest of this country. This can affect all of us. All of us.

I certainly hope it’s not too late to reverse this damage being done to the educational system in Texas. If not, then we may all be doomed.

Original here

Cold-fusion demonstration "a success"


On 23 March 1989 Martin Fleischmann of the University of Southampton, UK, and Stanley Pons of the University of Utah, US, announced that they had observed controlled nuclear fusion in a glass jar at room temperature, and — for around a month — the world was under the impression that the world's energy woes had been remedied. But, even as other groups claimed to repeat the pair's results, sceptical reports began trickle in. An editorial in Nature predicted cold fusion to be unfounded. And a US Department of Energy report judged that the experiments did "not provide convincing evidence that useful sources of energy will result from cold fusion."

This hasn't prevented a handful of scientists persevering with cold-fusion research. They stand on the sidelines, diligently getting on with their experiments and, every so often, they wave their arms frantically when they think have made some progress.

Nobody notices, though. Why? These days the mainstream science media wouldn't touch cold-fusion experiments with a barge pole. They have learnt their lesson from 1989, and now treat "cold fusion" as a byword for bad science. Most scientists agree, and some even go so far as to brand cold fusion a "pathological science" — science that is plagued by falsehood but practiced nonetheless.

There is a reasonable chance that the naysayers are (to some extent) right and that cold fusion experiments in their current form will not amount to anything. But it's too easy to be drawn in by the crowd and overlook a genuine breakthrough, which is why I'd like to let you know that one of the handful of diligent cold-fusion practitioners has started waving his arms again. His name is Yoshiaki Arata, a retired (now emeritus) physics professor at Osaka University, Japan. Yesterday, Arata performed a demonstration at Osaka of one his cold-fusion experiments.

Although I couldn't attend the demonstration (it was in Japanese, anyway), I know that it was based on reports published here and here. Essentially Arata, together with his co-researcher Yue-Chang Zhang, uses pressure to force deuterium (D) gas into an evacuated cell containing a sample of palladium dispersed in zirconium oxide (ZrO2–Pd). He claims the deuterium is absorbed by the sample in large amounts — producing what he calls dense or "pynco" deuterium — so that the deuterium nuclei become close enough together to fuse.

So, did this method work yesterday? Here's an email I received from Akito Takahashi, a colleague of Arata's, this morning:

"Arata's demonstration...was successfully done. There came about 60 people from universities and companies in Japan and few foreign people. Six major newspapers and two TV [stations] (Asahi, Nikkei, Mainichi, NHK, et al.) were there...Demonstrated live data looked just similar to the data they reported in [the] papers...This showed the method highly reproducible. Arata's lecture and Q&A were also attractive and active."

I also received a detailed account from Jed Rothwell, who is editor of the US site LENR (Low Energy Nuclear Reactions) and who has long thought that cold-fusion research shows promise. He said that, after Arata had started the injection of gas, the temperature rose to about 70 °C, which according to Arata was due to both chemical and nuclear reactions. When the gas was shut off, the temperature in the centre of the cell remained significantly warmer than the cell wall for 50 hours. This, according to Arata, was due solely to nuclear fusion.

Rothwell also pointed out that Arata performed three other control experiments: hydrogen with the ZrO2–Pd sample (no lasting heat); deuterium with no ZrO2–Pd sample (no heating at all); and hydrogen with no ZrO2–Pd sample (again, no heating). Nevertheless, Rothwell added that Arata neglected to mention certain details, such as the method of calibration. "His lecture was very difficult to follow, even for native speakers, so I may have overlooked something," he wrote.

It will be interesting to see what other scientists think of Arata's demonstration. Last week I got in touch with Augustin McEvoy, a retired condensed-matter physicist who has studied Arata's previous cold-fusion experiments in detail. He said that he has found "no conclusive evidence of excess heat" before, though he would like to know how this demonstration turned out.

I will update you if and when I get any more information about the demonstration (apparently there might be some videos circulating soon). For now, though, you can form your own opinions about the reliability of cold fusion.

Original here

Mind reading may reveal mother tongue

Scientists decipher language proficiency by analyzing brain activity

Alice Mado Proverbio / AP
This computer-generated image shows an electrical response evoked in a subject's brain while looking at words flashed on a screen during an experiment.

ROME - No one can read our thoughts, for now, but some scientists believe they can at least figure out in what language we do our thinking.

Before we utter a single word, experts can gauge our mother tongue and the level of proficiency in other languages by analyzing our brain activity while we read, scientists working with Italy's National Research Council say.

For more than a year, a team of scientists experimented on 15 interpreters, revealing what they say were surprising differences in brain activity when the subjects were shown words in their native language and in other languages they spoke.

The findings show how differently the brain absorbs and recalls languages learned in early childhood and later in life, said Alice Mado Proverbio, a professor of cognitive electrophysiology at the Milano-Bicocca University in Milan.

Proverbio, who led the study, said such research could help doctors communicate with patients suffering from amnesia or diseases that impair speech. It could also be of use one day in questioning refugee applicants or terror suspects to determine their origin, she said.

Switching languages alters brain waves
The interpreters who took part in the study were all Italians working for the European Union and translating in English and Italian.

"They were extremely fluent in English," Proverbio said in a telephone interview earlier this month. "We didn't expect a big difference in brain activity" when they switched from one language to another.

The subjects were asked to look at a screen that flashed words in Italian, English, German as well as nonsensical letter combinations. They were not aware of the purpose of the study and were simply tasked with pressing a button when they spotted a specific symbol, Proverbio said.

Meanwhile, researchers monitored them using an electroencephalograph, or EEG, which measures the brain's electrical activity through electrodes placed on the scalp. The EEG readout was fed into a computer program that pinpointed the time, intensity and location of the responses evoked in the subjects' brains by each word.

About 170 milliseconds after a word was shown, the researchers recorded a peak in electrical activity in the left side of the brain, in an area that recognizes letters as part of words before their meaning is interpreted.

These brain waves had a much higher amplitude when the word was in Italian, the language the interpreters had learned before age five.

"The research suggests the differences between the two languages are at a very fundamental level," said Joseph Dien, a psychology professor at the University of Kansas who was not involved in the study.

Mother tongue sparks more brain activity
Proverbio attributed the differences to the fact the brain absorbs the mother tongue at a time when it is also storing early visual, acoustic, emotional and other nonlinguistic knowledge. This means that the native language triggers a series of associations within the brain that show up as increased electrical activity.

"Our mother tongue is the language we use to think, dream and feel emotion," Proverbio said.

Offering an example, she said that an English-speaking child would associate the word "knife" with a sharp, cold object that is dangerous and should only be used by adults, while these links would be much weaker and indirect once that person learned the same word in another language later in life.

The only exception would be for those bilingual individuals who learn an extra language before age five.

The findings by Proverbio's team were published earlier this year in the Biological Psychology journal and have surprised some scientists, particularly because the differences in brain activity show up at a point in the thought process when the brain hasn't yet interpreted the meaning of the words.

Results surprise scientists
"I didn't expect such differences at the very beginning of the process," Dien said in a telephone interview.

"They emerge at a very early level of comprehension," he said. "It will take a lot more work to work out the implications of that."

Dien said further research in the area could help understand and treat learning disabilities like dyslexia.

The Italian study also showed links between brain activity and proficiency in other languages. The differences showed up when the translators were shown words in English and in German, a language they knew at a more basic level, Proverbio said.

In this case, the differences in intensity and duration of the brain's activity were seen some 250 milliseconds after a word was shown, and were traced to areas of the brain used to understand the meaning of words.

This phenomenon had been already discovered by previous studies which, however, had not spotted any difference between the mother tongue and other languages spoken with high proficiency. This had suggested that with some effort "we could all become perfectly bilingual," Proverbio said. "Unfortunately, that's not true."

© 2008 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
Original here

Flamboyant archeologist believes he has identified Cleopatra's tomb


Hawass, at work in 2005, now thinks he has found the queen’s tomb

A flamboyant archeologist known worldwide for his trademark Indiana Jones hat believes he has identified the site where Cleopatra is buried.

Now, with a team of 12 archeologists and 70 excavators, Zahi Hawass, 60, the head of Egypt’s Supreme Council of Antiquities, has started searching for the entrance to her tomb.

And after a breakthrough two weeks ago he hopes to find her lover, the Roman general Mark Antony, sharing her last resting place at the site of a temple, the Taposiris Magna, 28 miles west of Alexandria.

Hawass has discovered a 400ft tunnel beneath the temple containing clues that the supposedly beautiful queen may lie beneath. “We’ve found tunnels with statues of Cleopatra and many coins bearing her face, things you wouldn’t expect in a typical temple,” he said.

A fortnight ago Hawass’s team discovered a bust of Mark Antony, the Roman general who became Cleopatra’s lover and had three children with her before their ambitions for an Egyptian empire brought them into conflict with Rome.

They committed suicide – he with his sword, she reputedly by clutching an asp to her breast – after being defeated by Octavian in the battle of Actium in 31BC. “Our theory is that both Cleopatra and Mark Antony are buried here,” said Hawass.

The archeologist, best known in Britain for demanding the return of the Rosetta Stone from the British Museum and for promoting the Tutankhamun exhibition at the O2 dome in London, believes the temple’s location would have made it a perfect place for Cleopatra to hide from Octavian’s army.

Work on the site has been suspended until the summer heat abates and is due to resume in November, when Hawass will use radar to search for hidden chambers.

The queen’s life and death were immortalised in Shakespeare’s Antony and Cleopatra, and the Hollywood movie Cleopatra – starring Elizabeth Taylor and Richard Burton, who fell in love during the filming – but the location of her tomb has remained a mystery.

If Hawass is right, he could make the greatest archeological discovery in Egypt since Tutankhamun’s tomb was uncovered by the British archeologist Howard Carter in 1922.

Other experts are cautious, though. John Baines, professor of Egyptology at Oxford University, warned that searching for royal tombs often proved a “hopeless” task. He also doubted that Antony would be buried alongside his lover.

“It’s unlikely Mark Antony would have a tomb that anyone would be able to discover because he was the enemy at the time he died,” he said.

Hawass, however, remains defiant. “This is our theory. Others may disagree, but we are searching to see if we can prove it,” he said.

Original here

Hot Life-Forms Found a Mile Under Seafloor

Life-forms have been found thriving a mile (1.6 kilometers) beneath the seafloor in hot sediments, a new study says. The finding doubles the maximum known depth for organisms under the ocean bottom—and may be an encouraging sign for the search for life on other planets.

At 140 to 212 degrees Fahrenheit (60 to 100 degrees Celsius), the microscopic life forms are probably also the hottest life-forms yet found in seafloor sediments, according to study co-author R. John Parkes, a microbiologist at Cardiff University in the United Kingdom.

(Related: "Hottest Life-Form Found: Microbe Thrives When Boiling" [May 21, 2004].)

The scientists examined core samples of sediments in the North Atlantic Ocean and found microbes known as prokaryotes.

Many of the prokaryotes share characteristics in common with "extremophiles," which live in hot springs, both under the sea and in areas such as Yellowstone National Park.

The microbes appear to make their livings by metabolizing methane and other hydrocarbons created as the Earth's interior heat warms organic material in the sediments, Parkes said.

"That's what we think they're using as an energy source."

The organisms do not appear simply to have been dormant microbes trapped in the sediments, Parkes added, but instead appeared to be thriving.

The discovery supports predictions that as much as 70 percent of the Earth's prokaryotes may live in seabed sediments, some of which can be several miles thick.

All told, Parkes said, these prokaryotes could amount to 10 to 30 percent of the world's total living matter.

Life on Mars?

The find is also significant for the search for life on Mars and other planets. Our "surface centric" view of life on Earth, Parkes said, may mean we're looking in the wrong places for life elsewhere.

"There are [nonbiological] sources that can produce methane [and related chemicals]," he said. "Therefore there might be a biosphere on other planets that may not require" the ability to harness sunlight on a planet's surface for energy.

Other scientists agree.

"The more places we look for life, the more places we find it," said Dennis Geist, a University of Idaho geologist who was not involved in the study. "This [new study] furthers the notion that the days of limiting our search for new life to surface conditions are long gone," he continued by email.

"The findings of this work push the limits in terms of both pressure and temperature."

In fact, Geist notes, the drilling for the core samples used in the study began on a seafloor 2.8 miles (4.5 kilometers) beneath the waves, where the water temperature is barely above freezing. The drilling ended at a point beneath the ocean bed where the temperature is nearly equal to that of a boiling pot of water.

"The range of physical conditions is enormous."

Original here

Behind a scientific success, a failed Texas experiment

COLLEGE STATION — Under a leaden sky that mirrored his mood, physicist Peter McIntyre eyed a long submarine-shaped magnet resting on the ground.

Had the Superconducting Super Collider been completed south of Dallas as planned, the magnet would now lie in a 54-mile-long tunnel, accelerating bits of matter to near the speed of light, producing spectacular collisions and re-creating conditions that existed at the universe's beginning.

But after spending billions of dollars, Congress axed the SSC 15 years ago.

The magnet now sits outside McIntyre's lab near the Texas A&M University campus, a weathered reminder of what might have been and an apt metaphor for the state of U.S. high-energy physics.

"It's just a shame," McIntyre said.

Later this year, the Switzerland-based CERN laboratory is scheduled to fire up a new particle accelerator near Geneva. By laying claim to the world's most powerful collider, Europeans will wrest leadership in high-energy physics away from the U.S. after 80 years of American hegemony.

"Europe's now playing in the major leagues, and we're in the minors," said A&M physicist Bhaskar Dutta.

American physicists have dominated the field since the 1930s, when Ernest Lawrence and colleagues at the University of California at Berkeley developed the first cyclotron, an early particle accelerator.

Atomic secrets

The device allowed scientists to discover atomic secrets by accelerating protons to high speeds, slamming them into a target and studying new particles the collisions produced.

The discoveries led, in part, to the development of the atomic bomb. Later spinoffs included the development of cancer therapies and the processing of a host of materials, such as semiconductor chips.

After World War II, the United States built a number of evermore powerful accelerators, culminating with the Tevatron at Fermi National Accelerator Laboratory near Chicago.

The machines seek to create highly energetic collisions between atomic particles. Only at high energies do the smallest, most exotic particles - which existed at the beginning of the universe — briefly appear.

Existing accelerators already achieve collisions near the speed of light, so the only way to reach higher energies is to build larger rings through which the particles travel.

The Tevatron ring measures about 4 miles in circumference. The SSC ring was to have been 54 miles in circumference, producing collisions 20 times more intense than the Tevatron.

The new European accelerator, called the Large Hadron Collider, will not be as powerful as the mighty SSC would have been. The Large Hadron Collider's ring, about 17 miles in circumference, should be capable of producing collisions about one-third as powerful.

"The SSC would unquestionably have made Texas an exciting center of fundamental research," said Steven Weinberg, a Nobel laureate and physics professor at the University of Texas at Austin. "We terribly regret this wasted opportunity. I'm even sorry for the farmers in Ellis County who had to give up their houses and move away for no reason."

The deserted scene

A drive to the main, 135-acre site just west of Waxahachie yields such a sensation of waste. Amid a pastoral countryside where corn fields mix with scattered country homes, the SSC's main buildings rise incongruously above the plain. The boxy, forlorn structures could house large commercial airplanes.

The tunnels were filled in long ago. The site, for all practical purposes, is abandoned. So little used are the surrounding roads that truck-driving instructors use them for training.

Before the project's cancellation, about 16,000 acres of land were condemned, including about 90 homes. Afterward, the land and buildings were largely deeded to the state, which in turn transferred the property to Ellis County. Bits and pieces were sold.

However, the main buildings and an accompanying 135 acres of land remained unsold until 2006, when a group that included J.B. Hunt, founder of the billion-dollar trucking empire, purchased the property with the intent of marketing it as a high-tech secure data center. After Hunt's death, his estate abandoned those plans.

Earlier proposals — for prisons, schools, movie studios and a Veterans Administration facility — met a similar fate.

"I'm not happy about it, not one bit," said one of the SSC's original champions, U.S. Rep. Joe Barton, a Republican whose district includes Ellis County.

The despair stands in stark contrast to the mood in Europe.

"The attitude here is one of wild enthusiasm," said Paul Padley, a Rice University physicist in charge of building a $40 million collision detector for the Large Hadron Collider.

"We're motivated by the physics questions we're trying to answer, and we're willing to move heaven and Earth to get the experiment built to answer these fundamental questions about the universe," Padley said.

The United States has contributed hundreds of millions of dollars to the European collider, which may cost as much as $10 billion, giving American scientists a stake in the project.

"Still, it's incredibly hard for Americans to be effective on a European experiment," said David Toback, an A&M physicist who has worked on the Tevatron and now works on one of the Large Hadron Collider experiments.

That's because Europeans will generally run the large experimental collaborations, interpret the results and publish them. They'll get the lion's share of glory.

The SSC's cancellation followed more than a decade of planning and construction. McIntyre, a magnet designer, was among the earliest evangelists for a Texas accelerator, and he helped arrange an early meeting with then Vice President George H.W. Bush to prod the project along.

Bush helped put it on the fast track. But internal and external forces began working against the project, said Neal Lane, a Rice physicist who served on the SSC's board of overseers and later as President Bill Clinton's science adviser.

From 1987 to 1993, the project's estimated price tag ballooned from about $4.4 billion to as much as $12 billion. Some physicists say this reflected poor management.

But other factors were involved, Lane said. Expected money from external sources, such as Japan, never came. In the early 1990s, budget cutting was in vogue. And the Texas delegation, with Lloyd Bentsen leaving the Senate to become secretary of the Treasury Department, lost some of its clout.

As the Cold War ended, the SSC lost support to the international space station, which had a comparable cost and offered an opportunity for rapprochement with the Russians.

"Congress had to have some symbol of fiscal restraint, and we were it," said Roy Schwitters, a UT physicist and the SSC's director.

Pulling the plug

So, after spending more than $2 billion and digging 14 miles of tunnels, Congress canceled the project in October 1993.

Science fiction author Bruce Sterling captured the mood among physicists during a visit to Waxahachie.

"To say that morale is low at the SSC Labs does not begin to capture the sentiment there," he wrote in an essay titled "The Dead Collider."

At the time, 2,000 people remained at the project, winding it down. They stayed, Sterling wrote, "because, despite their alleged facility at transforming themselves into neurophysiologists, arms control advocates, et al., there is simply not a whole lot of market demand anywhere for particle physicists, at the moment."

The loss extended to Waxahachie and the state in general. The accelerator would have attracted thousands of physicists and brought a new economic and cultural dimension to the area south of Dallas, leading to spinoff computer and cryogenic companies, Lane said.

The opening of the Large Hadron Collider comes at an especially bleak time for U.S. high-energy physics.

Earlier this month, 20 of America's physics Nobel laureates sent a letter to President Bush, urging him to restore half a billion dollars in fiscal year 2008 science funding. As a result of the cuts, physicists say, hundreds of scientists have been laid off and research grants have been slashed.

One surviving lab

In the last few years, the number of high-energy physics labs in the United States has been reduced from three to one — Fermilab, which is itself facing a diminished budget.

The "brain drain" that brought brilliant European physicists - such as Albert Einstein and Enrico Fermi — to America in the 1930s appears to be reversing, U.S. physicists say.

"The entirety of U.S. high-energy physics is at very significant risk," said Al McInturff, who helped develop magnets for the Tevatron, SSC and the Large Hadron Collider. "It's just a very, very painful situation."

"That's something of an understatement," McIntyre added.

Original here

Interactive Web sites draw minds, shape public perception

University Park, Pa. -- The interactive look and feel of a corporate website could help shape positive perceptions about the organization if the site includes a likeable design and features that engage the target audience, especially job seekers, according to media researchers.

S. Shyam Sundar, professor of film, video and media studies at Penn State, and Jamie Guillory, formerly an undergraduate student at Penn State, are trying to understand how interactivity in websites influences the public perception of an organization. In previous studies of websites of political candidates, Sundar had found that the candidates were rated more positively if their site had some interactive features, even though the sites had no new content, and the candidates held the same policy positions. But too much interactivity tends to turn off people.

"Websites with low to medium levels of interactivity create positive perceptions but for medium to high interactivity, it actually falls down," said Sundar. "In general, too much interactivity is not desirable, and may lead to information overload."

Whatever effects, positive or negative, on a site, interactivity acts as a volume knob that boosts the effect, he explained, noting, "Just through the presence of such features, people attribute meaning to the content or the nature of the site."

The Penn State researchers wanted to see if the same effect holds true even if the people viewing the website are highly engaged, or whether they form their opinions based on bells and whistles on a website only when they do not know enough about a topic.

In the current study, 116 undergraduate students were randomly assigned to one of seven websites representing low, medium, and high levels of interactivity. The students were specifically assigned to review the career section of these organizations because these sites require a higher level of involvement.

Features on these sites ranged from enabling a person to click on a link for job inquiries, follow a link for information on a specific job, submit an online application and view video footage of the company and its employees.

Students then answered a questionnaire on their perceptions of an organization based on their experience with its website. The study results show that there is a significant positive relationship between the level of interactivity on a career website and job seekers' perception of that organization.

"We found that college students looking for a job are more likely to apply to companies that have interactive websites with bells and whistles," said Sundar, who presented his findings today (May 25) at the 58th annual conference of the International Communication Association (ICA) in Montreal. "But the students use these features to make a logical connection."

The work received a Top Paper award from the association’s Public Relations division.

"We found that both liking and involvement are significant mediators such that people who saw a high interactive website liked it more, and they also got involved as a result of liking it more," he added.

The findings may have important implications for organizations. For instance, by simply tweaking the features on the website and without changing any of the content, a company could project a positive image to its targeted demographic.

In other words, the website of an organization could feature an optimal amount of interactivity specifically tailored to its target audience, and thereby control the impressions that people form of that organization.

But Sundar also cautions against being taken in by fancy websites that promise much and deliver little.

"We have uncovered a psychological phenomena here, that is the more interactive some thing is, more people -- especially college students -- are likely to buy into whatever is being advocated," said Sundar, who is also a founder of the Penn State Media Effects Research Laboratory. "We are trying to warn them against that potential danger."

Researchers say the next step is to figure out all the different meanings people are attaching when they are faced with new responsive features.

"Interactivity is multi-faceted in terms of the meanings it communicates. It is not just about interaction alone," added Sundar.

Original here

Scientists image a single HIV particle being born

A mapmaker and a mathematician may seem like an unlikely duo, but together they worked out a way to measure longitude – and kept millions of sailors from getting lost at sea. Now, another unlikely duo, a virologist and a biophysicist at Rockefeller University, is making history of their own. By using a specialized microscope that only illuminates the cell’s surface, they have become the first to see, in real time and in plain view, hundreds of thousands of molecules coming together in a living cell to form a single particle of the virus that has, in less than 25 years, claimed more than 25 million lives: HIV.

This work, published in the May 25 advanced online issue of Nature, may not only prove useful in developing treatments for the millions around the globe still living with the lethal virus but the technique created to image its assembly may also change the way scientists think about and approach their own research.
“The use of this technique is almost unlimited,” says Nolwenn Jouvenet, a postdoc who spearheaded this project under the direction of HIV expert Paul Bieniasz and cellular biophysicist Sandy Simon, who has been developing the imaging technique since 1992. “Now that we can actually see a virus being born, it gives us the opportunity to answer previously unanswered questions, not only in virology but in biology in general.”

Unlike a classical microscope, which shines light through a whole cell, the technique called total internal reflection microscopy only illuminates the cell’s surface where HIV assembles. “The result is that you can see, in exquisite detail, only events at the cell surface. You never even illuminate anything inside of the cell so you can focus on what you are interested in seeing the moment it is happening,” says Simon, professor and head of the Laboratory of Cellular Biophysics.

When a beam of light passes through a piece of glass to a cell’s surface, the energy from the light propagates upward, illuminating the entire cell. But when that beam is brought to a steeper angle, the light’s energy reflects off the cell’s surface, illuminating only the events going on at its most outer membrane. By zeroing in at the cell’s surface, the team became the first to document the time it takes for each HIV particle, or virion, to assemble: five to six minutes. “At first, we had no idea whether it would take milliseconds or hours,” says Jouvenet. “We just didn’t know.”

“This is the first time anyone has seen a virus particle being born,” says Bieniasz, who is an associate professor and head of the Laboratory of Retrovirology at Rockefeller and a scientist at the Aaron Diamond AIDS Research Center. “Not just HIV,” he clarifies, “any virus.”

To prove that what they were watching was virus particles assembling at the surface (rather than an already assembled virion coming into their field of view from inside the cell), the group tagged a major viral protein, called the Gag protein, with molecules that fluoresce, but whose color would change as they packed closer together. Although many different components gather to form a single virion, the Gag protein is the only one necessary for assembly. It attaches to the inner face of the cell’s outer membrane and when enough Gag molecules flood an area, they coalesce in a way that spontaneously forms a sphere.

Simon, Bieniasz and Jouvenet found that the Gag molecules are recruited from the inside of the cell and travel to the cell’s surface. When enough Gag molecules get close and start bumping into each other, the cell’s outer membrane starts to bulge outward into a budding virion and then pinches off to form an individual, infectious particle. At this point, the researchers showed that the virion is a lone entity, no longer exchanging resources with the cell. By using tricks from optics and physiology, they were able to watch the steps of viral assembly, budding, and even scission off the cell surface. With such a view they can start to describe the entire lifeline in the birth of the virus.

“I think that you can begin to understand events on a different level if you actually watch them happen instead of inferring that they might occur using other techniques,” says Bieniasz. “This technique and this collaboration made that possible.”

Original here

Tracing Humanity's Path


Coming to America.

A new study suggests that humans arrived in North and South America in multiple waves.

Most researchers agree that modern humans got their start in Africa and then spread throughout the world beginning about 50,000 years ago. But scientists are still working out the details of how the planet was peopled, such as who went where, and when. A new study, employing sophisticated modeling techniques, confirms the prevailing Out of Africa model but also comes up with some surprises, including evidence that the Americas' first human inhabitants arrived in multiple waves.

Archaeologists and anthropologists worldwide have dug up plenty of skeletons over the years, but the bones seldom say much about where ancient peoples originally came from. Thus researchers have tried using variations in the genes of living individuals to trace their ancestries back to prehistoric times. In general, the closer two modern populations are genetically, the more likely that they share a common ancestry; yet this ancestral heritage is sometimes obscured by genetic changes that have taken place over thousands of years, as well as by interbreeding between populations. Happily, efforts to get around these complications have been boosted by an ever-growing mound of data about genetic differences between human populations.

A team led by geneticist Daniel Falush of University College Cork in Ireland developed a new mathematical model to compare not just individual genes or short DNA segments, as previous studies have done, but also very long stretches of DNA. Falush and his colleagues analyzed 32 DNA segments, each consisting of more than 300,000 base pairs, from 927 people representing 53 different populations from around the globe. Plugging this huge amount of data into computer simulations, the team worked out which migration scenarios were most likely to have created the genetic variation we see today. The results, reported today in PloS Genetics, suggest that modern humans peopled the world in nine phases, beginning in Africa, moving on to Europe and Asia, and finally colonizing the Americas and the Pacific islands. (The team illustrates humanity's journey in two movies accompanying the paper; see below.) The team did not try to date the migrations.

The study came up with two unexpected findings. One is that the people of the Orkney Islands, to the north of Scotland, share some ancestry with Siberians, possibly because some ancestors of modern Orcadians ventured to Asia via the Arctic Circle. The team also found that North and South America were colonized independently by at least two different waves of migration from different parts of Asia, although both waves appear to have arrived via the Bering Strait. This conclusion contradicts the conventional view, which postulates just one migratory wave out of Asia.

"I like the paper very much," says Jonathan Pritchard, a human geneticist at the University of Chicago in Illinois. "It's a very novel and creative way of thinking about the data" that "may provide a better representation of human history." Ripan Malhi, a molecular anthropologist at the University of Illinois, Urbana-Champaign, says that the team's approach "holds great potential to give us important and novel insights into the peopling of the Americas." Nevertheless, Malhi cautions that the multiple migrations Falush and his colleagues detect in the Americas might be an artifact of ancient population movements "more complex than the simple models created in this study can accommodate."

Original here

Beavers to return after 400 years

Beaver swimming
Up to four beaver families will be released at lochs in Argyll

The European beaver is to be reintroduced to Scotland for the first time in more than 400 years, the Scottish Government has announced.

Environment Minister Michael Russell has given the go-ahead for up to four beaver families to be released in Knapdale, Argyll, on a trial basis.

The beavers will be caught in Norway and released in spring 2009.

Mr Russell said: "This is an exciting development for wildlife enthusiasts all over Scotland and beyond."

The beavers, which will be captured in autumn 2008, will be put into quarantine for six months before three to four families are released. Five lochs have been proposed for the release.

This will be the first-ever formal reintroduction of a native mammal into the wild in the UK.

They are charismatic, resourceful little mammals and I fully expect their reappearance in Knapdale to draw tourists from around the British Isles and even further afield
Mike Russell
Environment Minister

The trial will be run over five years by the Scottish Wildlife Trust and the Royal Zoological Society of Scotland, with Scottish Natural Heritage (SNH) monitoring the project.

Mr Russell added: "The beaver was hunted to extinction in this country in the 16th Century and I am delighted that this wonderful species will be making a comeback.

"They are charismatic, resourceful little mammals and I fully expect their reappearance in Knapdale to draw tourists from around the British Isles and even further afield.

"Other parts of Europe, with a similar landscape to Scotland, have reintroduced beavers and evidence has shown that they can also have positive ecological benefits, such as creating and maintaining a habitat hospitable to other species."

'Historic moment'

Scottish Natural Heritage will closely monitor the progress of the beavers over the next five years to consider the impact on the local environment and economy before any decision on a wider reintroduction.

Professor Colin Galbraith, director of policy for SNH, said: "The decision is excellent news. For the first time we will have the opportunity to see how beavers fit into the Scottish countryside in a planned and managed trial.

Knapdale Wildlife Reserve
The Knapdale Wildlife Reserve in Argyll where beavers will be released

"No other beaver reintroduction project in Europe has gone through such a long, and thorough, process of preparation, assessment and examination."

Prof Galbraith added that although beavers had been spotted in the wild in isolated cases, they had usually been caught and returned to zoos.

Allan Bantick, chairman of the Scottish Beaver Trial Steering Group, said it was a "historic moment" for wildlife conservation.

"By bringing these useful creatures back to their native environment we will have the chance to restore a missing part of our wetland ecosystems and re-establish much needed natural processes," he said.

David Windmill, chief executive of the Royal Zoological Society of Scotland, said: "It is a strong and visible sign of the Scottish Government's commitment to carrying out conservation in Scotland and re-building our depleted biodiversity."

Simon Milne, chief executive of the Scottish Wildlife Trust, said the challenge was now for the licence holders to raise funds for the project.

Original here

Tasmanian Devils Decimated by Mystery Cancer

Starting in the late 1990s Tasmanian wildlife authorities began receiving unusual reports: Some of the island's Tasmanian devils were spied with their faces marred by ulcerated sores. Small and furry, the carnivores are known for their unearthly howl and cranky temperament. They are also well loved as a species unique to Tasmania, an island outpost calved from mainland Australia.

But what was initially a bit of a scientific curiosity soon became a potential catastrophe. The results of the first investigative survey revealed that tens of thousands of Tasmanian devils had died from a disease now known as Devil Facial Tumor Disease.

Alistair Scott is a project leader for the devil disorder program at Tasmania's Department of Primary Industries, Water, and Environment. He says that initial survey revealed that the disease was running rampant through the eastern half of Tasmania.

"Mapping and monitoring showed it was across 65 percent of the state," Scott said. "The maximum population estimate [before the disease struck] was between 130,000 and 150,000. So from that, we believe we've lost up to 75,000. This has had a major impact on the devil population.''

Devil Facial Tumour Disease causes cancers to appear first in and around the mouth before spreading down the neck and, sometimes, into the rest of the body. Both male and female adults are more commonly affected than juveniles.

Sick devils also become emaciated, because the tumors interfere with eating, and many mothers lose their young.

The animals can die within six months of the appearance of the first sores, and in some areas whole populations have been wiped out within 18 months.

Mystery Disease

A special Tasmanian devil-dedicated laboratory has been set up in the northern town of Launceston, where scientists are desperately trying to work out what causes the disease and how it is transmitted.

"At the moment the scientists are working on the hypothesis that the disease is physically transferred," Scott explained.

"That has only been recorded once before in the animal world, and that was a venereally transmitted canine cancer. It's a unique disease we're dealing with. Once it's detected—and even then, that is only visually—the animal is on death row. They are dead three to six months later."

The team is working on the disease's chromosome structure and has just identified the karyotype. (Karyotypes are pictures of cellular chromosomes that are used to check for abnormalities.) The researchers say they have also been able to develop tumor cells in the lab, enabling them to run tests without actual tumors.

A diagnostic test and, ultimately, a vaccine are the scientific team's main aims. But while the scientists race to learn more about the disease, the Tasmanian government is trying to stop the disease from spreading farther.

To stop infected animals from moving in, trapping lines are being set in the north of Tasmania, an area that so far appears to be disease free.

The government is also considering whether to list the species as threatened under Australia's threatened-species laws. Threatened status would give the devils protection from other perils, such as habitat destruction caused by water pollution or land development.

The move would also require the government to draw up a plan for recovery of the species. Such a plan would focus on managing the disease where it already exists and on preventing it from affecting more animals.

The disease "is having a very serious impact on the wild population, but at this stage we don't believe that the species is headed for extinction," Scott said.

"In the areas where it has been present but [where] there's only a low level population, it has had a major impact. One of the problems is that the devil is nocturnal, and many people never see it in the wild. So it's very difficult to pick up changes in the population.''

Insurance Colony

The devil's plight has touched mainland Australia, where wildlife sanctuaries and zoos are being contacted about the possibility of establishing reserve populations of Tasmanian devils.

Melbourne's Healesville Sanctuary has been charged with keeping records of their Tasmanian devils' lineages, to ensure genetic diversity.

Adam Battaglia, a keeper at Taronga Zoo in Sydney, is preparing to send one of the zoo's two male devils to Tasmania.

"One male has been recalled to Tasmania because his genes are so valuable,'' Battaglia explained.

"He is going down to be part of the breeding program, so he will be very busy. All resources are being pulled back to the region. A national breeding program will be put back together because of what's happening in Tasmania."

"This example just validates other breeding programs we have like the platypus. They are common but there is always the fear something might happen."

There are 150 devils in captivity on mainland Australia and Tasmania. Making sure the animals stay healthy is a priority for their keepers.

If the situation among wild Tasmanian devils grows worse, captive devils may be moved to the mainland. But Scott, the Tasmanian environment official, says the island dwellers will remain where they are for the time being until more is known about the disease.

Original here

Re-energising the nuclear industry

Nuclear power has a key part to play in providing the UK with reliable, low carbon electricity in the future, argues Peter Bleasdale. In this week's Green Room, he says a new National Nuclear Laboratory will provide the research and training needed after decades of neglect and decline.

xx
In nuclear research and development, personnel numbers declined dramatically from about 9,000 in 1980 to just 1,000 a few years ago

The UK government confirmed in January that it was in the country's long-term interest that nuclear power should play a role in providing Britain with clean, secure and affordable energy.

So why is nuclear power back on the national agenda? While there is no perfect answer and no perfect energy source, each method of generating electricity has advantages and disadvantages.

Like every other country, the UK is faced with the challenge of developing an energy programme that balances environmental issues, such as carbon emission reduction, with energy demand, security of supply and economics.

Secure supplies are especially important as even short lived power cuts can cause massive disruption. Nuclear is a traditional base-load supplier with high global reliability.

In the past, nuclear has been seen as a high-cost option when compared with other methods of electricity generation.

But a range of independent studies now show that full nuclear life-cycle costs are competitive with other sources. This competitiveness improves further when factors such as desirability of meeting policy objectives of cleaner, more secure power sources are taken into account.

Back to school

Bearing all of this in mind, UK ministers now believe it to be in the public interest to allow energy companies the option of investing in new nuclear power stations.

Sizewell B nuclear power station (Image: AFP)
The UK has not built a nuclear power station for more than a decade

These stations are more efficient than those they will replace and they must have a key role to play as part of the UK's energy mix.

The White Paper, Meeting the Energy Challenge, also restated the intention of establishing a National Nuclear Laboratory (NNL).

It will have the aim of providing the technologies and expertise to ensure the industry operates safely and cost effectively.

Ministers believe that the energy sector faces challenges in meeting the need for skilled workers in research and development, design, construction and operation of new nuclear power.

The NNL will be aligned with national policy on skills and will support and work alongside the newly created National Skills Academy for Nuclear (NSAN). The NNL's role will be vital in building nuclear scientific skills.

In nuclear research and development, personnel numbers declined dramatically from about 9,000 in 1980 to just 1,000 a few years ago.

Nexia Solutions, which the NNL will be based around, has already taken action to address the skills decline by working closely with the academic sector.

The NNL will safeguard and develop key scientific and technical skills and facilities that cannot be reliably supplied by the external marketplace.

RadBall (Image: Nexia)
The "RadBall" is one innovation to improve nuclear plants' operations

It will build a technology skills pipeline back into industry, and will also have national and international influence in marketing and selling skills and technologies overseas.

One example of the NNL's innovative work already playing its part is the "RadBall", developed by Dr Steven Stanley, a research technologist.

In essence, it collects information about where, how much and the kind of radiation in inaccessible areas, removing the need for people to access the location.

The information gathered through a polymer-based "crystal ball" device is then turned into meaningful data by using new software.

Waste worries

While nuclear power is a tried-and-tested carbon free technology, and new stations are better designed and more efficient than those being replaced, it's true that many countries, including most of Western Europe, decided to suspend new development in the recent past.

Sellafield nuclear processing facility (Image: BNFL)
The quantity of nuclear waste is extremely small when compared to overall national volumes of all toxic wastes

Concerns about the economics of new stations, the possibility of accidents, slow progress in dealing with nuclear waste and the historical link between nuclear energy and weapons resulted in a loss of confidence among politicians and the public.

With the UK being challenged to balance its commitments to tackling climate change while at the same time meeting rising energy demand and keep the lights on, nuclear generated electricity has returned to the energy agenda alongside other low carbon technologies.

But, before new nuclear stations are given the go ahead, the government will have to be satisfied that effective arrangements are in place to manage and dispose of the waste they produce.

All wastes can have a negative effect if released into the surrounding environment, but the amount of waste produced from nuclear power are very small by industrial standards and modern nuclear power stations are much more efficient than earlier examples.

Proportionally, the volume of radioactive waste needing to be managed will not increase very much, whether or not there is a future programme of nuclear power stations.

In supporting the resurgence of the nuclear industry in the UK, the function of the NNL is very clear - to provide the skills and technologies to support all aspects of a successful nuclear industry now and into the future

The quantity of nuclear waste is extremely small when compared to overall national volumes of all toxic wastes.

Given the diversity of the nuclear programme, radioactive waste can vary in the amount of radiation it gives out so it is categorised into low-level, intermediate-level and high-levels.

A nuclear power station produces around 100 cubic metres of solid radioactive waste each year (about the volume of a lorry) and more than 90% of this is low-level.

Already there are significant volumes of historic wastes safely stored, and a programme of new reactors in the UK will only raise waste volumes by up to 10%.

Unlike fossil fuel waste, which is released into the atmosphere and ground, nuclear waste is carefully managed and contained.

In its role to provide the experts and technologies that support the nuclear industry in operating safely and cost effectively in the short and longer term, the NNL will deliver technological solutions that help deal with waste issues now and plans for the processing and management of waste in the future.

The NNL is closely involved in waste research and will provide close support for future waste management plans and strategies.

Disposal in a geological facility is a seen as the most viable long term solution and the right approach for managing waste from new nuclear power stations and legacy waste.

Safety first

The nuclear industry spends millions of pounds each year on safety to meet its own extremely stringent requirements and also those of several external regulating and advisory bodies.

The safety record of nuclear power reactors in the UK and in most of the world is excellent.

In the UK, before a nuclear power station is licensed, its owners must demonstrate that it is safe and prove that the likelihood of uncontrolled radioactivity escaping is, literally, less than one in a million for every year of the reactor's life.

Designers must also assume that human operators can make mistakes, so in traditional nuclear designs all protective systems must be duplicated or trebled.

More recently, the focus has moved towards "passive safety". Instead of relying on valves, pumps and other engineered features, designers are increasingly using the forces of nature - gravity or the fact that materials expand when they get hotter - to ensure safety.

Such an approach reduces the cost of building the reactor and increases the reliability and predictability of how the plant behaves under both normal and abnormal circumstances.

The NNL will also play a key role in supporting the licensing process for new nuclear reactors and other regulatory requirements in the UK.

It will also play an ongoing and crucial technology role in all aspects of nuclear new build and operation.

In supporting the resurgence of the nuclear industry in the UK, the function of the NNL is very clear - to provide the skills and technologies to support all aspects of a successful nuclear industry now and into the future.

Dr Peter Bleasdale is managing director of Nexia Solutions, a wholly owned subsidiary of BNFL Group, that will be developed into the National Nuclear Laboratory

The Green Room is a series of opinion pieces on environmental topics running weekly on the BBC News website


Do you agree with Peter Bleasdale? Does nuclear power play a vital role in providing reliable, low carbon electricity? Will a National Nuclear Laboratory plug the gap in the skills and research deficit? Or has the sector been proven to be an expensive white elephant?

We agree with the need to focus on skills development for the nuclear industry. We believe the most important of these skills is effective safety management. Safety management includes safety-related programs and processes, and the development and maintenance of a strong safety culture. Problems at existing nuclear plants are often accompanied by a safety culture that has gone awry. NNL and NSAN safety management training must recognize that safety and production goals will sometimes be in conflict, and that safety culture will deteriorate over time unless it is actively managed.
Lewis Conner, Powershift LLC, Kensington, California

I laugh when I hear arguments from green parties about the dreadful legacies of nuclear waste. Do they imagine that future technologies wil be unable to cope. If all else fails they may bury it under the huge mountains of waste form other processes! The real danger to our heirs will arise from wars fought over diminishing supplies of food, energy, clean air and water. Let us keep a sense of proportion and allow our scientists to deal with these shortages.
Larry, Barry, Wales

Not that I believe OldStone50's assertion below that nuclear power requires a "police state"; but if it is true then only police states will have nuclear power. Obviously nuclear power is part of the solution to the world's energy supply. Energy usage is bound to increase - to suggest otherwise is to say that we will never again use as much energy as we currently use. Two billion people don't even have access to electricity at the moment. It is a certainty that the world's energy usage will increase; and it will obviously have to be policed. Just in the same way that vehicles are licensed and traffic is policed in every developed country. This is not a symptom of a police state; it is a trait of technological civilisation. Humans have always looked for ways to manipulate larger quantities of energy, and this needs to be controlled by society. The suggestion that the solution is to reduce energy consumption reminds me of Miss World wishing for "world peace": it is superficially worthy but hopelessly naïve. Energy rationing is not a long-term solution. Somebody, somewhere will always take the high-energy route and will consequently dominate economically and politically. What we need is sustainable high-energy low-carbon sources; and nuclear fits that bill.
Colin, Glasgow

Can I give you all this link http://www.stormsmith.nl/ which shows that the CO2 emissions from nuclear power stations are the same as those from one powered by fossil fuel. The following quote is part of the summary. Some novel concepts are introduced, to make the results of this study better accessible: the, the 'energy cliff', the 'CO2 trap', the 'coal ceiling' and the 'energy debt'. Beyond the energy cliff the nuclear system cannot generate net useful energy and will produce more carbon dioxide than a fossil-fueled power station (CO2 trap). Nuclear power may run off the energy cliff with the lifetime of new nuclear build. Beyond the coal ceiling more uranium ore has to be processed each year to feed one nuclear power plant than the annual coal tonnage of coal consumed by a coal-fired power plant to generate the same amount of electricity. The only people to gain from nuclear power are those involved in the industry at the planets expense.
Martin, Scotland

People seem to be forgetting in 50 years or so, there will be little oil left, though much now re-economical coal.. What is going to plug the energy gap ? To power your central heating boiler, TV, car trains, cooking, street lighting etc. Forget about using renewables to power 15% of current needs, worry about what is going to power half of what we currently use sourced from oil/gas - Electricity generation, petrol, diesel, natural and bottled gas before you even discuss other oil uses - plastics and chemicals etc Nuclear waste, why not shoot it off into the Sun or into deep space. Don't put it on the Moon, I've seem Space 1999 and know what happens there :-)
Neil Postlethwaite, Warwickshire

To all those who complain that uranium mining causes emissions of carbon (dioxide, don't forget that word), they seem to forget that almost all mining and quarrying activity have similar effects. Also, to those who say that we are past 'peak uranium', the reserves of fissile material can be used far more efficiently with breeder reactors, whose waste products can themselves be used as fuel, which will last for 500,000 years or more. Nuclear power is by no means perfect, but seems to be by far the best option for a so-called 'low carbon' economy. One last thing, renewable energy is nothing of the sort! Any student of physics will be able to tell you that energy is transferred, not created or destroyed (aside from matter-antimatter annihilation and electron-positron pair production on an unimaginably small scale). Wind, tidal and solar energy are not limitless, they depend on indirect solar energy (for wind), direct solar energy for photovoltaic cells, and the kinetic energy and gravitation of the moon for the tides and these are not infinite.
Will Griffith, Exeter, Devon

I agree with Dr Bleasdale. Plus take a look at France and Finland to see what can be achieved with consistent nuclear power programmes. The Finns have waste disposal facilities and are building a new nuclear power station and planning another (their sixth unit).
Peter Holt, Berne, Switzerland

A FACT about N-waste. Certain nuclides[atom species]in waste can persist for thousands of human generations;this is a good feature not a drawback which is a common mis-conception.A fire which takes untold ages to die down emits negligible heat here and now,and by analogy the same applies to decaying nuclides.
Leonard Ainsworth, Lytham St.Annes UK

What a one-sided story that trumpets "benefits" without ever mentioning the many drawbacks. Surely the problem lies not in generating enough power to meet the demand, but reducing the demand to meet reality. Can the whole world use fission? I think not. It's just like John Travolta getting dressed up and re-oiled for the sequel to "Saturday Night Fever".... old, oily and wrong. Designing re-vamped reactors is stupid and selfish. Wear jumpers in the winter, insulate and generate power locally. It's so much simpler that waiting for a Buck Rogers style fusion reactor, or further destroying whatever beauty is left in the world.
Chris Harper, Yonago, Japan and Bristol

Bleasdale's failure is that of the engineer. He focuses narrowly on a micro-problem while assuming ceteras paribus, and he can't resist the attraction of the grand project, the silver bullet solution, the vision of massive concentrated power. In effect, he's a three year old boy, standing by the railroad tracks, staring wide-eyed at the train and being totally enthralled by it. It is probably possible to have many thousands of nuclear power stations all over the world cranking out energy. It may be possible to find basic fuels other than uranium, e.g., the realization of practicable fusion reactors. We might even learn to control toxic by-products safely and consistently. It might even be - with a real long stretch of the imagination - all economically efficient. But one thing that cannot be avoided with the use of nuclear power is the establishment and maintenance of strong, rigid, and absolutly centralized control. Nuclear power is a clear prescription for a police state because only through a police state can all the necessary safety measures be achieved and - much more difficult - maintained. Especially over hundreds or thousands of years. Turning off the lights is a much more rational, safe and democratic approach. And for those who have difficulty understanding metaphors, turning off the lights is a metaphor for energy use reduction.
OldStone50, Freiburg, Germany

Is there anyone else who thinks that the problem of leaving nuclear waste to future generations is insanely immoral ? What kind of inheritance is that for our children, grandchildren, great grandchilren for 1000s of years ? And does this set an example of responsibility ? Even if there are no accidents or leaks it is quite a responsibility to be given to future generations because we are unable to live within our means.
Simon Kellett, Darmstadt, Germany

On the subject of radioactive waste, I once read that a large coal fired power station such as Drax emits more radioactive material to the environment that theh average nuclear station. Do we have dual standards?
Doug Elliot, Ormskirk, Lancashire

I believe strongly that nuclear energy is the only realistic answer to the critical state our climate is currently in. Wemay well have already passed a tipping point in terms of greenhouse gases and positive feedback loops. It would please me greatly if people could stop harking back to Chernobyl, and linking nuclear energy to nuclear weapons. This blinkered view will only lead to global catastrophe. 'Renewable' or 'green' energies simply do not have the cpapcity to support energy needs. Take the example of biofuels, wonderful in theory but realistically, the space needed grow such a vast amount of the crops needed would probably do more overall planetary damage than the benefits gained from using biofuels. In addition,there would most likely be food shortages (as was claimed recently in Latin America). The best thing developed nations could probably do right now is offer to share existing nuclear technologies with developing nations. Allow developing nations to! skipthe step that developed nations went through and put us in this mess in the first place.
Olly Burdekin, Playa Del Carmen , Mexcio

It makes me laugh when people assume we have a choice. Renewables will never meet our energy requirements alone, unless we dramatically change our way of life. Now instead of worrying about the long term problems with nuclear fission I think we should look at it as a means to an end. This will give the nation chance to develop carbon free ways to go about their daily lives (hydrogen fuel cell cars, electric heating etc). Nuclear Fusion is an amazing way to create energy, a few grams of fuel (which is found in water and the earth) can power London for ages, its only waste is helium and neutrons. A prototype reactor is expected to open in 2017. Eventually all the fission reactors could be replaced.
Paul Holland, loughborough

Good Point Nic Borough (below). - Indeed the most efficient use of the worlds remaining Uranium 235, (whose mining operations are rapidly inflating the carbon footprint of Nuclear Power due to lower and lower grade ores now being mined), would be to build fast breeder reactors. So called because the can produce more fuel (in the form of Plutonium) than they use (in the form of U-235). These would ensure that fission could contribute to emmissions targets going forward. The problem and I think and the reason no-one is talking about it is that Plutonium is relatively easy to refine from spent fuel/ waste thus increasing the risk of proliferation of nuclear weapons should the global nuclear power industry go down the 'fast breeder' route. In my opinion it would be crazy not to develop and build fast breeder reactors, and in so doing squander the remaining economically available Uranium. Commercial Fusion is still potentially a long way away (50 yrs away and always will be as th! e old joke goes). Surely security concerns can be addressed in the way the meltdown/ radioactice release threat has been mitigated!
Barry Gallagher, London

The editorial is the fox guarding the chicken house (or BNFL Group looking out for your health vs the profit in building/running new plants). If the amount of nuclear waste is so SMALL compared to other waste then how come people don't store it in their backyards? Texans don't want BNFL here and we don't want the nuclear waste from the world brought here to the pristine desert and aquifers. That open desert actually controls global warming in a similar way to tree cover via desert crusts, and as a society we are getting too big to simply cart our waste over to "that open space" or some poor or disenfranchised neighbor. Britain, keep your nuclear waste in Britain -- and then tell us that nuclear waste is "small": its impact is HUGE and in terms of human-years, forever.
Heather, El Paso near Sierra Blanca, Texas

Nic Brough suggested squeezing more energy from uranium will be extremely costly and that nuclear power is at the end of its improvement cycle. This is not true. Most reactors in service today are so called thermal-spectrum reactors that burn only a few percent of uranium ( the fissile U-235 component ). More advanced fast neutron reactors can also burn the U-238 component thus giving between 60-100 times the energy for the same amount of ore and waste ( depending a little on what reactor you compare with ). Some of these designs (like the lead cooled fast reactor ) are likely to be cheaper than current technologies, mainly because molten metals have better thermodynamic properties than does pressurized water. The problem is that they are all on the prototype stage and construction of commercial plants is not expected to be feasible before the existing plants are to be shut down. Nevertheless claiming there is no room for improvement is simply wrong.
Jonatan Ring! , Lund, Sweden

lets dismantle this shall we? "life-cycle costs are competitive" Even if (and its a big if) costs were comparable to renewables, why have nuclear when you can renewables for the same cost? Comparing nuclear waste with other industrial waste is silly - if it were so why do we guard even moderate nuclear waste and not all industrial waste? If its so easy to deal with, why hasn't the existing nuclear waste been finaly disposed of? As for safety "one in a million(s)" do happen. If they happen in a nuclear environment the potential for disaster is far greater than say an accident at a coal plant. Remember Chernobyl? The decision for nuclear is a political one, not a rational one. Perhaps the money for a Nuclear Academy should be put into a 'Sustainable Academy', perhaps then we wouldn't need any nuclear stations?
Kevin, Coventry, Coventry

Unfortunately renewables simply will not be able fill the gap, at least not in the short term... No matter how much nuclear is villified, it remains the only viable, if expensive, solution to the problem. As for the issue of nuclear waste, some pretty nifty ideas have been bandied about lately, and I'm pretty convinced that, at least for the time it takes for renewables to be sufficiently developed and financially efficient, using nuclear fission as a power source is our only real option.
Jamie Males, Cambridge

There is absolutely no evidence that the developed, and now developing, nations are prepared to give up their high standards of living. We are therefore forced to generate more and more electric power. There is no perfect solution, greener sources like wind power can never supply all our needs and so we are forced to use nuclear, which although is not zero carbon, is a thousand times less that using fossil fuels.
Mike Pettman, Chichester

Nuclear might not be zero carbon but when compared to other forms of energy production it is extremely small; way down there with renewables. New build nuclear produces about 8 tonnes of carbon per MWh compared with around 400 for gas and 900 - 1400 for coal (depending on how dirty the coal is). Those wind turbines don't spring up from nowhere and have about the same carbon footprint per MWh as Nuclear if not more. The only real difference is that you'd have to cover the entire coast line of the UK in wind turbines (about 1 every 2km) to meet a fraction of UK demand and then only when the winds blowing. As for turning off light bulbs, are you serious? The amount of electricity saved is on a par with the ridiculous "let's switch all devices off standby" campaign. Electrical devices when on full blast are only 3% of total demand so when on standby their using a fraction of 3%. The most effective method for reducing energy use is to insulate peoples homes. 60% of energ! y goes on heating so logically it's where the most improvements can be made. Go out there and buy some loft insulation made from recycled bottles, some cavity wall insulation and some double or even triple glazing, then come back and tell me to switch off some lights.
Anon, England

Nuclear power supporting a thermal hydrogen generation system is absolutely zero carbon. Not a single molecule of CO2 need be released by any stage of the mining and power generation process once a full nuclear and hydrogen economy is implemented. If Chris would like us to believe nuclear power is not carbon neutral I would like him to tell us where he plans to get the steel, silicon, copper, and other materials for his wind and solar plants without mining for them and how he plans to install the new plants without building them. Heck, I'd like him to tell us what he plans to do with hundreds of thousands of tonnes of waste silicon.
Thomas, Birmingham

"... new stations are better designed and more efficient than those being replaced". The only one nearing completion in 2011 (Olkiluoto Finland), already two years late, is a prototype. So whether it will be more efficient will not be known until around 2012, 2013 or ... if it is delayed further.
John Busby, Bury St Edmunds

What a surprise - an employee of a company owned by BNFL talking up nuclear power. The reality is that nuclear power is a joke: a white elephant, a black hole into which we persist in throwing billions of pounds of taxpayers' money. Will we never learn? And as Chris (below) points out, nuclear power is in no way a carbon free source of energy. And its carbon emissions are only going to rise as the good quality uranium ores are worked out and the industry turns to lower grade sources of fuel. If we're serious about tackling climate change, nuclear power is way down the list of things that we should be doing. We should spend the money on something that might do some good instead.
Eddie, Edinburgh

Yes, Nuclear power is an option - but it comes with a terrible legacy of waste that remains dangerous for thousands of years! The cost of keeping this waste secure under permanent ? safe conditions is enormous and ongoing. Closing down older plants is fraught with problems let alone the huge costs. The govermnent should have encouraged industry to invest in research for more efficient ways of producing energy for homes, cars, air tansport years ago - not like they did know this was coming! The promising technology is there, just needs a kick start with financial inducement and legislative targets to be met by certain dates. The upshot is the oil industry and its dependents are hampering progress in this area for its own ends. Some of the profits have to be channelled into reseach now, the world cannot wait.Green technology will be the end result, power first then refine it! Ignoring this research will inevivitably result in wars and famine - that none of us want or deserve! BCJ
BARRY JOHNSNON, Minster, Sheerness

I agree that nuclear energy needs to be revisited. It has numerous advantages and even the worst case disaster of Chernobyl didn't prove to be anything like as damaging to the environment as first feared, as a Horizon programme a few years ago showed.
Les Howarth, Saffron Walden, UK

Nuclear might not be zero carbon but when compared to other forms of energy production it is extremely small; way down there with renewables. New build nuclear produces about 8 tonnes of carbon per MWh compared with around 400 for gas and 900 - 1400 for coal (depending on how dirty the coal is). Those wind turbines don't spring up from nowhere and have about the same carbon footprint per MWh as Nuclear if not more. The only real difference is that you'd have to cover the entire coast line of the UK in wind turbines (about 1 every 2km) to meet a fraction of UK demand and then only when the winds blowing. As for turning off light bulbs, are you serious? The amount of electricity saved is on a par with the ridiculous "let's switch all devices off standby" campaign. Electrical devices when on full blast are only 3% of total demand so when on standby their using a fraction of 3%. The most effective method for reducing energy use is to insulate peoples homes. 60% of energ! y goes on heating so logically it's where the most improvements can be made. Go out there and buy some loft insulation made from recycled bottles, some cavity wall insulation and some double or even triple glazing, then come back and tell me to switch off some lights.
Anon, England

>a range of independent studies now show that full nuclear life-cycle >costs are competitive with other sources But only if the costs of waste disposal and security are not taken into account. Also neglected are other facts like: * Fission power is at the end of it's technological improvement cycle - it's going to cost a fortune to squeeze another 0.1% of energy out of the fuel, whereas renewables are mostly at the beginning * it takes 10+ years to get a plant running, but we need the energy now * we're already close to "peak uranium", if not past it, and demand is rising - fuel is going to get more expensive and there's no way back * whilst Europe has a very good record on reactor safety, we have an appalling one on disposal of waste and leakage. Our current nuclear option is the most expensive waste of time we can possibly chase, and we, the taxpayer and consumer, are going to get burnt again. We need to forget fission reactors and fund research, building and improvements on renewable energy sources - wind, wave, tidal, solar and fusion power. Our only real "nuclear" option is fast-breeder reactors which no-one seems to mention any more.
Nic Brough, London

Nuclear power is not actually "tried-and-tested carbon free", as stated in the report. A great deal of energy is spent - and carbon used - in the mining and enrichment processes that uranium must go through before it can be considered fuel. It is also hard if not impossible to estimate the costs of decommissioning a nuclear power plant, with further carbon costs. The report also states that nuclear waste is "very small by industrial standards". What is necessary to balance this statement is the fact that there is no current technical solution for the very long-term management of high-level waste, which must be considered. Even deep geological storage of waste does not stop leaching from barrels, change to water tables, and damage from earthquakes. Such sites would therefore need to be managed for thousands of years to ensure they function as intended. Uranium is not a limitless source either. Current estimates put uranium supplies at 200 years left, at today's uranium consumption. The cost of nuclear power must be considered from the entire of the nuclear power generation "cycle". The decision to use nuclear power is an economic one, with short term costs politically hidden away, and the long term costs the burden of another generation. A useful starting point for anyone interested in the topic is a report from the UK's Sustainable Development Commission.
Chris, Glasgow

I see Dr Bleasdale neglets to mention that whilst the new stations will be more efficient than the previous ones, the high burn rates of the uranium that give greater efficiency will cause a range of kncok-on problems. They make the cladding for the fuel rods more brittle, giving clear safety issues that would have to be resolved before any consideration of operation. They also make teh spent elements of the high burn-up fuel far more radioactive than the current waste so, whilst there would be less of it, it would need to be stored much further apart than at present (due to the heat build up) thus meaning it would actually require more storage space.
Jon, Rainham, Kent

The example of France provides the proof that there is a role for nuclear energy. There is a desperate need, particularly in UK, to take the technology forward - an NNL should achieve that. It would seem reasonable to state that the cost of all the damage done by "conventional" fuels so far and into the future exceeds any done to date by nuclear. The underlying fear is always the potential for mis-use of the knowledge.
Timothy Havard, Fife Scotland

Problem is it will take so long to build these reactors we will have run out out of capacity long before they are completed. We need to start building them now and to balance this with Large Tidal (Severn Barrage and Thames BArrage) and Wind. Add to thsi soem biomass and gas and coal near the places where there a re supplies.
Ben Shepherd, Farnham, Surrey

Please, please stop calling nuclear fission "zero carbon" - it is NOT! The new infrastructure and fuel production are both massive sources of emissions during construction, operation and decommissioning phases. As for keeping the lights on - let's turn some off for goodness sake, and there'll be far less of an energy gap which could then be covered by renewables!
Chris, Blewbury, Oxon

Original here