Followers

Wednesday, July 23, 2008

Apollo 14 astronaut claims aliens HAVE made contact - but it has been covered up for 60 years

Edgar Mitchell

Edgar Mitchell was the Lunar Module Pilot for Apollo 14

Aliens have contacted humans several times but governments have hidden the truth for 60 years, the sixth man to walk on the moon has claimed.

Apollo 14 astronaut Dr Edgar Mitchell, said he was aware of many UFO visits to Earth during his career with NASA but each one was covered up.

Dr Mitchell, 77, said during a radio interview that sources at the space agency who had had contact with aliens described the beings as 'little people who look strange to us.'

He said supposedly real-life ET's were similar to the traditional image of a small frame, large eyes and head.

Chillingly, he claimed our technology is 'not nearly as sophisticated' as theirs and "had they been hostile", he warned 'we would be been gone by now'.

Dr Mitchell, along with with Apollo 14 commander Alan Shepard, holds the record for the longest ever moon walk, at nine hours and 17 minutes following their 1971 mission.

'I happen to have been privileged enough to be in on the fact that we've been visited on this planet and the UFO phenomena is real,' Dr Mitchell said.

'It's been well covered up by all our governments for the last 60 years or so, but slowly it's leaked out and some of us have been privileged to have been briefed on some of it.

UFO theorists believe Roswell in New Mexico was the site of an alien crash in 1947

'I've been in military and intelligence circles, who know that beneath the surface of what has been public knowledge, yes - we have been visited. Reading the papers recently, it's been happening quite a bit.'

Dr Mitchell, who has a Bachelor of Science degree in aeronautical engineering and a Doctor of Science degree in Aeronautics and Astronautics claimed Roswell was real and similar alien visits continue to be investigated.

He told the astonished Kerrang! radio host Nick Margerrison: "This is really starting to open up. I think we're headed for real disclosure and some serious organisations are moving in that direction.'

Mr Margerrison said: 'I thought I'd stumbled on some sort of astronaut humour but he was absolutely serious that aliens are definitely out there and there's no debating it.'

Officials from NASA, however, were quick to play the comments down.

In a statement, a spokesman said: "NASA does not track UFOs. NASA is not involved in any sort of cover up about alien life on this planet or anywhere in the universe.

'Dr Mitchell is a great American, but we do not share his opinions on this issue.'

Original here

The Pole star comes to life again

Artists impression of Polaris and the constellations of the Big and Small Dippers.
Artist's impression of Polaris and the constellations of the Big and Small Dippers.

(PhysOrg.com) -- The Northern Star, whose vibrations were thought to be dying away, appears to have come to life again.
An international team of astronomers has observed that vibrations in the Pole star, which had been fading away to almost nothing over the last hundred years, have recovered and are now increasing. And the astronomers don't know why.


Plot of decrease over 100 years of amplitude of 4-day light variation of Polaris and of the increase since 2000. Observations before 2000 from other work, observations after 2000 from this work.
The discovery will be announced during the "Cool Stars 15" conference at the University of St Andrews. Dr Alan Penny from the School of Physics and Astronomy will present results of the recovery to around 350 international delegates at the meeting that runs from July 21-25.

The astronomers were watching Polaris in the expectation that they would catch the star switching off its vibrations completely when they made the surprising observation of its revival.

Dr Penny explained, "It was only through an innovative use of two small relatively unknown telescopes in space and a telescope in Arizona that we were able to discover and follow this star's recovery so accurately."

Team leader, Dr Hans Bruntt of the University of Sydney, had been using a small telescope attached to NASA's now defunct infrared space telescope (WIRE) to study the star for a short period of time. He knew Dr Penny was using a device known as the SMEI space camera - predominately employed to watch matter being ejected from the Sun - to do long-term monitoring of stars. When the SMEI data were analysed this recovery of Polaris was seen, and could also be traced in the WIRE data.

Professor Joel Eaton of Tennessee State University was also doing long-term monitoring, and, using the AST automated spectroscopic telescope located in Arizona, was able to confirm the observations of Polaris by watching the change in velocity of the surface of the star as the variation caused it to expand and contract.

Although Shakespeare's Julius Caesar declared, "I am as constant as the Northern star, Of whose true-fix'd and resting quality There is no fellow in the firmament", Polaris is a 'Cepheid' variable star, getting brighter and fainter every four days. Cepheids do play a vital part as 'standard candles' in determining the size of the Universe, but the details of their variations are not well understood.
Dr Penny said, "One hundred years ago Polaris varied by 10%, but over the last century the variations became smaller and smaller until ten years ago it only varied by 2%.

"It was thought the structure of the star was changing to switch off the vibration. Yet the team has found that about ten years ago the vibrations started picking up and are now back up at the 4% level."

The slow decline in variability was in itself unusual, as no other Cepheid is known to do this. Astronomers thought that Polaris was ageing and its structure was changing so that it was no longer unstable. This was being followed to learn about how stars age. Now Polaris has 'turned on' again this explanation seems unlikely and there may be a complex process in the outer layers of the star, with more than one mode of variability.

Original here

Chemical Breakthrough Turns Sawdust Into Biofuel

By COLIN BARRAS

A wider of range of plant material could be turned into biofuels thanks to a breakthrough that converts plant molecules called lignin into liquid hydrocarbons.

sawdust
(ABC News Photo Illustration)

The reaction reliably and efficiently turns the lignin in waste products such as sawdust into the chemical precursors of ethanol and biodiesel.

In recent years, the twin threats of global warming and oil shortages have led to growth in the production of biofuels for the transportation sector.

But as the human digestive system will attest, breaking down complex plant molecules such as cellulose and lignin is a tricky business.

Food Crisis

The biofuels industry has relied instead on starchy food crops such as corn and sugar cane to provide the feedstock for their reactions. But that puts the industry into direct competition with hungry humans, and food prices have risen as a result.

A second generation of biofuels could relieve the pressure on crop production by breaking down larger plant molecules – hundreds of millions of dollars are currently being poured into research to lower the cost of producing ethanol from cellulose.

But cellulose makes up only about a third of all plant matter. Lignin, an essential component of wood, is another important component and converting this to liquid transport fuel would increase yields.

However, lignin is a complex molecule and, with current methods, breaks down in an unpredictable way into a wide range of products, only some of which can be used in biofuels.

Balancing Act

Now Yuan Kou at Peking University in Beijing, China, and his team have come up with a lignin breakdown reaction that more reliably produces the alkanes and alcohols needed for biofuels.

Lignin contains carbon-oxygen-carbon bonds that link together smaller hydrocarbon chains. Breaking down those C-O-C bonds is key to unlocking the smaller hydrocarbons, which can then be further treated to produce alkanes and alcohol.

But there are also C-O-C bonds within the smaller hydrocarbons which are essential for alcohol production and must be kept intact. Breaking down the C-O-C bonds between chains, while leaving those within chains undamaged, is a difficult balancing act.

In Hot Water

Kou's team used their previous experience with selectively breaking C-O-C bonds to identify hot, pressurised water – known as near-critical water – as the best solvent for the reaction.

Water becomes near-critical when heated to around 250 to 300 °C and held at high pressures of around 7000 kilopascals. Under those conditions, and in the presence of a suitable catalyst and hydrogen gas, it reliably breaks down lignin into smaller hydrocarbon units called monomers and dimers.

The researchers experimented with different catalysts and organic additives to optimise the reaction. They found that the combination of a platinum-carbon catalyst and organic additives such as dioxane delivered high yields of both monomers and dimers.

Under ideal conditions, it is theoretically possible to produce monomers and dimers in yields of 44 to 56 weight % (wt%) and 28-29 wt% respectively. Weight % is the fraction of the solution's weight that is composed of either monomers or dimers.

Easy Extraction

Impressively, the researchers' practical yields approached those theoretical ideals. They produced monomer yields of 45 wt% and dimer yields of 12 wt% – about twice what has previously been achieved.

Removing the hydrocarbons from the water solvent after the reaction is easy – simply by cooling the water again, the oily hydrocarbons automatically separate from the water.

It is then relatively simple to convert those monomers and dimers into useful products, says Ning Yan at the Ecole Polytechnique Fédérale de Lausanne, Switzerland, and a member of Kou's team.

That results in three components: alkanes with eight or nine carbon atoms suitable for gasoline, alkanes with 12 to 18 carbons for use in diesel, and methanol.

Efficient Process

"For the first time, we have produced alkanes, the main component of gasoline and diesel, from lignin, and biomethanol becomes available," says Yan.

"A large percentage of the starting material is converted into useful products," he adds. "But this work is still in its infancy so other aspects related to economic issue will be evaluated in the near future."

John Ralph at the University of Wisconsin in Madison thinks the work is exciting. He points out that there have been previous attempts to convert lignin into liquid fuels. "That said, the yields of monomers [in the new reaction] are striking," he says.

Richard Murphy at Imperial College London, UK, is also impressed with Kou's work. "I believe that approaches such as this will go a considerable way to help us extract valuable molecules including fuels from all components of lignocellulose," he says.

Provided by NewScientist.com news service © Reed Business Information

Original here

Green Vision: Artificial DNA as Software and Artificial Enzymes as Hardware!


This sounds like a bit of a fantasy and if you are really blunt about it then many would probably shrug their shoulders and say that this is down right lunacy. It would be hard to convince them otherwise by explaining them the ins and outs of this new technology as not much is yet known to us but all one can say is that when man first dreamed of stepping on to the moon, then it was also considered just plain ‘lunacy’. Science has a way of altering our perceptions and blurring the line between the probable and the impossible!

Dutch researchers take flight with three-gram 'dragonfly'

The DelFly Micro is a acuteMicro Air Vehicleacute (MAV) an exceptionally small remote-controlled aircraft with camera and image recognition software. The Micro weighing just 3 grams and measuring 10 cm (wingtip to wingtip) is the considerably smaller ...
The DelFly Micro is a 'Micro Air Vehicle' (MAV), an exceptionally small remote-controlled aircraft with camera and image recognition software. The Micro, weighing just 3 grams and measuring 10 cm (wingtip to wingtip) is the considerably smaller successor to the successful DelFly I (2005) and DelFly II (2006).

(PhysOrg.com) -- On Wednesday 23 July, TU Delft will be presenting the minute DelFly Micro air vehicle. This successor to the DelFly I and II weighs barely 3 grams, and with its flapping wings is very similar to a dragonfly.

Sponsored Links (Ads by Google)

3D-MEMS sensors - Accelerometers, Inclinometers Shock, Vibration, Pressure sensors
www.shop.hy-line.de

ISI Hellas S.A. - Tactical Data Links and Systems Planning, Simulation and Training
www.isihellas.gr

Avionics Catalog - Specs, Pictures, Pricing Sale items updated weekly
www.seaerospace.com

Ultra-small, remote-controlled micro aircraft with cameras, such as this DelFly, may well be used in the future for observation flights in difficult-to-reach or dangerous areas. The DelFly Micro will make a short demonstration flight during the presentation.

Video: DelFly micro first test flight. TU Delft

The DelFly Micro is a 'Micro Air Vehicle' (MAV), an exceptionally small remote-controlled aircraft with camera and image recognition software. The Micro, weighing just 3 grams and measuring 10 cm (wingtip to wingtip) is the considerably smaller successor to the successful DelFly I (2005) and DelFly II (2006). The DelFly Micro, with its minuscule battery weighing just 1 gram, can fly for approximately three minutes and has a maximum speed of 5 m/s.

Ultra-small remote-controlled, camera-equipped aircraft are potentially of great interest because they could eventually be used for observation flights in difficult-to-reach or dangerous areas.

The basic principle of the DelFly is derived from nature. The 'dragonfly' has a tiny camera (about 0.5 grams) on board that transmits its signals to a ground station. With software developed by TU Delft itself, objects can then be recognised independently.

The camera transmits TV quality images, and therefore allows the DelFly II to be operated from the computer. It can be manoeuvred using a joystick as if the operator was actually in the cockpit of the aircraft. The aim is to be able to do this with the DelFly Micro too.

The development of the DelFly is above all the story of continuing miniaturisation of all the parts, from the DelFly I (23 grams and 50 cm) via the DelFly II (16 grams and 30 cm) to the present DelFly Micro (3 grams and 10 cm).

The DelFly II drew huge attention in 2006 because it could fly horizontally (21 km/hr) as well as hover, just like a hummingbird, and also fly backwards. The DelFly Micro, incidentally, cannot do this just yet.

In a few years time, the new objective of the project, the DelFly NaNo (5 cm, 1 gram) will have been developed. The Micro is an important intermediate step in this development process. A second objective for the future is for the DelFly to be able to fly entirely independently thanks to image recognition software.

Original here

NASA's Ames, JPL Win NASA Software of Year Award

WASHINGTON -- Computer programs that are used to define safety margins for fiery spacecraft re-entries and help detect planets outside our solar system are co-winners of NASA's 2007 Software of the Year Award.

Software engineers at NASA's Ames Research Center at Moffett Field, Calif., developed the Data-Parallel Line Relaxation, or DPLR, which is used to analyze and predict the extreme environments human and robotic spacecraft experience during super high-speed entries into planetary atmospheres.

At NASA's Jet Propulsion Laboratory in Pasadena, Calif., software engineers developed the Adaptive Modified Gerchberg-Saxton Phase Retrieval program. The software uses a telescope's science camera with innovative and robust algorithms to characterize possible errors that limit its imaging performance. The software has been integrated into calibration control loops to correct those errors, and can achieve orders of magnitude improvement in sensitivity and resolution.

The DPLR simulates the intense heating, shear stresses, and pressures a spacecraft endures as it travels through atmospheres to land on Earth or other planets. It is capable of creating a highly accurate, simulated entry environment that exceeds the capability of any test facility on Earth, allowing engineers to design and apply thermal protection materials suited to withstand such intense heating environments.

The DPLR team members include Michael J. Wright, James Brown, David Hash, Matt MacLean, Ryan McDaniel, David Saunders, Chun Tang and Kerry Trumble.

JPL's software can be applied to other sciences and systems that use light, such as laser communications and extrasolar planet detection.

JPL's Adaptive Modified Gerchberg-Saxton Phase Retrieval software already is in use at the California Institute of Technology's Palomar Observatory, in northern San Diego County. The software played a significant role in designing such next-generation telescopes as NASA's James Webb Space Telescope, scheduled to launch in 2013.

An eight-person team from JPL is responsible for the Adaptive Modified Gerchberg-Saxton Phase Retrieval software: Scott Basinger, Siddarayappa Bikkannavar, David Cohen, Joseph Green, Catherine Ohara, David Redding and Fang Shi.

Early work for the software was based on efforts to correct the vision of NASA's Hubble Space Telescope. After initial images came back blurry, engineers worked for months to determine the problem. Eventually, astronauts traveled to the telescope to install a corrective lens based on telescope-imaging errors.

A NASA Software Advisory Panel reviews entries and recommends winners to NASA's Inventions and Contributions Board for confirmation. Entries are nominated for developing innovative technologies that significantly improve the agency's exploration of space and maximize scientific discovery.

Both Ames and JPL have won or been co-winner of the award seven times, including three out of the past four years, since the NASA Software of the Year Award was initiated in 1994.

Original here

Pi or 2 Pi: That Is the Question scientific_blogging

By Robert Olley

In February this year there appeared in Physics World an article entitled Constant Failure by Robert P Crease of Stony Brook University, in which he showed in how many formulae of physics and mathematics 2π turns up, rather than π. This article struck a chord with me, since even after many years I remember the feeling of "cognitive dissonance" when being taught that the formula was 2πR rather than πD.

I felt it a bit much, though, suggesting that Archimedes might have been mistaken in choosing to calculate the ratio of circumference to diameter rather than to radius. In those days, the fundamental dichotomy seems to have been between the geometers who thought of circumference, diameter and their ratio, and the astronomers who used the radius in their calculation of chord tables.

Hipparchus used a radius of 3438 which is the nearest integer to the number of minutes in 1 radian, but Ptolemy preferred 3600 as this is easier to calculate within the sexagesimal system. The work of these astronomers, further developed by Hindu and Arabic mathematicians, gives us our trigonometry of today.

In particular, Aryabhata published in 499 A.D. the Aryabhatiya in which he invented the sine function (radius!) as more convenient than the chord, but nevertheless computed the most accurate value of π (diameter!) known in ancient times. However al-Kashi, who was very much an astronomer and trigonometer, set a new record in precision in his Treatise on the Circumference in July 1424, a work in which he calculated 2π to nine sexagesimal places and translated this into sixteen decimal places.

The Greek geometers did not think of their ratio as a number. To them, number, magnitude and ratio were three distinct concepts. Then who first did? As it might say at the beginning of a tale from The Thousand and One Nights, "there were three brothers from Baghdad", namely the Banu Musa in the 9th Century, who are first recorded as having described this ratio as a number.

The first person to use π to represent the ratio of the circumference to the diameter (3.14159...) was the Welshman William Jones in 1706. But the radius fought back, with the word 'radian' first appeared in print in 1873, in examination questions set by James Thomson (brother of Lord Kelvin) at Queen's College, Belfast.

He used the term as early as 1871, while in 1869 Thomas Muir, then of St. Andrew's University in Scotland, hesitated between 'rad', 'radial' and 'radian', adopting 'radian' after consultation with James Thomson. (A Welshman, an Irishman, and a Scotsman – is it a Celtic conspiracy?)

Even the difference between the two versions of Planck's constant ℎ and ℏ (aka the Dirac constant) depends on whether one is thinking physically in terms of frequency ν or mathematically in terms of angular velocity ω. Physics is not Applied Mathematics!

Original here

CERN's Large Hadron Collider Going Colder than Outer Space

Outer_space_big Based underneath the line that separates France and Switzerland, the Large Hadron Collider has become the center of scientific endeavor for the general public to focus on. First prophesied to bring ruin to the whole universe (or at least that little bit that surrounds us), the LHC has now been deemed safe. Subsequently, knowing that it won’t blast us all (or suck us all) into a black hole, the LHC has begun commissioning.

Set to have its first particle beams injected in August of this year, the LHC must first bring its temperature down, so as to obtain the highest possible magnetic fields while consuming the least amount of power.

In other words, the over 1600 magnets that make up the 27 kilometer long tunnel must be brought to low temperatures so that the electrical current being channeled along its length experience zero resistance and very little power loss.

Currently, six out of the eight sectors making up the LHC have been brought down to between 4.5 and 1.9 Kelvin, which equals out to be around -270C and -454F. The commissioning cooling will be complete when all eight sectors reach 1.9 Kelvin. For comparison, the temperature in deep space measures in at about 2.7 Kelvin.

Needless to say, given the time that it takes to cool these objects down, and the delays that could occur if a mistake is made, the LHC teams are meticulous. "We have a very systematic process for the commissioning of this machine, based on very carefully designed procedures prepared with experience we have gathered on prototypes," said Roberto Saban, the LHC's head of hardware commissioning.

"Our motto is: no short cuts... exchanging a single component which today is cold, is like bringing it back from the Moon. It takes about three to four weeks to warm it up. Then it takes one or two weeks to exchange. Then it needs three to six weeks to cool down again. So, you see, it is three months if we make a mistake."

Obviously one of the most high profile searches that the LHC will be conducting is for the fabled god-particle, the Higgs Boson. The discovery of this particle would go a long way towards the search for a Grand Unified Theory, which seeks to unify three of the four known fundamental forces: electromagnetism, the strong nuclear force and the weak nuclear force, leaving out only gravity.

But there are other discoveries hoping to be made through the whizzing and crashing particles bouncing around inside the LHC; questions such as “are there extra dimensions indicated by theoretical gravitons?” and “what is the nature of dark matter and dark energy?”

Original here

U.S. Takes Global Lead in Wind Energy Production

By Clara Moskowitz, LiveScience Staff Writer

Thanks to blow-hard winds, the United States has just become the world's largest generator of wind energy.

Germany previously held this distinction, though since the United States has about 26 times more land than Germany, the milestone isn't a huge surprise. Nonetheless, we weren't expected to reach this point until late 2009.

"Our wind energy capacity is growing faster than anyplace else," said Randall Swisher, the executive director of the American Wind Energy Association, the national trade organization for the wind energy industry. "So it's no longer really alternative energy. This is very mainstream."

During the first half of 2008, the United States, for the first time, generated more wind energy electricity than Germany, despite the fact that the smaller European country still has more turbines than we do.

Germany has enough turbines to collect about 22,000 to 23,000 megawatts of power, while the United States has a capacity of about 18,000 megawatts, Swisher said.

"The difference is that because the winds are so much stronger here in the U.S. we are actually providing more wind-generated electricity than Germany," Swisher told LiveScience. "Our turbines are so much more productive that theirs."

Though we are winning the race in terms of volume of wind energy produced, we are far behind when it comes to the proportion of our total energy we get from wind.

While wind currently supplies about 1.2 percent of the United States' power, it accounts for about 7 percent of Germany's total energy consumption. And the even-smaller country of Denmark gets roughly 20 percent of its energy form wind.

Most of America's wind power is being collected in Texas (which provides more than 25 percent of the country's wind-generated electricity), the Midwest, and West Coast, Swisher said.

The main issue with ramping up our use of wind power is not a lack of wind — have you seen how gusty it gets on the plains of Iowa? — but a lack of good ways to transport that energy from where it's collected to homes and offices and factories where it will be used.

"The major constraint is the transmission infrastructure," Swisher said. "To be able to build more turbines we have to build more transmission lines to carry the electricity from where it's generated to major areas where energy is being used."

Though America's wind energy use is certainly ramping up, we still have a ways to go toward harnessing its full potential.

The U.S. Department of Energy (DOE) reported that wind has the capability to provide 20 percent of our country's energy needs by the year 2030.

Since wind is a "green" form of energy, the DOE predicted this change would lead to a 25 percent reduction of carbon dioxide emissions from electricity generation in 2030.

"We need to back away from fossil fuel and embrace renewable energy," Swisher said. "The survival of the world depends on it."

Original here

6 Plants That Will Grow (Almost) Anywhere

've tried to grow peppers, rosemary, and other greenery in my small New York City one-bedroom apartment, and each has died a quick death.

Is it me? Or is it what I've chosen to grow on my microwave-oven-like window ledge?

In hopes of helping others like me - inexperienced but eager planters who don't live in ideal conditions for growing green - I turned to Leslie Land, blogger for The Daily Green, and lead author of The New York Times 1000 Gardening Questions and Answers.

Land warned me that nothing is completely trouble-free (you still have to water these, guys), and you're still going to need a safe outdoor spot (no fire escapes, people). She added that these choices wouldn't exactly impress a seasoned gardener. But that's ok; they would deliver on what I was looking for - they'd grow almost anywhere.

1. Herbs - While many herbs need sun, Land suggests growing parsley, which tolerates partial shade, and mint, which likes things a bit shadier. Land adds that in addition to being a wonderful fresh herb (don't forget to use those sweet stems!), giant flat leaf parsley also makes an excellent filler for flower arrangements.

2. Cherry Tomatoes - If you have a sunny spot, enough space for a whiskey barrel-sized container, and a 5-foot support, try planting an "indeterminate" cherry tomato plant. This plant will keep getting bigger all summer. Land points out you'll get a lot more yield for your space compared to a regular tomato plant.

3. Dwarf Evergreens - Who wouldn't want a little evergreen forest next to her humble home? Land says you'll have to go to a specialty nursery to find these little treasures, but that they are far less labor intensive than bonsai trees. Dwarf evergreens need to be in a sheltered location and not in direct sun.

4. Coreopsis - This long blooming perennial does very well in window boxes, according to Land. It's a sun-loving flower plant. Sign me up.

5. Coral Bells - Land says these are beautiful even when they're not flowering. They're a great decorative option and they do best in partial shade. Land emphasized these would grow anywhere. "Even Alaska?," I asked. "Well," she answered, "parts of it."

6. Sedum - There are many different types of sedum in all different sizes. Almost all are drought-resistant, and seldom bothered by insects or disease, which is about as trouble-free as it gets.

Original here

The Cheeseburger Footprint

by Jamais Cascio

We're growing accustomed to thinking about the greenhouse gas impact of transportation and energy production, but nearly everything we do leaves a carbon footprint. If it requires energy to make or do, chances are, some carbon was emitted along the way. But these are the early days of the climate awareness era, and it's not yet habit to consider the greenhouse implications of otherwise prosaic actions.

So as an exercise, let's examine the carbon footprint of something commonplace -- a cheeseburger. There's a good chance you've eaten one this week, perhaps even today. What was its greenhouse gas impact? Do you have any idea? This is the kind of question we'll be forced to ask more often as we pay greater attention to our individual greenhouse gas emissions.

Burgers are common food items for most people in the US -- surprisingly common. Estimates for the average American diet range from an average of about one per week, or about 50/year (Fast Food Nation) to as many as three burgers per week, or roughly 150/year (the Economist, among other sources). So what's the global warming impact of all those cheeseburgers? I don't just mean cooking the burger; I mean the gamut of energy costs associated with a hamburger -- including growing the feed for the cattle for beef and cheese, growing the produce, storing and transporting the components, as well as cooking.

The first step in answering this question requires figuring out the life cycle energy of a cheeseburger, and it turns out we're in luck. Energy Use in the Food Sector (PDF), a 2000 report from Stockholm University and the Swiss Federal Institute of Technology, does just that. This highly-detailed report covers the myriad elements going into the production of the components of a burger, from growing and milling the wheat to make bread, to feeding, slaughtering and freezing the cattle for meat -- even the energy costs of pickling cucumbers. The report is fascinating in its own right, but it also gives us exactly what we need to make a relatively decent estimation of the carbon footprint of a burger.

Overall, the researchers conclude that the total energy use going into a single cheeseburger amounts to somewhere between about 7 and 20 megajoules (the range comes from the variety of methods available to the food industry).

The researchers break this down by process, but not by energy type. Here, then, is a first approximation: we can split the food production and transportation uses into a diesel category, and the food processing (milling, cooking, storage) uses into an electricity category. Split this way, the totals add up thusly:

Diesel -- 4.7 to 10.8 MJ per burger
Electricity -- 2.6 to 8.4 MJ per burger
With these ranges in hand, we can then convert the energy use into carbon dioxide emissions, based on fuel. Diesel is straightforward. For electricity, we should calculate the footprint using both natural gas and coal, as their carbon emissions vary considerably. (If you're lucky enough to have your local cattle ranches, farms and burger joints powered by wind farm, you can drop that part of the footprint entirely.) The results:
Diesel -- 350 to 800 grams of carbon dioxide per burger
Gas -- 416 to 1340 grams of carbon dioxide per burger
Coal -- 676 to 2200 grams of carbon dioxide per burger
...for a combined carbon dioxide footprint of a cheeseburger of 766 grams of CO2 (at the low end, with gas) to 3000 grams of CO2 (at the high end, with coal). Adding in the carbon from operating the restaurant (and driving to the burger shop in the first place), we can reasonably call it somewhere between 1 kilogram and 3.5 kilograms of energy-based carbon dioxide emissions per cheeseburger.

But that's not the whole story. There's a little thing called methane. It's a greenhouse gas that is, pound for pound, about 23 times more effective a greenhouse gas than carbon dioxide. It's also something that cattle make, in abundance.

By regulation, a beef cow must be at least 21 months old before going to the slaughterhouse; let's call it two years. A single cow produces about 110 kilos of methane per year in manure and what the EPA delicately calls "enteric fermentation," so over its likely lifetime, a beef cow produces 220 kilos of methane. Since a single kilo of methane is the equivalent of 23 kilos of carbon dioxide, a single beef cow produces a bit more than 5,000 CO2-equivalent kilograms of methane over its life.

A typical beef cow produces approximately 500 lbs of meat for boneless steaks and ground beef. If we assume that the typical burger is a quarter-pound of pre-cooked meat, that's 2,000 burgers per cow. Dividing the methane total by the number of burgers, then, we get about 2.6 CO2-equivalent kilograms of additional greenhouse gas emissions from methane, per burger, or roughly as much greenhouse gas produced from cow burps (etc.) as from all of the energy used to raise, feed or produce all of the components of a completed cheeseburger!

That's a total of 3.6-6.1 kg of CO2-equivalent per burger. If we accept the ~3/week number, that's 540-915 kg of greenhouse gas per year for an average American's burger consumption. And for the nation as a whole?

300,000,000 citizens
* 150 burgers/year
* 4.35 kilograms of CO2-equivalent per burger
/ 1000 kilograms per metric ton
= 195,750,000 annual metric tons of CO2-equivalent for all US burgers
That's at a lower-than-average level of kg/burger.

Even with the lower claim of one cheeseburger per week, for an average American, the numbers remain sobering.

300,000,000 citizens
* 50 burgers/year (~Fast Food Nation)
* 4.35 kilograms of CO2-equivalent per burger
/ 1000 kilograms per metric ton
= 65,250,000 annual metric tons of CO2-equivalent for all US burgers
Those numbers are big, impressive, and probably meaningless.

So let's convert that to something more visceral. Let's compare to the output from a more familiar item: an SUV.

A Hummer H3 SUV emits 11.1 tons (imp.) of CO2 over a year; this converts to about 10.1 metric tons, so we'll call it 10 to make the math easy.

195,750,000 annual metric tons of CO2-equivalent for all US burgers
/10 metric tons of CO2-equivalent per SUV

=19.6 million SUVs

----or----
65,250,000 annual metric tons of CO2-equivalent for all US burgers
/10 metric tons of CO2-equivalent per SUV

=6.5 million SUVs

To make it clear, then: the greenhouse gas emissions arising every year from the production and consumption of cheeseburgers is roughly the amount emitted by 6.5 million to 19.6 million SUVs. There are now approximately 16 million SUVs currently on the road in the US.

Will this information alone make a difference? Probably not; after all, nutrition info panels on packaged foods didn't turn us all into health food consumers. But they will allow us more informed choices, with no appeals to not knowing the consequences of our actions.

This was, ultimately, an attempt to take a remarkably prosaic activity and parse out its carbon aspects. After all, we're all increasingly accustomed to recognizing obvious, direct carbon emissions, but we're still wrapping our heads around the secondary and tertiary sources. Exercises like this one help to reveal the less-obvious ways that our behaviors and choices impact the planet and our civilization.

I doubt that we'll have to go through this process with everything we eat, from now until the end of the world. As our societies become more conscious of the impact of greenhouse gases, and the need for very tight and careful controls on just how much carbon we dump into the air, we'll need to create mechanisms for carbon transparency. Be they labels, icons, color-codes, or arphid, we'll need to be able to see, at a glance, just how much of a hit our personal carbon budgets take with each purchase.

The Cheeseburger Footprint is about much more than raw numbers. It's about how we live our lives, and the recognition that every action we take, even the most prosaic, can have unexpectedly profound consequences. The article was meant to poke us in our collective ribs, waking us up to the effects of our choices.

Original here

BMW Mini Electric Cars Available in U.S. From Summer 2009

Summer 2009

Which is the better source for milk?

Cows. Click image to expand

You've already weighed in on the question of whether veganism or vegetarianism is better for the environment. But I want more specifics: Which is better for the environment, soy milk or cow's milk?

First, a disclosure: The Green Lantern can't really stomach lactose and rarely drinks milk. But he isn't too keen on the taste of soy milk, either, so consider him a neutral arbiter when he concludes that soy is the somewhat more eco-conscious choice. That said, it's not easy to compare the two products: Soy milk may be packaged and marketed as a substitute for dairy, but environmentally speaking, it's a very different product. Start with the basics: The calcium in soy milk has to be artificially added, and you won't get anything remotely looking like milk from soy until you've ground up the beans; removed a fiber the Japanese call okara; and added water, vitamins, minerals, and sugar. Most cow's milk needs to be pasteurized and packaged, of course, but what you buy in the store is much closer to what comes off the farm.

We'll begin with the place where soy milk and cow's milk are most similar: as a source of protein. The Lantern has already discussed some of the environmental costs that come with raising cows—they require an enormous amount of energy to feed, they produce lots of waste, and they're a major emitter of methane. Dairy production is much more energy-efficient than raising cattle for meat, since you get more use out of each cow, but it's not as clean as growing crops. According to research (PDF) by Cornell University scientist David Pimentel, it takes about 14 calories of fossil-fuel energy to produce one calorie of milk protein on a conventional farm. Organically produced milk might require a little less than 10 calories of fossil-fuel energy, under the most optimistic assumptions, and better farming techniques could cut down greenhouse-gas emissions by at least 25 percent.

By comparison, Pimentel's data suggest that it takes about 0.26 calories of fossil fuel to make a calorie of organic soybeans—which are used by most soy milk manufacturers. Soy protein accounts for about 35 percent of those calories, so let's say you'll need to put 0.75 calories of energy into farming soy to produce a calorie of protein. That makes soy protein approximately 13 times more energy-efficient than even organic dairy protein under a best-case scenario. (Producing a kilogram of soybeans also yields significantly less greenhouse-gas emission than producing a kilogram of milk.)

Of course, as we've already discussed, you don't drink raw soy beans. Not only do the other ingredients in soy milk need to be shipped from elsewhere; the process of adding them requires energy and produces a significant amount of waste. As one British government report (PDF) put it, manufacturing soy milk is closer to making fruit juice than cow's milk. (And as the Lantern has noted before, producing fruit juice takes quite a bit of electricity.) Depending on how you feel about carbon offsets, the makers of Silk-brand soy milk—which accounts for about two-thirds of America's soy milk—may make the calculus easier since the company says it purchases wind power to compensate for the energy it uses in production.

Indeed, Silk's green marketing efforts offer an interesting case study in how a product's makeup isn't its environmental destiny. Despite animal-rights advocates' love of soy products, soybean producers aren't exactly the darlings of the environmental movement. The vast majority of soybeans are genetically modified (PDF); the Green Lantern may be agnostic about whether that's so bad, but the fact certainly hasn't made them popular among many green activists. In South America, farmers responding to the massive global demand for soy—fueled in large part by China—have been accused of doing lasting damage to the Amazon rainforest.

But because soy milk drinkers are, on the whole, an eco-conscious bunch, the products they buy tend to be made in a more sustainable way. In addition to the wind-power initiative, Silk's product line is almost entirely organic. Silk isn't perfect: The company is a bit evasive about what percentage of their soybeans are imported from abroad—a spokeswoman was unable to give the Lantern specific numbers—and its sale to the nation's largest milk processor in 2002 raised some eyebrows. But, generally speaking, the fact that its consumers are environmentally savvy probably makes for a greener product.

Nonetheless, niche products have their environmental downsides, too. A specialty product like soy milk—despite its growing popularity, its market is about one-twentieth the size of regular milk's—is probably going to have to travel farther to the average consumer simply because fewer people produce it. With more centralized production, that means the soy travels farther to the plant, too. Production probably has a much bigger environmental impact than transportation, but it's worth keeping in mind: Environmentally, it hurts to be in the minority.

Original here

Giant Twist Freedom DX Electric Bike Test Drive: A Flying Good Time—Just Don't Call It a Workout (With Video!)

NEW YORK — In March, we got the first look at the new Giant Twist Freedom DX bicycle, an electric bike that uses a cyborglike mix of muscle and machine power to propel riders as far as 75 miles on a single charge. At $2000, it's a pricey bike, but Giant is touting this ride as a cost-effective alternative for commuters upset about record gas prices. (Hey, it worked for Segway—not to mention the entire cycling industry.) We'll know soon enough if consumers agree: The bike is just now beginning to ship to retailers, and we recently took an exclusive drive. So check out feet-on video here, then read our full review below. —Seth Porges



The Specs

The motor can be set to one of three modes: Eco, Normal and Sport. Eco puts more emphasis on peddling and less on the motor, while sport gives your feet a rest and kicks the power up—not that there's a huge difference between the settings. But that may have more to do with Central Park's relatively flat terrain than with anything else.

But the chief selling point here is the bike's power assist functionality, which offers extra muscle for climbing hills. The key to operation is an integrated torque sensor that measures how much strength you're putting into your pedaling and provides an appropriate amount of supplemental power. When you reach a hill, the sensor detects the change in your pedaling for better help

And the power assist is easy enough to use­—just flip a switch, and it turns on. With two batteries parked above the back wheel, the switch allows you to choose which one to pull power from (a meter shows how much power remains in each). Giant claims that when both batteries are charged the bike can "assist" you for about 75 miles. After that, it's all up to your legs.

The Drive

When the bike is "on," the only indication is a status light. The Twist Freedom DX doesn't perform in any out-of-the-ordinary manner until you hit an incline. But the second you do, the motor adds power to your wheels, and your feet suddenly feel like they're, well, flying. The process is very seamless, and actually quite fun. The motor leaves some slack for your feet to pull and keep pedaling, so they're never circling wildly, and you never feel as though you're losing control. It's not a moped, so unless you're cruising down a hill, your legs will be doing at least some of the work.

On a ride around Central Park here, the bike provided a fun and relaxing ride. Normally, a bike that weighs 50 pounds (with batteries) would be painful to ride after a few sharp inclines, but the Twist's power assist allowed us to conquer hills with ease. The motor engaged without being intrusive, but we never felt that we could remove our feet from the pedals and ride the bike like a scooter. It simply made the overall ride less exhausting. In fact, although it was about 90 degrees and humid during our ride, we barely broke a sweat over the course of a several mile ride.

The Bottom Line

But a sweat-free bike ride is something of a mixed blessing, since fitness can be as important as transport when it comes to bicycles. It's fair to expect that a large segment of the biking community will shun the new technology. There's just something very un-bike-like about the whole experience.

Overall, the bike doesn't provide enough power to satiate those used to Ducatis (or even Vespas), but it will make a long-haul or uphill commute easier and leave you less sweaty when you hit the office. Just don't expect a good workout.

Original here

Solar power from Saharan sun could provide Europe's electricity, says EU

Alok Jha, science correspondent

A concentrating solar power (CSP) plant in Spain that uses panels to refl ect light on to a central tower to produce electricity. Similar plants are proposed for north Africa. Photograph: AP

A tiny rectangle superimposed on the vast expanse of the Sahara captures the seductive appeal of the audacious plan to cut Europe's carbon emissions by harnessing the fierce power of the desert sun.

Dwarfed by any of the north African nations, it represents an area slightly smaller than Wales but scientists claimed yesterday it could one day generate enough solar energy to supply all of Europe with clean electricity.

Speaking at the Euroscience Open Forum in Barcelona, Arnulf Jaeger-Walden of the European commission's Institute for Energy, said it would require the capture of just 0.3% of the light falling on the Sahara and Middle East deserts to meet all of Europe's energy needs.

The scientists are calling for the creation of a series of huge solar farms - producing electricity either through photovoltaic cells, or by concentrating the sun's heat to boil water and drive turbines - as part of a plan to share Europe's renewable energy resources across the continent.

A new supergrid, transmitting electricity along high voltage direct current cables would allow countries such as the UK and Denmark ultimately to export wind energy at times of surplus supply, as well as import from other green sources such as geothermal power in Iceland.

Energy losses on DC lines are far lower than on the traditional AC ones, which make transmission of energy over long distances uneconomic.

The grid proposal, which has won political support from both Nicholas Sarkozy and Gordon Brown, answers the perennial criticism that renewable power will never be economic because the weather is not sufficiently predictable. Its supporters argue that even if the wind is not blowing hard enough in the North Sea, it will be blowing somewhere else in Europe, or the sun will be shining on a solar farm somewhere.

Scientists argue that harnessing the Sahara would be particularly effective because the sunlight in this area is more intense: solar photovoltaic (PV) panels in northern Africa could generate up to three times the electricity compared with similar panels in northern Europe.

Much of the cost would come in developing the public grid networks of connecting countries in the southern Mediterranean, which do not currently have the spare capacity to carry the electricity that the north African solar farms could generate. Even if high voltage cables between North Africa and Italy would be built or the existing cable between Morocco and Spain would be used, the infrastructure of the transfer countries such as Italy and Spain or Greece or Turkey also needs a major re-structuring, according to Jaeger-Walden.

Southern Mediterranean countries including Portugal and Spain have already invested heavily in solar energy and Algeria has begun work on a vast combined solar and natural gas plant which will begin producing energy in 2010. Algeria aims to export 6,000 megawatts of solar-generated power to Europe by 2020.

Scientists working on the project admit that it would take many years and huge investment to generate enough solar energy from north Africa to power Europe but envisage that by 2050 it could produce 100 GW, more than the combined electricity output from all sources in the UK, with an investment of around €450bn.

Doug Parr, Greenpeace UK's chief scientist, welcomed the proposals: "Assuming it's cost-effective, a largescale renewable energy grid is just the kind of innovation we need if we're going to beat climate change."

Jaeger-Walden also believes that scaling up solar PV by having large solar farms could help bring its cost down for consumers. "The biggest PV system at the moment is installed in Leipzig and the price of the installation is €3.25 per watt," he said. "If we could realise that in the Mediterranean, for example in southern Italy, this would correspond to electricity prices in the range of 15 cents per kWh, something below what the average consumer is paying."

The vision for the renewable energy grid comes as the commission's joint research centre (JRC) published its strategic energy technology plan, highlighting solar PV as one of eight technologies that need to be championed for the short- to medium-term future.

"It recognises something extraordinary - if we don't put together resources and findings across Europe and we let go the several sectors of energy, we will never reach these targets," said Giovanni de Santi, director of the JRC, also speaking in Barcelona.

The JRC plan includes fuel cells and hydrogen, clean coal, second generation biofuels, nuclear fusion, wind, nuclear fission and smart grids. De Santi said it was designed to help Europe to meet its commitments to reduce overall energy consumption by 20% by 2020, while reducing CO² emissions by 20% in the same time and increasing to 20% the proportion of energy generated from renewable sources.

Original here

Half the Amazon Rainforest to be Lost by 2030

(NaturalNews) Due to the effects of global warming and deforestation, more than half of the Amazon rainforest may be destroyed or severely damaged by the year 2030, according to a report released by the World Wildlife Fund (WWF).

The report, "Amazon's Vicious Cycles: Drought and Fire," concludes that 55 percent of the world's largest rainforest stands to be severely damaged from agriculture, drought, fire, logging and livestock ranching in the next 22 years. Another 4 percent may be damaged by reduced rainfall caused by global warming. This is anticipated to destroy up to 80 percent of wildlife habitat in the region.

By 2100, the report adds, global warming may cause rainfall in the Amazon to drop by 20 percent and temperatures to increase by 2 degrees Celsius (3.6 degrees Fahrenheit). This combination will increase the occurrence of forest fires, further accelerating the pace of deforestation.

The Amazon contains more than half of the planet's surviving rainforest and is a key stabilizer of global climate. The report notes that losing 60 percent of it would accelerate the pace of global warming, affecting rainfall as far away as India.

WWF warned that the "point of no return" for the Amazon rainforest, from which ecological recovery will be impossible, is only 15-25 years in the future, much sooner than has previously been supposed.

"The Amazon is on a knife-edge," said WWF-UK forests head Beatrix Richards, "due to the dual threats of deforestation and climate change."

She called for the countries discussing global climate change at an international conference in Bali to take the importance of forests into account.

"At the international negotiations currently underway in Bali, governments must agree a process which results in ambitious global emission reduction targets beyond the current phase of Kyoto," she said. "Crucially, this must include a strategy to reduce emissions from forests and help break the cycle of deforestation."

Original here

Windows that are also Solar Panels

MIT researches may have figured out a cost friendly and fashionable alternative to regular solar panels. The secret is dye colored glass.

The MIT method uses a solar concentrator. Which can collect and send light at longer distances. In this case, across a window to solar cells on the windows edge.

"Once the light is trapped inside, a major loss mechanism is that it can be reabsorbed by another dye molecule on its way out," MIT researcher Jon Mapel -- one of the study's authors -- told TechNewsWorld. "Every time that happens, there's a chance that it can get lost. It ends up going through a loss-absorption-reemission cycle, and eventually you lose too many of them and not enough get to the edges," he explained.

That problem has led to major research, and now a solution.

The mixture of two or more dyes is used, versus just one, and it is then painted onto a pane of glass or plastic. Different dyes absorb different wavelengths of light, and also re-emit it at different wave lengths.

The solar concentrator produces 10 times more energy than that of the current systems, so hypothetically, they can be sold for a fraction of the price. "Since you're using a lot less solar cells, you can potentially reduce the cost of solar electricity," Mapel said.

This new system also holds greater potential for the homeowners market. No one wants large, unsightly, and expensive solar panels on their home. However, if your windows could secretly be solar panels, why not get them? However, if you don't mind the regular old solar panel, using both could increase the amount of energy efficiency by a big margin.

This is because solar panels are better at absorbing infrared light than visible light, and these new concentrators get more power from visible light. However if they are combined they could be up to 50 percent high in conversion efficiency.

The MIT team estimates the products could become readily available within the next three years.

However, MIT are not the only ones who have figured it out. In another part of the world members of the Fraunhofer Institute for Solar Energy Systems ISE are built a prototype that combined nanoparticles and organic dyes to create solar panels that could eventually be any color, or even feature decorative images or text.

Their use of the dyes, also apply to window solar panels, versus rooftop panels, like MIT.

They presented their prototype in February at the Nanotech 2008 conference in Tokyo.

Original here

Bush Cronies Tried To Redefine ‘Carbon Dioxide’ To Save Power Plants From Emissions Regulations»

Earlier this month, former EPA official Jason Burnett wrote to Sen. Barbara Boxer (D-CA) with explosive revelations on how the White House has neutered climate change science to protect corporate interests. For example, OMB general counsel Jeffrey Rosen asked for multiple memos on whether carbon dioxide (CO2) from cars and plants could be regulated differently.

In a Senate hearing today, Burnett further explained that under the Clear Air Act, “after a pollutant is a regulated pollutant, controls are required on a variety of sources.” During the “inter-agency process,” Burnett said, OMB officials looked for ways to define CO2 from power plants as different from CO2 from automobiles, in order to shield industrial power plants from regulation under the landmark Supreme Court decision Massachusetts v. EPA:

BURNETT: There was quite a bit of effort and interest to see whether the Supreme Court case itself and regulation of CO2 and other greenhouse gases from automobiles be restricted to just automobiles. … So there’s an interest to determine whether we could define CO2 from automobiles as somehow different than CO2 from power plants, for example –

SEN. KLOBUCHAR: Do you think that’s possible?

BURNETT: Clearly it wasn’t supportable.

Watch it:

It is common knowledge that carbon dioxide is the same chemical regardless of what source emits it. But for the White House, which unabashedly asserts its anti-environment agenda, the definition of CO2 can change to help big polluters.

“I must say that it was sometimes somewhat embarrassing,” Burnett admitted, “for me to return to EPA and ask for my colleagues to explain yet again that CO2 is a molecule and there is no scientific way of differentiating between CO2 from car and a power plant.”

Original here

Saharan sun to power European supergrid

Alok Jha, science correspondent

A worker tends to the world's largest solar plant in Germany. Photo: Waltraud Grubitzsch/EPA/Corbis

Vast farms of solar panels in the Sahara desert could provide clean electricity for the whole of Europe, according to EU scientists working on a plan to pool the region's renewable energy.

Harnessing the power of the desert sun is at the centre of ambitious scheme to build a €45bn (£35.7bn) European supergrid that would allow countries across the continent to share electricity from abundant green sources such as wind energy in the UK and Denmark and geothermal energy from Iceland and Italy.

The idea is gaining growing political support in Europe with both Gordon Brown and Nicholas Sarkozy recently giving backing to the north African solar plan.

Speaking today at the Euroscience Open Forum in Barcelona, Arnulf Jaeger-Walden of the European commission's Institute for Energy, said it would require the capture of just 0.3% of the light falling on the Sahara and Middle Eastern deserts to provide all of Europe's energy needs.

In addition, because the sunlight in this area is more intense, solar photovoltaic (PV) panels in northern Africa could generate up to three times the electricity compared with similar panels in northern Europe.

Jaeger-Walden explained how electricity produced in solar farms in Africa, each containing power plants generating around 50-200MW of power, could be fed thousands of miles across European countries by using high-voltage direct current transmission lines instead of the traditional alternating current lines. Energy losses on DC lines are far lower than AC ones where transmission of energy over long distances is uneconomic.

"If you look at solar radiation, then the Mediterranean region is a very favourable one," said Jaeger-Walden.

He said that the proposed grid was a way to balance out the intermittencies of renewable energy: "If you can connect the grid to hydro power, you've got that as a backup battery, and in addition there's wind. It's not a single source that's providing the energy but a combination of the different renewable energies."

Conveniently the potential to generate solar energy, either from photovoltaic cells, or by using it to heat water, is at its highest exactly when there is peak demand. "Between 11am and 1pm – there are a lot of cooking activities going on, people are going home, air conditioners are used," he said.

The idea of developing solar farms in the Mediterranean region and north Africa was given a boost recently by French president Nicholas Sarkozy earlier this month when he highlighted solar farms in north Africa as a key part of the work of his newly-formed Mediterranean Union.

Depending on the size of the grid, building the necessary high voltage lines across Europe could cost up to €1bn a year every year till 2050 but Jaeger-Walden pointed out that the figure was small when compared to a recent prediction by the International Energy Agency that the world needs to invest more than $45tr (£22.5tr) in energy systems over the next 30 years.

Much of the cost would come in developing the public grid networks of connecting countries in the southern Mediterranean, which do not currently have the spare capacity to carry the electricity that the north African solar farms could generate.

"Even if high voltage cables between North Africa and Italy would be built or the existing cable between Morocco and Spain would be used, the infrastructure of the transfer countries such as Italy, Spain, Greece and Turkey also needs a major restructuring," said Jaeger-Walden.

Scientists working on the project admit that it would take many years and huge investment to generate enough solar energy from north Africa to power Europe but envisage that by 2050 it could produce 100 GW, more than the the combined electricity output from all sources in the UK, with an investment of around €450bn.

Doug Parr, Greenpeace UK's chief scientist, welcomed the proposals. "Assuming it's cost-effective, a large scale renewable energy grid is just the kind of innovation we need if we're going to beat climate change. Europe needs to become a zero-carbon society as soon as possible, and that will only happen with bold new ideas like this one. Tinkering with 20th-century technologies like coal and nuclear simply isn't going to get us there."

Jaeger-Walden also believes that scaling up solar PV by having large solar farms could help bring its cost down for consumers. "The biggest PV system at the moment is installed in Leipzig and the price of the installation is €3.25 per watt. If we could realise that in the Mediterranean, for example in southern Italy, this would correspond to electricity prices in the range of 15 cents per KWh, something below what the average consumer is paying."

The vision for the renewable energy grid comes as the European Commission's Joint Research Centre (JRC) published its strategic energy technology plan, highlighting solar PV as one of eight technologies that need to be championed for the short to medium term future.

"It recognises something extraordinary – if we don't put together resources and findings across Europe and we let go the several sectors of energy, we will never reach these targets. We need a coordination of research applied to different fields," said Giovanni de Santi, director of the JRC, also speaking in Barcelona.

The JRC plan includes fuel cells and hydrogen, clean coal, second-generation biofuels, nuclear fusion, wind, nuclear fission and smart grids. De Santi said it was designed to help Europe to meet its commitments to reduce overall energy consumption by 20% by 2020 while reducing CO2 emissions by 20% in the same time and increasing to 20% the proportion of energy generated from renewable sources.

High-voltage transmission lines

First developed in the 1930s, High Voltage Direct Current (HVDC) transmission lines are seen as the most efficient way to move electricity over long without incurring the losses experienced in normal AC power lines. HVDC cables can carry more power for the same thickness of cable compared with AC lines but are only suited to long-distance transmission because they require expensive devices called static inverters to convert the electricity, usually generated as AC, into DC. Modern HVDC cables can keep energy losses down to around 3% per 1,000km.

Another advantage of HVDC is that it can be used as a link to transfer electricity between different countries that might use AC systems at differing frequencies. Alternatively, the HVDC cables could be used to synchronise the AC currents produced by renewable energy sources such as wind turbine farms.

Original here

Wind power: A reality check

Plans are afoot to prod the nation into using much more renewable energy. Can it be done, and what's the cost?

By Steve Hargreaves, CNNMoney.com staff writer

NEW YORK (CNNMoney.com) -- High-profile personalities have been telling the nation to ditch that dirty fossil fuel and turn to renewable energy.

T. Boone Pickens, the billionaire oilman, has been hitting the airwaves, pitching a plan to use wind to replace all the natural gas that's used to produce electricity, then using that saved natural gas to fuel cars.

In addition to weaning the nation from foreign oil, Pickens' plan is not entirely altruistic. He's investing hundreds of millions of dollars on a giant wind farm in the Texas panhandle, and his hedge fund, BP Capital, is said to own stakes in several companies that equip cars to run on natural gas. If his energy efforts pan out, he could get even richer in the process.

Then there's Al Gore. The former U.S. vice president and Nobel Prize winner said last week that electricity generation should be completely fossil-fuel free in 10 years.

The question is, are these plans realistic or just dreams?

"It's not out of the realm of technical feasibility," said Chris Namovicz, a renewable energy analyst at the government's Energy Information Agency. "But they come with pretty significant price tags."

The order is indeed tall.

The nation currently relies on coal - the dirtiest of all fossil fuels - for 50% of its electricity production. Natural gas makes up about 21%, and nuclear power comprises about 20%. Hydro and oil each contribute a bit as well, while traditional renewables - wind, solar, biomass and geothermal - ring in at only 3% combined, according to the EIA.

Pickens has a loosely detailed plan to replace the natural-gas-produced electricity with wind energy. He says it could be done in 10 years.

"That is extremely aggressive," said Dave Hamilton, director for global warming and energy projects at the Sierra Club. "But it's in the right direction. It's a good thing we have an oilman saying we can't drill our way out of this problem."

Unpredictable wind

One of the big challenges with using wind to replace natural gas is that, unlike the steady flame from natural gas, the wind doesn't blow all the time.

To make sure enough power is available when the wind isn't blowing, backup generators would be needed, said Paul Fremont, an electric-utility analyst at the investment bank Jefferies & Co.

That could mean maintaining those natural gas plants in case of emergency, or implementing even more novel ideas like systems in Europe that use excess wind electricity to pump water uphill when the wind is blowing, then release it through hydro dams when the wind stops.

Either way, any type of backup system comes with a price.

"It's very costly, and very inefficient for society as a whole," said Fremont. "Policy makers will have to decide if the benefits are worth it."

The utility industry also has reservations about using wind on a large scale, again pointing to the fact that it doesn't blow all the time.

The Sierra Club's Becker downplayed the problem. While a challenge now, he said technological advances will allow several wind farms from varying regions of the country to be tied together in the same electricity grid; when some are idle, others could make up the difference.

"The more we focus on how to get this done, the quicker we'll solve our problems," he said.

Government regulations

Another impediment to large-scale wind generation is a lack of turbines and infrastructure, said Hamilton. Companies like General Electric (GE, Fortune 500), India's Suzlon and Spain's Gamesa, which make wind turbines, aren't building enough of those turbines to meet demand because government tax credits offered to energy producers expire every two years. These tax credits are a big incentive for people to invest in wind energy - Pickens would net $60 million a year, according to Jefferies' Fremont, and that is likely why he's currently pitching his plan to lawmakers.

Companies fear that, if the tax credits aren't renewed, they will be stuck with unwanted wind turbines if energy producers scale back their demand for wind power.

Also impeding the development of wind power is the fact that the government is unclear about how or whether it will regulate greenhouse gas emissions. If regulations were enacted, investments in wind energy would likely increase as utilities seek cleaner sources of power.

Wind farms also could benefit when companies or people buy carbon offsets - essentially payments to producers of clean energy and others who take steps in reducing greenhouse gasses.

Despite these challenges, wind power's ability to produce 21% of the nation's electricity needs isn't out of the question. While wind currently only makes up 0.8% of the country's total electricity production, and would need to grow well over 20 times that to replace gas, it's worth noting that wind capacity has increased twelvefold since 1990, according to the EIA.

The second part of Pickens' plan - using natural gas to power vehicles - is perhaps easier.

While automakers are betting on electric cars as the vehicle of the future, those electric cars will still need backup engines to recharge the battery on long trips, at least for the foreseeable future.

Those backup engines could run on natural gas, said Julius Pretterebner, a vehicles and alternative-fuels expert at Cambridge Energy Research Associates.

Pretterebner also pointed to a host of other reasons why natural gas in cars is a good idea: It's about half as expensive as gasoline and 30% cleaner; the infrastructure to get it to service stations already exists; it's relatively cheap to convert existing cars ($500 to $2,000 per car, he said); and natural gas can be carbon neutral, if it's made from plants, a process he said requires no new technology.

"It's maybe the best alternative fuel we have, and the quickest way to get off foreign imports," he said.

As for Gore's call, there aren't any specific measures to analyze. But if Pickens' timetable is aggressive, Gore's is like Pickens' gone wild.

"It's completely impractical to imagine that we could totally wean ourselves off fossil fuels," said Jim Owen, a spokesman for the Edison Electric Institute, the utility industry's trade association.

Impractical, maybe. But using more renewables is certainly worth looking into. The EIA estimates that by 2015, wind energy will cost 7 cents per kilowatt-hour to produce, just a half-cent more than coal or natural gas.

The EIA says if strict greenhouse gas restrictions become law, renewables might go from 3% percent of the nation's electricity mix to around 25%. Coal, meanwhile, would likely go from more than half to less than a quarter. The EIA said that under the worst-case scenario in bringing about this shift, electricity prices may double.

Given the dangers global warming may pose - U.N. scientists predict severe droughts and floods unless greenhouse gasses are drastically reduced - more-expensive electricity may be a cost Americans are willing to bear.

Original here