There was an error in this gadget


Saturday, February 16, 2008

Smaller Version of the Solar System Is Discovered

Astronomers said Wednesday that they had found a miniature version of our own solar system 5,000 light-years across the galaxy — the first planetary system that really looks like our own, with outer giant planets and room for smaller inner planets.

“It looks like a scale model of our solar system,” said Scott Gaudi, an assistant professor of astronomy at Ohio State University. Dr. Gaudi led an international team of 69 professional and amateur astronomers who announced the discovery in a news conference with reporters.

Their results are being published Friday in the journal Science. The discovery, they said, means that our solar system may be more typical of planetary systems across the universe than had been thought.

In the newly discovered system, a planet about two-thirds of the mass of Jupiter and another about 90 percent of the mass of Saturn are orbiting a reddish star at about half the distances that Jupiter and Saturn circle our own Sun. The star is about half the mass of the Sun.

Neither of the two giant planets is a likely abode for life as we know it. But, Dr. Gaudi said, warm rocky planets — suitable for life — could exist undetected in the inner parts of the system.

“This could be a true solar system analogue,” he said.

Sara Seager, a theorist at the Massachusetts Institute of Technology who was not part of the team, said that “right now in exoplanets we are on an inexorable path to finding other Earths.” Dr. Seager praised the discovery as “a big step in finding out if our planetary system is alone.”

Since 1995, around 250 planets outside the solar system, or exoplanets, have been discovered. But few of them are in systems that even faintly resemble our own. In many cases, giant Jupiter-like planets are whizzing around in orbits smaller than that of Mercury. But are these typical of the universe?

Almost all of those planets were discovered by the so-called wobble method, in which astronomers measure the gravitational tug of planets on their parent star as they whir around it. This technique is most sensitive to massive planets close to their stars.

The new discovery was made by a different technique that favors planets more distant from their star. It is based on a trick of Einsteinian gravity called microlensing. If, in the ceaseless shifting of the stars, two of them should become almost perfectly aligned with Earth, the gravity of the nearer star can bend and magnify the light from the more distant one, causing it to get much brighter for a few days.

If the alignment is perfect, any big planets attending the nearer star will get into the act, adding their own little boosts to the more distant starlight.

That is exactly what started happening on March 28, 2006, when a star 5,000 light-years away in the constellation Scorpius began to pass in front of one 21,000 light-years more distant, causing it to flash. That was picked up by the Optical Gravitational Lensing Experiment, or Ogle, a worldwide collaboration of observers who keep watch for such events.

Ogle in turn immediately issued a worldwide call for continuous observations of what is now officially known as OGLE-2006-BLG-109. The next 10 days, as Andrew P. Gould, a professor of mathematical and physical sciences at Ohio State said, were “extremely frenetic.”

Among those who provided crucial data and appeared as lead authors of the paper in Science were a pair of amateur astronomers from Auckland, New Zealand, Jennie McCormick and Grant Christie, both members of a group called the Microlensing Follow-Up Network, or MicroFUN.

Somewhat to the experimenters’ surprise, by clever manipulation they were able to dig out of the data not just the masses of the interloper star and its two planets, but also rough approximations of their orbits, confirming the similarity to our own system. David P. Bennett, an assistant professor of astrophysics at the University of Notre Dame, said, “This event has taught us that we were able to learn more about these planets than we thought possible.”

As a result, microlensing is poised to become a major new tool in the planet hunter’s arsenal, “a new flavor of the month,” Dr. Seager said.

Only six planets, including the new ones, have been discovered by microlensing so far, and the Scorpius event being reported Friday is the first in which the alignment of the stars was close enough for astronomers to detect more than one planet at once. Their success at doing just that on their first try bodes well for the future, astronomers say.

Alan Boss, a theorist at the Carnegie Institution of Washington, said, “The fact that these are hard to detect by microlensing means there must be a good number of them — solar system analogues are not rare.”

Original here

Survival in Space Unprotected Is Possible--Briefly

But don't linger in the interstellar vacuum, or hold your breath

By Anna Gosline

As far as certain death in a science fiction plot line goes, being ejected into the vacuum of space is more than a pretty sure thing. A shove out of the air lock by a mutinous lieutenant or a vicious rip in a space suit, and your average movie victim is guaranteed to die quickly and quietly, though with fewer exploding body parts than screenwriters might have you believe.

In reality, however, animal experiments and human accidents have shown that people can likely survive exposure to vacuum conditions for at least a couple of minutes. Not that you would remain conscious long enough to rescue yourself, but if your predicament was accidental, there could be time for fellow crew members to rescue and repressurize you with few ill effects.

"In any system, there is always the possibility of equipment failure leading to injury or death. That's just the risk you run when you are in a hostile environment and you depend upon the equipment around you," says Dartmouth Medical School professor and former NASA astronaut Jay Buckey, author of the 2006 book Space Physiology. "But if you can get to someone quickly, that is good. Often spacewalks are done with two spacewalkers and there is continuous communication. So if someone is having a problem, hopefully the other can go get them and bring them in."

Vacuums are indeed lethal: Under extremely low pressure air trapped in the lungs expands, tearing the tender gas-exchange tissues. This is especially grave if you are holding your breath or inhaling deeply when the pressure drops. Water in the soft tissues of your body vaporizes, causing gross swelling, though the tight seal of your skin would prevent you from actually bursting apart. Your eyes, likewise, would refrain from exploding, but continued escape of gas and water vapor leads to rapid cooling of the mouth and airways.

Water and dissolved gas in the blood forms bubbles in the major veins, which travel throughout the circulatory system and block blood flow. After about one minute circulation effectively stops. The lack of oxygen to the brain renders you unconscious in less than 15 seconds, eventually killing you. "When the pressure gets very low there is just not enough oxygen. That is really the first and most important concern," Buckey says.

But death is not instantaneous. For example, one 1965 study by researchers at the Brooks Air Force Base in Texas showed that dogs exposed to near vacuum—one three-hundred-eightieth of atmospheric pressure at sea level—for up to 90 seconds always survived. During their exposure, they were unconscious and paralyzed. Gas expelled from their bowels and stomachs caused simultaneous defecation, projectile vomiting and urination. They suffered massive seizures. Their tongues were often coated in ice and the dogs swelled to resemble "an inflated goatskin bag," the authors wrote. But after slight repressurization the dogs shrank back down, began to breathe, and after 10 to 15 minutes at sea level pressure, they managed to walk, though it took a few more minutes for their apparent blindness to wear off.

However, dogs held at near vacuum for just a little bit longer—two full minutes or more—died frequently. If the heart was not still beating upon recompression, they could not be revived and the more rapid the decompression was, the graver the injuries no matter how much time had elapsed in the vacuum.

Chimpanzees can withstand even longer exposures. In a pair of papers from NASA in 1965 and 1967, researchers found that chimpanzees could survive up to 3.5 minutes in near-vacuum conditions with no apparent cognitive defects, as measured by complex tasks months later. One chimp that was exposed for three minutes, however, showed lasting behavioral changes. Another died shortly after exposure, likely due to cardiac arrest.

Although the majority of knowledge on the effects of vacuum exposure comes from animal studies, there have also been several informative—and scary—depressurization accidents involving people. For example, in 1965 a technician inside a vacuum chamber at Johnson Space Center in Houston accidentally depressurized his space suit by disrupting a hose. After 12 to 15 seconds he lost consciousness. He regained it at 27 seconds, after his suit was repressurized to about half that of sea level. The man reported that his last memory before blacking out was of the moisture on his tongue beginning to boil as well as a loss of taste sensation that lingered for four days following the accident, but he was otherwise unharmed.

When it comes to exposure to the interstellar medium, you might survive it with timely help but it probably won't be to your taste.

Original here

Were the First Stars Powered by Dark Matter?

When scientists look at what makes a star and what makes it burn, they turn to fusion. However, according to new research, this may not have been the case for the stars classified as “Population 3."

“The first stars were different in a lot of ways,” Katherine Freese, a theoretical physicist at the University of Michigan, told According to Freese, dark matter annihilation was the source of energy for the earliest stars, rather than fusion, when the universe was only 100 to 200 million years young.

“Annihilation means that matter goes into something else,” Freese explains. She says that everything has a partner opposite – matter and anti-matter, electrons and positrons. When these opposites meet, their identity is lost and the energy goes elsewhere. “Dark matter particles are their own anti. When they meet, one-third of the energy goes into neutrinos, which escape, one-third goes into photons and the last third goes into electrons and positrons.”

“In order for a star to form, in order for its matter to collapse into a dense object, it has to be able to cool off,” Freese continues. “We noticed that in the first stars something was competing with the cooling. The stars couldn’t collapse down small enough to get fusion going. But they were still giving off energy. They were in a phase we hadn’t discovered before.”

But how did they move from a dark matter phase and in to the more standard fusion stage that we are more familiar with? Freese explains; “The annihilation products getting stuck is what allows the dark matter heating to stay inside the star, and is what prevents the star from collapsing into a fusion driven one.” When all the dark matter has been annihilated the star then collapses enough so that fusion can take over.

What happens next is another “circle of life” thing that really makes you think. Hydrogen and helium atoms are forced together by the fusion process within the star and form new elements such as carbon, nitrogen, oxygen and various metals. Once it becomes dense enough with these new elements it collapses in on itself and goes supernova. Then, all the new elements made up within the old star spread out in to the universe to help in the creation of new stars.

“This new phase is only true in the first stars,” Freese insists. “The stars we see today are called population one stars. Earlier stars were population two stars. The first stars are referred to as population three stars. Our work is to modify how we believe population three stars developed. At first, they weren’t fusion driven.”

If the three are indeed right about this new theory, it will change what we know about how stars are formed. “It adds a new phase of stellar evolution,” Freese says.

Sadly though, according to Freese, any further study of this theory is going to have to wait until at least 2013, when NASA is scheduled to launch the James Webb Space Telescope. “We call them dark stars,” Freese explains, “but they would still shine, looking a little different. They would be cooler than a fusion driven star. We hope the next phase telescope will be able to tell between the standard stars now, and what we think happened in the first stars.”

Their study appears in Physical Review Letters with the title “Dark Matter and the First Stars: A New Phase of Stellar Evolution.”

Thanks to PhysOrg for the permission to work from their article, found at the link below.

Posted by Josh Hill.

Original here

Embryo free way to make cells 'safe'

Evidence that it should be possible to take skin cells and safely turn them into embryonic like cells to treat a vast range of diseases is published today.

The work takes doctors one step closer to the day that doctors will not need to clone embryos in order to create an unlimited supply of any of a patient's own cells and tissues for novel treatments and transplants, whether to treat diabetes or Parkinson's disease.

Researchers recently showed that adult human and mouse skin cells could be "reprogrammed" to be capable of generating any type of cell in a manner similar to embryonic stem cells, sidestepping many ethical objections to this work.

Now the pioneer of this method, Prof Shinya Yamanaka of Kyoto University, reports today in the journal Science that his team has moved another step closer to understanding how these "induced pluripotent stem cells" or iPS cells might be reprogrammed without causing tumours when they are transplanted into the body, a crucial step if they are used to study disease or developed for human therapies.

Dr Takashi Aoi and his other colleagues reprogrammed adult mouse liver and stomach lining cells into iPS cells by genetically altering the cell by introducing three new genes with a kind of virus, called a retrovirus.

There have been concerns that the retrovirus could trigger cancer, by introducing the new genes in such a way to cause problems to useful stretches of DNA, for instance genes that suppress tumour growth.

The team was able to confirm in iPS cells made from adult cells that the virus did not insert genes this way. Mice implanted with the rejuvenated cells remained tumour-free six months after receiving the new cells.

But he tells the Telegraph that practical uses of the reprogrammed cells are still years away.

"In order to apply this technology to clinics, we still have to study the safety of iPS cells in bigger animals such as monkeys," he says. "It will take years to do this."

"It is very encouraging that the method does not lead to preferential insertion of the retroviruses into tumour suppressor genes," comments Dr Robin Lovell-Badge of the National Institute for Medical Research, London.

"However, even if rare, this may happen occasionally, so some caution is still needed. The insertions could also effect genes required for the iPS cells to form the specific cell types that might be required for cell-based therapies.

"So it would still be better if the reprogramming method could avoid the use of these viruses. This is one reason why it is still desirable to pursue other methods of reprogramming, such as so-called "therapeutic cloning". Once we understand how reprogramming works we can devise the ideal way to achieve it safely."

Prof Yamanka's work is a remarkable advance that prompted Sir Ian Wilmut, who led the team that cloned Dolly the sheep, to tell The Daily Telegraph that he would adopt the new method rather than the nuclear transfer method that his team used to create Dolly.

The Japanese work has now been confirmed in other laboratories. A few days ago, UCLA stem cell scientists led by Kathrin Plath and William Lowry used genetic alteration to turn back the clock on human skin cells and create cells that are nearly identical to human embryonic stem cells, iPS cells.

"Our reprogrammed human skin cells were virtually indistinguishable from human embryonic stem cells," says Dr Plath. "We are very excited about the potential implications."

Original here

Jumper's Tricked-Out Teleportation: Hollywood Sci-Fi vs. Reality

Director Doug Liman (inset) did some DIY jump-cut teleportation in his office before digitally zapping Hayden Christensen around the world. (Images courtesy of 20th Century Fox)

Teleportation has long been the object of real-world desire—not to mention sci-fi speculation, from Star Trek to The Twilight Zone to Heroes. Though scientists have actually had some success in the lab, the fact remains that you can’t spontaneously disappear and travel wherever you want. And that, according to director Doug Liman, is the sole fallacy of physics he accepted when filming Jumper, featuring Star Wars vet Hayden Christensen as a kid with a genetic anomaly that allows him to skip through wormholes across the world. “What if somebody had this power?” Liman asks. “That’s the only leap of faith that I took in the movie. Everything else is grounded in the real physics of our world.”

The Director’s Take

After guiding Matt Damon through amnesia in The Bourne Identity and now getting set for a modern-day, DIY recreation of the Apollo missions for his next film, Liman is happy to call himself a science geek—honest portrayals are the key to his films, even since playing it indie cool during his Swingers years. “I wanted to keep the physics everywhere else as honest as possible,” he tells PM. “Other films that have had super powers say, ‘Well, now that there’s a super power—we’re going to ignore every law of physics on the planet.’ That’s why you end up with Spider-Man existing in a [place] that feels nothing like our world, [where] everybody else can do extraordinary things. Gravity doesn’t even seem to apply because there [are] people zipping around on anti-gravity boards. Those things feel like cartoons to me.”

So Liman did his own prep work on the circumstances around a jump, experimenting with its look and its scientific premise at the same time. “[I] basically started with someone sitting in my office and pointing a video camera at them and being like, ‘Okay, I’m going to teleport them out of the chair’—starting with the most rudimentary, which is you pause the camera and they leave the frame and you start the camera again,” he says. Then came the physics: With a character’s weight shifting off the chair, the air vacuum and surrounding environment in a teleportation scene would be more crucial than the jump itself. “The jump is extraordinarily unimpressive,” Liman admits, “because I felt anything that powerful and that violent would happen in a split second, which means you basically can’t see it.” Indeed, we get no wormholes, no slo-mo—just some post-teleport scars, from frost on a windshield to oil floating on the surface of water.

Liman was also preoccupied with what would happen when ordinary nonjumpers were exposed to the wormholes left behind by the likes of Christensen and Samuel L. Jackson. “If you opened a temporary wormhole between two places, before it closed up, that would be something extraordinarily violent and dangerous,” he says. “One of the rules of science is that everything that we do in this world, there’s a price to be paid—nothing comes for free. And one of the prices to be paid is that, if you jump, you leave this wormhole behind. That wormhole is dangerous for people who might stumble across it before it evaporates.”

The wormholes that bad guy Samuel L. Jackson flies through in Jumper are nothing compared to the unstable ones physicists have found in the real world. "If you try to fly through them," says MIT professor Max Tegmark (inset), "the whole thing collapses into a black hole." (Still courtesy of 20th Century Fox)

The Physics Reality

Dr. Max Tegmark, a physics professor at MIT who recently participated in a teleportation panel discussion with Liman, says he was impressed by the filmmaker’s ambition to get the physics right. “It was very good to see a movie director trying to make things as realistic as he could, at least within the confines of the script,” Tegmark says of Jumper, which is based on a Steven Gould sci-fi novel. Some of Liman’s leaps of scientific intuition, such as environmental disruption based on air pockets left by the jumper, “make perfect sense,” Tegmark says.

And make no mistake: While Liman and his team just assumed that humans can’t jump in real life, teleportation does exist—just not the way it’s portrayed on screen. The physical particles don’t move, but their quantum information does—essentially cloning them at a distance. So electrons and ions have “teleported” before, and last year scientists beamed data from photons almost 90 miles between two of the Canary Islands. But even teleporting one particle at a time—say, an electron—is not a simple process. It requires three electrons—the one you want to teleport, and a pair of entangled electrons that split. One electron learns information about the electron to be teleported while the third receives that information at the destination, creating a perfect clone down to the electron’s quantum state; the information about the original is then destroyed. “You don’t want to teleport yourself unless you’re really confident that it’s going to work,” Tegmark says.

Physicists, meanwhile, aren’t sure that traversable wormholes are even possible. “I think Liman had in mind that there was supposed to be some kind of wormhole through space-time, and that’s how it was supposed to work,” Tegmark says. “The ones we know of in physics don’t just appear out of nowhere, and they’re very unstable. If you try to fly through them, the whole thing collapses into a black hole. It’s still an open problem in physics—whether all wormholes are unstable or whether by putting dark energy in them you can make them stable, and whether or not traversable wormholes are actually possible.” Regardless, he says, stabilized, traversable wormholes aren’t coming anytime soon.

One real world physics conundrum the script leaves out, Tegmark says, is energy conservation. “As you convert yourself into pure energy, you correspond to many, many megatons of energy,” he explains. “If you unleash that in an uncontrolled way, it would look like a giant nuclear bomb—and you didn’t see anything like that in the movie.”

One thing’s for certain: It will be a long time before we’re teleporting humans, “not because it’s impossible, but because there’s not much incentive to do it,” says Tegmark, who pinpoints information transfer as the most apt use for teleportation. “If you send an encrypted message over the Internet from the White House to the Pentagon, you’re always worried that someone’s going to eavesdrop on you,” he says. “But if you teleport your information through fiberoptic wires, we know from the basic principles of quantum mechanics that no one can eavesdrop on you. No one can get to that message without the message self-destructing.”

Regardless of whether or not Liman has his science right, Tegmark applauds the film. “I think it’s great that Hollywood makes you think about what’s possible in physics and what’s not,” he says. “In my experience as a scientist, it’s not always about finding the right answer but finding the right question.”

Original here

Green Basics: Carbon Footprint

In addition to metrics like ecological footprint, each of us (and each of the products and services we use and consume every day) has a carbon footprint; it's a way to measure the relative impact of our actions -- as individuals, as businesses, communities and countries, as we eat, work, travel, play, etc. -- in terms of the contribution made to global climate change. Measured in carbon emissions (usually in pounds, tons or kilograms), it's become an increasingly useful and popular tool to help contextualize global warming in our daily routines and lives.

A carbon footprint is the total amount of carbon dioxide (CO2) and other greenhouse gases emitted over the full life cycle of a product or service, and everything has one, from the computer you used to find this article to the next meal you eat (and the one after that, and after that, and so on...) to the shoe that will leave a physical footprint on the ground the next time you walk outside. But that's only part of the story.

First of all, carbon footprints can be calculated in one of two ways: using a Life Cycle Assessment (LCA) method (more accurate and specific), or it can be restricted to the immediately attributable emissions from energy use of fossil fuels (more general). To use your car's carbon footprint as an example: the first method would take into account all carbon emissions required to build the car (including all the metal, plastic, glass and other materials), drive the car and dispose of the car; the second would account only for the fossil fuels that resulted from building, driving and disposing of it.

Further, there's more than one way to run the numbers, depending on how they're going to be used. Top-down calculations, like those done in the world map above and the US state map below, that calculate per capita carbon footprints, take total emissions from a country (or other high-level group, organization, etc.) and divide these emissions among the residents or otherwise applicable group. Bottom-up calculations, like with your car's carbon emissions from the example above, sum attributable carbon emissions from individual actions.
Okay, so everything has a carbon footprint, and each can be measured a couple different ways, but it's not just a matter of carbon dioxide, though that is the most common of greenhouse gases (GHGs) other than water vapor; other GHGs include (but aren't limited to) methane, ozone, nitrous oxide, sulfur hexafluoride, hydrofluorocarbons, perfluorocarbons and chlorofluorocarbons (see the IPCC list of greenhouse gases for a more thorough list). Given this, still, most carbon footprint calculations include all applicable gases, as they all contribute to the greenhouse effect and our persistently warming globe.

Though a fairly complex calculation, with many variables that are different for each person, carbon footprint calculations generally include energy used to power our homes and transport, including travel by car, airplane, rail and other public transport, as well as all the consumables we use on a regular (and irregular) basis; many of the individual factors above can be calculated separately (e.g. an individual carbon footprint for your home, travel, food, etc.). Once you understand what goes in to your carbon footprint, and, probably more importantly, what your carbon footprint is, you can start reducing it; indeed, for as many ways as there are to create a carbon footprint, there are ways to reduce it.

Increasing the efficiency of our energy use, reducing our energy use and changing a few habits (like eating less meat, eating more local food, not traveling by airplane as much) are some of the quick, easy ways to cut back on the size our individual carbon footprints. After increasing efficiency and reducing use, carbon offsets are also an increasingly popular (and increasingly controversial) way to help mitigate our carbon footprints -- see TreeHugger's How to Green Your Carbon Offsets guide for more on that. But the point remains: there are many, many ways to reduce and even eliminate your carbon footprint; most every article you'll read on TreeHugger will be related to carbon footprints and emissions, though some more directly than others.

Moving forward, we expect to see more and more information about the carbon footprints of the things we encounter and use every day; carbon labeling for produce is catching on the UK, and we've seen carbon footprint measurements for everything from cheeseburgers to Christmas, and sushi to Shaq. Want more? Type 'carbon footprint' into our search engine, above, on the right, and go nuts.

Ready to find out what your carbon footprint is? There are a handful of calculators out there; try The Nature Conservancy's Carbon Footprint Calculator and the calculator at (yep, the site for An Inconvenient Truth) for starters.

Quench your thirst for more green knowledge with our Green Basics column, which appears on TreeHugger on Thursdays.

Original here

1989 Record for Solar-to-Grid Efficiency Finally Broken

This January, on an exceptionally clear and cold day, scientists for the Sandia National Laboratory and Sterling Energy Systems recorded a 31.25% solar-to-grid efficiency, nearly 2% better than the 1989 record.
The efficiency record, according to the scientists working on the project, brings us another step closer to getting solar to compete with coal power prices.

Solar-to-grid efficiency is very different than solar panel efficiency, which already has exceeded 40%. Unfortunately, getting the power from a solar panel (which is direct current) onto the grid (which is alternating current) requires several steps, each of which eats away at efficiency.

These solar collectors, which use concentrators to heat a Sterling engine, produce alternating current, so less energy is lost getting the power onto the grid.

The scientists contribute their success to 1. The increased reflectivity of the mirrors, which now approaches 95%, and 2. A rather ironic "Perfect Storm." This perfect storm consisted of a perfectly clear day, with 0% moisture and no particulates, the day was 10% brighter than average.

Additionally, the cold weather helped as well. Sterling engines operate by exploiting the difference in temperature between a hot end and a cold end. Solar energy heats up the hot end, but the only thing to cool the cold end is the ambient temperature.

Now we just have to hope that these solar concentrators can be scaled up, or made cheap enough to start, at least on cold days in New Mexico, to compete with coal.

Read the full press release after the jump.

Via Metaefficient

Sandia, Stirling Energy Systems set new world record for solar-to-grid conversion efficiency

31.25 percent efficiency rate topples 1984 record

Sandia and Stirling Energy Systems set new world record for solar-to-grid conversion efficiency. The record establishes a new solar-to-grid conversion efficiency of 31.25 percent. The old record, which has stood since 1984, was 29.4 percent.
Sandia and Stirling Energy Systems set new world record for solar-to-grid conversion efficiency. The record establishes a new solar-to-grid conversion efficiency of 31.25 percent. The old record, which has stood since 1984, was 29.4 percent. (Photo by Randy Montoya)

Download 300dpi 4.27MB JPEG image (Media are welcome to download/publish this image with related news stories.)

ALBUQUERQUE, N.M. —On a perfect New Mexico winter day — with the sky almost 10 percent brighter than usual — Sandia National Laboratories and Stirling Energy Systems (SES) set a new solar-to-grid system conversion efficiency record by achieving a 31.25 percent net efficiency rate. The old 1984 record of 29.4 percent was toppled Jan. 31 on SES’s “Serial #3” solar dish Stirling system at Sandia’s National Solar Thermal Test Facility.

The conversion efficiency is calculated by measuring the net energy delivered to the grid and dividing it by the solar energy hitting the dish mirrors. Auxiliary loads, such as water pumps, computers and tracking motors, are accounted for in the net power measurement.

“Gaining two whole points of conversion efficiency in this type of system is phenomenal,” says Bruce Osborn, SES president and CEO. “This is a significant advancement that takes our dish engine systems well beyond the capacities of any other solar dish collectors and one step closer to commercializing an affordable system.”

Serial #3 was erected in May 2005 as part of a prototype six-dish model power plant at the Solar Thermal Test Facility that produces up to 150 kilowatts (kW) of grid-ready electrical power during the day. Each dish unit consists of 82 mirrors formed in a dish shape to focus the light to an intense beam.

The solar dish generates electricity by focusing the sun’s rays onto a receiver, which transmits the heat energy to a Stirling engine. The engine is a sealed system filled with hydrogen. As the gas heats and cools, its pressure rises and falls. The change in pressure drives the pistons inside the engine, producing mechanical power, which in turn drives a generator and makes electricity.

Lead Sandia project engineer Chuck Andraka says that several technical advancements to the systems made jointly by SES and Sandia led to the record-breaking solar-to-grid conversion efficiency. SES owns the dishes and all the hardware. Sandia provides technical and analytical support to SES in a relationship that dates back more than 10 years.

Sandia is a National Nuclear Security Administration laboratory.

Andraka says the first and probably most important advancement was improved optics. The Stirling dishes are made with a low iron glass with a silver backing that make them highly reflective —focusing as much as 94 percent of the incident sunlight to the engine package, where prior efforts reflected about 91 percent. The mirror facets, patented by Sandia and Paneltec Corp. of Lafayette, Colo., are highly accurate and have minimal imperfections in shape.

Both improvements allow for the loss-control aperture to be reduced to seven inches in diameter — meaning light is highly concentrated as it enters the receiver.

Other advancements to the solar dish-engine system that helped Sandia and SES beat the energy conversion record were a new, more effective radiator that also costs less to build and a new high-efficiency generator.

While all the enhancements led to a better system, one aspect made it happen on a beautiful New Mexico winter day — the weather.

“It was a ‘perfect storm’ of sorts,” Andraka says. “We set the record on Jan. 31, a very cold and extremely bright day, a day eight percent brighter than normal.”

The temperature, which hovered around freezing, allowed the cold portion of the engine to operate at about 23 degrees C, and the brightness means more energy was produced while most parasitic loads and losses are constant. The test ran for two and a half hours, and a 60-minute running average was used to evaluate the power and efficiency data, in order to eliminate transient effects. During the testing phase, the system produced 26.75 kW net electrical power.

Osborn says that SES is working to commercialize the record-performing system and has signed power purchase agreements with two major Southern California utilities (Southern California Edison and San Diego Gas & Electric) for up to 1,750 megawatts (MW) of power, representing the world’s two largest solar power contracts. Collectively, these contracts require up to 70,000 solar dish engine units.

“This exciting record shows that using these dishes will be a cost-effective and environmentally friendly way of producing power,” Osborn says. “SES is actively engaged in the commercialization of the system, called the ‘SunCatcher,’ including continuing to prepare it for mass production, completing project site development and preconstruction activities, and establishing partnerships with substantial manufacturing and industrial organizations to develop a cost-effective manufacturing process and supply chain. The demonstrated high efficiency means more energy is generated for the given investment, lowering the cost of the energy delivered.”

SES was formed in 1996 to develop and commercialize advanced solar technology. The company maintains its corporate headquarters in Phoenix, Ariz, project and technical development offices in Tustin, Calif, and engineering and test site operations at Sandia National Laboratories in Albuquerque.

Original here

Man's affect on world's oceans revealed

Almost half of the world's oceans have been seriously affected by over-fishing, pollution and climate change, according to a major study of man's impact on marine life.

  • Ocean monitoring system 'vital to mankind'
  • James Lovelock's plan to pump ocean water to stop climate change
  • Acidic oceans threaten marine life
  • An international team of 19 scientists have published the first ever comprehensive map showing the combined impact of human activity on the planet's seas and oceans.

    t shows that more than 40 per cent of marine regions have been significantly altered, while just four per cent remains in a pristine state.

    Previous studies have largely focused on the impacts of specific activities such as pesticide runoff or fishing, or have looked at damage in certain areas.

    The North Sea is one of the most heavily affected regions, along with the South and East China Seas, the Caribbean, the east coast of North America, the Mediterranean, the Red Sea and the Persian Gulf. The least affected areas are near the poles.

    Dr Ben Halpern, of the University of California, presented the new findings at the American Association for the Advancement of Science (AAAS) conference in Boston.

    Dr Halpern said: "This project allows us to finally start to see the big picture of how humans are affecting the oceans.

    "The big picture looks much worse than I imagine most people expected. It was certainly a surprise to me."

    Activities and impacts included in the study include fishing, ocean acidification caused by pollution, temperature change, species extinctions and invasions, and the shipping, oil and gas industries.

    The researchers developed models to quantify and compare how 17 human activities affected marine ecosystems.

    For example fertiliser runoff has been shown to cause significant damage to coral reefs but has less effect on kelp forests.

    They gathered data from across the world and collated the results to give each area a score for man-made damage and changes.

    The results, published in the journal Science, show that 41 per cent of the world's oceans and seas have been significantly affected by multiple human activities.

    Coral reefs, seagrass beds, mangroves, rocky reefs and shelves are among the most seriously altered ecosystems.

    The team hope their work will provide information to help policymakers decide on priorities for conservation action.

    Dr Kimberly Selkoe, of the University of Hawaii, said: "Conservation and management groups have to decide where, when, and what to spend their resources on.

    "Whether one is interested in protecting ocean wilderness, assessing which human activities have the greatest impact, or prioritising which ecosystem types need management intervention, our results provide a strong framework for doing so."

    Coastal regions were shown to be particularly badly hit. The North Sea was the 24th most affected region of 232.

    The effects of over-fishing in the North Sea have been well-documented, while the close proximity of heavily populated areas, shipping, oil and gas extraction have all affected a region that is relatively shallow and enclosed, and therefore slower to repair damage.

    Co-author Dr Mark Spalding, a marine scientist at the conservation group Nature Conservancy, said: "What is surprising is the truly global spread of human impact.

    "But it's not all doom and gloom. In some areas, such as the Great Barrier Reef, strong integrated conservation measures are being introduced.

    "The map provides a challenge for us to start to think seriously about conservation and management, and gives us pointers to the priorities and different states of urgency of response required."

    Original here

    World’s Largest Marine Protected Area Created in Pacific Ocean

    Vast Ocean Reserve Conserves Vital Resources for Human Well-Being

    Arlington, VAThe small Pacific Island nation of Kiribati has become a global conservation leader by establishing the world's largest marine protected area – a California-sized ocean wilderness of pristine coral reefs and rich fish populations threatened by over-fishing and climate change.

    The Phoenix Islands Protected Area (PIPA) conserves one of the Earth's last intact oceanic coral archipelago ecosystems, consisting of eight coral atolls and two submerged reef systems in a nearly uninhabited region of abundant marine and bird life. The 410,500-square-kilometer (158,453-square-mile) protected area also includes underwater mountains and other deep-sea habitat.

    Kiribati first declared the creation of PIPA at the 2006 Conference of the Parties to the Convention on Biological Diversity in Brazil. On January 30, 2008, Kiribati adopted formal regulations for PIPA that more than doubled the original size to make it the largest marine protected area on Earth.

    Kiribati and the New England Aquarium (NEAq) developed PIPA over several years of joint scientific research, with funding and technical assistance from Conservation International's (CI) Global Conservation Fund and Pacific Islands Program. The CI support for PIPA is part of the Coral Reef Initiative in the South Pacific (CRISP).

    "Kiribati has taken an inspirational step in increasing the size of PIPA well beyond the original eight atolls and globally important seabird, fish and coral reef communities," said Greg Stone, the NEAq vice-president of global marine programs. "The new boundary includes extensive seamount and deep sea habitat, tuna spawning grounds, and as yet unsurveyed submerged reef systems."

    Located near the equator in the Central Pacific between Hawaii and Fiji, the Phoenix Islands form an archipelago several hundred miles long. They are part of the Republic of Kiribati, which comprises three distinct island groups (Gilbert Islands, Phoenix Islands, and Line Islands) with a total of 33 islands to make it the largest atoll nation in the world.

    "The creation of this amazing marine protected area by a small island nation in the Pacific represents a commitment of historic proportions; and all of this by a country that is under serious threat from sea-level rise attributed to global warming," said CI President Russell A. Mittermeier. "The Republic of Kiribati has now set a standard for other countries in the Pacific and elsewhere in the world. We are proud to be associated with this effort that helps the people of Kiribati, and we call on governments and private conservation groups everywhere to support Kiribati in its efforts and make similar commitments to protect their own natural systems."

    The Phoenix Islands were featured in a major article in National Geographic in February 2004.

    Three NEAq-led research expeditions since 2000 found great marine biodiversity, including more than 120 species of coral and 520 species of fish, some new to science. Some of the most important seabird nesting populations in the Pacific, as well as healthy fish populations and the presence of sea turtles and other species, demonstrated the pristine nature of the area and its importance as a migration route.

    Protecting the Phoenix Islands means restricting commercial fishing in the area, resulting in a loss of revenue that the Kiribati government would normally receive from issuing foreign commercial fishing licenses. NEAq and CI are helping Kiribati design an endowment system that will cover the core recurring management costs of PIPA and compensate the government for the foregone commercial fishing license revenues. The plan allows for subsistence fishing by resident communities and other sustainable economic development in designated zones of the protected area.

    Keeping oceans and marine ecosystems intact and healthy allows them to better resist the impacts of climate change and continue their natural role of sequestering atmospheric carbon that causes global warming.

    Original here