Saturday, August 16, 2008

Experts: Reliance on Russia makes NASA weak

By Lara Farrar

LONDON, England (CNN) -- Experts are growing increasingly concerned that the United States will have to rely entirely upon Russia to take astronauts to and from the international space station for at least half a decade.

NASA astronaut Peggy Whitson endured a chaotic Soyuz capsule landing in April.

NASA astronaut Peggy Whitson endured a chaotic Soyuz capsule landing in April.

Click to view previous image
1 of 2
Click to view next image

Observers say the situation is all the more worrying as earlier in the week, NASA announced a delay in the launch of its next-generation Orion spacecraft.

NASA's dependency upon the Russian Soyuz space capsules and rockets to carry astronauts to the station is the result of a five-year gap between the scheduled retirement of the shuttle in 2010 and the debut of its replacement in 2015.

The agency had hoped it could narrow this gap by accelerating the initial launch of the craft to 2013 but announced on Monday that because of inadequate funding and technical issues the new Constellation space program would not be ready for testing until September 2014.

While the new date is still within the March 2015 absolute deadline, many experts say NASA's reliance upon Russia to take astronauts into space has placed the agency in an unnecessary position.

"It is a vulnerability," John Logsdon, director of the space policy institute at George Washington University, told CNN.

"Any time you are relying on a single system to do a critical task, you are vulnerable if that system has problems."

"It is our fault for not having a replacement for the shuttle much earlier than Orion will be available. It puts Russia in a very powerful position," Logsdon said.

Russia will be the only country capable of providing human access to space not only for the Americans but also the rest of the world in the near future, said Howard McCurdy, a space expert at American University in Washington.

"It is like a monopoly position where you are at the mercy of that supplier," said McCurdy. "You don't want to be dependent on a single provider no matter who it is."

McCurdy warned that because the United States has positioned itself to be completely dependent on Russia to get humans into space until 2015, it may be harder for the American government to take diplomatic action against the country, especially in light of recent tensions between Russia and Georgia.

"That is a real concern," said McCurdy. "You are much more reluctant to be nasty with somebody who is a sole provider of an essential service.

"We have other international arrangements with them that could be jeopardized by our reliance on them," McCurdy continued. "Everything from their foreign relations with ex-Soviet states to their role in economic summits." Does NASA's dependence on Russia bode badly for U.S. space program?

For its part, NASA says it remains confident that diplomatic affairs between the two countries will not adversely impact the space agency's relationship with Russia.

"While it is possible that government to government issues could potentially have an impact on other aspects of a relationship between nations including cooperative space exploration activities, NASA has no reason to believe that it will be unable to rely upon Roscosmos-provided Soyuz vehicles for future ISS activities," spokesman Michael Curie wrote in an email statement to CNN.

The threat of a breakdown in diplomatic relations is not the only one hanging over NASA's space program.

Legislation passed in 2000 (now called the Iran, North Korea and Syria Nonproliferation Act) could soon bring an abrupt halt to NASA's partnership with the Russian Space Federation, Democratic Senator Bill Nelson of Florida told CNN.

The law bans the United States from buying space technology from Russia unless the president determines Russia is taking steps to prevent the proliferation of nuclear and missile technology to Iran.

Congress waived the ban in 2005, allowing NASA to enter into a $719 million contract with the Russians for use of the Soyuz through 2011.

NASA says it is currently renegotiating a new long-term contract for use of the Soyuz but, according to Nelson, the success of that contract could depend on whether or not lawmakers decide to approve the waiver again.

Election-year politics combined with increasing concerns about Iran and the ongoing crisis in Georgia all but guarantee that lawmakers will not vote for the exemption, said Nelson.

That means NASA could lose access to the $100 billion space station unless it continues to fly the shuttle or strikes some sort of deal with another space agency willing to put forward money for additional Soyuz seats, the Senator explained to CNN.

"It is a lose-lose situation," said Nelson.

"If our relationship with Russia is strained who knows if Russia will give us rides in the future?" Nelson continued. "Or if they give us rides will they charge such an exorbitant price that it becomes blackmail?"

Questions about the safety and reliability of the Soyuz have also been raised in recent months after two consecutive troublesome landings by space capsules, including one in April with American astronaut Peggy Whitson on board.

NASA has been working with Russian engineers to try to determine the cause of the dangerous descents but has so far failed come up with any concrete answers.

But NASA officials say the space agency still believes the Soyuz is a reliable transport system for its astronauts.

"We do not have concerns," NASA spokesman Rob Navias told CNN. "The Soyuz, which has been flying for decades now, is extremely reliable and is extremely capable."

"We have been partnering with the Russians for decades now for space flights."

The Russian Federal Space Agency, Roscosmos, could not be reached for comment on the matter.

Original here

Phoenix Microscope Takes First Image Of Martian Dust Particle

NASA's Phoenix Mars Lander has taken the first-ever image of a single particle of Mars' ubiquitous dust, using its atomic force microscope.

The particle -- shown at higher magnification than anything ever seen from another world -- is a rounded particle about one micrometer, or one millionth of a meter, across. It is a speck of the dust that cloaks Mars. Such dust particles color the Martian sky pink, feed storms that regularly envelop the planet and produce Mars' distinctive red soil.

"This is the first picture of a clay-sized particle on Mars, and the size agrees with predictions from the colors seen in sunsets on the Red Planet," said Phoenix co-investigator Urs Staufer of the University of Neuchatel, Switzerland, who leads a Swiss consortium that made the microscope.

"Taking this image required the highest resolution microscope operated off Earth and a specially designed substrate to hold the Martian dust," said Tom Pike, Phoenix science team member from Imperial College London. "We always knew it was going to be technically very challenging to image particles this small."

It took a very long time, roughly a dozen years, to develop the device that is operating in a polar region on a planet now about 350 million kilometers or 220 million miles away.

The atomic force microscope maps the shape of particles in three dimensions by scanning them with a sharp tip at the end of a spring. During the scan, invisibly fine particles are held by a series of pits etched into a substrate microfabricated from a silicon wafer. Pike's group at Imperial College produced these silicon microdiscs.

The atomic force microscope can detail the shapes of particles as small as about 100 nanometers, about one one-thousandth the width of a human hair. That is about 100 times greater magnification than seen with Phoenix's optical microscope, which made its first images of Martian soil about two months ago. Until now, Phoenix's optical microscope held the record for producing the most highly magnified images to come from another planet.

"I'm delighted that this microscope is producing images that will help us understand Mars at the highest detail ever," Staufer said. "This is proof of the microscope's potential. We are now ready to start doing scientific experiments that will add a new dimension to measurements being made by other Phoenix lander instruments."

"After this first success, we're now working on building up a portrait gallery of the dust on Mars," Pike added.

Mars' ultra-fine dust is the medium that actively links gases in the Martian atmosphere to processes in Martian soil, so it is critically important to understanding Mars' environment, the researchers said.

The particle seen in the atomic force microscope image was part of a sample scooped by the robotic arm from the "Snow White" trench and delivered to Phoenix's microscope station in early July. The microscope station includes the optical microscope, the atomic force microscope and the sample delivery wheel. It is part of a suite of tools called Phoenix's Microscopy, Electrochemistry and Conductivity Analyzer.

Original here

3 Controversial Maps

Buzz up!
Rob Lammle
by Rob Lammle

1. The One With Only 38 States


[Click on the map for a larger view.]

If George Etzel Pearcy had his way, Lynyrd Skynyrd’s famous song would have been called “Sweet Home Talladego.” In 1973, the California State University geography professor suggested that the U.S. should redraw its antiquated state boundaries and narrow the overall number of states to a mere thirty-eight.

Pearcy’s proposed state lines were drawn in less-populated areas, isolating large cities and reducing their number within each state. He argued that if there were fewer cities vying for a state’s tax dollars, more money would be available for projects that would benefit all citizens.

Because the current states were being chopped up beyond recognition, part of his plan included renaming the new states by referencing natural geologic features or the region’s cultural history.

While he did have a rather staunch support network—economists, geographers, and even a few politicians argued that Pearcy’s plan might be crazy enough to work—the proposal was defeated in Washington, D.C. Imagine all the work that would have to be done to enact Pearcy’s plan: re-surveying the land, setting up new voter districts, new taxation infrastructure—basically starting the whole country over. It’s easy to see why the government balked.

The map above was published in 1973. Oddly, it doesn’t show any city locations to help illustrate Pearcy’s argument. At this point, I should tell you that I make maps for a living. So I did my best to replicate Pearcy’s map using population data from the 2000 census to show current high population cities and where they would fall within the new states. Here’s what I came up with:


[Click on the map for a larger view.]

As you can see, many of the new states contain a small number of major metropolitan areas, and the problem of dual-state cities has been solved. While Pearcy’s proposal might have been a logistical nightmare to make a reality, that doesn’t necessarily mean it was a bad idea.

2. The One Where Greenland & Africa are the Same Size


In 1973, Arno Peters, a German filmmaker and journalist, called a press conference to denounce the widely accepted map of the world known as the “Mercator Map” (above). Peters’ position was that the Mercator Projection—a cylindrical projection first developed in 1569 by Flemish cartographer Gerardus Mercator—was not only inaccurate, but downright racist. Peters pointed out that the Mercator map has a distortion in the northern hemisphere, making North American and Eurasian countries appear much larger than they actually are. For example, Greenland and Africa are shown as roughly the same size, although in reality Africa is about fourteen times larger. In contrast, the regions along the equator—Africa, India, and South America, to name a few—appear smaller, especially when seen next to the distorted northern half of the map. It was Peters’ belief that this error led many in the developed world to ignore the struggles of the larger, poorer nations near the equator.

Of course Peters had a suggestion on how to fix this problem—his own map. The Peters Projection map, which claimed to show the world in a more accurate, equal-area fashion.


Because Peters’ map showed the size of developing nations more accurately, charitable organizations that worked in those regions quickly gave him their endorsement. Eventually his map became so well received that some were calling for an all-out ban on the Mercator map, believing it to be an outmoded symbol of colonialism.

The thing is, cartographers agreed that the Mercator map was outdated, inaccurate, and wasn’t the best way to represent the world’s landmasses. They’d been calling for the use of a new projection since the 1940s.

One of the reasons experts wanted to move away from the Mercator was because of the distortion. However, they also understood that it was distorted for good reason. The Mercator map was intended as a navigational tool for European mariners, who could draw a straight line from Point A to Point B and find their bearings with little trouble. Because it was made for European navigators, it was actually helpful to show Europe larger than it really was. It wasn’t a political statement, but a decision made purely for ease-of-use.

However, the biggest insult to cartographers was the Peters projection itself. Peters claimed to have created the projection, when in fact, it was essentially the same thing as devised in 1855 by a cartographer named James Gall. Many have recognized this similarity and now you’ll often see Peters’ map called “The Gall-Peters Projection.”

Today, the controversy is mostly dead. Both projections are seen as flawed and have fallen into disuse as more accurate maps have been developed. In classrooms now, you’re more likely to see the Robinson Projection or the Winkel Tripel Projection. The Gall-Peters map is still favored by some organizations, though many map publishers don’t even produce it anymore. And despite the controversy, the Mercator projection is still one of the most widely used navigational tools in the world—it’s the primary projection for Google Maps.

3. The One that Claims the Chinese Got Here First


[Click on the map for a larger view.]

It seems everyone wants to ruin Christopher Columbus’ biggest claim to fame. This time it’s a Chinese map that is threatening to rewrite history.

Purchased from a Shanghai dealer in 2001 by Liu Gang for a mere $500, the map shows the world—including a well-developed picture of North and South America. While text on the map indicates it was drawn in 1763, it claims to be a copy of another map drawn in 1418. The original map was cited as belonging to the great Chinese explorer, Zheng He, whose known travels include India and eastern Africa. However, thanks to numerous errors and anachronisms, the map’s authenticity has been called into question.

For example, California is shown as an island, which is a famous mistake made by European maps of the 17th Century. Furthermore, the detailed representation of river systems would be difficult to attain by such large ships as those used by Zheng He, whose fleets sometimes carried up to 28,000 men. Finally, the Chinese did not have an understanding of how to create a map projection at that time, a skill necessary to translate a 3-D globe to a 2-D map. In short, they didn’t even know how to make this map when it was supposed to have been drawn.

The annotations on the map also seem to be largely erroneous. A perfect example is a note stating of Eastern Europe: “The people here all worship God (shang-di) and their religion is called ‘Jing.’” However, according to noted professor and map critic, Dr. Geoff Wade, the term “shang-di” for the Christian God was not introduced until the late 16th Century. Perhaps most damaging are the many references to the “Great Qing Ocean” regarding the waters off China. Unfortunately, the Qing Dynasty began in 1644—more than 200 years after the original map was supposed to have been made.

Based upon this evidence, it seems likely that the Chinese map of the New World is the product of a 1763 cartographer using the terminology of his time, combined with data from European maps. Therefore, while Zheng He can definitely claim to be a great explorer, it is doubtful he ever made it to America.

Junk Food In, Junk Food Out

Has the ethanol boom led to the rash of E. coli-contaminated beef recalls?

It's possible. Here's how:

The demand for ethanol, the fuel additive and purported gas substitute, has been higher than ever because Congress started requiring a higher percentage of ethanol in the nation's fuel supply. As has been well documented, the increased corn production has enriched chemical fertilizer and pesticide companies, increased agricultural runoff has fed a near-record dead zone in the Gulf of Mexico, and the diversion of corn from food and feed to fuel has contributed to a worldwide run-up in food commodity prices. (These are only the start to corn-based ethanol's problems; if we planted all U.S. cropland in corn, it would still only supply about 20% of our demand, and require so much fossil fuel fertilizer that we'd still be contributing nearly as much to global warming and importing loads of energy.)

The high price of corn has also made it hard on conventional — if that's the right word — beef ranchers, who have for decades converted cheap corn into profits by feeding it to cows on crowded feed lots.

junk food cow

It's already been reported that some ranchers have turned to the waste from ethanol plants to feed their cows, and that the switch makes the cows produce even more E. coli bacteria — the kind that's harmless to cows, but which can and frequently has contaminated the meat supply for humans. Dozens have been made ill from eating beef produced at plants "processing" — that is, assembly line-style slaughtering — these cows.

What's also true is that E. coli only showed up so prolifically in the guts of cows since they've been fed corn in the last 50 years or so. A starchy food the grass-eaters didn't evolve to consume, corn produces an acidic mess in their stomachs that E. coli bacteria apparently loves. But corn has been made cheap by federal policy, and it can be used — often along with artificial hormones — to make cows grow faster and fatter. As Georgian grass-fed beef rancher Will Harris put it, "The best way to sell seven pounds of corn was to sell one pound of beef." (Of course, as Harris knows, all that rapid growth on an unnatural diet in such close proximity to other cows makes it necessary to treat them with antibiotics to prevent other disease outbreaks, which like E. coli would be rare if the animals weren't raised this way.)

Without cheap corn, the economic model for beef ranchers is broken. And there's only so much waste from ethanol plants to go around. Fortunately for these ranchers, there's waste from other sources — namely, any nearby food processing plant. As a Wall Street Journal video recently demonstrated (thanks to Tom Philpott at Grist for bringing it to my attention), some ranchers are feeding their cows potato chips and chocolate. Actually, that's too generous: the cows are eating waste — the potato chip and chocolate waste not fit for the junk food aisle at the grocery store.

"I don't know of any research yet on the impacts of feeding cows potato chips, but it sure isn't what they are built to eat," Patty Lovera, assistant director of Food and Water Watch, a nonprofit watchdog group, wrote in an e-mail.

"And it opens up the whole topic of how little research there really is on the food safety impacts of what cattle are fed. Basically, land grant universities that do most of the research on livestock issues are not very likely to criticize (or even fully examine) 'modern' or more 'efficient' techniques of raising cattle, especially what they are fed."

What research is out there is suggestive. A 1998 USDA and Cornell University study showed that feeding cows grass right before slaughter decreased the E. coli counts in their guts, and would make human infections less likely. (Lovera said the research was "basically shut down after the industry protested it.") A study just this spring showed that E. coli counts were twice as high in the hindguts of cows fed distiller's grains (ethanol waste) as those fed a "traditional" corn diet.

What about potato chips and chocolate?

The short answer is that there is no evidence it wreaks havoc on cows and breeds E. coli. (I've reached out to other experts on this question, and will update this thread if I learn more.) It could be the rash of beef recalls has more to do with slaughterhouses "processing" more cows more quickly (and, the accusation goes, more sloppily) to increase their margins through volume, because the profit per cow is down due to the high cost of grain.

But, as the premise of the movie King Corn taught us, we are what we eat. If cows are fed mostly corn, and we eat a lot of beef, the carbon in corn becomes a part of us. Now, Americans may be made up more of junk food even than corn.

inquisitive cow

"If corn is high carbohydrate — and relative to grass, it is — what about potato chips and chocolate?" Will Harris, the Georgia grass-fed beef rancher and beef director for the American Grassfed Association, said. "That's real high carbohydrate, isn't it? Do those kinds of practices cause beef to be less safe or less healthy or less humane or less environmentally sustainable? Well? Intuitively, you might think so. I leave it to the scientists to find out."

He added: "What I believe is that any species of animal does best when it eats what it evolved to eat."

He was talking about cows. It makes sense for us, as well.

Original here

Star Trek warp drive is a possibility, say scientists

By Roger Highfield, Science Editor

The advance could mean that Star Trek fantasies of interstellar civilisations and voyages powered by warp drive are now no longer the exclusive domain of science fiction writers.

The US Starship Enterprise from the original Star Trek series
The US Starship Enterprise from the original Star Trek series

In the long running television series created by Gene Roddenberry, the warp drive was invented by Zefram Cochrane, who began his epic project in 2053 in Bozeman, Montana.

Now Dr Gerald Cleaver, associate professor of physics at Baylor, and Richard Obousy have come up with a new twist on an existing idea to produce a warp drive that they believe can travel faster than the speed of light, without breaking the laws of physics.

In their scheme, in the Journal of the British Interplanetary Society, a starship could "warp" space so that it shrinks ahead of the vessel and expands behind it.

By pushing the departure point many light years backwards while simultaneously bringing distant stars and other destinations closer, the warp drive effectively transports the starship from place to place at faster-than-light speeds.

All this extraordinary feat requires, says the new study, is for scientists to harness a mysterious and poorly understood cosmic antigravity force, called dark energy.

Dark energy is thought responsible for speeding up the expansion rate of our universe as time moves on, just like it did after the Big Bang, when the universe expanded much faster than the speed of light for a very brief time.

This may come as a surprise since, according to relativity theory, matter cannot move through space faster than the speed of light, which is almost 300,000,000 metres per second. But that theory applies only to unwarped 'flat' space.

And there is no limit on the speed with which space itself can move: the spaceship can sit at rest in a small bubble of space that flows at "superluminal" - faster than light - velocities through normal space because the fabric of space and time itself (scientists refer to spacetime) is stretching.

In the scheme outlined by Dr Cleaver dark energy would be used to create the bubble: if dark energy can be made negative in front of the ship, then that patch of space would contract in response.

"Think of it like a surfer riding a wave," said Dr Cleaver. "The ship would be pushed by the spatial bubble and the bubble would be travelling faster than the speed of light."

The new warp drive work also draws on "string theory", which suggests the universe is made up of multiple dimensions. We are used to four dimensions - height, width, length and time but string theorists believe that there are a total of 10 dimensions and it is by changing the size of this 10th spatial dimension in front of the space ship that the Baylor researchers believe could alter the strength of the dark energy in such a manner to propel the ship faster than the speed of light.

They conclude by recommending that it would be "prudent to research this area further."

  • Einstein's spooky action acts at 10,000 times the speed of light
  • Reach for the stars on a beam of light
  • We must leave Earth, says Hawking
  • But hold the dilithium crystals: Dr Chris Van Den Broeck of Cardiff University commented: "The problem with this and previous schemes (including my own) is that part of the exotic matter would have to travel faster than the *local* speed of light (roughly speaking, it would need to go faster than the speed of light with respect to the portion of space it occupies), and that's not allowed by any established physical theory."

    And even if this criticism can be met, Richard Obousy computed the amount of energy required to start up a "warp" process (but not the total energy required to travel a specific distance) around a 10x10x10 metre-cube ship based on the required change in dark energy in a space equal to the volume of the ship.

    The energy to kick start the drive turned out to be equivalent to turning the entire mass of Jupiter into energy, by Einstein's famous E equals Mc squared equation, where c is the speed of light. Given the mass of Jupiter is around 2000,000,000,000,000,000,000,000,000 kilograms, that is a big number.

    "That is an enormous amount of energy," Dr Cleaver said. "We are still a very long ways off before we could create something to harness that type of energy."

    Original here

    Portal to mythical Mayan underworld found in Mexico


    By Miguel Angel Gutierrez

    MEXICO CITY (Reuters) - Mexican archeologists have discovered a maze of stone temples in underground caves, some submerged in water and containing human bones, which ancient Mayans believed was a portal where dead souls entered the underworld.

    Clad in scuba gear and edging through narrow tunnels, researchers discovered the stone ruins of eleven sacred temples and what could be the remains of human sacrifices at the site in the Yucatan Peninsula.

    Archeologists say Mayans believed the underground complex of water-filled caves leading into dry chambers -- including an underground road stretching some 330 feet -- was the path to a mythical underworld, known as Xibalba.

    According to an ancient Mayan scripture, the Popol Vuh, the route was filled with obstacles, including rivers filled with scorpions, blood and pus and houses shrouded in darkness or swarming with shrieking bats, Guillermo de Anda, one of the lead investigators at the site, said on Thursday.

    The souls of the dead followed a mythical dog who could see at night, de Anda said.

    Excavations over the past five months in the Yucatan caves revealed stone carvings and pottery left for the dead.

    "They believed that this place was the entrance to Xibalba. That is why we have found the offerings there," de Anda said.

    The Mayans built soaring pyramids and elaborate palaces in Central America and southern Mexico before mysteriously abandoning their cities around 900 A.D.

    They described the torturous journey to Xibalba in the Popul Vuh sacred text, originally written in hieroglyphic script on long scrolls and later transcribed by Spanish conquerors.

    "It is very likely this area was protected as a sacred depository for the dead or for the passage of their souls," said de Anda, whose team has found ceramic offerings along with bones in some temples.

    Different Mayan groups who inhabited southern Mexico and northern Guatemala and Belize had their own entrances to the underworld which archeologists have discovered at other sites, almost always in cave systems buried deep in the jungle.

    In the Yucatan site they have found one 1,900-year-old ceramic vase, but most of the artifacts date back to between 700 and 850 A.D.

    "These sacred tunnels and caves were natural temples and annexes to temples on the surface," said de Anda.

    (Writing by Mica Rosenberg; editing by Todd Eastham)

    Original here

    Want Solar? Head to Sam's Club

    Written by Hank Green

    Getting a solar system installed on your house is decidedly complicated. There is no centralized system, only contractors who you may or may not be able to trust. Prices for installation vary wildly, as do prices for the modules themselves.

    But in California, where things are getting a bit more consolidated and simplified, one retailer is trying to make solar easy. And it's Wal-Mart. Or, Sam's Club in nine California stores: Corona, Murrieta, Glendora, Ontario, La Habra, Chino, Long Beach, Fountain Valley, and Torrance.

    The solar kiosks will hook consumers up with established solar sellers and installers including Borrego Solar and BP Solar. The kiosks also offer $100 off every kilowatt of installed solar power. Honestly, that's not very much, considering a kilowatt of installed solar can cost up to $10,000, but Sam's Club members expect savings they will have!

    Borrego, however, is intent on selling the future savings of the system. According to them, a $35k system will save homeowners $96k over the life of the panels. I'm not arguing with their data (at least in California) but that's a big pill to swallow for a lot of folks. But exposing people to the potential benefits of solar is apparently half the battle.

    We imagine that Wal-Mart gets some commission on sales that come through the Sam's Club kiosks, but the image it gives them may be more important in the end. But if these things will make it simpler and more common for Californians to buy personal home solar systems, we certainly aren't going to complain.

    Original here

    Climate Change Caused Widespread Tree Death In California Mountain Range, Study Confirms

    White fir trees died in the 2002 drought, while neighboring Jeffrey pines survived at this elevation. (Credit: Image courtesy of University of California - Irvine)

    Warmer temperatures and longer dry spells have killed thousands of trees and shrubs in a Southern California mountain range, pushing the plants' habitat an average of 213 feet up the mountain over the past 30 years, a UC Irvine study has determined.

    White fir and Jeffrey pine trees died at the lower altitudes of their growth range in the Santa Rosa Mountains, from 6,400 feet to as high as 7,200 feet in elevation, while California lilacs died between 4,000-4,800 feet. Almost all of the studied plants crept up the mountain a similar distance, countering the belief that slower-growing trees would move slower than faster-growing grasses and wildflowers.

    This study is the first to show directly the impact of climate change on a mountainous ecosystem by physically studying the location of plants, and it shows what could occur globally if the Earth's temperature continues to rise. The finding also has implications for forest management, as it rules out air pollution and fire suppression as main causes of plant death.

    "Plants are dying out at the bottom of their ranges, and at the tops of their ranges they seem to be growing in and doing much better," said Anne Kelly, lead author of the study and a graduate student in the Department of Earth System Science at UCI. "The only thing that could explain this happening across the entire face of the mountain would be a change in the local climate."

    The study appears online the week of Aug. 11 in the Proceedings of the National Academy of Sciences.

    Kelly and Michael Goulden, Earth system science professor, studied the north face of the Santa Rosa Mountains, just south of Palm Desert near Idyllwild, Calif. In the past 30 years, the average temperature there rose about 2 degrees Fahrenheit. While overall precipitation increased, the area experienced longer periods of drought, specifically in 1987-1990 and 1999-2002.

    They decided to study the area after learning that people who live and work there were speculating that climate change was causing the plants to die.

    Kelly and Goulden began with a 1977 plant survey by researcher Jan Zabriskie that cataloged all plants along a five-mile vertical stretch through the desert scrub, pinyon-juniper woodland, and chaparral shrubland and conifer forest.

    The UCI scientists went back to the same spot in 2006-07 and did another plant survey, in which they stretched a measuring tape along the route and physically identified and measured plants that covered the tape. Then with a computer, they compared their results with those of the 1977 survey.

    In the UCI study, 141 different species were identified along the tape, but the final analysis focused on 10 that were most abundant at different elevations. Those species included white fir and Jeffrey pine trees; golden cup oak trees; sugar bush, California lilac, Muller scrub oak, creosote bush, ragweed, and brittle bush shrubs; and agave plants.

    The mean elevation of nine of the 10 species rose, with an average gain of 213 feet.

    "I was surprised by how nice the data looked and how unambiguous the signal was," Goulden said. "It is clear that ecosystems can respond rather rapidly to climate change."

    The scientists say air pollution did not kill the trees or cause the shift because the area does not have unusually high carbon dioxide levels, and they did not observe the characteristic speckling on plants caused by ozone damage. Also, if it was pollution, all of the plants would be suffering, not just the ones at the bottom of their range.

    Fire suppression also is not a culprit, they say. The fire regime there is normal, with the last major fire occurring in the 1940s.

    "The plants should still be in a recovery phase where they are growing back in," Kelly said. "But they have stopped recovering and now are dying, which these plants should not be doing."

    A study published recently in the journal Science also found that plant growth ranges are moving upward in a French mountain range, but its conclusions were based on historic databases, not a systematic, repeated measurement of plant cover. The UCI study also found that all types of plants, from pine trees to ragweed, moved up a similar distance, not just small, short-lived plants as found by the French scientists.

    The UCI study was funded by NASA, the U.S. Department of Energy Program for Ecosystem Research and a National Science Foundation CEA-CREST fellowship. Kelly conducted her research as a graduate student in the CEA-CREST program at California State University, Los Angeles.

    Original here

    New water lily species called proof of evolution

    A new species of water lily discovered this year helps prove the theory of evolution, a Manitoba scientist says.

    "This species isn't just new to Manitoba, it is a new species of plant that has evolved fairly recently," said Diana Robson, the Manitoba Museum's curator of botany. "Evolution isn't just something that occurred in the past; it's happening right now."

    Enlarge Image Enlarge Image icon

    "New species evolve when individuals obtain new genetic material that makes them well adapted to new habitats. Mutation is one way that organisms obtain new genetic information, and hybridization is another."

    The new species is a hybrid. It arose when two fairly common species of water lily, Leiberg's Water-lily (Nymphaea leibergii) and Fragrant Water-lily (Nymphaea odorata) interbred.

    Although plant hybrids form regularly, they are usually sterile and unable to reproduce.

    It is the only documented population of this type of new water lily in Manitoba, Robson said, and is a fertile hybrid that is reproducing. So it's considered to be a brand-new species.

    It's also quite rare. There are fewer than 500 of them in existence, Robson said, and they should probably be protected.

    Robson was alerted to the possibility a new plant species existed in northern Manitoba by John Wiersema, a biologist and water-lily expert with the United States Department of Agriculture. In an old collection, Wiersema found documented a strange specimen of water lily that was collected in Manitoba 60 years ago.

    Intrigued by its unusual characteristics, Wiersema and another colleague visited the Minago River about 100 kilometres north of Lake Winnipeg in 1996 and 2000 to find the plant, with no luck.

    This year, after a two-hour plane trip and a one-hour boat ride on the same river with local guides, Wiersema and Robson found the rare new lily.

    The new species appears to have evolved within the last 2,000 years, she said, and is unusual for Canada.

    "Part of what was exciting about this was that the flora in Canada is quite well-known. Most scientists don't really expect to find a new species of plant anymore," Robson said.

    Original here

    2 Large Solar Plants Planned in California, Will Each Be 10 Times Bigger Than Largest Now in Service