Friday, August 29, 2008

New Milky Way map reveals a complicated outer galaxy

CHICAGO -- The halo of stars that envelops the Milky Way galaxy is like a river delta criss-crossed by stellar streams large and small, according to new data from the Sloan Digital Sky Survey (SDSS-II). While the largest rivers of this delta have been mapped out over the last decade, analysis of the new SDSS-II map shows that smaller streams can be found throughout the stellar halo, said Kevin Schlaufman, a graduate student at the University of California at Santa Cruz.

A theoretical model of a galaxy like the Milky Way, showing trails of stars torn from disrupted satellite galaxies that have merged with the central galaxy. The structures seen in the SDSS-II star maps support this prediction of a complicated outer Galaxy. The region shown is about one million light years on a side; the sun is just 25,000 light years from the center of the Galaxy and would appear close to the center of this picture. Credit: K. Johnston, J. Bullock

Schlaufman reported his results at an international symposium in Chicago, titled "The Sloan Digital Sky Survey: Asteroids to Cosmology." Over the last three years, Schlaufmann explained, the SEGUE survey of SDSS-II has measured the motions of nearly a quarter million stars in selected areas of the sky. A careful search for groups of stars at the same velocity turned up 14 distinct structures, 11 of them previously unknown.

"Even with SEGUE, we are still only mapping a small fraction of the Galaxy," said Schlaufman, "so 14 streams in our data implies a huge number when we extrapolate to the rest of the Milky Way." If each velocity structure were a separate stream, Schlaufman explained, there would be close to 1,000 in the inner 75,000 light years of the Galaxy. However, these structures could arise from a smaller number of streams that are seen many times in different places.

"A jumble of pasta" is the way Columbia University researcher Kathryn Johnston described her theoretical models of the Milky Way's stellar halo. In a review talk at the symposium, Johnston explained how dwarf galaxies that pass close to the Milky Way can be stretched by gravitational tides into spaghetti-like strands, which wind around the Galaxy as stars trace out the same orbital paths at different rates.

"In the center of the Galaxy, these stellar strands crowd together and you just see a smooth mix of stars," said Johnston. "But as you look further away you can start to pick out individual strands, as well as features more akin to pasta shells that come from dwarfs that were on more elongated orbits. By looking at faint features, Kevin may be finding some of the 'angel hair' that came from smaller dwarfs, or ones that were destroyed longer ago."

Heidi Newberg of Rensselaer Polytechnic Institute and her thesis student Nathan Cole have been trying to follow some of the larger strands as they weave across the sky. "It's a big challenge to piece things together," said Cole, "because the stream from one dwarf galaxy can wrap around the Galaxy and pass through streams of stars ripped from other dwarf galaxies."

Toward the constellation Virgo, where SDSS images revealed an excess of stars covering a huge area of sky, Newberg finds that there are at least two superposed structures, and possibly three or more. The SEGUE velocity measurements can separate systems that overlap in sky maps, Newberg explained in her symposium talk. "Part of what we see toward Virgo is a tidal arm of the Sagittarius dwarf galaxy, whose main body lies on the opposite side of the Milky Way, but we don't know the origin of the other structures. There really aren't enough pasta varieties to describe all the structures we find."

In addition to stellar streams, astronomers searching the SDSS data have found 14 surviving dwarf companions of the Milky Way, including two new discoveries announced today at the symposium by Gerard Gilmore of Cambridge University. These satellite galaxies are orbiting within the halo of invisible dark matter whose gravity holds the Milky Way itself together. Most of them are much fainter than the ten satellites known prior to the SDSS.

Because even the SDSS can only detect these ultra-faint dwarfs if they are relatively nearby, there could be several hundred more of them further out in the Milky Way's dark halo, according to independent analyses by graduate students Sergey Koposov, of the Max Planck Institute for Astronomy in Heidelberg, Germany, and Eric Tollerud, of the University of California at Irvine. "Even so," said Koposov, "we expect that the number of dark matter clumps is much larger than that, so something must prevent the smaller clumps from gathering gas and forming stars."

The SDSS dwarfs have far fewer stars than the previously known satellites, noted Gilmore, but they have similar spatial extents, and the stars within them move at similar speeds. "I think the internal dynamics of these tiny galaxies may be hard to explain with our conventional ideas about dark matter," said Gilmore.

"The SDSS has taught us a huge amount about the Milky Way and its neighbors," said Johnston, who is pleased to see some of the predictions of her models confirmed by the new data. "But we're still just beginning to map the Galaxy in a comprehensive way, and there's a trove of discoveries out there for the next generation of surveys, including the two new Milky Way surveys that will be carried out in SDSS-III."

The Sloan Digital Sky Survey is the most ambitious survey of the sky ever undertaken, involving more than 300 astronomers and engineers at 25 institutions around the world. SDSS-II, which began in 2005 and finished observations in July, 2008, is comprised of three complementary projects. The Legacy Survey completed the original SDSS map of half the northern sky, determining the positions, brightness, and colors of hundreds of millions of celestial objects and measuring distances to more than a million galaxies and quasars. SEGUE (Sloan Extension for Galactic Understanding and Exploration) mapped the structure and stellar makeup of the Milky Way Galaxy. The Supernova Survey repeatedly scanned a stripe along the celestial equator to discover and measure supernovae and other variable objects, probing the accelerating expansion of the cosmos. All three surveys were carried out with special purpose instruments on the 2.5-meter telescope at Apache Point Observatory, in New Mexico.

Funding for the SDSS and SDSS-II has been provided by the Alfred P. Sloan Foundation, the Participating Institutions, the National Science Foundation, the U.S. Department of Energy, the National Aeronautics and Space Administration, the Japanese Monbukagakusho, the Max Planck Society, and the Higher Education Funding Council for England.

The SDSS is managed by the Astrophysical Research Consortium for the Participating Institutions. The SDSS-II Participating Institutions are the American Museum of Natural History, Astrophysical Institute Potsdam, University of Basel, University of Cambridge, Case Western Reserve University, University of Chicago, Drexel University, Fermilab, the Institute for Advanced Study, the Japan Participation Group, Johns Hopkins University, the Joint Institute for Nuclear Astrophysics, the Kavli Institute for Particle Astrophysics and Cosmology, the Korean Scientist Group, the Chinese Academy of Sciences (LAMOST), Los Alamos National Laboratory, the Max-Planck-Institute for Astronomy (MPIA), the Max-Planck-Institute for Astrophysics (MPA), New Mexico State University, Ohio State University, University of Pittsburgh, University of Portsmouth, Princeton University, the United States Naval Observatory, and the University of Washington.

Original here

Scientists discover why flies are so hard to swat

Over the past two decades, Michael Dickinson has been interviewed by reporters hundreds of times about his research on the biomechanics of insect flight. One question from the press has always dogged him: Why are flies so hard to swat?

"Now I can finally answer," says Dickinson, the Esther M. and Abe M. Zarem Professor of Bioengineering at the California Institute of Technology (Caltech).

Using high-resolution, high-speed digital imaging of fruit flies (Drosophila melanogaster) faced with a looming swatter, Dickinson and graduate student Gwyneth Card have determined the secret to a fly's evasive maneuvering. Long before the fly leaps, its tiny brain calculates the location of the impending threat, comes up with an escape plan, and places its legs in an optimal position to hop out of the way in the opposite direction. All of this action takes place within about 100 milliseconds after the fly first spots the swatter.

"This illustrates how rapidly the fly's brain can process sensory information into an appropriate motor response," Dickinson says.

For example, the videos showed that if the descending swatter--actually, a 14-centimeter-diameter black disk, dropping at a 50-degree angle toward a fly standing at the center of a small platform--comes from in front of the fly, the fly moves its middle legs forward and leans back, then raises and extends its legs to push off backward. When the threat comes from the back, however, the fly (which has a nearly 360-degree field of view and can see behind itself) moves its middle legs a tiny bit backwards. With a threat from the side, the fly keeps its middle legs stationary, but leans its whole body in the opposite direction before it jumps.

"We also found that when the fly makes planning movements prior to take-off, it takes into account its body position at the time it first sees the threat," Dickinson says. "When it first notices an approaching threat, a fly's body might be in any sort of posture depending on what it was doing at the time, like grooming, feeding, walking, or courting. Our experiments showed that the fly somehow 'knows' whether it needs to make large or small postural changes to reach the correct preflight posture. This means that the fly must integrate visual information from its eyes, which tell it where the threat is approaching from, with mechanosensory information from its legs, which tells it how to move to reach the proper preflight pose."

The results offer new insight into the fly nervous system, and suggest that within the fly brain there is a map in which the position of the looming threat "is transformed into an appropriate pattern of leg and body motion prior to take off," Dickinson says. "This is a rather sophisticated sensory-to-motor transformation and the search is on to find the place in the brain where this happens," he says.

Dickinson's research also suggests an optimal method for actually swatting a fly. "It is best not to swat at the fly's starting position, but rather to aim a bit forward of that to anticipate where the fly is going to jump when it first sees your swatter," he says.

The paper, "Visually Mediated Motor Planning in the Escape Response of Drosophila," will be published August 28 in the journal Current Biology.

Original here

Giant Clams Fed Early Humans

A new species of giant clam, Tridacna costata, found in the Red Sea. Credit: Carin Jantzen

By Charles Q. Choi, Special to LiveScience

Giant clams two feet long might have helped feed prehistoric humans as they first migrated out of Africa, new research reveals.

The species, Tridacna costata, once accounted for more than 80 percent of giant clams in the Red Sea, researcher now say. Today, these mollusks, the first new living species of giant clam found in two decades, represent less than 1 percent of giant clams living there.

This novel clam, whose shell has a distinctive scalloped edge, was discovered while scientists were attempting to develop a breeding program for another giant clam species, Tridacna maxima, which is prized in the aquarium trade. The new species appears to live only in the shallowest waters, which makes it particularly vulnerable to overfishing.

"These are all strong indications that T. costata may be the earliest example of marine overexploitation," said researcher Claudio Richter, a marine ecologist at the Alfred-Wegener-Institute for Polar and Marine Research in Bremerhaven, Germany.

Fossil evidence that the researchers uncovered suggests the stocks of these giant clams began crashing some 125,000 years ago, during the last interval between glacial periods. During that time, scientists think modern humans first emerged out of Africa, Richter said.

These mollusks could have played a key role in feeding people during that crucial era, serving as a prime target due to their large size, the scientists added. Indeed, competition for these clams and other valuable sea resources "may have been an important driver for human expansion," Richter told LiveScience.

Since this new species bore some features in common with two other living species of Red Sea clams, at first the researchers thought the new mollusk might have been a hybrid, but genetic analysis showed otherwise. These results were further corroborated by marked differences in behavior — while the other two clams spawn over a long period in summer, the new species reproduces during a brief spurt in spring.

No one had expected to discover a new giant clam species, "particularly in the Red Sea, one of the best investigated coral reef provinces," Richter said. The fact that it was overlooked for so long "is a testimony as to how little we really know about marine biodiversity."

Underwater surveys carried out in the Gulf of Aqaba (north of the Red Sea, between the Sinai Peninsula and Arabian mainland) and northern Red Sea revealed this long-overlooked clam must be considered critically endangered. Only six out of 1,000 live specimens the scientists observed belonged to the new species. This mollusk could be the earliest victim of human degradation of coral reefs in this region, the researchers added.

The scientists detailed their findings online on Aug. 28 in the journal Current Biology.

Original here

Bell Labs Kills Fundamental Physics Research

After six Nobel Prizes, the invention of the transistor, laser and countless contributions to computer science and technology, it is the end of the road for Bell Labs' fundamental physics research lab.

Alcatel-Lucent, the parent company of Bell Labs, is pulling out of basic science, material physics and semiconductor research and will instead be focusing on more immediately marketable areas such as networking, high-speed electronics, wireless, nanotechnology and software.

The idea is to align the research work in the Lab closer to areas that the parent company is focusing on, says Peter Benedict, spokesperson for Bell Labs and Alcatel-Lucent Ventures.

"In the new innovation model, research needs to keep addressing the need of the mother company," he says.

That view is shortsighted and may drastically curtail the Labs' ability to come up with truly innovative discoveries, respond critics.

"Fundamental physics is absolutely crucial to computing," says Mike Lubell, director of public affairs for the American Physical Society. "Say in the case of integrated circuits, there were many, many small steps that occurred along the way resulting from decades worth of work in matters of physics."

Bell Labs was one of the last bastions of basic research within the corporate world, which over the past several decades has largely focused its R&D efforts on applied research -- areas of study with more immediate prospects of paying off.

Without internally funded basic research, fundamental research has instead come to rely on academic and government-funded laboratories to do kind of long-term projects without immediate and obvious payback that Bell Labs used to historically do, says Lubell.

Most of the scientists working in the company's fundamental physics department have been reassigned, says Benedict. Nature, which first reported the news, says just four scientists are left working the fundamental physics department in Murray Hill, New Jersey. Benedict wouldn't confirm or deny that.

Computing and wireless technologies owe much to advancements in physics, though the connection may not always be immediately apparent. An example is the Global Positioning Systems or GPS.

For instance, an integral element of GPS are atomic clocks, which stemmed from the creation of the hydrogen maser.

The hydrogen maser, or hydrogen frequency standard, uses the properties of a hydrogen atom to serve as a precision frequency reference.

"GPS is based on very accurate timing mechanisms," says Lubell. "So the measure of time and the frequency standards that are used to do it date back to research in optical pumping which led to the development of hydrogen maser."

In the past Bell Labs was the place where such fundamental research that impacts the fields of both computing and physics could meet.

Bell Labs was founded in 1925 by Walter Gifford, then president of AT&T. AT&T, a monopoly, established Bell Telephone Laboratories, popularly known as Bell Labs, as a joint venture with Western Electric, AT&T's manufacturing subsidiary.

The Labs became the Mecca for researchers in science, computers and mathematics. Deregulation, however, forced AT&T in 1995 to spin off Bell and other parts of the company into Lucent Technologies. The move marked a shift in fortunes for the research arm as research budgets came to be trimmed and Alcatel-Lucent faced increasing pressure from stockholders.

"Bell Labs could do the kind of fundamental research it did in the past because it was functioning as part of a monopoly," says Lubell. "With that gone the landscape changed dramatically."

In recent years, Bell Labs' physics unit had its share of controversy when researcher J. Hendrik Schön was found to have published data in the area of molecular-scale transistors between 1998 and 2001 that had been manipulated and falsified.

That's a long way from where the Labs once stood with its position as a Nobel Prize magnet.

In 1937, Bell Labs researcher Clinton Davisson shared the Nobel Prize in physics for demonstrating the wave nature of matter.

Nearly twenty years later, in 1956 came the Nobel prize for inventing the transistor and it was shared by William Shockley, John Bardeen and Bell scientist Walter Brattain.

In the seventies, Bell Labs won two Nobel prizes in physics back-to-back in the years 1977 and 1978. Philip Anderson shared the Nobel for developing an improved understanding of the electronic structure of glass and magnetic materials. The next year Arno Penzias and Robert Wilson were feted for their discovery of cosmic microwave background radiation.

Former Bell Labs researcher Steven Chu shared the Nobel in 1997 for developing methods to cool and trap atoms with laser light. A year later Horst Stormer, Robert Laughlin, and Daniel Tsui were awarded a Nobel for the discovery and explanation of the fractional quantum Hall effect.

In the last few years, Lucent has sold its semiconductor business and that means research in areas connected to that had to be scaled back, especially in areas such as integrated circuits and Microelectromechanicals Systems (MEMS).

Meanwhile, Alcatel-Lucent continues to hack away at its jewels. Though Murray Hill in New Jersey, the company's U.S. headquarters, and the site of many great scientific discoveries remains safe, Alcatel-Lucent has sold its Holmdel campus. Holmdel's technological contributions include contributions to Telstar, the first communications satellite and Chu's Nobel Prize-winning work.

Still for fundamental physics research there will be life after Bell Labs, though it will be dependent on the whims of the federal government.

Increasingly, long-term research is being carried out in universities and national laboratories with federal grants, says Lubell.

For Bell Labs, yet another chapter in its storied history of comes to a close taking the once iconic institution closer to being just another research arm of a major corporation.

Photo: William Shockley, John Bardeen and Walter Brattain invented the transistor in 1947. (Alcatel-Lucent/Bell Labs)

Original here

How bacteria could help power the future

By Michael Schirber

Hydrogen is the cleanest and most abundant fuel there is, but extracting it from water or organic material is currently not a very efficient process. Scientists are therefore studying certain bacteria that exhale hydrogen as part of their normal metabolism.

"The production of hydrogen by microorganisms is intimately linked to their cellular processes, which must be understood to optimize bioenergy yields," said Amy VanFossen of North Carolina State University.

Of particular interest are microbes that thrive in hot temperatures, near the boiling point of water. VanFossen and her colleagues carried out a detailed DNA study of one of these thermophilic (heat-loving) bacteria called Caldicellulosiruptor saccharolyticus, which was first found in a hot spring in New Zealand.

The results, presented last week at the American Chemical Society meeting in Philadelphia, indicate which genes allow C. saccharolyticus to eat plant material, referred to as biomass, and expel hydrogen in the process.

Fuel cell vehicles are starting to be available for lease in California and the New York area. They run off of hydrogen gas and emit only water vapor out the tail pipe.

Hydrogen can be found everywhere: it's the "H" in H2O and a major element in biological processes. The problem is that it takes quite a bit of energy to separate the hydrogen from the molecules it is found in.

However, certain organisms, such as the bacteria in cow stomachs , get energy from food through a chemical reaction that releases hydrogen gas. Often this hydrogen is immediately taken up by other bacteria, called methanogens , that convert it to methane .

One of the challenges, therefore, of producing hydrogen from bacteria is to prevent the methanogens from gobbling up the gas. The advantage of thermophiles is that they operate at temperatures that are typically too hot for methanogens. C. saccharolyticus, for example, prefers a toasty 160 degrees Fahrenheit (70 degrees Celsius).

Furthermore, the chemistry of hydrogen formation is easier at these higher temperatures, said Servé Kengen from Wageningen University in the Netherlands.

"In general, thermophiles have a simpler fermentation pattern compared to [lower temperature] mesophiles, resulting in fewer byproducts," he said.

Bionic microbe
Kengen is part of a European Union project called Hyvolution, which is developing decentralized hydrogen production that can be performed near where biomass is grown.

"Biological hydrogen production is well suited for decentralized energy production," Kengen said. "The process is performed at almost ambient temperature and pressure, and therefore it is expected to be less energy intensive than thermochemical or electrochemical production methods [which are alternative ways to get hydrogen]."

Kengen said that C. saccharolyticus, or what he calls "Caldi," is very attractive for this application. It is unique in that it eats a wide range of plant materials, including cellulose , and can digest different sugars (technically carbohydrates) at the same time.

"The wide range of carbohydrates it grows on suggests that C. saccharolyticus will yield a plethora of industrially relevant carbohydrate degrading enzymes," VanFossen told LiveScience.

These enzymes — now isolated through VanFossen's genetic analysis — could help get more hydrogen from a given quantity of biomass.

"Once we are able to engineer Caldi (not yet possible) we want to further improve its hydrogen producing capacity," Kengen said.

© 2008 All rights reserved.

Original here

Swift Enterprises Joins Race for Alternative Jet Fuel

Green Gym Uses Human-Powered Energy

Why US must invest against climate change

Eight scientific organisations have urged the next US president to help protect the country from climate change by pushing for increased funding for research and forecasting. The organisations say about $2 trillion of US economic output could be hurt by storms, floods and droughts.

"We don't think we have the right kind of tools to help decision makers plan for the future," said Jack Fellows, the vice president for corporate affairs of the University Corporation for Atmospheric Research, a consortium of 71 universities.

The groups, including the American Geophysical Union and the American Meteorological Society, urged Democratic presidential candidate Barack Obama and Republican rival John McCain to support $9 billion in investments between 2010 and 2014 to help protect the country from extreme weather, which would nearly double the current US budget for the area.

The UN's science panel says extreme weather events could hit more often as temperatures rise due to climate change.

Each year the United States suffers billions of dollars in weather-related damages ranging from widespread events like Hurricanes Katrina and Rita, and the more recent droughts in the Southeast, to smaller, more frequent glitches like airline delays from storms, they said. More than a quarter of the country's economic output, about $2 trillion, is vulnerable to extreme weather, they added.

The investments would pay for satellite and ground-based instruments that observe the Earth's climate and for computers to help make weather predictions more accurate.

Invest to protect

John Snow, the co-chairman of the Weather Coalition, a business and university group that advocates for better weather prediction, said improved computers would help scientists forecast extreme weather events more locally, which could help cities better prepare for weather disasters.

It could also help businesses that produce virtually no greenhouse emissions, such as wind farms, know where to best locate their operations, he said.

The scientists said cooler temperatures in the first half of this year are making their task more difficult. "One of the challenges we face ... is to make the case that while we are in a period of warming, we should not expect every year to be the warmest year on record," Snow said.

The global mean temperature to the end of July was 0.28 C above the 1961-1990 average, the UK-based MetOffice for climate change research said on Wednesday. That would make the first half of 2008 the coolest since 2000.

Neither campaign responded immediately to questions about the plea for funding. Obama and McCain, who face off in a November election, both support regulation of greenhouse gases through market mechanisms such as cap-and-trade programs on emissions.

Original here

Ford Tests Improve Gas Mileage 24% with EcoDriving

Ford is really throwing down the guantlet by showing how dedicated it is to the new EcoDriving initiative we talked about the other day. I really liked it because it validates a lot of what we’re trying to do on the forums in terms of improving fuel economy on an individual level, but also showed that automakers were willing to commit (at least in name) to supporting fuel efficient driving. However, it seems Ford has really stepped up to the plate by offering ecodriving lessons over the course of several days to see how effective it really is.

Ford takes on ecodriving

Recently, Ford and a group called Pro Formance decided to take on ecodriving in the form of a 4-day long seminar with 48 different drivers taking part. Using the ecodriving tips taught by Pro Formance, the participants increased their fuel economy between 6-50%, with and average increase of 24%.

With the gas crunch hitting people hard, it’s good to see a company like Ford stepping up and showing consumers that there’s more than just air up your tires and cleaning out the trunk. Here’s their take on ecodriving:

“By working with Pro Formance to conduct validation testing, Ford is proving that eco-driving techniques are teachable and work across a broad spectrum of vehicles and drivers,” said Drew DeGrassi, president and CEO of Pro Formance Group. “It’s a great initiative for Ford to lead in this country. It’s not the end-all solution for America to obtain energy independence, but it is an important part of it.”

I would love to see what the training program is like, but for the rest of us Ford give us 10 ecodriving tips. Sure, they pale in comparison to EcoModder’s ecodriving tips list, but most drivers aren’t interested in getting really involved, and Ford’s hands-on approach is a good way to get results without asking too much of people.

Evidently they have been doing this since the 1990s in Germany, where gas mileage has been an issue for longer than it has in the US. Hopefully, this will encourage other manufacturers to bring their most efficient vehicles and programs to a ready-and-willing US market.

If you liked this post, sign up for out RSS Feed for automatic updates.

Original here

Scientists: Save the planet—have fewer kids

As rising populations strain a warming planet, a British journal suggests having smaller families

|Chicago Tribune correspondent

LONDON — There are plenty of ways to cut your carbon footprint, whether it's driving less or buying an energy-efficient refrigerator. But the British Medical Journal, in an editorial last month, urged a more controversial one: having fewer children.

With 60 million people already living in one of the most densely populated countries in the world, the journal said, British couples should aim to have no more than two children as part of their contribution to worldwide efforts to reduce carbon emissions, stem climate change and ease demands on the world's resources.

Limiting family size is "the simplest and biggest contribution anyone can make to leaving a habitable planet for our grandchildren," the editorial's authors said.

Family planning as a means to reduce climate change has been little talked about in international climate forums, largely because it is so politically sensitive. China's leaders, however, regularly argue that their country should get emission reduction credits because of their one-child policy, and many environmentalists—and even a growing number of religious and ethics scholars—say the biblical command to "be fruitful and multiply" needs to be balanced against Scripture calling for stewardship of the Earth.

Europe's rates diving

Increasingly, "a casual attitude toward global warming ought to be viewed as a sin," argues James Nash, director of the Churches' Center for Theology and Public Policy, a Washington-based research group that studies the relationship between Christian faith and public policy.

The appeal to have fewer children sounds a bit odd in Europe, where one of the biggest worries these days is plunging birthrates. German women today bear an average of 1.3 children, fewer than women in China, where the one-child policy is fast weakening. Even British women are giving birth to just 1.9 children on average, a level below that needed to produce a stable population.

But each child born in a rich country like Britain or the United States is likely to be responsible for 160 times as much carbon emitted as a child born in Ethiopia, said John Guillebaud, a British family-planning doctor, professor and one of the authors of the British Medical Journal editorial. With efforts to cut emissions likely to go only so far, cutting births may be the best option, he said.

"We're not Big Brother. We're not for pushing people," he insisted in an interview. "We just think deciding how big a family to have should take into consideration our descendants."

At the current projected rates of growth, the world's population, now at 6.7 billion, is expected to reach about 9 billion by 2050. Environmentalists argue that a population that large will dramatically overtax the world's resources and lead to growing conflict as well as potentially crippling climate change, particularly as poorer parts of the world develop and begin using more resources.

Most of the expected growth in population is projected to come in less-developed parts of the world, particularly Asia, where 60 percent of the world's people live, and Africa, where birthrates are the highest in the world.

Worldwide, population growth is declining, and even in much of Asia and Africa "the drop in fertility rate has been quite amazing," said Werner Haug, director of the United Nations Population Fund's technical division. Despite falling international investment in family planning, Thailand today has a European-like birthrate; Kenyan women, who once averaged eight children, are now having five.

Overall, Asia's birthrate, excluding China, is 2.8 children per woman, and Africa's is 5.4—well down from the past, said Carl Haub of the Washington-based Population Reference Bureau, an independent organization that analyzes demographic data.

Asia set for boom

But because a birthrate above 2.1 children per couple — the approximate replacement level, allowing for some untimely deaths—will produce ever-expanding growth, even Asia is still set to "grow like wildfire," Haub said.

The problem is worst in places such as northern India, where literacy, education and access to birth control are poor and poverty levels and population numbers are already high. If those conditions continue, runaway growth could push India toward a population of 2 billion people, Haub said. Sub-Saharan Africa, at expected growth rates, is likely to nearly triple its population by 2050, also to about 2 billion people, he said.

Even in the United States, birthrates, which had fallen to around 1.85 children per non-Hispanic white woman, are now about 2.1 children per U.S. couple, thanks to Hispanic migration.

In a nation where Texas' 23 million people account for more greenhouse gas emissions than all 720 million Sub-Saharan Africans, even small rates of U.S. population growth may have a disproportionate impact on global warming, said the UN's Haug.

Experts say the best way to cut the world's birthrate is simply to push ahead with what has worked best in the past: education, access to information about birth-control options, and better health care to give parents confidence that children born will survive to adulthood.

Original here