Wednesday, April 9, 2008

Rare Quasar Discovered That Produces More X-rays Than Thought Possible

Artist's impression of a rare type of quasar, called a broad absorption line (BAL) quasar. Quasars are vast cosmic engines that pump energy into their surroundings. It is thought an enormous black hole drives each quasar. (Credit: ESA, Image by C. Carreau)

XMM-Newton has been surprised by a rare type of galaxy, from which it has detected a higher number of X-rays than thought possible. The observation gives new insight into the powerful processes shaping galaxies during their formation and evolution.

Scientists working with XMM-Newton were looking into the furthest reaches of the universe, at celestial objects called quasars. These are vast cosmic engines that pump energy into their surroundings. It is thought an enormous black hole drives each quasar.

As matter falls into the black hole, it collects in a swirling reservoir called the accretion disc, which heats up. Computer simulations suggest that powerful radiation and magnetic fields present in the region eject some of gas from the gravitational clutches of the black hole, throwing it back into space.

This outflow has a profound effect on its surrounding galaxy. It can create turbulence in the gas throughout the galaxy, hampering star formation. Thus, understanding quasars is an important step to understanding the early history of galaxies.

However, the structure surrounding a quasar is difficult to see because they are so distant. The light and X-rays from them takes thousands of millions of years to reach us.

About 10-20% of quasars are of a special type called BAL quasars. The BAL stands for ‘broad absorption line’ and seems to indicate that a thick cocoon of gas surrounds the quasar.

Most researchers believe that gas flows away from a BAL quasar along the equatorial direction of the accretion disc. These quasars show little X-ray emission, indicating that there is enough gas to absorb most of the X-rays given out from the region near the black hole.

But some BAL quasars appear to be spewing material out along their polar axes, at right angles to the accretion discs.

JunXian Wang, Center for Astrophysics, University of Science and Technology of China, Hefei, and his colleagues including Tinggui Wang and Hongyan Zhou, used XMM-Newton to target four such polar BAL quasars, identified by them previously. They were investigating whether the X-rays were being absorbed strongly.

XMM-Newton observed the quasars at specific times during 2006 and 2007. Two of them emitted more X-rays than the researchers anticipated, indicating that there is no veil of absorbing gas surrounding these particular quasars. “Our results can help refine the computer simulations of how these quasars work,” says Wang.

It may mean that BAL quasars are more complicated than originally thought. “Perhaps there can be both equatorial outflows and polar outflows simultaneously from these objects,” says Wang. Maybe, the outflows are even produced by similar means.

Computer simulations suggest that the polar outflows, like the gas ejected from the accretion disc, are also material falling in, turned away by fierce radiation before it comes near the black hole.

Wang and colleagues are now following this work up. They hope to monitor more BAL quasars over a longer period of time. “We need more data so that we can look into the details of the X-ray emission,” says Wang.

It seems that the more astronomers look into the distant universe, the more complex it becomes.

Original here

Do We Have the 'Right Stuff' to Put an Astronaut on Mars?

Mars_2_3_2 "All civilizations become either spacefaring or extinct."

Carl Sagan

It only took eight years for JFK’s dream to land a man on the moon to be fulfilled, but plans to to land a man on Mars is going to take just that little bit longer -24 years to be exact, but at least we know how we’re going to go about getting our astronauts there.

NASA is serious about launching the most difficult mission ever attempted by the human race - putting an astronaut on Mars. The voyage will cover hundreds of millions of miles and take two and half years roundtrip. It sounds like science fiction. To make it scientific fact, the United States needs to first flex its deep space muscles again on familiar terrain - the moon. It's called the Constellation program.

"The accuracy with which you need to target a landing site on the surface is like throwing a basketball from New York to Los Angeles and having it go through without touching the rim," explains Steve Squyres, the principal investigator for the Mars rovers Spirit and Opportunity..

If the astronauts actually that shot, and if they land on Mars, they will face a deadly environment - radiation from solar flares, dangerous dust and temperatures that average 60 degrees below zero. And they’ll have to do it for up to 18 months, before the Earth and Mars align properly again for a faster return home. No astronauts have ever spent that amount of time on an alien world. Neil Armstrong was on the moon for less than a day.

"And I think it’s more responsible for us to go to the moon, check out these systems, make sure the life-support systems, the space suits, the little things we need for these long voyages, work properly," explains Dr. Rick Gilbrech is NASA’s exploration chief.

During Apollo, the furthest the astronauts could ever venture out on their lunar rovers was six miles. NASA hopes the new rovers will let the astronauts explore 60 miles from their spacecraft. Technological advancements will help in another way. Think about this: There is more computing power in your average cell phone today than there was on any of the Apollo spacecraft that took the astronauts to the moon.

Another example of how the new missions might be different is the robonaut, which looks like a cousin of C-3PO. It’s an early model of a robot that might assist the astronauts with mundane and sometimes dangerous tasks on the moon.

NASA isn’t using the moon just to train for Mars. Next year, it will launch orbiters around the moon and then essentially blast the lunar surface. In the midst of the debris field, NASA hopes to find evidence of hydrogen, which could one day help fuel trips home for the astronauts. But will there be any missions for the astronauts at all?

The biggest obstacle NASA faces is money. One critic has called the Constellation program "Apollo on food stamps." During the 1960s, 4 percent of the entire national budget was spent on space. Today one-sixth of 1 percent goes to NASA.

Thanks to some new details released by NASA for their new Constellation manned Mars mission, we now have an insight in to what the missions will look like.

A 400,000kg (880,000lb) spacecraft would be constructed, in space. Due to the necessity of a craft that can act autonomously of NASA control, make the distance, and provide for its crew for a 900 day mission, the size of the ‘marscraft’ is obviously going to exceed that of the current space shuttles.

It would take three to four Ares V rockets to launch the elements of the spacecraft in to a low Earth orbit.

A ‘minimal crew’ would make the journey to Mars, taking approximately six to seven months to traverse the distance. A total of 550 days would be spent on the surface of Mars, before returning. Sent every 26 months to the red planet, the crews would need to take up to 50,000kg of cargo with them. They’d need an ‘aerodynamic and powered descent’ and autonomy or at least asynchronous from NASA control.

However, sending the crew is not the only part of this mission. Where are they going to live? How will all their equipment make the journey? That’s why their mission will be preceded by two separate missions.

With a theoretical launch date for the manned mission to Mars arriving in February of 2031, a cargo lander and surface habitat would be launched December 2028 and January 2029, respectively using two Ares V launches. Subsequently, the launder will arrive October 2029, and the habitat a month later; the crew will arrive August 2031.

The second set of pre-launches will occur in late 2030/early 2031, and anticipated to reach Mars at the same time as the first crew. Thus, in the first quarter of 2033, the second mission’s crew will launch to arrive on Mars by December, with the first crew having left January that same year, after a 17-month stay.

Will we see human settlements on Mars? Or is it all just a dream? Will the American public even support traveling to places humans can barely imagine?

Posted by Josh Hill with Casey Kazan.

Original here

If ET Calls, Would We Be Told?

If a verified message from aliens is ever received, would the public be told about it? SETI – the Search for Extra Terrestrial Intelligence - does have an international protocol that if an alien signal is ever received, it would be disseminated among the astronomical community and made public. And of course, says Mac Tonnies at the SETI Blog, “international cooperation might be necessary in order to distinguish a legitimate alien signal from any number of phenomena capable of generating false alarms.” But what if the signal is more than just extra-terrestrials saying hello? Tonnies believes SETI's plans for full disclosure only makes sense if the message is fairly benign. If the signal was a notice of impending doom from a black hole, supernova, or alien invasion – something we on Earth had little power to do anything about – Tonnies questions whether governments would choose to make such information public. But could something of this magnitude really be kept under wraps?

Frankly, I hadn’t really considered this scenario. When I think about SETI and the possibility of communication with an alien species, I envision, perhaps naively, what Tonnies calls the “lofty, abstract dialogue immortalized by Carl Sagan.” But of course, we have no idea of what any alien intelligence would like to say to us. If it was bad news, would governments of the world elect to withhold the information from the public?

Intrigued by Tonnies’ blog post, I contacted him to ask that question.

“I think it's a very real possibility that generally goes unspoken,” said Tonnies, an author, essayist and blogger. “In the event of a bona fide signal, the public may only be made privy to part of it. It depends on the content and context of the message.”

Original here

Polar Bears

Polar Bears: Image 1 of 32 thumb
Polar Bears: Image 2 of 32 thumb
Polar Bears: Image 3 of 32 thumb
Polar Bears: Image 4 of 32 thumb

Polar Bears: Image 5 of 32 thumb
Polar Bears: Image 6 of 32 thumb
Polar Bears: Image 7 of 32 thumb
Polar Bears: Image 8 of 32 thumb

Polar Bears: Image 9 of 32 thumb
Polar Bears: Image 10 of 32 thumb
Polar Bears: Image 11 of 32 thumb
Polar Bears: Image 12 of 32 thumb

Polar Bears: Image 13 of 32 thumb
Polar Bears: Image 14 of 32 thumb
Polar Bears: Image 15 of 32 thumb
Polar Bears: Image 16 of 32 thumb

Polar Bears: Image 17 of 32 thumb
Polar Bears: Image 18 of 32 thumb
Polar Bears: Image 19 of 32 thumb
Polar Bears: Image 20 of 32 thumb

Polar Bears: Image 21 of 32 thumb
Polar Bears: Image 22 of 32 thumb
Polar Bears: Image 23 of 32 thumb
Polar Bears: Image 24 of 32 thumb

Polar Bears: Image 25 of 32 thumb
Polar Bears: Image 26 of 32 thumb
Polar Bears: Image 27 of 32 thumb
Polar Bears: Image 28 of 32 thumb

Polar Bears: Image 29 of 32 thumb
Polar Bears: Image 30 of 32 thumb
Polar Bears: Image 31 of 32 thumb
Polar Bears: Image 23 of 32 thumb

previous gallery
back to galleries
next gallery

How Gunpowder Changed the World

When gunpowder was used to create personal handguns and rifles, a new type of soldier was created: infantry. Photo shows a Revolutionary War reenactment at Fort Ward Historic Site in Alexandria, Virginia. Credit: Igor Vorobyov, Dreamstime

Each Monday, this column turns a page in history to explore the discoveries, events and people that continue to affect the history being made today.

Ironically, it was a quest for immortality that led to the invention of the deadliest weapon before the arrival of the atomic bomb.

Experimenting with life-lengthening elixirs around A.D. 850, Chinese alchemists instead discovered gunpowder. Their explosive invention would become the basis for almost every weapon used in war from that point on, from fiery arrows to rifles, cannons and grenades.

Gunpowder made warfare all over the world very different, affecting the way battles were fought and borders were drawn throughout the Middle Ages.

Flying fire

Chinese scientists had been playing with saltpeter — a common name for the powerful oxidizing agent potassium nitrate — in medical compounds for centuries when one industrious individual thought to mix it with sulfur and charcoal.

The result was a mysterious powder from which, observers remarked in a text dated from the mid-9th century, "smoke and flames result, so that [the scientists'] hands and faces have been burnt, and even the whole house where they were working burned down."

Gunpowder was quickly put to use by the reigning Sung dynasty against the Mongols, whose constant invasions into the country plagued the Chinese throughout the period. The Mongols were the first to be subject to flying fire — an arrow fixed with a tube of gunpowder that ignited and would propel itself across enemy lines. More gunpowder-based weapons were invented by the Chinese and perfected against the Mongols in the next centuries, including the first cannons and grenades.

The psychological effect alone of the mystifying new technology likely helped the Chinese win battles against the Mongols, historians believe.

Explosive trade

Gunpowder somehow remained a monopoly of the Chinese until the 13th century, when the science was passed along the ancient silk trade route to Europe and the Islamic world, where it became a deciding factor in many Middle Age skirmishes.

By 1350, rudimentary gunpowder cannons were commonplace in the English and French militaries, which used the technology against each other during the Hundred Years' War. The Ottoman Turks also employed gunpowder cannons with abandon during their successful siege of Constantinople in 1453. The powerful new weapon essentially rendered the traditional walled fortification of Europe, impregnable for centuries, weak and defenseless.

The next important step for gunpowder came when it was inserted into the barrel of a handgun, which first appeared in the mid-15th century and was essentially a cannon shrunk down to portable size. Guns literally put weaponry into the hands of the individual, creating a new class of soldier — infantry — and giving birth to the modern army.

Gunpowder is still the basis for many modern weapons, including guns, though it's certainly no longer the most explosive force available to armies.

Need to celebrate a victory in battle, though? Gunpowder is there for you. The powder is also at the heart of the fireworks that make the Fourth of July and other holidays so special. To produce the aerial spray of reds, golds and blues, pyrotechnicians pack a tube with gunpowder, colorizing chemicals and small pellets that create the shape and shimmer of the firework.

Original here

When Roses Won’t Do, E-Mail a Fragrance

After satisfying the senses of sight and sound through video streams and music downloads, NTT Communications aims to tap into the sense of smell with a new system that allows users to send fragrances from their cell phones.

A trial of the service will take place later this month during which users will be able to select and send certain fragrance recipes to an in-home unit that is responsible for concocting and releasing the various fragrances. Each holds 16 cartridges of base fragrances or essences that are mixed to produce the various scents in a similar way that a printer mixes inks to produce other colors.

Transforming the mood of room with a new scent is quite easy with this technology.

The first step is to choose a scent from the multitude of fragrance recipes available through an I-mode site on a cell phone. Once chosen the instructions on how to make the scent are then transmitted to the fragrance device through infrared from the phone, and from there the scent is quickly mixed and emitted.

If distance is an issue, the other option is to send the instructions to the device via an e-mail message. The message is intercepted by a home gateway unit that is latched to the home’s broadband connection and sends the instructions to the fragrance device at home. Using this method users can set the time and date of fragrance emission, so one can come home to the relaxing scent of lavender, for example.

There's even room for creating customized scents, which can be shared with other users through the fragrance "playlist" on the Web site.

The technology is not only limited to creating a pleasant-smelling workplace or home. NTT also sees it as a way to enhance multimedia content. For example, instead of just sending an image of a bouquet of roses to a friend, one can boost the experience by sending the fragrance as well.

NTT hopes the fragrance emitter will cost about ¥20,000 (US$195) when eventually launched commercially. Cartridge refills should cost about ¥1,600 it said.

NTT Communications believes that fragrance is the next important medium for telecommunications, as more value is placed on high sensory information. Through a company sponsored Internet survey, NTT found that 56 percent of people polled use aromatherapy or believe that it has positive benefits.

"Aromatherapy can reduce stress and help you relax, and to be able to control smell implies one has the power to manipulate feelings as well," said Akira Sakaino, from NTT Communications' Net Business Division.

NTT has been developing this technology, which it calls "kaori tsushin," since 2004, and has collaborated with various outfits to test the service.

Applications have ranged from fragrance rooms in hotels in Tokyo and Osaka to aroma advertising through digital signage, where fragrances were made to match audio-visual content, located in pubs, parking lots and railway stations around Tokyo.

The fragrance communication mobile service test will take place from April 10 to 20 and involves 20 monitors who are tasked to give feedback on the service.

Original here

'Yeti' fly lost for 40 years is rediscovered

One of the most exotic and elusive flies known to science has been rediscovered, four decades after it was first found buzzing around a Caribbean crab.

'Yeti' fly is rediscovered: Courting of Drosophila endobranchia flies on their host crab

"To me it was like seeing the Yeti," exclaims Dr Marcus Stensmyr, one of the team that made the find in an expedition to Grand Cayman in the Caribbean, the sole known home of this species, to resolve enduring questions about how the fly fits into the tree of life.

To the untrained eye, it looks like just another fruit fly, one of the Drosophilidae, a family consisting of about 3,000 species.

In fact most of the members feed on microbes, not fruit, and one of the more strange choices of habitat is made by this particular fly, Drosophila endobranchia, one of only three fruit flies known to have found a home on (and inside) land-crabs.

To the surprise of scientists, though probably not to most of us, these crab flies have been neglected in research since their description in a paper in 1966. In fact, D. endobranchia has actually not even been seen since its initial discovery.

To fill in this glaring gap in knowledge, scientists from Prof Bill Hansson's group at the Max Planck Institute for Chemical Ecology, Germany, set out last year to relocate these elusive flies on Grand Cayman.

Expedition members Dr Marcus Stensmyr and Regina Stieber pored over the original notebooks from Hampton Carson, who first found the crab flies in 1965 but were disappointed to find that all those sites had been developed.

Soon after, says Dr Stensmyr, came the Eureka moment, which led to the collection of 66 specimens so they could resolve detailed arguments about how to place the flies in the Drosophila family:

"One day driving around we came across a place which to me looked suitable to crab life, but which did not fit the original description of the habitat. We nevertheless decided to revisit the area after darkness. Upon arriving to the area we almost immediately saw in the headlights from our Jeep this enormous crab sitting by the roadside.

This was the first decent sized crab we had seen since arriving on the island. I dashed out of the car, ran up to the crab, and in the beam of the flashlight I saw flies scurrying over the back of the crabs! I could hardly believe it! The flies were real, they were still on the island and they actually lived on land crabs."

The flies strike a very peculiar sight in real life. They are absolutely reluctant to leave their crab hosts, no matter what," he says. "Even after you have picked up the crabs, something which the crabs obviously are not to happy about, the flies still sit tight.

"There are still many unanswered questions regarding the flies which we hopefully will be able to address in the coming years. A key mystery is what the heck the adult flies are doing on the crabs. When you see them they most of the time just sit there, they don't seem to be doing anything."

Original here

Scientists Construct Model of the World Wide Web

Traffic statistics for the Web page, where the long periods of low activity and short bursts of high activity are similar to other sites that the researchers analyzed. Credit: Simkin and Roychowdhury.
Although the Internet contains well over 100 million Web sites, two electrical engineers think they know what the traffic patterns of the entire Web look like.

Sponsored Links (Ads by Google)

Free Online Book - Future Human Evolution Eugenics in the 21st Century

TransModeler - The world's most powerful traffic simulation software

Communications Research - Full-text communications books, journals, articles at Questia.

Mikhail Simkin and Vwani Roychowdhury, electrical engineers from both the University of California Los Angeles and, have constructed a model of the Web using the traffic statistics of just three Web pages: , , . (Traffic patterns from a dozen other Web pages the researchers studied were very similar.) Using several years of data from these three pages, the researchers show how the Internet overall reaches a self-organized critical (SOC) state with long-lasting traffic.

“One of the main implications of our findings is that traffic and [the corresponding] fame is a prolonged phenomenon instead of a one-time fling, and recurs in a spasmodic fashion,” Roychowdhury told

Most of the time, traffic to any single Web page is relatively low and steady, where visitors come from search engines, Web directories, online encyclopedias, and other constant sources. But these long periods of low traffic are interrupted by bursts of heavy traffic that follow a power law, usually the effect of numerous blog entries linking to those pages.

The researchers use a branching model to describe the probability and extent of these bursts. Basically, there’s a certain probability that a viewer will post a blog entry with a link to that Web page, and then a certain number of viewers who will visit the Web page via the blogger’s link. The product of these two variables determines whether or not a Web page will reach the critical value of 1, which determines if the branch keeps growing or dead-ends.

“A system is in a critical state if a single movement in an individual constituent element leads, on the average, to the movement of precisely one other element in the system,” Roychowdhury explained.

If a system is in a super- or sub-critical state, movement of one element leads, on average, to the movement of either more or less than one other element, respectively. That means that a signal generated in a super-critical system should increase forever, while a signal in a sub-critical system eventually dies out.

“But in a critical system, something very interesting happens,” Roychowdhury said. “Almost all signal cascades will die out, but some of them can last for a long time and can cover a large area. Clearly, sub- and super-critical systems are not that interesting unless we want a system that is either not that responsive or a system that explodes at the slightest provocation. Critical systems, on the other hand, allow for a responsive system to exist without it being blown apart. Many physical systems naturally gravitate towards a critical state, and this phenomenon is termed SOC.”

As the researchers explain, competition for viewers and links is a driving force of the Web, and this competition pushes the entire Web into an SOC state. Based on their data, the researchers determined the values for the two variables above for the “true art or fake art” site that closely produce its traffic patterns: they found that its link probability of 0.01 and referral number of 95 visitors per link results in a slightly sub-critical value of 0.95 for that particular Web page.

But since some Web pages are more interesting than others, some pages will achieve the critical value of 1 or even surpass it.

“To explain how the Web evolves into the SOC state, we need to use the concept of Darwinian fitness, which is a scientific measure of digital fangs and claws that help the Web page to fight for links with its competitors,” Simkin said. “The success in this competition depends not only on the Web page's own fitness, but also on the average fitness of other pages currently discussed in the blogosphere, with which our Web page competes.”

If this average is low, Simkin explained, then the fittest papers are super-critical. This means that, with time, they increase their share of the blogosphere. But in turn, this leads to the increase of the average fitness. The process continues until the fittest pages become exactly critical.

“One finding that is important for Webmasters is that our work disproves the so-called fifteen minutes of fame paradigm, according to which things can get popular soon after release and quickly become forgotten,” Simkin said. “One, of course, knows that this paradigm is manifestly wrong for immortal classics. However, our work shows it to be wrong not only for great creations, but for anything which is of any intrinsic (not created by advertisement) merit.”

The researchers found that the traffic to a Web page with fixed content can persist for at least several years.

“So one should not hurry to delete old Web pages,” Simkin said. “When there is enough – say a year – of access statistics, our model can be used to infer a page's fitness and predict the average volume and fluctuations of future traffic.”

Roychowdhury is a cofounder and Simkin a consultant for a start-up company called that focuses on next-generation Internet advertising. The company utilizes similar physics-based modeling of the Web, though not the direct results of the present study.

The researchers add that the Web is just one of many complex systems that exhibit self-organized criticality, with other examples including evolutionary patterns, earthquakes, and citations in research papers. They suggest that their model could also be used to explain the spreading of cultural elements, like movies, books, and fashion styles.

Original here

Fear in the genes

fearEven though fear is partly genetic, the things that terrify us change as we age.

fearEven though fear is partly genetic, the things that terrify us change as we age.Getty

If snakes strike terror in your toddler’s heart, he might still grow to be brave. A tendency toward fearfulness does have genetic underpinnings, but those shift several times as children become adults, a study has found.

The worries of adolescents differ from those of young children — fear of the dark gives way to squeamishness about blood in a well-documented developmental progression. Now, psychiatrist Kenneth Kendler of the Medical College of Virginia in Richmond and his colleagues have found that the genetic factors that leave a person prone to fear also shift during development.

To tease apart the effect of genes and upbringing, the researchers tracked 2,490 Swedish twins as they aged from 8 to 20 years old, asking them to answer questions sent by mail. The twins were quizzed on whether they were afraid of 13 potentially terrifying phenomena, including lightning, dentists, spiders and heights.

At every age a child was more likely to be fearful if their identical twin was too. Fraternal twins also shared a tendency towards fearfulness but the link was less strong, indicating a genetic component to fearfulness.

Fear factors

However, despite this evidence for a genetic effect, children weren’t consistently prone to fear as they grew up. Evidence for multiple fear factors comes from the comparison between ages - some twins were similarly fearful at age 8, but not at older ages.

Similarly, young adult responders who were easily frightened were no more likely to have had a fearful identical twin during early adolescence, the team reports in the Archives of General Psychiatry 1. “You might have fairly substantial changes in levels of fearfulness over time because different genetic effects are coming online at different ages,” Kendler says.

The genes that contribute to fearfulness at different ages remain unknown, as evidence for the shift lies entirely in the strength of the links between fear levels in identical twins across time.

Work to identify specific genes for such complex traits is in its infancy, says psychiatrist Murray Stein, who studies the biological underpinnings of anxiety at the University of California, San Diego. “We are starting to see findings of specific genes being associated with particular kinds of temperaments,” Stein says, but he notes that variation in any of these genes explains only a very small percentage of variability in human behaviour.

Stein thinks that the study highlights the importance of recognizing that the factors that lead to excessive fears, or phobias, may change over time. “It could be that we’re going to need very different interventions at different stages of people’s growth and development,” he says.

Original here

A Disease That Allowed Torrents of Creativity

Image of a migraine by Anne Adams, who was drawn to structure and repetition. She had a rare disease that changes connections between parts of the brain.

If Rod Serling were alive and writing episodes for “The Twilight Zone,” odds are he would have leaped on the true story of Anne Adams, a Canadian scientist turned artist who died of a rare brain disease last year.

Trained in mathematics, chemistry and biology, Dr. Adams left her career as a teacher and bench scientist in 1986 to take care of a son who had been seriously injured in a car accident and was not expected to live. But the young man made a miraculous recovery. After seven weeks, he threw away his crutches and went back to school.

According her husband, Robert, Dr. Adams then decided to abandon science and take up art. She had dabbled with drawing when young, he said in a recent telephone interview, but now she had an intense all-or-nothing drive to paint.

“Anne spent every day from 9 to 5 in her art studio,” said Robert Adams, a retired mathematician. Early on, she painted architectural portraits of houses in the West Vancouver, British Columbia, neighborhood where they lived.

In 1994, Dr. Adams became fascinated with the music of the composer Maurice Ravel, her husband recalled. At age 53, she painted “Unravelling Bolero” a work that translated the famous musical score into visual form.

Unbeknown to her, Ravel also suffered from a brain disease whose symptoms were identical to those observed in Dr. Adams, said Dr. Bruce Miller, a neurologist and the director of the Memory and Aging Center at the University of California, San Francisco. Ravel composed “Bolero” in 1928, when he was 53 and began showing signs of his illness with spelling errors in musical scores and letters.

“Bolero” alternates between two main melodic themes, repeating the pair eight times over 340 bars with increasing volume and layers of instruments. At the same time, the score holds methodically to two simple, alternating staccato bass lines.

“ ‘Bolero’ is an exercise in compulsivity, structure and perseveration,” Dr. Miller said. It builds without a key change until the 326th bar. Then it accelerates into a collapsing finale.

Dr. Adams, who was also drawn to themes of repetition, painted one upright rectangular figure for each bar of “Bolero.” The figures are arranged in an orderly manner like the music, countered by a zigzag winding scheme, Dr. Miller said. The transformation of sound to visual form is clear and structured. Height corresponds to volume, shape to note quality and color to pitch. The colors remain unified until the surprise key change in bar 326 that is marked with a run of orange and pink figures that herald the conclusion.

Ravel and Dr. Adams were in the early stages of a rare disease called FTD, or frontotemporal dementia, when they were working, Ravel on “Bolero” and Dr. Adams on her painting of “Bolero,” Dr. Miller said. The disease apparently altered circuits in their brains, changing the connections between the front and back parts and resulting in a torrent of creativity.

“We used to think dementias hit the brain diffusely,” Dr. Miller said. “Nothing was anatomically specific. That is wrong. We now realize that when specific, dominant circuits are injured or disintegrate, they may release or disinhibit activity in other areas. In other words, if one part of the brain is compromised, another part can remodel and become stronger.”

Thus some patients with FTD develop artistic abilities when frontal brain areas decline and posterior regions take over, Dr. Miller said.

An article by Dr. Miller and colleagues describing how FTD can release new artistic talents was published online in December 2007 by the journal Brain. FTD refers to a group of diseases often misdiagnosed as Alzheimer’s disease, in that patients become increasingly demented, Dr. Miller said. But the course and behavioral manifestations of FTD are different.

In the most common variant, patients undergo gradual personality changes. They grow apathetic, become slovenly and typically gain 20 pounds. They behave like 3-year-olds in public, asking embarrassing questions in a loud voice. All along, they deny anything is wrong.

Two other variants of FTD involve loss of language. In one, patients have trouble finding words, Dr. Miller said. When someone says to the patients, “Pass the broccoli,” they might reply, “What is broccoli?”

In another, PPA or primary progressive aphasia, the spoken-language network disintegrates. Patients lose the ability to speak.

All three variants share the same underlying pathology. The disease, which has no cure, can progress quickly or, as in the case of Senator Pete V. Domenici, Republican of New Mexico, who announced his retirement last fall because of an FTD diagnosis, over many years.

Dr. Adams and Ravel had the PPA variant, Dr. Miller said.

From 1997 until her death 10 years later, Dr. Adams underwent periodic brain scans that gave her physicians remarkable insights to the changes in her brain.

“In 2000, she suddenly had a little trouble finding words,” her husband said. “Although she was gifted in mathematics, she could no longer add single digit numbers. She was aware of what was happening to her. She would stamp her foot in frustration.”

By then, the circuits in Dr. Adams’s brain had reorganized. Her left frontal language areas showed atrophy. Meanwhile, areas in the back of her brain on the right side, devoted to visual and spatial processing, appeared to have thickened.

When artists suffer damage to the right posterior brain, they lose the ability to be creative, Dr. Miller said. Dr. Adams’s story is the opposite. Her case and others suggest that artists in general exhibit more right posterior brain dominance. In a healthy brain, these areas help integrate multisensory perception. Colors, sounds, touch and space are intertwined in novel ways. But these posterior regions are usually inhibited by the dominant frontal cortex, he said. When they are released, creativity emerges.

Dr. Miller has witnessed FTD patients become gifted in landscape design, piano playing, painting and other creative arts as their disease progressed.

Dr. Adams continued to paint until 2004, when she could no longer hold a brush. Her art, including “An ABC Book of Invertebrates,” a rendering of the mathematical ratio pi, an image of a migraine aura and other works, is at two Web sites: and

Original here

Physicist Says Particle Will Be Seen

AP Photo
AP Photo/Salvatore Di Nolfi

GENEVA (AP) -- The "father" of an elusive subatomic particle said Monday he is almost sure it will be discovered in the next year in a race between powerful research equipment in the United States and Europe.

British physicist Peter Higgs, who more than 40 years ago postulated the existence of the particle in the makeup of the atom, said his visit to a new accelerator in Geneva over the weekend encouraged him that the so-called Higgs boson will soon be seen.

The $2 billion Large Hadron Collider, under construction since 2003, is expected to start operating by June at the European Laboratory for Particle Physics, which is known as CERN.

It likely will take several months before the hundreds of scientists from all over the world at the laboratory are ready to start smashing together protons to study their composition.

But Higgs said the particle may already have been created at the rival Fermi National Accelerator Laboratory outside Chicago, where the Tevatron is currently the world's most powerful particle accelerator.

"The Tevatron has plenty of energy to do it," said Higgs. "It's just the difficulty of analyzing the data which prevents you from knowing quickly what's hiding in the data."

The massive new CERN collider, which has been installed in a 17-mile circular tunnel under the Swiss-French border, will be more powerful still and will be better able to show what particles are created in the collisions of beams of protons traveling at the speed of light.

The new Geneva collider will recreate the rapidly changing conditions in the universe a split second after the Big Bang. It will be the closest that scientists have yet come to the event that they theorize was the beginning of the universe. They hope the new equipment will enable them to study particles and forces yet unobserved.

But Fermilab still has time to be first if it can show that it has discovered the Higgs boson.

"It's a possibility," Higgs said. "The race is a very close thing. Fermilab are obviously trying very hard. It could be already in their data and just not found in the analysis yet. That's what they're certainly hoping - that they will at least get the first indication before LHC gets going."

Higgs told reporters that he is hoping to receive confirmation of his theory by the time he turns 80 at the end of May next year.

If not, he added "I'll just have to ask my GP to keep me alive a bit longer."

Higgs predicted the existence of the boson while working at the University of Edinburgh to explain how atoms - and the objects they make up - have weight.

Without the particle, the basic physics theory - the "standard model" - lacks a crucial element, because it fails to explain how other subatomic particles - such as quarks and electrons - have mass.

The Higgs theory is that the bosons create a field through which the other particles pass.

The particles that encounter difficulty going through the field as though they are passing through molasses pick up more inertia, and mass. Those that pass through more easily are lighter.

Higgs said he would be "very, very puzzled" if the particle is never found because he cannot image what else could explain how particles get mass.

Higgs said initial reaction to his ideas in the early 1960s was skeptical.

"My colleagues thought I was a bit of an idiot," he said, noting that his initial paper explaining how his theory worked was rejected by an editor at CERN.

He said a colleague spent the summer at CERN right after he did his work on the theory.

"He came back and said, 'At CERN they didn't see that what you were talking about had much to do with particle physics.'

"I then added on some additional paragraphs and sent it off across the Atlantic to Physical Review Letters, who accepted it. The mention of what became known as the Higgs boson was part of the extra which was added on."

Original here

Computer recognizes attractiveness in women

It’s said that a computer can never even get close to processing data in the way a man does, but this can easily freak you out. Beauty lies in the eye of the beholder, right? Right ?! The thing is that according to scientists at Tel Aviv University, the beholder doesn’t have to be human.

Amit Kagian, an M.Sc. graduate from the TAU School of Computer Sciences, has successfully “taught” a computer how to understand and process the factors which contribute to attractiveness in women. Aside from this being the modern equivalent of “mirror mirror on the wall” and all the vanity that could come from it, it’s actually quite important. The importance lies in the fact that this is a significant breakthrough in creating artificial intelligence in computers.

“Until now, computers have been taught how to identify basic facial characteristics, such as the difference between a woman and a man, and even to detect facial expressions,” says Kagian. “But our software lets a computer make an aesthetic judgment. Linked to sentiments and abstract thought processes, humans can make a judgment, but they usually don’t understand how they arrived at their conclusions.”

The computer takes into consideration into its analyss factors such as symmetry, smoothness of the skin and hair color. In the first step 30 men and women were presented with 100 different faces of Caucasian women, roughly of the same age, and were asked to judge the beauty of each face. But the idea that beauty can be reduced to binary data is not new at all; actually it dates back to ancient Greece. Pythagoras reasoned that features of physical objects corresponding to the “golden ratio” were considered most attractive.

Original here

"Tower Lions" May Help Resurrect Extinct African Breed?

An extinct breed of lion from North Africa was held at the Tower of London in medieval times, a new study shows. A pair of skulls unearthed from the tower's moat in the 1930s belonged to Barbary lions, a subspecies that has since died out in the wild.

The discovery raises the possibility that descendants of Barbary lions may still survive in captivity, which could help efforts to resurrect the dark-maned breed, researchers say.

The lions' North African roots were revealed by analysis of mitochondrial DNA, a genetic marker passed between females.

What's more, the DNA reveals that the two animals represent the oldest confirmed Barbary lion remains in the world, the study team said.

The findings are reported in the current issue of the journal Contributions to Zoology.

Exploited Population

Radiocarbon dating of the lion skulls in 2005 indicated that the two male cats first came to the tower in the 13th century, the oldest being dated to between A.D. 1280 and 1385.

(Read "Medieval Lion Skulls Reveal Secrets of Tower of London 'Zoo'" [November 3, 2005].)

At that time the palace housed the Royal Menagerie, a diverse collection of exotic animals owned by the reigning monarch.

Carcasses of dead animals from the menagerie were likely thrown into the moat, where they became buried in silt, said study team member Richard Sabin of the Natural History Museum in London.

The environment preserved the lion skulls remarkably well, allowing genetic samples to be taken, Sabin said.

Original here

Hybrids: separating hope from the hype

Recent weeks have seen a furore erupt over controversial laws that would allow the creation of human-animal embryos for research, with much talk of the sacredness of human life and the danger of scientists playing God.
Human-animal embryos have caused controversy in recent weeks

But there are some, both professors and priests, who are using the facts in ways that are at best selective and at worst misleading. More importantly, even though proponents say there is a moral imperative to do hybrid embryo research to find cures, some scientists are sceptical that it is worth diverting money from more promising research.

Last week the news emerged that Newcastle University had created the first European hybrids, a blend of cow and human. The coverage made much of how the embryos could yield highly flexible parent cells, or stem cells, that could be used in a dizzying array of treatments, from diabetes to heart disease.

This is precisely the kind of science that the Embryology Bill is designed to address, and that has caused such bitter argument. Opponents of the Bill try to convince the public that embryos are "people" whose "body parts" will be raided by monstrous researchers to make "Frankenstein creations" (in fact these "people" are microscopic balls of cells).

Yet some scientists have also been manipulating the media when it comes to the most divisive part of proposed legislation, to allow "admixed human embryos". They have underplayed concerns by their peers that this work is speculative by downplaying the role of animal DNA.

In the Bill, these embryos could result from a human egg and animal sperm, or blends of cells from animal and human embryos (chimaeras). The Newcastle team made another version, a "cybrid".

This is created by inserting human DNA, in the form of a broken human cell, into an empty animal egg, in the same cloning method used to make Dolly the sheep.

This work will be high on the agenda of the first national stem cell research conference being held in Edinburgh this week. There is a need for more science in the public debate because the animal DNA does a special job in these embryos, one which diminishes their direct relevance to cures.

The cells in our bodies contain two kinds of DNA. Most research concerns nuclear DNA, so named because it resides in a compartment in the heart of cells called the nucleus. This DNA comes from our parents, and provides the recipe for the proteins that make and run the body.

Variations in this genetic message have been linked to the inheritance of individual characteristics, and also to hereditary diseases and risk of illnesses.

The animal DNA in the cybrids would be of the second kind, which resides in lozenge-like structures outside the nucleus called mitochondria. These are power packs that we inherit from our mothers.

Scientists say that cybrids of human and animal are "99.9 per cent human", in order to comfort those who think (wrongly) that an embryo would end up with horns or hooves. But this figure misleads the vast majority of people, who know little about human biology.

There probably is about 0.1 per cent animal DNA by weight when there are 500 cells in the embryo. But in the initial embryo, there are 100,000 copies of the mitochondrial DNA, around half animal DNA by weight.

This proportion declines because only the nuclear DNA replicates at first. In addition, some human mitochondria could come along for the ride with the human nuclear DNA and persist as the cells grow, perhaps even take over.

The 99.9 per cent figure probably refers to the fact that there are 37 instructions (genes) in mitochondrial DNA, compared with 29,000 in nuclear, which means 0.1 per cent animal instructions. But this is hardly reassuring.

Decades of work has shown that even one genetic spelling mistake in the three billion letters of the nuclear code can be fatal - just 0.0000001 per cent. And mitochondria are important: faults in them are responsible for around 50 metabolic disorders that affect one in every 6,500 people.

This includes fatal liver failure, stroke-like episodes, blindness, mental retardation, muscle weakness, diabetes and deafness.

Some claim that using animal mitochondrial DNA in cybrids would be like changing a battery in a computer, leaving the "hard disk" - the nuclear DNA - unaffected. But Dr Marc Vermulst of the University of Washington in Seattle, who has linked changes in mitochondria to premature ageing, says: "In flies, if you mix the mitochondrial DNA of one strain with the nuclear DNA of another strain, the mitochondria of the mixed strain work less efficiently than they normally would.

By evolving together, mitochondria and nuclei have become very finely tuned to each other. I am not sure how well human and cow DNA would communicate with each other. That would be very important."

Although a pioneering study in China suggested rabbit and human could be successfully blended, using animal mitochondria in human cells could sometimes be like trying to put AA batteries into an AAA compartment.

For animal and human DNA to work in harmony is "a big ask," says Prof Neil Scolding, a Catholic stem cell researcher at Bristol University. Prof Jun-Ichi Hayashi, of University of Tsukuba, who in the latest issue of Science shows that mitochondrial mutations can encourage tumours to spread, has also - unsuccessfully - tried to make mouse nuclear DNA work with rat mitochondria.

"It is clear that human embryos with animal mitochondrial DNA may develop in the initial stage," he says, "but [they] could not survive any further."

Prof Scolding points out that, thanks to pioneering work in Japan, there is now an egg and embryo-free alternative source of stem cells, albeit one that might present other ethical issues (such as the ability of men to make eggs).

"That has led scientists all over the place (including Sir Ian Wilmut, the creator of Dolly) to embrace this technology. Which makes it all the more inexplicable why a small minority of UK stem cell scientists wants to pursue the extraordinarily complex and frankly speculative hybrid approach."

For many scientists, such as British stem cell pioneer and Nobel prizewinner Sir Martin Evans, resolving such issues provides a clear scientific rationale for using cybrids to find out more about the basic role of mitochondria in development and in disease.

There is, for example, research at Newcastle to transplant healthy human mitochondria to treat serious metabolic diseases that could benefit from that. However, the hard sell has been about the medical use of cybrid stem cells - not by using the cells themselves in human bodies, but to test drugs and study disease.

When even human stem cells are poorly understood, it will take a lot of slogging to show whether cybrid stem cells will behave properly. Here, even Sir Martin feels the immediate potential has been hyped and claims about cures "overheated".

Original here

Self-Experimenters Step Up for Science

Quick—what's the first thought that pops into your head when you hear the word "experiment"? Odds are that what did not bubble up was the image of a 16th-century Italian nobleman who lived for 30 years on a platform suspended from a large straight-beam balance. But it should have. Historians of medicine consider Santorio Santorio—aka Santorio Santorii, aka Sanctorius of Padua—the first physician to have knowingly submitted his theoretical speculations to the rigor of experimental testing that today is taken for granted. By living on the balance, he was able to weigh himself against his daily intake of food and liquids, and his combined expulsions, leading him to the discovery of the insensible perspiration that wafts from our bodies.

Signore Santorio is far from the only self-experimenter to have left a mark on science. Sir Isaac Newton left a mark on the back of his eyelids, nearly blinding himself at age 22 by staring at the sun for too long in a mirror to study the after-images it left on his retinas. Early chemists were known for tasting their distillations, a habit that may have cut short the life of Carl Wilhelm Scheele, the 18th-century German-Swedish chemist who discovered chlorine and co-discovered nitrogen and oxygen; he died at age 44 from suspected heavy metal poisoning. And in what is probably the most famous case of self-experimentation, Australian physician Barry James Marshall downed the contents of a Petri dish laden with Helicobacter pylori bacteria to demonstrate that the microbe caused ulcers, sharing the 2005 Nobel Prize in Physiology or Medicine with J. Robin Warren for his self-experiment.

The practice is common enough among biomedical researchers that a full accounting would take volumes—a good starting point is the 1987 book by physician–journalist Lawrence Altman, entitled Who Goes First? The Story of Self-Experimentation in Medicine. To showcase the variety of reasons that a researcher (or daughter of a researcher or filmmaker) would opt to self-experiment as well as the problems of ethics and data interpretation that may crop up as a result, Scientific American is presenting an eight-part series on some of the most fascinating modern exemplars of the self-experimental method.

Original here