Followers

Wednesday, February 27, 2008

NASA Takes Aim at Moon with Double Sledgehammer

Scientists are priming two spacecraft to slam into the moon's South Pole to see if the lunar double whammy reveals hidden water ice.

The Earth-on-moon violence may raise eyebrows, but NASA's history shows that such missions can yield extremely useful scientific observations.

"I think that people are apprehensive about it because it seems violent or crude, but it's very economical," said Tony Colaprete, the principal investigator for the mission at NASA's Ames Research Center in Moffett Field, Calif.

NASA's previous Lunar Prospector mission detected large amounts of hydrogen at the moon's poles before crashing itself into a crater at the lunar South Pole. Now the much larger Lunar Crater and Observation Sensing Satellite (LCROSS) mission, set for a February 2009 moon crash, will take aim and discover whether some of that hydrogen is locked away in the form of frozen water.

LCROSS will piggyback on the Lunar Reconnaissance Orbiter (LRO) mission for an Oct. 28 launch atop an Atlas 5 rocket equipped with a Centaur upper stage. While the launch will ferry LRO to the moon in about four days, LCROSS is in for a three-month journey to reach its proper moon smashing position. Once within range, the Centaur upper stage doubles as the main 4,400 pound (2,000 kg) impactor spacecraft for LCROSS.

The smaller Shepherding Spacecraft will guide Centaur towards its target crater, before dropping back to watch - and later fly through - the plume of moon dust and debris kicked up by Centaur's impact. The shepherding vehicle is packed with a light photometer, a visible light camera and four infrared cameras to study the Centaur's lunar plume before it turns itself into a second impactor and strikes a different crater about four minutes later.

"This payload delivery represents a new way of doing business for the center and the agency in general," said Daniel Andrews, LCROSS project manager at Ames, in a statement. "LCROSS primarily is using commercial-off-the-shelf instruments on this mission to meet the mission's accelerated development schedule and cost restraints."

Figuring out the final destinations for the $79 million LCROSS mission is "like trying to drive to San Francisco and not knowing where it is on the map," Colaprete said. He and other mission scientists hope to use observations from LRO and the Japanese Kaguya (Selene) lunar orbiter to map crater locations before LCROSS dives in.

"Nobody has ever been to the poles of the moon, and there are very unique craters - similar to Mercury - where sunlight doesn't reach the bottom," Colaprete said. Earth-based radar has also helped illuminate some permanently shadowed craters. By the time LCROSS arrives, it can zero in on its 19 mile (30 km) wide targets within 328 feet (100 meters).

Scientists want the impactor spacecraft to hit smooth, flat areas away from large rocks, which would ideally allow the impact plume to rise up out of the crater shadows into sunlight. That in turn lets LRO and Earth-based telescopes see the results.

"By understanding what's in these craters, we're examining a fossil record of the early solar system and would occurred at Earth 3 billion years ago," Colaprete said. LCROSS is currently aiming at target craters Faustini and Shoemaker, which Colaprete likened to "fantastic time capsules" at 3 billion and 3.5 billion years old.

LCROSS researchers anticipate a more than a 90 percent chance that the impactors will find some form of hydrogen at the poles. The off-chance exists that the impactors will hit a newer crater that lacks water - yet scientists can learn about the distribution of hydrogen either way.

"We take [what we learn] to the next step, whether it's rovers or more impactors," Colaprete said.

This comes as the latest mission to apply brute force to science.

The Deep Impact mission made history in 2005 by sending a probe crashing into comet Tempel 1. Besides Lunar Prospector's grazing strike on the moon in 1999, the European Space Agency's Smart-1 satellite dove more recently into the lunar surface in 2006.

LCROSS will take a much more head-on approach than either Lunar Prospector or Smart-1, slamming into the moon's craters at a steep angle while traveling with greater mass at 1.6 miles per second (2.5 km/s). The overall energy of the impact will equal 100 times that of Lunar Prospector and kick up 1,102 tons of debris and dust.

"It's a cost-effective, relatively low-risk way of doing initial exploration," Colaprete said, comparing the mission's approach to mountain prospectors who used crude sticks of dynamite to blow up gully walls and sift for gold. Scientists are discussing similar missions for exploring asteroids and planets such as Mars.

Nevertheless, Colaprete said they "may want to touch the moon a bit more softly" after LCROSS has its day.

Original here

Students to save the Earth

Nine students from the Technion-Israel Institute of Technology have developed a model spacecraft for deflecting objects falling from space.

The model has been created in response to the asteroid Apophis which scientists believe will collide with Earth in 2036, and was presented at a competition of NASA and the American Institute of Aeronautics and Astronautics.

The initial plan would put the craft into space in 2020 where it will approach the asteroid and launch two penetrating devices. These will deliver equipment including a specially adapted camera, transmitter and antenna. Air bags will be used to safely deliver the equipment and also will be attached with solar panels to power the equipment.

The equipment will collect data on the location and composition of the asteroid and relay it back to Earth. If needed the spacecraft can again approach the asteroid in 2025 to divert the asteroid from its path using the gravitational pull of the spacecraft. The asteroid will pass the Earth in 2029 before returning in 2036 and the team aim to change the path of the asteroid during this pass.

According to Dr Alexander Kogan, who guided the students, the craft will use its ion thrusters to hover 200-300m from the asteroid for four months. Using the mass of the spacecraft, combined with the effect of Earths’ gravity, the craft will pull the asteroid out of its previous path.

‘The spacecraft is what will make the difference,’ said student Lior Avital. ‘It will divert the asteroid one kilometre and with the help of the Earth, in seven years - 7,000km.’

Alternatives such as blasting the asteroid with a nuclear bomb were also considered, but the group believed the danger posed by two large asteroids or many small ones would be much greater. Diverting the asteroid by connecting powerful motors to it was also ruled out as the solution was deemed too expensive and complicated.

Original here

Ronald Reagan's remarks on the Challenger Shuttle explosion


Sea reptile is biggest on record






A fossilised "sea monster" unearthed on an Arctic island is the largest marine reptile known to science, Norwegian scientists have announced.

The 150 million-year-old specimen was found on Spitspergen, in the Arctic island chain of Svalbard, in 2006.

The Jurassic-era leviathan is one of 40 sea reptiles from a fossil "treasure trove" uncovered on the island.

Nicknamed "The Monster", the immense creature would have measured 15m (50ft) from nose to tail.

A large pliosaur was big enough to pick up a small car in its jaws and bite it in half
Richard Forrest, plesiosaur palaeontologist
And during the last field expedition, scientists discovered the remains of another so-called pliosaur which is thought to belong to the same species as The Monster - and may have been just as colossal.

The expedition's director Dr Jorn Hurum, from the University of Oslo Natural History Museum, said the Svalbard specimen is 20% larger than the previous biggest marine reptile - another massive pliosaur from Australia called Kronosaurus.

"We have carried out a search of the literature, so we now know that we have the biggest [pliosaur]. It's not just arm-waving anymore," Dr Hurum told the BBC News website.

"The flipper is 3m long with very few parts missing. On Monday, we assembled all the bones in our basement and we amazed ourselves - we had never seen it together before."

Young girl beside pliosaur flipper (J. Hurum)
The Monster's flipper alone measures 3m in length

Pliosaurs were a short-necked form of plesiosaur, a group of extinct reptiles that lived in the world's oceans during the age of the dinosaurs.

A pliosaur's body was tear drop-shaped with two sets of powerful flippers which it used to propel itself through the water.

"These animals were awesomely powerful predators," said plesiosaur palaeontologist Richard Forrest.

A second large pliosaur has now been found on the Arctic island

"If you compare the skull of a large pliosaur to a crocodile, it is very clear it is much better built for biting... by comparison with a crocodile, you have something like three or four times the cross-sectional space for muscles. So you have much bigger, more powerful muscles and huge, robust jaws.

"A large pliosaur was big enough to pick up a small car in its jaws and bite it in half."

"There are a few isolated bones of huge pliosaurs already known but this is the first find of a significant portion of a whole skeleton of such a giant," said Angela Milner, associate keeper of palaeontology at London's Natural History Museum

"It will undoubtedly add much to our knowledge of these top marine predators. Pliosaurs were reptiles and they were almost certainly not warm-blooded so this discovery is also a good demonstration of plate tectonics and ancient climates.

Lena Kristiansen prepares specimens in the Natural History Museum, University of Oslo.

"One hundred and fifty million years ago, Svalbard was not so near the North Pole, there was no ice cap and the climate was much warmer than it is today."

The Monster was excavated in August 2007 and taken to the Natural History Museum in Oslo. Team members had to remove hundreds of tonnes of rock by hand in high winds, fog, rain, freezing temperatures and with the constant threat of attack by polar bears.

They recovered the animal's snout, some teeth, much of the neck and back, the shoulder girdle and a nearly complete flipper.

Unfortunately, there was a small river running through where the head lay, so much of the skull had been washed away.

A preliminary analysis of the bones suggests this beast belongs to a previously unknown species.

Unprecedented haul

The researchers plan to return to Svalbard later this year to excavate the new pliosaur.

A few skull pieces, broken teeth and vertebrae from this second large specimen are already exposed and plenty more may be waiting to be excavated.

"It's a large one, and has the same bone structure as the previous one we found," said Espen Knutsen, from Oslo's Natural History Museum, who is studying the fossils.

Artist's impression of long-necked plesiosaur (Tor Sponga, BT)
Excavations have also yielded long-necked plesiosaurs

Dr Hurum and his colleagues have now identified a total of 40 marine reptiles from Svalbard. The haul includes many long-necked plesiosaurs and ichthyosaurs in addition to the two pliosaurs.

Long-necked plesiosaurs are said to fit descriptions of Scotland's mythical Loch Ness monster. Ichthyosaurs bore a passing resemblance to modern dolphins, but they used an upright tail fin to propel themselves through the water.

Richard Forrest commented: "Here in Svalbard you have 40 specimens just lying around, which is like nothing we know.

Exacavation at the Monster site
The 2007 fieldwork took place in challenging conditions
"Even in classic fossil exposures such as you have in Dorset [in England], there are cliffs eroding over many years and every so often something pops up. But we haven't had 40 plesiosaurs from Dorset in 200 years."

The fossils were found in a fine-grained sedimentary rock called black shale. When the animals died, they sank to the bottom of a cold, shallow Jurassic sea and were covered over by mud. The oxygen-free, alkaline chemistry of the mud may explain the fossils' remarkable preservation, said Dr Hurum.

The discovery of another large pliosaur was announced in 2002. Known as the "Monster of Aramberri" after the site in north-eastern Mexico where it was dug up, the creature could be just as big as the Svalbard specimen, according to the team that found it.

But palaeontologists told the BBC a much more detailed analysis of these fossils was required before a true picture of its size could be obtained.

Original here

Nokia Morph Concept (long)


The Earth's 6th Great Mass Extinction is Occurring as You Read This

Transparentbutterfly2sm_2










"In one sense we know much less about Earth than we do about Mars. The vast majority of life forms on our planet are still undiscovered, and their significance for our own species remains unknown. This gap in our knowledge is a serious matter: we will never completely understand and preserve the living world around us at our present level of ignorance.

"If all mankind were to disappear, the world would regenerate back to the rich state of equilibrium that existed ten thousand years ago. If insects were to vanish, the environment would collapse into chaos."

Edward O. Wilson, The world's leading authority on Biodiversity, Emeritus Professor of Biology at Harvard and author of "The Creation: An Appeal to Save Life on Earth."

There is little doubt left in the minds of professional biologists that Earth is currently faced with a mounting loss of species that threatens to rival the five great mass extinctions of the geological past, the most devasting being the Third major Extinction (c. 245 mya), the Permian, where 54% of the planet's species families lost. As long ago as 1993, Harvard biologist E.O. Wilson estimated that Earth is currently losing something on the order of 30,000 species per year -- which breaks down to the even more daunting statistic of some three species per hour. Some biologists have begun to feel that this biodiversity crisis -- this "Sixth Extinction" -- is even more severe, and more imminent, than Wilson had supposed.

With the human population expected to reach 9-10 billion by the end of the century and the planet in the middle of its sixth mass extinction — this time due to human activity — the next few years are critical in conserving Earth’s precious biodiversity. The cause of the Sixth Extinction, Homo sapiens, means we can continue on the path to our own extinction, or, preferably, we modify our behavior toward the global ecosystem of which we are still very much a part.

At a casual glance, the physically caused extinction events of the past might seem to have little or nothing to tell us about the current Sixth Extinction, which is a human-caused event. For there is little doubt that humans are the direct cause of ecosystem stress and species destruction in the modern world through transformation of the landscape, overexploitation of species, pollution, and the introduction of alien species

The Sixth Extinction can be characterized as the first recorded global extinction event that has a biotic, rather than a physical, cause, due to massive asteroid impact, volcanic eruptions. Yet, looking deeper, human impact on the planet is a similar to the Cretaceous cometary collision. Sixty-five million years ago that extraterrestrial impact -- through its sheer explosive power, followed immediately by its injections of so much debris into the upper reaches of the atmosphere that global temperatures plummeted and, most critically, photosynthesis was severely inhibited -- wreaked havoc on the living systems of Earth, which is precisely what we are doing to the planet right now.

Phase two of the Sixth Extinction began around 10,000 years ago with the invention of agriculture-perhaps first in the Natufian culture of the Middle East. Agriculture appears to have been invented several different times in various different places, and has, in the intervening years, spread around the entire globe.

Agriculture, which began around 10,000 years ago in the Natufian culture of the Middle East, is a major engine driving the Sixth Extinction, represents the single most profound ecological change in the entire 3.5 billion-year history of life. With its invention humans did not have to interact with other species for survival, and so could manipulate other species for their own use nor did humans have to adhere to the ecosystem's carrying capacity, and so could overpopulate

Homo sapiens became the first species to stop living inside local ecosystems. All other species, including our ancestral hominid ancestors, all pre-agricultural humans, and remnant hunter-gatherer societies still extant exist as semi-isolated populations playing specific roles (i.e., have "niches") in local ecosystems. This is not so with post-agricultural revolution humans, who in effect have stepped outside local ecosystems. Indeed, to develop agriculture is essentially to declare war on ecosystems - converting land to produce one or two food crops, with all other native plant species all now classified as unwanted "weeds" -- and all but a few domesticated species of animals now considered as pests.

Yet, upon further reflection, human impact on the planet is a direct analogue of the Cretaceous cometary collision. Sixty-five million years ago that extraterrestrial impact -- through its sheer explosive power, followed immediately by its injections of so much debris into the upper reaches of the atmosphere that global temperatures plummeted and, most critically, photosynthesis was severely inhibited -- wreaked havoc on the living systems of Earth. That is precisely what human beings are doing to the planet right now: humans are causing vast physical changes on the planet.

"The comparison I make between these big extinction events, prehistoric meteorite-caused or natural event-caused extinctions and the present one," says E.O. Wilson, "is parallel to the difference between a heart attack and cancer. We understand that what we are doing is a slow but insidious, and only can be seen when you lay it out over the whole world over a period of decades. The hopeful thing about it is that this cancer can be treated. A lot of damage has been done, and it can be dangerous to us if we really just go on until half the species of organisms are extinct forever. Or we can halt the hemorrhaging.

"In terms of scale, it’s hard to put a figure on it," Wilson adds: "We’re in a pronounced early stage of an extinction event that would probably be, by the end of this century if human activities continue unabated, right up to the Cretaceous level. We’re part way there. Whether you can say its 10 percent there or 25 percent there, a lot of it depends on the organisms you’re talking about. One estimate has it that, particularly when you throw in the mass extinction of the Pacific Island birds, which are the most vulnerable on Earth, something like 20 percent of bird species has been extinguished by human activities."

Biocide is occurring at an alarming rate. Experts say that at least half of the world’s current species will be completely gone by the end of the century. Wild plant-life is also disappearing. Most biologists say that we are in the midst of an anthropogenic mass extinction. Numerous scientific studies confirm that this phenomenon is real and happening right now. Should anyone really care? Will it impact individuals on a personal level? Scientists say, “Yes!”

Critics argue that species disappear and new ones emerge all the time. That’s true, if you’re speaking in terms of millennia. Scientists acknowledge that species disappear at an estimated rate of one species per million per year, with new species replacing the lost ones at around the same rate. Recently humans have accelerated the extinction rate to where several entire species are annihilated every single day. The death toll artificially caused by humans is mind-boggling. Nature will take millions of years to repair what we destroy in just a few decades.

A recent analysis, published in the journal Nature, shows that it takes 10 million years before biological diversity even begins to approach what existed before a die-off. Over 10,000 scientists in the World Conservation Union have compiled data showing that currently 51 per cent of known reptiles, 52 per cent of known insects, and 73 per cent of known flowering plants are in danger along with many mammals, birds and amphibians. It is likely that some species will become extinct before they are even discovered, before any medicinal use or other important features can be assessed. The cliché movie plot where the cure for cancer is about to be annihilated is more real than anyone would like to imagine.

Research done by the American Museum of Natural History found that the vast majority of biologists believe that mass extinction poses a colossal threat to human existence, and is even more serious of an environmental problem than one of its contributors- global warming. The research also found that the average person woefully underestimates the dangers of mass extinction. Powerful industrial lobbies would like people to believe that we can survive while other species are quickly and quietly dying off. Irresponsible governments and businesses would have people believe that we don’t need a healthy planet to survive- even while human cancer rates are tripling every decade.

A lot of us heard about the recent extinction of the Yangtze river dolphin. It was publicized because dolphins are cute and smart, and we like dolphins. We were sort of sad that we humans were single-handedly responsible for destroying the entire millions-of-years-old species in just a few years through rampant pollution. Unfortunately the real death toll is so much higher than we hear on the news. Only a few endangered “celebrity favorites” get any notice at all.

Since animals and plants exist in symbiotic relationships to one another, extinction of one species is likely to cause ”co-extinctions”. Some species directly affect the health of hundreds of other species. There is always some kind of domino effect. This compounding process occurs with frightening speed. That makes rampant extinction similar too disease in the way that it spreads. Sooner or later- if gone unchecked- humans may catch it too.

Amphibians are a prime example at how tinkering with the environment can cause rapid animal death. For over 300 million years frogs, salamanders, newts and toads were hardy enough to precede and outlive the dinosaurs up until the present time. Now, within just two decades many amphibians are disappearing. Scientists are alarmed at how one seemingly robust species of amphibians will suddenly disappear within a few months.

The causes of biocide are a hodge-podge of human environmental “poisons” which often work synergistically, including a vast array of pollutants, pesticides, a thinning ozone layer which increases ultra-violet radiation, human induced climate change, habitat loss from agriculture and urban sprawl, invasions of exotic species introduced by humans, illegal and legal wildlife trade, light pollution, and man-made borders among other many other causes.

Is there a way out? The answer is yes and no. We’ll never regain the lost biodiversity-at least not within a fathomable time period, but there are ways to prevent a worldwide bio collapse, but they all require immediate action. Wilson, and other scientists point out that the world needs international cooperation in order to sustain ecosystems, since nature is unaware of artificially drawn borders. Humans love to fence off space they’ve claimed as their own. Sadly, a border fence often has terrible ecological consequences. One fence between India and Pakistan cuts off bears and leopards from their feeding habitats, which is causing them to starve to death. Starvation leads to attacks on villagers, and more slaughtering of the animals.

Some of the most endangered wildlife species live right in between the borderland area of the US and Mexico. These indigenous animals don’t know that they now live between two countries. They were here long before the people came and nations divided, but they will not survive if we cut them off from their habitat. The Sky Islands is one of many areas smack in the middle of this boundary where some of North America's most threatened wildlife is found. Jaguars, bison, and Wolves have to cross through international terrain in the course of their life's travels in order to survive. Unfortunately, illegal Mexican workers cross here too. People who know nothing of the wildlife’s biological needs want to create a large fence to keep out Mexicans, regardless of the fact that a fence would devastate these already fragile animal populations.

Wilson says the time has come to start calling the "environmentalist view" the "real-world view". We can’t ignore reality simply because it doesn’t conform nicely within convenient boundaries and moneymaking strategies. What good will all of our money and conveniences do for us, if we collectively destroy the necessities of life?

There is hope, but it requires radical changes. Many organizations are lobbying for that change. One group trying to salvage ecosystems is called The Wildlands Project, a conservation group spearheading the drive to reconnect the remaining wildernesses. The immediate goal is to reconnect wild North America in four broad "mega-linkages". Within each mega-linkage, mosaics of public and private lands, which would provide safe migrations for wildlife, would connect core areas. Broad, vegetated overpasses would link wilderness areas normally split by roads. They will need cooperation from local landowners and government agencies.

It is a radical vision to many people, and the Wildlands Project expects that it will take at least 100 years to complete. Even so, projects like this, on a worldwide basis, may be humanity’s best chance of saving what’s left of the planets eco-system, and the human race along with it.

Posted by Casey Kazan and Rebecca Sato.

Original here

Gecko Stitches


Supercomputer unleashes virtual 9.0 megaquake in Pacific Northwest

Simulation may help big cities develop early warning systems


On January 26, 1700, at about 9 p.m. local time, the Juan de Fuca plate beneath the ocean in the Pacific Northwest suddenly moved, slipping some 60 feet eastward beneath the North American plate in a monster quake of approximately magnitude 9, setting in motion large tsunamis that struck the coast of North America and traveled to the shores of Japan.

Since then, the earth beneath the region – which includes the cities of Vancouver, Seattle and Portland -- has been relatively quiet. But scientists believe that earthquakes with magnitudes greater than 8, so-called “megathrust events,” occur along this fault on average every 400 to 500 years.

To help prepare for the next megathrust earthquake, a team of researchers led by seismologist Kim Olsen of San Diego State University (SDSU) used a supercomputer-powered “virtual earthquake” program to calculate for the first time realistic three-dimensional simulations that describe the possible impacts of megathrust quakes on the Pacific Northwest region. Also participating in the study were researchers from the San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.

What the scientists learned from this simulation is not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle.

With a rupture scenario beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone, the ground moved about 1 Ѕ feet per second in Seattle; nearly 6 inches per second in Tacoma, Olympia and Vancouver; and 3 inches in Portland, Oregon. Additional simulations, especially of earthquakes that begin in the southern part of the rupture zone, suggest that the ground motion under some conditions can be up to twice as large.

“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.

The long-duration shaking, combined with high ground velocities, raises the possibility that such an earthquake could inflict major damage on metropolitan areas -- especially on high-rise buildings in downtown Seattle. Compounding the risks, like Los Angeles to the south, Seattle, Tacoma, and Olympia sit on top of sediment-filled geological basins that are prone to greatly amplifying the waves generated by major earthquakes.

“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” said Olsen. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.” Depending on how far the earthquake is from a city, early warning systems could give from a few seconds to a few tens of seconds to implement measures, such as automatically stopping trains and elevators.

Added Olsen, “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia.” One of the largest earthquakes ever recorded, the magnitude 9.2 Sumatra-Andeman event was felt as far away as Bangladesh, India, and Malaysia, and triggered devastating tsunamis that killed more than 200,000 people.

In addition to increasing scientific understanding of these massive earthquakes, the results of the simulations can also be used to guide emergency planners, to improve building codes, and help engineers design safer structures -- potentially saving lives and property in this region of some 9 million people.

Even with the large supercomputing and data resources at SDSC, creating “virtual earthquakes” is a daunting task. The computations to prepare initial conditions were carried out on SDSC’s DataStar supercomputer, and then the resulting information was transferred for the main simulations to the center’s Blue Gene Data supercomputer via SDSC’s advanced virtual file system or GPFS-WAN, which makes data seamlessly available on different – sometimes distant – supercomputers.

Coordinating the simulations required a complex choreography of moving information into and out of the supercomputer as Olsen’s sophisticated “Anelastic Wave Model” simulation code was running. Completing just one of several simulations, running on 2,000 supercomputer processors, required some 80,000 processor hours – equal to running one program continuously on your PC for more than 9 years!

“To solve the new challenges that arise when researchers need to run their codes at the largest scales, and data sets grow to great size, we worked closely with the earthquake scientists through several years of code optimization and modifications,” said SDSC computational scientist Yifeng Cui, who contributed numerous refinements to allow the computer model to “scale up” to capture a magnitude 9 earthquake over such a vast area.

In order to run the simulations, the scientists must recreate in their model the components that encompass all the important aspects of the earthquake. One component is an accurate representation of the earth’s subsurface layering, and how its structure will bend, reflect, and change the size and direction of the traveling earthquake waves. Co-author William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer, from Ulm University in Germany, to create the first unified “velocity model” of the layering for this entire region, extending from British Columbia to Northern California.

Another component is a model of the earthquake source from the slipping of the Juan de Fuca plate underneath the North American plate. Making use of the extensive measurements of the massive 2004 Sumatra-Andeman earthquake in Indonesia, the scientists developed a model of the earthquake source for similar megathrust earthquakes in the Pacific Northwest.

The sheer physical size of the region in the study was also challenging. The scientists included in their virtual model an immense slab of the earth more than 650 miles long by 340 miles by 30 miles deep -- more than 7 million cubic miles -- and used a computer mesh spacing of 250 meters to divide the volume into some 2 billion cubes. This mesh size allows the simulations to model frequencies up to 0.5 Hertz, which especially affect tall buildings.

“One of the strengths of an earthquake simulation model is that it lets us run scenarios of different earthquakes to explore how they may affect ground motion,” said Olsen. Because the accumulated stresses or “slip deficit” can be released in either one large event or several smaller events, the scientists ran scenarios for earthquakes of different sizes.

“We found that the magnitude 9 scenarios generate peak ground velocities five to 10 times larger than those from the smaller magnitude 8.5 quakes.”

The researchers are planning to conduct additional simulations to explore the range of impacts that depend on where the earthquake starts, the direction of travel of the rupture along the fault, and other factors that can vary.

Original here