Wednesday, October 21, 2009

Dinosaurs may have been wiped out by a massive 25 mile wide meteor - four times bigger than the asteroid previously though to be behind their extinction।



The deepest part of the Shiva basin, in red in this elevation diagram, is three miles below the surface of the Indian Ocean Photo: KYLE MCQUILKIN

Researchers believe they have discovered the world's biggest crater off the coast of India which they think may be responsible for the extinction of dinosaurs 65 million years ago.

The mysterious Shiva basin, named after the Hindu God, has a diameter of 310.7 miles along the seafloor and has a central peak of some 3 miles, as tall as Mount McKinley, the highest mountain in North America.

This dwarfs the meteor that was thought to have killed off the dinosaurs which measured between five and six miles and lies in the Yucatan Peninsula in Mexico.

That impact left a crater with a diameter of 180 kilometres.

Sankar Chatterjee of Texas Tech University, who led the research, said: "If we are right, this is the largest crater known on our planet.

"Rocks from the bottom of the crater will tell us the telltale sign of the impact event from shattered and melted target rocks. And we want to see if there are breccias, shocked quartz, and an iridium anomaly."

Asteroids are rich in iridium, and such anomalies are thought of as the fingerprints of an impact.

Mr Chatterjee believes the impact of an asteroid or comet of this size would have vaporized the Earth's crust on collision, killing most life and leaving ultra-hot mantle material to well up in its place.

The force of the impact broke the Seychelles islands off of the Indian tectonic plate and sent them drifting towards Africa. Much of the 30-mile-thick granite layer in the western coast of India was also destroyed.

Most of the crater lies submerged on India's continental shelf, but some tall cliffs rise above the sea, bringing active faults and hot springs. The area is a rich source of oil and gas reserves.

The team plans to visit India again to drill into the centre of the crater for clues to prove the basin was formed by a gigantic impact.

Mr Chaterjee will present his research this month at the Annual Meeting of the Geological Society of America.

Tuesday, October 20, 2009

A particle God doesn’t want us to discover

Could the Large Hadron Collider be sabotaging itself from the future, as some physicists say

Explosions, scientists arrested for alleged terrorism, mysterious breakdowns — recently Cern’s Large Hadron Collider (LHC) has begun to look like the world’s most ill-fated experiment.

Is it really nothing more than bad luck or is there something weirder at work? Such speculation generally belongs to the lunatic fringe, but serious scientists have begun to suggest that the frequency of Cern’s accidents and problems is far more than a coincidence.

The LHC, they suggest, may be sabotaging itself from the future — twisting time to generate a series of scientific setbacks that will prevent the machine fulfilling its destiny.

At first sight, this theory fits comfortably into the crackpot tradition linking the start-up of the LHC with terrible disasters. The best known is that the £3 billion particle accelerator might trigger a black hole capable of swallowing the Earth when it gets going. Scientists enjoy laughing at this one.

This time, however, their ridicule has been rather muted — because the time travel idea has come from two distinguished physicists who have backed it with rigorous mathematics.

What Holger Bech Nielsen, of the Niels Bohr Institute in Copenhagen, and Masao Ninomiya of the Yukawa Institute for Theoretical Physics in Kyoto, are suggesting is that the Higgs boson, the particle that physicists hope to produce with the collider, might be “abhorrent to nature”.

What does that mean? According to Nielsen, it means that the creation of the boson at some point in the future would then ripple backwards through time to put a stop to whatever it was that had created it in the first place.

This, says Nielsen, could explain why the LHC has been hit by mishaps ranging from an explosion during construction to a second big bang that followed its start-up. Whether the recent arrest of a leading physicist for alleged links with Al-Qaeda also counts is uncertain.

Nielsen’s idea has been likened to that of a man travelling back through time and killing his own grandfather. “Our theory suggests that any machine trying to make the Higgs shall have bad luck,” he said.

“It is based on mathematics, but you could explain it by saying that God rather hates Higgs particles and attempts to avoid them.”

His warnings come at a sensitive time for Cern, which is about to make its second attempt to fire up the LHC. The idea is to accelerate protons to almost the speed of light around the machine’s 17-mile underground circular racetrack and then smash them together.

In theory the machine will create tiny replicas of the primordial “big bang” fireball thought to have marked the creation of the universe. But if Nielsen and Ninomiya are right, this latest build-up will inevitably get nowhere, as will those that come after — until eventually Cern abandons the idea altogether.

This is, of course, far from being the first science scare linked to the LHC. Over the years it has been the target of protests, wild speculation and court injunctions.

Fiction writers have naturally seized on the subject. In Angels and Demons, Dan Brown sets out a diabolical plot in which the Vatican City is threatened with annihilation from a bomb based on antimatter stolen from Cern.

Blasphemy, a novel from Douglas Preston, the bestselling science-fiction author, draws on similar themes, with a story about a mad physicist who wants to use a particle accelerator to communicate with God. The physicist may be American and the machine located in America, rather than Switzerland, but the links are clear.

Even Five, the TV channel, has got in on the act by screening FlashForward, an American series based on Robert Sawyer’s novel of the same name in which the start-up of the LHC causes the Earth’s population to black out for two minutes when they experience visions of their personal futures 21 years hence. This gives them a chance to change that future.

Scientists normally hate to see their ideas perverted and twisted by the ignorant, but in recent years many physicists have learnt to welcome the way the LHC has become a part of popular culture. Cern even encourages film-makers to use the machine as a backdrop for their productions, often without charging them.

Nielsen presents them with a dilemma. Should they treat his suggestions as fact or fiction? Most would like to dismiss him, but his status means they have to offer some kind of science-based rebuttal.

James Gillies, a trained physicist who heads Cern’s communications department, said Nielsen’s idea was an interesting theory “but we know it doesn’t happen in reality”.

He explained that if Nielsen’s predictions were correct then whatever was stopping the LHC would also be stopping high-energy rays hitting the atmosphere. Since scientists can directly detect many such rays, “Nielsen must be wrong”, said Gillies.

He and others also believe that although such ideas have an element of fun, they risk distracting attention from the far more amazing ideas that the LHC will tackle once it gets going.

The Higgs boson, for example, is thought to give all other matter its mass, without which gravity could not work. If the LHC found the Higgs, it would open the door to solving all kinds of other mysteries about the origins and nature of matter. Another line of research aims to detect dark matter, which is thought to comprise about a quarter of the universe’s mass, but made out of a kind of particle that has so far proven impossible to detect.

However, perhaps the weirdest of all Cern’s aspirations for the LHC is to investigate extra dimensions of space. This idea, known as string theory, suggests there are many more dimensions to space than the four we can perceive.

At present these other dimensions are hidden, but smashing protons together in the LHC could produce gravitational anomalies, effectively tiny black holes, that would reveal their existence.

Some physicists suggest that when billions of pounds have been spent on the kit to probe such ideas, there is little need to invent new ones about time travel and self-sabotage.

History shows, however, it is unwise to dismiss too quickly ideas that are initially seen as science fiction. Peter Smith, a science historian and author of Doomsday Men, which looks at the links between science and popular culture, points out that what started as science fiction has often become the inspiration for big discoveries.

“Even the original idea of the ‘atomic bomb’ actually came not from scientists but from H G Wells in his 1914 novel The World Set Free,” he said.

“A scientist named Leo Szilard read it in 1932 and it gave him the inspiration to work out how to start the nuclear chain reaction needed to build a bomb. So the atom bomb has some of its origins in literature, as well as research.”

Some of Cern’s leading researchers also take Nielsen at least a little seriously. Brian Cox, professor of particle physics at Manchester University, said: “His ideas are theoretically valid. What he is doing is playing around at the edge of our knowledge, which is a good thing.

“He is pointing out that we don’t yet have a quantum theory of gravity, so we haven’t yet proved rigorously that sending information into the past isn’t possible.

“However, if time travellers do break into the LHC control room and pull the plug out of the wall, then I’ll refer you to my article supporting Nielsen’s theory that I wrote in 2025.”

This weekend, as the interest in his theories continued to grow, Nielsen was sounding more cautious. “We are seriously proposing the idea, but it is an ambitious theory, that’s all,” he said. “We already know it is not very likely to be true. If the LHC actually succeeds in discovering the Higgs boson, I guess we will have to think again.”


Friday, June 26, 2009

End of the Big Beasts

by Peter Tyson

Who or what killed off North America's mammoths
and other megafauna 13,000 years ago?

It takes a certain kind of person to tackle this question in earnest. You have to be itching to know the answer yet patient as a Buddha, for the answer is frustratingly elusive. I know I'm not the type. I'm intrigued by the question but far too anxious to calmly accept, as some experts suggest, that it might be years or decades, if ever, that a definitive, widely accepted solution will come. (To follow the long-running wrangle over this question, see The Extinction Debate.)

The four people I spoke to about the megafaunal or "large-animal" extinctions possess this sort of edgy sangfroid. While keeping an open mind, they also stand in four decidedly different camps regarding why America's rich complement of big beasts went extinct quite suddenly at the end of the Ice Age. The four camps are known tongue-in-cheek as "overkill," "overchill," "overill," and "overgrill"*:

  • Archeologist Gary Haynes, University of Nevada Reno, and others think that the continent's first human hunters, fresh from Siberia, killed the megafauna off as they colonized the newly discovered land.

  • Donald Grayson, an archeologist at the University of Washington, Seattle, along with colleague David Meltzer of Southern Methodist University, believes that climate changes at the end of the Pleistocene epoch triggered the collapse.

  • Mammalogist Ross MacPhee of the American Museum of Natural History has advanced the idea, with virologist Preston Marx, that a virulent "hyperdisease" brought by the first Americans might have raced through species with no natural immunity, bringing about their demise.

  • And, in the newest hypothesis advanced, geologist James Kennett, U.C. Santa Barbara, and colleagues propose that a comet impact or airburst over North America did it.

So why is the answer so elusive? As often happens in the paleosciences, it largely comes down to lack of empirical evidence, something all four hypotheses arguably suffer from. (There's a fifth hypothesis, actually—that a combination of overkill and overchill did it.)

Overkill

In the early 1960s, ecologist Paul Martin of the University of Arizona postulated that the first Americans, after crossing into the Americas over the Bering Land Bridge, hunted the megafauna to extinction. For many years, "overkill" became the leading contender. The timing seemed more than coincidental: Humans were thought to have arrived no earlier than about 14,000 years ago, and by roughly 13,000 years ago, most of the megafaunal species abruptly vanish from the fossil record. (See a list of all 35 extinct genera of North American Ice Age mammals.)

But skeptics have asked, Where's the evidence? Grayson and Meltzer (overchill) have noted that late-Ice Age sites bearing megafaunal remains that show unequivocal sign of slaughter by humans number just 14. Moreover, they stress, only two types of giants were killed at those 14 sites, mammoth and mastodon. There's no sign that early hunters preyed on giant ground sloths, short-faced bears, or the massive, armadillo-like glyptodonts, for instance. (Forensic studies of a cache of Clovis tools found in 2008 suggest the Clovis people did hunt now-extinct camels and horses.) That's hardly enough evidence, Grayson and Meltzer argue, to lay blame for a continent's worth of lost megafauna at the foot of the first Americans.

Gary Haynes (overkill) begs to differ. "I don't care what anybody else says, 14 kill sites of mammoth and mastodon in a very short time period is extraordinary," he told me. It's one thing to find a campsite with some animal bones in it, he says, quite another to find the actual spot where an ancient hunter felled and butchered an animal—where, say, a spearpoint turns up still sticking in bone. "It's very, very rare to find a kill site anywhere in the world," he says. And absence of other megafauna in kill sites doesn't mean they weren't hunted. "There is no doubt Native Americans were eating deer and bear and elk," Haynes says, citing several large mammals that pulled through. "But you cannot find a single kill site of them across 10,000 years."

The dearth of widely convincing evidence only serves as a spur.

Could what scholars agree must have been a relatively modest initial population of hunters have emptied an entire continent of its megafauna virtually overnight, geologically speaking? (In fact, it's three continents: South America and, to a lesser extent, Northern Eurasia also lost many large species at the end of the Ice Age.) For his part, Ross MacPhee (overill) finds it hard to swallow. "I just don't think it's plausible, especially if we're also talking about collapses for megafauna that didn't actually go extinct." Certain populations of surviving big beasts, including bison in North America and musk oxen in Asia, are known to have fallen precipitously at the end of the Ice Age. "It gets a little bit beyond probability in my view that people could have been so active as to hunt every animal of any body size, in every context, in every possible environment, over three continents."

Overchill

Could climate change have done it? Scholars generally agree that North America witnessed some rapid climate adjustments as it shook off the Ice Age beginning about 17,000 years ago. The most significant swing was a cold snap between about 12,900 and 11,500 years ago. Known as the Younger Dryas, this partial return to ice-age conditions may have stressed the megafauna and their habitats sufficiently to cause widespread die-offs, Grayson and others believe.

Detractors, again, point to the lack of evidence. "There aren't any deposits of starved or frozen or somehow naturally killed animals that are clearly non-cultural in origin that you would expect if there was an unusual climate swing," says Haynes. "I don't think that evidence exists." Another question dissenters have is how the megafauna survived many abrupt glacial and deglacial shifts during the past two million years only to succumb to the one that closed the Pleistocene. "It just doesn't hold water," Jim Kennett (overgrill) told me.

Grayson admits that overchill advocates have failed to develop the kind of records needed to test climate hypotheses in detail. But he focuses on climate change, he says, because he sees absolutely no sign that people were involved. "You can't look at climate and say climate didn't do it for the simple reason that we don't really know what to look for," Grayson told me. "But what you can do fairly easily is look at the evidence that exists for the overkill position. That position would seem to make fairly straightforward predictions about what the past should have been like, and when you look to see if it was that way, you don't find it."

Overill

A lack of data has particularly plagued the "overill" hypothesis. This is the notion that diseases brought unwittingly by newly arriving people, either in their own bodies or in those of their dogs or perhaps rats, could have killed off native species that had no natural immunity. MacPhee devised this hypothesis with Preston Marx after realizing that the link between initial human arrival and subsequent large-animal extinctions was strong not just in North America but in many other parts of the world (see map in sidebar), but that in his opinion, convincing evidence for hunting as the culprit simply did not exist.

Despite what he calls "prodigious effort" using DNA techniques and immunological probes, however, MacPhee and his colleagues have failed to detect clues to any pathogens in megafaunal bones, much less nail down a specific disease, like rabies or rinderpest, that could have jumped from one type of animal to another and wiped out all the big beasts. "There's no evidence, and there's virtually no possibility of getting any evidence," Kennett told me.

"[Overill] doesn't even have circumstantial evidence, because we can't prove there was hyperdisease," Haynes says. "We can prove people were here, and we can prove climates were changing." Fair enough, says MacPhee, though he points out that the burgeoning ability of Asian bird flu to infect across species boundaries seems to suggest that some diseases are ecologically and genetically preordained to, as he puts it, "go hyper."

Overgrill

The most recent hypothesis, advanced by Kennett and 25 other scientists in a 2007 Proceedings of the National Academy of Sciences paper, concerns the proposed cosmic impact. Right about the time the Younger Dryas began and at least 15 of those 35 extinct mammals and arguably the Clovis culture itself appear to vanish abruptly from the fossil record—that is, right about 12,900 years ago—Kennett et al see markers of a major catastrophe. The markers lie in a thin layer at the base of a "black mat" of soil that archeologists have identified at over 50 Clovis sites across North America.

According to Kennett, fieldworkers have uncovered fossils of the 15 genera of mammals that survived right up to Younger Dryas times just beneath—but neither within nor above—this black mat. (Some fossil bones butt up against this layer so closely that the mat has blackened them, Kennett told me.) Stone-tool remains of the Clovis culture also end just beneath the mat, he says. Moreover, Kennett and the team he works with have identified charcoal, soot, microscopic diamonds, and other trace materials at the base of the mat. These materials indicate, he says, that a comet (not an asteroid—different constituents) exploded in the atmosphere or struck the surface, likely in pieces. This triggered widespread wildfires and extinctions, changed ocean circulation, and coughed up sun-blocking ash and dust, all of which helped unleash the Younger Dryas. Tokens of this cosmic cataclysm have shown up in the Greenland ice sheet as well, Kennett says.

Aren't you just dying to know what happened?

Where then, skeptics ask, is the crater? Unlike the asteroid strike at the end of the Cretaceous, the one thought to have ended the reign of the dinosaurs, this 12,900-year-old event currently has no hole or holes definitively linked to it. Kennett says it's still early, noting that it took nearly a decade for scientists to discover the dinosaur-ending impact crater after evidence for a cosmic collision 65 million years ago first turned up in sedimentary layers around the world. Then again, there may be no crater, Kennett says. He cites Tunguska: In 1908, an object that scholars believe was a meteor or comet exploded high above the Tunguska River in Siberia, leveling trees over 800 square miles but leaving no crater.

Critics also take issue with the black-mat evidence. Haynes (overkill) argues that the mat's charcoal-rich layer could as likely be from human-caused fires as from comet-caused wildfires, while Grayson (overchill) questions the purported collapse of Clovis populations, for which he and many other archeologists see very little evidence.

Finally, there are the extinctions themselves. Of the 35 extinct genera, 20 or so cannot be shown to have survived up to the Younger Dryas. The youngest date, for example, for fossils of Eremotherium, a giant ground sloth, is 28,000 years ago. "So the idea that this impact could have caused the extinctions of all these animals just does not make sense," Grayson says. In response, Kennett points out that the fossil record is imperfect, and one would not expect to see the most recent occurrence of rare forms like Eremotherium to extend right up to the Younger Dryas, as the remains of more common animals like mammoths, horses, and camels do.

Soldiering on

If there's one thing all scholars involved in this famously contentious debate would welcome it's more data. For in science, as Kennett put it to me, "data eventually rules." Grayson, for one, feels the field would benefit from a better understanding of just when each of those 20 rarer genera of big beasts went extinct. "Until we know when these extinctions occurred, I think we're wasting our time in trying to explain them," he says.

In the meantime, the dearth of widely convincing evidence only serves as a spur. MacPhee may be speaking for all researchers working on this mystery when he says: "What's of interest here for me personally is that these Pleistocene extinctions have occupied the minds of some very able thinkers over the last half century or so, and nobody's come up with anything that's drop-dead decisive. So it's attractive as an intellectual problem."

Granted. But hey, aren't you just dying to know what happened?


*Gary Haynes offered this sobriquet when I asked him if a playful term for the comet hypothesis had caught on yet.

Monday, March 23, 2009

The “Ultimate Jurassic Predator” Could Crush a Hummer in Its Jaws

On a Norwegian island within the Arctic Circle, researchers have unearthed the fossilized remains of a marine monster they call “Predator X.” The 50-foot beast is a new species of pliosaur, and researchers say the enormous reptile ruled the Jurassic seas some 147 million years ago…. “Its anatomy, physiology and hunting strategy all point to it being the ultimate predator – the most dangerous creature to patrol the Earth’s oceans” [New Scientist], the Natural History Museum at the University of Oslo said in a breathless press release.

Predator X swept through the seas some 147 million years ago during the Jurassic Period, when dinosaurs walked the land. The creature swam with its four flippers, and relied on its crushing jaw power to bring down its prey–lead researcher Joern Hurum estimates that its had 33,000 pounds per square inch bite force. Says Hurum: “With a skull that’s more than 10 feet long you’d expect the bite to be powerful but this is off the scale…. It’s much more powerful than T-Rex” [Reuters]. Hurum has said that a previously discovered fossil pliosaur was big enough to chomp on a small car. He said the bite estimates for the latest fossil forced a rethink. “This one is more like it could crush a Hummer,” he said [Reuters]. Hurum theorizes that the 45-ton predator feasted on fish and marine reptiles, including ichthyosaurs and long-necked plesiosaurs.

Paleontologists dug up the partial skull and the fragmented skeleton of a giant pliosaur last summer on the island of Spitsbergen. Fossil hunters get used to working in the heat and cold, the dry and wet, but even without counting the polar bears nosing around their dig, Spitsbergen posed unusual challenges. It has only a three-week window for excavating, from the end of July through much of August. That is after the warmth of a brief summer has thawed upper layers of the ground and before the onset of the round-the-clock darkness of Arctic winter [The New York Times]. A documentary about the expedition will be shown on the History Channel later this month.

The researchers haven’t yet given the new species a scientific name, and although they’ve described their findings at scientific conferences, they have yet to publish their work in a peer-reviewed journal–they say that will happen later this year.

Wednesday, March 11, 2009

Trio of Galaxies Mix It Up

Release No. STScI-2009-10
Image Credit: NASA, ESA, and R. Sharples (University of Durham)

Though they are the largest and most widely scattered objects in the universe, galaxies do go bump in the night. The Hubble Space Telescope has photographed many pairs of galaxies colliding. Like snowflakes, no two examples look exactly alike. This is one of the most arresting galaxy smash-up images to date.

At first glance, it looks as if a smaller galaxy has been caught in a tug-of-war between a Sumo-wrestler pair of elliptical galaxies. The hapless, mangled galaxy may have once looked more like our Milky Way, a pinwheel-shaped galaxy. But now that it's caught in a cosmic Cuisinart, its dust lanes are being stretched and warped by the tug of gravity. Unlike the elliptical galaxies, the spiral is rich in dust and gas for the formation of new stars. It is the fate of the spiral galaxy to be pulled like taffy and then swallowed by the pair of elliptical galaxies. This will trigger a firestorm of new stellar creation.

If there are astronomers on any planets in this galaxy group, they will have a ringside seat to seeing a flurry of starbirth unfolding over many millions of years to come. Eventually the ellipticals should merge too, creating one single super-galaxy many times larger than our Milky Way. This trio is part of a tight cluster of 16 galaxies, many of them being dwarf galaxies. The galaxy cluster is called the Hickson Compact Group 90 and lies about 100 million light-years away in the direction of the constellation Piscis Austrinus, the Southern Fish.

Hubble imaged these galaxies with the Advanced Camera for Surveys in May 2006.

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency (ESA) and is managed by NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Md. The Space Telescope Science Institute (STScI) conducts Hubble science operations. The institute is operated for NASA by the Association of Universities for Research in Astronomy, Inc., Washington, D.C.

STScI is an International Year of Astronomy 2009 (IYA 2009) program partner.


Thursday, February 05, 2009

FORECAST: EARTH QUAKE

Fascinated by the implications of what were apparently man-made quakes, USGS scientists in 1969 set up their instruments at the Rangely oilfield in northwestern Colorado. There, Chevron was recovering oil from less productive wells by injecting water into them under great pressure. The recovery technique was setting off small quakes, the strongest near wells subjected to the greatest water pressure. If water was pumped out of the earth, the survey scientists wondered, would the quakes stop? In November 1972, they forced water into four of the Chevron wells. A series of minor quakes soon began, and did not stop until March 1973. Then the scientists pumped water out of the wells, reducing fluid pressure in the rock below. Almost immediately, earthquake activity ended. In a limited way, they had controlled an earthquake.

The results of the Rangely experiments led USGS Geophysicists Raleigh and James Dietrich to propose an ingenious scheme. They suggested drilling a row of three deep holes about 500 yds. apart, along a potentially dangerous fault. By pumping water out of the outer holes, they figured they could effectively strengthen the surrounding rock and lock the fault at each of those places. Then they would inject water into the middle hole, increasing fluid pressure in the nearby rocks and weakening them to the point of failure. A minor quake—contained between the locked areas—should result, relieving the dangerous stresses in the immediate vicinity. By repeating the procedure, the scientists could eventually relieve strains over a wide area. Other scientists feel that such experiments should be undertaken with caution, lest they trigger a large quake. Raleigh is more hopeful. In theory, he says, relatively continuous movement over the entire length of the San Andreas Fault could be maintained—and major earthquakes prevented—with a system of some 500 three-mile-deep holes evenly spaced along the fault. Estimated cost of the gigantic project: $1-$2 billion.

In a time of austerity, the possibility of such lavish financing is remote। As M.I.T.'s Press puts it: "How does one sell preventive medicine for a future affliction to Government agencies beleaguered with current illness?" Ironically, the one event that would release money for the study of earthquake prediction and control is the very disaster that scientists are trying to avert: a major quake striking a highly populated area without any warning. Tens of thousands of people living in the flood plain of the Van Norman Dam had a close call four years ago in the San Fernando Valley quake; had the tremor lasted a few more seconds, the dam might have given way. When the San Andreas Fault convulses again—as it surely must—or when another, less notorious fault elsewhere in the U.S. suddenly gives way, thousands of other Americans may not be so lucky.

The last large-scale killers occurred in Asia. One, last December in northern Pakistan, ravaged nine towns and took nearly 5,000 lives. The other, a February tremor in China, is believed to have killed hundreds. Indeed, not a day passes without earth tremors somewhere on the globe. Some of those quakes are too weak to be felt by humans; they can be detected only by sensitive seismographs. Others are more violent but occur on the ocean floor or in remote areas and do no harm. Some add to the long catalogue of destruction. Last week, for example, a 4.7 earthquake rocked lightly populated Kodiak Island, off the coast of Alaska. In July, a 6.8 quake struck Pagan, Burma, destroying or damaging half of the city's historic temples. Within the past several weeks, strong earthquakes struck Oroville, Calif., Mindanao in the Philippines, the Kamchatka Peninsula in Siberia and the southwest Pacific island of Bougainville.

With good reason, many primitive peoples regarded the terrible quakes they could not understand as the acts of a vengeful deity. As late as 1750, Thomas Sherlock, the Bishop of London, told his flock that two recent earthquakes were warnings that Londoners should atone for their sins. John Wesley agreed. In a 1777 letter to a friend, he wrote:

"There is no divine visitation which is likely to have so general an influence upon sinners as an earthquake." The ancient Japanese believed that the hundreds of quakes that shook (and still shake) their islands every year were caused by the casual movements of a great spider that carried the earth on its back. Natives of Siberia's quake-prone Kamchatka Peninsula blamed the tremors on a giant dog named Kosei tossing snow off his fur. Pythagoras, the Greek philosopher and mathematician, believed that earthquakes were caused by the dead fighting among themselves. Another ancient Greek, Aristotle, had a more scientific explanation. He contended that the earth's rumblings were the result of hot air masses trying to escape from the earth's interior.

In the past decade, the development of a bold new geological theory called plate tectonics—which offers an elegant, comprehensive explanation for continental drift, mountain building and volcanism—seems finally to have clarified the underlying cause of earthquakes। It holds that the surface of the earth consists of about a dozen giant, 70-mile-thick rock plates. Floating on the earth's semimolten mantle and propelled by as yet undetermined forces, the plates are in constant motion. Where they meet, friction sometimes temporarily locks them in place, causing stresses to build up near their edges. Eventually the rock fractures, allowing the plates to resume their motion. It is that sudden release of pent-up energy that causes earthquakes. Off Japan, for instance, the Pacific plate is thrusting under the Eurasian plate, causing the deep-seated quakes characteristic of the Japanese archipelago. In California, along the San Andreas Fault, two great plates are sliding past each other. The sliver west of the fault, which is located on the Pacific plate, is moving toward the northwest. The rest of the state is resting on the North American plate, which is moving westward. The sudden movement of a portion of the fault that had been locked in place for many years is thought to have caused the great San Francisco earthquake of 1906.

When quake centers are marked on a map of the world, it becomes clear that many earthquakes do indeed occur along plate boundaries. The earthquake-marked "ring of fire" around the Pacific Ocean, for example, neatly outlines the Pacific plate. But earthquakes can also occur well within a plate, possibly because the plate structure has been weakened in those places during periods of ancient volcanism. Charleston, S.C., for instance, is more than 1,000 miles away from the edge of the North American plate; yet it lies in a seismically active area (see map page 39) and was hit by a major quake that killed 27 people in 1886. New Madrid, Mo., near the middle of the plate, was the site of three huge quakes in 1811 and 1812. Wrote one resident of the then sparsely populated area: "The whole land was moved and waved like the waves of the sea. With the explosions and bursting of the ground, large fissures were formed, some of which closed immediately, while others were of varying widths, as much as 30 ft."

Long before the plate-tectonics theory was conceived, scientists were aware that rocks fracture only under extreme stress. As early as 1910, Johns Hopkins Geologist Harry Reid suggested that it should be possible to tell when and where quakes were likely to occur by keeping close tab on the buildup of stresses along a fault. But the knowledge, instruments and funds necessary to monitor many miles of fault line and interpret any findings simply did not exist. Earthquake prediction did not draw much attention until 1949, when a devastating quake struck the Garm region of Siberia, causing an avalanche that buried the village of Khait and killed 12,000 people. Stunned by the disaster, the Soviets organized a scientific expedition and sent it into the quake-prone area. Its mission: to discover any geologic changes—in effect, early warning signals—that might occur before future quakes. The expedition remained In Siberia far longer than anyone had expected. But it was time well spent. In 1971, at an international scientific meeting in Moscow, the Soviet scientists announced that they had achieved their goal: learned how to recognize the signs of impending quakes.

The most important signal, they said, was a change in the velocity of vibrations that pass through the earth's crust as a result of such disturbances as quakes, mining blasts or underground nuclear tests। Earth scientists have long known that tremors spread outward in two different types of seismic waves. P waves cause any rock in their path to compress and then expand in the same direction as the waves are traveling. S waves move the rock in a direction that is perpendicular to their path. Because P waves travel faster than S waves, they reach seismographs first. The Russian scientists found that the difference in the arrival times of P and S waves began to decrease markedly for days, weeks and even months before a quake. Then, shortly before the quake struck, the lead time mysteriously returned to normal. The Russians also learned that the longer the period of abnormal wave velocity before a quake, the larger the eventual tremor was likely to be.*

The implication of that information was not lost on visiting Westerners. As soon as he returned home from Moscow, Lynn Sykes, head of the seismology group of Columbia University's Lamont-Doherty Geological Observatory, urged one of his students, a young Indian doctoral candidate named Yash Aggarwal, to look for similar velocity shifts in records from Lamont-Doherty's network of seismographs in the Blue Mountain Lake region of the Adirondacks, in upper New York State, where tiny tremors occur frequently

As it happens, a swarm of small earthquakes had taken place at approximately the time of the Moscow meeting. Aggarwal's subsequent analysis bore out the Russian claims: before each quake, there had been a distinct drop in the lead time of the P waves.

As significant as those changes seemed, U.S. seismologists felt that they could not be really dependable as a quake-prediction signal without a more fundamental understanding of what was causing them. That explanation was already available. In the 1960s, while studying the reaction of materials to great mechanical strains, a team of researchers under M.I.T. Geologist William Brace had discovered that as rock approaches its breaking point, there are unexpected changes in its properties. For one thing, its resistance to electricity increases; for another, the seismic waves passing through it slow down.

Both effects seemed related to a phenomenon called dilatancy—the opening of a myriad of tiny, often microscopic cracks in rock subjected to great pressure. Brace even suggested at the time that the physical changes associated with dilatancy might provide warning of an impending earthquake, but neither he nor anyone else was quite sure how to proceed with his proposal. Dilatancy was, in effect, put on the shelf.

The Russian discoveries reawakened interest in the subject। Geophysicist Christopher Scholz of Lamont-Doherty and Amos Nur at Stanford, both of whom had studied under Brace at M.I.T., independently published papers that used dilatancy to explain the Russian findings. Both reports pointed out an apparent paradox: when the cracks first open in the crustal rock, its strength increases. Temporarily, the rock resists fracturing and the quake is delayed. At the same time, seismic waves slow down because they do not travel as fast through the open spaces as they do through solid rock. Eventually ground water begins to seep into the new openings in the dilated rock. Then the seismic-wave velocity quickly returns to normal. The water also has another effect: it weakens the rock until it suddenly gives way, causing the quake.

Soon California Institute of Technology's James Whitcomb, Jan Garmany and Don Anderson weighed in with more evidence. In a search of past records, they found a distinct drop in the speed of P waves 3½ years before the 1971 San Fernando quake (58 deaths), the largest in California in recent years. The P waves had returned to their normal velocity a few months before the tremor. Besides providing what amounted to a retroactive prediction of that powerful quake, the Caltech researchers demonstrated that it was primarily the velocity of the P waves, not the S waves, that changed. Their figures were significant for another reason: the P-wave velocity change was not caused by a quirk of geology in the Garm region or even in the Adirondacks, but was apparently a common symptom of the buildup of dangerous stresses in the earth.

In fact, dilatancy seems to explain virtually all the strange effects observed prior to earthquakes. As cracks open in rock, the rock's electrical resistance rises because air is not a good conductor of electricity. The cracks also increase the surface area of rock exposed to water; the water thus comes in contact with more radioactive material and absorbs more radon—a radioactive gas that the Soviet scientists had noticed in increased quantities in Garm-area wells. In addition, because the cracking of the rock increases its volume, dilatancy can account for the crustal uplift and tilting that precedes some quakes. The Japanese, for instance, noticed a 2-in. rise in the ground as long as five years before the major quake that rocked Niigata in 1964. Scientists are less certain about how dilatancy accounts for variations in the local magnetic field but think that the effect is related to changes in the rock's electrical resistance.

With their new knowledge, U।S. and Russian scientists cautiously began making private predictions of impending earthquakes. In 1973, after he had studied data from seven portable seismographs at the Blue Mountain Lake encampment, Columbia University's Aggarwal excitedly telephoned Lynn Sykes back at the laboratory. All signs, said Aggarwal, pointed to an imminent earthquake of magnitude 2.5 to 3. As Aggarwal was sitting down to dinner two days later, the earth rumbled under his feet. "I could feel the waves passing by," he recalls, "and I was jubilant." In November 1973, after observing changes in P-wave velocity, Caltech's Whitcomb predicted that there would be a shock near Riverside, Calif., within three months. Sure enough, a tremor did hit before his deadline—on Jan. 30. Whitcomb's successful prediction was particularly important. All previous forecasts had involved quakes along thrust faults, where rock on one side of a fault is pushing against rock on the other. The Riverside quake took place on a strike-slip fault, along which the adjoining sides are sliding past each other. Because most upheavals along the San Andreas Fault involve strike-slip quakes, Whitcomb's forecast raised hopes that seismologists could use their new techniques to predict the major earthquakes that are bound to occur along the San Andreas.

The Chinese, too, were making rapid progress in their earthquake-forecast studies. When a delegation of U.S. scientists headed by M.I.T. Geologist Frank Press toured Chinese earthquake-research centers in October 1974, they were astonished to learn that the country had some 10,000 trained earthquake specialists (more than ten times the American total). They were operating 17 major observation centers, which in turn receive data from 250 seismic stations and 5,000 observation points (some of which are simply wells where the radon content of water is measured). In addition, thousands of dedicated amateurs, mainly high school students, regularly collect earthquake data.

The Chinese have good reason to be vigilant. Many of their people live in vulnerable adobe-type, tile-roofed homes that collapse easily during tremors. And the country shudders through a great number of earthquakes, apparently because of the northward push of the Indian plate against the Eurasian plate. Says Press: "It is probably the one country that could suffer a million dead in a single earthquake."

Chinese scientists read every scientific paper published by foreign earthquake researchers. They also pay close attention to exotic prequake signals—including oddities of animal behavior—so far largely overlooked by other nations. Before a quake in the summer of 1969, the Chinese observed that in the Tientsin zoo, the swans abruptly left the water, a Manchurian tiger stopped pacing in his cage, a Tibetan yak collapsed, and a panda held its head in its paws and moaned. On his return from the China tour, USGS's Barry Raleigh learned that horses had behaved skittishly in the Hollister area before the Thanksgiving Day quake. "We were very skeptical when we arrived in China regarding animal behavior," he says. "But there may be something in it."

Though the U.S. does not have the national commitment of the Chinese, there is no lack of urgency among American scientists. California has not had a great earthquake since the San Francisco disaster in 1906, and seismologists are warily eying at least two stretches of the San Andreas Fault that seem to be "locked." One segment, near Los Angeles, has apparently not budged, while other parts of the Pacific and North American plates have slid some 30 ft. past each other. Near San Francisco, there is another locked section. Sooner or later, such segments will have to catch up with the inexorable-movement of the opposing plates. If they do so in one sudden jolt, the resulting earthquakes, probably in the 7-to 8-pt. Richter range and packing the energy of multimegaton hydrogen bombs, will cause widespread destruction in the surrounding areas.

If one of those quakes occurs in the San Francisco area, the results will be far more calamitous than in 1906 (see box page 40)। A comparable earthquake near Los Angeles could kill as many as 20,000 and injure nearly 600,000.

As a practical start toward earthquake prediction, USGS is constructing a prototype network of automated sensing stations equipped with magnetometers, tiltmeters and seismographs in California's Bear Valley. They are also beginning to make measurements of radon in wells and electrical resistance in rock. Some of the data are already being fed into the USGS's central station at Menlo Park. But analysis is still being delayed by lack of adequate computer facilities.

Other seismic monitoring grids in the U.S. include a 45-station network in the Los Angeles area, operated jointly by the USGS and Caltech; smaller networks in the New York region under the Lamont-Doherty scientists; and those in the Charleston, S.C., area, operated by the University of South Carolina. When completed and computerized, these networks will provide two warnings of impending quakes. If scientists detect changes in P-wave velocities, magnetic field and other dilatancy effects that persist over a wide area, a large quake can be expected—but not for many months. If the dilatancy effects occur in a small area, the quake will be minor but will occur soon. The return to normal of the dilatancy effects provides the second warning. It indicates that the quake will occur in about one-tenth the time during which the changes were measured. If dilatancy changes have been recorded for 70 days and then suddenly return to normal, the quake should occur in about a week.

The networks are far from complete, progress in general has been slow, and seismologists blame inadequate Government funding. The USGS's annual quake budget has remained at about $11 million for the past few years, only about $3 million of it for research in the art of forecasting.

Once in operation, an earthquake warning system will bring with it a new set of problems. If a major quake is forecast for San Francisco, for example, should the Government shut down businesses and evacuate the populace? Where would evacuees be housed? If the quake does not occur, who will be responsible for the financial loss caused by the evacuation? Answers come more easily in totalitarian China. There, says Press, "if an actual quake does not take place, it is felt that the people will understand that the state is acting on their behalf and accept a momentary disruption in their normal lives."

Just such a disruption took place in many Chinese communities on Feb। 4, the day that an earthquake struck an industrialized area in Liaoning province. According to the Chinese publication Earthquake Frontiers, at 6 p.m. that day an announcement was made over the public-address system in the Kuan-t'un commune: "According to a prediction by the superior command, a strong earthquake will probably occur tonight. We insist that all people leave their homes and all animals leave their stables." As an added incentive for people to go outside, the commune leaders also announced that movies would be shown in an outdoor location.

"As soon as the announcement was finished," the article says, "many men and women members with their whole families gathered in the square in front of the detachment gate. The first film was barely finished when a strong earthquake, 7.3 on the magnitude scale, occurred. Lightning flashed and a great noise like thunder came from the earth. Many houses were destroyed at once. Of the 2,000 people in the commune, only the 'stubborn ones,' who ignored the mobilization order, were wounded or killed by the earthquake. All the others were safe and uninjured; not even one head of livestock was lost."

Convinced that "earthquake prediction is a fact at the present time," and worried about the effect of such forecasts, particularly in U.S. cities, the National Academy of Sciences this week released a massive study entitled "Earthquake Prediction and Public Policy." Prepared by a panel of experts headed by U.C.L.A. Sociologist Ralph Turner, the study takes strong issue with the politicians and the few scientists who believe that earthquake predictions and warnings would cause panic and economic paralysis, thus resulting in more harm than the tremors themselves. Forecasting would clearly save lives, the panel states, and that is the "highest priority." Because most casualties during a quake are caused by collapsing buildings, the report recommends stronger building codes' in areas where earthquakes occur frequently, the allocation of funds for strengthening existing structures in areas where earthquakes have been forecast and even requiring some of the population to live in mobile homes and tents when a quake is imminent. Fearful that forecasting could become a political football and that some officials might try to suppress news of an impending quake, the panel recommends that warnings, which would cause disruption of daily routine when an earthquake threatens, should be issued by elected officials—but only after a public prediction has been made by a panel of scientists set up by a federal agency.

Other scientists are already looking ahead toward an even more remarkable goal than forecasting: earthquake control। What may become the basic technique for taming quakes was discovered accidentally in 1966 by earth scientists in the Denver area. They noted that the forced pumping of lethal wastes from the manufacture of nerve gases into deep wells at the Army's Rocky Mountain arsenal coincided with the occurrence of small quakes. After the Army suspended the waste-disposal program, the number of quakes declined sharply.

Fascinated by the implications of what were apparently man-made quakes, USGS scientists in 1969 set up their instruments at the Rangely oilfield in northwestern Colorado. There, Chevron was recovering oil from less productive wells by injecting water into them under great pressure. The recovery technique was setting off small quakes, the strongest near wells subjected to the greatest water pressure. If water was pumped out of the earth, the survey scientists wondered, would the quakes stop? In November 1972, they forced water into four of the Chevron wells. A series of minor quakes soon began, and did not stop until March 1973. Then the scientists pumped water out of the wells, reducing fluid pressure in the rock below. Almost immediately, earthquake activity ended. In a limited way, they had controlled an earthquake.

The results of the Rangely experiments led USGS Geophysicists Raleigh and James Dietrich to propose an ingenious scheme. They suggested drilling a row of three deep holes about 500 yds. apart, along a potentially dangerous fault. By pumping water out of the outer holes, they figured they could effectively strengthen the surrounding rock and lock the fault at each of those places. Then they would inject water into the middle hole, increasing fluid pressure in the nearby rocks and weakening them to the point of failure. A minor quake—contained between the locked areas—should result, relieving the dangerous stresses in the immediate vicinity. By repeating the procedure, the scientists could eventually relieve strains over a wide area. Other scientists feel that such experiments should be undertaken with caution, lest they trigger a large quake. Raleigh is more hopeful. In theory, he says, relatively continuous movement over the entire length of the San Andreas Fault could be maintained—and major earthquakes prevented—with a system of some 500 three-mile-deep holes evenly spaced along the fault. Estimated cost of the gigantic project: $1-$2 billion.

In a time of austerity, the possibility of such lavish financing is remote। As M.I.T.'s Press puts it: "How does one sell preventive medicine for a future affliction to Government agencies beleaguered with current illness?" Ironically, the one event that would release money for the study of earthquake prediction and control is the very disaster that scientists are trying to avert: a major quake striking a highly populated area without any warning. Tens of thousands of people living in the flood plain of the Van Norman Dam had a close call four years ago in the San Fernando Valley quake; had the tremor lasted a few more seconds, the dam might have given way. When the San Andreas Fault convulses again—as it surely must—or when another, less notorious fault elsewhere in the U.S. suddenly gives way, thousands of other Americans may not be so lucky.

* At the current rate of plate movement, Los Angeles will lie directly west of San Francisco in 10 million years.

* Used to measure the strength of earthquakes. Because the scale is logarithmic, each higher number represents a tenfold increase in the magnitude of the tremors, and a 30-fold increase in the energy released. Thus a 2-point quake is barely perceptible, a 5 may cause minor damage, a 7 is severe, and an 8 is a violent quake.

* U.S. scientists now estimate that the change can occur as long as ten years before a magnitude 8 quake, a year before a 7-pointer and a few months before a magnitude 6.

Wednesday, January 28, 2009

Dinosaur Fossil Record Compiled, Analyzed; 500 Or More Dinosaurs Possible Yet To Be Discovered

ScienceDaily (Feb. 10, 2004) — A graduate student in earth and planetary sciences in Arts & Sciences at Washington University in St। Louis has combed the dinosaur fossil record from T Rex to songbirds and has compiled the first quantitative analysis of the quality and congruence of that record.

Julia Heathcote, whose advisor is Josh Smith, Ph.D., Washington University assistant professor of earth and planetary sciences, examined data of more than 250 dinosaur genera, or classes, as well as various clades, or family tree branches, of dinosaur classes.

Heathcote found that existing data is between one-half and two-thirds complete, or of high quality, for dinosaur data. As template, she used two published whole dinosaur studies and compared them with smaller family trees within the context of the whole dinosaur data, commonly known as the Dinosauria. She also analyzed for congruence – the correlation between the fossil record and family tree relationships. Heathcote found some of the clades both complete and congruent, while others are poor in both ways.

"The whole Dinosauria fossil record I would say is moderately good, which was a surprise, because I thought it would be much worse," Heathcote said. "It generally shows a low degree of completeness but a high degree of congruence with the existing phylogenies, or family trees, studied."

Her results are important for paleontologists who are especially focused on the evolution of dinosaurs. It is to the paleontologist what Beckett's Baseball Card Price guide is to the baseball card collector, and more -– Heathcote's analysis provides information on the relationships between classes and groups, whereas the Beckett guide draws no lineages for players, for instance.

Heathcote said that there have been many attempts to analyze evolutionary patterns using the fossil record, but that the patterns can only be interpreted usefully in the context of stratigraphy -- essentially how old the fossils are. It's important to know the quality of the fossil record to better assess whether an apparently large number of genera at any one time – say, the late Jurassic period – is due to genuine species diversity or just exceptionally good preservation. Congruence matters, too, to provide information on the adequacy of data and confidence to construct evolutionary relationships.

Heathcote presented her results at the annual meeting of the Geological Society of America, held Nov. 2-5, 2003, in Seattle.

Heathcote used three different calculations to achieve her results: the Stratigraphic Consistency Index, the Relative Completeness Index and the Gap Excess Ratio. The first is a measure of how well the relationships that have been proposed for dinosaurs fits in with the stratigraphic data, which contributes to a timeline for evolution. The Relative Completeness index measures the percentage of how much missing data there might be to how much researchers actually have. And the Gap Excess Ratio measures how much missing data there actually is to the minimum missing data possible if the genera were arranged in the family tree in order of age.

Heathcote said that the known number of dinosaurs now stands at slightly more than 250. But because her results give a maximum possible completeness value, there might be 500 or more yet to be discovered, and she hopes that with each discovery, the researchers will enter their data into her program so that all paleontologists can benefit by seeing how the new discovery relates to previous ones.

She called the work "a new tool that draws together all of the data of the past 150 years, all plotted out accurately for the first time. You can see how far back these dinosaurs go, see their relationships with each other."


Adapted from materials provided by Washington University In St. Louis.

Dinosaur Fossils Fit Perfectly Into The Evolutionary Tree Of Life

ScienceDaily (Jan. 26, 2009) — A recent study by researchers at the University of Bath and London’s Natural History Museum has found that scientists’ knowledge of the evolution of dinosaurs is remarkably complete।

Evolutionary biologists use two ways to study the evolution of prehistoric plants and animals: firstly they use radioactive dating techniques to put fossils in chronological order according to the age of the rocks in which they are found (stratigraphy); secondly they observe and classify the characteristics of fossilised remains according to their relatedness (morphology).

Dr Matthew Wills from the University of Bath’s Department of Biology & Biochemistry worked with Dr Paul Barrett from the Natural History Museum and Julia Heathcote at Birkbeck College (London) to analyse statistical data from fossils of the four major groups of dinosaur to see how closely they matched their trees of evolutionary relatedness.

The researchers found that the fossil record for the dinosaurs studied, ranging from gigantic sauropods to two-legged meat eaters such as T. rex, matched very well with the evolutionary tree, meaning that the current view of evolution of these creatures is very accurate.

Dr Matthew Wills explained: “We have two independent lines of evidence on the history of life: the chronological order of fossils in the rocks, and ‘trees’ of evolutionary relatedness.

“When the two tell the same story, the most likely explanation is that both reflect the truth. When they disagree, and the order of animals on the tree is out of whack with the order in the rocks, you either have a dodgy tree, lots of missing fossils, or both.

“What we’ve shown in this study is that the agreement for dinosaurs is remarkably good, meaning that we can have faith in both our understanding of their evolution, and the relative completeness of their fossil record.

“In other words, our knowledge of dinosaurs is very, very good.”

The researchers studied gaps in the fossil record, so-called ‘ghost ranges’, where the evolutionary tree indicates there should be fossils but where none have yet been found. They mapped these gaps onto the evolutionary tree and calculated statistical probabilities to find the closeness of the match.

Dr Wills said: “Gaps in the fossil record can occur for a number of reasons. Only a tiny minority of animals are preserved as fossils because exceptional geological conditions are needed. Other fossils may be difficult to classify because they are incomplete; others just haven’t been found yet.

“Pinning down an accurate date for some fossils can also prove difficult. For example, the oldest fossil may be so incomplete that it becomes uncertain as to which group it belongs. This is particularly true with fragments of bones. Our study made allowances for this uncertainty.

“We are excited that our data show an almost perfect agreement between the evolutionary tree and the ages of fossils in the rocks. This is because it confirms that the fossil record offers an extremely accurate account of how these amazing animals evolved over time and gives clues as to how mammals and birds evolved from them.”

The study, published in the peer-reviewed journal Sytematic Biology, was part of a project funded by the Biotechnology & Biological Sciences Research Council (BBSRC) that aimed to combine different forms of evolutionary evidence to produce more accurate evolutionary trees.

----------------------------------------------------------

Journal reference:

  1. Wills et al. The Modified Gap Excess Ratio (GER*) and the Stratigraphic Congruence of Dinosaur Phylogenies. Systematic Biology, 2008; 57 (6): 891 DOI: 10.1080/10635150802570809
Adapted from materials provided by University of Bath