23 February 2007

5 Minutes To Doomsday

By Rusty Rockets

The failure to resolve the problem of biological, chemical and nuclear weapons development and climate change during a time of global turmoil has been reason enough for the Bulletin of the Atomic Scientists (BAS) to recently reset their Doomsday Clock. They set their clock forward 2 minutes, so now the clock reads 5 minutes to midnight, just to prove to everyone that we are all on the brink of catastrophe once again (as if we'd have it any other way). But what exactly does the Doomsday Clock achieve, and why should we take any notice of the time-setting habits of a bunch of boffins anyway?

The BAS was formed in 1945, and the changing of the Doomsday Clock's minute hand has been going on since 1947, when the newly formed group of scientists sought a simple method to convey the perils of nuclear brinkmanship. The group considered the clock idea a winner, and by 1949 the BAS founders agreed that the clock would represent their most recent assessment of world affairs. The closest the clock has ever been to midnight (an image supposed to represent apocalypse) is 2 minutes, when, in 1953, the United States, closely followed by the Soviets, developed the H-bomb. "The hands of the Clock of Doom have moved again," said the BAS at the time. "Only a few more swings of the pendulum and, from Moscow to Chicago, atomic explosions will strike midnight for Western civilization." Ahem, so, relatively speaking, 5 minutes to midnight doesn't sound too bad.

The solemn duty of moving the hand forward or backward is left to the BAS's Board of Directors after consultation with the Board of Sponsors, which currently includes 18 Nobel Laureates. The board moves the minute hand based on nuclear developments, environmental factors, and new and emerging technologies that they consider pose a threat to human life on a grand scale. This time around, the board justified the 2-minute change by drawing attention to the world's stash of some 27,000 nuclear weapons (with 2,000 of them primed to launch within minutes), and the likely destruction of human habitats from climate change. "We stand at the brink of a second nuclear age. Not since the first atomic bombs were dropped on Hiroshima and Nagasaki has the world faced such perilous choices," said the board in a recent statement, adding: "We have concluded that the dangers posed by climate change are nearly as dire as those posed by nuclear weapons."

But why, you may ask, do scientists have to revert to such gimmicky tactics like a big, scary clock to get their message across? Isn't the public hanging on every word that scientists utter, and don't scientists have the ear of every political leader around the globe? Maybe not, as if it were true, we may not be facing the perilous future that awaits us. The question begging is that if this clock has been around since 1947, why does it seem that nobody has taken any notice of it? I mean, which bit don't the public and politicians understand when teams of highly qualified scientists say that nuclear weapons are dangerous and that climate change is a real phenomenon? To this end we might ask what input scientists really have in regard to public policy debates.

Stanford University's Rebecca Slayton, a lecturer on Science, Technology, and Society, recently spoke as a member of a panel entitled: "Who Speaks for Science? Scientific Authority in the 21st Century," which was hosted by the American Association for the Advancement of Science (AAAS). Using missile defense system computing as a case study, Slayton showed that while their influence has been small, scientists have had some sway in public debates regarding weapons policy. Missile defense systems, says Slayton, rely on extremely complex computers and software that cannot be deemed reliable until tested under realistic conditions. By implication, this means that unless someone decides to start a full-blown nuclear exchange to test such missile defense systems, such systems can never be considered bug-free.

It's taken many years for computer scientists to get to a point where their advice on such matters has been considered seriously, explains Slayton. After looking at two defense systems debates from the 60s and 80s, Slayton found that computer professionals were those most against the idea of such systems being used to protect the US from Soviet nuclear attack. But it wasn't until the 80s that computer professionals had gained enough credibility as scientists in their own right that they became integral to Defense Department expert panels. In fact, it was software engineer David Lorge Parnas' claim that reliable software couldn't be written for the "Star Wars" missile defense system and his subsequent resignation that led to the whole affair being made public via the New York Times.

While it looks as though science, and reason, may have a rare win over politics during these periods, Slayton says that people still argued (and continue to argue) against the expert opinions provided to them by software engineers. It makes you wonder whether they treat their electrician or motor mechanic the same way. "How could you expect a computer system more complex than anything we use these days to work correctly the very first time it's used?" asks Slayton, incredulously. "And yet, people argued about it. What kind of knowledge do you need to bring to the table to close this argument down?" It's a poignant question, and one that reflects a great many debates currently on the public agenda, including climate change and arms control.

As Slayton says, it took software engineers 20 years before their opinions were even remotely respected, and it's taken a similar struggle for climatologists to get their message heard. When it comes to apocalyptic scenarios presented to us by experts in their field, why is it that the first response we have is to doubt or ignore it? Who else should speak for science other than scientists? The clock is ticking!

Related articles:
The 10,000 Year Clock