Main Book Contents Background Reflections Building on Breakthrough  

Instabilities in the Control of Nuclear Forces*

Paul Bracken

Professor of Public Policy and Political Science, Yale University. Author of the highly popular book, The Command and Control of Nuclear Forces, Dr. Bracken specializes in international security issues. He is a member of the editorial boards of the Journal of Conflict Resolution, Orbis, and Defense Analysis.

 


Introduction

Warning System Reliability

Multiple Errors

Tight Coupling

The Global Warning System

Reactions to Compound Stimuli

Conclusions

References

 


 

No single dictator, no single event pushed Europe into war in 1914. But during the preceding decade, motivated by various political and economic self-interests, the nations of Europe had institutionalized the potential for catastrophe. They had built interlocking alerts and mobilization plans that, once triggered, swamped and outran the political control process. It was a disaster waiting to happen.

The lesson from the outbreak of World War I is that a nation's actions in a crisis are profoundly influenced by the defense institutions built years before the crisis occurs. The construction of fantastically complex nuclear command organizations in the US and the USSR has created an extremely volatile situation, but on a far more spectacular and quick-reacting scale. A review of today's nuclear command organizations, and their governance, is clearly in order.

 

Warning System Reliability

Warning systems are an important part of the command and control of nuclear forces. They help protect vulnerable strategic weapons, such as bombers and missiles, against surprise attack. If one country knows that the other has an effective warning system, it is less likely to attack in the first place and the world is more stable as a result.

More sophisticated warning may therefore mean better security - but not always. During the past twenty-five years, both the US and the USSR have made immense investments to build highly complex warning systems. The sophistication of these systems, and their interconnection, have advanced in a manner that defies comprehension. And that may be the heart of the problem. With these systems tightly coupling the nuclear arsenals of each side, the effect of small perturbations is amplified throughout the entire nuclear force system.

 

"During the past twenty-five years, both the US and the USSR have made immense investments to build highly complex warning systems. The sophistication of these systems, and their interconnection, have advanced in a manner that defies comprehension."

 

The average person seems to realize, or at least intuit, the possible danger. Since the early 1950s, the specter of nuclear war by technical accident has been a pervasive theme of popular novels and movies. The story from the 1950s, of a flock of Canadian geese that triggered the Distant Early Warning Line radar system into mistakenly interpreting the event as an attack by Soviet bombers has been enshrined in the lore of the nuclear age. As warning systems became more sophisticated, variants of the episode inevitably followed. In 1960, meteor showers and lunar radar reflections, rather than Canadian geese, excited the new Ballistic Missile Early Warning System (BMEWS) radar, temporarily leading the North American Aerospace Defense Command (NORAD) to believe that a Soviet missile attack was en route. In 1980, a 46¢ computer chip failed in the computer warning system, producing an image of a Soviet submarine-launched ballistic missile (SLBM) attack on the US. While information is not available on Soviet false alarms, it is reasonable to assume that they have had similar experiences.

Official reaction to these false alarms tends to be defensive: Corrective actions are taken to prevent repeated accidents; nobody, including the military, wants accidental war; the system has been designed to make sure that the decision to go to war is not driven by a flock of geese or a defective computer chip. These arguments seem persuasive. Man is always in the decision loop; positive control is exercised at every point. I am convinced of the validity of these propositions at the intellectual level at which they are offered.

Yet, there is a latent fear. Intuition and common sense tell us that all is not well. Broadly speaking, people believe in Murphy's law: "If anything can go wrong it will." They believe it because it applies to the world of experience, and it applies with special force to large, technically complex systems. In the world in which people live, power grids fail, trains derail, bridges and dams fall down, DC-10 engines fall off, and nuclear power plants come close to meltdown. These things don't happen often, but they do occur.

A 1965 power failure in the American Northeast was traced to a single inexpensive switch. It was said repeatedly after 1965 that such a cascading power blackout could never occur again, since the freak accident had been carefully considered in new designs based on the lessons of 1965. But it did happen again, in 1977, in New York.

Engines fell off an inspected DC-10 airplane, leading to public outcry, high-level attention, and lawsuits. Even after repeated warnings, the same type of engine fell off the same type of plane two months later. Similarly, the cargo doors of the DC-10 blew out, not once but three times. Ultimately, the blown-out cargo doors caused a plane crash with major loss of lives.

The nuclear power plant failures at Three Mile Island in 1979 and Chernobyl in 1986 came after innumerable engineering studies had been made on the safety of these plants. Nuclear power experts had claimed that getting hit by a meteor was far more likely than a major nuclear plant accident, in retrospect clearly an invalid analogy.

 

"In the world in which people live, power grids fail, trains derail, bridges and dams fall down, DC-10 engines fall off, and nuclear power plants come close to meltdown. These things don't happen often, but they do occur."

 

When an expert states that a flock of geese or a lunar radar reflection will not trigger the automatic launch of a nuclear weapon, he or she is making a particular remark about a single system, a particular possibility. Our intuition, on the other hand, takes the flock of geese triggering World War III as an example of a wider concern. In the world of experience, we feel complex systems are bound to go awry precisely because they are complex.

Power blackouts, DC-10 failures, and nuclear power station accidents reinforce our intuitive concerns. In each of these examples, it was not the isolated accident that led to trouble, but a series of compound, and highly correlated events, which triggered a sequence of human, bureaucratic, and technical reactions. These reactions resulted in incorrect diagnoses of what was going wrong, which led to the initiation of actions that either had nothing to do with the problem or, even worse, exacerbated it.

 

Multiple Errors

Discrete accidents are easy to design against. The flight of geese, the lunar radar reflection, and the imperfect computer chip are all isolated events. With so many checks and balances overlaid onto the control system for strategic weapons, the likelihood of accidental or inadvertent war from a single failure is very, very low in peacetime. Each layer of the warning and intelligence system inspires new checks, new balances, and new authentication procedures. Against the discrete accident, malfunction, or operator error, the total system is massively redundant. I believe the likelihood of nuclear war due to a single failure is much lower today than it was twenty-five years ago precisely because of today's more complex warning and control system.

Multiple errors or malfunctions are a different matter altogether. The problem with compound accidents, especially those involving human behavior, is that the number of possible reactions is enormous and no design can protect against all of them. The likelihood that multiple events will lead to trouble increases when there is increased military activity. Thus, when forces are placed on alert, the complexity of the warning system may not only cease to provide redundancy; it may also amplify the mistakes.

What set off the interlocking alerts of the European armies in 1914 was not the isolated assassination of the archduke in Sarajevo but the decision to mobilize. The effect of the thousands of orders issued was to create an unstoppable chain reaction of reinforcing alerts. The alerts acted like ratchets, step-by-step moving Europe into war but unable to function in reverse toward peace.

 

"In the world of experience, we feel complex systems are bound to go awry precisely because they are complex."

 

In the summer of 1914, everything functioned the way it was supposed to. There were no accidents in the usual sense of the term. Political leaders lost control of the tremendous momentum built up when their armies went on alert. The institutions designed to protect the peace moved the nations of Europe into war. It pays to examine some implications of this theme for the nuclear forces of today.

 

Tight Coupling

A major element in the evolution of both American and Soviet warning systems has been their thoroughgoing integration with the command and control of nuclear weapons themselves. The result is a tightly coupled system in which a perturbation in one part can, in short order, be amplified throughout the entire system. The greatest single change in nuclear forces during the past twenty-five years is this shift from loose to tight coupling. (See Raushenbakh's paper in this volume for an analysis of the danger from a control theory point of view.)

Two false alerts, in 1979 and 1980, illustrate the strong interconnectedness between warning and weapons systems. In the first, an operator mistake led to the transmission of an erroneous message that the US was under nuclear attack. This information was sent to NORAD fighter bases, and ultimately ten fighters from three separate bases in the US and Canada were scrambled and sent airborne. American missile and submarine bases across the nation automatically switched to a higher level of alert.

 

"In the summer of 1914, everything functioned the way it was supposed to. There were no accidents in the usual sense of the term."

 

Several months later, in 1980, a failed chip in a minicomputer led to the transmission of a similar message to American forces. This time about a hundred B-52 bombers were readied for takeoff, as was the president's emergency aircraft. The airborne command post of the American commander in the Pacific took off from its base in Hawaii.

These incidents suggest some of the problems of a tightly coupled nuclear force and also illustrate how different nuclear forces are from conventional armies, navies, and air forces. For conventional armies, the key to survival was loose coupling. A part of the force could be sacrificed to save the whole. For nuclear forces, however, everything affects everything else. A seemingly small threat in one area, say one submarine, could wipe out much of the opponent's bomber force, or it could try to totally paralyze the opponent by destroying his national leadership and command centers - a "decapitation" strike. To protect itself, a nuclear force does the opposite of what a conventional army does. It tries to "manage" every small threat in detail by centralized direction, reliance on immediate warning, and dependence on prearranged reactions. The result is a system in which relatively small stimuli in one part produce vast reverberations throughout the rest of the system.

Such tightly coupled systems are notorious for producing overcompensation effects. A malfunctioning 46¢ computer chip initiated a chain of events thousands of miles away in Washington and Hawaii. Had the accident proceeded a bit longer, the president of the US would have had to be awakened to be told he had fourteen minutes to get out of the White House and to decide on a retaliatory plan in the event that the attack was real, and even less time to get on the Hot Line to Moscow. Nearly a hundred B-52s would have been launched to airborne positions over the Arctic, alert messages sent to ICBM crews, and warning messages sent to American military units from Korea to Germany.

The missile alert in question did not lead to such actions. But to argue that the major lesson of the NORAD missile alerts of 1979 and 1980 is that the warning system proved successful is to miss the point. They revealed a deeper, more fundamental truth about nuclear forces: They have developed into highly interdependent systems. Under peacetime conditions, the system's massive complexity does prevent isolated accidents from leading to catastrophe. This is why NORAD and other commands were able to deal safely with some fifteen hundred false alarms in 1979 through 1982. But during heightened military activity, the system is likely to become even more tightly coupled than it ordinarily is.

On a full alert, with worldwide warning and intelligence sensors flooding the headquarters with information, it is safe to say that much stronger reactive dynamics would drive the system this way and that. The institutional checks and balances that ordinarily dampen the internal overcompensation dynamics would be removed, either totally or partially, depending on the level of the alert. That, after all, is what it means to go on alert. At the highest levels of alert, the coupling might become so tight, and the checks and balances so removed, that the stability of the command system itself would be in doubt.

 

The Global Warning System

Sophisticated warning and intelligence systems have produced a tight, interactive coupling of American with Soviet forces. In certain respects, American and Soviet strategic forces have combined into a single gigantic nuclear system. A threatening military action or alert is detected almost immediately by the other side's warning and intelligence systems and conveyed to force commanders. The detected action may not have a clear meaning, but because of its possible dire consequences, protective measures must be taken against it. The action-reaction process can spiral, extending from sea-based forces to air- and land-based forces.

In addition to observing opposing forces, the American and Soviet intelligence systems now have the ability to monitor the other side's warning and intelligence systems themselves. The possibility exists that each side's warning and intelligence system could interact with the other's in unusual or complicated ways that are unanticipated, to produce a mutually reinforcing alert. This last possibility is not a new phenomenon; it is precisely what happened in Europe in 1914. What is new is the technology and the speed with which it could happen.

An example of mutually interacting strategic moves occurred in April 1978 when two Soviet submarines moved unusually close to the eastern coastline of the US. In such close-in positions these nuclear missile equipped submarines had the capability of launching attacks with minimal warning on bomber bases, command and control centers, submarine bases - and on Washington itself. Their movements were tracked by the underwater acoustic detection network operated by the US Navy.

 

"On a full alert ... the institutional checks and balances ... would be removed ... That, after all, is what it means to go on alert ... the stability of the command system itself would be in doubt."

 

The American response was to "let the Soviets know that we know" how close in they had moved. This was done by raising the alert level at several SAC bomber bases and ultimately by dispersing the aircraft to other bases. Such an action in a crisis might suggest that the bomber force was preparing to launch against the USSR. These actions were apparently detected almost immediately by Soviet electronic reconnaissance satellites or by other technical means. The Soviet submarines soon moved from their close-in positions to their usual deployments farther out in the Atlantic.

In peacetime nonalert conditions, the response to a single discrete threat can be to take a small number of precautionary moves. If Soviet nuclear submarines move unusually close to the Eastern coast, then SAC bombers can be removed to different airfields. Similarly, the Soviets observe that only American bombers are active, and that American nuclear submarines in port, for example, are inactive.

But once warning and intelligence systems are stimulated beyond a certain threshold, or once a certain level of alert has been ordered by political or military authorities, the situation may alter dramatically. Tight coupling of the forces increases, information begins to inundate headquarters, and human, preprogrammed-computer, and organizational responses are invoked. Although each side might well believe it was taking necessary precautionary moves, the other side might see a precaution as a threat. This would in turn ratchet the alert level upward another notch.

Whether or not such a chain-reaction alert could lead to nuclear war is difficult to imagine, stated in these terms. Unfortunately, it is not that difficult to envision a political crisis leading to an alert, and the alerting process escalating until one side felt forced to disperse its nuclear weapons from their storage positions, or until conventional attacks were authorized against Soviet or American submarines patrolling near each other's coasts. It is also possible to imagine a mutual alerting process reaching the point where interference or direct attack of satellites was undertaken, or where spontaneous evacuation of Soviet and American cities would occur for civil defense reasons.

 

"Instead of war versus peace, the decision would be seen as either striking first or striking second - precisely the dilemma faced at the outbreak of World War I."

 

Few people would disagree that operating nuclear forces at such high states of alert in this environment could easily tip over into preemptive attacks and all-out war. Each nation might not want war but might feel driven to hit first rather than second. Instead of war versus peace, the decision would be seen as either striking first or striking second - precisely the dilemma faced at the outbreak of World War I.

 

Reactions to Compound Stimuli

A 1956 example illustrates how compound warning stimuli can contribute to the false perception of danger. In early November, at the same time as the British and French attack on Suez, the Hungarian uprising was taking place. TASS, the Soviet press agency, was describing fears of worldwide nuclear war. Moscow issued a strong warning to London and Paris, and suggested to Washington that joint American-Soviet military action should be taken in Suez. This last message was received at the White House in the late afternoon of November 5.

Against this context, on the same night, the following fourfold coincidence took place. The headquarters of the US military command in Europe received a flash message that unidentified jet aircraft were flying over Turkey and that the Turkish air force had gone on alert in response. There were additional reports of a hundred Soviet MiG-15s over Syria and further reports that a British Canberra bomber had been shot down, also over Syria. (In the mid-1950s, only the Soviet MiGs had the ability to shoot down the high-flying Canberras.) Finally, there were reports that a Russian fleet was moving through the Dardanelles. This has long been considered an indicator of hostilities, because of the Soviet need to get its fleet out of the Black Sea, where it was bottled up in both world wars. The White House reaction to these events is not fully known, but reportedly General Andrew Goodpaster was afraid that the events "might trigger off all the NATO operations plan." At this time, the NATO operations plan called for all-out nuclear strikes on the USSR.

As it turned out, the "jets" over Turkey were actually a flock of swans picked up on radar and incorrectly identified, and the hundred Soviet MiGs over Syria were really a much smaller routine escort returning the president of Syria from a state visit to Moscow. The British Canberra bomber was downed by mechanical difficulty, and the Soviet fleet was engaging in long-scheduled exercises. The detection and misinterpretation of these events, against the context of world tensions from Hungary and Suez, was the first major example of how the size and complexity of worldwide electronic warning systems could, at certain critical times, create momentum of its own.

While the fourfold compound events in the Suez incident did not lead to war, they demonstrate a dangerous feature of warning systems that cover a multiplicity of phenomena over a widespread geographic area. Turkish radars, a listening post in the Dardanelles, and communications intelligence from Syria and the USSR each contributed to a false overall picture. The simultaneity of the events, an arbitrary accident, was interpreted as evidence that they were all related.

 

"In the broadest terms, the danger facing the world is that the superpowers have institutionalized a major nuclear showdown."

 

Once again, in retrospect, it is easy to see that each warning was not a sign of attack. But in November 1956, at the time they were happening, the compound events did not seem benign. There has been a tendency for the US and the USSR to be suspicious of each other and expect the worst. When warning incidents appear simultaneously, the simultaneity itself will contribute to the belief that the situation really might be dangerous.

The warning and intelligence systems of 1956 were primitive compared with those built over the next thirty years. The warning systems improved technically. More important, both in the number of phenomena covered and their geographic spread, the American coverage of the USSR - and the Soviet coverage of the US - has increased immensely. This trend would seem to make it more likely that simultaneous events will be picked up by warning and intelligence sensors and will, by the very reason of their simultaneity, be interpreted at headquarters as related.

 

Conclusions

The massive redundancy inherent in a system as complex as the world's nuclear forces reduces the danger of war resulting from a single technical accident. It very likely mitigates the danger of war from even a handful of such isolated stresses. When the stresses occur close together in time, the situation is a bit more dangerous. The situation becomes very dangerous, however, when the stresses occur in the midst of an international crisis. The real danger during Suez occurred because the simultaneous incidents took place during a political crisis. In a future crisis, one in which nuclear forces are placed on increased alert as a demonstration of political resolve, the warning system may have to contend with a strong random input of simultaneously threatening events. Some of the events will be part of the directed alert and some won't, but the system will not be able to discern the difference. In such a future crisis, going to high levels of alert could be a much more dangerous game than it was in the 1950s or 1960s.

In broadest terms, the danger facing the world is that the superpowers have institutionalized a major nuclear showdown. Today's complex nuclear defense system is strongly reminiscent of the institutionalized conflict mechanisms of the early twentieth century. World War I was a war waiting to happen at any time in the decade before 1914. Remarkably enough, during the very time when the general staffs of Europe were working out the interlocking mobilization programs, a feeling of security and complacency dominated popular and elite opinion. Although the war was waiting to happen, the fact that it hadn't happened was taken as a sign that all was well. Bertrand Russell tells how the absence of conflict during the Victorian era lulled people into confidently projecting peace into the indefinite future. Skirmish wars aside, they felt that no one would be so irrational as to initiate a major war.

The abrupt suddenness of World War I surprised everyone. Yet, in retrospect, almost nothing else could have occurred, given the institutionalized mobilization plans and firepower developed in the preceding decade. The same is true today.

 

 

 

 

REFERENCES

* This article is adapted from Paul Bracken's book, The Command and Control of Nuclear Forces, copyright 1983 by Yale University, New Haven. The reader is referred to the book for further reading and for documentation on specific facts. Reprinted by permission.

 

Main Book Contents Background Reflections Building on Breakthrough  
Contact Breakthrough      Foundation for Global Community      Copyright ©2001 FGC