Show Summary Details

Page of

PRINTED FROM OXFORD REFERENCE ( (c) Copyright Oxford University Press, 2013. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single entry from a reference work in OR for personal use (for details see Privacy Policy and Legal Notice).

Subscriber: null; date: 18 October 2018

military, the, And technology

Science, Technology, and Society

Chris Hables Gray

military, the, And technology. 

The relationship between technology and war is particularly intimate. Humans need weapons to be efficient killers. It is hard to imagine a battle without at least stones and clubs—crude technologies, but effective at close quarters. Some scholars postulate that the demand for better war weapons drove the development of technologies such as metallurgy and shipbuilding and also led to the invention of bureaucracy to manage war. Lewis Mumford (1970) argued that ancient armies were the first machines, bringing masses of tools and complicated processes together in regimented action. Perhaps cities, irrigation systems, or large rituals were actually the first machines, but the evidence is compelling that even during ritual wars before the founding of cities (civilization), the cult of the weapon was a powerful technological impetus and that the better weapons (including protective armor) were often decisive in victory. For all this, ritual and even ancient war were basically conservative technologically. The blades of weapons name the slow transitions—Stone Age to Bronze Age to Iron Age to age of steel.

The rise of city states and their organization of battle transformed war from semi-religious skirmish and kidnap to the most horrible political instrument humans have invented. Despite sophisticated early theories about war such as the unsurpassed Chinese collection The Art of War, ascribed to Sun Tzu, most wars were about brute strength in numbers, well-made weapons, and the skill of certain elite aristocratic troops such as chariots and cavalry. Later, the lethal value of slingers, archers, and heavy infantry, often citizens, was crucial. In any event, almost all battles ended with butchery, including sieges.

Butchery is a precise way of describing war, as the historian John Keegan revealed in his masterful 1976 history of the experience of soldiers: The Face of Battle. Cut down by the blades on chariot wheels, stabbed and trampled by phalanx or horse, shattered by explosion, ripped by gunfire, seared by napalm—war is always about mutilating and destroying human bodies.

This explains the persistence of magical and emotional thinking around war. Honor, élan, fate, courage, and divine intervention are important figures in both ritual and ancient war, all the more so since technological change was slow and intermittent. The armies of ancient Egypt could have fought pretty equally, in technological terms, with anyone in the next 2500 years. In the 1500s, however, reason and experiment were fostered, and soon they were turned to the ballistics and navigation problems of war. War began to be rationalized, it became “modern,” and 500 years later industrialized slaughter (as in the assembly-line deaths of the Western Front, submarines, the Holocaust, and strategic bombing) had reached epic proportions. War was still emotional, but the emotion was sublimated through an instrumentalist rationality into constant improvement in military technology.

Modern War

In many respects modern war not only was made possible by the technological breakthroughs of the Renaissance and after; in turn it largely shaped the European nation-state system that politically structures the world today. The influential John Nef even argued that human technological progress was almost completely driven by war (1963), but most historians argue that the relationship is more complex. Sometimes war or the fear of war produces new technologies; sometimes it adapts old ones. New technical developments sometimes lead to military applications and even new doctrines; sometimes not. Often the ideology about the place of technology in war can precede the changes in war itself.

In the 1500s the Italian political operator and philosopher Niccolò Machiavelli called for the application of rationality to make war more politically useful through total battle. This is exactly what came to pass. Aristocracy gave way to meritocracy, and the pace of weapons development accelerated remarkably. The historian Geoffrey Parker (1988) and others rightly ascribe the spread of European dominance around the world to military innovation, although it was often not the weapons but the way they were utilized that was decisive.

As modern war developed and matured, so did the modern nation-state, industrial society and science, and European and North American colonialism. The modern world system coevolved with modern war. Military innovation was institutionalized, culminating during World War II in a plethora of technological innovations including radar, sonar, computers, missiles, jet airplanes, and nuclear bombs.

Whole strategies were built on new technologies (as in the case of blitzkrieg) or were influenced by them. In particular, the breaking of the Japanese and German codes using new systems, including computing machines, was decisive.

Modern war, however, fell victim to its own success. The nightmare of World War I did not merely end the Romanov and Ottoman empires in Russia and the Middle East, the near consensus in Western culture that war was natural was also shattered. At the heart of this new view that war was not inevitable was the argument that, whatever its value had been in the past, contemporary war had been so transformed by technology that heroism and any other martial virtues were now rendered impossible or mute. For many, the technological efficiency of it—its deadliness beyond human scale—meant that war was now politically and morally unacceptable.

This view has continued to gain strength over the intervening decades, as military technologies have far surpassed those of World War I. After the Vietnam War it came to be held by a surprising number of veterans, officers, and enlisted men as well as military historians and experts.

But another stance toward technology and war has also thrived under the rapid changes of the twentieth century: a love for the technology. A professor of American literature, Bruce Franklin (1988), has described how, over the years, many U.S. policy makers have been attracted to superweapons as a solution to the problem of war. Robert Fulton, Thomas Edison, and other great inventors were deeply involved in military developments, seeking technological solutions to the problems of battle. Even though President Lincoln had to intervene personally to get the Sharps repeating rifle adapted during the Civil War, over time the U.S. military became more and more proactive technologically, fostering the combat development of trains, balloons, telegraph, machine guns, accurate artillery, steel battleships, and airplanes.

After World War I, the majority of all military leaders agreed that new technology was profoundly important. Reflecting on how machine guns and artillery cut down a generation of European manhood, generals and politicians alike vowed never to be “technologically surprised” again. War became technophilic; in love with any new and potential weapons.

World War I introduced tanks, planes, radios, trucks, and submarines to total war. These technologies proved to be incredibly important in World War II, along with such new “wizard weapons” as radar, missiles, and computers. Finally, at the height of World War II, weapons of mass destruction (WMD) were developed, destroying modern war's core logic. Total war was no longer possible. Some weapons, at least in some forms, became for most people too horrible to use.

The Cold War that followed was replete with paradoxes. Its stability depended on mutually assured destruction (MAD), the promise that a superpower would use these horrible weapons in retaliation. It apparently was enough to keep conflict to bloody but “limited” wars such as Korea and Vietnam. But these were strange wars of stalemate and even defeat for the formerly dominant European and North American powers. War was no longer modern.

Postmodern War

So what should war be called now? There is no real agreement. Over fifty terms have been used to relabel war since the end of World War II. Among the most interesting are “permanent war,” “pure war,” “perfect war,” “postmodern war,” “high-technology war,” “technological war,” “technowar,” “cyberwar,” “computer war,” “high modern war,” “hypermodern war,” “third-wave war,” “net war,” “information war,” “info war,” “iwar,” “hyper-real war,” “neocortical warfare,” “sixth-generation war,” “fourth-epoch war,” and “nonheroic war.” Some of these are official terms for war, but most are from war theory. What links them all is that technology is central to their definitions. It has become impossible to think of war without the relentless perpetual changes in military technology.

Historians of war have noted that for the last six hundred years there have been a series of war-changing technological jumps, termed “revolutions in military affairs” (RMA). They include the infantry revolution (fourteenth century), the artillery revolution (fifteenth century), the revolution of sail and shot at sea (fourteenth to seventeenth century, a slow one), the fortress revolution (sixteenth century), the gunpowder revolution (sixteenth and seventeenth centuries), the Napoleonic Revolution (eighteenth century), the land war revolution (nineteenth century), the naval revolution (nineteenth and twentieth centuries), and then a series of revolutions during and between the World Wars (mechanization, aviation, information) and the nuclear and electronics revolutions afterward.

The earlier technological revolutions were separated by large blocks of time, and often took place over centuries. Since the start of World War I in 1914, RMAs have come often and suddenly. It makes more sense now to talk of a permanent revolution in military affairs. Innovation is constant, especially in the realm of information technology and such specialities as remote weapons and battlespace informatics.

To produce these new generations of weapons and other military technology, the industrial economies were militarized in constant preparation for apocalyptic and more limited conflicts. In both the West and the Communist countries the military parts of the economy were favored, such that President Dwight Eisenhower even warned of the dangers posed by the “military-industrial complex” to American democracy.

This system has helped shape the world economy. The first transistors, integrated circuits, and computers were all developed in the favorable climate of massive U.S. military purchases and cost-plus contracts. The Internet itself began as a military research project to improve communication between military and civilian researchers working on Department of Defense contracts and to test the usability of a distributed nonhierarchical network. Such networks were deemed necessary to control nuclear weapons in the case of total war, when large parts of any command and control system were expected to be obliterated.

Fortunately, the command and control system for nuclear conflict was not tested in war. But some analysts credit the technological arms race of the Cold War, along with other factors, for eventually forcing the Soviet Union into collapse.

As reason and technology justified the MAD nuclear policy, it was assumed that they could also solve the problems of limited war. The real test was Vietnam. Financial support by the United States for France's attempt to reassert its colonial domination of Indochina turned into direct military intervention to prevent a Communist victory. Confident of its technological superiority, the United States set out to wage the “perfect war” (Gibson 1986). Lip service was paid to winning the hearts and minds of the Vietnamese, but faith was put into “vertical envelopment” (helicopter attacks), the electronic battlefield, precision bombing (jets and drones), superior firepower, and “McNamara's Wall,” a high-tech death zone intended to stop the flow of weapons along the Ho Chi Minh Trail between North and South Vietnam, named after the Secretary of Defense, Robert Strange McNamara.

A Harvard Business School professor with an expertise in bargaining theory, McNamara joined the U.S. Army Air Corps strategic bombing campaign during World War II to do systems analysis (also called operations analysis and operations research), calculating casualty rates, pilot tours of duty, and plane replacements, and solving other problems. After the war, he marketed himself and a team of such experts to Ford Motor Company, where they “rationalized” its procedures. Later, when President Kennedy appointed McNamara as Secretary of Defense, most of his old team came with him to the Pentagon.

As in business, the theory of systems analysis is that institutions today are combinations of humans and machines and that the flows of information, material, and energy between them can be closely controlled, usually through information technology. Operations can also be successfully managed this way. However, it turned out that systems and technologies could not bring the United States victory in Vietnam. In reality, the theories of asymmetric war seem to apply in the postmodern era. Because they used appropriate technology and kept the political aspects (popular support, especially in Vietnam and the United States) of the war at the center, the North Vietnamese outlasted the United States.

Even though the Vietnam War was lost, the commitment of the U.S. military to systems and to new technologies as a way of dominating the battlespace has, if anything, grown stronger. It is still producing technological innovation in every aspect of the military. Logistics, uniforms, training, weapons, and weapon platforms are all in never-ending cycles of improvement. Everything possible is being done to improve both the automation of weapons and the efficiency of human-machine systems.

Information technology represents a particularly important part of this dynamic. In World War II it was generally accepted that “force of fire,” the amount of metal and explosives you could hurl at your enemy, was the most important military advantage. But now many of the most forceful weapons cannot even be used. So there has been a growing emphasis on collecting information and using it. Officially, information is now the main force multiplier for NATO, and this is a doctrine that is accepted in Russia, China, and Israel as well.

The importance of information processing (collecting, evaluating, using) had become so evident that in the 1990s there was a flurry of wild claims of a special kind of information (or info or cyber or net) war. But such pure information war is a myth. In reality war continues to be about killing and the terror it evokes.

War in the Twenty-first Century

Technology will certainly continue to shape war profoundly. It is erasing old distinctions between nonstate and state combatants, because one does not need a government now to kill thousands. Terror and war also are bleeding together; human-weapon suicide bombers, nuclear deterrence, and cruise missiles all involve terror. The WMDs that ended modern war are particularly important. Continued improvements in biotechnology, materials science, rocketry, and other areas means that the proliferation of nuclear, biological, and chemical weapons is probably inevitable. This has already contributed to fundamental changes in the international system, including regional alliances, international legal frameworks, and the concept of pre-emptive war. Constraining WMDs will be very difficult.

The history of limiting military technological developments is not encouraging. In the Middle Ages there were attempts by the Catholic Church to outlaw the use of crossbows against Christian enemies (infidel or heretical enemies not being expected to have any reason to respect the ban), but it did not last. More successful was samurai Japan, which limited the use of firearms for over 300 years (1543–1879). Both of these cases involved attempts to stabilize a cultural order threatened by new military technologies, and Japan was an exception in its isolation. In general, military innovation has been difficult to suppress because, in a multipolar world, it only takes one adopter to force everyone to the next level—hence, arms races.

Sometimes such competitions are disguised, such as the race to put a man on the moon. It is clear now, especially in light of the intercontinental missile defense system (Strategic Defense Initiative, or SDI) of President Reagan and the limited missile defense system of President George W. Bush, that the overall goal of the Soviet, U.S., European, Chinese, and Israeli space programs has been to use, or even dominate, space militarily. Space assets for surveillance and communications have become increasingly important in war. Plans for space-based weapons platforms are in development, and in the United States, for example, the Space Forces are expected to become the latest branch of the military.

The debates about SDI did have a very important unforeseen effect. Many scientists and engineers came forward with their opinions about the unworkability of the designs for the gigantic, automatic system to stop a Soviet missile attack. While there were problems with the physics and general engineering, it was its computer problems that were particularly notable. SDI would have been the largest computer system in history, but even its simulation would have required the largest computer system ever. In debates on SDI's technical aspects it became clear that there were severe limits on what military systems dependent on information technology could accomplish. These limits are based on work such as Gödel's incompleteness theorem (all formal systems are either limited or have paradoxes) and the Church/Turing thesis that demonstrated the same limits for an infinite computer.

When one puts this into the context of war itself, that complex, violent, unfathomable experience that involves not just triumphing over nature but over a human opponent, it becomes clear that managing war is impossible. Battle systems that purport to dominate war without casualties (for at least one side) will ever fail, unless the opponent is incredibly weak.

This is a key feature of the strange geography of the twenty-first century international system. Supposedly there is one superpower, the United States, but, because it is so dependent on high-technology approaches to combat that severely limit U.S. casualties, the United States is actually quite limited in its potential military actions, just at a time when it seems more threatened then ever because of the threat of nuclear, biological, or chemical attacks, conventional military weapons, or just hijacked civilian high technology. So the United States has made a massive commitment to new weapons, continual innovation, and carefully chosen interventions.

The Cold War system seems to be continuing with somewhat new alignments and with much less stability than the Soviet-West stalemate. Meanwhile, the military establishments of all the industrial powers, from China to NATO, rush the latest technologies and their associated doctrines into action while researchers explore the military potential of dozens of new areas, including nanotechnology, biotechnology, lasers, small nuclear munitions, anti-gravity systems, sound, noxious smells, intimate human-machine weapon systems (cyborgs), intelligent autonomous weapons, and psychological operations. But the history of technology and war strongly suggests that any technological solutions will be short-lived. Political problems, in the long run, are not solved by new weapons, and the problems of the twenty-first century cannot be solved by war. In fact, the high-technology war/terror that has developed may be the most important problem of all.

See also Biological Terrorism


Brodie, Bernard, and Fawn Brodie. From Crossbow to H-Bomb. Bloomington: Indiana University Press, 1973. A fine, readable account of the long history of technology and war.Find this resource:

    Dyer, Gwynne. War. New York: Crown Publications, 1985. Graphically rich and historically clear history of war with due emphasis on the growing importance of technology. Linked to a fine television documentary available on video.Find this resource:

      Edwards, Paul. The Closed World: Computers and the Politics of Discourse in Cold War America. Cambridge, MA: MIT Press, 1996. Charts in detail the origins of military computerization and the influence it has had on American science, especially psychology, and culture.Find this resource:

        Ekstein, Modris. Rites of Spring: The Great War and the Birth of the Modern Age. Boston: Houghton Mifflin, 1989. A beautiful and horrific description of how the technologically driven carnage of the Western Front transformed twentieth-century culture.Find this resource:

          Ellis, John. The Social History of the Machine Gun. New York: Pantheon Books, 1973. Gripping account of the macabre dance between machine gun technology, modern war (including colonial conquest and class conflict), and Western civilization.Find this resource:

            Franklin, Bruce. War Stars: The Superweapon and the American Imagination. Oxford: Oxford University Press, 1988.Find this resource:

              Gibson, James. The Perfect War: Technowar in Vietnam. Boston: Atlantic Monthly Press, 1986.Find this resource:

                Gray, Chris Hables. Postmodern War: The New Politics of Conflict. New York/London: Guilford Press, 1997. An extended analysis of contemporary war with an emphasis on how technology has shaped its major aspects.Find this resource:

                  Keegan, John. The Face of Battle. London: Penguin, 1976. This book revolutionized the history of war by focusing on the actual experience of combat.Find this resource:

                    Keegan, John. The Mask of Command. New York: Viking, 1987. Argues convincingly that changes in war technology have fundamentally changed the nature of military leadership.Find this resource:

                      Krepinevich, Andrew. “Cavalry to Computer: The Pattern of Military Revolutions.” National Interest (Fall 1994). Reprinted in Foreign Policy (Winter, 1998): 82–83.Find this resource:

                        Machiavelli, Niccolò. The Art of War (1520). Translated by Ellis Farnsworth. De Capo Press, 1990.Find this resource:

                          Mumford, Lewis. The Myth of the Machine. Vol. 1: Technics and Human Development. New York: Harcourt, Brace & World, 1967.Find this resource:

                            Mumford, Lewis. The Myth of the Machine. Vol. 2: The Pentagon of Power. New York: Harcourt Brace Jovanovich, 1970.Find this resource:

                              Nef, John. War and Human Progress. New York: W. W. Norton, 1963.Find this resource:

                                Parker, Geoffrey. The Military Revolution: Military Innovation and the Rise of the West, 1500–1800. Cambridge, U.K.: Cambridge University Press, 1988.Find this resource:

                                  Scarry, Elaine. The Body in Pain. Oxford: Oxford University Press, 1985.Find this resource:

                                    Sherry, Michael. The Rise of American Air Power: The Creation of Armageddon. New Haven, CN: Yale University Press, 1987. Sobering analysis of the Allied strategic bombing campaign of World War II.Find this resource:

                                      Sun Tzu. The Art of War. Translated by S. B. Griffith. Oxford: Oxford University Press, 1962.Find this resource:

                                        Van Creveld, Martin. Technology and War: From 2000 b.c. to the Present. New York: Free Press, 1989.Find this resource:

                                          Zulaika, Joseba, and William A. Douglas. Terror and Taboo: The Follies, Labels, and Faces of Terrorism. New York: Routledge, 1997. Examines the many different definitions of terrorism and the politics behind them.Find this resource:

                                            Chris Hables Gray