Show Summary Details

Page of

PRINTED FROM OXFORD REFERENCE (www.oxfordreference.com). (c) Copyright Oxford University Press, 2013. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single entry from a reference work in OR for personal use (for details see Privacy Policy and Legal Notice).

Subscriber: null; date: 21 June 2018

computer science

Source:
The Oxford Companion to the History of Modern Science
Author(s):

Jonathan P. Bowen

computer science. 

Computer science is the study of the principles and the use of devices for the processing and storing of usually digital data using instructions in the form of a program.

Before the existence of modern computers, people who performed calculations manually were known as “computers.” The term “computer science,” signifying a particular combination of applied mathematics (particularly logic and set theory), and engineering (normally electronic) first occurred as the name of a university department at Purdue (U.S.) in 1962. The two key areas of development have been hardware (the computers themselves) and more recently software (the intangible programs that run on them).

Mechanical computing devices long preceded electronic ones. The earliest known mechanical adding machine, the creation of the German inventor Wilhelm Schickard, dates from 1621. The mechanical calculators created by Blaise Pascal (1623–1662) and Gottfried Leibniz (1646–1716) received wider attention than Schickard's machine, which fell into oblivion, and formed part of the intellectual inheritance of Charles Babbage (1792–1871), often celebrated as a pioneer of computing. Alarmed by the number of mistakes in hand-computed mathematical tables, he invented the Difference Engine and subsequently the Analytical Engine with many of the features of a modern computer. The “mill” (the gears and wheels that performed the arithmetical operations) corresponded to a modern central processing unit (CPU) for computation and the “store” was a mechanical memory for reading and writing numerical values. Ada Lovelace (1815–1852), the daughter of Lord Byron, provided the earliest comprehensive description of this first programmable computer partially based on notes by the Italian Luigi Menabrea (1809–1896). Babbage never completed the Analytical Engine, which would have stretched cogwheel machinery to its limits at the time.

Leibniz was the first mathematician thoroughly to study the binary system, upon which all modern digital computers are based. George Boole presented what became known as Boolean algebra or logic in his masterwork of 1854, An Investigation of the Laws of Thought. Boole's laws can be used to formalize binary computer circuits. Later, David Hilbert (1862–1943) argued that in an axiomatic logical system all propositions could be proved or disproved, but Kurt Gödel (1906–1978) demonstrated otherwise, with important implications for the theory of computability. Propositional and predicate logic, together with set theory, as formulated by Ernst Zermelo (1871–1953) and Adolf Fraenkel (1891–1965) among others, provide important underpinnings for computer science.

Analog computers use continuous rather than discrete digital values. They enjoyed some success before digital technology became established for systems of related variables in equational form. Vannevar Bush devised the successful Differential Analyzer for solving differential equations at the Massachusetts Institute of Technology during the 1930s. Later he wrote a seminal article, As We May Think (1945), that predicted some of the features of the World Wide Web, illustrating a very broad appreciation of computer science.

In 1936, the English mathematician Alan Turing, influenced by Gödel, devised a theoretical “universal machine,” later known as a Turing machine, that helped to define the limits of possible computations on any computing machine. Turing had a practical as well as a theoretical bent. He played a significant part in the building of Colossus, at Bletchley Park, which made possible the breaking of German codes during World War II (See Cryptography). Although it has a claim to be the first modern digital computer, Colossus had little influence since it remained secret for several decades. Turing subsequently worked on the design of the Pilot ACE computer at the National Physical Laboratory and the programming of the Manchester Mark I at the University of Manchester, both successful postwar British computers. Turing's standing in modern computing is indicated by the Turing Award, computer science's highest distinction. It has been given annually since 1966 by the Association for Computing Machinery (ACM), the subject's foremost professional body, founded in 1947.

John Atanasoff built what may have been the first electronic digital computer, the Atanasoff-Berry, prototyped in 1939 and made functional in 1940. It influenced John Mauchly, who with J. Presper Eckert constructed the famous ENIAC at the Moore School of Electrical Engineering in Philadelphia between 1943 and 1945. EDVAC, the first U.S.-built stored-program computer, followed in 1951. Maurice Wilkes attended a summer program at the Moore School in 1946, returned to the University of Cambridge in England, and completed the EDSAC in 1949. Its run on 6 May 1949 made it the world's first practical electronic stored-program computer. The Lyons company copied much of EDSAC to produce the first commercial data processing computer, the LEO (Lyons Electronic Office), in 1951. Other important early computer pioneers include Konrad Zuse, who worked separately on mechanical relay machines, including floating-point numbers, producing the Z1-Z4 models in Berlin between 1936 and 1945. Significant U.S. engineers include Howard Aiken, who developed the electromechanical calculator Harvard Mark I, launched in August 1944, and George Stibitz, who illustrated remote job entry in September 1940 by communicating between Dartmouth College in New Hampshire and his Model 1, first operational in 1939, located in New York. Aiken established programming courses at Harvard long before the university computer science courses of the 1960s.

Programming facilities for early computers initially operated at the binary level of zeros and ones. Assembler programs allowed the input of instructions in mnemonic form, but still matching the machine code very closely. Higher-level programming less dependent on machine language requires a compiler program run on a computer. Noam Chomsky provided influential formal characterizations of grammars for languages, including programming languages, in the late 1950s and early 1960s.

Early high-level programming languages for scientific and engineering applications included Fortran, developed between 1954 and 1957 by John Backus and others at IBM in New York City. Backus also devised Backus Normal Form (BNF) for the formal description of the syntax of programming languages. Successive versions of Fortran have kept it in use. COBOL was another important programming language, developed for business applications in the late 1950s. U.S. Navy Captain Grace Hopper played a key part in its creation. ALGOL, the first programming language described using BNF, included in its 1960 version important new features such as block structuring, parameter passing by name or value, and recursive procedures that greatly influenced subsequent programming languages.

Pascal, designed by Nichlaus Wirth in Zurich between 1968 and 1970, embodied the concepts of structured programming espoused by Edsger Dijkstra and C. A. R. Hoare. Its simplicity suited it for educational purposes as well for practical commercial use. Wirth went on to develop Modula-2 and Oberon and is widely considered as the world's foremost designer of programming languages. Ada was developed in the 1970s for U.S. military applications. It proved to be the opposite of Pascal in the scale of complexity.

Dennis Ritchie created “C” as a general-purpose procedural language. It served as a basis for the highly influential Unix operating system, developed by Ritchie and Kenneth Lee Thompson at Bell Laboratories in New Jersey and refined still further there into C++. C++ encourages information hiding, as suggested by David Parnas, or encapsulation within “objects” considered as instances of classes, a technique first used in the SIMULA language, produced by Ole-Johan Dahl and others in 1967. The highly successful language Java, designed as a portable object-oriented programming language for distributed applications, dates from the early 1990s.

The languages so far considered follow an order of instructions. Some higher-level languages, such as LISP (late 1950s and early 1960s, at MIT), widely used in artificial intelligence, express computations in the form of mathematical functions, and so reduce the importance of the ordering of execution. Logic programming, as in Prolog (1970s), is a relational approach admitting nondeterministic answers. An extension, constraint logic programming, allows the convenient inclusion of extra conditions on variables.

A von Neumann machine, similar but not identical to a theoretical Turing machine, refers to the standard arrangement of early sequential computers with CPU and memory still widely used. However, parallel architectures have become increasingly important, as computer-processing power presses against physical limits. Architecture has evolved through valves or tubes, solid-state transistors, and integrated circuits of increasing complexity.

Theoretical underpinnings for computer science include the definition of computability incorporated in the Turing machine, the λ-calculus of Alonso Church (1903–1995), and recursive functions. Complexity theory aids reasoning about the efficiency of computation. Other theoretical computer science subdisciplines include automata theory, computational geometry (for computer graphics), graph theory, and formal languages.

Software engineering encompasses the process of producing programs from requirements and specifications via a design process. Dijkstra from Holland has been a major contributor to the field. His influential paper GO TO Statement Considered Harmful (1968) led to the acceptance of structured programming, a term he coined, in the 1970s, in which abstraction is encouraged in the design process and program constructs are limited to make reasoning about the program easier. Dijkstra, Dahl, and Hoare wrote the widely read Structured Programming (1972). Hoare has also made important contributions to formal reasoning about programs using assertions, sorting algorithms, and the formalization of concurrency. Donald Knuth of Stanford University has been a major innovator in computer algorithms. His multivolume and still unfinished magnum opus, The Art of Computer Programming, is one of the best-known and influential books in computer science. He has contributed especially to parsing, reasoning, and searching algorithms, all important computer science techniques. Like all good computer scientists, he has expertise in both theory and practice. As well as major theoretical contributions, he has produced the TeX document preparation system, still widely used by computer scientists internationally for the production of books and papers.

Artificial intelligence (AI) has held out huge promises that have been slower to mature than expected. Major contributions have been made by John McCarthy, latterly at Stanford University, and Marvin Minsky at MIT. Important aspects of AI include automated reasoning, computer vision, decision making, expert systems, machine learning, natural language processing, pattern recognition, planning, problem solving, and robot control. A successful outcome of the Turing test, where the responses of a human are essentially indistinguishable from those of a computer, has proved elusive in practice unless the knowledge domain is very limited. Connectionism, using massively parallel systems, has opened up newer interesting areas for machine learning such as neural networks (similar to the workings of the brain) and also genetic algorithms (inspired by Darwin's theory of evolution). Databases are an important method of storing, organizing, and retrieving information. The Briton Edgar Codd created the relational model for databases in the late 1960s and early 1970s at the IBM Research Laboratory in San Jose, California. The two important categories of database objects are “entities” (items to be modeled) and “relationships” (connections between the entities) for which a good underlying theory has been established. Communication has become as significant as computation in computing. Claude Shannon provided an important theoretical approach in his paper of 1948, A Mathematical Theory of Communication. He contributed to both network theory and data compression. Donald Davies of NPL and others developed packet switching in the 1960s, a precursor to the Internet, originally established in 1969 and known as the ARPAnet for many years. More recently, the expansion of the Internet has made possible the proliferation of the World Wide Web (WWW), a distributed information system devised in the early 1990s by the British scientist Tim Berners-Lee at CERN in Switzerland. His unique insight combined a number of key principles: a standard network-wide naming convention for use by hyperlinks in traversing information; a simple but extensible markup language to record the information; and an efficient transfer protocol for the transmission of this information between the server and a client user.

Computer science has had an enormous social impact in recent years. Yet it is still a relativity young and perhaps immature science. Quantum computers offer the possibility of removing some of the stumbling blocks encountered today and could theoretically render useless many of the data security mechanisms currently in place. The future of computer science looks even more interesting than its past.

Bibliography

J. A. N. Lee, ed., International Biographical Dictionary of Computer Pioneers (1995).Find this resource:

    Valerie Illingworth, ed., A Dictionary of Computing, 4th ed. (1996).Find this resource:

      Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (1997).Find this resource:

        David Harel, Computers Ltd.: What They Really Can't Do (2000).Find this resource:

          Mark W. Greenia, History of Computing: An Encyclopedia of the People and Machines that Made Computer History, CD-ROM (2001).Find this resource:

            Raúl Rojas, ed., Encyclopedia of Computers and Computer History, 2 vols. (2001).Find this resource:

              IEEE Annals of the History of Computing, available online at http://www.computer.org/annals. The Virtual Museum of Computing, available online at vmoc.museophile.com.Find this resource:

                Jonathan P. Bowen