8 minute read

Calculation and Computation

Contemporary Period



The demand for ballistics tables beyond what available human computers could supply in World War II resulted in the construction of ENIAC (Electronic Numerical Integrator and Computer), considered by many to have been the first digital electronic computer. It was constructed by employing 18,000 vacuum tubes between 1943 and 1945 at the Moore School of Pennsylvania by a team led by John Mauchly (1907–1980) and J. Presper Eckert (1919–1995) with funds from the U.S. Army's Ballistics Research Laboratory. Under the direction of Howard Aiken (1900–1973), a physics instructor at Harvard University, IBM completed in 1944 the construction of the Automatic Sequence Controlled Calculator (Harvard Mark I), used to produce the U.S. Navy's ballistic tables. A wealth of detail exists about the 1940s and the 1950s, the period of the heroes of electronic computation, which has attracted the disproportionate attention of historians. At the very least, this attention has made it clear that the electronic computer was a "multiple invention," which means that it came by a social cause, not an individual genius.



The construction of ENIAC was roughly contemporaneous to several other projects. Aiken's efforts at reconfiguring IBM's accounting machines for the purpose of solving nonlinear differential equations went back to 1937. The Bell Labs employee George Stibitz constructed a series of machines based on relays and other telephonic equipment, including the Complex Number Calculator, which was used successfully by the military between 1940 and 1949. In Germany, starting in 1938 and continuing through the war, Conrad Zuse (1910–1995) constructed a series of machines that were also based on electromechanical relays. Iowa State College physics professor John Atanasoff (1903–1995) and his graduate student Clifford Berry (1918–1963) built a special purpose electronic computer, the ABC (Atanasoff-Berry Computer) between 1939 and 1942. Ranging from the big to the gigantic, these machines inaugurated experimentation with correspondence between electronic circuitry and numerical computational relationships, in the decimal or the binary system. Claude Shannon is the most recognizable of those who were arguing for such correspondence theoretically by showing that the design of electronic circuits and the reduction of reasoning to a series of simple algebraic operations (as proposed by the Boolean algebra) could be used to push each other forward.

Having been a participant of the Moore School team, the Princeton mathematician John von Neumann (1903–1957) shaped the following generation of electronic computers by setting the standard of a computer architecture based on an internal division of labor between a control unit that interacted with the memory to check the flow of data between the memory and the arithmetic unit while controlling the input and output. His division between an arithmetic unit, which is where any future calculation was to take place (by taking into account selected past computations from a stock accumulated in the memory) imported the living-dead labor dynamic balance of the whole of the capitalist economy into the workings of the machine—the balance during the dynamic self-accumulation of past computations in the memory was to be provided by the control unit. The accumulation of data and instructions in the memory unit became known as the "stored program technique." It represented an economy of flexible allocation of resources that was opposite to the brute-force approach of the previous generation of electronic computers. Von Neumann presented his architecture in a 1945 report on EDVAC, a computer to follow ENIAC. The architecture was rehearsed in the construction of the IAS (named after Princeton's Institute for Advanced Study) by a team led by von Neumann at Princeton. It was completed in 1952. Like most experimental computers of the period, IAS was funded by sources such as the military and the Atomic Energy Commission. Similar machines were constructed at seventeen other research centers in the United States and several more in various other countries.

Interested in commercial rather than scientific computers, Eckert and Mauchly tried a series of intermediate computer configurations and business schemes before authorizing the production of the Remington-Rand UNIVAC (Universal Automatic Computer), the first of which was delivered to the Census Bureau in 1951. The first UNIVAC for business use was installed at the General Electric Appliance Division, for payroll computations in 1954. By 1957, the number of UNIVACs sold was up to forty-six. Along with several other manufacturers, IBM entered the electronic computer business in the early 1950s. It started with the 1951 IAS-type Defense Calculator, which was renamed IBM 701. IBM constructed and rented nineteen such machines. By 1960, IBM had dominated the market with machines such as the IBM 650, usually rented to universities under attractive terms, which subsequently tied university computations to IBM. IBM's dominance was solidified by its introduction in the mid-1960s of the standard-setting System/360 family of computers.

By then, the analog-digital debate, which has started in the late 1940s and escalated in the early 1950s, was practically over. The evolution of the MIT Project Whirlwind between 1945 and 1953, under the direction of Jay Forrester, captured the emergence of the analog-digital demarcation. Intended to be used in real-time aircraft flight simulation, it started as an analog machine. Upon learning about the EDVAC, Forrester decided to attempt to construct a digital computer. His costly change was supported, initially, by the U.S. Office of Naval Research with approximately one million dollars per year. When the Navy gave up, the Air Force stepped into the void hoping that the digital Whirlwind computer could lead into a machine suitable to the needs of SAGE (Semi-Automatic Ground Environment), a system to coordinate the detection of and response to the Soviet Union's strategic bombers. The pursuit of SAGE brought about enormous demands for programming, thereby revealing the dependence on computer software. It started to become apparent that the analog-digital contrast was succeeded by a contrast between software and hardware.

There is no record of thinkers who foresaw a market for more than a few mainframes in the 1940s. There is also no record of thinkers who predicted that the future of computation was not in the formation of computer utilities, according to the direction suggested by the time-sharing of mainframes during the 1960s. Patented in 1959 by Jack Kilby of Texas Instruments, the integrated circuit contained all the elements of an electronic circuit in a chip of silicon. The microprocessor appeared a decade later as a general-purpose programmable integrated circuit. The cheapening of the hardware and the miniaturization of electronic components made possible the decrease of the size of the mainframes to that of minicomputers. The potential of reducing the size of the computer further to that of a home appliance was realized in the subsequent decades, resulting in the mass production and use of personal computers during the 1980s and the mass interconnection of personal computers that led to the formation of the Internet and the World Wide Web during the 1990s. In the meantime, microprocessors have been installed everywhere from home appliances to automobiles.

The decrease of the value of hardware accentuated the increase in the value of software. What concluded as a "software crisis" started as a problem of "programmer shortage." Generations of general and special-purpose programming languages, and, by now, software operating systems have yet to provide a stable solution. Attempts at computer-aided programming (machine self-programming) and software engineering (mass production of software) have met with limited success if not complete failure. From eliminating the "programming bugs" that clogged the early electronic computers to blocking the so-called spam that plugs contemporary e-mailing, computation seemed to have increased rather than decreased the dependence on skilled labor. In its absence, all sorts of computer viruses threaten the contemporary world with catastrophe. Many of the world's habitants anxiously anticipated living through the completion of a millennium in the transition from year 1999 to year 2000, which became known as Y2K. Their recollection of the event is marked by the memory of anxiety surrounding Y2K, a concern that stemmed from decades of labor-saving yet short-sighted use of two digits for the purpose of making electronic computations economical.

BIBLIOGRAPHY

Abbate, Janet. Inventing the Internet. Cambridge, Mass.: MIT Press, 1999.

Aspray, William. ed. Computing before Computers. Ames: Iowa State University Press, 1990.

Beniger, James, R. The Control Revolution: Technological and Economic Origins of the Information Society. Cambridge, Mass.: Harvard University Press, 1986.

Black, Edwin. IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America's Most Powerful Corporation. New York: Crown, 2001.

Blok, Aad, and Greg Downey, eds. Uncovering Labour in Information Revolutions, 1750–2000. Cambridge, U.K., and New York: Cambridge University Press, 2003.

Borst, Arno. The Ordering of Time: From the Ancient Computus to the Modern Computer. Translated by Andrew Winnard. Chicago: University of Chicago Press, 1994.

Campbell-Kelly, Martin. From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry. Cambridge, Mass.: MIT Press, 2003.

Campbell-Kelly, Martin, et al. The History of Mathematical Tables: From Sumer to Spreadsheets. Oxford and New York: Oxford University Press, 2003.

Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996.

Ceruzzi, Paul. A History of Modern Computing. 2nd ed. London and Cambridge, Mass.: MIT Press, 2003.

Cortada, James. IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956. Princeton, N.J.: Princeton University Press, 1993.

Edwards, Paul. The Closed World: Computer and the Politics of Discourse in Cold War America. Cambridge, Mass.: MIT Press, 1996.

Hopp, Peter M. Slide Rules: Their History, Models and Makers. Mendham, N.J.: Astragal Press, 1999.

Jezierski, Dieter von. Slide Rule: A Journey through the Centuries. Translated by Rodger Shepherd. Mendham, N.J.: Astragal Press, 2000.

Kline, Ronald. Steinmetz: Engineer and Socialist. Baltimore: John Hopkins University Press, 1992.

Lubar, Steven. InfoCulture: The Smithsonian Book of Information Age Inventions. Boston, Mass.: Houghton Mifflin, 1993.

MacKenzie, Donald. Knowing Machines: Essays on Technical Change. Cambridge, Mass.: MIT Press, 1996.

McFarland, Stephen L. America's Pursuit of Precision Bombing, 1910–1945. Washington, D.C.: Smithsonian Institution Press, 1995.

Menninger, Karl. Number Words and Number Symbols: A Cultural History of Numbers. Translated by Paul Bronner. Cambridge, Mass.: MIT Press, 1969.

Mindell, David A. Between Human and Machine: Feedback, Control, and Computing Before Cybernetics. Baltimore: Johns Hopkins University Press, 2002.

Nebeker, Frederik. Calculating the Weather: Metrology in the 20th Century. San Diego, Calif.: Academic Press, 1995.

Small, James S. The Analogue Alternative: The Electronic Analogue Computer in Britain and the USA, 1930–1975. London and New York: Routledge, 2001.

Spufford, Francis, and Jenny Uglow, eds. Cultural Babbage: Technology, Time and Invention. London: Faber, 1996.

Williams, Michael R. A History of Computing Technology. 2nd ed. Los Alamitos, Calif.: IEEE Computer Society, 1997.

Yates, JoAnne. Control Through Communication: The Rise of American System in Management. Baltimore: Johns Hopkins University Press, 1989.

Zachary, G. Pascal. Endless Frontier: Vannevar Bush, Engineer of the American Century. New York: Free Press, 1997.

Aristotle Tympas

Additional topics

Science EncyclopediaScience & Philosophy: Calcium Sulfate to Categorical imperativeCalculation and Computation - Premodern, Early Modern, Non-western, Late Modern Period, Contemporary Period, Bibliography