6 minute read

Digital Computer



A digital computer is a programmable device that processes information by manipulating symbols according to logical rules. Digital computers come in a wide variety of types, ranging from tiny, special-purpose devices embedded in cars and other devices to the familiar desktop computer, the minicomputer, the mainframe, and the supercomputer. The fastest supercomputer, as of early 2003, can execute up to 36 trillion instructions (elementary computational operations) per second; this record is certain to be broken. The impact of the digital computer on society has been tremendous. It is used to run everything from spacecraft to factories, healthcare systems to telecommunications, banks to household budgets. Since its invention during World War II, the electronic digital computer has become essential to the economies of the developed world.



The story of how the digital computer evolved goes back to the beyond the calculating machines of the 1600s to the pebbles (in Latin, calculi) that the merchants of imperial Rome used for counting, to the abacus of the fifth century B.C. Although the earliest devices could not perform calculations automatically, they were useful in a world where mathematical calculations, laboriously performed by human beings in their heads or on paper, tended to be riddled with errors. Like writing itself, mechanical helps to calculation such as the abacus may have first developed to make business easier and more profitable to transact.

By the early 1800s, with the Industrial Revolution well under way, errors in mathematical data had assumed new importance; faulty navigational tables, for example, were the cause of frequent shipwrecks. Such errors were a source of irritation to Charles Babbage (1792–1871), a young English mathematician. Convinced that a machine could do mathematical calculations faster and more accurately than humans, Babbage, in 1822, produced a small working model of what he called his "difference engine." The difference engine's arithmetic was limited, but it could compile and print mathematical tables with no more human intervention than a hand to turn the handles at the top of the device. Although the British government was impressed enough to invest £17,000 in the construction of a full-scale difference engine—a sum equivalent to millions of dollars in today's money—it was never built. The project came to a halt in 1833 in a dispute over payments between Babbage and his workmen.

By that time, Babbage had already started to work on an improved version—the analytical engine, a programmable machine that could perform all types of arithmetic functions. The analytical engine had all the essential parts of the modern computer: a means of entering a program of instructions, a memory, a central processing unit, and a means of outputting results. For input and programming, Babbage used punched cards, an idea borrowed from French inventor Joseph Jacquard (1757–1834), who had used them in his revolutionary weaving loom in 1801.

Although the analytical engine has gone down in history as the prototype of the modern computer, a full-scale version was never built. Among the obstacles were lack of funding and manufacturing methods that lagged well behind Babbage's vision.

Less than 20 years after Babbage's death, an American by the name of Herman Hollerith (1860–1929) was able to make use of a new technology, electricity, when he submitted to the United States government a plan for a machine that could compute census data. Hollerith's electromechanical device tabulated the results of the 1890 U.S. census in less than six weeks, a dramatic improvement over the seven years it had taken to tabulate the results of the 1880 census. Hollerith went on to found the company that ultimately emerged as International Business Machines, Inc. (IBM).

World War II was the driving force behind the next significant stage in the evolution of the digital computer: greater complexity, greater programmability, and greater speed through the replacement of moving parts by electronic devices. These advances were made in designing the Colossus, a special-purpose electronic computer built by the British to decipher German codes; the Mark I, a gigantic electromechanical device constructed at Harvard University under the direction of U.S. mathematician Howard Aiken (1903–1973); and the ENIAC, a large, fully electronic machine that was faster than the Mark I. Built at the University of Pennsylvania under the direction of U.S. engineers John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), the ENIAC employed some 18,000 vacuum tubes.

The ENIAC was general-purpose in principle, but to switch from one program to another meant that a part of the machine had to be disassembled and rewired. To avoid this tedious process, John von Neumann (1903–1957), a Hungarian-born American mathematician, proposed the concept of the stored program—that is, the technique of coding the program in the same way A possible future direction for computer technology is the optical computer. The Bit-Serial Optical Computer (BSOC) shown here is the first computer that both stores and manipulates data and instructions as pulses of light. To enable this, the designers developed bit-serial architecture. Each binary digit is represented by a pulse of infrared laser light 13 ft (4 m) long. The pulses circulate sequentially through a tightly wound, 2.5-mile-long (4-km-long) loop of optical fiber some 50,000 times per second. Other laser beams operate lithium niobate optical switches which perform the data processing. This computer was developed by Harry Jordan and Vincent Heuring at the University of Colorado and was unveiled on January 12, 1993. Photograph by David Parker. National Audubon Society Collection/Photo Researchers, Inc. Reproduced by permission. as the stored data and keeping it in the computer's own memory for as long as needed. The computer could then be instructed to change programs, and programs could even be written to interact with each other. For coding, von Neumann proposed using the binary numbering system, which uses only 0 and 1, as opposed to the decimal system, which uses the ten digits 0 through 9. Because 0 and 1 can readily be symbolized by the "on" or "off" states of a switched electric current, computer design was greatly simplified.

Von Neumann's concepts were incorporated in the first generation of large computers that followed in the late 1940s and 1950s. All these machines were dinosaurs by today's standards, but in them all the essential design principles on which today's billions of digital devices operate were worked out.

The digital computer is termed "digital" to distinguish it from the analog computer. Digital computers manipulate symbols—not necessarily digits, despite the name—while analog computers manipulate electronic signals or other physical phenomena that act as models or analogs of various other phenomena (or mathematical variables). Today, the word "computer" has come to be effectively synonymous with "digital computer," due to the rarity of analog computation.

Although all practical computer development to date has obeyed the principles of binary logic laid down by von Neumann and the other pioneers, and these principles are sure to remain standard in digital devices for the near future, much research has focused in recent years on quantum computers. Such devices will exploit properties of matter that differ fundamentally from the onoff, yes-no logic of conventional digital computers.

Resources

Books

Lee, Sunggu. Design of Computers and Other Complex Digital Devices. Upper Saddle River, NJ: Prentice Hall, 2000.

White, Ron, and Timothy Downs. How Computers Work. 6th ed. Indianapolis, IN: Que Publishers, 2001.

Other

Associated Press. "Study: Japan Has Fastest Supercomputer." December, 2002. (cited January 5, 2003). <http://www.govtech.net/news/news.phtml?docid=2002.11.15-30715>.

Additional topics

Science EncyclopediaScience & Philosophy: Cluster compound to Concupiscence