2 minute read

Calculation and Computation

Premodern, Early Modern, Non-western, Late Modern Period, Contemporary Period, Bibliography



Words containing the roots calcul- and comput- have existed since antiquity. The study of concepts used to indicate actions, professions, and (mental and material) artifacts suggests that calculation and computation have not been, as canonically assumed, an exclusive concern of modern times. The mere existence of both word clusters throughout the decades (and centuries) prior to World War II also suggests that it may be problematic to assume that the relationship between calculation and computation has been simple—that is, computation existing as an exclusive postwar phenomenon brought about by a technical revolution that left calculation behind. For both Charles Steinmetz (1865–1923) and Vannevar Bush (1890–1974), celebrated pioneers of the prewar and interwar generation of electrical engineering, respectively, both calculation and computation were of constitutional importance to all their technical work. Their writings indicate a belief that the computing revolution started long before the 1940s. Steinmetz and Bush employed both concepts contemporaneously in their pervasive engineering textbooks in order to differentiate between high-skill analysis and low-skill application, between creative mental design and routine manual implementation, between that which was subject to the least and to the most of mechanization.



Steinmetz and Bush perceived themselves as analysts, in the tradition of Gottfried Wilhelm von Leibnitz (1646–1716) and Isaac Newton (1642–1727), the early modern founders of calculus. With the dynamic expansion of the division of labor that has been part and parcel of the expanding capitalist mode of production, the progress of calculation was a prerequisite for the advance of computation. Successful calculation from the top by the well-paid few would, first, routinize the work performed by the multitudes in the base of the pyramid; second, it would minimize the skill required by them; third, it would subject their work to mechanization; and fourth, it would lower their salary expectations. Computation was the job for a low-paid clerk known as the "computer" or "computor." These human computers were usually women, who produced computations for the state and the military, for insurance companies and other businesses, and for analysts within the engineering and scientific community. While human computers worked with a rich variety of artifacts, it was the mass employment of mechanical desktop "calculating machines" that determined their history. By contrast, the engineering graduate, almost exclusively a male, was trying to distance himself from the ranks of the human computers by passionately defending the accuracy of his inexpensive slide rule, which he could own individually and skillfully use to "calculate."

After the 1950s, amid popular expectation that full mechanization had finally arrived, the concept of "computer" connoted a machine rather than a human, thereby signifying the ideological hegemony that pointed to full separation of production from skilled labor, of accumulated (or dead) labor from living labor, and of fixed from variable capital. Instead of disappearing, the dated engineering differentiation between "analysts" and "computors" resurfaced in the system analysts versus coders struggle that marked the emergence of the programmers, computation's new professionals. The difference between computation and calculation resonates throughout the fierce competition between the digital and analog computer (1940s–1950s), followed by the juxtaposition of digital computer hardware to software (1960s–1970s), which, in turn, was succeeded by the contrast between digital computer software operating systems from software for special purposes (i.e., application software) in the 1980s and 1990s.

Additional topics

Science EncyclopediaScience & Philosophy: Calcium Sulfate to Categorical imperative