10 minute read

Language and Linguistics

Other Voices

Although structuralism and TGL were respectively paramount in the first and second halves of the twentieth century, this is by no means to say that competing approaches were lacking. During the period when structuralism dominated linguistics, other interesting approaches to language proliferated. One that caught the popular imagination was that of general semantics, a philosophical movement originated by the Polish-American philosopher and mathematician Alfred Korzybski (1879–1950). Korzybski, who once famously declared that "The map is not the territory," called for a heightened awareness of the conventional relationship between words and the things to which they refer. It was his intention to promote clear thought (to free human beings from the "tyranny of words," as enunciated by one enthusiast) and thereby to improve systems of communication. That is to say, it should be recognized that language does not directly reflect reality. Indeed, the structure of language may be said to distort our perception of reality. This deficiency can be remedied by insight into the nature of mundane language and, further, by the creation of more refined language that is structured in the same way reality is. Korzybski's fundamental ideas are spelled out in Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics (1933). General semantics was further popularized by S. I. Hayakawa (1906–1992), whose Language in Thought and Action (1938) has been a bestseller for decades. After serving in the United States Senate from 1977 to 1983, Hayakawa founded U.S. English, Inc., which is dedicated to making English the official language of the United States.

Also showing how politics and linguistics can become tightly intertwined is a nonstructuralist school of a very different sort. Marrism, founded in the 1920s by the Soviet archeologist and linguist Nikolai Y. Marr (1865–1934), was quintessentially Marxist in holding that all linguistic phenomena are purely a reflection of economic functions and social forces (superstructure). Marr considered Caucasian as the proto-language of Europe (the so-called Japhetic theory), which oddly coincided with German racialist theories of Johann Friedrich Blumenbach (1752–1840). Joseph Stalin (1879–1953) (who, incidentally, had much to say about language), however, put an end to Marr's influence on Soviet linguistics when in 1950 he refuted the superstructure theory of language, declaring that it was independent of human productivity.

Politics aside, there were plenty of other nonaligned linguistic practitioners during this period, both in Europe and in America. Bringing together the philological exactitude of Antoine Meillet (1866–1936) with the conceptual grandeur of Georges Dumézil (1898–1986), the French scholar Émile Benveniste (1902–1976) was the author of the redoubtable Le vocabulaire des institutions indo-européennes (1969; English trans. Indo-European Language and Society, 1973). In it, employing what has been referred to as ethnosemantics (or ethnographic semantics), he strove to "elucidate the genesis" of the vocabulary of Indo-European institutions in six fundamental realms: economy, kinship, society (status), authority (especially royalty and its prerogatives), law, and religion. Benveniste was particularly interested in the religious doctrines of the Indo-Europeans.

J. R. Firth (1890–1960) was one of the chief founders of linguistics in Great Britain. He held the first chair in general linguistics in England, which was established at the School of Oriental and African Studies of the University of London in 1944. Firth is noted for his development of prosodic phonology, and insisted on analyzing both sound and meaning in context. Furthermore, Firth held that no single system of analytical principles and categorization could adequately account for language; different systems are required for different situations. Unlike most theorists, but very much like Benveniste, Firth recognized the importance of religion for the history of linguistics. Himself an Orientalist, Firth acknowledged the great merit of early Indian grammars (the first in the world) and the tremendous significance of Sanskrit for understanding the development of Indo-European. At the same time, he admits that the classical grammarians (Panini for Sanskrit, Dionysius for Greek, Priscian for Latin) were not concerned with the vernacular. Firth is thus also one of the few major linguistic theoreticians who is aware of the great gulf between classical and vernacular languages, a subject that awaits future research.

One of Firth's outstanding students, M. A. K. Halliday (b. 1925), countered Chomsky on many points, including the central concept of competence, against which Halliday adduces the notion of "meaning potential." This is defined in terms of culture, to which Halliday is unusually sensitive, not mind. He possesses a keen sense of the social functions of language and its existential acquisition during childhood. Halliday developed a grammatical theory according to which language is viewed as an intersecting set of categories and scales operating at different levels. He is also responsible for creating systemic functional grammar, which is particularly well adapted to non-Indo-European languages.

In the United States, one of the most estimable twentieth-century linguists whose work lay outside of both structuralism and TGL is Kenneth Lee Pike (1912–2000). His Language in Relation to a Unified Theory of the Structure of Human Behavior (1967) is a massive, ambitious tome. In keeping with the combative atmosphere pervading the discipline, Pike speaks of the "battle ground" of language study and his determined efforts to promote thoroughgoing changes in language theory. Pike's theoretical work (he is also celebrated for his achievements in applied linguistics) derives from an attempt to describe empirical data drawn from a literally worldwide range of languages in the absence of a satisfactory grounding in contemporary linguistic theory. It was due to his search for a theoretical basis that would permit him to analyze and make sense of a vast amount of empirical data that he developed his brand of tagmemics. (A tagmeme is normally defined as the smallest functional grammatical element of a language. It is parallel in usage to the morpheme [the smallest functional lexical unit of a language] and the phoneme [the smallest functional phonological unit of a language].)

Pike's tagmemic approach differs from mainstream American linguistics in various technical respects, but above all in its complexity. A key feature of Pike's thinking about language is that he abandons the Saussurian distinction between langue and parole. His reason for doing so was because the large amounts of materials that he and his collaborators collected showed that speech itself was highly standard, an analytical characteristic that was normally reserved for formal, written language according to the mainstream view. As elaborated by Pike, tagmemics remained an important branch of American structuralism, but he distanced himself from other leading linguists of his day in striving to describe linguistic regularities in accord with sociocultural behavior instead of abstract models. Part of the methodology of tagmemics was determined by the sheer necessity of the chief task that its practitioners faced: translating the Bible into previously unwritten, unresearched, "esoteric" languages. Pike goes further in combining tagmemes to form syntagmemes, thus enabling him to engage in advanced syntactical analysis. Pike's most profound and far-reaching contribution to the history of ideas, however, is his application of etic and emic analysis to linguistic research. This distinction between the material and functional study of language had an enormous impact upon anthropology and other fields, albeit often in poorly understood and badly distorted guises.

After a couple of decades in which theoretical research reigned supreme, the restoration of empirical studies of language was furthered in the 1970s and 1980s with the inauguration of discourse studies. A landmark in this development is Strategies of Discourse Comprehension (1983), co-authored by Teun van Dijk, a linguist, and Walter Kintsch, a psychologist. A salient feature of their work is its interdisciplinary quality, requiring linguistic and computer analysis of texts, experiments in psychology laboratories, sociological field studies, and so forth. They also relied on literary scholarship, classical poetics and rhetoric, Russian formalism, and Czech structuralism, as well as sociolinguistics, ethnography, and folklore studies. All of these approaches were integrated under the umbrella of "the wide new field of cognitive science."

A linguistic loner who has had a remarkable impact on the classification of languages is Joseph H. Greenberg (1915–2001). Greenberg started out as a language typologist. Language typology identifies ideal types (for example, agglutinative, [in]flectional, isolating, etc., but there are, of course, many other characteristics that must be taken into account) and proceeds to group individual languages under these categories. Greenberg's fame rests in part on his seminal contributions to synchronic linguistics and his indefatigable quest to identify language universals. His typological approach contrasts with that of genetic classification, which is premised on delineating the development of languages from older precursors. Greenberg was always collecting data, which he copied down in countless notebooks. Known as a "lumper" (as opposed to a "splitter") par excellence, in 1955 Greenberg reduced more than 1,500 African languages to just four supergroups. Later he would ascribe all of the indigenous languages of the Americas to just three main waves of migrants, whereas they had formerly been grouped into hundreds of families. Greenberg achieved these nearly miraculous feats through the application of what he styled mass lexical comparison or multilateral comparison. Mainstream linguists were outraged, with one of the most distinguished among them publicly calling for Greenberg to be "shouted down." At stake were sacrosanct issues of methodology relating to phonology, etymology, and other vital components of linguistics. Undaunted, Greenberg dedicated the last years of his life to the study of Eurasiatic, which brought together all of the languages of Europe and Asia (and then some—except isolates) and was similar to earlier proposals for Nostratic, minus certain African languages.

In linguistics and language studies, writing is often overlooked. When attention is devoted to writing, it is usually minimized as secondary to speech. The Akkadian specialist I. J. Gelb (1907–1985) aimed to lay the foundations for a new science of writing that he called grammatology. This approach would not be merely descriptive, as were earlier histories of writing. In his classic work, entitled humbly and plainly A Study of Writing (1952), Gelb attempted to establish general principles governing the use and evolution of written forms of language through comparative and typological analysis. His is the first, and still the only, work to present a universal theory of all known writing systems. Gelb was able to achieve this considerable synthesis by distinguishing clearly between forerunners of writing and writing proper, and by distinguishing further between word-syllabic systems, syllabaries, and alphabets.

In the opinion of those who are involved in computational linguistics, the most important development since the 1980s has been to resurrect the use of statistical methods for analyzing distributional evidence. This general approach was pioneered by Zellig Harris in the early 1950s, but starting around 1955, his student Chomsky simultaneously cast doubt on the viability of such methods and presented a different vision of how to proceed, employing a more axiomatic approach based on explorations in formal language theory. The "cybernetic underground" began skirmishes in the engineering hinterlands during the 1980s and took over computational linguistics entirely by the 1990s. By the early twenty-first century, psycholinguistics had largely succumbed, though there are pockets of resistance. Plain or unhyphenated linguistics is increasingly influenced by statistical methods, both in methodology and in terms of the empirical techniques that are used.

The phonetician Mark Liberman is responsible for building gigantic corpora of data that are used to solve both theoretical issues and practical problems of great merit (such as voice recognition by cybernetic-electronic devices). Many of the brightest minds in linguistics are now laboring quietly at the task of figuring out how to enable human beings and machines to talk to each other. One of the leading theoreticians engaged in this area of research is Roland Hausser, whose Foundations of Computational Linguistics: Man-Machine Communication in Natural Language (1999) offers a prescient look at what the future holds in store with regard to the human-machine interface.

One of the most exciting new realms of investigation in historical linguistics is the application of genetics. According to Luigi Luca Cavalli-Sforza, one of the leading researchers in this field, the genes of modern populations contain a record of the human species stretching back 100,000 years. What is more, conclusions drawn from the study of modern genetic material are now being corroborated by direct recourse to ancient DNA. It is striking that genetic and linguistic trees match each other closely, and archeological data provide further confirmation of the movements and intricate interrelationships of ancient peoples.

Additional topics

Science EncyclopediaScience & Philosophy: Kabbalah Mysticism - Types Of Kabbalah to LarynxLanguage and Linguistics - Philosophers, Grammarians, And Neogrammarians, The Structuralist Era, The Transformational Generative Insurrection, Other Voices