Noam Chomsky: Life and Basic Ideas

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 20

Noam Chomsky

American linguist
WRITTEN BY
James A. McGilvray
Emeritus Professor of Philosophy, McGill University. Editor
of The Cambridge Companion to Chomsky; author of Chomsky:
Language, Mind, and Politics.
See Article History
Alternative Title: Avram Noam Chomsky
Noam Chomsky, in full Avram Noam Chomsky, (born
December 7, 1928, Philadelphia, Pennsylvania, U.S.), American
theoretical linguist whose work from the 1950s revolutionized the
field of linguistics by treating language as a uniquely human,
biologically based cognitive capacity. Through his contributions to
linguistics and related fields, including cognitive psychology and
the philosophies of mind and language, Chomsky helped to initiate
and sustain what came to be known as the “cognitive revolution.”
Chomsky also gained a worldwide following as a political dissident
for his analyses of the pernicious influence of economic elites on
U.S. domestic politics, foreign policy, and intellectual culture.

TOP QUESTIONS
What was Noam Chomsky’s early life like?
How did Noam Chomsky influence the field of linguistics?
What are Noam Chomsky’s politics?

Life And Basic Ideas


Born into a middle-class Jewish family, Chomsky attended an
experimental elementary school in which he was encouraged to
develop his own interests and talents through self-directed
learning. When he was 10 years old, he wrote an editorial for his
school newspaper lamenting the fall of Barcelona in the Spanish
Civil War and the rise of fascism in Europe. His research then and
during the next few years was thorough enough to serve decades
later as the basis of “Objectivity and Liberal Scholarship” (1969),
Chomsky’s critical review of a study of the period by the historian
Gabriel Jackson.

When he was 13 years old, Chomsky began taking trips by himself


to New York City, where he found books for his voracious reading
habit and made contact with a thriving working-class Jewish
intellectual community. Discussion enriched and confirmed the
beliefs that would underlie his political views throughout his life:
that all people are capable of comprehending political and
economic issues and making their own decisions on that basis;
that all people need and derive satisfaction from acting freely and
creatively and from associating with others; and that authority—
whether political, economic, or religious—that cannot meet a
strong test of rational justification is illegitimate. According to
Chomsky’s anarchosyndicalism, or libertarian socialism, the best
form of political organization is one in which all people have a
maximal opportunity to engage in cooperative activity with others
and to take part in all decisions of the community that affect them.

In 1945, at the age of 16, Chomsky entered the University of


Pennsylvania but found little to interest him. After two years he
considered leaving the university to pursue his political interests,
perhaps by living on a kibbutz. He changed his mind, however,
after meeting the linguist Zellig S. Harris, one of the American
founders of structural linguistics, whose political convictions were
similar to Chomsky’s. Chomsky took graduate courses with Harris
and, at Harris’s recommendation, studied philosophy with Nelson
Goodman and Nathan Salmon and mathematics with Nathan
Fine, who was then teaching at Harvard University. In his 1951
master’s thesis, The Morphophonemics of Modern Hebrew, and
especially in The Logical Structure of Linguistic Theory (LSLT),
written while he was a junior fellow at Harvard (1951–55) and
published in part in 1975, Chomsky adopted aspects of Harris’s
approach to the study of language and of Goodman’s views
on formal systems and the philosophy of science and transformed
them into something novel.
Get exclusive access to content from our 1768 First Edition with your
subscription.Subscribe today

Whereas Goodman assumed that the mind at birth is largely


a tabula rasa (blank slate) and that language learning in children is
essentially a conditioned response to linguistic stimuli, Chomsky
held that the basic principles of all languages, as well as the basic
range of concepts they are used to express, are innately
represented in the human mind and that language learning
consists of the unconscious construction of a grammar from these
principles in accordance with cues drawn from the child’s
linguistic environment. Whereas Harris thought of the study of
language as the taxonomic classification of “data,” Chomsky held
that it is the discovery, through the application of formal systems,
of the innate principles that make possible the swift acquisition of
language by children and the ordinary use of language by children
and adults alike. And whereas Goodman believed that linguistic
behaviour is regular and caused (in the sense of being a specific
response to specific stimuli), Chomsky argued that it is incited by
social context and discourse context but essentially uncaused—
enabled by a distinct set of innate principles but innovative, or
“creative.” It is for this reason that Chomsky believed that it is
unlikely that there will ever be a full-fledged science of linguistic
behaviour. As in the view of the 17th-century French
philosopher Réne Descartes, according to Chomsky, the use of
language is due to a “creative principle,” not a causal one.

Harris ignored Chomsky’s work, and Goodman—when he realized


that Chomsky would not accept his behaviourism—denounced it.
Their reactions, with some variations, were shared by a large
majority of linguists, philosophers, and psychologists. Although
some linguists and psychologists eventually came to accept
Chomsky’s basic assumptions regarding language and the mind,
most philosophers continued to resist them.
Chomsky received a Ph.D. in linguistics from the University of
Pennsylvania in 1955 after submitting one chapter of LSLT as a
doctoral dissertation (Transformational Analysis). In 1956 he was
appointed by the Massachusetts Institute of Technology (MIT) to a
teaching position that required him to spend half his time on a
machine translation project, though he was openly skeptical of its
prospects for success (he told the director of the translation
laboratory that the project was of “no intellectual interest and was
also pointless”). Impressed with his book Syntactic
Structures (1957), a revised version of a series of lectures he gave
to MIT undergraduates, the university asked Chomsky and his
colleague Morris Halle to establish a new graduate program in
linguistics, which soon attracted several outstanding scholars,
including Robert Lees, Jerry Fodor, Jerold Katz, and Paul Postal.

Chomsky’s 1959 review of Verbal Behavior, by B.F. Skinner, the


dean of American behaviourism, came to be regarded as the
definitive refutation of behaviourist accounts of language learning.
Starting in the mid-1960s, with the publication of Aspects of the
Theory of Syntax (1965) and Cartesian Linguistics (1966),
Chomsky’s approach to the study of language and mind gained
wider acceptance within linguistics, though there were many
theoretical variations within the paradigm. Chomsky was
appointed full professor at MIT in 1961, Ferrari P. Ward Professor
of Modern Languages and Linguistics in 1966, and Institute
Professor in 1976. He retired as professor emeritus in 2002.

Noam Chomsky
QUICK FACTS
View Media Page

BORN

December 7, 1928 (age 91)


Philadelphia, Pennsylvania

NOTABLE WORKS

 “Aspects of the Theory of Syntax”


 “Necessary Illusions: Thought Control in Democratic Societies”
 “Syntactic Structures”
 “The Logical Structure of Linguistic Theory”

SUBJECTS OF STUDY
 language
 philosophy of language
 rationalism
 innate idea
 transformational grammar

Linguistics
“Plato’s problem”
A fundamental insight of philosophical rationalism is that human
creativity crucially depends on an innate system of concept
generation and combination. According to Chomsky, children
display “ordinary” creativity—appropriate and innovative use of
complexes of concepts—from virtually their first words. With
language, they bring to bear thousands of rich
and articulate concepts when they play, invent, and speak to and
understand each other. They seem to know much more than they
have been taught—or even could be taught. Such knowledge,
therefore, must be innate in some sense. To say it is innate,
however, is not to say that the child is conscious of it or even that it
exists, fully formed, at birth. It is only to say that it is produced by
the child’s system of concept generation and combination, in
accordance with the system’s courses of biological and physical
development, upon their exposure to certain kinds of
environmental input.

It has frequently been observed that children acquire both


concepts and language with amazing facility and speed, despite the
paucity or even absence of meaningful evidence and instruction in
their early years. The inference to the conclusion that much of
what they acquire must be innate is known as the argument from
the “poverty of the stimulus.” Specifying precisely what children
acquire and how they acquire it are aspects of what Chomsky
called in LSLT the “fundamental problem” of linguistics. In later
work he referred to this as “Plato’s problem,” a reference to Plato’s
attempt (in his dialogue the Meno) to explain how it is possible for
an uneducated child to solve geometrical problems with
appropriate prompting but without any specific training or
background in mathematics. Unlike Plato, however, Chomsky held
that solving Plato’s problem is a task for natural science,
specifically cognitive science and linguistics.

Principles and parameters


Chomsky’s early attempts to solve the linguistic version of Plato’s
problem were presented in the “standard theory” of Aspects of the
Theory of Syntax and the subsequent “extended standard theory,”
which was developed and revised through the late 1970s. These
theories proposed that the mind of the human infant is endowed
with a “format” of a possible grammar (a theory of linguistic data),
a method of constructing grammars based on the linguistic data to
which the child is exposed, and a device that evaluates the relative
simplicity of constructed grammars. The child’s mind constructs a
number of possible grammars that are consistent with the
linguistic data and then selects the grammar with the fewest rules
or primitives. Although ingenious, this approach was cumbersome
in comparison with later theories, in part because it was not clear
exactly what procedures would have to be involved in the
construction and evaluation of grammars.

In the late 1970s and early 1980s Chomsky and others developed a
better solution using a theoretical framework known as “principles
and parameters” (P&P), which Chomksy introduced in Lectures
on Government and Binding (1981) and elaborated in Knowledge
of Language (1986). Principles are linguistic universals, or
structural features that are common to all natural languages;
hence, they are part of the child’s native endowment. Parameters,
also native (though not necessarily specific to language, perhaps
figuring elsewhere too), are options that allow for variation in
linguistic structure. The P&P approach assumed that these options
are readily set upon the child’s exposure to a minimal amount of
linguistic data, a hypothesis that has been supported
by empirical evidence. One proposed principle, for example, is that
phrase structure must consist of a head, such as a noun or a verb,
and a complement, which can be a phrase of any form. The order
of head and complement, however, is not fixed: languages may
have a head-initial structure, as in the English verb phrase (VP)
“wash the clothes,” or a “head-final” structure, as in the
corresponding Japanese VP “the clothes wash.” Thus,
one parameter that is set through the child’s exposure to linguistic
data is “head-initial/head-final.” The setting of what was thought,
during the early development of P&P, to be a small number of
parametric options within the constraints provided by a
sufficiently rich set of linguistic principles would, according to this
approach, yield a grammar of the specific language to which the
child is exposed. Later the introduction of “microparameters” and
certain nonlinguistic constraints on development complicated this
simple story, but the basic P&P approach remained in place,
offering what appears to be the best solution to Plato’s problem yet
proposed.

The phonological, or sound-yielding, features of languages are also


parameterized, according to the P&P approach. They are usually
set early in development—apparently within a few days—and they
must be set before the child becomes too old if he is to be able to
pronounce the language without an accent. This time limit on
phonological parameter setting would explain why second-
language learners rarely, if ever, sound like native speakers. In
contrast, young children exposed to any number of additional
languages before the time limit is reached have no trouble
producing the relevant sounds.

In contrast to the syntactic and phonological features of language,


the basic features out of which lexically expressed concepts (and
larger units of linguistic meaning) are constructed do not appear
to be parameterized: different natural languages seem to rely on
the same set. Even if semantic features were parameterized,
however, a set of features detailed enough to provide (in principle)
for hundreds of thousands of root, or basic, concepts would have
to be a part of the child’s innate, specifically linguistic endowment
—what Chomsky calls Universal Grammar, or UG—or of his
nonlinguistic endowment—the innate controls on growth,
development, and the final states of other systems in the mind or
brain. This is indicated, as noted above, by the extraordinary rate
at which children acquire lexical concepts (about one per waking
hour between the ages of two and eight) and the rich knowledge
that each concept and its verbal, nominal, adverbial, and other
variants provide. No training or conscious intervention plays a
role; lexical acquisition seems to be as automatic as parameter
setting.

Of course, people differ in the words contained in their


vocabularies and in the particular sounds they happen to associate
with different concepts. Early in the 20th century, the Swiss
linguist Ferdinand de Saussure noted that there is nothing natural
or necessary about the specific sounds with which a concept may
be associated in a given language. According to Chomsky, this
“Saussurean arbitrariness” is of no interest to the natural scientist
of language, because sound-concept associations in this sense are
not a part of UG or of other nonlinguistic systems that contribute
to concept (and sound) development.

A developed theory of UG and of relevant nonlinguistic systems


would in principle account for all possible linguistic sounds and all
possible lexical concepts and linguistic meanings, for it would
contain all possible phonological and semantic features and all the
rules and constraints for combining phonological and semantic
features into words and for combining words into a
potentially infinite number of phrases and sentences. Of course,
such a complete theory may never be fully achieved, but in this
respect linguistics is no worse off than physics, chemistry, or any
other science. They too are incomplete.

It is important to notice that the semantic features


that constitute lexical concepts, and the rules and constraints
governing their combination, seem to be virtually designed for use
by human beings—i.e., designed to serve human interests and to
solve human problems. For example, concepts such as “give” and
“village” have features that reflect human actions and interests:
transfer of ownership (and much more) is part of
the meaning of give, and polity (both abstract and concrete) is part
of the meaning of village. Linguists and philosophers sympathetic
to empiricism will object that these features are created when
a community “invents” a language to do the jobs it needs to do—no
wonder, then, that linguistic meanings reflect human interests and
problems. The rationalist, in contrast, argues that humans could
not even conceive of these interests and problems unless the
necessary conceptual machinery were available beforehand. In
Chomsky’s view, the speed and facility with which children learn
“give” and “village” and many thousands of other concepts show
that the empiricist approach is incorrect—though it may be correct
in the case of scientific concepts, such as “muon,” which
apparently are not innate and do not reflect human concerns.
The overall architecture of the language faculty also helps to
explain how conceptual and linguistic creativity is possible. In the
P&P framework in its later “minimalist” forms (see below Rule
systems in Chomskyan theories of language), the language faculty
has “interfaces” that allow it to communicate with other parts of
the mind. The information it provides through “sensorimotor”
interfaces enables humans to produce and
perceive speech and sign language, and the information it provides
through “conceptual-intentional” interfaces enables humans to
perform numerous cognitive tasks, ranging from categorization
(“that’s a lynx”) to understanding and producing stories and
poetry.

RELATED BIOGRAPHIES

 Plato
 René Descartes
 Aristotle
 John Locke
 Georg Wilhelm Friedrich Hegel
 Benedict de Spinoza
 Hilary Putnam
 Søren Kierkegaard
 Gottfried Wilhelm Leibniz
 John Searle

Rule systems in Chomskyan theories


of language
Chomsky’s theories of grammar and language are often referred to
as “generative,” “transformational,” or “transformational-
generative.” In a mathematical sense, “generative” simply means
“formally explicit.” In the case of language, however,
the meaning of the term typically also includes the notion of
“productivity”—i.e., the capacity to produce an infinite number of
grammatical phrases and sentences using only finite means (e.g., a
finite number of principles and parameters and a finite
vocabulary). In order for a theory of language to be productive in
this sense, at least some of its principles or rules must
be recursive. A rule or series of rules is recursive if it is such that it
can be applied to its own output an indefinite number of times,
yielding a total output that is potentially infinite. A simple example
of a recursive rule is the successor function in mathematics, which
takes a number as input and yields that number plus 1 as output. If
one were to start at 0 and apply the successor function
indefinitely, the result would be the infinite set of natural
numbers. In grammars of natural languages, recursion appears in
various forms, including in rules that allow for concatenation,
relativization, and complementization, among other operations.

BRITANNICA QUIZ

Historical Smorgasbord: Fact or Fiction?

Paper money was first issued because of a shortage of coinage.


Chomsky’s theories are “transformational” in the sense that they
account for the syntactic and semantic properties of sentences by
means of modifications of the structure of a phrase in the course of
its generation. The standard theory of Syntactic Structures and
especially of Aspects of the Theory of Syntax employed a phrase-
structure grammar—a grammar in which the syntactic elements of
a language are defined by means of rewrite rules that specify their
smaller constituents (e.g., “S → NP + VP,” or “a sentence may be
rewritten as a noun phrase and a verb phrase”)—a large number of
“obligatory” and “optional” transformations, and two levels of
structure: a “deep structure,” where semantic interpretation takes
place, and a “surface structure,” where phonetic interpretation
takes place. These early grammars were difficult to contrive, and
their complexity and language-specificity made it very difficult to
see how they could constitute a solution to Plato’s problem.

In Chomsky’s later theories, deep structure ceased to be the locus


of semantic interpretation. Phrase-structure grammars too were
virtually eliminated by the end of the 1970s; the task they
performed was taken over by the operation of “projecting”
individual lexical items and their properties into more complex
structures by means of “X-bar theory.” Transformations during
this transitional period were reduced to a single operation, “Move
α” (“Move alpha”), which amounted to “move any element in a
derivation anywhere”—albeit within a system
of robust constraints. Following the introduction of the
“minimalist program” (MP) in the early 1990s, deep structure (and
surface structure) disappeared altogether. Move α, and thus
modification of structure from one derivational step to another,
was replaced by “Move” and later by “internal Merge,” a variant of
“external Merge,” itself a crucial basic operation that takes two
elements (such as words) and makes of them a set. In the early
21st century, internal and external Merge, along with parameters
and microparameters, remained at the core of Chomsky’s efforts to
construct grammars.

Throughout the development of these approaches to the science of


language, there were continual improvements in simplicity and
formal elegance in the theories on offer; the early phrase-structure
components, transformational components, and deep and surface
structures were all eliminated, replaced by much simpler systems.
Indeed, an MP grammar for a specific language could in principle
consist entirely of Merge (internal and external) together with
some parametric settings. MP aims to achieve both of the major
original goals that Chomsky set for a theory of language in Aspects
of the Theory of Syntax: that it be descriptively adequate, in the
sense that the grammars it provides generate all and only the
grammatical expressions of the language in question, and that it
be explanatorily adequate, in the sense that it provides a
descriptively adequate grammar for any natural language as
represented in the mind of a given individual. MP grammars thus
provide a solution to Plato’s problem, explaining how any
individual readily acquires what Chomsky calls an “I-
language”—“I” for internal, individual, and intensional (that is,
described by a grammar). But they also speak to other desiderata
of a natural science: they are much simpler, and they are much
more easily accommodated to another science, namely biology.

Philosophy Of Mind And
Human Nature
Human conceptual and linguistic creativity involves several
mental faculties and entails the existence of some kind of mental
organization. It depends on perceptual-articulatory systems and
conceptual-intentional systems, of course, but on many others too,
such as vision. According to Chomsky, the mind comprises an
extensive cluster of innate “modules,” one of which is language.
Each module operates automatically, independently of individual
control, on the basis of a distinct, domain-specific set of rules that
take determinate inputs from some modules and yield determinate
outputs for others. In earlier work these operations were called
“derivations”; more recently they have been called
“computations.” The various modules interact in complex ways to
yield perception, thought, and a large number of
other cognitive products.

READ MORE ON THIS TOPIC

philosophy of language: Chomsky

The views common to Quine and the hermeneutic tradition were opposed

from the 1950s by developments in theoretical linguistics, particularly...

The language module seems to play a role in coordinating the


products of other modules. The generative—specifically, recursive
—properties of language enable humans to combine arbritary
concepts together in indefinitely many ways, thereby making the
range of human thought virtually unlimited. When concepts are
paired with sounds in lexical items (words), humans can say
virtually anything and cooperate and make plans with each other.
The fact that the language faculty yields this kind of flexibility
suggests that the emergence of language in human evolutionary
history coincided with the appearance of other cognitive capacities
based on recursion, including quantification.

In a 2002 article, “The Language Faculty,” Chomsky and his


coauthors Marc Hauser and W. Tecumseh Fitch divided the
language faculty in a way that reflected what had been Chomsky’s
earlier distinction between competence and performance.
The faculty of language in the “narrow” sense (FLN) amounts to
the recursive computational system alone, whereas the faculty in
the broad sense (FLB) includes perceptual-articulatory systems
(for sound and sign) and conceptual-intentional systems
(for meaning). These are the systems with which the
computational system interacts at its interfaces. Regarding
evolution, the authors point out that, although there are
homologues and analogs in other species for the perceptual-
articulatory and conceptual-intentional systems, there are none
for the computational system, or FLN. Conceivably, some
cognitive systems of animals, such as the navigational systems of
birds, might involve recursion, but there is no computational
system comparable to FLN, in particular none that links sound
and meaning and yields unbounded sentential “output.” FLN is
arguably what makes human beings cognitively distinct from other
creatures.

As suggested earlier, UG, or the language faculty narrowly


understood (FLN), may consist entirely of Merge and perhaps
some parameters specific to language. This raises the question of
what the biological basis of FLN must be. What distinctive fact of
human biology, or the human genome, makes FLN unique to
humans? In a 2005 article, “Three Factors in Language Design,”
Chomsky pointed out that there is more to organic development
and growth than biological (genomic) specification and
environmental input. A third factor is general conditions on
growth resulting from restrictions on possible physical structures
and restrictions on data analysis, including those that might figure
in computational systems (such as language). For example, a bee’s
genome does not have to direct it to build hives in a hexagonal
lattice. The lattice is a requirement imposed by physics, since this
structure is the most stable and efficient of the relevant
sort. Analogous points can be made about the growth, structure,
and operation of the human brain. If the parameters of UG are not
specified by the language-specific parts of the human genome but
are instead the result of third factors, the only language-specific
information that the genome would need to carry is an instruction
set for producing a single principle, Merge (which takes external
and internal forms). And if this is the case, then the appearance of
language could have been brought about by a single genetic
mutation in a single individual, so long as that mutation were
transmissible to progeny. Obviously, the relevant genes would
provide great advantages to any human who possessed them. A
saltational account such as this has some evidence behind it:
50,000 to 100,000 years ago, humans began to observe the
heavens, to draw and paint, to wonder, and to develop
explanations of natural phenomena—and the migration from
Africa began. Plausibly, the introduction of the computational
system of language led to this remarkable cognitive awakening.

Politics
Chomsky’s political views seem to be supported to some extent by
his approach to the study of language and mind, which implies
that the capacity for creativity is an important element of human
nature. Chomsky often notes, however, that there is only an
“abstract” connection between his theories of language and his
politics. A close connection would have to be based on a fully
developed science of human nature, through which fundamental
human needs could be identified or deduced. But there is nothing
like such a science. Even if there were, the connection would
additionally depend on the assumption that the best form of
political organization is one that maximizes the satisfaction of
human needs. And then there would remain the question of what
practical measures should be implemented to satisfy those needs.
Clearly, questions such as this cannot be settled by scientific
means.

Although Chomsky was always interested in politics, he did not


become publicly involved in it until 1964, when he felt compelled
to lend his voice to protests against the U.S. role in the Vietnam
War (or, as he prefers to say, the U.S. invasion of Vietnam), at no
small risk to his career and his personal safety. He has argued that
the Vietnam War was only one in a series of cases in which the
United States used its military power to gain or consolidate
economic control over increasingly larger areas of the developing
world. In the same vein, he regards the domestic political scene of
the United States and other major capitalist countries as theatres
in which major corporations and their elite managers strive to
protect and enhance their economic privileges and political power.

In democracies like the United States, in which the compliance of


ordinary citizens cannot be guaranteed by force, this effort
requires a form of “propaganda”: the powerful must make
ordinary citizens believe that vesting economic control of society
in the hands of a tiny minority of the population is to their benefit.
Part of this project involves enlisting the help of “intellectuals”—
the class of individuals (primarily journalists and academics) who
collect, disseminate, and interpret political and economic
information for the public. Regrettably, Chomsky argues, this task
has proved remarkably easy.

As a responsible (rather than mercenary) member of


the intellectual class, Chomsky believes that it is his obligation to
provide ordinary citizens with the information they needed to
draw their own conclusions and to make their own decisions about
vital political and economic issues. As he wrote in Powers and
Prospects (1996),
The responsibility of the writer as a moral agent is to try to bring the
truth about matters of human significance to an audience that can do
something about them.

In one of his first political essays, “The Responsibility of


Intellectuals” (1967), Chomsky presented case after case in
which intellectuals in positions of power, including prominent
journalists, failed to tell the truth or deliberately lied to the public
in order to conceal the aims and consequences of the United
States’ involvement in the Vietnam War. In their two-volume
work The Political Economy of Human Rights (1979) and later
in Manufacturing Consent: The Political Economy of the Mass
Media (1988), Chomsky and the economist Edward Herman
analyzed the reporting of journalists in the mainstream (i.e.,
corporate-owned) media on the basis of statistically careful studies
of historical and contemporary examples. Their work provided
striking evidence of selection, skewing of data, filtering of
information, and outright invention in support of assumptions
that helped to justify the controlling influence of corporations in
U.S. foreign policy and domestic politics.

The studies in these and other works made use of paired examples
to show how very similar events can be reported in very different
ways, depending upon whether and how state and corporate
interests may be affected. In The Political Economy of Human
Rights, for example, Chomsky and Herman compared reporting
on Indonesia’s military invasion and occupation of East
Timor with reporting on the behaviour of the communist Khmer
Rouge regime in Cambodia. The events in the two cases took place
in approximately the same part of the world and at approximately
the same time (the mid- to late 1970s). As a proportion of
population, the number of East Timorese tortured and murdered
by the Indonesian military was approximately the same as the
number of Cambodians tortured and murdered by the Khmer
Rouge. And yet the mainsteam media in the United States devoted
much more attention to the second case (more than 1,000 column
inches in the New York Times) than to the first (about 70 column
inches). Moreover, reporting on the actions of the Khmer Rouge
contained many clear cases of exaggeration and fabrication,
whereas reporting on the actions of Indonesia portrayed them as
essentially benign. In the case of the Khmer Rouge, however,
exaggerated reports of atrocities aided efforts by the United States
to maintain the Cold War and to protect and expand its access to
the region’s natural resources (including East Timorese oil
deposits) through client states. Indonesia, on the other hand, was
just such a state, heavily supported by U.S. military and economic
aid. Although ordinary Americans were not in a position to do
anything about the Khmer Rouge, they were capable of doing
something about their country’s support for Indonesia, in
particular by voting their government out of office. But the media’s
benign treatment of the invasion made it extremely unlikely that
they would be motivated to do so. According to Chomsky, this and
many other examples demonstrate that prominent journalists and
other intellectuals in the United States function essentially as
“commissars” on behalf of elite interests. As he wrote
in Necessary Illusions (1988):
The media serve the interests of state and corporate power, which are
closely interlinked, framing their reporting and analysis in a manner
supportive of established privilege and limiting debate and discussion
accordingly.
Some of Chomsky’s critics have claimed that his political and
media studies portray journalists as actively engaged in a kind
of conspiracy—an extremely unlikely conspiracy, of course, given
the degree of coordination and control it would require.
Chomsky’s response is simply that the assumption of conspiracy is
unnecessary. The behaviour of journalists in the mainstream
media is exactly what one would expect, on average, given the
power structure of the institutions in which they are employed,
and it is predictable in the same sense and for the same reasons
that the behaviour of the president of General Motors is
predictable. In order to succeed—in order to be hired and
promoted—media personnel must avoid questioning the interests
of the corporations they work for or the interests of the elite
minority who run those corporations. Because journalists
naturally do not wish to think of themselves as mercenaries (no
one does), they engage in what amounts to a form of self-
deception. They typically think of themselves as stalwart defenders
of the truth (as suggested by the slogan of the New York Times,
“All the news that’s fit to print”), but when state or corporate
interests are at stake they act otherwise, in crucially important
ways. In short, very few of them are willing or even able to live up
to their responsibility as intellectuals to bring the truth about
matters of human significance to an audience that can do
something about them.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy