Transformational Generative Grammar
Transformational Generative Grammar
A. Background
Structuralism emphasizes on the process of segmenting and classifying the physical features
of utterances with little reference to the abstract underlying structures of language or their meaning.
Structuralism is based on a body of original collected language data known as a corpus. This corpus
represents native spoken or written language. It is broken down by the grammarians who describe and
classify it in terms of form, position and function. However, something very important seemed to be
overlooked in this approach, linguistic creativity or the ability to produce utterances that formed the
corpus. Since speakers know how to produce an infinite number of sentences, many of which are novel,
never having been produced before, it is a must for a linguist to study this ability and make explicit the
ways in producing utterances.
The person who noticed and paid attention to this oversight was Avram Noam Chomsky. He laid the
foundations for this theory upon the publication of his book Syntactic Structures in 1957. In 1955,
however, Chomsky was already arguing for an approach to grammar which was independent of
semantics. He rejected the description of a corpus and instead focused on discovery procedures which
he considered as the aim of linguistic science. For him, the goal of a grammar is to account for the
native speaker's competence, defined as what a native speaker knows, tacitly, of his or her language.
Since the number of all the grammatical sentences possible in language is infinite, he considered
grammar is as sets of rules which, when followed, can generate all the grammatical sentences of
language.
In order for Chomsky to accomplish his newly formulated goal of linguistics, to characterize
competence, he decided to focus on the aspect of language that was ignored by the Structuralist, which
is syntax. As we all know syntax is the branch of linguistics devoted to the study of structures or the
way in which words combine to express meaning through sentences. In syntax, hypotheses are called
rules, and the group of hypotheses that describe a languages syntax is called a grammar. However,
from Chomskys perspective, syntax is seen as a natural system, somehow rooted in human psychology
and biology. Hence, for him it is syntax which will address the mystery of linguistic creativity and will
fulfill his held goal of linguistics. So, focusing on syntax he laid the foundation for explaining linguistic
creativity. From the traditional definition of being the study of the rules governing the way words are
combined to form sentences in a language, an alternative definition has been formed which states that
syntax of the interrelationships between elements of sentence structures, and of the rules governing
the arrangement of sentences in sequences.
According to Chomsky, the primary task of the linguist should not be to discover the structure of
the language from a body of data but rather to describe and explain the knowledge of the structure of
the language which the native speaker has. Rejecting the approach of his structural predecessors, he
introduced and developed a radical approach that came to be known as generative grammar or
transformational-generative grammar. Generative grammar or transformational generative
grammar is a form of language analysis that establishes a relationship with the different elements in
the sentence of a language and makes use of rules or transformations to recognize these relationships.
It is called generative grammar because it states that language is a rule-governed system which allows
us to generate an infinite number of sentences. Chomsky borrowed the term is borrowed from
mathematics to refer to the capacity of a grammar to define (i.e. specify the membership of) the set of
grammatical sentences in a language. Technically, a generative grammar is a set of formal rules
which projects a finite set of sentences upon the potentially infinite set of sentences that constitute the
language as a whole, and it does this in an explicit manner, assigning to each a set of structural
descriptions. generate (a technical term for describe or specify)
The term 'generative' means that the grammar is formal and explicit; 'when we speak of the
linguist's grammar as a ''generative grammar" we mean only that it is sufficiently explicit to determine
how sentences of the language are in fact characterized by the grammar' (Chomsky, 1980). Hence
generative grammars contrast with traditional grammars, which left many rules of the grammar to the
interpretation of the reader; such grammars dealt primarily with the idiosyncratic forms that were not
'obvious' and thus left it to the reader to know what a 'noun' was or what the basic word order of the
sentence was. A generative grammar therefore tried to specify everything that is necessary on the
printed page rather than leaving it to the imagination.
Sentences are generated by a (subconscious) set of procedures which is part
The goal is to figure out what we (subconsciously) know--a theory of
(unconscious knowledge) of a native speaker. Yet since Chomsky will deal with
logic, he borrowed from the axiomatic-deductive method in mathematical logic,
earlier in its computational formulation. An important influence to his mathematical approach was the
Polish-American mathematician logician Emil Leon Post who started the practice of mathematicizing
logical proof. By and large though the notation of generative grammar was invented.
At first, Chomskys theory is just a continuation of the general concerns of American Structural syntax,
which found an extreme in Zellig Harriss (1951) advocation of a formal and meaningless linguistics.
Harris had been Chomskys professor at the University of Pennsylvania. In a paper entitled From
Morpheme to Utterance, Harris (1946) had proposed the use of equations to indicate
substitutability. The expressions to the left in (1) - (6) are taken from Harris (1946.166ff.). The
expressions to the right are their rough equivalences in TGG phrase structure (cf. [7] - [12] and [13] [16]).
(1)
(2)
(3)
(3)
(4)
(5)
(6)
N4V4 = N4V4
I = N4
V3-Vv = V4
V= V1 = V2 = V3
I = she
Vv = -ed
V = gain
S > NP + VP
NP > N
VP > Aux + Verb
Aux > C (M) (have + en) (be + ing)
N > she
C > Past
Verb > gain
This selection of equations is represented in an order reverse from the one in which Harris lists them,
i.e., from the top-down in place of from the bottom-up. Harris is working from morpheme to utterance
and therefore begins with she, -ed, gain, etc. The turn around is to match the sequence used in TGG,
which then is more akin to IC-analysis.
The equations of Harris can then be adapted in the following way (Chomsky 1957:26):
(7) Sentence > NP + VP
(8) NP > T + N
(9) VP > Verb + NP
(10) T > the
(11) N > man, ball, etc.
(12) Verb > hit, took, etc.
And because the statements of (7) - (12) produce such structures as those in Figure 1, they are called
phrase structure rules, and the (or any) grammar which is of this sort is a phrase structure grammar.
And that is the syntax of American Structuralism ... a phrase structure grammar. The shared view of
grammar is another continuation (iv) from American Structuralism.
Hence, the first step of TG grammar emerges from American Structuralism. The focus on the
relation between sentences arises first in discourse study. Neither the emphasis on such relations nor
the notion of transformation for its expression is Chomskys, as (more generally in American
Structralism) the notion phrase structure rule (equation) is not Chomskys contribution. It all originates
in Harriss work and is adapted by Chomsky.The motivation for transformation was originally one of
textual equivalence and restricted to one specific text. The idea of the technique of experimental
variation (again Harriss) allowed the possibility of generalizing (abstracting) the relation beyond the
text and for the retreat from discourse and from parole to the safer confines of the sentence and
language. That is Chomskys contribution. Nevertheless, the mainstream of linguistics since 1957 has
been dominated by Noam Chomsky (1928-).
B. How Transformational Grammar Generate Sentences
The predecessor to phrase structure rules was an approach to syntax used by structural linguists called
immediate constituent analysis, which accounted for the linear order of words on the surface and the
hierarchical structure of the sentence; it was based on the principle that words enter into relationships
with one another within coherent units known as constituents, the proper subparts of sentences. (An
even older form of phrase structure grammar is the sentencediagrammingof traditional grammar. In
a book entitled Syntactic Structures published in 1957, however, Noam Chomsky argued that
immediate constituent analysis, though not wrong, was insufficient since it dealt only with the surface
order.
It has been assumed that all syntactic constituents are organized around a head, X. X can be any word
or morpheme category. X is expanded by the addition of a complement to form a larger unit.
In GG literature, there are four major kinds of words, which are the skeleton of syntax, being verbs the
head of them:
1.
2.
3.
4.
Verbs (V)
Prepositions (P)
Nouns (N)
Adjectives (A)
=
=
=
VP
= PP
NP
AP
D
Two metaphors are used here. In the family tree metaphor, B and C are daughters of A and they are
sisters of each other; less often, A is referred to as the mother or parent of B and C. Also, in the tree
metaphor, A is seen as a branching node, as opposed to C, which is a nonbranching node. In the
domination metaphor, a distinction is made between immediate domination and domination: a node
dominates everything below it (hence, A dominates B, C, and D); a node immediately dominates those
nodes for which there are no intervening nodes (hence, A immediately dominates B and C, but not
D).Finally, B and C form a constituent: a constituent is all and only the nodes dominated by a single
node, in this case, A.
Nodes which contain specific lexical items such as on, the, and beach will never themselves have
daughters; they mark the bottom end of the tree structure. Nodes of this type, which do not dominate
any other node, are called terminal nodes. Lexical items such as on, the, and beach are terminal
elements , and the sequence of terminal elements at the bottom of a tree (e.g. on the beach) is called
the terminal string. We say that a non-terminal node dominates all of its daughter nodes, the daughters
of its daughters, daughters of its grand-daughters, etc. A mother immediately dominates its own
daughters.
The phrase structure rules also allow for choices. The optional choices are indicated with parentheses:
A
(B) C
This rule reads that A is expanded as optionally B and obligatorily C. In every rewrite rule, at least one
element must be obligatory. There may also be mutually exclusive choices of elements in a string;
these are indicated with curly braces:
A
{B,C} or A
B
C
This rule states that if you choose B, you cant choose C, but you must choose one either B or C, but
not both. Whether the mutually exclusive items are written on one line separated by commas or on
separate lines does not matter, as long as they occur within braces. These two types of choices can be
combined:
A({B,C})D
In every phrase structure rule, there must be an initial symbol, a first left-hand symbol, such as A
above. Thereafter, every symbol appearing on the left has already been introduced on the right-hand
side. The symbols that occur on the right, but never on the left are the terminal symbols; another way
of defining them is that they occur at the bottom of a tree diagram. In our brief grammar above, B and
D are terminal symbols. They immediately dominate lexical items, or words. Phrase structure rules
account for the linear order of elements in a sentence in deep structure, as well as for the hierarchical
arrangement of sentence structure. They can also account for the infinite generating capacity of
language.
S: The man takes the books.
(3) {Sentence, NP+VP, NP+Verb+NP, NP+Aux+V+NP, NP singular+Aux+V+NP, NPsingular+Aux+V+ NPplural,
T+N++Aux+V+NPplural,
T+N++Aux+V+T
+N+S,
T+N+_+C+V+T+N+S, . . . ,the+man++C+take+the+books+S}
own defense, to assign sentences complex syntactic representations, which are mediated by
transformations.
2. Transformational Rules
Phrase structure rules account for much of our syntactic knowledge, but they do not account for the
fact that certain sentence types in the language relate systematically to other sentence types. The
standard way of describing these relationships is to say that the related sentences come from a
common underlying structure. Yes-no questions are a case in point, and they bring us back to a
discussion of auxiliaries. Auxiliaries are central to the formation of yes-no questions as well as certain
other types of sentences in English.
The very small set of phrase structure rules just described is a sample of what a more complex phrase
structure grammar of English, with many more parts, would look like. These rules can be treated as a
representation of the underlying or deep structures of sentences in English. One feature of these
underlying structures is that they will generate sentences with a fixed word order. That is convenient for
creating declarative forms (You will help Mary), but not for making interrogative forms, as used in
questions (Will you help Mary?). In making the question, we move one part of the structure to a
different position. This process is based on a movement rule.
In order to talk about this process, we need to expand our phrase structure rules to include an auxiliary
verb (Aux) as part of the sentence. This is illustrated in the first rewrite rule below. Auxiliary verbs
(sometimes described as helping verbs) take different forms in English, but one well-known set can
be included in the rudimentary lexical rule for Aux below. We also need a lexical rule that specifies the
basic forms of the verbs, shown as the third rewrite rule below.
S NP Aux VP
Aux { can, could , should, will, would}
V { follow , help, see}
With these components, we can specify a simple movement rule that is involved in the creation of one
basic type of question in English.
NP Aux VP Aux NP VP
This type of rule has a special symbol and can be illustrated in the process of one tree, on the right,
being derived from the tree on the left.
Transformations were essentially rules for relating one syntactic structure to another.
Transformation in Chomskys model of transformational grammar is the formal operations which
mediate between the deep structure and the surface structure of sentences. Transformations
transfer the tree diagrams generated by phrase structure rules from deep structure to derived tree
diagrams at surface structure. Stated in technical terms: transformations are operations of phrase
markers on phrase markers. Transformational rules are different from phrase structure rules in that
their operational domain is not restricted to individual nodes, but extends to the whole phrase structure
tree, which they modify according to precise conditions.
Functions of Transformations
1. They can move elements from one position to another in a sentence. (declarative-interrogative
transformation)
2. Transformations can insert new elements into a structure. (example by in active-passive
transformation)
3. Transformations can substitute an element in a structure.
Sam loves Sam.
Sam loves himself.
The second Sam in (3) is replaced by himself in (4). We call this reflexivization transformation.
4. Transformations can invert the order of elements in a structure.
This is evident when a declarative sentence is changed into a Yes-No question.
(v) Transformations can also delete elements in a structure. This is also called gapping.
Two Components of Transformation
Ambiguity
One surface structure relating to several deep structures.
*Flying planes can be dangerous.
A) To fly planes.
Ambiguity refers to a word or sentence which expresses more than one meaning. The meaning of a
sentence can be predicted from the meaning of the words it contains. It may be ambiguous in the
sense that the listener may not be clear about what exactly the speaker is trying to do by uttering that
sentence. Hurford and Heasley state that a word or sentence is ambiguous when it has more than one
sense. A sentence is ambiguous if it has two (or more) paraphrases which are not themselves
paraphrased of each other.
It is very important for a reader or a listener to understand ambiguity because it affects his
comprehension. For example, when reading:
They are washing clothes.
The reader or listener cannot decide if it means:
1- They (people) are doing the action of washing.
or 2- They (clothes) are for washing.
Such ambiguous sentences can be accounted and understood on the basis of sentence basic patterns.
The above-mentioned sentence can be of two patterns:
S + V + O. (fist meaning).
S + Be + adj + N (second meaning).
There are still other types of ambiguous sentences which cannot be understood and accounted by
structural grammar. For example:
John teaches the group singing.
According to the structuralists, the above-mentioned sentence may have one of two basic patterns:
N1 + V + N2 + N3 (John teaches singing to the group).
or N1 + V + N2 (John teaches the singing of the group).
In addition to these two meanings, El-Natoor states the transformationalists account for two different
deep structures (meanings) which the structuralists have failed to see:
1- John teaches the group. John is singing.
2- John teaches the group. The group is singing.
In the fist sentence singing is an adjective modifying John while in the second sentence singing is
an adjective modifying the group.
The ambiguous sentence is that which has a surface structure and two or more deep
structures. For example Time flies is considered as one surface structure, but it has two deep
structures (a) The time flies (N+V) (b) Time the flies (V+N).
The notion of the surface structure and the deep structure which the transformationalists use is very
useful for understanding the message which the speaker or the writer wants to convey.
Unlike structuralism, transformationalists have the ability to distinguish between sentences which have
the same surface structure, but different deep structures. Structural linguistics was mainly concerned
with near-surface identity for the purpose of interpreting the sentence, but transformational linguistics
analyzes the surface identity for the purpose of underlying differences. Chomsky states the following
examples:
1- John is eager to please.
2- John is easy to please.
The structuralists classify such sentences under the same basic patterns.
S. + Be + adj + infinitive
because the surface structures of both sentences are obviously identical.
Transformational Generative Grammar has an advantage on the Structural Grammar in interpreting
such sentences of the same structures, but of different meanings (different deep structures).
Transformationalists say that although these sentences have the same surface structure, they have
different deep structures or meanings:
The first means: John is eager to please others (people).
The second means: It is easy for people to please John.
Both sentences are not capable of being transformed in the same way:
It is easy to please John.
It is eager to please John.
Native speakers can normally understand the ambiguity of surface structures. This seems to imply that
they interpret surface structures by a process that relates them to the underlying deep structure. The
concept of basic patterns as defied by structural linguists is limited for the purpose of explaining some
very basic aspects of human language. Transformational Grammar distinguishes between two aspects
of a speakers language production. One aspect is the speakers subconscious knowledge of a set of
internalized rules. This is called his competence. The other is the speakers use of these rules where he
speaks. This is called his performance. Transformationalists believe that the language user is able to:
Make infinite use of finite means.
Distinguish grammatical, ungrammatical and nonsensical sentences.
Perceive ambiguity.
N+V+N