50% found this document useful (2 votes)
3K views10 pages

Transformational Generative Grammar

1) Generative grammar, developed by Noam Chomsky, focuses on explaining linguistic creativity by studying the underlying rules that allow speakers to produce an infinite number of sentences. 2) Chomsky argued that syntax is the key to understanding linguistic creativity. He developed transformational grammar, which uses phrase structure rules and transformations to relate sentence elements and account for linguistic intuitions. 3) Transformational grammar generates sentences using phrase structure rules to define syntactic categories and their arrangements, and transformations to relate surface structures to deeper structures.

Uploaded by

Francis B. Tatel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
50% found this document useful (2 votes)
3K views10 pages

Transformational Generative Grammar

1) Generative grammar, developed by Noam Chomsky, focuses on explaining linguistic creativity by studying the underlying rules that allow speakers to produce an infinite number of sentences. 2) Chomsky argued that syntax is the key to understanding linguistic creativity. He developed transformational grammar, which uses phrase structure rules and transformations to relate sentence elements and account for linguistic intuitions. 3) Transformational grammar generates sentences using phrase structure rules to define syntactic categories and their arrangements, and transformations to relate surface structures to deeper structures.

Uploaded by

Francis B. Tatel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 10

Generative Grammar

A. Background
Structuralism emphasizes on the process of segmenting and classifying the physical features
of utterances with little reference to the abstract underlying structures of language or their meaning.
Structuralism is based on a body of original collected language data known as a corpus. This corpus
represents native spoken or written language. It is broken down by the grammarians who describe and
classify it in terms of form, position and function. However, something very important seemed to be
overlooked in this approach, linguistic creativity or the ability to produce utterances that formed the
corpus. Since speakers know how to produce an infinite number of sentences, many of which are novel,
never having been produced before, it is a must for a linguist to study this ability and make explicit the
ways in producing utterances.
The person who noticed and paid attention to this oversight was Avram Noam Chomsky. He laid the
foundations for this theory upon the publication of his book Syntactic Structures in 1957. In 1955,
however, Chomsky was already arguing for an approach to grammar which was independent of
semantics. He rejected the description of a corpus and instead focused on discovery procedures which
he considered as the aim of linguistic science. For him, the goal of a grammar is to account for the
native speaker's competence, defined as what a native speaker knows, tacitly, of his or her language.
Since the number of all the grammatical sentences possible in language is infinite, he considered
grammar is as sets of rules which, when followed, can generate all the grammatical sentences of
language.
In order for Chomsky to accomplish his newly formulated goal of linguistics, to characterize
competence, he decided to focus on the aspect of language that was ignored by the Structuralist, which
is syntax. As we all know syntax is the branch of linguistics devoted to the study of structures or the
way in which words combine to express meaning through sentences. In syntax, hypotheses are called
rules, and the group of hypotheses that describe a languages syntax is called a grammar. However,
from Chomskys perspective, syntax is seen as a natural system, somehow rooted in human psychology
and biology. Hence, for him it is syntax which will address the mystery of linguistic creativity and will
fulfill his held goal of linguistics. So, focusing on syntax he laid the foundation for explaining linguistic
creativity. From the traditional definition of being the study of the rules governing the way words are
combined to form sentences in a language, an alternative definition has been formed which states that
syntax of the interrelationships between elements of sentence structures, and of the rules governing
the arrangement of sentences in sequences.
According to Chomsky, the primary task of the linguist should not be to discover the structure of
the language from a body of data but rather to describe and explain the knowledge of the structure of
the language which the native speaker has. Rejecting the approach of his structural predecessors, he
introduced and developed a radical approach that came to be known as generative grammar or
transformational-generative grammar. Generative grammar or transformational generative
grammar is a form of language analysis that establishes a relationship with the different elements in
the sentence of a language and makes use of rules or transformations to recognize these relationships.
It is called generative grammar because it states that language is a rule-governed system which allows
us to generate an infinite number of sentences. Chomsky borrowed the term is borrowed from
mathematics to refer to the capacity of a grammar to define (i.e. specify the membership of) the set of
grammatical sentences in a language. Technically, a generative grammar is a set of formal rules
which projects a finite set of sentences upon the potentially infinite set of sentences that constitute the
language as a whole, and it does this in an explicit manner, assigning to each a set of structural
descriptions. generate (a technical term for describe or specify)
The term 'generative' means that the grammar is formal and explicit; 'when we speak of the
linguist's grammar as a ''generative grammar" we mean only that it is sufficiently explicit to determine
how sentences of the language are in fact characterized by the grammar' (Chomsky, 1980). Hence
generative grammars contrast with traditional grammars, which left many rules of the grammar to the
interpretation of the reader; such grammars dealt primarily with the idiosyncratic forms that were not
'obvious' and thus left it to the reader to know what a 'noun' was or what the basic word order of the
sentence was. A generative grammar therefore tried to specify everything that is necessary on the
printed page rather than leaving it to the imagination.
Sentences are generated by a (subconscious) set of procedures which is part
The goal is to figure out what we (subconsciously) know--a theory of
(unconscious knowledge) of a native speaker. Yet since Chomsky will deal with
logic, he borrowed from the axiomatic-deductive method in mathematical logic,

of our cognitive ability.


the linguistic intuition
something abstract like
developed a generation

earlier in its computational formulation. An important influence to his mathematical approach was the
Polish-American mathematician logician Emil Leon Post who started the practice of mathematicizing
logical proof. By and large though the notation of generative grammar was invented.
At first, Chomskys theory is just a continuation of the general concerns of American Structural syntax,
which found an extreme in Zellig Harriss (1951) advocation of a formal and meaningless linguistics.
Harris had been Chomskys professor at the University of Pennsylvania. In a paper entitled From
Morpheme to Utterance, Harris (1946) had proposed the use of equations to indicate
substitutability. The expressions to the left in (1) - (6) are taken from Harris (1946.166ff.). The
expressions to the right are their rough equivalences in TGG phrase structure (cf. [7] - [12] and [13] [16]).
(1)
(2)
(3)
(3)
(4)
(5)
(6)

N4V4 = N4V4
I = N4
V3-Vv = V4
V= V1 = V2 = V3
I = she
Vv = -ed
V = gain

S > NP + VP
NP > N
VP > Aux + Verb
Aux > C (M) (have + en) (be + ing)
N > she
C > Past
Verb > gain

This selection of equations is represented in an order reverse from the one in which Harris lists them,
i.e., from the top-down in place of from the bottom-up. Harris is working from morpheme to utterance
and therefore begins with she, -ed, gain, etc. The turn around is to match the sequence used in TGG,
which then is more akin to IC-analysis.
The equations of Harris can then be adapted in the following way (Chomsky 1957:26):
(7) Sentence > NP + VP
(8) NP > T + N
(9) VP > Verb + NP
(10) T > the
(11) N > man, ball, etc.
(12) Verb > hit, took, etc.
And because the statements of (7) - (12) produce such structures as those in Figure 1, they are called
phrase structure rules, and the (or any) grammar which is of this sort is a phrase structure grammar.
And that is the syntax of American Structuralism ... a phrase structure grammar. The shared view of
grammar is another continuation (iv) from American Structuralism.
Hence, the first step of TG grammar emerges from American Structuralism. The focus on the
relation between sentences arises first in discourse study. Neither the emphasis on such relations nor
the notion of transformation for its expression is Chomskys, as (more generally in American
Structralism) the notion phrase structure rule (equation) is not Chomskys contribution. It all originates
in Harriss work and is adapted by Chomsky.The motivation for transformation was originally one of
textual equivalence and restricted to one specific text. The idea of the technique of experimental
variation (again Harriss) allowed the possibility of generalizing (abstracting) the relation beyond the
text and for the retreat from discourse and from parole to the safer confines of the sentence and
language. That is Chomskys contribution. Nevertheless, the mainstream of linguistics since 1957 has
been dominated by Noam Chomsky (1928-).
B. How Transformational Grammar Generate Sentences
The predecessor to phrase structure rules was an approach to syntax used by structural linguists called
immediate constituent analysis, which accounted for the linear order of words on the surface and the
hierarchical structure of the sentence; it was based on the principle that words enter into relationships
with one another within coherent units known as constituents, the proper subparts of sentences. (An
even older form of phrase structure grammar is the sentencediagrammingof traditional grammar. In
a book entitled Syntactic Structures published in 1957, however, Noam Chomsky argued that
immediate constituent analysis, though not wrong, was insufficient since it dealt only with the surface
order.
It has been assumed that all syntactic constituents are organized around a head, X. X can be any word
or morpheme category. X is expanded by the addition of a complement to form a larger unit.

In GG literature, there are four major kinds of words, which are the skeleton of syntax, being verbs the
head of them:
1.
2.
3.
4.

Verbs (V)
Prepositions (P)
Nouns (N)
Adjectives (A)

=
=
=

VP
= PP
NP
AP

(For a diagrammatic example use Figure 0.B.)


As outlined in syntactic structures, Transformational Grammar comprised three sections or
components: the phrase-structural component, the transformational component, and the
morphophonemic component. Each of these components consists of a set of rules operating upon a
certain input to yield a certain output.
To generate sentences, Transformational Grammar employs the following:
1. Phrase Structure Grammar (PSG)=Phrase structure grammars describe the syntactic structure of
sentences as constituent structures, i.e. as a hierarchy of ordered constituents through the use of rules
(PS-rules) which are capable not only of generating strings of linguistic elements, but also of providing
a constituent analysis of the strings. The rules in PSG are called Phrase Structural Rules which are a set
of rewriting rules.
The Form of Phrase Structure Rules
A phrase structure grammar consists of a set of ordered rules known as rewrite rules, which are applied
stepwise. A rewrite rule has a single symbol on the left and one or more symbols
on the right:
A
B+C
C
D
More than one symbol on the right constitutes a string. The arrow is read as is rewritten as, has as its
constituents, consists of, or is expanded as. The plus sign is read as followed by, but it is often
omitted.
For example, the phrase structural rules:
1. S
NP + VP
or Sentence Noun Phrase + Verb Phrase
2. NP
Art/Det + N
or Noun Phrase Determiner + Noun
3. VP
V + NP
or Verb phrase Verb + Noun Phrase
4. VP V
5. VP V PP
6. PP P NP
7. VP V CP
8. CP C S
Here are the PS rules we have discussed so far. A few other rules will be considered later.
1. S NP VP
2. NP Det N
3. VP V NP
4. VP V
5. VP V PP
6. PP P NP
7. VP V CP
8. CP C S
The static, analytically descriptive rules can be interpreted as rewrite rules. In these rules, the arrow (
) can be interpreted as an instruction to rewrite whatever symbol appears to the left of the
arrow is the symbol or string of symbols that appears to the right of the arrow. The second appendix of
SS provides the following simplified example (p. 111).
(i) Sentence NP + VP
(ii) VP Verb + NP
(iii) Verb Aux + V
(iv) NP {NPsingular, NPplural}
(v) NPsingular T + N +
(vi) NPplural T + N + S

(vii) Aux C (M) (have + en) (be + ing)


(vii) T the
(ix) N man, ball, etc.
(x) V hit, take, walk, read, etc.
(xi) M will, can, may, shall, must
For example, the phrase structural rule:
S
NP + VP means that a sentence can be rewritten as a noun phrase + a verb phrase. The
phrase structural rules can also be represented by a tree diagram.
A
B

D
Two metaphors are used here. In the family tree metaphor, B and C are daughters of A and they are
sisters of each other; less often, A is referred to as the mother or parent of B and C. Also, in the tree
metaphor, A is seen as a branching node, as opposed to C, which is a nonbranching node. In the
domination metaphor, a distinction is made between immediate domination and domination: a node
dominates everything below it (hence, A dominates B, C, and D); a node immediately dominates those
nodes for which there are no intervening nodes (hence, A immediately dominates B and C, but not
D).Finally, B and C form a constituent: a constituent is all and only the nodes dominated by a single
node, in this case, A.
Nodes which contain specific lexical items such as on, the, and beach will never themselves have
daughters; they mark the bottom end of the tree structure. Nodes of this type, which do not dominate
any other node, are called terminal nodes. Lexical items such as on, the, and beach are terminal
elements , and the sequence of terminal elements at the bottom of a tree (e.g. on the beach) is called
the terminal string. We say that a non-terminal node dominates all of its daughter nodes, the daughters
of its daughters, daughters of its grand-daughters, etc. A mother immediately dominates its own
daughters.
The phrase structure rules also allow for choices. The optional choices are indicated with parentheses:
A
(B) C
This rule reads that A is expanded as optionally B and obligatorily C. In every rewrite rule, at least one
element must be obligatory. There may also be mutually exclusive choices of elements in a string;
these are indicated with curly braces:
A

{B,C} or A

B
C
This rule states that if you choose B, you cant choose C, but you must choose one either B or C, but
not both. Whether the mutually exclusive items are written on one line separated by commas or on
separate lines does not matter, as long as they occur within braces. These two types of choices can be
combined:
A({B,C})D
In every phrase structure rule, there must be an initial symbol, a first left-hand symbol, such as A
above. Thereafter, every symbol appearing on the left has already been introduced on the right-hand
side. The symbols that occur on the right, but never on the left are the terminal symbols; another way
of defining them is that they occur at the bottom of a tree diagram. In our brief grammar above, B and
D are terminal symbols. They immediately dominate lexical items, or words. Phrase structure rules
account for the linear order of elements in a sentence in deep structure, as well as for the hierarchical
arrangement of sentence structure. They can also account for the infinite generating capacity of
language.
S: The man takes the books.
(3) {Sentence, NP+VP, NP+Verb+NP, NP+Aux+V+NP, NP singular+Aux+V+NP, NPsingular+Aux+V+ NPplural,
T+N++Aux+V+NPplural,
T+N++Aux+V+T
+N+S,
T+N+_+C+V+T+N+S, . . . ,the+man++C+take+the+books+S}

(For diagrammatic example use Figure F and A.)


Branching=refers to the descending linear connections which constitute the identity of a tree diagram.
Phrase structure rules which generate such trees are sometimes called branching rules. The S, the
first NP, and the VP in the diagram are branching nodes; the other nodes are non-branching. It has
sometimes been suggested that binary branching is the norm in a phrase marker. Since tree diagrams
indicate the phrases functioning as constituents, they are also called phrase markers. Therefore, A tree
diagram is also called a phrase marker.
Lexical insertion (rule) or lexicon rule is the substitution of the preterminal symbols (N, Adj, V, etc.)
in the deep structure with lexical formatives (i.e. words) from the lexicon.
PS trees represent three aspects of a speakers syntactic knowledge:
1. The linear order of the words in the sentence
2. The identification of the syntactic categories of words and groups of words
3. The hierarchical structure of the syntactic categories (e.g., an S is composed of an NP followed by a
VP, a VP is composed of a V that may be followed by an NP, and so on)
Two Main Divisions of Phrase Structure Grammar
1. Context-free grammar consists of wholly of context-free rules, i.e. they would be all of the type
Rewrite X as Y, regardless of context.
2. Context-sensitive grammar contains some rules of the type Rewrite X as Y in the context of Z or A
B /CD, where the forward slash means in the context of, and the horizontal line indicates the place
in the structure where A (a single non-terminal symbol) is rewritten as B (a non-empty string of
symbols) in this case, between C and D (any strings of symbols).
**A grammar is said to be recursive if a category occurring on the left-hand side of a production also
appears on the right-hand side of a production.
Remark: PSG with context-free rules is much less powerful than PSG containing contextsensitive rules. Chomsky (1957), however, claimed phrase structure was inadequate for human
languages without transformational rules operations that can be performed on the elements in the
sentence other than expansion.
In addition to tree diagrams, there is the notational variant known as labeled bracketing. In this
system, the terminal symbols are placed on the line and the nodes dominating them are subscripted.
Square brackets indicate constituents. Our brief grammar immediately above would permit expansions
such as the following with labeled bracketings:
A[B[D]C] or A[B[A[BC]D]C]
Note that there must be as many left-facing as right-facing brackets. While you are free to use this
notation is you wish, most people find tree diagrams much clearer.
When we use a tree diagram format, we can think of it in two different ways. In on e way, we can
simply treat it as a static representation of the structure of the sentence shown at the bottom of the
diagram. We could then propose that, for every single sentence in English, a tree diagram of this type
could be drawn. An alternative view is to treat the tree diagram as a dynamic format, in the sense that
it represents a way of gene rating not only that one sentence, but a very large number of other
sentences with similar structures.
This second approach is very appealing because it would enable us to generate a very large number of
sentences with what look like a very small number of rules. These rules are called phrase structure
rules. As the name suggests, these rules state that the structure of a phrase of a specific type will
consist of one or more constituents in a particular order. We can use phrase structure rules to present
the information of the tree diagram in another format.
Problem: A phrase structure grammar which operates strictly at surface structure cannot
adequately capture a string of syntactic-semantic problems, e.g. discontinuous elements, Philip
called his brother up, or ambiguity, the discovery of the student (the student was discovered or the
student discovered something); the paraphrase relationship between sentences, e.g. the paraphrase
relationship between active and passive sentences. Generative grammar uses these difficulties in its

own defense, to assign sentences complex syntactic representations, which are mediated by
transformations.
2. Transformational Rules
Phrase structure rules account for much of our syntactic knowledge, but they do not account for the
fact that certain sentence types in the language relate systematically to other sentence types. The
standard way of describing these relationships is to say that the related sentences come from a
common underlying structure. Yes-no questions are a case in point, and they bring us back to a
discussion of auxiliaries. Auxiliaries are central to the formation of yes-no questions as well as certain
other types of sentences in English.
The very small set of phrase structure rules just described is a sample of what a more complex phrase
structure grammar of English, with many more parts, would look like. These rules can be treated as a
representation of the underlying or deep structures of sentences in English. One feature of these
underlying structures is that they will generate sentences with a fixed word order. That is convenient for
creating declarative forms (You will help Mary), but not for making interrogative forms, as used in
questions (Will you help Mary?). In making the question, we move one part of the structure to a
different position. This process is based on a movement rule.
In order to talk about this process, we need to expand our phrase structure rules to include an auxiliary
verb (Aux) as part of the sentence. This is illustrated in the first rewrite rule below. Auxiliary verbs
(sometimes described as helping verbs) take different forms in English, but one well-known set can
be included in the rudimentary lexical rule for Aux below. We also need a lexical rule that specifies the
basic forms of the verbs, shown as the third rewrite rule below.
S NP Aux VP
Aux { can, could , should, will, would}
V { follow , help, see}
With these components, we can specify a simple movement rule that is involved in the creation of one
basic type of question in English.
NP Aux VP Aux NP VP
This type of rule has a special symbol and can be illustrated in the process of one tree, on the right,
being derived from the tree on the left.
Transformations were essentially rules for relating one syntactic structure to another.
Transformation in Chomskys model of transformational grammar is the formal operations which
mediate between the deep structure and the surface structure of sentences. Transformations
transfer the tree diagrams generated by phrase structure rules from deep structure to derived tree
diagrams at surface structure. Stated in technical terms: transformations are operations of phrase
markers on phrase markers. Transformational rules are different from phrase structure rules in that
their operational domain is not restricted to individual nodes, but extends to the whole phrase structure
tree, which they modify according to precise conditions.
Functions of Transformations
1. They can move elements from one position to another in a sentence. (declarative-interrogative
transformation)
2. Transformations can insert new elements into a structure. (example by in active-passive
transformation)
3. Transformations can substitute an element in a structure.
Sam loves Sam.
Sam loves himself.
The second Sam in (3) is replaced by himself in (4). We call this reflexivization transformation.
4. Transformations can invert the order of elements in a structure.
This is evident when a declarative sentence is changed into a Yes-No question.
(v) Transformations can also delete elements in a structure. This is also called gapping.
Two Components of Transformation

A. Structural analysis/description (SA)=refers to an analysis of a terminal string in terms of a


labelled bracketing. In transformational analysis, the SD identifies the input to a transformational rule:
it specifies which phrase-markers are to be affected by the rule, i.e. which will satisfy or meet the
conditions of the rule.
B. Structural Change (SC)=refers to the operations involved in applying a transformational rule, i.e.
the changes between the input and the output phrase-markers.
Two Types of Transformation
A. Obligatory transformation=rules that must apply at a given stage in a derivation, when its
structural description is met, if a well-formed sentence is to result, e.g. the rule which attaches affixes
to their base forms.
B. Optional transformations=rules that may apply at a certain stage in a derivation; but it is not
essential for the well-formedness of the sentence that they do so, e.g. the transformation from positive
to negative, active to passive, or declarative to interrogative.
In the transformation of active into passive sentences involves reordering of the two noun phrases, and
the insertion of new forms of the verb, and the agent marker by.
Passive (optional transformation):

S: The dog chased the cat

Structural analysis: NP - Aux - V- NP (The dog-past-chase-the cat.)


Structural change:
The dog--past--chase--the cat.
The cat--was--chased--by the dog.
X1 X2 - X3
- X4
X4- X2+be+en- X3 - by+X1
Auxiliary Transformation (obligatory transformation):
S: Klaus opened the window.
Structural analysis: X- Af - v - Y (where Af is any C or is --en or --ing; v is any M or v, or have or be)
Structural change:
Klaus-ed-open-the window.
Klaus-opened-the window.
X1- X2- X3
--X4
X1-- X3--X2# - X4
(For diagrammatic example use Figure E.)
Restatement: These are a set of rules applied to the deep structure to generate the surface structure
as a string of morphemes. They are indicated by double-arrow from left to right (=>) meaning that the
deep structure at the left of the arrow can be transformed into the surface structure on the right of the
arrow. For example, the affix rule: af + v => v + af means that an affix preceding a verb in the deep
structure is suffixed to that verb in the surface structure, e.g. ed + open => open + ed (opened). All
transformations are based on the deletion and insertion of constituents. Operations derived from
these are substitution (the deletion and insertion of different elements in the same place) and
permutation (the deletion of an element from one place and its insertion in another). There must be
more to TGG, or it will remain a notational variant of the syntax from which it arose.
One of the major conceptual innovations in the entire theory is the proposal that a sentence has not
just one structure, closely related to the way it is pronounced, but an additional abstract structure
(potentially very different from the superficial one), and intermediate structures between these two.
This is fundamental to all the analyses in the Chomskyan system.
3. Morphophonemic Rules
These are rules that guarantee the transfer of an abstract morphophonological structure (deep
structure) into the concrete phonetic realization of the surface structure. A body of formal rules to
enable transforming deep structures to surface structures.
Deep Structure and Surface Structure
According to Chomskyan theory, transformational-generative grammar involves two levels to
represent the structure of sentences known as the deep structure and the surface structure. The
deep structure implies a more abstract form underlying the structure of a sentence and determines
the underlying meaning of a sentence. It is represented by a phrase structure tree in which the
abstract relations between words and phrases of a sentence are depicted in a hierarchical tree
diagram. The surface structure refers to the actual form of the structure of a sentence used, which
determines the pronunciation of individual lexical items and their sequence in a sentence.

In some theories, morphophonemics is seen as a separate level of linguistic structure intermediate


between grammar and phonology. In early versions of generative grammar, morphophonemic rules
were distinguished as a separate component in the derivation of sentences, whereby a terminal string
of morphemes would be converted into their correct phonological form.
When applied to a sentence, the transformational rules change the deep structure into a series of
morphemes which can be grouped into units which belong to each other and spelled out from the
lexicon to generate the surface structure of the sentences, i.e. the form in which we say. For example,
N + pres + have + en + go => N + have + pres + go + en
It is the morphophonemic rules which tell us how to group related units together and spell them out.
Thus we have N + (have + pres) + (v + en) spelled out into: Ali has arrived.
Suppose we want to generate the following sentence: Has Ali written the letter?
The deep structure of this sentence is: Ali has written the letter.
The phrase structural rules of this sentence give us the following tree diagram:
(Look at Figure G)
Thus we have the string as follows:
N + pres + have + en + V + art + N
By applying the affix transformational rule, af + V => V + af, we get:
N + pres. + have + en + V+ art + N => N + have + pres + V + en + art + N
By applying the question transformational rule: NP + Aux => Aux + NP, we get:
N + have + pres + V + en + art + N => have + pres + N + V + en + art +N
To get the surface structure of the sentence, we apply the morphophonemic rules to get:
(have + pres) + N + (V + en) + art + N
Spelled as: Has Ali written the letter?
How Transformational Grammar and Structural Grammar Account for Ambiguous Sentences:
Paraphrase & Ambiguity
Paraphrase
When several surface structures relate to one deep structure.
*John bought the book from Mary.
A) Mary sold the book to John.

B) The book was sold to John by Mary.

Ambiguity
One surface structure relating to several deep structures.
*Flying planes can be dangerous.
A) To fly planes.

B) Planes which are flying.

Ambiguity refers to a word or sentence which expresses more than one meaning. The meaning of a
sentence can be predicted from the meaning of the words it contains. It may be ambiguous in the
sense that the listener may not be clear about what exactly the speaker is trying to do by uttering that
sentence. Hurford and Heasley state that a word or sentence is ambiguous when it has more than one
sense. A sentence is ambiguous if it has two (or more) paraphrases which are not themselves
paraphrased of each other.
It is very important for a reader or a listener to understand ambiguity because it affects his
comprehension. For example, when reading:
They are washing clothes.
The reader or listener cannot decide if it means:
1- They (people) are doing the action of washing.
or 2- They (clothes) are for washing.

Such ambiguous sentences can be accounted and understood on the basis of sentence basic patterns.
The above-mentioned sentence can be of two patterns:
S + V + O. (fist meaning).
S + Be + adj + N (second meaning).
There are still other types of ambiguous sentences which cannot be understood and accounted by
structural grammar. For example:
John teaches the group singing.
According to the structuralists, the above-mentioned sentence may have one of two basic patterns:
N1 + V + N2 + N3 (John teaches singing to the group).
or N1 + V + N2 (John teaches the singing of the group).
In addition to these two meanings, El-Natoor states the transformationalists account for two different
deep structures (meanings) which the structuralists have failed to see:
1- John teaches the group. John is singing.
2- John teaches the group. The group is singing.
In the fist sentence singing is an adjective modifying John while in the second sentence singing is
an adjective modifying the group.
The ambiguous sentence is that which has a surface structure and two or more deep
structures. For example Time flies is considered as one surface structure, but it has two deep
structures (a) The time flies (N+V) (b) Time the flies (V+N).
The notion of the surface structure and the deep structure which the transformationalists use is very
useful for understanding the message which the speaker or the writer wants to convey.
Unlike structuralism, transformationalists have the ability to distinguish between sentences which have
the same surface structure, but different deep structures. Structural linguistics was mainly concerned
with near-surface identity for the purpose of interpreting the sentence, but transformational linguistics
analyzes the surface identity for the purpose of underlying differences. Chomsky states the following
examples:
1- John is eager to please.
2- John is easy to please.
The structuralists classify such sentences under the same basic patterns.
S. + Be + adj + infinitive
because the surface structures of both sentences are obviously identical.
Transformational Generative Grammar has an advantage on the Structural Grammar in interpreting
such sentences of the same structures, but of different meanings (different deep structures).
Transformationalists say that although these sentences have the same surface structure, they have
different deep structures or meanings:
The first means: John is eager to please others (people).
The second means: It is easy for people to please John.
Both sentences are not capable of being transformed in the same way:
It is easy to please John.
It is eager to please John.
Native speakers can normally understand the ambiguity of surface structures. This seems to imply that
they interpret surface structures by a process that relates them to the underlying deep structure. The
concept of basic patterns as defied by structural linguists is limited for the purpose of explaining some
very basic aspects of human language. Transformational Grammar distinguishes between two aspects
of a speakers language production. One aspect is the speakers subconscious knowledge of a set of
internalized rules. This is called his competence. The other is the speakers use of these rules where he
speaks. This is called his performance. Transformationalists believe that the language user is able to:
Make infinite use of finite means.
Distinguish grammatical, ungrammatical and nonsensical sentences.
Perceive ambiguity.

How Transformational Grammar Avoids Generating Nonsensical Sentences


Transformational Grammar avoids generating non-sensical sentences coming from the phrase structural
rules:
Example: S

N+V+N

This rule may wrongly give the following sentence:


The stone eats an apple.
The rule stated above allows us to replace (N) for any noun and (V) for any verb. Consequently, nonsensical sentences might be generated.
To avoid generating such sentences, Transformational Grammar uses the following procedure:
Selectional Feature/Restriction/Rule
A term in generative grammar for a type of contextual feature, i.e. a syntactic feature which specifies
the conditions relating to where in a deep structure a lexical item can occur. Selectional features specify
the restrictions on the permitted combinations of lexical items within a given grammatical context.
These restrictions are stated with reference to the relevant inherent features in an adjacent or nearby
complex symbol (within the same structural unit, i.e. they must be clause-mates). In simple words,
these are rules that tell us which words we can use with which.
Two Types of Selectional Types
A. Syntactic Selectional Rules: these focus on syntactic features of words. For example in verb
category, syntactic features such as +/- transitive or +/-reflexive.
B. Semantic Selectional Rules: these focus on the semantic features of words. In noun category for
example, +/- animate, +/- adult, +/-male
man = + human + male + adult
child = + human adult
For example, a verb which requires an animate subject noun phrase (cf. *the stone slept) would have
the restriction stated as part of its feature specification, e.g. as [+[+Animate]].
Accordingly, we can say The child eats an apple. but not The stone eats an apple. As mentioned
above, one of the semantic features of the verb eat is [+ animate] which means that it requires an
animate subject.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy