Generative grammar is a theory of language that is generally said to have made its appearance in 1957, with Noam Chomsky’s Syntactic Structures. It is often contrasted with structural or taxonomic grammars. While the aim of structural grammar was to account for all the data in a given corpus, the aim of generative grammar was to generate any sentence in a given language, including those not yet uttered. Thus, the goal shifted from accounting for a body of data to modeling the mental functioning that allows humans to use language creatively. Put another way (Chomsky’s), the goal shifted from the description of observed language behavior to the explanation of the observed behavior in terms of the mental states of language users.
In its early phase, generative linguists used a huge number of transformational rules to generate different surface sentences from the same underlying sentence. For example, the basic sentence Mary saw the cat can be transformed by different rules into a passive (The cat was seen by Mary), a yes-no question (Did Mary see the cat?), and various “Wh” questions (Who saw the cat? What did Mary see?). The grammars linguists tried to write using these rules were enormous and yet still incomplete, and the proliferation of rules made it difficult to explain the rapid and effortless way in which children, by the age of 3 or so, acquire the grammar of their language. As a corrective, generativists attempted to reduce the number of rules by developing general ones, such as merge, which allows smaller elements to combine into larger ones, and move, which accounts for the position of what in What did Mary see?
For a simple illustration of the power of generative grammar, take the noun phrase (NP) the gray cat. This phrase is generated by first merging the adjective gray and the noun cat to yield gray cat, which can then be merged with the determiner the. Using the traditional rewrite rules this looks like:
NP Det N’
Adding lexical items to the N, Adj, and Det lists very rapidly increases the number of NPs that can be generated. For example, adding the indefinite determiner a(n), the nouns dog and goat, and the adjectives black and white increases the number of potential NPs from one to 18.
More recently, generative linguists have incorporated principles and parameters. Principles are invariant; the projection principle, for example, requires that any properties of the lexical head of a phrase apply to the whole phrase. So, if the noun woman is [+human], the NP the first woman in space is also [+human].
Parameters require language learners to make a choice. For example, in English, verbs precede their objects: eat + potatoes. In some languages, like Aymara, objects come first: chhuq (potatoes) + man-q’aha (eat). This illustrates the head parameter: the heads of phrases may either precede or follow their complements. Once children “set” this parameter, they know other facts about their language:
Verb phrase eat potatoes
Prepositional phrase in the house
Adjective phrase afraid of spiders
Noun phrase president of the United States
One of the claims of generative grammarians is that because language is innate, the forms that human languages can take are sharply constrained. This claim created some tension between these linguists and followers of the earlier tradition in linguistic anthropology that took languages to be infinitely variable and entirely determined by culture.
- Chomsky, N. (1957). Syntactic structures. The Hague: Mouton.
- Chomsky, N. (1995). The minimalist program. Cambridge: MIT Press.
- Cook, V., & Newson, M. (1996). Chomsky’s universal grammar: An introduction (2nd ed.). Oxford: Blackwell.
- Lightfoot, D. (1982). The language lottery: Toward a biology of grammars. Cambridge: MIT Press.