Generated by DeepSeek V3.2| generative grammar | |
|---|---|
| Name | Generative Grammar |
| Field | Linguistics |
| Founded | Mid-1950s |
| Founder | Noam Chomsky |
| Key works | Syntactic Structures, Aspects of the Theory of Syntax |
| Influenced | Cognitive science, Philosophy of language, Computer science |
generative grammar is a linguistic theory that aims to model the implicit knowledge, or competence, that enables a speaker to produce and understand a potentially infinite number of grammatical sentences in their language. Pioneered by Noam Chomsky in the 1950s, it posits that this knowledge is rooted in an innate, biologically determined language faculty with a universal core structure. The theory uses formal, explicit rules to generate all and only the grammatical sentences of a language, shifting linguistics toward a mentalistic and cognitive science framework.
The central goal is to construct a formal system of rules that can generate the set of well-formed sentences in a given language, such as English or Japanese. This approach treats linguistics as a branch of cognitive psychology, seeking to uncover the mental representations underlying language use. It stands in contrast to the behaviorism prevalent in mid-20th century psychology, as championed by figures like B.F. Skinner, and to earlier structuralist approaches in linguistics associated with Leonard Bloomfield. Key foundational texts include Chomsky's Syntactic Structures and Aspects of the Theory of Syntax, which established the field's core agenda.
The framework is built upon the distinction between competence and performance, where competence is the idealized speaker-hearer's knowledge. A primary hypothesis is the existence of Universal Grammar, an innate set of principles and parameters that constrain possible human languages. The theory models language through a series of interrelated components, including a syntactic component that builds phrase structures, often visualized through tree diagrams. Early models, like the Standard Theory, included transformational rules that operated on deep structures to produce surface structures, a concept further developed in the Extended Standard Theory and later the Principles and Parameters approach.
Fundamental mechanisms include phrase structure rules, which generate initial hierarchical structures, and transformations like wh-movement and passivization. The X-bar theory provided a more constrained template for phrase structure across all categories. The Government and Binding Theory introduced modules like theta theory, case theory, and binding theory to explain syntactic phenomena. Later, the Minimalist Program sought to reduce the apparatus to the most computationally efficient operations, such as Merge and Agree, driven by interface conditions with the conceptual-intentional system and the sensory-motor system.
The field originated with Chomsky's critique of B.F. Skinner's Verbal Behavior and the publication of Syntactic Structures in 1957. The Standard Theory (1960s) was followed by the Extended Standard Theory and Revised Extended Standard Theory, which incorporated semantic interpretation. The 1980s saw the rise of the highly influential Principles and Parameters framework, exemplified by Government and Binding Theory. This period also spawned rival generative models, including Generalized Phrase Structure Grammar, developed by Gerald Gazdar, and Head-Driven Phrase Structure Grammar, associated with Carl Pollard and Ivan Sag. The current dominant paradigm is Chomsky's Minimalist Program, initiated in the 1990s.
Its influence extends far beyond theoretical linguistics, fundamentally shaping modern cognitive science and the philosophy of mind, impacting thinkers like Jerry Fodor. In computer science, it informed early work in natural language processing and the design of programming languages. It has provided a formal framework for the study of language acquisition, supporting the poverty of the stimulus argument. The theory has also been applied in comparative linguistics to analyze diverse languages from Navajo to Arabic, and in neurolinguistics, where researchers like Angela Friederici investigate its biological correlates.
Major criticisms come from proponents of functional linguistics, such as Michael Halliday, and cognitive linguistics, including figures like George Lakoff and Ronald Langacker, who argue it neglects language use and semantic motivation. Some philosophers, including Hilary Putnam and W.V.O. Quine, have challenged its nativist assumptions. Within the generative tradition, debates have centered on the adequacy of the Principles and Parameters model, the direction of the Minimalist Program, and the empirical status of Universal Grammar. Alternative generative frameworks, like Lexical Functional Grammar developed by Joan Bresnan and Ronald Kaplan, offer competing formal accounts of syntactic phenomena.
Category:Linguistics Category:Generative linguistics Category:Noam Chomsky