From Wikipedia, the free encyclopedia - View original article
Universal grammar (UG) is a theory in linguistics, usually credited to Noam Chomsky, proposing that the ability to learn grammar is hard-wired into the brain. (sometimes known as 'mental grammar', and as opposed to other 'grammars', e.g. prescriptive, descriptive and pedagogical)  The theory suggests that linguistic ability manifests itself without being taught (see the poverty of the stimulus argument), and that there are properties that all natural human languages share. It is a matter of observation and experimentation to determine precisely what abilities are innate and what properties are shared by all languages.
The theory of Universal Grammar proposes that if human beings are brought up under normal conditions (not conditions of extreme sensory deprivation), then they will always develop language with a certain property X (e.g., distinguishing nouns from verbs, or distinguishing function words from lexical words). As a result, property X is considered to be a property of universal grammar in the most general sense (here not capitalized).
There are theoretical senses of the term Universal Grammar as well (here capitalized). The most general of these would be that Universal Grammar is whatever properties of a normally developing human brain cause it to learn languages that conform to universal grammar (the non-capitalized, pretheoretical sense). Using the above examples, Universal Grammar would be the innate property of the human brain that causes it to posit a difference between nouns and verbs whenever presented with linguistic data.
As Chomsky puts it, "Evidently, development of language in the individual must involve three factors: (1) genetic endowment, which sets limits on the attainable languages, thereby making language acquisition possible; (2) external data, converted to the experience that selects one or another language within a narrow range; (3) principles not specific to FL." [FL is the faculty of language, whatever properties of the brain cause it to learn language.] So (1) is Universal Grammar in the first theoretical sense, (2) is the linguistic data to which the child is exposed.
Occasionally, aspects of Universal Grammar seem to be describable in terms of general details regarding cognition. For example, if a predisposition to categorize events and objects as different classes of things is part of human cognition and directly results in nouns and verbs showing up in all languages, then it could be assumed that rather than this aspect of Universal Grammar being specific to language, it is more generally a part of human cognition. To distinguish properties of languages that can be traced to other facts regarding cognition from properties of languages that cannot, the abbreviation UG* can be used. UG is the term often used by Chomsky for those aspects of the human brain which cause language to be the way it is (i.e. are Universal Grammar in the sense used here) but here for discussion it is used for those aspects which are furthermore specific to language (thus UG, as Chomsky uses it, is just an abbreviation for Universal Grammar, but UG* as used here is a subset of Universal Grammar).
In the same article, Chomsky casts the theme of a larger research program in terms of the following question: "How little can be attributed to UG while still accounting for the variety of I-languages attained, relying on third factor principles?" (I-languages meaning internal languages, the brain states that correspond to knowing how to speak and understand a particular language, and third factor principles meaning (3) in the previous quote).
Chomsky has speculated that UG might be extremely simple and abstract, for example only a mechanism for combining symbols in a particular way, which he calls Merge. To see that Chomsky does not use the term "UG" in the narrow sense UG* suggested above, consider the following quote from the same article:
"The conclusion that Merge falls within UG holds whether such recursive generation is unique to FL or is appropriated from other systems."
I.e. Merge is part of UG because it causes language to be the way it is, is universal, and is not part of (2) (the environment) or (3) (general properties independent of genetics and environment). Merge is part of Universal Grammar whether it is specific to language or whether, as Chomsky suggests, it is also used for example in mathematical thinking.
The distinction is important because there is a long history of argument about UG*, whereas most people working on language agree that there is Universal Grammar. Many people assume that Chomsky means UG* when he writes UG (and in some cases he might actually mean UG*, though not in the passage quoted above).
Some students of universal grammar study a variety of grammars to abstract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a variety of traits, such as the phonemes found in languages, what word orders languages choose, and why children exhibit certain linguistic behaviors.
Later linguists who have influenced this theory include Noam Chomsky and Richard Montague, developing their version of this theory as they considered issues of the Argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The application of the idea of Universal Grammar to the area of second language acquisition (SLA) is represented mainly by the McGill linguist Lydia White.
Most syntacticians generally concede that there are parametric points of variation between languages, although heated debate occurs over whether UG constraints are essentially universal due to being "hard-wired" (Chomsky's Principles and Parameters approach), a logical consequence of a specific syntactic architecture (the Generalized Phrase Structure approach) or the result of functional constraints on communication (the functionalist approach).
In an article titled, "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where we have a Universal Grammar.
Hypothesis 1 states that FLB (the Faculty of Language in the broad sense) is strictly homologous to animal communication. This means that homologous aspects of the Faculty of Language exist in non-human animals.
Hypothesis 2 states that FLB "is a derived, uniquely human adaptation for language". This hypothesis believes that individual traits were subject to natural selection and came to be very specialized for humans.
Hypothesis 3 states that only FLN (the Faculty of Language in the narrow sense) is unique to humans. It believes that while mechanisms of FLB are present in both humans and non-human animals, that the computational mechanism of recursion is recently evolved solely in humans. This is the hypothesis which most closely aligns to the typical theory of Universal Grammar championed by Chomsky.
The idea of a universal grammar can be traced back to Roger Bacon's observation that all languages are built upon a common grammar, even though it may undergo accidental variations, and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. There is a Scottish school of universal grammarians from the 18th century, to be distinguished from the philosophical language project, which includes authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on "Grammar" in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar".
During the early 20th century, in contrast, language was usually understood from a behaviourist perspective, suggesting that language learning, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success. In other words, children learned their mother tongue by simple imitation, listening to and repeating what adults said.
For example, when a child says "milk" and the mother will smile and give her some as a result, the child will find this outcome rewarding, enhancing the child's language development.
Chomsky argued that the human brain contains a limited set of rules for organizing language. This implies in turn that all languages have a common structural basis; the set of rules is what is known as universal grammar.
Chomsky has stated "I think, yet the world thinks in me", exemplifying his belief that since humans are natural beings and have undergone evolution, that Universal Grammar is a biological evolutionary trait, common to all humans.
Speakers proficient in a language know which expressions are acceptable in their language and which are unacceptable. The key puzzle is how speakers come to know these restrictions of their language, since expressions that violate those restrictions are not present in the input, indicated as such. Chomsky argued that this poverty of stimulus means Skinner's behaviorist perspective cannot explain language acquisition. The absence of negative evidence—evidence that an expression is part of a class of ungrammatical sentences in one's language—is the core of his argument. For example, in English one cannot relate a question word like what to a predicate within a relative clause:
Such expressions are not available to language learners: they are, by hypothesis, ungrammatical. Speakers of the local language do not use them, nor note them as unacceptable to language learners. Universal grammar offers a solution to the poverty of the stimulus problem by making certain restrictions universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory. Creoles are languages that are developed and formed when different societies come together and are forced to devise their own system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items known as a pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole. Unlike pidgins, creoles have native speakers and make use of a full grammar.
According to Bickerton, the idea of universal grammar is supported by creole languages because certain features are shared by virtually all of these languages. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creoles is that questions are created simply by changing a declarative sentence's intonation, not its word order or content.
However, extensive work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar, as has sometimes been supposed. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. Notably, they found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, they tend to standardize the language that they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin situation (and in the real life situation of a deaf child whose parents were disfluent signers), children are systematizing the language they hear based on the probability and frequency of forms, and not, as has been suggested on the basis of a universal grammar. Further, it seems unsurprising that creoles would share features with the languages they are derived from and thus look similar "grammatically".
Many adherents of Universal Grammar argue against a concept of Relexification, which says that a language replaces its lexicon almost entirely with that of another. This goes against universalist ideas of a Universal Grammar, which has an innate grammar.
While the majority of linguistics accept universal grammar, there have been a few linguists who do not accept the theory.
Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific theory. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language. Similarly, Jeffrey Elman argues that the unlearnability of languages assumed by Universal Grammar is based on a too-strict, "worst-case" model of grammar, that is not in keeping with any actual grammar. In keeping with these points, James Hurford argues that the postulate of a language acquisition device (LAD) essentially amounts to the trivial claim that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories.
Morten Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate Universal Grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics."
Hinzen summarizes the most common criticisms of Universal Grammar:
Other researchers have come to some of the same conclusions as Hinzen. Christensen and Chater note that there was not a stable environment across all populations, cultures, and languages, in which a language acquisition gene could have adapted. Instead, Christensen and Chater focus on the relationship between language and the learner, claiming that language has been shaped to fit the human brain. According to Christensen and Chater, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics." (489).
In addition, it has been suggested that people learn about probabilistic patterns of word distributions in their language, rather than hard and fast rules (see Distributional hypothesis). For example, children overgeneralize the past tense marker "ed" and mispronounce the irregular verbs, producing forms like goed and eated and correct these errors over time. It has also been proposed that the poverty of the stimulus problem can be largely avoided, if we assume that children employ similarity-based generalization strategies in language learning, generalizing about the usage of new words from similar words that they already know how to use.
Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure. This implies that word learning is a probabilistic, error-driven process, rather than a process of fast mapping, as many nativists assume.
In the domain of field research, the Pirahã language is claimed to be a counterexample to the basic tenets of Universal Grammar. This research has been led by Daniel Everett. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and color terms. According to the writings of Dr. Everett, the Pirahã showed these linguistic shortcomings not because they were simple-minded, but because their culture — which emphasized concrete matters in the present and also lacked creation myths and traditions of art making — did not necessitate it. Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of Universal Grammar. Other linguists have attempted to reassess the Pirahã to see if it did indeed use recursion. In a corpus analysis of the Pirahã language, linguists failed to disprove Everett's arguments against Universal Grammar and the lack of recursion in Pirahã, but they also stated that there was "no strong evidence for the lack of recursion either" and they stated that there may be "evidence of recursive structure".
Daniel Everett has gone as far as claiming that universal grammar does not exist. In his words, "universal grammar doesn't seem to work, there doesn't seem to be much evidence for [it]. And what can we put in its place? A complex interplay of factors, of which culture, the values human beings share, plays a major role in structuring the way that we talk and the things that we talk about." Michael Tomasello, a developmental psychologist, also supports this claim, arguing that "although many aspects of human linguistic competence have indeed evolved biologically, specific grammatical principles and constructions have not. And universals in the grammatical structure of different languages have come from more general processes and constraints of human cognition, communication, and vocal-auditory processing, operating during the conventionalization and transmission of the particular grammatical constructions of particular linguistic communities."