Minimalist program
In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.
Following Imre Lakatos's distinction, Chomsky presents minimalism as a program, understood as a mode of inquiry that provides a conceptual framework which guides the development of linguistic theory. As such, it is characterized by a broad and diverse range of research directions. For Chomsky, there are two basic minimalist questions—What is language? and Why does it have the properties it has?—but the answers to these two questions can be framed in any theory.
Conceptual framework
Goals and assumptions
Minimalism is an approach developed with the goal of understanding the nature of language. It models a speaker's knowledge of language as a computational system with one basic operation, namely Merge. Merge combines expressions taken from the lexicon in a successive fashion to generate representations that characterize I-Language, understood to be the internalized intensional knowledge state as represented in individual speakers. By hypothesis, I-language—also called universal grammar—corresponds to the initial state of the human language faculty in individual human development.Minimalism is reductive in that it aims to identify which aspects of human language—as well the computational system that underlies it—are conceptually necessary. This is sometimes framed as questions relating to perfect design and optimal computation According to Chomsky, a human natural language is not optimal when judged based on how it functions, since it often contains ambiguities, garden paths, etc. However, it may be optimal for interaction with the systems that are internal to the mind.
Such questions are informed by a set of background assumptions, some of which date back to the earliest stages of generative grammar:
- Language is a form of cognition. There is a language faculty that interacts with other cognitive systems; this accounts for why humans acquire language.
- Language is a computational system. The language faculty consists of a computational system whose initial state contains invariant principles and parameters.
- Language acquisition consists of acquiring a lexicon and fixing the parameter values of the target language.
- Language generates an infinite set of expressions given as a sound-meaning pair.
- Syntactic computation interfaces with phonology: π corresponds to phonetic form, the interface with the articulatory-perceptual performance system, which includes articulatory speech production and acoustic speech perception.
- Syntactic computation interfaces with semantics: λ corresponds to logical form, the interface with the conceptual-intentional performance system, which includes conceptual structure and intentionality.
- Syntactic computations are fully interpreted at the relevant interface: are interpreted at the PF and LF interfaces as instructions to the A-P and C-I performance systems.
- Some aspects of language are invariant. In particular, the computational system and LF are invariant.
- Some aspects of language show variation. In particular, variation reduces to Saussurean arbitrariness, parameters and the mapping to PF.
- The theory of grammar meets the criterion of conceptual necessity; this is the Strong Minimalist Thesis introduced by Chomsky in. Consequently, language is an optimal association of sound with meaning; the language faculty satisfies only the interface conditions imposed by the A-P and C-I performance systems; PF and LF are the only linguistic levels.
Strong minimalist thesis
Within minimalism, economy—recast in terms of the strong minimalist thesis —has acquired increased importance. The 2016 book entitled Why Only Us—co-authored by Noam Chomsky and Robert Berwick—defines the strong minimalist thesis as follows:
Under the strong minimalist thesis, language is a product of inherited traits as developmentally enhanced through intersubjective communication and social exposure to individual languages. This reduces to a minimum the "innate" component of the language faculty, which has been criticized over many decades and is separate from the developmental psychology component.
Intrinsic to the syntactic model is the fact that social and other factors play no role in the computation that takes place in narrow syntax; what Chomsky, Hauser and Fitch refer to as faculty of language in the narrow sense, as distinct from faculty of language in the broad sense. Thus, narrow syntax only concerns itself with interface requirements, also called legibility conditions. SMT can be restated as follows: syntax, narrowly defined, is a product of the requirements of the interfaces and nothing else. This is what is meant by "Language is an optimal solution to legibility conditions".
Interface requirements force deletion of features that are uninterpretable at a particular interface, a necessary consequence of Full Interpretation. A PF object must only consist of features that are interpretable at the articulatory-perceptual interface; likewise a LF object must consist of features that are interpretable at the conceptual-intentional interface. The presence of an uninterpretable feature at either interface will cause the derivation to crash.
Narrow syntax proceeds as a set of operations—Merge, Move and Agree—carried out upon a numeration with the sole aim of removing all uninterpretable features before being sent via Spell-Out to the A-P and C-I interfaces. The result of these operations is a hierarchical syntactic structure that captures the relationships between the component features.
Technical innovations
The exploration of minimalist questions has led to several radical changes in the technical apparatus of transformational generative grammatical theory. Some of the most important are:- the elimination of the distinction between deep structure and surface structure in favour of a derivational approach
- the elimination of X-bar theory in favour of bare phrase structure
- the elimination of indexation in favour of Move or [|Agree]
- the elimination of the notion of government in favour of feature-checking
- the idea that feature-checking—which matches interpretable and uninterpretable features, and subsequently deletes the latter—might be responsible for all structure-building operations, including Merge, Move, and Agree
- the idea that syntactic derivations proceed by clearly delineated stages called "phases"
- the specification that there are exactly two points where syntax interacts with other components: a "spell-out" point between syntax and the interface with phonetic form, and an additional point of interaction with logical form
Basic operations
Operations and the Y-model
Minimalist theorizing assumes a derivational model of syntax. While it inherits and adapts concepts from some of its predecessors, it departs from them in significant ways. Minimalism rejects earlier proposals that syntactic structure is generated through successive levels of representation and instead proposes that it is built piecemeal by recursive, combinatorial operations, culminating in distinct interface levels.Lexical items are drawn from the lexicon and placed in a numeration. This is the set of lexical items to be used in the derivation. These are assembled by the operations Merge and Move. Recent theorizing proposes that agreement and movement are mediated by the operation Agree. At some point in the derivation, the derivation is spelled out to the interface levels, Logical Form and Phonetic Form, splitting the derivation into two branches. LF and PF feed the cognitive modules responsible for determining meaning and externalizing speech, respectively. This model of the grammar is referred to as the Y-model or T-model because the divergence of the derivation after spell-out resembles an inverted letter Y in diagrams of the model.
Operations on the LF and PF branches are fed by Spell Out. Operations on the LF branch are not "visible" on the PF branch, and vice versa. This means that any movement that occurs on the LF branch will be covert and not be phonologically represented, and operations on the PF branch will not directly affect the meaning of an utterance. By hypothesis, the operations on the LF branch are distinct from those on the PF branch. Proposals about what operations are available on each branch are a matter of debate. For example, Distributed Morphology proposes a wide array of operations on the PF branch that are responsible for deriving morphological structure from syntactic structures.
Merge
In its original formulation, Merge is a function that takes two objects and merges them into an unordered set with a label, either α or β. In more recent treatments, the possibility of the derived syntactic object being un-labelled is also considered; this is called "simple Merge".In the version of Merge which generates a label, the label identifies the properties of the phrase. Merge will always occur between two syntactic objects: a head and a non-head. For example, Merge can combine the two lexical items drink and water to generate drink water. In the Minimalist Program, the phrase is identified with a label. In the case of drink water, the label is drink since the phrase acts as a verb. This can be represented in a typical syntax tree as follows, with the name of the derived syntactic object determined either by the lexical item itself, or by the category label of the LI:
Merge can operate on already-built structures; in other words, it is a recursive operation. If Merge were not recursive, then this would predict that only two-word utterances are grammatical. As illustrated in the accompanying tree structure, if a new head is merged with a previously formed syntactic object, the function has the form Merge →
[Image:Minimalist Syntax Tree 1.png|thumb|Merge (γ,
Chomsky's earlier work defines each lexical item as a syntactic object that is associated with both categorical features and selectional features. Features—more precisely formal features—participate in feature-checking, which takes as input two expressions that share the same feature, and checks them off against each other in a certain domain. In some but not all versions of minimalism, projection of selectional features proceeds via feature-checking, as required by locality of selection:
Selection as projection: As illustrated in the bare phrase structure tree for the sentence The girl ate the food; a notable feature is the absence of distinct labels. Relative to Merge, the selectional features of a lexical item determine how it participates in Merge:
- eat The Lexical Item eat is a transitive verb, and so assigns two theta-roles. Theta-roles can be represented as D-features on V—VD,D—and these D features force the verb to merge with two DPs. As illustrated in the tree, the first application of Merge generates the Verb-Complement sequence, with the DP the food in complement position. The second application of Merge generates the equivalent of a Specifier-VP sequence, with the DP the girl in specifier position.
- PAST The Lexical Item for "past tense" is represented as the feature. Tense requires the presence of a DP subject and a verb; this is notated as TD,V. Tense first merges with a V-projection, and the output then combines the DP subject the girl, which, in some sense, merges twice: once within the V-projection, and once within the T-projection.
- C∅ The Lexical Item for clause-typing is a phonologically null C∅. By hypothesis, all sentences are clauses, so the root clause The girl ate the food is analyzed as CP. Given the assumption that all phrases are headed, CP must be headed by C. C selects TP, notated as CT.
- Merge checks off one of the D features of V. We see this on the intermediate V projection, where the complement position is realized by the DP the food. This D-feature is then "checked" and we can see one of the D features is removed at the intermediate V-projection. Merge applies a second time, and the maximal V in the tree has no D features because at this stage of the derivation both D features have been "checked". Specifically, the D feature of the intermediate V-projection is "checked" by the DP the girl in the specifier position of V.
- Merge checks off the V-feature of T; Merge checks off the D-feature of T.
- Merge checks off the T-feature of C.
| bare phrase structure | projection as feature-checking |
Move
The operation Move in Minimalist theory descended initially from the operation Move α, which could move syntactic elements freely. The Minimalist approach to movement is more constrained, and proposes that movement occurs in order to satisfy interface conditions by checking and eliminating features that are not legible at the interfaces. Later instantiations propose that Move is not a distinct operation but a subtype of Merge.Features and motivation
The precise conceptualization of Move has varied since the Minimalist Program was introduced and depends on the assumptions one makes about the nature of features on lexical items. The general account, however, is that Move is a last-resort operation that occurs to check features that are introduced on lexical items in order to remove or neutralize those features before they are sent to the interfaces. This stands in contrast to earlier work in Transformational Grammar and Government and Binding Theory.File:Mp-checking.svg|thumb|250px|A schematic tree showing movement driven by strong feature checking. The strong feature is checked by movement of another element bearing the feature . Checking is indicated with strikethrough. Unpronounced copies are indicated with angle brackets. In one formulation common in early versions of Minimalist theory, a distinction is made between so-called strong and weak features. Strong features are not legitimate objects at the PF interface and so must be checked before Spell Out. Weak features, however, are not visible at PF and need not be checked before Spell Out.
In this original formulation, feature checking occurs locally. A feature on a head X must be checked by merging another element bearing feature as a specifier of X. This mechanism can be used to explain subject movement. If one assumes that nominals are introduced from the lexicon bearing Case features, then those features must be checked against heads bearing matching Case features. If T bears a strong Nominative_case| feature, then a D bearing a case feature must move to the specifier of T to check this feature.
File:Mp-wh-checking.svg|thumb|250px|A simplified tree showing feature-driven wh-movement. The strong feature on the head C must be checked against another element bearing a feature. Moving the word what into the specifier of C accomplishes this. Irrelvant feature checking has been omitted. A similar approach can be taken for wh-movement. In this case, one assumes there is a strong feature on C. Another element bearing a feature must move to C in order to check the strong feature on C before Spell Out.
Under this view weak features must still be checked before arriving at the LF interface. It is assumed that the LF branch of the derivation still has access to syntactic operations and that, therefore, movement can occur after Spell Out. Movement that is not necessary to satisfy PF interface conditions does not occur before Spell Out due to an economy condition called Procrastinate, which requires operations to wait as long as possible before they occur.
Later approaches to syntactic features introduce the notion of feature interpretability and integrate feature valuation with the operation Agree, discussed below. The introduction of Agree obviates the need to check features via movement. Instead, a generalized EPP feature or diacritic requires movement of elements into the specifiers of various heads. Nonetheless, the core idea that movement must occur to satisfy interface conditions remains at the core of movement under Minimalist theorizing.
The nature of Move
Early Minimalist theorizing assumes that Move is a distinct operation. However, later revisions propose that Move is actually the result of Merge. As discussed above, Merge takes two syntactic objects as its arguments and returns a new syntactic object. Movement can be derived if Merge takes as its arguments a syntactic object α and some element β that is a proper subpart of α. This is Internal Merge. This can be distinguished from External Merge, where β is a syntactic object external to α.There are different ideas about how to cash this out. One variation is the Copy Theory of Movement under which an element targeted for Internal Merge is copied, and the copy is merged in a new higher position. On the PF branch, lower copies are typically deleted after Spell Out when the structure is linearized. As an alternative to the Copy Theory of Movement, syntactic structure may be multidominant. On this view, individual nodes in a syntactic structure can have multiple mothers. When Internal Merge occurs, no copies are made. Instead, the internal node is merged again at the top of the structure. Such nodes are never truly moved from their initial structural position; rather, their position in the linear order is determined on the PF branch.
Beyond basic operations
Label
A substantial body of literature in the minimalist tradition focuses on how a phrase receives a proper label. The debate about labeling reflects the deeper aspirations of the minimalist program, which is to remove all redundant elements in favour of the simplest analysis possible. While earlier proposals focus on how to distinguish adjunction from substitution via labeling, more recent proposals attempt to eliminate labeling altogether, but they have not been universally accepted.Adjunction and substitution: Chomsky's 1995 monograph entitled The Minimalist Program outlines two methods of forming structure: adjunction and substitution. The standard properties of segments, categories, adjuncts, and specifiers are easily constructed. In the general form of a structured tree for adjunction and substitution, α is an adjunct to X, and α is substituted into SPEC, X position. α can raise to aim for the Xmax position, and it builds a new position that can either be adjoined to or is SPEC, X, in which it is termed the 'target'. At the bottom of the tree, the minimal domain includes SPEC Y and Z along with a new position formed by the raising of α which is either contained within Z, or is Z.Adjunction: Before the introduction of bare phrase structure, adjuncts did not alter information about bar-level, category information, or the target's head. An example of adjunction using the X-bar theory notation is given below for the sentence Luna bought the purse yesterday. Observe that the adverbial modifier yesterday is sister to VP and dominated by VP. Thus, the addition of the modifier does not change information about the bar-level: in this case the maximal projection VP. In the minimalist program, adjuncts are argued to exhibit a different, perhaps more simplified, structure. Chomsky proposes that adjunction forms a two-segment object/category consisting of: the head of a label; a different label from the head of the label. The label L is not considered a term in the structure that is formed because it is not identical to the head S, but it is derived from it in an irrelevant way. If α adjoins to S, and S projects, then the structure that results is L =, where the entire structure is replaced with the head S, as well as what the structure contains. The head is what projects, so it can itself be the label or can determine the label irrelevantly. In the new account developed in bare phrase structure, the properties of the head are no longer preserved in adjunction structures, as the attachment of an adjunct to a particular XP following adjunction is non-maximal, as shown in the figure below that illustrates adjunction in BPS. Such an account is applicable to XPs that are related to multiple adjunction.
Substitution forms a new category consisting of a head, which is the label, and an element being projected. Some ambiguities may arise if the features raising, in this case α, contain the entire head and the head is also XMAX.
Agree
Starting in the early 2000s, attention turned from feature-checking as a condition on movement to feature-checking as a condition on agreement. This line of inquiry was initiated in Chomsky, and formulated as follows:Many recent analyses assume that Agree is a basic operation, on par with Merge and Move. This is currently a very active area of research, and there remain numerous open questions:
- Is Agree a primitive operation?
- What is the "direction" of the Agree relation: does it apply top-down, bottom-up, or both?
- Is Agree a syntactic operation, a post-syntactic operation that applies at PF, or both?
- Is the Agree relation restricted to certain feature types?
- Is the Agree relation subject to locality restrictions?
- Which phenomena are best modelled by the Agree relation?
- Is the Agree relation conditioned by other factors, or does it apply freely?
- How does Agree interact with other operations such as Merge and Label?
Derivation by phase
A phase is a syntactic domain first hypothesized by Noam Chomsky in 1998. It is a domain where all derivational processes operate and where all features are checked. A phase consists of a phase head and a phase domain. Once any derivation reaches a phase and all the features are checked, the phase domain is sent to transfer and becomes invisible to further computations. The literature shows three trends relative to what is generally considered to be a phase:- All CPs and some vPs are phases: Chomsky originally proposed that CP and vP in transitive and unergative verbs constitute phases. This was proposed based on the phrases showing strong phase effects discussed above.
- A specified set of phrases are phases: CP, DP, all vPs, TP
- Every phrase is a phase, with moved constituents cycling through all intermediate phrase edges.
Strong phases: CP and ''v''P
Propositional content: CP and vP are both propositional units, but for different reasons. CP is considered a propositional unit because it is a full clause that has tense and force: example shows that the complementizer that in the CP phase conditions finiteness and force of the subordinate clause. vP is considered a propositional unit because all the theta roles are assigned in vP: in the verb ate in the vP phase assigns the Theme theta role to the DP the cake and the Agent theta-role to the DP Mary.
John said .
.
Movement: CP and vP can be the focus of pseudo-cleft movement, showing that CP and vP form syntactic units: this is shown in for the CP constituent that John is bringing ''the dessert, and in for the v''P constituent arrive tomorrow.
a. Mary said .
b. What Mary said was .
a. Alice will .
b. What Alice will do is .
Reconstruction. When a moved constituent is interpreted in its original position to satisfy binding principles, this is called reconstruction. Evidence from reconstruction is consistent with the claim that the moved phrase stops at the left edge of CP and vP phases.
- Reconstruction at left edge of CP phase: In, the reflexive himself can be understood as being co-referential with either John or Fred, where co-reference is indicated by co-indexation. However, the constituent that contains himself, namely the sentence-initial phrase , is not c-commanded by either John or Fred, as is required by Principle A of the Binding Theory. The fact that co-indexation of himself with either one of John or Fred is possible is taken as evidence that the constituent containing the reflexive, namely has moved through a reconstruction site—here the left edge of the lower CP phrase—from where it can satisfy Principle A of the Binding Theory relative to the DP John.
- Reconstruction at left edge of vP phase: In, bound variable anaphora requires that the pronoun he must be c-commanded by every student, but Condition C of the Binding Theory requires that the R-expression Mary be free. However, these requirements cannot be satisfied by the sentence-initial constituent that contains both he and Mary, namely the phrase . The fact that the sentence is nevertheless well-formed is taken to indicate that this phrase must have moved through a reconstruction site first, from where it is interpreted. The left edge of the vP phase is the only position where these binding requirements could be satisfied: every student c-commands the pronoun he; Mary is free from any c-commanding DP.
b. did Johnk think ___ Fredj liked __?
did every studentk __ ask herj to read __ carefully?
Phase edge
Chomsky theorized that syntactic operations must obey the phase impenetrability condition which essentially requires that movement be from the left-edge of a phase. The PIC has been variously formulated in the literature. The extended projection principle feature that is on the heads of phases triggers the intermediate movement steps to phase edges.Phase impenetrability condition (PIC)
Movement of a constituent out of a phase is only permitted if the constituent has first moved to the left edge of the phase.The edge of a head X is defined as the residue outside of X', in either specifier of X and adjuncts to XP.
English successive cyclic wh-movement obeys the PIC. Sentence has two phases: vP and CP. Relative to the application of movement, who moves from the vP phase to the CP phase in two steps:
- Step 1: who moves from the complement position of VP to the left edge of vP, and the EPP feature of the verb forces movement of who to the edge of vP.
- Step 2: who moves from the left edge of the lower vP phase to the specifier of the CP phase.
[Medumba language">Medumba
wh-movementAnother example of PIC can be observed when analyzing A'-agreement in Medumba. A'-agreement is a term used for the morphological reflex of A'-movement of an XP. In Medumba, when the moved phrase reaches a phase edge, a high low tonal melody is added to the head of the complement of the phase head. Since A'-agreement in Medumba requires movement, the presence of agreement on the complements of phase heads shows that the wh-word moves to the edges of phases and obeys PIC.Example:
The sentence has a high low tone on the verb nɔ́ʔ and tense ʤʉ̀n, therefore is grammatical.
'The child gave the bag to who?'
Cycle
The spell-out of a string is assumed to be cyclic, but there is no consensus about how to implement this. Some analyses adopt an iterative spell-out algorithm, with spell-out applying after each application of Merge. Other analyses adopt an opportunistic algorithm, where spell-out applies only if it must. And yet others adopt a wait-til-the-end algorithm, with spell-out occurring only at the end of the derivation.There is no consensus about the cyclicality of the Agree relation: it is sometimes treated as cyclic, sometimes as a-cyclic, and sometimes as counter-cyclic.
Implications
Connections to other models
Principles and parameters
From a theoretical standpoint, and in the context of [generative grammar">vP see[Medumba language">Medumba wh-movement
Another example of PIC can be observed when analyzing A'-agreement in Medumba. A'-agreement is a term used for the morphological reflex of A'-movement of an XP. In Medumba, when the moved phrase reaches a phase edge, a high low tonal melody is added to the head of the complement of the phase head. Since A'-agreement in Medumba requires movement, the presence of agreement on the complements of phase heads shows that the wh-word moves to the edges of phases and obeys PIC.Example:
The sentence has a high low tone on the verb nɔ́ʔ and tense ʤʉ̀n, therefore is grammatical.
'The child gave the bag to who?'
Cycle
The spell-out of a string is assumed to be cyclic, but there is no consensus about how to implement this. Some analyses adopt an iterative spell-out algorithm, with spell-out applying after each application of Merge. Other analyses adopt an opportunistic algorithm, where spell-out applies only if it must. And yet others adopt a wait-til-the-end algorithm, with spell-out occurring only at the end of the derivation.There is no consensus about the cyclicality of the Agree relation: it is sometimes treated as cyclic, sometimes as a-cyclic, and sometimes as counter-cyclic.
Implications
Connections to other models
Principles and parameters
From a theoretical standpoint, and in the context of [generative grammar, the Minimalist Program is an outgrowth of the principles and parameters model, considered to be the ultimate standard theoretical model that generative linguistics developed from the early 1980s through to the early 1990s. The Principles and Parameters model posits a fixed set of principles that—when combined with settings for a finite set of parameters—could describe the properties that characterize the language competence that a child eventually attains. One aim of the Minimalist Program is to ascertain how much of the Principles and Parameters model can be taken to result from the hypothesized optimal and computationally efficient design of the human language faculty. In turn, some aspects of the Principles and Parameters model provide technical tools and foundational concepts that inform the broad outlines of the Minimalist Program.X-bar theory
—first introduced in Chomsky and elaborated in Jackendoff among other works—was a major milestone in the history of the development of generative grammar. It contains the following postulates:- Each phrase has a head and it projects to a larger phrase.
- Heads are feature complexes that consist of a primitive feature.
- The general X-bar schema in is a property of universal grammar (UG):
In 1980, the principles and parameters approach took place which marked the emergence of different theories that stray from rule-based grammars/rules, and have instead been replaced with multiple segments of UG such as X-bar theory, case theory, etc. During this time, PS rules disappeared because they have proved to be redundant since they recap what is in the lexicon. Transformational rules have survived with a few amendments to how they are expressed. For complex traditional rules, they do not need to be defined and they can be dwindled to a general schema called Move-α—which means things can be moved anywhere. The only two sub-theories that withstood time within P&P is Move-α. Of the fundamental properties mentioned above, X-bar theory accounts for hierarchical structure and endocentricity, while Move-α accounts for unboundedness and non-local dependencies. A few years later, an effort was made to merge X-bar theory with Move-a by suggesting that structures are built from the bottom going up :
- Features are discharged as soon as a head projects. This follows from the idea that phrases are endocentric : the head is the obligatory component of a phrasal constituent and projects its essential features.
- There is no X-bar schema, and no requirements for maximal projection to be specified as bar levels. This is a consequence of the claim that features discharged by projection of the head.
- At any given bar level, iteration is possible. This is based on the idea that phrase structure composition is infinite.
- Adjunction is responsible for movement and structure-building. This is based on the idea that transformational operations are fundamental.
- Projections are closed by agreement. This based on the idea that in some languages, phrases do not close and elements can be added to keep expanding it.
- BPS is explicitly derivational. That is, it is built from the bottom up, bit by bit. In contrast, X-bar theory is representational—a structure for a given construction is built in one fell swoop, and lexical items are inserted into the structure.
- BPS does not have a preconceived phrasal structure, while in X-bar theory every phrase has a specifier, a head, and a complement.
- BPS permits only binary branching, while X-bar theory permits both binary and unary branching.
- BPS does not distinguish between a "head" and a "terminal", while some versions of X-bar theory require such a distinction.
- BPS incorporates features into their structure, such as Xmax and Xmin, while X-bar theory contains levels, such as XP, X', X
- BPS accounts cross-linguistically as maximal projections can be perceived at an XP level or an X' level, whereas X-bar theory only perceives XP as the maximal projection.
- Eliminating the notion of non-branching domination
- Eliminating the necessity of bar-level projections
Functionalism
In linguistics, there are differing approaches taken to explore the basis of language: two of these approaches are formalism and functionalism. It has been argued that the formalist approach can be characterized by the belief that rules governing syntax can be analyzed independently from things such as meaning and discourse. In other words, according to formalists, syntax is an independent system. By contrast, functionalists believe that syntax is determined largely by the communicative function that it serves. Therefore, syntax is not kept separate from things such as meaning and discourse.Under functionalism, there is a belief that language evolved alongside other cognitive abilities, and that these cognitive abilities must be understood in order to understand language. In Chomsky's theories prior to MP, he had been interested exclusively in formalism, and had believed that language could be isolated from other cognitive abilities. However, with the introduction of MP, Chomsky considers aspects of cognition system and the sensory motor to be linked to language. Rather than arguing that syntax is a specialized model which excludes other systems, under MP, Chomsky considers the roles of cognition, production, and articulation in formulating language. Given that these cognitive systems are considered in an account of language under MP, it has been argued that in contrast to Chomsky's previous theories, MP is consistent with functionalism.
Dependency grammar
There is a trend in minimalism that shifts from constituency-based to dependency-based structures. Minimalism falls under the dependency grammar umbrella by virtue of adopting bare phrase structure, label-less trees, and specifier-less syntax.- bare phrase structure: merge does away with non-branching nodes and bar levels, which are replaced by minimal projections and maximal projections :
- * a minimal projection does not dominate other lexical items or categories
- * a maximal projection is unable to project any higher.
- label-less trees: to simplify phrase structures, Noam Chomsky argues that the labels of the category are unnecessary, and therefore do not need to be included, leading to what is now known as label-less trees. In lieu of a specific category in the projection, the lexical item that is classified as a head become its own label.
- specifier-less syntax: the generalization of Abney's DP hypothesis gives rise to the development of specifier-less syntax. Lexical items that would have been analyzed as a specifier in earlier versions of X-bar theory—e.g. Determiners were introduced in ; Auxiliaries were introduced in —become the heads of their own phrases. For example, D introduces NP as a complement; T introduces VP as a complement.
First language (L1) acquisition
- Two-word stage: Merge is the operation where two syntactic elements are brought together and combined to form a constituent. The head of the pair determines the constituent's label, but the element that becomes the head depends on the language. English is a left-headed language, such that the element on the left is the head; Japanese is a right-headed language, such that the element on the right is the head. Merge can account for the patterns of word-combination, and more specifically word-order, observed in children's first language acquisition. In first language acquisition, it has been observed that young children combine two words in ways that are consistent with either the head-initial or head-final pattern of the language they are learning. Children learning English produce "pivot" words before "open" words, which is consistent with the head-initial pattern of English, whereas children learning Japanese produce "open" words before "pivot" words.
- Emergence of headed combinations: Within the minimalist program, bare phrase structure, described in detail above, accounts for children's first language acquisition better than earlier theories of phrase structure building, such as X-bar theory. This is because, under bare phrase structure, children do not need to account for the intermediate layers of structure that appear in X-bar theory. The account of first language acquisition provided under bare phrase structure is simpler than that provided under X-bar theory. In particular, children typically progress from conjunctions to headed combinations. This trajectory can be modelled as a progression from symmetric Merge to asymmetric Merge.
Criticisms
Lappin et al. argue that the minimalist program is a radical departure from earlier Chomskyan linguistic practice that is not motivated by any new empirical discoveries, but rather by a general appeal to perfection, which is both empirically unmotivated and so vague as to be unfalsifiable. They compare the adoption of this paradigm by linguistic researchers to other historical paradigm shifts in natural sciences and conclude that of the minimalist program has been an "unscientific revolution", driven primarily by Chomsky's authority in linguistics. The several replies to the article in Natural Language and Linguistic Theory Volume 18 number 4 make a number of different defenses of the minimalist program. Some claim that it is not in fact revolutionary or not in fact widely adopted, while others agree with Lappin and Johnson on these points, but defend the vagueness of its formulation as not problematic in light of its status as a research program rather than a theory.
Prakash Mondal has published a book-length critique of the minimalist model of grammar, arguing that there are a number of contradictions, inconsistencies and paradoxes within the formal structure of the system. In particular, his critique examines the consequences of adopting some rather innocuous and widespread assumptions or axioms about the nature of language as adopted in the Minimalist model of the language faculty.
Developments in the minimalist program have also been critiqued by Hubert Haider, who has argued that minimalist studies routinely fail to follow scientific rigour. In particular, data compatible with hypotheses are filed under confirmation whereas crucial counter-evidence is largely ignored or shielded off by making ad hoc auxiliary assumptions. Moreover, the supporting data are biased towards SVO languages and are often based on the linguist's introspection rather attempts to gather data in an unbiased manner by experimental means. Haider further refers to the appeal to an authority figure in the field, with dedicated followers taking the core premises of minimalism for granted as if they were established facts.
Works by Noam Chomsky
- Chomsky, Noam. 2013. Problems of Projection. Lingua 130: 33–49.
- Chomsky, Noam. 2008. On Phases. In Foundational Issues in Linguistic Theory. Essays in Honor of Jean-Roger Vergnaud, eds. Robert Freidin, Carlos Peregrín Otero and Maria Luisa Zubizarreta, 133–166. Cambridge, Massachusetts: MIT Press.
- Chomsky, Noam. 2007. Approaching UG From Below. In Interfaces + Recursion = Language?, eds. Uli Sauerland and Hans Martin Gärtner, 1–29. New York: Mouton de Gruyter.
- Chomsky, Noam. 2005. Three Factors in Language Design. Linguistic Inquiry 36: 1–22.
- Chomsky, Noam. 2004. . In Structures and Beyond. The Cartography of Syntactic Structures, ed. Adriana Belletti, 104–131. Oxford: Oxford University Press.
- Chomsky, Noam. 2001. Derivation by Phase. In Ken Hale: A Life in Language, ed. Michael Kenstowicz, 1–52. Cambridge, Massachusetts: MIT Press.
- Chomsky, Noam. 2000. . Cambridge, UK; New York: Cambridge University Press.
- Chomsky, Noam. 2000. Minimalist inquiries: the framework. In Step by Step: Essays on Minimalist Syntax in Honor of Howard Lasnik, eds. Roger Martin, David Michaels and Juan Uriagereka, 89–155. Cambridge, Massachusetts: MIT Press.
- Chomsky, Noam. 1995. The Minimalist Program. Cambridge, Massachusetts: The MIT Press.
- Chomsky, Noam. 1993. "A minimalist program for linguistic theory". In Hale, Kenneth L. and S. Jay Keyser, eds. The view from Building 20: Essays in linguistics in honor of Sylvain Bromberger. Cambridge, Massachusetts: MIT Press. 1–52.
Works on minimalism and its applications
- Citko, Barbara and Martina Gračanin-Yuksek. 2020. Merge: Binarity in Syntax. Cambridge, Massachusetts: MIT Press.
- Smith, Peter W., Johannes Mursell, and Katharina Hartmann 2020. Agree to Agree: Agreement in the Minimalist Programme. Berlin: Language Science Press. .
- Cipriani, Enrico. 2019. Semantics in Generative Grammar. A Critical Survey. Lingvisticae Investigationes, 42, 2, pp. 134–85
- Stroik, Thomas. 2009. Locality in Minimalist Syntax. Cambridge, Massachusetts: MIT Press.
- Boeckx, Cedric. 2006. Minimalist Essays. Amsterdam: John Benjamins.
- Epstein, Samuel David, and Seely, T. Daniel. 2002. Derivation and Explanation in the Minimalist Program. Malden, MA: Blackwell.
- Richards, Norvin. 2001. Movement in Language. Oxford: Oxford University Press.
- Pesetsky, David. 2001. Phrasal Movement and its Kin. Cambridge, Massachusetts: MIT Press.
- Martin, Roger, David Michaels and Juan Uriagereka. 2000. Step by Step: Essays on Minimalist Syntax in Honor of Howard Lasnik. Cambridge, Massachusetts: MIT Press.
- Epstein, Samuel David, and Hornstein, Norbert. 1999. Working Minimalism. Cambridge, Massachusetts: MIT Press.
- Fox, Danny. 1999. Economy and Semantic Interpretation. Cambridge, Massachusetts: MIT Press.
- Bošković, Željko. 1997. . Cambridge, Massachusetts: MIT Press.
- Collins, Chris. 1997. Local Economy. Cambridge, Massachusetts: MIT Press.
- Brody, Michael. 1995. Lexico-Logical Form: a Radically Minimalist Theory. Cambridge, Massachusetts: MIT Press.
Textbooks on minimalism
- Adger, David. 2003. Core Syntax. A Minimalist Approach. Oxford: Oxford University Press
- Boeckx, Cedric. 2006. Linguistic Minimalism. Origins, Concepts, Methods and Aims. Oxford: Oxford University Press.
- Bošković, Željko and Howard Lasnik. 2006. Minimalist Syntax: The Essential Readings. Malden, MA: Blackwell.
- Cook, Vivian J. and Newson, Mark. 2007. Chomsky's Universal Grammar: An Introduction. Third Edition. Malden, MA: Blackwell.
- Hornstein, Norbert, Jairo Nunes and Kleanthes K. Grohmann. 2005. Understanding Minimalism. Cambridge: Cambridge University Press
- Lasnik, Howard, Juan Uriagereka, Cedric Boeckx. 2005. A Course in Minimalist Syntax. Malden, MA: Blackwell
- Radford, Andrew. 2004. '. Cambridge: Cambridge University Press.
- Uriagereka, Juan. 1998. '. Cambridge, Massachusetts: MIT Press.
- Webelhuth, Gert. 1995. Government and Binding Theory and the Minimalist Program: Principles and Parameters in Syntactic Theory. Wiley-Blackwell