Context-free grammar
In formal language theory, a context-free grammar is a formal grammar whose production rules
can be applied to a nonterminal symbol regardless of its context.
In particular, in a context-free grammar, each production rule is of the form
with a single nonterminal symbol, and a string of terminals and/or nonterminals. Regardless of which symbols surround it, the single nonterminal on the left hand side can always be replaced by on the right hand side. This distinguishes it from a context-sensitive grammar, which can have production rules in the form with a nonterminal symbol and,, and strings of terminal and/or nonterminal symbols.
A formal grammar is essentially a set of production rules that describe all possible strings in a given formal language. Production rules are simple replacements. For example, the first rule in the picture,
replaces with. There can be multiple replacement rules for a given nonterminal symbol. The language generated by a grammar is the set of all strings of terminal symbols that can be derived, by repeated rule applications, from some particular nonterminal symbol.
Nonterminal symbols are used during the derivation process, but do not appear in its final result string.
Languages generated by context-free grammars are known as context-free languages. Different context-free grammars can generate the same context-free language. It is important to distinguish the properties of the language from the properties of a particular grammar. The [|language equality] question is undecidable.
Context-free grammars arise in linguistics where they are used to describe the structure of sentences and words in a natural language, and they were invented by the linguist Noam Chomsky for this purpose. By contrast, in computer science, as the use of recursively defined concepts increased, they were used more and more. In an early application, grammars are used to describe the structure of programming languages. In a newer application, they are used in an essential part of the Extensible Markup Language called the document type definition.
In linguistics, some authors use the term phrase structure grammar to refer to context-free grammars, whereby phrase-structure grammars are distinct from dependency grammars. In computer science, a popular notation for context-free grammars is Backus–Naur form, or BNF.
Background
Since at least the time of the ancient Indian scholar Pāṇini, linguists have described the grammars of languages in terms of their block structure, and described how sentences are recursively built up from smaller phrases, and eventually individual words or word elements. An essential property of these block structures is that logical units never overlap. For example, the sentence:can be logically parenthesized as follows:
A context-free grammar provides a simple and mathematically precise mechanism for describing the methods by which phrases in some natural language are built from smaller blocks, capturing the "block structure" of sentences in a natural way. Its simplicity makes the formalism amenable to rigorous mathematical study. Important features of natural language syntax such as agreement and reference are not part of the context-free grammar, but the basic recursive structure of sentences, the way in which clauses nest inside other clauses, and the way in which lists of adjectives and adverbs are swallowed by nouns and verbs, is described exactly.
Context-free grammars are a special form of semi-Thue systems that in their general form date back to the work of Axel Thue.
The formalism of context-free grammars was developed in the mid-1950s by Noam Chomsky, and also their classification as a special type of formal grammar. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars. In Chomsky's generative grammar framework, the syntax of natural language was described by context-free rules combined with transformation rules.
Block structure was introduced into computer programming languages by the Algol project, which, as a consequence, also featured a context-free grammar to describe the resulting Algol syntax. This became a standard feature of computer languages, and the notation for grammars used in concrete descriptions of computer languages came to be known as Backus–Naur form, after two members of the Algol language design committee. The "block structure" aspect that context-free grammars capture is so fundamental to grammar that the terms syntax and grammar are often identified with context-free grammar rules, especially in computer science. Formal constraints not captured by the grammar are then considered to be part of the "semantics" of the language.
Context-free grammars are simple enough to allow the construction of efficient parsing algorithms that, for a given string, determine whether and how it can be generated from the grammar. An Earley parser is an example of such an algorithm, while the widely used LR and LL parsers are simpler algorithms that deal only with more restrictive subsets of context-free grammars.
Formal definitions
A context-free grammar is defined by the 4-tuple,where
- is a finite set; each element is called a nonterminal character or a variable. Each variable represents a different type of phrase or clause in the sentence. Variables are also sometimes called syntactic categories. Each variable defines a sub-language of the language defined by.
- is a finite set of terminals, disjoint from, which make up the actual content of the sentence. The set of terminals is the alphabet of the language defined by the grammar.
- is a finite relation in, where the asterisk represents the Kleene star operation. The members of are called the rules or productions of the grammar.
- is the start variable, used to represent the whole sentence. It must be an element of.
Production rule notation
It is allowed for to be the empty string, and in this case it is customary to denote it by. The form is called an ε-production.
It is common to list all right-hand sides for the same left-hand side on the same line, using | to separate them. Rules and can hence be written as. In this case, and are called the first and second alternative, respectively.
Rule application
For any strings, we say directly yields, written as, if with and such that and. Thus, is a result of applying the rule to.Repetitive rule application
For any strings we say yields or is derived from if there is a positive integer and strings such that. This relation is denoted, or in some textbooks. If, the relation holds. In other words, and are the reflexive transitive closure and the transitive closure of, respectively.Context-free language
The language of a grammar is the setof all terminal-symbol strings derivable from the start symbol.
A language is said to be a context-free language, if there exists a CFG, such that.
Non-deterministic pushdown automata recognize exactly the context-free languages.
Examples
Words concatenated with their reverse
The grammar, with productionsis context-free. It is not proper since it includes an -production. A typical derivation in this grammar is
This makes it clear that.
The language is context-free; however, it can be proved that it is not regular.
If the productions
are added, a context-free grammar for the set of all palindromes over the alphabet is obtained.
Well-formed parentheses
The canonical example of a context-free grammar is parenthesis matching, which is representative of the general case. There are two terminal symbols and and one nonterminal symbol. The production rules areThe first rule allows the symbol to multiply; the second rule allows the symbol to become enclosed by matching parentheses; and the third rule terminates the recursion.
Well-formed nested parentheses and square brackets
A second canonical example is two different kinds of matching nested parentheses, described by the productions:with terminal symbols,,, and nonterminal.
The following sequence can be derived in that grammar:
Matching pairs
In a context-free grammar, we can pair up characters the way we do with brackets. The simplest example:This grammar generates the language, which is not regular.
The special character stands for the empty string. By changing the above grammar to
we obtain a grammar generating the language instead. This differs only in that it contains the empty string while the original grammar did not.
Distinct number of as and bs
A context-free grammar for the language consisting of all strings over containing an unequal number of s and s:Here, the nonterminal can generate all strings with more as than s, the nonterminal generates all strings with more s than s and the nonterminal generates all strings with an equal number of s and s. Omitting the third alternative in the rules for and does not restrict the grammar's language.