
- Automata Theory - Applications
- Automata Terminology
- Basics of String in Automata
- Set Theory for Automata
- Finite Sets and Infinite Sets
- Algebraic Operations on Sets
- Relations Sets in Automata Theory
- Graph and Tree in Automata Theory
- Transition Table in Automata
- What is Queue Automata?
- Compound Finite Automata
- Complementation Process in DFA
- Closure Properties in Automata
- Concatenation Process in DFA
- Language and Grammars
- Language and Grammar
- Grammars in Theory of Computation
- Language Generated by a Grammar
- Chomsky Classification of Grammars
- Context-Sensitive Languages
- Finite Automata
- What is Finite Automata?
- Finite Automata Types
- Applications of Finite Automata
- Limitations of Finite Automata
- Two-way Deterministic Finite Automata
- Deterministic Finite Automaton (DFA)
- Non-deterministic Finite Automaton (NFA)
- NDFA to DFA Conversion
- Equivalence of NFA and DFA
- Dead State in Finite Automata
- Minimization of DFA
- Automata Moore Machine
- Automata Mealy Machine
- Moore vs Mealy Machines
- Moore to Mealy Machine
- Mealy to Moore Machine
- Myhill–Nerode Theorem
- Mealy Machine for 1’s Complement
- Finite Automata Exercises
- Complement of DFA
- Regular Expressions
- Regular Expression in Automata
- Regular Expression Identities
- Applications of Regular Expression
- Regular Expressions vs Regular Grammar
- Kleene Closure in Automata
- Arden’s Theorem in Automata
- Convert Regular Expression to Finite Automata
- Conversion of Regular Expression to DFA
- Equivalence of Two Finite Automata
- Equivalence of Two Regular Expressions
- Convert Regular Expression to Regular Grammar
- Convert Regular Grammar to Finite Automata
- Pumping Lemma in Theory of Computation
- Pumping Lemma for Regular Grammar
- Pumping Lemma for Regular Expression
- Pumping Lemma for Regular Languages
- Applications of Pumping Lemma
- Closure Properties of Regular Set
- Closure Properties of Regular Language
- Decision Problems for Regular Languages
- Decision Problems for Automata and Grammars
- Conversion of Epsilon-NFA to DFA
- Regular Sets in Theory of Computation
- Context-Free Grammars
- Context-Free Grammars (CFG)
- Derivation Tree
- Parse Tree
- Ambiguity in Context-Free Grammar
- CFG vs Regular Grammar
- Applications of Context-Free Grammar
- Left Recursion and Left Factoring
- Closure Properties of Context Free Languages
- Simplifying Context Free Grammars
- Removal of Useless Symbols in CFG
- Removal Unit Production in CFG
- Removal of Null Productions in CFG
- Linear Grammar
- Chomsky Normal Form (CNF)
- Greibach Normal Form (GNF)
- Pumping Lemma for Context-Free Grammars
- Decision Problems of CFG
- Pushdown Automata
- Pushdown Automata (PDA)
- Pushdown Automata Acceptance
- Deterministic Pushdown Automata
- Non-deterministic Pushdown Automata
- Construction of PDA from CFG
- CFG Equivalent to PDA Conversion
- Pushdown Automata Graphical Notation
- Pushdown Automata and Parsing
- Two-stack Pushdown Automata
- Turing Machines
- Basics of Turing Machine (TM)
- Representation of Turing Machine
- Examples of Turing Machine
- Turing Machine Accepted Languages
- Variations of Turing Machine
- Multi-tape Turing Machine
- Multi-head Turing Machine
- Multitrack Turing Machine
- Non-Deterministic Turing Machine
- Semi-Infinite Tape Turing Machine
- K-dimensional Turing Machine
- Enumerator Turing Machine
- Universal Turing Machine
- Restricted Turing Machine
- Convert Regular Expression to Turing Machine
- Two-stack PDA and Turing Machine
- Turing Machine as Integer Function
- Post–Turing Machine
- Turing Machine for Addition
- Turing Machine for Copying Data
- Turing Machine as Comparator
- Turing Machine for Multiplication
- Turing Machine for Subtraction
- Modifications to Standard Turing Machine
- Linear-Bounded Automata (LBA)
- Church's Thesis for Turing Machine
- Recursively Enumerable Language
- Computability & Undecidability
- Turing Language Decidability
- Undecidable Languages
- Turing Machine and Grammar
- Kuroda Normal Form
- Converting Grammar to Kuroda Normal Form
- Decidability
- Undecidability
- Reducibility
- Halting Problem
- Turing Machine Halting Problem
- Rice's Theorem in Theory of Computation
- Post’s Correspondence Problem (PCP)
- Types of Functions
- Recursive Functions
- Injective Functions
- Surjective Function
- Bijective Function
- Partial Recursive Function
- Total Recursive Function
- Primitive Recursive Function
- μ Recursive Function
- Ackermann’s Function
- Russell’s Paradox
- Gödel Numbering
- Recursive Enumerations
- Kleene's Theorem
- Kleene's Recursion Theorem
- Advanced Concepts
- Matrix Grammars
- Probabilistic Finite Automata
- Cellular Automata
- Reduction of CFG
- Reduction Theorem
- Regular expression to ∈-NFA
- Quotient Operation
- Parikh’s Theorem
- Ladner’s Theorem
Context-Sensitive Languages
Context-sensitive languages (CSLs) bridge the gap between the well-studied context-free languages and the more complex recursively enumerable languages. In this chapter, we will cover the concepts of context-sensitive languages, exploring their definitions, properties, and relationships to other language classes in the Chomsky hierarchy.
Context-Free Languages: Foundation for Understanding CSLs
Before we enter context-sensitive languages, it's essential to understand the context-free languages (CFLs), as they form the foundation for our discussion.
What is Context-Free Grammar?
A context-free language is generated by a context-free grammar (CFG). In a CFG, production rules have the form: A → X, Where −
- A is a variable (non-terminal)
- X is any string of terminals or variables
Properties of Context-Free Languages
The key characteristic of CFLs is that the replacement of A with X is independent of the surrounding context. This property gives CFLs their name they are "free" of context constraints.
CFLs correspond to pushdown automata (PDAs) in the Chomsky hierarchy, which are more powerful than finite automata but less powerful than linear-bounded automata.
Context-Sensitive Languages: More Powerful CFLs
The context-sensitive languages extend the concept of CFLs by allowing production rules to depend on the context in which variables appear. This seemingly small change leads to a significant increase in expressive power. Let us understand CSG in greater detail.
What is Context-Sensitive Grammars (CSGs)?
A context-sensitive grammar has production rules of the form: αAβ → αXβ, where −
- α, β are strings of terminals and/or variables (can be empty)
- A is a variable
- X is a non-empty string of terminals or variables
Properties of Context-Sensitive Languages
Listed below are some of the important properties of context-sensitive languages −
- Context Preservation − The production process maintains the same context (α and β) on both sides, ensuring that the replacement of A with X only occurs within the defined context.
- Non-Contracting − The grammar's property X cannot be empty, ensuring it doesn't reduce string length during derivation. However, the start variable S can generate an empty string if it's part of the language.
- Increased Expressive Power − CSLs can describe patterns that CFLs cannot, such as matching multiple repeated substrings.
The Chomsky Hierarchy and CSLs
Context-sensitive languages are located at the third level of the Chomsky hierarchy, between context-free and recursively enumerable languages.
The relationship between different classes of languages is as follows −
- Regular Languages ⊂ Context-Free Languages ⊂ Context-Sensitive Languages ⊂ Recursively Enumerable Languages.
- CSLs have the added power to describe patterns that CFLs cannot.
Now let us see CSL through an example.
Example of a Language That is context-free or Not
A classic example of a language that is context-sensitive but not context-free is: L = {abc | n ≥ 0}
The language is composed of strings with equal numbers of a's, b's, and c's, which cannot be generated by context-free grammars due to their inability to count and ensure equal numbers of three different symbols.
To illustrate the power of context-sensitive grammars, let's construct a CSG that generates the language L = {abc | n ≥ 0}.
The production rules are as follows −
- S → ε (to handle the case n = 0)
- S → S'
- S' → ABC
- AB → BAB
- BA → ACA
- CA → CB
- CB → AB
- Bb → Bb
- A → a
- B → b
- C → c
This grammar works through a series of transformations −
- Rule 1 handles the empty string case (n = 0).
- Rules 2-3 initialize the string with one occurrence of each variable (A, B, C).
- Rules 4-7 allow for the rearrangement of variables. These rules effectively "bubble" the A's to the left and the C's to the right, maintaining the correct order and equal numbers of each variable.
- Rule 8 is crucial: it ensures that B's are replaced with lowercase b's only when adjacent to an existing lowercase b. This prevents premature conversion of B's and maintains the structure of the string.
- Rules 9-11 convert the variables to their corresponding terminals once they are in the correct position.
The grammar maintains the equal numbers of A's, B's, and C's while rearranging them into the correct order. The context-sensitive nature of the rules allows for this precise control over the string's structure.
Conclusion
Context-sensitive languages offer a significant improvement over context-free languages by considering surrounding context in production rules. It enables the description of complex patterns beyond the reach of CFLs.
In this chapter, we first highlighted the problem with context-free languages, followed by the definition and properties of context-sensitive languages. In addition, we presented an example to demonstrate how context-sensitive languages are more efficient than context-free languages.