LLMpediaThe first transparent, open encyclopedia generated by LLMs

Sequent calculus

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Jean-Yves Girard Hop 5
Expansion Funnel Raw 1 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted1
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Sequent calculus
NameSequent calculus
CaptionGentzen's original proof notation for natural deduction
Introduced1934
InventorGerhard Gentzen
FieldMathematical logic
Notable worksUntersuchungen über das logische Schließen

Sequent calculus is a formal system for representing proofs and deductions in logic, introduced to analyze the structure of proofs and to establish meta-theoretical results such as consistency and normalization. It provides a symmetric, rule-based framework that treats premises and conclusions uniformly and has been influential in proof theory, automated reasoning, and the study of computation via logic. The calculus shaped developments in structural proof theory, influenced type theory, and connected to algebraic and categorical approaches in the mid-20th and early-21st centuries.

History

Gerhard Gentzen developed the system in 1934 in his work Untersuchungen über das logische Schließen, responding to foundational problems pursued also by David Hilbert, Paul Bernays, and Emil Post. Gentzen’s innovations built on earlier inquiries by Gottlob Frege and Leopold Löwenheim and operated in the milieu of Kurt Gödel’s incompleteness results and Emmy Noether’s algebraic methods. Subsequent contributors included Stanislaw Jaśkowski, Thoralf Skolem, and Gerhard Gentzen’s contemporaries at the University of Göttingen and University of Vienna. Later developments involved logicians such as Dag Prawitz, Michael Dummett, Jean-Yves Girard, and William Tait, who extended structural analysis and normalization inspired by influences from Alexandre Grothendieck’s structural ideas and Saunders Mac Lane’s category theory.

Syntax and sequents

A sequent expresses a relation between a finite multiset of antecedent formulas and a finite multiset of succedent formulas, following Gentzen’s notation. The formal syntax draws on traditions from Fregean notation and Hilbert-style axiom systems as used by David Hilbert and Paul Bernays, and it interacts with symbolic frameworks found in Alfred Tarski’s semantic investigations and Alonzo Church’s lambda calculus. Sequents permit explicit manipulation of hypotheses and conclusions and are formal objects similar in role to judgments in Per Martin-Löf’s type theory and to morphisms studied by William Lawvere in categorical logic.

Inference rules

Inference rules in the sequent calculus are local transformations of sequents, typically presented as introduction rules for logical connectives on either side of the turnstile. Gentzen formulated left and right rules for conjunction, disjunction, implication, and negation; these rules echo proof transformations studied later by Hilary Putnam and Jaakko Hintikka in model-theoretic contexts. Structural presentation of rules influenced automated theorem provers developed in the tradition of Frederic Fitch and Alan Robinson and informed complexity analyses performed by Stephen Cook and Richard Karp.

Structural rules

Structural rules govern the manipulation of premises and conclusions independently of logical connectives; principal examples are weakening, contraction, and exchange. Gentzen used these rules to control duplication and permutation of formulas in proofs, ideas resonant with Jean-Yves Girard’s work on linear logic and with the resource-sensitive perspectives in Robin Milner’s type systems. The admissibility or removal of structural rules distinguishes classical, intuitionistic, and substructural logics explored by Nuel Belnap, Crispin Wright, and Graham Priest.

Cut elimination and admissibility

The cut rule allows composition of proofs by eliminating intermediate formulas; Gentzen’s cut-elimination theorem establishes that any derivation using cut can be transformed into a cut-free derivation, a result analogous in consequence to consistency proofs given by Hilbert and Bernays. Cut elimination has deep links to normalization in proof theory studied by Dag Prawitz and William Tait, and to strong normalization results in the lambda calculus explored by Henk Barendregt and Jean-Louis Krivine. Admissibility of rules such as cut is central to interpolation theorems developed in the work of Alfred Tarski and Robert Craig and to decidability analyses influenced by Emil Post and Alan Turing.

Variants and extensions

Many variants of the sequent calculus have been introduced to capture different logics and computational behaviors: multi-conclusion systems for classical logic, single-conclusion systems for intuitionistic logic, and focused calculi developed by Jean-Marc Andreoli. Substructural variants include linear logic by Jean-Yves Girard, relevant logic pursued by A. R. Anderson and N. D. Belnap, and Lambek calculus influenced by Joachim Lambek’s syntactic studies. Extensions connect to categorical proof theory as in the work of Saunders Mac Lane and William Lawvere, and to modal and temporal logics studied by Saul Kripke and Arthur Prior, with applications in concurrency theories advanced by Robin Milner and Tony Hoare.

Applications and significance

Sequent calculus underlies many modern proof assistants and automated deduction systems influenced by Robin Milner’s LCF tradition, Paulson’s Isabelle, and Tobias Nipkow’s developments, and it informs type-theoretic frameworks such as the Curry–Howard correspondence explored by Henk Barendregt and Philip Wadler. In theoretical computer science, it supports analyses in complexity theory by Stephen Cook and descriptive complexity by Neil Immerman, and it provides semantics for programming languages via category-theoretic bridges made by Joachim Lambek and William Lawvere. In mathematics and philosophy, sequent methods contribute to studies by Hilary Putnam, Michael Dummett, and Saul Kripke on meaning, inference, and modality, and they continue to guide research on proof identity, interpolation, and constructive content in the work of contemporary logicians and computer scientists.

Category:Proof theory