On completeness of reducibility candidates as a semantics of strong normalization
This paper defines a sound and complete semantic criterion, based on reducibility candidates, for strong normalization of theories expressed in minimal deduction modulo `a la Curry. The use of Curry-style proof-terms allows to build this criterion on the classic notion of pre-Heyting algebras and makes that criterion concern all theories expressed in minimal deduction modulo. Compared to using Church-style proof-terms, this method provides both a simpler definition of the criterion and a simpler proof of its completeness.
💡 Research Summary
The paper presents a semantic criterion based on reducibility candidates that is both sound and complete for proving strong normalization of theories formulated in minimal deduction modulo using Curry‑style proof terms. The authors begin by recalling the syntax of minimal deduction modulo and the distinction between Curry‑style and Church‑style proof terms. In Curry‑style, proof terms are untyped syntactic objects, which allows the authors to avoid the additional complexity introduced by type annotations in Church‑style presentations.
The core construction is a family of sets (C(\phi)) of proof terms, one for each logical formula (\phi). Each set is defined to satisfy three classic conditions: (i) every term in (C(\phi)) is strongly normalizing, (ii) the sets are closed under β‑reduction (if a term reduces to another, the reduct also belongs to the same set), and (iii) they are stable under logical implication (if a term of type (\phi \rightarrow \psi) belongs to (C(\phi \rightarrow \psi)) and a term of type (\phi) belongs to (C(\phi)), then the application belongs to (C(\psi))). These conditions are precisely the Tait‑Girard reducibility candidate conditions, but the authors embed them into a pre‑Heyting algebraic semantics. A pre‑Heyting algebra provides an algebraic interpretation of the logical connectives (\wedge, \vee, \rightarrow, \bot) without requiring the full intuitionistic Heyting structure, which matches the minimal nature of the deduction modulo framework.
The soundness proof shows that any term belonging to a candidate set is indeed strongly normalizing. This is achieved by an induction on the structure of terms, using the closure properties of the candidate sets and the fact that the algebraic operations preserve normalization. The authors carefully verify that each inference rule of minimal deduction modulo (introduction and elimination rules for the connectives, as well as the congruence rules of the modulo part) respects the candidate conditions.
The completeness proof is more subtle. Assuming a theory (T) is strongly normalizing, the authors construct a model of (T) inside a suitable pre‑Heyting algebra. They then demonstrate that for every provable formula (\phi) there exists at least one proof term that maps into the corresponding candidate set (C(\phi)). The construction relies on the completeness of pre‑Heyting algebras for the fragment of logic considered, and on the ability to interpret Curry‑style terms directly as algebraic elements without type constraints. This yields a uniform argument that any strongly normalizing theory admits a full family of reducibility candidates, establishing completeness.
A significant contribution of the paper is the comparison with the Church‑style approach. In the Church‑style setting, proof terms carry type information, and defining reducibility candidates requires a simultaneous treatment of term reduction and type conversion, which complicates both the definition and the completeness argument. By contrast, the Curry‑style presentation eliminates the need for a separate typing discipline, allowing the authors to define candidates purely in terms of term reduction and algebraic interpretation. Consequently, the definitions become shorter, the proofs more transparent, and the overall framework more modular.
Finally, the authors discuss potential applications. The reducibility‑candidate criterion can be incorporated into proof assistants that support deduction modulo, providing an automated check for strong normalization of user‑defined rewrite rules. Moreover, because the criterion is expressed entirely in algebraic terms, it may be adapted to other logical frameworks that employ similar congruence‑based reasoning, such as rewriting‑based type theories or logical frameworks based on higher‑order abstract syntax.
In summary, the paper delivers a clean, algebraic semantics for strong normalization in minimal deduction modulo, leveraging Curry‑style proof terms to simplify both the definition of reducibility candidates and the proof of their completeness. This work advances the theoretical foundations of normalization proofs and opens avenues for practical implementation in automated reasoning tools.
Comments & Academic Discussion
Loading comments...
Leave a Comment