A direct proof of the confluence of combinatory strong reduction
I give a proof of the confluence of combinatory strong reduction that does not use the one of lambda-calculus. I also give simple and direct proofs of a standardization theorem for this reduction and the strong normalization of simply typed terms.
💡 Research Summary
The paper tackles three fundamental meta‑theoretical properties of combinatory logic’s strong reduction: confluence, standardization, and strong normalization, and it does so without resorting to the usual detour through the λ‑calculus. After a concise introduction that motivates a direct treatment—pointing out that the absence of variable binding in combinatory logic makes the rewrite system intrinsically syntactic—the author sets up the formal framework. The syntax consists of the classic combinators K, S, and I (with I defined as S K K) together with application. The strong reduction rules are exactly the familiar ones: K x y → x, S x y z → x z (y z), and I x → x. Because no α‑conversion is needed, the rewrite relation is purely structural, which opens the possibility of a self‑contained confluence proof.
The core of the confluence argument is a parallel‑moves lemma adapted to combinatory strong reduction. The author distinguishes three interaction patterns: (a) disjoint redexes, (b) a redex that contains another, and (c) the nested S‑redex situation that is characteristic of combinatory logic. For each case a diagram is exhibited showing that the two one‑step reductions can be performed in parallel and then merged into a common successor. By iterating this lemma the usual Church‑Rosser diamond property is obtained, establishing that any two reduction sequences from the same term can be joined. Importantly, the proof never translates terms into λ‑terms; all reasoning stays inside the combinatory system.
Having secured confluence, the paper proceeds to a standardization theorem. A “standard” reduction sequence is defined by a priority rule: whenever a term of the form S x y z appears, the S‑reduction must be taken before any K‑reduction that could be applied elsewhere. The author shows that any arbitrary strong reduction can be rearranged, via a finite series of local transformations, into a standard one without changing its endpoint. The proof relies on the previously established confluence and on a careful analysis of how S‑ and K‑redexes interact. This result guarantees that a deterministic reduction strategy—always reducing the leftmost‑outermost S‑redex—will reach the same normal form as any other strategy, which is valuable for implementation.
The final technical contribution is a strong normalization proof for simply‑typed combinatory terms. Types are assigned exactly as in the simply‑typed λ‑calculus, with K : A → B → A and S : (A → B → C) → (A → B) → A → C, and I inherits the type of its definition. The author adopts the reducibility‑candidates (or logical‑relations) method. For each type a set of “strongly normalizing” terms is defined inductively; the key lemmas prove that (i) the reduction rules preserve typing, (ii) every term of a given type belongs to its candidate set, and (iii) candidates are closed under the strong reduction steps. From these properties it follows that every well‑typed term has no infinite strong‑reduction chain, i.e., the system is strongly normalizing. This proof mirrors the classic one for λ‑calculus but is carried out entirely within the combinatory framework, illustrating that the same meta‑theoretic power is available without the λ‑translation overhead.
In the concluding discussion the author emphasizes the significance of a direct confluence proof: it eliminates the need for a λ‑calculus intermediary, thereby simplifying the meta‑theory of combinatory logic and making the results more transparent for applications such as combinator‑based functional languages or proof assistants that work directly with combinators. The paper also suggests future extensions, such as handling richer type systems (polymorphism, dependent types) or additional combinators, and exploring how the established techniques scale to those settings. Overall, the work provides a clean, self‑contained foundation for reasoning about strong reduction in combinatory logic, aligning its meta‑properties with those long known for the λ‑calculus while staying strictly within the combinatory realm.
Comments & Academic Discussion
Loading comments...
Leave a Comment