Quasipolynomial Normalisation in Deep Inference via Atomic Flows and Threshold Formulae

Reading time: 6 minute
...

📝 Original Info

  • Title: Quasipolynomial Normalisation in Deep Inference via Atomic Flows and Threshold Formulae
  • ArXiv ID: 0903.5392
  • Date: 2017-01-11
  • Authors: Researchers from original ArXiv paper

📝 Abstract

Je\v{r}\'abek showed that cuts in classical propositional logic proofs in deep inference can be eliminated in quasipolynomial time. The proof is indirect and it relies on a result of Atserias, Galesi and Pudl\'ak about monotone sequent calculus and a correspondence between that system and cut-free deep-inference proofs. In this paper we give a direct proof of Je\v{r}\'abek's result: we give a quasipolynomial-time cut-elimination procedure for classical propositional logic in deep inference. The main new ingredient is the use of a computational trace of deep-inference proofs called atomic flows, which are both very simple (they only trace structural rules and forget logical rules) and strong enough to faithfully represent the cut-elimination procedure.

💡 Deep Analysis

Deep Dive into Quasipolynomial Normalisation in Deep Inference via Atomic Flows and Threshold Formulae.

Je\v{r}'abek showed that cuts in classical propositional logic proofs in deep inference can be eliminated in quasipolynomial time. The proof is indirect and it relies on a result of Atserias, Galesi and Pudl'ak about monotone sequent calculus and a correspondence between that system and cut-free deep-inference proofs. In this paper we give a direct proof of Je\v{r}'abek’s result: we give a quasipolynomial-time cut-elimination procedure for classical propositional logic in deep inference. The main new ingredient is the use of a computational trace of deep-inference proofs called atomic flows, which are both very simple (they only trace structural rules and forget logical rules) and strong enough to faithfully represent the cut-elimination procedure.

📄 Full Content

Deep inference is a proof-theoretic methodology where proofs can be freely composed by the logical operators, instead of having a rigid formula-directed tree structure, as in Gentzen proof theory [Gug07,BT01,Brü04,GGP10]. As a result, inference rules apply arbitrarily deep inside formulae, contrary to traditional proof systems such as natural deduction and the sequent calculus, where inference rules only deal with the outermost structure of formulae.

to the overall clarification of this highly technical matter, by reducing our dependency on syntax. The techniques developed via atomic flows tolerate variations in the proof system specification. In fact, their geometric nature makes them largely independent of syntax, provided that certain linearity conditions are respected (and this is usually achievable in deep inference).

The paper is self-contained. Sections 2 and 3 are devoted, respectively, to the necessary background on deep inference and atomic flows. Threshold functions and formulae are introduced in Section 5.

We normalise proofs in two steps, each of which has a dedicated section in the paper: (1) We transform any given proof into what we call its ‘simple form’. No use is made of threshold formulae and no significant proof complexity is introduced. This is presented in Section 4, mostly an exercise on deep inference and atomic flows. (2) In Section 6, we show the cut elimination step, starting from proofs in simple form.

Here, threshold formulae play a major role. Normalisation can be taken one step further, by removing the instances of the only inference rule left that is not analytic in the deep-inference sense, viz. coweakening. This is performed by a simple and standard deep-inference procedure in Section 7.

Section 8 concludes the paper with comments on future research directions. Parts of this paper were presented at LPAR 16 [BGGP10] and some appear in [Gun09]. Recently, threshold functions have been used in [Das14] to build quasipolynomial size cut-free deep-inference proofs of the propositional pigeonhole principle that, crucially, do not use cocontraction, which is a form of dagness.

Inside the deep-inference methodology we can define several formalisms, i.e. general prescriptions on how to design proof systems, in the same sense as the sequent calculus and natural deduction are formalisms in Gentzen-style proof theory (where the structure of proofs is determined by the tree structure of the formulae they prove).

The first, and conceptually simpler, formalism that has been defined in deep inference is called the calculus of structures, or CoS [Gug07]. Another deep-inference formalism has later been introduced in [GGP10], called open deduction. Open deduction is more general than CoS, in the sense that every CoS derivation is also an open-deduction derivation. On the other hand, every open-deduction derivation can be transformed into a CoS derivation by a straightforward transformation that essentially amounts to interleaving derivations. The cost of this transformation is at most quadratic in the size of the original open-deduction derivation; therefore, from the point of view of complexity, CoS and open deduction are equivalent.

CoS and open deduction are equivalent also from the point of view of proof theory, because the two formalisms are just two different notations for derivations of the same nature, and so every derivation transformation that can be performed in one formalism can also be performed in the other. In this paper we will adopt the open-deduction notation, especially because it is more efficient for the reader. However, given that most of the literature in deep inference adopts the CoS notation, which is more similar to the traditional Gentzen syntax, we will present both styles in this section.

The standard proof system of propositional logic in deep inference is called SKS. The basic proof-complexity properties of SKS, and so of propositional logic in deep inference, have been studied in [BG09] (which also could be used as an introduction to SKS). Those properties are:

• SKS is polynomially equivalent to Frege proof systems.

• SKS can be extended with Tseitin’s extension and substitution, and the proof systems so obtained are polynomially equivalent to Frege proof systems augmented with extension and substitution. • Cut-free SKS polynomially simulates cut-free Gentzen proof systems for propositional logic, but the converse does not hold: in fact, Statman’s tautologies admit polynomial proofs in cut-free SKS but only exponential ones in cut-free Gentzen [Sta78]. We now quickly introduce all the necessary notions. An excellent and more relaxed introduction to SKS in CoS and its basic properties is [Brü04].

Formulae, denoted by A, B, C and D are freely built from: units, f (false), t (true); atoms, denoted by a, b, c, d and e; disjunction and conjunction, [A ∨ B] and (A ∧ B). The different brackets have the only purpose of improving legibility; we usually omit e

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut