Complexity of Non-Monotonic Logics

Complexity of Non-Monotonic Logics
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter’s default logic, Moore’s autoepistemic logic and McCarthy’s circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.


💡 Research Summary

The surveyed paper provides a comprehensive overview of the computational complexity landscape for several prominent non‑monotonic logics, focusing on propositional fragments that admit more efficient reasoning algorithms. The authors examine three classic formalisms—Reiter’s default logic, Moore’s autoepistemic logic, and McCarthy’s circumscription—together with abductive reasoning, which is treated as a non‑monotonic inference problem where explanations for observations are sought rather than direct entailments.

In the introductory section the authors motivate the study by recalling that non‑monotonic reasoning has become central to artificial intelligence and computational logic, yet most early complexity results concentrated on the full propositional versions of these logics, yielding high levels of the polynomial hierarchy (typically Σ₂^P‑ or Π₂^P‑complete). Recent work has shifted toward two complementary strategies: parameterized complexity and the identification of syntactic fragments that lower the worst‑case complexity. This survey deliberately concentrates on the latter, systematically cataloguing results obtained by restricting either the set of logical operators (e.g., forbidding negation) or the shape of clauses (e.g., Horn‑type, monotone CNF).

The first major chapter deals with default logic. The authors recall that the basic decision problems—extension existence, credulous and skeptical entailment—are Σ₂^P‑complete in the unrestricted setting. They then discuss a hierarchy of fragments. When negation is disallowed, yielding purely monotone formulas, the extension existence problem drops to NP‑complete. If all defaults are Horn, the problem becomes solvable in polynomial time. The chapter also treats counting extensions, showing #P‑completeness in the general case but tractability for the Horn fragment.

The second chapter focuses on autoepistemic logic. The central reasoning tasks are the existence of a stable expansion and the inclusion relationship between expansions. In full propositional autoepistemic logic these tasks are Π₂^P‑complete. By limiting the language to positive literals only, or by requiring the underlying propositional matrix to be Horn‑CNF, the complexity collapses to P. The authors also examine counting stable expansions, noting that the general problem is #P‑complete, yet becomes tractable under the same syntactic restrictions.

The third chapter examines circumscription. The minimal‑model semantics leads to decision problems that sit at the Δ₂^P level when unrestricted. The authors show that if the circumscribed predicates are limited to a fixed set, or if the circumscription is applied only to Horn clauses, the decision problems become solvable in polynomial time. Moreover, the counting version (how many minimal models exist) is #P‑complete in general but polynomial‑time computable for Horn‑type circumscription.

The fourth chapter treats abductive reasoning. The authors formalize the abductive explanation problem as “given a propositional theory T and an observation O, does there exist a set of hypotheses H such that T ∪ H entails O and H is minimal?” In the unrestricted propositional case this problem is Σ₂^P‑complete. When hypotheses are restricted to atomic propositions or when the background theory is Horn‑CNF, the problem reduces to NP‑complete or even P, depending on the exact constraints. Counting minimal explanations follows the same pattern: #P‑complete in general, tractable for Horn fragments.

A comparative table summarises the complexity results across all four logics, highlighting two recurring patterns: (1) forbidding negation or limiting to monotone operators typically lowers the complexity by one level of the polynomial hierarchy, and (2) restricting clauses to Horn or other well‑studied forms often yields polynomial‑time algorithms for both decision and counting variants. The authors argue that these fragments are not merely theoretical curiosities; they correspond to natural subclasses of knowledge bases used in practice, such as rule‑based expert systems, configuration problems, and diagnostic reasoning.

In the concluding discussion the paper emphasizes that while parameterized complexity offers fine‑grained insights for specific structural parameters (treewidth, backdoor size, etc.), syntactic fragment analysis provides a complementary, often more accessible route to efficient algorithms. The authors suggest several avenues for future work: (i) discovering new fragments that balance expressive power with tractability, (ii) developing practical solvers that automatically detect and exploit fragment membership, and (iii) extending the fragment‑based approach to richer logics (e.g., description logics with non‑monotonic extensions) and to combined reasoning tasks that integrate multiple non‑monotonic formalisms. Overall, the survey maps a detailed terrain of where non‑monotonic reasoning becomes computationally feasible, guiding both theoreticians and system designers toward promising algorithmic strategies.


Comments & Academic Discussion

Loading comments...

Leave a Comment