📝 Original Info
- Title: On Decidable Growth-Rate Properties of Imperative Programs
- ArXiv ID: 1005.0518
- Date: 2010-05-20
- Authors: Researchers from original ArXiv paper
📝 Abstract
In 2008, Ben-Amram, Jones and Kristiansen showed that for a simple "core" programming language - an imperative language with bounded loops, and arithmetics limited to addition and multiplication - it was possible to decide precisely whether a program had certain growth-rate properties, namely polynomial (or linear) bounds on computed values, or on the running time. This work emphasized the role of the core language in mitigating the notorious undecidability of program properties, so that one deals with decidable problems. A natural and intriguing problem was whether more elements can be added to the core language, improving its utility, while keeping the growth-rate properties decidable. In particular, the method presented could not handle a command that resets a variable to zero. This paper shows how to handle resets. The analysis is given in a logical style (proof rules), and its complexity is shown to be PSPACE-complete (in contrast, without resets, the problem was PTIME). The analysis algorithm evolved from the previous solution in an interesting way: focus was shifted from proving a bound to disproving it, and the algorithm works top-down rather than bottom-up.
💡 Deep Analysis
Deep Dive into On Decidable Growth-Rate Properties of Imperative Programs.
In 2008, Ben-Amram, Jones and Kristiansen showed that for a simple “core” programming language - an imperative language with bounded loops, and arithmetics limited to addition and multiplication - it was possible to decide precisely whether a program had certain growth-rate properties, namely polynomial (or linear) bounds on computed values, or on the running time. This work emphasized the role of the core language in mitigating the notorious undecidability of program properties, so that one deals with decidable problems. A natural and intriguing problem was whether more elements can be added to the core language, improving its utility, while keeping the growth-rate properties decidable. In particular, the method presented could not handle a command that resets a variable to zero. This paper shows how to handle resets. The analysis is given in a logical style (proof rules), and its complexity is shown to be PSPACE-complete (in contrast, without resets, the problem was PTIME). The a
📄 Full Content
Central to the field of Implicit Computational Complexity (ICC) is the following observation: it is possible to restrict a programming language syntactically so that the admitted programs will possess a certain complexity, say polynomial time, or polynomial output size. Since programmers want their programs to have well-behaved complexity, this appears at first to be a useful approach. However, languages designed to capture a complexity class tend to be too restrictive or cumbersome for practical programming. The programmer would prefer to program normally-which means that algorithms of undesirable complexity can be written-and would be happy to have them detected at compile time, just as type-related errors are. Thus, we move into the realm of static program analysis.
Automated complexity analysis (or “cost analysis”) as a kind of static analysis has a long history, with classic contributions including Wegbreit [Weg75], Rosendahl [Ros89] and Le Métayer [LM88]. Today it enjoys a flurry of research.
Static analysis targets program properties that are typically uncomputable in any Turing-complete language. Complexity properties, such as having a polynomial running time, are no different (in fact, they are still undecidable if termination is guaranteed). The common approach in static analysis is to just give up a complete solution; usually, one takes a conservative approach, which means that an algorithm that has to certify programs as “good” may only err by rejecting a good program. In other words, it may be sound but incomplete.
The downside of the conservative approach is that it gives up one of the hallmarks of algorithm theory: studying well-defined problems and developing algorithms that actually solve them. Besides losing the satisfaction of proving that a goal has been achieved, one loses the ability to precisely state what has been achieved by a new algorithm, other than by anecdotal evidence such as examples that it succeeds on.
The approach taken in this paper establishes a middle path. It consists of breaking the analysis of programs in two stages: the first is abstraction, in which the concrete program is replaced with an abstract one, a simplified model of the original; the second stage is analysis of the abstract program. Abstract programs are a weak model of computation where the properties of interest are, hopefully, decidable. Their relation to concrete programs may be specified precisely by first assigning an approximate semantics to the source program-more precisely, one which over-approximates the behaviour of the concrete program-then translating the program to a simplified “core” language, whose semantics is equal to the approximate semantics of the original. The over-approximation ensures that the conclusions drawn for the concrete programs are in the conservative zone. The analysis of core-language programs becomes a new, well-defined problem that may well be solvable. Another benefit of the approach is that abstract (core) programs may be rather independent of the concrete programming language, and their analysis more widely applicable-one only needs “front ends” for the concrete languages of interest.
In areas other than complexity analysis, this approach is well known. Probably the best-known example is the abtraction of programs into finite automata, central to software model checking. Closer to complexity analysis is termination analysis. The size-change abstraction reduces a program to a transition system specified by order-based constraints [LJBA01, BA09], whose termination is decidable.
The current paper is part of a plan to apply the “abstract and conquer” approach to complexity analysis, based on previous work by Jones, Kristiansen and the author [BJK08]. This work defined an imperative-style core language with restricted arithmetics and bounded loops. The loop bounds may be computed values (this is the main source of difficulty in analysis). It was shown that certain growthrate questions are decidable for this language. Specifically, algorithms were given to decide whether the running time is linear, polynomial or otherwise (as a function of input values); and which computed values are polynomially (or linearly) bounded. Suprisingly, the analysis itself takes polynomial time.
Once a result of this kind is established, a program of further research arises automatically: investigate the tradeoff between the strength of the core language (the abstraction) and the decidability of the properties of interest. A stronger core language can model more closely the concrete semantics of real programs; in other words, it constitutes a finer abstraction. Make it too fine, and decidability will be lost. It is interesting to find out how far we can venture while maintaining decidability: what language features are the real impediments? What extensions will increase the difficulty of the analysis? Note that since we are dealing with decidable problems, we can classify their complexit
…(Full text truncated)…
📸 Image Gallery
Reference
This content is AI-processed based on ArXiv data.