Computational Complexity of Interactive Behaviors
The theory of computational complexity focuses on functions and, hence, studies programs whose interactive behavior is reduced to a simple question/answer pattern. We propose a broader theory whose ultimate goal is expressing and analyzing the intrinsic difficulty of fully general interactive behaviors. To this extent, we use standard tools from concurrency theory, including labelled transition systems (formalizing behaviors) and their asynchronous extension (providing causality information). Behaviors are implemented by means of a multiprocessor machine executing CCS-like processes. The resulting theory is shown to be consistent with the classical definitions: when we restrict to functional behaviors (i.e., question/answer patterns), we recover several standard computational complexity classes.
💡 Research Summary
The paper begins by critiquing the traditional focus of computational complexity theory on functions—programs that map a single input to a single output. This functional view ignores the rich, ongoing interactions that characterize modern software such as network protocols, real‑time controllers, and human‑computer interfaces. To capture these phenomena, the authors adopt two well‑established tools from concurrency theory. First, labelled transition systems (LTS) provide a state‑based representation of observable actions. Second, they extend LTS to asynchronous transition systems (ATS), which encode causal relationships and allow for non‑instantaneous message delivery, thereby modelling true asynchrony and race conditions.
Implementation is realized on a multiprocessor machine (MPM) model. The MPM consists of a finite collection of processors and shared memory cells. Each processor runs a process described in a CCS‑like syntax, with primitive operations for labelled transitions, synchronisation, and message send/receive. Communication is mediated by labels; the ATS layer records the possible delays between send and receive events, making the model faithful to real multi‑core architectures while remaining mathematically tractable.
Complexity measures are defined in two dimensions. Time complexity counts the total number of labelled transitions performed during an execution, analogous to step counts on a Turing machine but allowing for concurrent steps. Space complexity aggregates the number of processors and memory cells that are ever allocated. Both measures admit natural upper and lower bounds, which the authors map to complexity classes.
The central theoretical contributions are threefold. (1) When the behaviour is restricted to functional patterns—single question, single answer—the new definitions coincide exactly with classical classes such as TIME, SPACE, NEXP, etc. This shows consistency with the established theory. (2) For genuinely interactive behaviours, a new hierarchy emerges. For instance, an infinite stream processor cannot be classified by traditional time bounds; instead the authors introduce a “stream complexity” metric based on the rate of labelled transitions per unit of input. (3) By explicitly modelling causality via ATS, the framework distinguishes behaviours that have the same raw transition count but differ in their concurrency structure, thereby quantifying the impact of scheduling on complexity.
The paper illustrates the framework with three case studies. The first examines a client‑server negotiation protocol that requires multiple rounds of message exchange, each round creating intricate causal dependencies. The second analyses a real‑time control loop that continuously reads sensor data, processes it, and immediately issues actuator commands; the authors derive bounds on both transition rate and memory usage. The third case studies an infinite data‑stream filtering pipeline, highlighting trade‑offs between stream‑throughput complexity and space consumption. In each scenario, the authors compute upper and lower bounds, compare them with traditional algorithmic analyses, and demonstrate the added expressive power of the interactive complexity model.
Finally, the authors outline future research directions. They propose extending the model to probabilistic interactive behaviours, establishing completeness results for the newly defined complexity hierarchies, and developing automated analysis tools that can extract ATS representations from real code bases and compute the associated complexity measures. In sum, the work provides a rigorous formal foundation for studying the intrinsic difficulty of fully general interactive behaviours, bridging the gap between classical complexity theory and concurrency theory, and opening a pathway for systematic analysis of modern interactive systems.