Nash Codes for Noisy Channels
This paper studies the stability of communication protocols that deal with transmission errors. We consider a coordination game between an informed sender and an uninformed decision maker, the receiver, who communicate over a noisy channel. The sender’s strategy, called a code, maps states of nature to signals. The receiver’s best response is to decode the received channel output as the state with highest expected receiver payoff. Given this decoding, an equilibrium or “Nash code” results if the sender encodes every state as prescribed. We show two theorems that give sufficient conditions for Nash codes. First, a receiver-optimal code defines a Nash code. A second, more surprising observation holds for communication over a binary channel which is used independently a number of times, a basic model of information transmission: Under a minimal “monotonicity” requirement for breaking ties when decoding, which holds generically, EVERY code is a Nash code.
💡 Research Summary
The paper investigates the strategic stability of communication protocols that operate over noisy channels by modeling the interaction between an informed sender and an uninformed receiver as a coordination game. The sender observes the true state of nature and maps it to a signal (a codeword) according to a coding strategy, while the receiver observes the noisy output of the channel and must decode it into a guessed state. The receiver’s best‑response rule is defined as the Bayesian decision that maximizes his expected payoff: for each possible channel output, he selects the state that yields the highest expected receiver utility given the posterior distribution over states.
A pair consisting of a coding strategy (the sender’s “code”) and a decoding rule (the receiver’s “best response”) constitutes a Nash equilibrium—called a “Nash code”—if neither player can improve his expected payoff by unilaterally deviating. In this setting the sender’s incentive to follow the prescribed code is tied to the receiver’s decoding rule, and the receiver’s incentive to decode according to the Bayesian rule is tied to the sender’s coding.
The first main result establishes a sufficient condition that is both natural and powerful: any code that is optimal for the receiver (i.e., maximizes the receiver’s expected payoff over all possible codes) automatically forms a Nash code. The proof exploits the fact that a receiver‑optimal code already aligns the receiver’s decoding rule with his own best response; consequently the sender cannot gain by deviating because any deviation would either leave the receiver’s payoff unchanged or reduce it, and the sender’s payoff is assumed to be weakly aligned with the receiver’s. Thus, designing a code that is best from the receiver’s perspective guarantees strategic stability for both parties.
The second, more surprising theorem concerns the binary symmetric channel (BSC) used independently a fixed number of times—a canonical model for digital communication. The authors introduce a minimal “monotonicity” requirement for tie‑breaking in the receiver’s decoding rule: when several states yield the same expected utility for a given channel output, the rule must break ties in a consistent way that does not favor a state when more codewords are mapped to it. This condition is generic; it is satisfied by any deterministic tie‑breaking rule such as a fixed priority ordering or a lexicographic rule.
Under this monotonicity assumption, the authors prove that every possible coding strategy is a Nash code. The key insight is that, because the BSC’s errors are independent across uses, the posterior probability of each state given a received word factorizes into a product of per‑bit error probabilities. The monotonic tie‑breaking rule guarantees that the receiver’s decoding decision is a deterministic function of these posterior probabilities, and any change in the sender’s code merely permutes the mapping between states and posterior vectors without improving the sender’s expected payoff. Consequently, the sender has no profitable unilateral deviation, and the receiver’s decoding remains a best response. The theorem extends to any memoryless symmetric channel with independent uses, showing that strategic stability is essentially automatic in these settings.
The paper’s contributions have several important implications:
-
Design Freedom – In binary (or more generally symmetric) channels, engineers can focus on traditional coding criteria such as error‑correction capability, rate, or complexity, without worrying about strategic incentives. The Nash‑code property is guaranteed as long as the receiver’s decoder respects a simple monotonic tie‑breaking rule.
-
Unified View of Efficiency and Stability – The first theorem links the classic notion of “optimal code” (maximizing a performance metric) with game‑theoretic equilibrium, suggesting that efficiency and stability need not be at odds. A code that is optimal for the receiver automatically stabilizes the interaction.
-
Robustness to Information Asymmetry – The model captures the realistic situation where the sender knows the state but the receiver does not. The results show that even with this asymmetry, equilibrium can be achieved without costly signaling or contract mechanisms.
-
Potential Extensions – The framework can be adapted to multi‑sender or multi‑receiver settings, to channels with memory, or to scenarios where sender and receiver utilities diverge more sharply. The monotonicity condition may inspire new decoding algorithms that are both performance‑optimal and strategically robust.
-
Practical Applications – In low‑power wireless sensor networks, Internet‑of‑Things devices, or any system where communication is noisy and devices have limited computational resources, the findings justify the use of simple, well‑understood coding schemes without additional incentive‑compatible layers.
In summary, the paper demonstrates that (i) a receiver‑optimal code always yields a Nash equilibrium, and (ii) for the widely used binary symmetric channel, the Nash‑code property is universal under a mild monotonicity requirement. These results bridge information theory and game theory, providing a clear pathway for designing communication protocols that are both technically efficient and strategically stable.