Data Sharing with Endogenous Choices over Differential Privacy Levels
We study coalition formation for data sharing under differential privacy when agents have heterogeneous privacy costs. Each agent holds a sensitive data point and decides whether to participate in a data-sharing coalition and how much noise to add to their data. Privacy choices induce a fundamental trade-off: higher privacy reduces individual data-sharing costs but degrades data utility and statistical accuracy for the coalition. These choices generate externalities across agents, making both participation and privacy levels strategic. Our goal is to understand which coalitions are stable, how privacy choices shape equilibrium outcomes, and how decentralized data sharing compares to a centralized, socially optimal benchmark. We provide a comprehensive equilibrium analysis across a broad range of privacy-cost regimes, from decreasing costs (e.g., privacy amplification from pooling data) to increasing costs (e.g., greater exposure to privacy attacks in larger coalitions). We first characterize Nash equilibrium coalitions with endogenous privacy levels and show that equilibria may fail to exist and can be non-monotonic in problem parameters. We also introduce a weaker equilibrium notion called robust equilibrium (that allows more widespread equilibrium existence by equipping existing players in the coalition with the power to prevent or veto external players from joining) and fully characterize such equilibria. Finally, we analyze, for both Nash and robust equilibria, the efficiency relative to the social optimum in terms of social welfare and estimator accuracy. We derive bounds that depend sharply on the number of players, properties of the cost profile and how privacy costs scale with coalition size.
💡 Research Summary
The paper investigates how autonomous agents form data‑sharing coalitions when each agent can choose both whether to join and how much differential‑privacy (DP) noise to add to their data. Each agent i holds a private datum x_i and incurs a privacy‑cost function c_i(ε_i, |S|) that depends on the chosen privacy parameter ε_i (the inverse of the noise level) and on the size of the coalition S. Participation yields a benefit that grows with the statistical accuracy of the pooled estimate, while a larger ε_i (stronger privacy) reduces that benefit by increasing the variance of the final estimator.
The authors model this situation as a non‑cooperative game with two strategic dimensions: (i) binary participation decisions and (ii) continuous privacy‑level choices. They introduce two equilibrium concepts. The first is a standard Nash equilibrium, where no single agent can improve its utility by unilaterally changing its participation status or ε_i. The second, called a robust equilibrium, equips the existing members of a coalition with a veto power over any external entrant, thereby capturing realistic “entry‑blocking” mechanisms that appear in many consortium settings.
A central contribution is the systematic classification of privacy‑cost regimes. In the decreasing‑cost regime, the marginal privacy cost falls as the coalition grows (capturing privacy amplification from pooling). In the increasing‑cost regime, costs rise with coalition size (reflecting larger attack surfaces). For decreasing‑cost functions the authors prove that at least one Nash equilibrium always exists, and coalition size grows monotonically with the number of agents. For increasing‑cost functions, Nash equilibria may fail to exist, and when they do exist, coalition size can be non‑monotonic in model parameters. By contrast, robust equilibria are shown to exist for all cost profiles; the maximal stable coalition size is determined by a critical point of the cost function where the marginal benefit of adding another member is offset by the marginal privacy loss.
The paper also analyzes a centralized benchmark in which a social planner simultaneously chooses the participating set and a uniform privacy level ε to maximize total welfare (benefit minus privacy costs) and to minimize estimator variance. In the decreasing‑cost regime the planner’s optimal solution is full participation, whereas in the increasing‑cost regime the optimal solution may involve partial participation or even no sharing, depending on how steeply costs rise.
To assess the efficiency loss due to decentralization, the authors define a Price of Stability (PoS) for both social welfare and estimator variance. They derive bounds that depend sharply on the number of agents n and on the curvature of the cost functions. When costs are modestly decreasing or increasing, PoS remains bounded by a constant factor. However, under steeply increasing costs, PoS can grow linearly with n, indicating that decentralized coalitions can be dramatically less efficient than the planner’s optimum.
Key technical insights include: (1) a novel game‑theoretic model that integrates DP‑induced externalities; (2) rigorous existence and non‑existence results for Nash equilibria across cost regimes; (3) the introduction of robust equilibrium as a realistic stability notion that guarantees existence; (4) explicit characterizations of the socially optimal coalition and privacy level; and (5) quantitative bounds on the welfare and accuracy gaps between decentralized and centralized outcomes.
The findings have practical implications for the design of data‑sharing consortia in health, finance, education, and other domains where privacy concerns are heterogeneous. Policymakers can influence the shape of privacy‑cost functions (e.g., through subsidies for privacy‑preserving technologies or regulations that limit attack surfaces) to ensure that decentralized coalitions remain efficient. Moreover, incorporating entry‑blocking mechanisms or a modest central coordinator can dramatically improve stability and reduce the efficiency loss identified by the PoS analysis. Overall, the paper provides a comprehensive theoretical foundation for understanding and guiding decentralized data sharing under differential privacy.
Comments & Academic Discussion
Loading comments...
Leave a Comment