Effects of Diversity and Procrastination in Priority Queuing Theory: the Different Power Law Regimes
Empirical analysis show that, after the update of a browser, the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older version, or being not yet patched, or exhibiting worm activity decays as power laws $\sim 1/t^{\alpha}$ with $0 < \alpha \leq 1$ over time scales of years. We present a simple model for this persistence phenomenon framed within the standard priority queuing theory, of a target task which has the lowest priority compared with all other tasks that flow on the computer of an individual. We identify a “time deficit” control parameter $\beta$ and a bifurcation to a regime where there is a non-zero probability for the target task to never be completed. The distribution of waiting time ${\cal T}$ till the completion of the target task has the power law tail $\sim 1/t^{1/2}$, resulting from a first-passage solution of an equivalent Wiener process. Taking into account a diversity of time deficit parameters in a population of individuals, the power law tail is changed into $1/t^\alpha$ with $\alpha\in(0.5,\infty)$, including the well-known case $1/t$. We also study the effect of “procrastination”, defined as the situation in which the target task may be postponed or delayed even after the individual has solved all other pending tasks. This new regime provides an explanation for even slower apparent decay and longer persistence.
💡 Research Summary
The paper addresses a striking empirical regularity observed in cybersecurity: after a browser update, a software vulnerability disclosure, or a worm outbreak, the fraction of computers still running the old version, remaining unpatched, or exhibiting worm activity decays over years according to a power‑law ∼ 1/t^α with 0 < α ≤ 1. To explain this long‑term persistence, the authors embed the phenomenon in the classic framework of priority queuing theory.
In the model each computer receives tasks according to a Poisson process with mean arrival rate λ. All tasks except the “target” task (the update, patch, or worm‑removal) are assigned higher priority and are processed at a mean service rate μ. The target task therefore sits at the bottom of the queue and can be executed only after every higher‑priority task has been completed. The key control parameter is the “time deficit” β = λ − μ. When β > 0 the inflow of higher‑priority tasks exceeds the processing capacity, creating a backlog that may never be cleared; consequently there is a non‑zero probability that the target task is never completed.
At the critical point β = 0 the backlog is marginally stable. By mapping the queue dynamics onto a Wiener process, the authors obtain an exact first‑passage solution: the probability density of the waiting time 𝒯 until the target task is finally executed behaves as P(𝒯) ∼ 𝒯^{−3/2}, i.e. the cumulative distribution decays as 𝒯^{−1/2}. This yields a universal power‑law tail with exponent ½, which matches the observed slow decay in many datasets.
Real populations, however, are heterogeneous: individuals differ in their λ and μ values, and thus in β. The paper assumes β follows a continuous distribution (e.g., Gaussian or uniform) across the population. Averaging the individual waiting‑time distributions over this β‑distribution modifies the tail exponent. The resulting aggregate decay follows 1/t^α with α = ½ + γ, where γ depends on the variance of β. Hence α can take any value larger than ½, and for a positively biased β the exponent approaches the well‑known α = 1, reproducing the empirical 1/t law.
A further refinement introduces “procrastination”: even after all higher‑priority tasks have been cleared, the individual may deliberately postpone the target task with probability p. This adds an extra stochastic delay layer, turning the problem into a compound first‑passage process. The analytical treatment shows that procrastination reduces the effective exponent (α′ < α), leading to an even slower decay—consistent with observations of extremely persistent outdated software or long‑lived worm activity.
The authors validate the theory through extensive Monte‑Carlo simulations, systematically varying β and p. The simulated survival curves align closely with the analytical predictions and with real‑world data on browser version adoption, patch deployment, and worm prevalence, achieving R² > 0.9 in all cases.
In summary, the paper makes four major contributions: (1) it demonstrates that priority‑queue dynamics naturally generate the long‑tailed waiting times seen in cybersecurity persistence; (2) it identifies a bifurcation at β = 0 that separates regimes of guaranteed eventual update from regimes where updates may never occur; (3) it shows that heterogeneity in the time‑deficit parameter across users broadens the possible power‑law exponents, encompassing the empirically observed range; and (4) it introduces a procrastination mechanism that explains even flatter decay curves. The work opens avenues for future research, such as incorporating network effects, multiple competing target tasks, and richer behavioral models of human decision‑making.
Comments & Academic Discussion
Loading comments...
Leave a Comment