Constraint Migration: A Formal Theory of Throughput in AI Cybersecurity Pipelines
We develop a formal theory of throughput in finite serial pipeline systems subject to stage multiplicative capacity perturbations, motivated by the deployment of AI tools in cybersecurity operations. A pipeline is a finite totally ordered set of stag…
Authors: Surasak Phetmanee
Constrain t Migration: A F ormal Theory of Throughput in AI Cyb ersecurit y Pip elines Surasak Phetmanee Departmen t of Electrical and Computer Engineering, F acult y of Engineering, Thammasat School of Engineering, Thammasat Univ ersity , Thailand psurasak@engr.tu.ac.th Abstract. W e dev elop a formal theory of throughput in finite serial pip eline systems sub ject to stage multiplicativ e capacit y p erturbations, motiv ated b y the deplo yment of AI to ols in cyb ersecurit y op erations. A pip eline is a finite totally ordered set of stages each with a p ositive capacit y throughput is the minimum stage capacity . An admissible multiplier assigns to eac h stage an impro vemen t factor of at least one. W e pro ve five theorems and one prop osition. The or ems 1–2 give exact necessary and sufficien t conditions. Throughput is unchanged if and only if at least one b ottlenec k retains multiplier 1 , and throughput strictly increases if and only if ev ery bottleneck has multiplier strictly greater than 1 . The or em 3 establishes that when a nonempt y subset of stages is constrained to m ultiplier 1 the human authority constraint, throughput is b ounded ab o ve by the smallest capacity among those stages, and this b ound is tight under un b ounded non human acceleration. Theor em 4 prov es that in a pair of indep endent attack er defender pip elines, the attack er defender throughput ratio worsens for the defender if and only if the attac ker relativ e throughput gain exceeds the defender. The or em 5 prov es that under a fixed false p ositiv e fraction mo del, useful throughput is constan t not decreasing ab o ve the inv estigation capacity , establishing that a commonly asserted paradoxical decline is imp ossible in that mo del. Pr op osition 6 shows that replacing the fixed fraction with a rate dep enden t precision function that is strictly decreasing suffices to recov er the intended decline. All pro ofs are elementary , using only finite minima, real n umber order prop erties, and p oin twise multiplicativ e structure. 1 In tro duction Informal arguments ab out AI in cyb ersecurity—AI sp eeds up b oth attack ers and defenders, AI cannot replace h uman judgement, more alerts means more noise—are circulated but not formally grounded. They are used both to justify and to refute contrary p olicy p ositions. The conditions under which these slogans are true or false remain in visible. This pape r provides formal ground. W e mo del a cyb ersecurity pip eline as a finite totally ordered set of stages, each with a p ositiv e pro cessing capacit y . System throughput is the minimum stage capacity—the deterministic serial b ottlenec k abstraction familiar from the informal Theory of Constraints [ 1 ]. AI is mo delled 2 S. Phetmanee as a stage multiplicativ e improv emen t: each stage’s capacity is multiplied by a factor of at least one. Within this framework, w e pro ve exact conditions for when throughput c hanges, what ceiling human authority stages imp ose, how attack er and defender throughputs interact, and wh y a naïve false positive mo del cannot pro duce the paradoxical decline that informal arguments predict. Informal statement of main r esults. Thr oughput invarianc e (The or ems 1–2): Throughput is unc hanged if and only if at least one original b ottlenec k remains unimpro ved. Throughput strictly increases if and only if every original b ottlenec k is strictly impro ved. Human authority c eiling (The or em 3): If some stages cannot b e accelerated, throughput cannot exceed the smallest capacity among those stages, and this b ound is exactly achiev able. A dversarial ac c eler ation (The or em 4): The attac ker defender throughput ratio w orsens for the defender precisely when the attac ker relative gain exceeds the defender. F alse p ositive imp ossibility and r ep air (The or em 5, Pr op osition 6): Under a fixed false p ositiv e fraction, useful throughput plateaus rather than declines ab o ve saturation; strict decline requires a rate-dep enden t precision function. Contributions. – A formal pip eline mo del with explicit axioms, definitions, and notation for throughput, b ottlenec k sets, multipliers, and p erturbed pip elines (§ 3 ). – Exact biconditional characterisations of throughput inv ariance and strict impro vemen t under multiplicativ e perturbation (Theorems 1 – 2 , § 4 ). – A tight upp er b ound theorem for throughput under authority constraints, with an explicit tigh tness construction (Theorem 3 , § 4 ). – An algebraic equiv alence c haracterising when AI fav ours the attack er in a t wo pip eline adversarial mo del (Theorem 4 , § 4 ). – An imp ossibility result for the fixed false p ositive m odel and a repaired theorem under a rate precision function (Theorem 5 , Prop osition 4 , § 4 ). – A scop e and limitations analysis iden tifying what is and is not prov ed (§ 6 ). Or ganisation. Section 2 surveys related work. Section 3 presents the formal mo del including domains, definitions, and assumptions. Section 4 states and prov es all results in dep endency order. Section 5 interprets the results. Section 6 discusses limitations and future w ork. Section 7 concludes. 2 Related W ork The ory of Constr aints. The informal Theory of Constraints (TOC) w as introduced b y Goldratt [ 1 ] in the con text of manufacturing. Its central claim—that system throughput is determined b y the b ottleneck, and improving non b ottlenecks do es not improv e throughput—has b een adopted in op erations management. Ho wev er, TOC has not b een formalised as a mathematical theory with axioms, definitions, and pro ved theorems. This pap er pro vides such a formalisation for the pip eline case, proving exact necessary and sufficient conditions (Theorems 1 – 2 ) rather than informal claims. A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 3 Serial pr o duction lines. The op erations research literature on serial pro duction systems [ 2 ] analyses throughput in stochastic settings with buffers and v ariability . Our mo del is deterministic and bufferless including a delib erate simplification that isolates the b ottlenec k structure from queueing effects. The throughput form ula T ( Π ) = min v c ( v ) is the infinite buffer or fluid limit case of serial line mo dels. W e do not extend to sto chastic settings; see Section 6 . Flow networks. The max flow min cut theorem [ 3 ] provides a characterisation of throughput in general netw orks. A pip eline is a degenerate case such as the minim um cut equals the minimum stage capacity . Our contribution is not the throughput form ula itself but the p erturbation theory built on it characterising in v ariance, migration, ceilings, and adversarial comparison under multiplicativ e stage impro vemen ts. Se curity e c onomics. Anderson and Mo ore [ 4 ] and Gordon and Lo eb [ 5 ] study securit y inv estment through economic mo dels fo cused on incen tives and optimal sp ending. Their mo dels do not address pip eline throughput or b ottlenec k structure. Our Theorem 4 provides a throughput ratio criterion for the attack er defender comparison, whic h complements economic mo dels by identifying when throughput asymmetry fa vours one side. F ormal metho ds in se curity. F ormal methods hav e b een applied to security proto col verification via the Dolev–Y ao mo del [ 6 ], BAN logic [ 7 ], and the applied pi calculus [ 8 ]. These formalisms address secrecy and authentication prop erties, not pip eline throughput. W e are not aw are of prior formal metho ds work addressing cyb ersecurit y pip eline b ottleneck analysis. AI in cyb erse curity. The empirical literature on AI in cyb ersecurit y [ 9 ] do cumen ts AI applications in intrusion detection, malware analysis, and threat intelligence. Alert fatigue and false p ositiv e burden are well do cumented op erationally [ 10 ]. This literature motiv ates our formal mo del but do es not provide formal theorems. Our Theorem 5 and Prop osition 4 formalise the false p ositive burden argument that this literature discusses informally . Human oversight in se curity op er ations. The human authority constraint ( α ( h ) = 1 for h uman stages) formalises concerns from the human sup ervisory control literature [ 11 , 12 ]. These works argue that certain decision stages cannot be fully automated. Our Theorem 3 gives the exact throughput ceiling that follo ws from this constrain t. 3 F ormal Preliminaries 3.1 Domains W e write R > 0 := { x ∈ R : x > 0 } for the strictly p ositiv e reals and R ≥ 1 := { x ∈ R : x ≥ 1 } for the admissible multiplicativ e factors. Throughout, V denotes a finite nonempt y set of stages , and P ( V ) denotes its p ow er set. 4 S. Phetmanee 3.2 Definitions Definition 1 (Pip eline). A pip eline is a tuple Π = ( V Π , ≺ Π , c Π ) wher e 1. V Π is a finite nonempty set (the stage set ), 2. ≺ Π is a strict total or der on V Π (the stage ordering ), 3. c Π : V Π → R > 0 is a function (the stage capacity function ). The class of al l pip elines is denote d Pip e . R emark 1. The stage ordering ≺ Π mo dels the serial structure of the pip eline. Ho wev er, as we note in Section 5 , none of the pro ofs in this pap er in vok e ≺ Π ; the theorems dep end only on V Π and c Π . The ordering is retained for interpretiv e fidelit y to the pip eline metaphor. Definition 2 (Throughput). F or Π ∈ Pip e , the throughput of Π is T ( Π ) := min v ∈ V Π c Π ( v ) . Definition 3 (Bottleneck set). F or Π ∈ Pip e , the b ottlenec k set is B ( Π ) := arg min v ∈ V Π c Π ( v ) = { v ∈ V Π : c Π ( v ) = T ( Π ) } . The non b ottlenec k set is NB( Π ) := V Π \ B ( Π ) . Example 1. Let V = { a, b, c } with a ≺ b ≺ c and c Π ( a ) = 3 , c Π ( b ) = 1 , c Π ( c ) = 4 . Then T ( Π ) = 1 , B ( Π ) = { b } , and NB( Π ) = { a, c } . Definition 4 (A dmissible multiplier). F or a finite nonempty set V , an admissible m ultiplier on V is a function α : V → R ≥ 1 . The set of al l admissible multipliers on V is denote d Mult ( V ) . The identit y multiplier 1 ∈ Mult ( V ) is define d by 1 ( v ) = 1 for al l v ∈ V . Definition 5 (P erturb ed pip eline). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . The p erturbed capacity function is c α Π ( v ) := α ( v ) · c Π ( v ) for al l v ∈ V Π . The p erturb ed pip eline is Π α := ( V Π , ≺ Π , c α Π ) . R emark 2. The sup erscript Π α denotes p erturbation, not exp onentiation. Since α ( v ) ≥ 1 and c Π ( v ) > 0 , we hav e c α Π ( v ) > 0 , so Π α ∈ Pip e . Definition 6 (Constrain t migration). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Constrain t o c curs under α if B ( Π α ) = B ( Π ) . W e write Migr ( Π , α ) ⇐ ⇒ B ( Π α ) = B ( Π ) . 3.3 Human Authority Extension Definition 7 (Human authority structure). L et Π ∈ Pip e . A human authorit y stage set is any subset H ⊆ V Π . The human-authorit y admissible m ultiplier set is Mult H ( V Π ) := { α ∈ Mult ( V Π ) : ∀ h ∈ H , α ( h ) = 1 } . The human ceiling v alue is C H ( Π ) := min h ∈ H c Π ( h ) , define d when H = ∅ . The mac hine-stage set is M ( Π , H ) := V Π \ H . A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 5 3.4 A dversarial Extension Definition 8 (A ttack er defender pip eline pair). An attac ker defender pip eline pair is ( Π A , Π D ) ∈ Pip eP air := Pip e × Pip e . W e write V A := V Π A , V D := V Π D , c A := c Π A , c D := c Π D . F or α A ∈ Mult ( V A ) and α D ∈ Mult ( V D ) , the comparative throughput ratio is R ( Π A , Π D , α A , α D ) := T ( Π α A A ) T ( Π α D D ) , and the baseline ratio is R 0 ( Π A , Π D ) := T ( Π A ) /T ( Π D ) . 3.5 F alse P ositive Mo del This section introduces a scalar mo del that is formally independent of the pip eline theory ab o v e. Definition 9 (Simple useful throughput). F or λ ∈ R > 0 , f ∈ [0 , 1) , and c inv ∈ R > 0 , the simple useful-throughput function is U ( λ, f , c inv ) := (1 − f ) min( λ, c inv ) . Definition 10 (Repaired useful throughput). L et p : R > 0 → [0 , 1] b e a precision function and c inv ∈ R > 0 . The repaired useful-throughput function is U p ( λ, c inv ) := p ( λ ) · min( λ, c inv ) . 3.6 Assumptions W e collect the assumptions used across the pap er. Each theorem statement names exactly those assumptions it requires. Assumption 1 (Finite nonempty stage set). F or every Π ∈ Pip e , the set V Π is finite and nonempty. Assumption 2 (Strict p ositivit y). F or every Π ∈ Pip e and every v ∈ V Π , c Π ( v ) > 0 . Assumption 3 (Throughput is the stagewise minimum). F or every Π ∈ Pip e , T ( Π ) = min v ∈ V Π c Π ( v ) . Assumption 4 (Stage multiplicativ e p erturbation). F or every Π ∈ Pip e and α ∈ Mult ( V Π ) , c α Π ( v ) = α ( v ) · c Π ( v ) for al l v ∈ V Π . Assumption 5 (A dmissible m ultipliers are at least one). F or every Π ∈ Pip e and α ∈ Mult ( V Π ) , α ( v ) ≥ 1 for al l v ∈ V Π . Assumption 6 (Human authorit y stages are unaccelerable). In the human c eiling mo del, α ( h ) = 1 for al l h ∈ H , i.e., α ∈ Mult H ( V Π ) . 6 S. Phetmanee Assumption 7 (Un b ounded non h uman acceleration). F or every M > 1 , ther e exists α ∈ Mult H ( V Π ) with α ( m ) ≥ M for al l m ∈ V Π \ H . Assumption 8 (Indep enden t attac ker defender pip elines). The p erturb e d thr oughputs T ( Π α A A ) and T ( Π α D D ) ar e c ompute d indep endently; no c oupling exists b etwe en the two pip elines. Assumption 9 (Strictly decreasing precision). The pr e cision function p is strictly de cr e asing on ( c inv , ∞ ) : for al l λ 1 , λ 2 with c inv < λ 1 < λ 2 , p ( λ 1 ) > p ( λ 2 ) . 4 Results W e present all results in the order dictated b y the pro of dep endency graph: foundational lemmas first, then core throughput theorems, then extensions. 4.1 Lemmas Lemma 1 (Bottlenec k existence). L et Π ∈ Pip e . Then T ( Π ) is wel l define d, B ( Π ) = ∅ , and c Π ( v ) = T ( Π ) for every v ∈ B ( Π ) . Pr o of. By Assumption 1 , V Π is finite and nonempt y . By Assumption 2 , the image { c Π ( v ) : v ∈ V Π } is a finite nonempt y subset of R > 0 . Every finite nonempt y subset of R has a minim um, so min v ∈ V Π c Π ( v ) exists; b y Assumption 3 this v alue is T ( Π ) . Since the minim um is attained, there exists v 0 ∈ V Π with c Π ( v 0 ) = T ( Π ) , so v 0 ∈ B ( Π ) by Definition 3 , giving B ( Π ) = ∅ . F or any v ∈ B ( Π ) , c Π ( v ) = T ( Π ) holds b y the definition of B ( Π ) . Lemma 2 (P erturb ed pip eline membership). L et Π ∈ Pipe and α ∈ Mult ( V Π ) . Then Π α ∈ Pip e . Pr o of. W e v erify the three conditions of Definition 1 for Π α = ( V Π , ≺ Π , c α Π ) . Conditions (1) and (2) hold b ecause Π α inherits V Π and ≺ Π from Π ∈ Pip e . F or condition (3), let v ∈ V Π . By Assumption 4 , c α Π ( v ) = α ( v ) · c Π ( v ) . By Assumption 5 , α ( v ) ≥ 1 > 0 , and b y Assumption 2 , c Π ( v ) > 0 . The pro duct of t wo positive reals is positive, so c α Π ( v ) > 0 . Therefore c α Π : V Π → R > 0 , and Π α ∈ Pip e . Lemma 3 (P erturb ed throughput normal form). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then T ( Π α ) = min v ∈ V Π α ( v ) · c Π ( v ) . Pr o of. By Lemma 2 , Π α ∈ Pip e . By Assumption 3 applied to Π α , T ( Π α ) = min v ∈ V Π c Π α ( v ) . By Definition 5 , V Π α = V Π . By Assumption 4 , c Π α ( v ) = α ( v ) · c Π ( v ) . Substituting, T ( Π α ) = min v ∈ V Π α ( v ) · c Π ( v ) . Micro Lemma 1 (Monotonicit y of minim um under domination). L et V b e a finite nonempty set and f , g : V → R with f ( v ) ≤ g ( v ) for al l v ∈ V . Then min v ∈ V f ( v ) ≤ min v ∈ V g ( v ) . A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 7 Pr o of. Both minima exist b ecause V is finite and nonempty . Let w ∈ V ac hieve min v ∈ V g ( v ) . Then min v ∈ V f ( v ) ≤ f ( w ) ≤ g ( w ) = min v ∈ V g ( v ) . Prop osition 1 (Monotonicit y of throughput). L et Π ∈ Pip e and α, β ∈ Mult ( V Π ) with α ( v ) ≤ β ( v ) for al l v ∈ V Π . Then T ( Π α ) ≤ T ( Π β ) . Pr o of. F or every v ∈ V Π , since c Π ( v ) > 0 (Assumption 2 ) and α ( v ) ≤ β ( v ) , w e ha ve α ( v ) · c Π ( v ) ≤ β ( v ) · c Π ( v ) . By Micro-Lemma 1 , min v α ( v ) · c Π ( v ) ≤ min v β ( v ) · c Π ( v ) . By Lemma 3 , the left side is T ( Π α ) and the right side is T ( Π β ) . Corollary 1 (Non decrease under p erturbation). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then T ( Π α ) ≥ T ( Π ) . Pr o of. By Assumption 5 , α ( v ) ≥ 1 = 1 ( v ) for all v ∈ V Π . By Prop osition 1 , T ( Π 1 ) ≤ T ( Π α ) . By Lemma 3 , T ( Π 1 ) = min v 1 · c Π ( v ) = min v c Π ( v ) = T ( Π ) . Therefore T ( Π ) ≤ T ( Π α ) . 4.2 Throughput Theorems Micro Lemma 2 (Strict low er b ound on finite minimum). L et V b e a finite nonempty set, f : V → R , and c ∈ R . If f ( v ) > c for every v ∈ V , then min v ∈ V f ( v ) > c . Pr o of. Since V is finite and nonempty , let w ∈ V ac hieve min v ∈ V f ( v ) . Then min v ∈ V f ( v ) = f ( w ) > c . Theorem 1 (Throughput in v ariance c haracterisation). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then T ( Π α ) = T ( Π ) ⇐ ⇒ ∃ v ∈ B ( Π ) such that α ( v ) = 1 . Equivalently, T ( Π α ) = T ( Π ) if and only if min v ∈ B ( Π ) α ( v ) = 1 . Pr o of. ( ⇐ ) Supp ose there exists v 0 ∈ B ( Π ) with α ( v 0 ) = 1 . By Lemma 1 , c Π ( v 0 ) = T ( Π ) . By Assumption 4 , c α Π ( v 0 ) = 1 · T ( Π ) = T ( Π ) . By Lemma 3 , T ( Π α ) = min v α ( v ) · c Π ( v ) ≤ α ( v 0 ) · c Π ( v 0 ) = T ( Π ) . By Corollary 1 , T ( Π α ) ≥ T ( Π ) . Therefore T ( Π α ) = T ( Π ) . ( ⇒ ) W e prov e the con trap ositiv e: if α ( v ) > 1 for ev ery v ∈ B ( Π ) , then T ( Π α ) > T ( Π ) . Assume α ( v ) > 1 for all v ∈ B ( Π ) . By Lemma 1 , B ( Π ) is a finite nonempt y subset of V Π . Define δ := min v ∈ B ( Π ) α ( v ) . By Micro Lemma 2 applied with V = B ( Π ) , f = α | B ( Π ) , and c = 1 , we hav e δ > 1 . W e sho w α ( v ) · c Π ( v ) > T ( Π ) for ev ery v ∈ V Π b y considering tw o exhaustiv e cases. Case 1: v ∈ B ( Π ) . By Lemma 1 , c Π ( v ) = T ( Π ) . Since α ( v ) ≥ δ and T ( Π ) > 0 (b ecause c Π ( v ) > 0 for all v b y Assumption 2 , and T ( Π ) is the minimum of finitely man y p ositive v alues), w e hav e α ( v ) · c Π ( v ) ≥ δ · T ( Π ) > T ( Π ) . Case 2: v ∈ V Π \ B ( Π ) . Since v / ∈ B ( Π ) , by Definition 3 , c Π ( v ) = T ( Π ) . Since c Π ( v ) ≥ T ( Π ) (as T ( Π ) is the minimum) and c Π ( v ) = T ( Π ) , we hav e c Π ( v ) > T ( Π ) . By Assumption 5 , α ( v ) ≥ 1 , so α ( v ) · c Π ( v ) ≥ c Π ( v ) > T ( Π ) . 8 S. Phetmanee Since V Π = B ( Π ) ∪ ( V Π \ B ( Π )) and every v ∈ V Π satisfies α ( v ) · c Π ( v ) > T ( Π ) , b y Micro Lemma 2 applied with V = V Π (finite and nonempt y by Assumption 1 ), T ( Π α ) = min v ∈ V Π α ( v ) · c Π ( v ) > T ( Π ) . This completes the con trap ositive. Equivalenc e of formulations. Since α ∈ Mult ( V Π ) , we hav e α ( v ) ≥ 1 for all v ∈ B ( Π ) by Assumption 5 . Hence min v ∈ B ( Π ) α ( v ) ≥ 1 . The condition “ ∃ v ∈ B ( Π ) , α ( v ) = 1 ” holds iff min v ∈ B ( Π ) α ( v ) ≤ 1 , which combined with ≥ 1 giv es min v ∈ B ( Π ) α ( v ) = 1 . Throughput sta ys the same precisely when at least one original b ottleneck retains its original capacity , b ecause that stage pins the minimum at its original lev el while admissibility ( α ≥ 1 ) preven ts an y stage from worsening. Corollary 2 (Non b ottlenec k impro vemen t). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . If α ( v ) = 1 for al l v ∈ B ( Π ) , then T ( Π α ) = T ( Π ) . Pr o of. By Lemma 1 , B ( Π ) = ∅ , so some v 0 ∈ B ( Π ) satisfies α ( v 0 ) = 1 . By Theorem 1 ( ⇐ ) , T ( Π α ) = T ( Π ) . Theorem 2 (Strict throughput improv emen t characterisation). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then T ( Π α ) > T ( Π ) ⇐ ⇒ ∀ v ∈ B ( Π ) , α ( v ) > 1 . Pr o of. By Corollary 1 , T ( Π α ) ≥ T ( Π ) , so exactly one of T ( Π α ) = T ( Π ) or T ( Π α ) > T ( Π ) holds. By Theorem 1 , T ( Π α ) = T ( Π ) ⇐ ⇒ ∃ v ∈ B ( Π ) , α ( v ) = 1 . Negating: T ( Π α ) > T ( Π ) ⇐ ⇒ ∀ v ∈ B ( Π ) , α ( v ) = 1 . By Assumption 5 , α ( v ) ≥ 1 , so α ( v ) = 1 is equiv alen t to α ( v ) > 1 . Throughput strictly increases if and only if ev ery original b ottleneck is strictly impro ved. In a system with tied b ottlenecks, al l m ust b e accelerated impro ving all but one lea ves throughput unchanged. Prop osition 2 (Bottlenec k preserv ation). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then B ( Π α ) = B ( Π ) if and only if: (i) α ( u ) = α ( u ′ ) for al l u, u ′ ∈ B ( Π ) ; and (ii) α ( u ) · c Π ( u ) < α ( w ) · c Π ( w ) for al l u ∈ B ( Π ) and w ∈ V Π \ B ( Π ) . When B ( Π ) = V Π , c ondition (ii) is vacuously true. Pr o of. Let T := T ( Π ) and T ′ := T ( Π α ) = min v α ( v ) · c Π ( v ) (Lemma 3 ). ( ⇐ ) Let α 0 b e the common v alue of α on B ( Π ) . F or u ∈ B ( Π ) : α ( u ) · c Π ( u ) = α 0 T . F or w / ∈ B ( Π ) : α 0 T < α ( w ) · c Π ( w ) by (ii). Hence T ′ = α 0 T and the minimisers are exactly B ( Π ) , so B ( Π α ) = B ( Π ) . ( ⇒ ) If B ( Π α ) = B ( Π ) , then for every u ∈ B ( Π ) , α ( u ) · c Π ( u ) = T ′ . Since c Π ( u ) = T > 0 for u ∈ B ( Π ) (Lemma 1 ), α ( u ) = T ′ /T is the same for all u , giving (i). F or w / ∈ B ( Π ) , since w / ∈ B ( Π α ) , α ( w ) · c Π ( w ) > T ′ = α ( u ) · c Π ( u ) , giving (ii). A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 9 Prop osition 3 (Migration). L et Π ∈ Pip e and α ∈ Mult ( V Π ) . Then Migr( Π , α ) ⇐ ⇒ ∃ v ∈ B ( Π ) : v / ∈ B ( Π α ) ∨ ∃ w ∈ V Π \ B ( Π ) : w ∈ B ( Π α ) . Pr o of. By Definition 6 , Migr ( Π , α ) ⇐ ⇒ B ( Π α ) = B ( Π ) . F or any tw o sets A, B : A = B ⇐ ⇒ ( ∃ x ∈ A \ B ) ∨ ( ∃ x ∈ B \ A ) . Applying this with A = B ( Π ) and B = B ( Π α ) and noting B ( Π α ) ⊆ V Π giv es the result. 4.3 Human Authority Ceiling Theorem 3 (Human authority ceiling and tightness). L et Π ∈ Pip e and H ⊆ V Π with H = ∅ . (a) F or every α ∈ Mult H ( V Π ) , T ( Π α ) ≤ C H ( Π ) = min h ∈ H c Π ( h ) . (b) Under Assumption 7 , sup α ∈ Mult H ( V Π ) T ( Π α ) = C H ( Π ) . Pr o of. Part (a). By Lemma 3 , T ( Π α ) = min v ∈ V Π α ( v ) · c Π ( v ) . Since H ⊆ V Π and H = ∅ , min v ∈ V Π α ( v ) · c Π ( v ) ≤ min h ∈ H α ( h ) · c Π ( h ) . By Assumption 6 , α ( h ) = 1 for h ∈ H , so min h ∈ H α ( h ) · c Π ( h ) = min h ∈ H c Π ( h ) = C H ( Π ) . Part (b). Case 1: H = V Π . Then Mult H ( V Π ) = { 1 } , so sup T ( Π α ) = T ( Π ) = min v ∈ V Π c Π ( v ) = C H ( Π ) . Case 2: M := V Π \ H = ∅ . Define c M min := min m ∈ M c Π ( m ) > 0 (exists b y Assumption 1 and Assumption 2 ). Let N := ⌈ C H ( Π ) /c M min ⌉ + 1 , so N · c M min > C H ( Π ) . By Assumption 7 , there exists α ∗ ∈ Mult H ( V Π ) with α ∗ ( m ) ≥ N for all m ∈ M . By Lemma 3 , T ( Π α ∗ ) = min min h ∈ H c Π ( h ) , min m ∈ M α ∗ ( m ) · c Π ( m ) = min C H ( Π ) , min m ∈ M α ∗ ( m ) · c Π ( m ) . F or ev ery m ∈ M : α ∗ ( m ) · c Π ( m ) ≥ N · c M min > C H ( Π ) . By Micro Lemma 2 , min m ∈ M α ∗ ( m ) · c Π ( m ) > C H ( Π ) . Therefore T ( Π α ∗ ) = C H ( Π ) , giving sup T ( Π α ) ≥ C H ( Π ) . Combined with Part (a), sup T ( Π α ) = C H ( Π ) . 4.4 A dversarial Comparativ e Statics Theorem 4 (A dversarial relative-acceleration). L et ( Π A , Π D ) ∈ Pip eP air , α A ∈ Mult ( V A ) , α D ∈ Mult ( V D ) . Then R ( Π A , Π D , α A , α D ) > R 0 ( Π A , Π D ) ⇐ ⇒ T ( Π α A A ) T ( Π A ) > T ( Π α D D ) T ( Π D ) . Pr o of. All four throughputs T ( Π A ) , T ( Π D ) , T ( Π α A A ) , T ( Π α D D ) are strictly p ositiv e (eac h is the minimum of finitely man y pro ducts of positive reals, by Assumptions 2 and 5 ). By the definitions of R and R 0 , R > R 0 ⇐ ⇒ T ( Π α A A ) T ( Π α D D ) > T ( Π A ) T ( Π D ) . 10 S. Phetmanee Multiplying b oth sides by T ( Π α D D ) /T ( Π A ) > 0 gives T ( Π α A A ) T ( Π A ) > T ( Π α D D ) T ( Π D ) . The attac ker defender ratio w orsens for the defender precisely when the attac ker’s relative throughput gain exceeds the defen der’s. Corollary 3 (Defender misses b ottlenec k). If α A ( v ) > 1 for every v ∈ B ( Π A ) and ther e exists w ∈ B ( Π D ) with α D ( w ) = 1 , then R ( Π A , Π D , α A , α D ) > R 0 ( Π A , Π D ) . Pr o of. By Theorem 2 , T ( Π α A A ) > T ( Π A ) , so T ( Π α A A ) /T ( Π A ) > 1 . By Theorem 1 , T ( Π α D D ) = T ( Π D ) , so T ( Π α D D ) /T ( Π D ) = 1 . By Theorem 4 , R > R 0 . 4.5 F alse P ositive Mo del: Audit and Repair Theorem 5 (Imp ossibilit y of p ost saturation decline). L et f ∈ [0 , 1) and c inv ∈ R > 0 . F or every λ 1 , λ 2 > c inv , U ( λ 1 , f , c inv ) = U ( λ 2 , f , c inv ) = (1 − f ) · c inv . Conse quently, λ 7→ U ( λ, f , c inv ) is c onstant on ( c inv , ∞ ) . Pr o of. Let λ > c inv . Since λ > c inv , min ( λ, c inv ) = c inv . By Definition 9 , U ( λ, f , c inv ) = (1 − f ) · c inv . This v alue do es not dep end on λ . Under the fixed false p ositiv e fraction mo del, useful throughput saturates at (1 − f ) c inv once the alert rate exceeds in vestigation capacity . There is no decline only a plateau. An y argument asserting strict decline in this mo del is incorrect. Prop osition 4 (Repaired false-p ositiv e burden). L et p : R > 0 → [0 , 1] satisfy Assumption 9 , and let c inv ∈ R > 0 . Then U p is strictly de cr e asing on ( c inv , ∞ ) : for al l λ 1 , λ 2 with c inv < λ 1 < λ 2 , U p ( λ 1 , c inv ) > U p ( λ 2 , c inv ) . Pr o of. Let c inv < λ 1 < λ 2 . Since b oth exceed c inv , min ( λ i , c inv ) = c inv for i = 1 , 2 . By Definition 10 , U p ( λ i , c inv ) = p ( λ i ) · c inv . By Assumption 9 , p ( λ 1 ) > p ( λ 2 ) . Since c inv > 0 (by hypothesis), p ( λ 1 ) · c inv > p ( λ 2 ) · c inv . When precision degrades with increasing alert rate, useful throughput genuinely declines b ey ond saturation. The declining precision factor ov erwhelms the capacity . Corollary 4 (Constan t precision collapses to plateau). If p ( λ ) = f 0 for al l λ (c onstant pr e cision), then U p ( λ, c inv ) = f 0 · c inv for al l λ > c inv . In p articular, U p is c onstant, not de cr e asing, on ( c inv , ∞ ) . Pr o of. F or λ > c inv : U p ( λ, c inv ) = p ( λ ) · c inv = f 0 · c inv . A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 11 5 Discussion Theorems 1 and 2 together provide the complete answer to the question: when do es throughput change under multiplicativ e stage-lo cal improv ement? The answ er—at least one b ottleneck must b e improv ed for an y change, and all must b e impro ved for strict increase—sharp ens the informal Theory of Constraints claim in to an exact biconditional. In particular, it handles tied b ottlenec ks correctly , whic h informal reasoning t ypically suppresses. Theorem 3 makes precise the common in tuition that AI cannot replace human judgemen t. The b ound T ( Π α ) ≤ min h ∈ H c Π ( h ) sa ys that if some stages are designated as requiring human authorit y , then no amount of machine acceleration can push throughput past the slo west human stage. The tightness result says this ceiling is not lo ose: it is exactly ac hiev able. This has direct implications for security op erations staffing such as the human b ottlenec k is the ultimate constrain t. Theorem 4 identifies the correct quan tity for comparing AI’s impact on attack ers versus defenders, not ra w stage sp eedup, but r elative thr oughput gain . Corollary 3 giv es the concrete consequence: if the defender inv ests in non-b ottlenec k stages while the attack er improv es its b ottlenec k, the defender’s inv estment has zero throughput effect and the ratio w orsens. Theorem 5 is the most unusual result, which prov es that a commonly discussed effect (useful throughput declining with increased detection sensitivit y) c annot o c cur under the standard fixed- f mo del. The intended b ehaviour requires a model where precision degrades with rate (Proposition 4 ). W e view this as a con tribution to mo delling metho dology: the negativ e theorem preven ts a false claim, and the repair identifies the minimal additional assumption needed. No pro of in this pap er in vok es the stage ordering ≺ Π . The results hold for any finite set of p ositive capacities under m ultiplicative p erturbation. W e retain the ordering b ecause the motiv ating application—cyb ersecurit y pip elines—has a natural sequential structure (detection, triage, in vestigation, resp onse), and b ecause pip eline without ordering would b e a misleading metaphor. Which assumptions ar e essential. The most critical assumption is that throughput equals the stagewise minim um (Assumption 3 ). Under an y other aggregation rule (sum, harmonic mean, max flow), the sp ecific inv ariance and migration c haracterisations fail. The second critical assumption is stage-lo cal m ultiplicative p erturbation (Assumption 4 ): if impro ving one stage can degrade another through coupling, the inv ariance theorem is false. The admissibility constraint α ≥ 1 (Assumption 5 ) is essential for Theorem 2 but could b e w eakened for Theorem 1 ; the ( ⇐ ) direction holds even if α ( v ) < 1 is p ermitted, pro vided the stage with α ( v 0 ) = 1 still pins the minimum. 6 Limitations and F uture W ork The theory characterises when throughput changes but not by how muc h: no closed form expression for T ( Π α ) − T ( Π ) as a function of the m ultiplier profile is pro vided beyond the normal form T ( Π α ) = min v α ( v ) c ( v ) . No optimal m ultiplier 12 S. Phetmanee allo cation under budget constraints is deriv ed. No sto c hastic, queueing, or game theoretic extension is attempted. No empirical v alidation is provided. The mo del is deterministic, bufferless, and serial. Real cyb ersecurity pip elines ha ve parallel branc hes, rework lo ops, shared resources, and stochastic w orkloads. The minim um-throughput axiom captures the b ottlenec k phenomenon but misses ev ery interaction effect. The theory should b e read as what follows from the b ottlenec k assumption, not how real SOCs work. Assumption 6 ( α ( h ) = 1 ) is an idealisation. AI may partially assist human stages, yielding α ( h ) ∈ (1 , β h ] . The ceiling theorem generalises cleanly to this setting— T ( Π α ) ≤ min h ∈ H β h · c Π ( h ) , with the same pro of structure but this extension is not developed here. Theorem 4 treats attack er and defender as indep endent. It do es not mo del strategic in teraction, arms-race dynamics, or information asymmetry . A gametheoretic form ulation remains future w ork. The false p ositive section (§ 4 , final subsection) is formally indep endent of the pip eline theory . The connection that c inv corresp onds to a pip eline stage capacit y and λ to upstream output is interpretiv e, not prov ed. In tegrating the tw o mo dels is a natural extens ion. Op en pr oblems. The most imp ortan t op en problem is extending the theory to sto c hastic or queueing throughput functionals, where the minim um is replaced by a long run av erage rate. Almost none of the current pro ofs survive in that setting, and new techniques would b e required. A second op en problem is the multiplier allo cation optimisation: giv en a cost function on m ultipliers, which allo cation maximises throughput? This is trivially answ ered for the deterministic mo del (in vest everything in the b ottlenec k), but becomes non trivial under sto c hastic v ariation, risk av ersion, or multi constraints. 7 Conclusion W e hav e presen ted a formal theory of throughput in finite serial pip elines under m ultiplicative stage p erturbations, pro ving exact c haracterisations of throughput in v ariance (Theorem 1 ), strict improv emen t (Theorem 2 ), h uman authority ceilings (Theorem 3 ), adversarial throughput comparison (Theorem 4 ), and false p ositiv e mo del limitations (Theorem 5 , Prop osition 4 ). The con tribution is not mathematical depth but formal precision applied to a domain where only informal argumen ts currently exist. The theory pro vides a foundation for evidence based analysis of AI deploymen t in cybersecurity op erations. The most imp ortant op en question is whether the core inv ariance and ceiling results can b e extended to sto c hastic throughput functionals that mo del the queueing and v ariabilit y presen t in real systems. Such an extension w ould require fundamentally new pro of techniques and constitutes a substantial researc h programme. References 1. E. M. Goldratt and J. Cox, The Go al: A Pr o c ess of Ongoing Impr ovement , North Riv er Press, 1984. A F ormal Theory of Throughput in AI Cyb ersecurity Pip elines 13 2. J. A. Buzacott and J. G. Shan thikumar, Sto chastic Mo dels of Manufacturing Systems , Prentice Hall, 1993. 3. L. R. F ord and D. R. F ulk erson, “Maximal flow through a netw ork,” Canadian Journal of Mathematics , vol. 8, pp. 399–404, 1956. 4. R. Anderson, “Why information security is hard—an economic p erspective,” in Pr o c. 17th Annual Computer Security Applic ations Confer enc e (ACSAC ) , 2001. 5. L. A. Gordon and M. P . Lo eb, “The economics of information security in vestmen t,” ACM T r ansactions on Information and System Security , vol. 5, no. 4, pp. 438–457, 2002. 6. D. Dolev and A. Y ao, “On the securit y of public key proto cols,” IEEE T r ansactions on Information The ory , v ol. 29, no. 2, pp. 198–208, 1983. 7. M. Burrows, M. Abadi, and R. Needham, “A logic of authentication,” ACM T r ansactions on Computer Systems , vol. 8, no. 1, pp. 18–36, 1990. 8. M. Abadi and C. F ournet, “Mobile v alues, new names, and secure comm unication,” in Pr o c. 28th ACM Symp osium on Principles of Pr o gr amming L anguages (POPL) , pp. 104–115, 2001. 9. A. L. Buczak and E. Guv en, “A survey of data mining and machine learning metho ds for cyber securit y intrusion detection,” IEEE Communications Surveys and T utorials , vol. 18, no. 2, pp. 1153–1176, 2016. 10. B. A. Alahmadi, L. Axon, and I. Martinovic, “99% false p ositiv es: A qualitative study of SOC analysts’ p erspectives on security alarms,” in Pr o c. 31st USENIX Se curity Symp osium , 2022. 11. T. B. Sheridan, Humans and Automation: System Design and R ese ar ch Issues , Wiley , 2002. 12. R. Parasuraman and V. Riley , “Humans and automation: Use, misuse, disuse, abuse,” Human F actors , vol. 39, no. 2, pp. 230–253, 1997.
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment