Quantifying EFT Uncertainties in LHC Searches
Effective Field Theory (EFT) is a general framework to parametrize the low-energy approximation to a UV model that is widely used in model-independent searches for new physics. The use of EFTs at the LHC can suffer from a ‘validity’ issue, since new physics amplitudes often grow with energy and the kinematic regions with the most sensitivity to new physics have the largest theoretical uncertainties. We propose a method to account for these uncertainties with the aim of producing robust model-independent results with a well-defined statistical interpretation. In this approach, one must specify the new operators being studied as well as the new physics cutoff $M$, the energy scale where the EFT approximation breaks down. At energies below $M$, the EFT uncertainties are accounted for by adding additional higher dimensional operators with coefficients that are treated as nuisance parameters. The size of the nuisances are governed by a prior likelihood function that incorporates information about dimensional analysis, naturalness, and the scale $M$. At energies above $M$, our method incorporates the lack of predictivity of the EFT, and we show that this is crucial to obtain consistent results. We perform a number of tests of this method in a simple toy model, illustrating its performance in analyses aimed at new physics exclusion as well as for discovery. The method is conveniently implemented by the technique of event reweighting and is easily ported to realistic LHC analyses. We find that the procedure converges quickly with the number of nuisance parameters and is conservative when compared to UV models. The paper gives a precise meaning and offers a principled and practical solution to the widely debated ‘EFT validity issue’.
💡 Research Summary
The paper addresses a long‑standing problem in LHC new‑physics searches that employ Effective Field Theory (EFT): the “validity issue”, i.e. the fact that EFT predictions become unreliable when the characteristic energy of the process approaches the cutoff scale M of the underlying UV theory. In many analyses, this problem is either ignored or mitigated by imposing hard cuts on an observable that is assumed to correlate with the partonic center‑of‑mass energy (“data clipping”). Such cuts are ad‑hoc, process‑dependent, and can lead to overly aggressive limits that are not representative of any realistic UV completion.
The authors propose a systematic, statistically well‑defined framework that incorporates EFT uncertainties directly into the likelihood used for inference. The method consists of two complementary ingredients:
-
Below the cutoff (E < M) – The truncated EFT (typically containing dimension‑6 operators) is supplemented with higher‑dimensional operators (dimension‑8,‑10, …). Their Wilson coefficients are treated as nuisance parameters θ_i. A prior probability density for each θ_i is constructed from dimensional analysis and naturalness arguments, typically a Gaussian with zero mean and unit variance (or a Laplace distribution). This encodes the expectation that omitted terms are of order (E/M)^n with O(1) prefactors.
-
Above the cutoff (E ≥ M) – Since the EFT no longer provides a predictive expansion, the authors replace the naive EFT amplitude by a flexible parametrization that damps the unphysical energy growth. They introduce a form‑factor F(E; M, α) =
Comments & Academic Discussion
Loading comments...
Leave a Comment