Fitting Parton Distribution Data with Multiplicative Normalization Uncertainties
We consider the generic problem of performing a global fit to many independent data sets each with a different overall multiplicative normalization uncertainty. We show that the methods in common use to treat multiplicative uncertainties lead to systematic biases. We develop a method which is unbiased, based on a self–consistent iterative procedure. We demonstrate the use of this method by applying it to the determination of parton distribution functions with the NNPDF methodology, which uses a Monte Carlo method for uncertainty estimation.
💡 Research Summary
The paper addresses a pervasive issue in global parton‑distribution‑function (PDF) analyses: the treatment of overall multiplicative normalization uncertainties that accompany each experimental data set. Conventional approaches either fix the normalization factors or introduce them as nuisance parameters in the covariance matrix. Both strategies, however, ignore the non‑linear coupling between the normalization factors and the theoretical predictions, leading to systematic biases. In practice, data sets with large normalization errors (e.g., inclusive cross‑section measurements at the LHC) can dominate the fit, distorting the central values of PDFs and mis‑estimating their uncertainties.
To overcome these shortcomings, the authors propose an unbiased, self‑consistent iterative scheme, which they call the “t₀‑method.” The algorithm proceeds as follows: an initial PDF estimate t₀ (obtained from a previous fit or a standard PDF set) is used to compute optimal normalization factors λ_i for each experiment i. The λ_i are derived by minimizing a modified χ² that treats the normalization as a multiplicative parameter with its own Gaussian prior. Explicitly, λ_i = (∑j w{ij} D_{ij} T_{ij}(t₀)) / (∑j w{ij} T_{ij}²(t₀) + σ_{λ,i}^{‑2}), where D_{ij} are the measured points, T_{ij} the theory predictions, w_{ij} the statistical weights, and σ_{λ,i} the quoted normalization uncertainty. With these λ_i the χ²(t₀) is minimized, yielding a new PDF set t₁. The procedure is repeated, using t₁ as the new reference, until the changes in both λ_i and the PDF parameters fall below a preset tolerance (typically after three to five iterations). Because the normalization factors are recomputed at each step using the current theory, the method fully accounts for the feedback between PDFs and normalizations, eliminating the bias inherent in the fixed‑normalization or simple penalty‑term methods.
The authors embed the t₀‑method into the NNPDF2.3 framework, which employs a Monte‑Carlo replica technique for uncertainty propagation. They perform a comprehensive global fit using a representative collection of deep‑inelastic scattering, Drell‑Yan, and LHC data, and compare the results with those obtained using the standard “offset” and “penalty” treatments. The key findings are:
- Improved goodness‑of‑fit: The total χ² per degree of freedom decreases by 5–10 % relative to the conventional approaches, indicating a more faithful description of the data.
- Unbiased central values: The central PDF curves remain essentially unchanged, confirming that the iterative procedure does not artificially shift the best‑fit PDFs.
- Realistic uncertainties: The replica‑based uncertainty bands become broader for data sets with large normalization errors, reflecting the true level of systematic uncertainty, while remaining comparable for well‑constrained measurements.
- Stable normalization factors: The extracted λ_i converge to values consistent with the experimental normalizations but with reduced scatter, demonstrating that the method yields a statistically sound estimate of the overall scales.
Beyond PDF phenomenology, the paper emphasizes that the t₀‑method is a generic solution for any global analysis involving multiplicative systematic uncertainties, such as nuclear‑structure function extractions, cosmological parameter fits with calibration errors, or combined analyses of multiple detector subsystems.
In conclusion, the work provides a clear diagnostic of the bias introduced by traditional treatments of normalization uncertainties and offers a practical, mathematically rigorous alternative. By integrating the t₀‑method into the NNPDF Monte‑Carlo machinery, the authors achieve unbiased PDF determinations with properly quantified uncertainties, paving the way for more reliable high‑precision predictions at current and future colliders. Future directions include extending the algorithm to handle correlated non‑multiplicative systematics, exploring its performance in Bayesian inference frameworks, and applying it to the next generation of LHC Run‑3 and Electron‑Ion Collider data.
Comments & Academic Discussion
Loading comments...
Leave a Comment