Off-line data processing and analysis for the GERDA experiment

Off-line data processing and analysis for the GERDA experiment
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

GERDA is an experiment designed to look for the neutrinoless double beta decay of Ge-76. The experiment uses an array of high-purity germanium detectors, enriched in Ge-76, directly immersed in liquid argon. GERDA is presently operating eight enriched coaxial detectors (approximately 15 kg of Ge-76) and about 30 new custom-made enriched BEGe detectors will be deployed in the next phase (additional 20 kg of Ge-76). The paper describes the GERDA off-line analysis of the high-purity germanium detector data. Firstly we present the signal processing flow, focusing on the digital filters and on the algorithms used. Secondly we discuss the rejection of non-physical events and the data quality monitoring. The analysis is performed completely with the GERDA software framework (GELATIO), designed to support a multi-channel processing and to perform a modular analysis of digital signals.


💡 Research Summary

The GERDA (Germanium Detector Array) experiment searches for neutrinoless double‑beta decay of ⁷⁶Ge using an array of high‑purity germanium (HPGe) detectors that are directly immersed in liquid argon. In Phase I the experiment operates eight enriched coaxial detectors (≈15 kg of ⁷⁶Ge) and plans to add about thirty custom‑made BEGe detectors (≈20 kg) for Phase II. Because the detectors are “bare” and sit in a cryogenic liquid, the intrinsic background is already strongly reduced, but further background suppression relies on sophisticated off‑line analysis of the digitized detector signals.

This paper describes the complete off‑line data‑processing chain implemented in the GERDA software framework GELATIO. The framework is written in C++, built on the MGDO library, and follows a modular architecture: each processing step is encapsulated in a dedicated class (module) and modules can be linked into user‑defined chains. The raw data are produced by 14‑bit flash‑ADCs (FADCs) that sample each detector pulse at two rates simultaneously: a high‑frequency short (HFS) trace at 100 MHz, 4 µs long, and a low‑frequency long (LFL) trace at 25 MHz, 160 µs long. The HFS trace contains the leading edge and is intended for pulse‑shape discrimination; the LFL trace contains the full charge collection and is used for energy reconstruction.

Two parallel processing chains are defined (see Fig. 1 of the paper). Chain 1 processes the LFL trace. The first module, GEMDBaseline, evaluates the pre‑trigger baseline (average, RMS, linear slope) and subtracts the average to restore a zero baseline. GEMDTrigger then applies a leading‑edge discriminator with a dynamic threshold set to three times the baseline RMS. To qualify as a trigger, the signal must stay above threshold for at least 40 µs, which suppresses spurious noise spikes while preserving true events. A complementary module, GEMDFTTrigger, is optimized for identifying multiple pulses within a single trace. It differentiates the signal with a 1.5 µs moving‑difference filter, smooths it with a 1 µs moving average, and then searches for peaks using a threshold of four times the baseline RMS. The peak width matches the differentiation window, maximizing pile‑up detection efficiency while avoiding false identification of multi‑site events.

Energy reconstruction is performed by GEMDEnergyGauss. The algorithm approximates a Gaussian shaping filter by first differentiating the pulse and then applying a moving‑average integrator fifteen times. The moving‑average window is 10 µs, chosen to mitigate ballistic deficit. The maximum amplitude of the resulting quasi‑Gaussian pulse is taken as the reconstructed energy. GEMDRiseTime computes the 10 %–90 % rise time by locating the points on the leading edge that correspond to those fractions of the full amplitude (baseline‑corrected).

Chain 2 processes the HFS trace. At present it contains a single module, GEMDCurrentPSA, which computes the derivative of the signal to obtain a current pulse and extracts basic features (rise time, width, area) of the current peak. This module will be expanded in future work to implement more advanced pulse‑shape discrimination (PSD) techniques that exploit the fast component of the signal.

Beyond the basic reconstruction, the paper details a systematic procedure for rejecting non‑physical or badly processed events. Four criteria are applied: (i) consistency between the trigger position (from GEMDTrigger) and the time of the Gaussian maximum (maxAmpTime from GEMDEnergyGauss); (ii) a reasonable 10 %–90 % rise time; (iii) absence of saturation (samples exceeding the ADC dynamic range); and (iv) verification that the baseline slope is flat. Events that fail any of these cuts are flagged as non‑physical (e.g., discharge, cross‑talk, pickup noise). A second class of problematic events consists of pile‑up or accidental coincidences, identified by an abnormal baseline slope, multiple peaks detected by GEMDFTTrigger, or a trigger position far from the centre of the trace. In calibration runs these pile‑up events can reach up to 15 % of the total, whereas they are negligible in physics data.

Data‑quality monitoring is achieved by continuously tracking the baseline average and RMS for each detector. Figure 6 of the paper shows these parameters over a ten‑day commissioning run; they remain stable except for a few hours of hardware interventions, during which the affected data are removed by applying cuts on the baseline parameters.

The impact of the selection cuts is illustrated with a ²²⁸Th calibration spectrum (Fig. 7). After applying the cuts, approximately 15 % of the total events (≈10 % of the γ‑line counts) are removed, leading to sharper γ‑ray peaks and a reduced low‑energy tail caused by pile‑up. The resulting spectra are well described by standard analytical peak models, confirming the effectiveness of the processing chain.

In conclusion, the GELATIO framework provides a robust, modular, and fully configurable environment for the off‑line analysis of GERDA HPGe detector data. The reference pipeline described in the paper has been validated during the commissioning phase, showing stable performance and readiness for Phase I physics data. The combination of optimized digital filters, precise trigger algorithms, and comprehensive data‑quality monitoring ensures that GERDA can achieve the low background levels required for a competitive search for neutrinoless double‑beta decay.


Comments & Academic Discussion

Loading comments...

Leave a Comment