Delay Time Characterization on FPGA: A Low Nonlinearity, Picosecond Resolution Time-to-Digital Converter on 16-nm FPGA using Bin Sequence Calibration
Reading time: 2 minute
...
📝 Original Info
- Title: Delay Time Characterization on FPGA: A Low Nonlinearity, Picosecond Resolution Time-to-Digital Converter on 16-nm FPGA using Bin Sequence Calibration
- ArXiv ID: 2511.05583
- Date: 2025-11-05
- Authors: ** 정보 없음 (논문에 저자 정보가 제공되지 않음) **
📝 Abstract
We present a Time-to-Digital Converter (TDC) implemented on a 16 nm Xilinx UltraScale Plus FPGA that achieves a resolution of 1.15 ps, RMS precision of 3.38 ps, a differential nonlinearity (DNL) of [-0.43, 0.24] LSB, and an integral nonlinearity (INL) of [-2.67, 0.15] LSB. This work introduces two novel hardware-independent post-processing techniques - Partial Order Reconstruction (POR) and Iterative Time-bin Interleaving (ITI) - that significantly enhance the performance of FPGA-based TDCs. POR addresses the missing code problem by inferring the partial order of each time bin through code density test data and directed acyclic graph (DAG) analysis, enabling near-complete recovery of usable bins. ITI further improves fine time resolution by merging multiple calibrated tapped delay lines (TDLs) into a single unified delay chain, achieving scalable resolution without resorting to averaging. Compared to state-of-the-art FPGA-based TDC architectures, the proposed methods deliver competitive or superior performance with reduced hardware overhead. These techniques are broadly applicable to high-resolution time measurement and precise delay calibration in programmable logic platforms.💡 Deep Analysis
📄 Full Content
Reference
This content is AI-processed based on open access ArXiv data.