Strong lensing cosmography using binary-black-hole mergers: Prospects for the near future
A small fraction of gravitational-wave (GW) signals from binary black holes (BBHs) will be gravitationally lensed by intervening galaxies and galaxy clusters. Strong lensing will produce multiple identical copies of the GW signal arriving at different times. Jana et al.~\cite{Jana_2023} recently proposed a method to constrain cosmological parameters using strongly lensed GW events detected by next-generation (XG) detectors. The idea is that the number of strongly lensed GW events and the distribution of their lensing time delays encode imprints of the cosmological parameters. From the observed number of lensed GW events (tens of thousands) and their time delay distribution, this method can provide a new probe of cosmology, obtaining information at intermediate redshifts. In this work, we explore the possibility of doing lensing cosmography using upcoming observations of the upgraded LIGO-Virgo-KAGRA (LVK) network. This requires incorporating the detector network selection effects in the analysis, which was neglected earlier. We expect dozens of lensed GW events to be detected by upgraded LVK detectors, potentially enabling modest constraints on cosmological parameters. Even with relatively modest numbers of lensed detections, we demonstrate the potential of lensing cosmography. For XG detectors, our revised forecasts are consistent with the earlier forecasts that neglected the selection effects.
💡 Research Summary
This paper investigates the feasibility of using strongly‑lensed binary‑black‑hole (BBH) gravitational‑wave (GW) events as a cosmological probe with both the upcoming upgraded LIGO‑Virgo‑KAGRA (LVK) network and future third‑generation (X‑ray) detectors such as Cosmic Explorer (CE) and the Einstein Telescope (ET). Building on the method proposed by Jana et al. (2023), which showed that the number of lensed GW detections and the distribution of their time‑delay measurements encode information about the Hubble constant (H₀), matter density (Ωₘ), and the clustering amplitude (σ₈), the authors incorporate realistic detector selection effects that were previously neglected.
The analysis proceeds in several steps. First, the intrinsic BBH merger rate density as a function of redshift is modeled using two astrophysical prescriptions: the “Dominik” population‑synthesis model and a Madau‑Dickinson (MD) star‑formation‑rate model without delay. Both are calibrated to low‑redshift merger‑rate constraints from GWTC‑3. Second, the sensitivity curves, duty cycles, and signal‑to‑noise‑ratio (SNR) thresholds for each observing run (O4, O5, O6, Voyager, and X‑ray configurations) are used to construct a redshift‑dependent selection function S(z). This function quantifies the probability that a BBH at redshift z will be detected given its intrinsic parameters and the network’s instantaneous antenna pattern.
The expected number of detectable BBH events Λ is obtained by integrating the product of the intrinsic rate, the comoving volume element, the time‑dilation factor, and S(z) over redshift. Strong‑lensing probabilities are then applied, using lens‑galaxy mass functions and cross‑sections, to estimate the number of lensed events Nₗ for each scenario. The authors find that the upgraded LVK network (particularly O5 and O6) should yield of order 10–30 strongly‑lensed BBH detections, while third‑generation detectors are expected to observe 10⁴–10⁵ such events.
For cosmological inference, a Bayesian framework is adopted. The likelihood combines a Poisson term for the total number of lensed detections and a product of individual time‑delay likelihoods, the latter derived from simulated delay distributions that depend on the lens model (typically singular isothermal sphere or NFW) and the cosmological parameters. Assuming negligible measurement error on the delays (they are measured to millisecond precision), the posterior over θ = {H₀, Ωₘ, σ₈} is sampled via Markov Chain Monte Carlo.
Results indicate that with ~20 lensed events from the upgraded LVK network, H₀ can be constrained to roughly 5–7 km s⁻¹ Mpc⁻¹ (≈8 % precision), Ωₘ to ~0.03, and σ₈ to ~0.08 (1σ). Although modest compared to current standard‑siren or CMB constraints, these measurements probe an intermediate redshift range (z ≈ 1–3) that is largely inaccessible to other probes, offering an independent check on the Hubble tension. In the X‑ray era, the vastly larger sample reduces uncertainties dramatically: H₀ could be measured to 1–2 km s⁻¹ Mpc⁻¹, Ωₘ to <0.01, and σ₈ to <0.02, rivaling or surpassing Planck precision.
The paper also discusses systematic uncertainties: (1) imperfect knowledge of the BBH mass‑spin distribution, (2) uncertainties in the lens‑galaxy mass profile and external convergence, and (3) contamination from mis‑identified lensed pairs (false positives). The authors propose mitigating strategies, including joint electromagnetic follow‑up, improved population modeling using the huge unlensed BBH catalog, and Bayesian model selection to down‑weight suspect events.
Figures illustrate the predicted detection rates across observing runs, the selection functions as a function of redshift, and comparative constraints on H₀ from lensing cosmography versus standard sirens, BAO, and supernovae. The authors conclude that even with the modest number of lensed events expected from near‑future LVK upgrades, meaningful cosmological information can be extracted, and that third‑generation detectors will turn GW lensing into a competitive, high‑precision cosmological tool. Future work will focus on refining lens‑population models, automating lensed‑event identification, and integrating multi‑messenger data to control systematics.
Comments & Academic Discussion
Loading comments...
Leave a Comment