Turbulence induced additional deceleration in relativistic shock wave propagation: implications for gamma-ray burst

Turbulence induced additional deceleration in relativistic shock wave   propagation: implications for gamma-ray burst

The late afterglow of gamma-ray burst is believed to be due to progressive deceleration of the forward shock wave driven by the gamma-ray burst ejecta propagating in the interstellar medium. We study the dynamic effect of interstellar turbulence on shock wave propagation. It is shown that the shock wave decelerates more quickly than previously assumed without the turbulence. As an observational consequence, an earlier jet break will appear in the light curve of the forward shock wave. The scatter of the jet-corrected energy release for gamma-ray burst, inferred from the jet-break, may be partly due to the physical uncertainties in the turbulence/shock wave interaction. This uncertainties also exist in two shell collisions in the well-known internal shock model proposed for gamma-ray burst prompt emission. The large scatters of known luminosity relations of gamma-ray burst may be intrinsic and thus gamma-ray burst is not a good standard candle. We also discuss the other implications.


💡 Research Summary

The paper revisits the standard picture of gamma‑ray burst (GRB) afterglows, in which a relativistic forward shock decelerates in a uniform interstellar medium according to the Bland‑Friedman‑McKee (BM) solution, and a jet break appears when the beaming angle exceeds the jet opening angle. The authors argue that interstellar turbulence, which is ubiquitous in galactic environments, modifies the shock dynamics in a fundamental way. By adding a turbulence‑induced energy loss term to the shock’s energy‑conservation equation, they derive a modified deceleration law γ∝t^{‑g} where g = 3/2 + Δg(ε, δ). Here ε quantifies the turbulent energy density relative to the ambient medium, and δ measures the efficiency of turbulent‑to‑thermal energy conversion at the shock front. Because g is larger than the canonical 3/2, the Lorentz factor drops more rapidly, causing the jet‑break time t_j to shift to earlier observer times (t_j∝θ_0^{2/(1+2g)} instead of the usual θ_0^{2} scaling). This provides a natural explanation for the “early jet breaks” sometimes observed in afterglow light curves.

The authors extend the turbulence argument to the internal‑shock model of the prompt emission. When two relativistic shells collide in a turbulent environment, the fluctuating turbulent pressure alters the collision efficiency and the resulting peak energy E_p. Consequently, the empirical luminosity–energy relations (e.g., Amati, Yonetoku) acquire an intrinsic scatter that is not merely observational but rooted in physical variability of the turbulence–shock interaction. This intrinsic dispersion undermines the use of GRBs as standard candles, because the jet‑corrected energy E_γ and radiative efficiency η become strongly dependent on the poorly constrained turbulence parameters.

Observational implications are threefold. First, an earlier jet break leads to systematic underestimates of the true jet‑corrected energy if one assumes the standard BM dynamics. Second, the large scatter in luminosity‑time and luminosity‑spectral correlations can be interpreted as a signature of variable turbulent strength across bursts, rather than measurement error alone. Third, any attempt to calibrate GRBs for cosmology must incorporate a model for turbulence‑induced deceleration, or else risk biased distance estimates.

The paper concludes by urging high‑resolution numerical simulations that explicitly resolve turbulent eddies interacting with relativistic shocks, and by recommending multi‑wavelength monitoring of afterglows to identify the predicted early jet breaks. Incorporating turbulence into GRB dynamics, the authors claim, will refine our understanding of shock physics, improve energy budget estimates, and clarify why GRBs have so far resisted being turned into reliable standard candles.