Theoretical models suggest that the first stars in the universe could have been very massive, with typical masses $\gtrsim$ 100 \Msun. Many of them might have died as energetic thermonuclear explosions known as pair-instability supernovae (PSNe). We present multidimensional numerical simulations of PSNe with the new radiation-hydrodynamics code CASTRO. Our models capture all explosive burning and follow the explosion until the shock breaks out from the stellar surface. We find that fluid instabilities driven by oxygen and helium burning arise at the upper and lower boundaries of the oxygen shell $\sim$ 20 - 100 sec after the explosion begins. Later, when the shock reaches the hydrogen envelope a strong reverse shock forms that rapidly develops additional Rayleigh-Taylor instabilities. In red supergiant progenitors, the amplitudes of these instabilities are sufficient to mix the supernova's ejecta and alter its observational signature. Our results provide useful predictions for the detection of PSNe by forthcoming telescopes.
The evolution of the first stars in the universe is one of the frontiers of modern cosmology. Primordial stars synthesized the first heavy elements in the universe, and their energetic feedback influenced the formation of later generations of stars and the first galaxies (Whalen et al. 2008a,b;Greif et al. 2010). Early numerical models predicted that Pop III stars formed with masses of 100-1000 M ⊙ (Bromm et al. 2009;Abel et al. 2002). New studies have found that ∼ 20% of Pop III stars form in binaries or multiples (Turk et al. 2009;Stacy et al. 2010) so the first stars could be less massive than originally thought. However, even today observations support the existence of stars with initial masses over 150 M ⊙ (Crowther et al. 2010). Stellar evolution models predict that Pop III stars with initial masses of 140 -260 M ⊙ develop oxygen cores of 50 M ⊙ after central carbon burning (Heger & Woosley 2002). At this point the core reaches sufficiently high temperatures (∼ 10 9 K) and low densities (∼ 10 6 g/cc) that the creation of electron-positron pairs is favored. Radiation pressure support then quickly decreases, triggering a rapid contraction of the core. During contraction, core temperatures and densities sharply rise and oxygen and silicon begin to burn explosively. The resulting thermonuclear explosion, known as a pair-instability supernova (PSN), reverses the contraction and completely unbinds the star, leaving no compact remnant and forming 2 Chen, Heger, and Almgren up 50 M ⊙ of 56 Ni. One possible PSN candidate, SN 2007bi, has recently found by (Gal-Yam et al. 2009).
Most current theoretical models of PSNe are based on one-dimensional calculations (Heger & Woosley 2002). However, in the initial stages of a supernova spherical symmetry is broken by fluid instabilities generated by burning, which cannot be captured in 1D. Two-dimensional simulations of Pop III PSNe have recently been done by Joggerst & Whalen (2011) in which only mild dynamical instabilities were found to form, but they proceeded from 1D KEPLER models in which explosive burning had already occurred and thus exclude instabilities driven by burning. Such instabilities, if they form, may alter the energetics and nucleosynthesis of the SN by vigorously mixing its fuel and must be included in simulations to understand the true evolution of PSNe. We have performed 2D simulations of Pop III PSNe that follow the initial contraction of the core until most of the energy due to explosive burning has been released, in contrast to Joggerst & Whalen (2011), who only follow the post-nucleosynthesis hydrodynamics. Our goal is to study any fluid instabilities that arise and how mixing alters nucleosynthesis and the energetics of the explosion.
We evolve zero-metallicity stars in KEPLER (Weaver et al. 1978), a one-dimensional Lagrangian stellar evolution code. In KEPLER we solve evolution equations for mass, momentum, and energy and include physics relevant to stellar evolution such as nuclear burning and artificial mixing. When the star comes to the end of central oxygen burning, we map its profile onto a 2D Cartesian grid in CASTRO. The procedure for mapping and seeding initial perturbations in these profiles is discussed in detail in Chen et al. (2011a). We evolve the star in CASTRO through the end of explosive burning.
CASTRO (Almgren et al. 2010;Zhang et al. 2011) is a massively parallel, multidimensional Eulerian adaptive mesh refinement (AMR) radiation-hydrodynamics code for astrophysical applications. Its time integration of the hydrodynamics equations is based on a higher-order, unsplit Godunov scheme. Block-structured AMR with subcycling in time enables the use of high spatial resolution where it is most needed. We use the Helmholtz equation of state (EOS) (Timmes & Swesty 2000) with density, temperature, and species mass fractions as inputs. The gravitational field is calculated using a monopole approximation constructed from a radial average of the 2D density field on the grid.
In Fig. 1 we show the formation of dynamical instabilities at the base of the oxygen burning shell during the contraction of the core (Chen et al. 2011b). They are relatively mild and do not penetrate the central 56 Ni region, so no 56 Ni is mixed into the upper layers of the star at this stage. After explosive burning reverses the contraction of the core, fluid instabilities driven by helium burning also appear in the outer layers of the oxygen shell. Minor mixing caused by these instabilities begins about 100 sec after reversal of the collapse.
As we show in Fig. 2, when the shock propagates into the hydrogen envelope the formation of a strong reverse shock creates additional Rayleigh-Taylor instabilities (RTI). Their amplitudes are sufficient to mix oxygen with the surrounding shells: H, He, and Si. Some mixing also occurs at the outer edge of the 56 Ni core. Our results demonstrate that dynamical instabilities form at several stages of the explosion and that
This content is AI-processed based on open access ArXiv data.