A closer look at supernovae as seeds for galactic magnetization by Evangelia Ntormousi et al. on Thursday 24 November
Explaining the currently observed magnetic fields in galaxies requires
relatively strong seeding in the early Universe. One theory proposes that
magnetic fields of the order of $\mu$G were expelled by supernova (SN)
explosions after primordial, nG or weaker fields were amplified in stellar
interiors. In this work, we calculate the maximum magnetic energy that can be
injected in the interstellar medium by a stellar cluster of mass $M_{cl}$ based
on what is currently known about stellar magnetism. We consider early-type
stars and adopt either a Salpeter or a top-heavy IMF. For their magnetic
fields, we adopt either a Gaussian or a bimodal distribution. The Gaussian
model assumes that all massive stars are magnetized with $10^3 < \langle B_*
\rangle < 10^4$ G, while the bimodal, consistent with observations of Milky Way
stars, assumes only 5-10 per cent of OB stars have $10^3 < \langle B_* \rangle
< 10^4$ G, while the rest have $10 < \langle B_* \rangle < 10^2$ G. We find
that the maximum magnetic energy that can be injected by a stellar population
is between $10^{-10}-10^{-7}$ times the total SN energy. The highest end of
these estimates is about five orders of magnitude lower than what is usually
employed in cosmological simulations, where about $10^{-2}$ of the SN energy is
injected as magnetic. Pure advection of the stellar magnetic field by SN
explosions is a good candidate for seeding a dynamo, but not enough to
magnetize galaxies. Assuming SNe as main mechanism for galactic magnetization,
the magnetic field cannot exceed an intensity of $10^{-7}$ G in the best-case
scenario for a population of $10^{5}$ solar masses in a superbubble of 300 pc
radius, while more typical values are between $10^{-10}-10^{-9}$~G. Therefore,
other scenarios for galactic magnetization at high redshift need to be
explored.
arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2211.12355v1