Environment and energy injection effects in grb afterglows

Jun, 1999
20 pages
Published in:
  • Astrophys.J. 537 (2000) 803
e-Print:

Citations per year

1999200520112017202202468
Abstract: (arXiv)
In a recent paper (Dai & Lu 1999), we have proposed a simple model in which the steepening in the light curve of the R-band afterglow of the gamma-ray burst (GRB) 990123 is caused by the adiabatic shock which has evolved from an ultrarelativistic phase to a nonrelativistic phase in a dense medium. We find that such a model is quite consistent with observations if the medium density is about 3×106cm33\times 10^6 {\rm cm}^{-3}. Here we discuss this model in more details. In particular, we investigate the effects of synchrotron self absorption and energy injection. A shock in a dense medium becomes nonrelativistic rapidly after a short relativistic phase. The afterglow from the shock at the nonrelativistic stage decays more rapidly than at the relativistic stage. Since some models for GRB energy sources predict that a strongly magnetic millisecond pulsar may be born during the formation of GRB, we discuss the effect of such a pulsar on the evolution of the nonrelativistic shock through magnetic dipole radiation. We find that after the energy which the shock obtains from the pulsar is much more than the initial energy of the shock, the afterglow decay will flatten significantly. When the pulsar energy input effect disappears, the decay will steepen again. These features are in excellent agreement with the afterglows of GRB 980519, GRB 990510 and GRB 980326. Furthermore, our model fits very well all the observational data of GRB 980519 including the last two detections.