Abstract
Gamma-ray bursts (GRBs) are one of the most energetic explosions known in the Universe and are also known to have the most relativistic jets, with initial expansion Lorentz factors of $100< \Gamma_i <1000$ \cite{KP91, Fenimore+93, WL95, LS01, ZLB11, Zou+11, Racusin+11}. Many of these objects have a plateau in their early X-ray light curves (up to thousands of seconds) \cite{Nousek+06, OBrien+06, Zhang+06, Liang+07, Srinivasaragavan+20}. In this phase, the X-ray flux decreases much slower than theoretically expected \cite{MR93} which has puzzled the community for many years. Here, we show that the observed signal during this phase in both the X-ray and the optical bands is naturally obtained within the classical GRB “fireball” model, provided that (i) the initial Lorentz factor of the relativistically expanding jet is of the order of a few tens, rather than a few hundreds, as is often cited in the literature, and (ii) the expansion occurs into a medium-low density “wind” with density typically 3-4 orders of magnitude below the expectation from a Wolf-Rayet star \cite{CL99}. Within this framework, the end of the “plateau” phase (the beginning of the regular afterglow) marks the transition from the coasting phase to the self-similar expansion phase, which follows the scaling laws first derived by Blandford \& McKee.\cite{BM76}. This result therefore implies that the long GRB progenitors are either (i) not Wolf-Rayet stars, or (ii) the properties of the wind ejected by these stars prior to their final explosion are very different than the properties of the wind ejected at earlier times. This result shows that the range of Lorentz factors in GRB jets is much wider than previously thought, and bridges an observational ‘gap’ between mildly relativistic jets\cite{Ghisellini1993} inferred in active galactic nuclei, $\Gamma_i\lesssim 20$, to the much higher Lorentz factors, $\Gamma_i\lesssim 1000$ inferred in a few extreme GRBs\cite{Racusin+11}.