The connection between Gamma-Ray Bursts ( GRBs ) and their afterglows is currently not well understood . Afterglow models of synchrotron emission generated by external shocks in the GRB fireball model predict emission detectable in the gamma-ray regime ( \mathrel { \hbox { \raise 2.15 pt \hbox { $ > $ } \hbox to 0.0 pt { \lower 2.15 pt \hbox { $ \sim$% } } } } 25 keV ) . In this paper , we present a temporal and spectral analysis of a subset of BATSE GRBs with smooth extended emission tails to search for signatures of the “ early high-energy afterglow ” , i.e. , afterglow emission that initially begins in the gamma-ray phase and subsequently evolves into X-Ray , uv , optical , and radio emission as the blast wave is decelerated by the ambient medium . From a sample of 40 GRBs we find that the temporal decays are best described with a power-law \sim t ^ { \beta } , rather than an exponential , with a mean index \langle \beta \rangle \approx - 2 . Spectral analysis shows that \sim 20 \% of these events are consistent with fast-cooling synchrotron emission for an adiabatic blast wave ; three of which are consistent with the blast wave evolution of a jet , with F _ { \nu } \sim t ^ { - p } . This behavior suggests that , in some cases , the emission may originate from a narrow jet , possibly consisting of “ nuggets ” whose angular size are less than 1 / \Gamma , where \Gamma is the bulk Lorentz factor .