We study the emission observed at energies >100 MeV of 11 gamma-ray bursts (GRBs) detected by the Fermi–Large Area Telescope (LAT) until 2009 October. The GeV emission has three main properties: (i) its duration is often longer than the duration of the softer emission detected by the Gamma Burst Monitor onboard Fermi (this confirms earlier results from the Energetic Gamma-Ray Experiment Telescope); (ii) its spectrum is consistent with Fν∝ν−1 and does not show strong spectral evolution; and (iii) for the brightest bursts the flux detected by the LAT decays as a power law with a typical slope t−1.5. We argue that the observed >0.1 GeV flux can be interpreted as afterglow emission shortly following the start of the prompt phase emission as seen at smaller frequencies. The decay slope is what is expected if the fireball emission is produced in the radiative regime, i.e. all dissipated energy is radiated away. We also argue that the detectability in the GeV energy range depends on the bulk Lorentz factor Γ of the bursts, being strongly favoured in the case of large Γ. This implies that the fraction of bursts detected at high energies corresponds to the fraction of bursts having the largest Γ. The radiative interpretation can help to explain why the observed X-ray and optical afterglow energetics are much smaller than the energetics emitted during the prompt phase, despite the fact that the collision with the external medium should be more efficient than internal shocks in producing the radiation that we see.