The geolocation accuracy of satellite-borne time difference of arrival systems is limited by the uncompensated differential delay experienced by each pair of satellite-to-ground paths. We have measured the ionospheric component of this differential delay using two-frequency GPS receivers at seven ground sites; one in the high-latitude region, four at midlatitudes, and two in the equatorial region. The measurements were expressed as differential total electron content (ΔTEC) so that they could be used at frequencies other than the 1.6 and 1.2 GHz GPS values and be readily compared with ionospheric models. Three models were used to account for the measured ΔTEC values: a climatological model (RIBG) and two modified versions of Jet Propulsion Laboratory's GIM model (a postanalysis and a near-real-time implementation using a subset of reporting stations). All three models reduced the ΔTEC differences significantly. The postanalysis GIM model performed better than the other two in the tails of the residual error distribution, but there was no clear difference among the three in about 75% of the cases. However, when the data were binned according to local time and angle between the two rays, both GIM models were more accurate than RIBG between about 0600 and 1900 LT at angles greater than about 50°.