Modeling the effects of ionospheric scintillation on GPS/Satellite-Based Augmentation System availability



[1] Ionospheric scintillation is a rapid change in the phase and/or amplitude of a radio signal as it passes through small-scale plasma density irregularities in the ionosphere. These scintillations not only can reduce the accuracy of GPS/Satellite-Based Augmentation System (SBAS) receiver pseudorange and carrier phase measurements but also can result in a complete loss of lock on a satellite. In a worst case scenario, loss of lock on enough satellites could result in lost positioning service. Scintillation has not had a major effect on midlatitude regions (e.g., the continental United States) since most severe scintillation occurs in a band approximately 20° on either side of the magnetic equator and to a lesser extent in the polar and auroral regions. Most scintillation occurs for a few hours after sunset during the peak years of the solar cycle. Typical delay locked loop/phase locked loop designs of GPS/SBAS receivers enable them to handle moderate amounts of scintillation. Consequently, any attempt to determine the effects of scintillation on GPS/SBAS must consider both predictions of scintillation activity in the ionosphere and the residual effect of this activity after processing by a receiver. This paper estimates the effects of scintillation on the availability of GPS and SBAS for L1 C/A and L2 semicodeless receivers. These effects are described in terms of loss of lock and degradation of accuracy and are related to different times, ionospheric conditions, and positions on the Earth. Sample results are presented using WAAS in the western hemisphere.