Satellite radio systems suffer loss of information in a wide band of frequencies during periods of intense ionospheric scintillation activity when the received signal undergoes rapid and deep fading. In order to assess the problem and to determine a proper fade margin for an Earth-space link, system engineers require information on signal statistics as well as on the morphological aspects of scintillations. Our observations near the northern boundary of the equatorial scintillation belt at (18.9° N geomagnetic) within the Indian zone show that the signal at 4/1.5 Ghz has faded often beyond 10 dB pp, and at times beyond 24 dB pp at 4 Ghz during equinoctial months of high solar activity during the years of 1989–1990. In addition to the morphology at 4 Ghz, information on signal statistics, such as cumulative amplitude distribution function, fade rate distribution, and signal reliability for different message lengths for some events of scintillations, both at C and L band, has been presented. The theoretical Nakagami m distribution has been found to be the best for describing various levels of fade. Autocorrelation and power-spectrum analysis have been used to estimate average fade rates and ground correlation distances. Performance evaluation of satellite Earth terminals using small antennas has been carried out to show the vulnerability of the system in the hostile ionospheric environment notwithstanding the advanced modulation systems being employed.