## 1. Introduction

[2] Terrestrial, line-of-sight, microwave telecommunications links experience attenuation due to rain. At frequencies above 10 GHz this is the dominant fade mechanism and (with mechanical failure) is almost exclusively the cause of outage. Outage has a complex definition stemming from ITU-R recommendations G.826 (1999) and G.828 (2000) and the F recommendations that are derived from them (e.g., F.1491). Modern digital radio systems broadcast a bit stream divided into blocks, e.g., a typical SDH STM-1 system might have 801 bits per block and transmit 192,000 blocks per second. If any bit within a block is transmitted incorrectly, then the block is termed “errored”. A Severely Errored Second (SES) occurs in any second where 30% or more of the blocks were errored. An outage is defined as the period between the first of ten consecutive SES until the first of ten consecutive non-SES. Traditionally, links are specified to have an outage period caused by rain fading not exceeding some small percentage of an average year, usually 0.01% or 0.1% of time, and the rain fade margin is built into the link budget by estimating the rain attenuation exceeded for this time. Many models exist to calculate the fade margin [e.g., *COST 210 Management Committee*, 1991; *COST 235 Management Committee*, 1996; *ITU-R*, 2007a, 2007b]. However, these models are based on available statistics of rain rate measured with a 1-min integration time. These models are adequate for fade-margin calculations for individual long links but probably under estimate the incidence of outage on links shorter than 1 km. They provide only limited guidance on the performance of networks; e.g., *ITU-R* [2007a] provides some guidance for more complex links such as multihop links and links utilizing route diversity. *ITU-R* [2003b] also provides some guidance for point-to-multipoint cellular systems.

[3] The Quality of Service (QoS) experienced by a node in a heterogeneous network of microwave links, at a minimum defined by the average annual outage, is currently impossible to predict as it depends upon joint time series of rain fade with a 1-s integration time. Rudimentary models of fade duration exist for individual links, but not for more complicated networks. The engineering of fade mitigation techniques such as route diversity or adaptive modulation, similarly require the knowledge of typical time series of rain fade on heterogeneous networks of links [see *COST 280 Management Committee*, 2005].

[4] It has been demonstrated that joint rain fade on arbitrary networks of microwave links can be calculated from radar derived rain rate fields [e.g., *Goddard and Thurai*, 1996, 1997; *Tan and Pedersen*, 2000; *Paulson*, 2003; *Usman*, 2005]. These studies and others have derived 10-s integrated rain fade on microwave links from radar measured rain rate maps with 300-m resolution. When meteorological radars are used to scan across near-horizontal planes (PPI scans), they provide near-instantaneous measurements of radar reflectivity over large areas. Each reflectivity value is a weighted average across a voxel defined by the radar antenna pattern, the angular scan of the radar and the range gates. Measured single- and dual-polarization radar reflectivities may be used to estimate rain rate, either using empirical relationships or theoretical relationships based on parameterized drop size distributions and assumptions about drop shape and terminal velocity [*Goddard and Cherry*, 1984]. These radar-derived rain rates are averaged over the same voxel as the reflectivity measurement. Large radars, such as CAMRa the Chilbolton Advanced Meteorological Radar, can produce near-instantaneous rain rates averaged over voxels with linear dimensions of a few hundred meters within a range up to 100 km [*Goddard et al.*, 1994].

[5] Instantaneous joint rain fade is calculated by superimposing the measured rain field over a network of links. The rain fade experienced by each link is calculated by numerical integration of the specific attenuation along the link path. The specific attenuation is derived from the rain rate using a power law *γ* - *R* relationship [i.e., *ITU-R*, 2003a]. Specific attenuations derived from rain rates are known to include relative errors of 50% or more depending upon variation in the drop size distribution (DSD). Unlike rain rate, Rayleigh scattering predicts that radar reflectivity and microwave scattering both depend upon the sixth power of drop diameter. Radar-derived specific attenuations are effectively a frequency scaling and are likely to be more accurate than the intermediate rain rates.

[6] The process described above yields instantaneous measurements of joint rain fade. Time series of radar derived rain rate fields can be used to generate joint-fade time series. However, the method is limited by the long temporal sampling period. Radars need to be physically rotated to scan across the area being mapped. The period between consecutive radar images is usually several minutes and can be as long as 15 min. This low temporal sampling rate severely limits the application of the derived rain fade time series.

[7] It has been proposed that joint rain fade and rain scatter interference on heterogeneous link networks of arbitrary geometry and radio parameters could be calculated from simulated rain fields, derived from stochastic models of rainfall [*Callaghan*, 2004, 2007; *Callaghan and Vilar*, 2007]. This is currently being considered for adoption into a recommendation by the ITU-R. The evolution of stochastic rain models spanning the wide range of scales necessary for radio system simulation is still in its infancy. At larger scales, stochastic, spatial-temporal rainfall models for hydrological modeling have existed for decades [*Wheater et al.*, 2000]. At the fine scales necessary for radio system simulation, multiscaling and multiplicative cascade models exist. However, the model parameters are based on statistics averaged over a large number of event types and so do not reproduce the rich diversity of rain events, particularly anisotropic events such as squall lines and fronts. Furthermore, the lack of very fine scale rain data, e.g., from integration periods less than 10 s, means these models have not been verified down to the smallest scales.

[8] An alternative method is proposed in this paper, where long time series of spatial rain fields, derived from meteorological radar, are numerically interpolated and downscaled to yield the range of scales necessary for network simulation. The effect is to numerically generate the fine-scale rain fields that may have been measured at times between radar scans. After interpolation and downscaling, the rain field time series considered in this paper has a sampling period of 10 s or less and can be used as the input to a microwave network simulator. The proposed method has the advantage over purely numerical rain field generation of matching measured rain fields at the radar sample times. The more anisotropic large scale rain variation is conserved from the radar data and other parameters, such as advection, are also retained.

[9] Section 2 describes the database of radar measurements used to develop and verify the proposed method. Section 3 develops the numerical interpolation and downscaling, while section 4 presents evidence verifying the interpolated rain fields and derived rain fade statistics.