Several research groups in Europe are developing joint channel simulators for arbitrarily complex networks of terrestrial and slant path, microwave telecommunications links. Currently, the Hull Rain Fade Network Simulator (HRFNS) developed at University of Hull can simulate rain fade on arbitrary terrestrial networks in the southern United Kingdom, producing joint rain fade time series with a 10 s integration time. This paper reports on work to broaden the function of the existing HRFNS to include slant paths such as Earth-space links and communications to high altitude platforms and unmanned airborne systems. The area of application of the new simulation tool is being extended to the whole of Europe, and other fade mechanisms are being included. Nimrod/OPERA has been chosen as the input meteorological data sets for the new system to simulate rain fade. Zero-degree isotherm heights taken from NCEP/NCAR Reanalysis Data are used in conjunction with the Eden-Bacon sleet (wet snow) model to introduce melting layer effects. Other fading mechanisms, including cloud fade, scintillation and absorption losses by atmospheric gasses, can be added to the simulator. The simulator is tested against ITU-R models for rain fade distribution experienced by terrestrial and Earth-space links in the southern United Kingdom. Statistics of fade dynamics, i.e., fade slope and fade duration, for a simulated Earth-space link are compared to International Telecommunication Union models.
 The International Telecommunication Union (ITU-R) maintains a set of models for predicting average annual distributions of fade, with a 1 min integration time, on individual terrestrial and Earth-space links. These models are adequate for the regulation and coordination of terrestrial links and are used to estimate link budgets for Earth-space links. The ITU-R also provides models of fade duration and fade slope for a 10 s integration time, principally used in the design of fade mitigation techniques (FMTs). However, the design and optimization of proposed networks requires joint channel models on arbitrarily complex networks of links, and with a temporal resolution of 1 s or shorter.
 The recently completed SATNEx II (IST 027393) European Network of Excellence, and the new IC0802 COST action, have identified the urgent need for joint channel models for Global Integrated Networks (GINs) utilizing terrestrial, satellite, unmanned airborne systems (UAS) and high altitude platforms (HAPs) radio links in the VHF to W bands. In particular, these models are necessary for the design and optimization of proposed Land Mobile Satellite (LMS) systems at Ka and Ku bands and higher EHF frequencies (currently at S and C band but quickly reaching capacity). Recent projects by the French space agency CNES (SWIMAX and SDMB) have identified these as the future for broadband and broadcast systems to mobile terminals in trains, planes and road vehicles. A draft report by the Electronic Communications Committee of CEPT stresses the importance of proposed consumer systems delivering broadband, IPTV and multimedia applications to users, direct from satellite at Ka and Ku bands. These services will need to coexist with terrestrial services. Of particular concern is the performance of the terrestrial networks linking ground stations to content providers, in conjunction with the satellite uplinks. These systems require adaptive FMTs and Dynamic Resource Management (DRM). Based on work reported in COST 255 and 280, various adaptation techniques have been included in standards DVB-S2 and DVB-RCS such as up-link power control, reconfigurable antenna systems, and adaptive coding and modulation (ACM). All these dynamic adaptation techniques require simulation on channel models during their development and evaluation. The most important fade mechanisms for EHF on terrestrial or slant paths are scattering by rain and cloud droplets, absorption by atmospheric gases and scintillation. In some cases clear-air mechanisms, such as ducting and multipath, can cause fading at outage levels. These fade mechanisms exhibit complex spatial and temporal correlations, due to their dependence upon the weather. Furthermore, it is necessary to be able to model a very wide range of scales i.e., from a radio beam Fresnel diameter to the width of a continent and from 1 s fade variation to the lifetime of a large weather system.
 Joint channel simulators are vital for the design, evaluation and optimization of these proposed systems. It is essential that sufficient diversity exists in terrestrial networks to ensure ground stations have access to content and that ground station diversity ensures content can be communicated to satellites. This will require meteorological nowcasting to predict link outage and to dynamically reconfigure networks. The development of 4G direct broadcast to mobile from satellites techniques, such as MIMO OFDM, will require adaptation to differential fade along alternate paths [see Liolis et al., 2007]. There is likely to be considerable advantage in the application of ACM techniques developed for satellite communications to terrestrial networks. However, joint channel models are required to quantify the benefits on terrestrial networks.
 The only known method to generate simultaneous fade time series for a heterogeneous networks of EHF radio links is to generate fine-scale, spatial-temporal weather fields and then to simulate the effects on each radio link. The weather fields have come from a range of sources i.e., measured radar data, numerical weather prediction (NWP) or stochastic simulations constrained by measured statistics. One of the earliest systems was developed in Italy and is known as EX-CELL [see Bosisio and Riva, 1998; Paraboni et al., 2002]. However, the EX-CELL system assumes unrealistically smooth spatial-temporal variation of rain rate and its scope and resolution are poorly defined. The Hull Rain Fade Network Simulator (HRFNS) [Paulson and Zhang, 2009] is based on the downscaling of radar data but is only applicable over squares of size 50 km in the southern United Kingdom. The University of Bath EHF SATCOM system [Hodges et al., 2003] combines numerical weather models with radar data to model satellite channels and is valid across the United Kingdom. It includes rain and cloud fade mechanisms as well as scintillation and absorption by atmospheric gases. At Rutherford Appleton Laboratory (RAL), fully numerical models have been used to simulate terrestrial and satellite links [e.g., Callaghan, 2004]. The RAL method is valid over regions of a few tens of kilometers diameter. A new simulation tool is currently being developed by ONERA in France. The ONERA system utilizes ERA-40 historical NWP data from European Centre for Mid Range Weather Forecast (ECMWF) and is applicable to satellite links [see Jeannin et al., 2008, 2009]. However, the ERA-40 data are very coarse, and the downscaling techniques employed have not been verified.
 The simulator under development is based on the application of stochastic numerical methods, developed as part of the HRFNS, to downscale measured meteorological data. Section 2 discusses the meteorological inputs chosen for the European simulator. Section 3 describes the numerical processes required to derive joint fade time series from coarse-scale meteorological input data. The downscaled meteorological fields are used to simulate the performance of a range of links and the simulated results are compared to ITU-R models in section 4.
2. Meteorological Input Data
 A range of meteorological input data is necessary for the simulator. At millimeter wave frequencies terrestrial links are short, from 10 km down to a few hundred meters, but rain specific attenuation is high and so fine-scale spatial-temporal rain fields are vital. For the simulation of slant paths it is necessary to produce fields of rain and cloud density at a range of altitudes. An initial feasibility study has identified data from the Nimrod/OPERA terrestrial rain radar systems and EUMETSAT satellite systems as yielding appropriate input data for the proposed system. The UK Met Office (UKMO) Nimrod (UK Met Office, Rain radar products (Nimrod), NCAS British Atmospheric Data Centre, 2003, available at http://badc.nerc.ac.uk/view/badc.nerc.ac.uk__ATOM__dataent_nimrod) rain radar network provides composite measurements of instantaneous rain rate over 1 km voxels, spanning the United Kingdom and parts of Scandinavia, with a 5 min sample period. These data span 1999 to the present, with the higher resolutions available since 2004. The OPERA project aims to integrate Nimrod with a large number of continental European radars into a network that will provide these data spanning Europe. OPERA data should be generally available from 2011. As part of this project, 3 months of beta release OPERA composite data was analyzed and has produced results essentially the same as Nimrod. The work described in this paper uses Nimrod data.
 Although Nimrod and OPERA data will provide a basis for UK and European network simulation, the development of a global simulation tool requires consistent data with a global span. This is far more likely to be achieved by satellite Earth observation systems than by combining national radar networks. EUMETSAT operate a constellation of Meteosat Earth observation satellites and publishes a wide range of derived data sets. The MultiSensor Precipitation Estimate data set yields global rain rates integrated across 3 km pixels with a 5 min sample period. Cloud information is provided with 9 km pixels and hourly sampling.
 Slant paths experience rain fade from the ground station up to a level known as the rain height, assumed by Recommendation ITU-R P.839-3 [ITU Radiocommunication Sector (ITU-R), 2001] to be 360 m above the zero-degree isotherm (ZDI). This level can be extracted from NCEP/NCAR Reanalysis Data composites [see Kalnay et al.  produced by NOAA (NCEP reanalysis data, NOAA, OAR, ESRL, PSD, Boulder, Colorado, available at http://www.esrl.noaa.gov/psd/). Among other parameters the data set provides air temperature and altitude over a global 2.5° grid at 6 h intervals from 1948 to the present, calculated at 17 pressure levels. The global grid has 73 latitudes and 144 longitudes. The ZDI height is calculated by linear interpolation between the lowest temperature-height points decreasing through zero Celsius with increasing altitude. Diaz et al.  conclude that the Reanalysis data can reliably predict ZDI height, even over mountainous areas, over the period 1958 to the present.
 The zero-degree isotherm is used to determine rain height and the Eden-Bacon sleet model [Tjelta et al., 2005] is used to introduce the enhanced specific attenuation of wet snow within the melting layer. These downscaled rain and cloud fields are then converted into specific attenuation using standard models (e.g., Recommendation ITU-R P.838-3 [ITU-R, 2005]) and the joint, instantaneous fade due to rain, cloud and the melting layer is calculated by numerical integration along the link paths.
3. Network Fade Simulation
 The fade experienced by a radio link is determined by the scattering and absorption within a volume roughly defined by the first Fresnel zone. Meteorological scatters, such as hydrometeors (raindrops, ice pellets, hail, and complex mixed phase particles) exhibit considerable spatial and temporal variation. Typically, smaller integration volumes yield more extremes of parameter variation; that is, 1 min rain rates are typically more extreme that hourly rain rates. For channel fade simulators to yield variation with the correct distribution and variation, they need to be based on parameter fields with integration volumes similar in size to Fresnel zone diameters. These scales are typically tens to hundreds of meters, depending upon the frequency and path length. Meteorological data at this resolution spanning the network are generally not available and so numerical methods are used to stochastically introduce finer-scale variation into coarse-scale data sets. These downscaling processes rely upon statistical models for parameter variation valid over the range from coarse to fine scale. Generally, models assume multiscaling behavior [see Lovejoy and Schertzer, 1995].
 The downscaling techniques developed for the HRFNS are based on numerically enhanced time series of rain radar images from the Chilbolton Radar Interference Experiment (CRIE), measured as part of the European COST 210 project [Ballabio, 1991]. Techniques that can preserve a mixture of measured and a priori known statistics have been implemented for downscaling and interpolating measured rain fields such as the CRIE.
 Finer-scale spatial variation is introduced into the Nimrod data using multiscaling exponents measured on CRIE data and extrapolated from the smallest measured scale i.e., from voxels of diameter 300 m down to 30 m. Lilley et al.  have proposed that the multiscaling exponents are applicable down to submeter scales. The temporal interpolation algorithms are based on an underlying statistical rain model that assumes spatial-temporal log rain rate fields, where raining, are well approximated by homogeneous, isotropic, fractional Brownian fields with a Hurst coefficient of 1/3 [Paulson, 2002; Callaghan, 2004]. This model has been derived and verified over the scales of interest, by analysis of Chilbolton rapid response rain gauge and radar data. The model has been used for the interpolation of rain gauge data [Paulson, 2004]. In HRFNS, the interpolation (i.e., introduction of new samples with the same spatial-temporal averaging at new points in space-time) is achieved by ARMD, a variant of the Local Average Subdivision (LAS) algorithm of Fenton and Vanmarcke . Disaggregation (i.e., refining existing samples to smaller integration volumes) is achieved using the log-Poisson multiplicative cascade algorithm of Deidda , designed to reproduce measured multiscaling statistics [Paulson and Zhang, 2007]. Full details are given by Paulson and Zhang .
Paulson  has suggested that the downscaling techniques from the HRFNS can be applied to input data sets with sample times as long as 20 min and spatial voxels up to 30 km. Nimrod and OPERA derived rain fields have spatial and temporal resolutions within this range and so are appropriate input data sets for a network simulator (EIG EUMETNET programme OPERA, information and data available at http://www.knmi.nl/opera/index.html). A simulator based on these data sets would be applicable anywhere in the United Kingdom or Europe respectively.
 The simulation of a network involves several steps. First the area spanned by the network of interest is identified and a coarse, rain map time series for this area is extracted from Nimrod or OPERA data. Each 2-D spatial rain map is disaggregated to smaller integration volumes. Then the 3-D data spatial-temporal volume between each consecutive pair of scans is interpolated to finer temporal sampling. Once the fine spatial-temporal rain field data is calculated, the microwave network can be overlaid and the joint fade time series calculated by pseudo integration of the specific attenuation along each link path. The specific attenuation is calculated using the power law of Recommendation ITU-R P.838-3 [ITU-R, 2005] and the Bacon-Eden sleet model that has been incorporated into Recommendation ITU-R P.530-13 [ITU-R, 2009b]. The sleet model introduces a specific attenuation amplification factor which allows for the mixed phase hydrometeor types in the melting layer. For slant paths the altitude of each segment of the path is calculated from the link geometry. The sleet model reduces the specific attenuation to zero when the link altitude is more than 360 m above the zero-degree isotherm. This altitude is obtained from NOAA NCEP/NCAR reanalysis data. The fade calculation using this method is equally applicable to terrestrial, Earth-space and shorter slant paths such as those to HAPs or UASs.
 The simulation tool can be subjected to a wide range of verification tests. These can be divided into testing the statistics of the downscaled rain fields and of derived rain fade time series. As the downscaling process does not aim to reproduce the fine-scale fields that actually existed during the simulation interval, only the statistics of rain fields and fade time series can be compared. ITU-R provides prediction methods for first order statistics i.e., distributions of average annual parameters. The ITU-R also provides limited models of second-order statistics, i.e., fade durations and fade slope.
 The following comparisons are based on 3 calendar years of Nimrod data, 2004 to 2006, spanning a 36 km square region centered on Chilbolton Observatory (51°8.7′N, 1°26.2′W). These data have been downscaled from a spatial integration area diameter of 1 km to 125 m. The original 5 min sampling interval has been interpolated to 18 s. The downscaled data set was used to simulate a range of terrestrial and Earth-space links.
4.1. Verification of Hydrometeor Fade Distributions
Figure 1 compares the rain intensity distributions derived from the Nimrod data, the downscaled data (Recommendation ITU-R P.837-5 [ITU-R, 2007]) and from Rapid Response Drop Counting rain gauges (collected from the same years as the Nimrod data) sited at Chilbolton Observatory and a nearby site at Sparsholt. The Recommendation ITU-R P.837-5 [ITU-R, 2007] distribution is calculated using the 0.01% exceeded rain rate derived from the downscaled Nimrod data of 29.0 mm/h. This result demonstrates that the downscaling process produces rain fields with a plausible distribution. Dissagregation has increased the probability of higher rain rates, as is expected for smaller integration volumes. Averaging over the 3 years of data obscures the year-to-year variation. Figure 2 shows the 3 years separately and shows a factor of 2 variations in the annual 0.01% exceeded rain rate.
 Vertically polarized, 38 GHz, zero elevation, terrestrial links, of length 5 km and 8 km; have been simulated. For each link length, all possible positions within the 36 km square orientated north-south or east-west, are simulated and each yields a fade time series. Effectively, over half a million links are simulated. The links are assumed to be at an altitude of 600 m, to illustrate the effects of the melting layer, despite the land surface being at an altitude of 100 m. The hydrometeor fade distributions illustrated in Figure 3 are calculated by averaging over all these link time series. Each fade time series can be simulated with and without using the Recommendation ITU-R P.530-13 [ITU-R, 2009b] sleet model. When the sleet model is not used, all hydrometeors are interpreted as liquid rain. Alternatively, the zero-degree isotherm height can be assimilated from NOAA data and the effects of mixed phase hydrometeors included. Figure 3 illustrates the distributions of fade, with and without allowing for sleet, calculated by simulation over the 3 years. These are compared with the average annual distributions provided by Recommendation ITU-R P.530-13 [ITU-R, 2009b] with and without the sleet correction. The rain rate exceeded 0.01% of the time used in the P.530-13 model is the 29.0 mm/h extracted from the downscaled Nimrod data set. Figure 3 shows very clear agreement between simulated distributions and ITU-R models, both with and without sleet. The variation is certainly within that expected given the year-to-year variation illustrated in Figure 2.
 The same simulation process was followed to calculate the hydrometeor fade distribution for a Ka band uplink to a geostationary satellite. The link uses circular polarization at 27.5 GHz and operates at an elevation angle of 28°. Figure 4 illustrates the fade distribution calculated over the 3 year simulation compared to the hydrometeor fade prediction of Recommendation ITU-R P.618-10 [ITU-R, 2009a]. Given the year-to-year variation observed over the 3 years, the differences between the distributions at time percentages above 0.01% are within expected bounds. The observed 0.01% exceeded rain rate is an input parameter into the P.618-10 model and so the good agreement at 0.01% is expected. The deviation at lower time percentages could be an inconsistency between the terrestrial and Earth-space models P.530-13 and P.618-10, as the terrestrial link distributions were much closer at these time percentages. Alternatively, the deviation could be due to a feature of the small number of extreme events that determine the fade at these low time percentages.
4.2. Verification of Second-Order Statistics
 Optimization of the system capacity, quality and reliability require second-order statistics such as fade duration and fade slope. These are essential inputs for the design and optimization of FMTs. The proposed simulator can produce joint time series of fade for arbitrary networks and so can provide a wide variety of statistics.
4.2.1. Fade Duration
 Fade duration is defined as the time period that fade exceeds a given attenuation threshold. The distribution of fade durations yields important information on system outage and unavailability and is one of the vital factors in choosing forward error correction codes and modulation schemes for satellite communication systems.
 Recommendation P.1623-1 [ITU-R, 2003] provides a prediction model for the average annual distribution of fade duration for Earth-space links. The fade duration model consists of a lognormal distribution for long fades and a power law function for short fades and is valid for durations longer than 1 s. Figure 5 illustrates the fade duration distributions for a 11 dB threshold level for the 3 individual years (2004, 2005 and 2006). The large year-to-year variation suggests that the differences between the 3 year simulation results and the model are not significant.
4.2.2. Fade Slope
 Fade slope is defined as the rate of change of attenuation with time. Fade slope is another important parameter when designing FMTs as it constrains the time in which a system needs to react to increasing fade.
Figure 6 illustrates the fade slope probability density function, conditional upon the fade level. The predicted distribution provided by P.1623-1, the distributions measured over 3 individual years and the average distribution, are compared. A threshold at the 0.01% exceedance level has been used. The ITU-R model is broader than the simulated distribution and appears to be outside the deviation indicated by year-to-year variation. This deviation could be due to the relatively small sampling size. The number of 18 s intervals that span the 0.01% exceeded fade is very small. It is also possible that the 18 s simulation sample interval has contributed to the deviation for the 10 s integration time model.
 A system has been developed capable of simulating heterogeneous networks of terrestrial and slant path microwave links. The system has been verified by simulating a variety of notional terrestrial and Earth-space links, situated in the southern United Kingdom. The simulated distributions of fade and fade dynamics have been compared to ITU-R models and agreement has been demonstrated, within the limits of the 3 year simulation period and the expected accuracy of the prediction models.
 The system requires further validation. In particular it needs to be tested in other locations across Europe to determine the effects of topography. Further fade processes will be added to refine the channel models. Absorption by atmospheric gases is a process that varies slowly in time and space and the fade can be calculated from local meteorological parameters provided by standard sources. Scintillation depends upon atmospheric turbulence and system parameters such as antenna apertures. Once turbulence has been estimated, scintillation time series can be added using the model of Tatarski .
 Ultimately, simulation tools such as this could augment or replace some ITU-R recommendations. The generation of joint fade time series allows a wide range of questions to be posed and answered. Joint time series allows dynamic network FMTs to be designed, optimized and tested for arbitrary heterogeneous networks.
 We would like to acknowledge the support of the British Atmospheric Data Centre and the National NOAA Earth System Research Laboratory Physical Science Division for providing the data sets for this research.