Adaptive transmit power control (ATPC) can be used to improve the spectrum efficiency of terrestrial point-to-point fixed links by limiting the transmit power to that required to maintain a constant bit error rate regardless of the propagation conditions. This results in a reduced transmit power being used during clear-sky conditions, lowering the interference resulting from the ATPC link. This improves the frequency reuse factor associated with a given band and geographic area, providing a spectrum efficiency gain. The project described in this paper found that implementing ATPC in the 38 GHz terrestrial fixed links band gives significant improvements in spectrum efficiency as measured by the increase in the number of links assigned to channel 1 (from ∼50% to ∼70%) and the decrease in the maximum bandwidth used (from ∼300 MHz to ∼180 MHz). However, a model plan exposed to an exceptionally intense frontal rain event showed a number of additional outages caused by ATPC, amounting to approximately 12% of the number of outages caused directly by rain. In comparison, when exposed to an annualized simulated rain database the number of extra outages in this case falls to 2.6%.
 There is increasing pressure on radio spectrum managers to improve the utilization and efficiency of the spectrum in order to cater for the increased bandwidths needed by new services such as mobile internet. For this reason, for systems operating above about 10 GHz, some fade mitigation techniques, such as adaptive transmit power control (ATPC) can be used to improve spectrum efficiency.
 ATPC can be used to improve the spectrum efficiency of fixed links by limiting the transmit power to that required to maintain a constant bit error ratio (BER) regardless of the propagation conditions. This results in a reduced transmit power being used during clear-sky conditions, meaning that the interference resulting from the ATPC link is correspondingly lower. This improves the frequency reuse factor associated with a given band and geographic area, providing a spectrum efficiency gain.
 For systems operating at frequencies of above 10 GHz, the primary propagation impairment is rain. The spatio-temporal distribution of rain fields determines whether interfering links are attenuated in similar proportion to wanted links, thereby indicating if the implementation of ATPC will result in increased levels of interference.
 A previous study funded by the UK's Ofcom [Richardson et al., 2004] indicated that there are spectrum efficiency gains to be made as a result of the introduction of ATPC in point-to-point fixed service bands, especially those operating at frequencies where rain is a significant attenuator. However, that study was limited in scope, dealing only with a simplified scenario of two parallel links, separated by distances of ∼1–10 km. Ofcom therefore funded another project to investigate the impact of implementing ATPC in more detail, concentrating on the following questions identified from the previous study:
 1. During rain events, does the use of ATPC increase harmful interference to neighboring point-to-point systems, exceeding the frequency assignment criteria? And if so, how often and to what extent will the criteria be exceeded?
 2. What is the most efficient way of maximizing the increase in packing density and spectrum utilization through the use of ATPC, without exceeding the assignment criteria?
 The technical frequency assignment criteria that Ofcom employ when selecting frequencies for fixed terrestrial (point to point) digital radio services operating in the 38 GHz band can be found in the work of Ofcom .
 The potential advantages of ATPC reported in the literature include [Richardson et al., 2004; National Spectrum Managers Association (NSMA), 1992]: (1) Reduced average power consumption. (2) Extended equipment mean time between failure (MTBF). (3) Elimination of the ‘upfade’ problem in receivers. (4) Improved outage performance due to the reduced influence of adjacent channel interference (ACI). (5) Easier frequency coordination in areas of high radio-relay station density.
 It is this last point that was fundamental to the objectives of this study, since an increase in spectrum utilization is dependent on the ability to reduce the coordination distance for systems employing ATPC without compromising the quality of service of neighboring links through excessive interference.
 However, it is vital to emphasize that ATPC should only be used to combat temporary fading of the wanted link rather than interference from the unwanted link(s). Otherwise, a situation could arise where two ATPC systems repeatedly increase their transmitter power in response to each other's interference until both are transmitting at their maximum transmitter power. This situation would reduce to the non-ATPC case, completely negating any spectrum efficiency benefits gained as a result of employing ATPC in the first place. To avoid this situation, ATPC links must be designed and deployed correctly to take into account the interference generated by neighboring links so that ATPC is used to combat rain fading, rather than interference.
2. ATPC System Definition
 The basic operational principle for ATPC is quite simple. In cases where rain fading occurs on the radio path, it involves increasing the transmit power to compensate for the fade on a dB by dB basis. Given a reliable power control system, it is possible to reduce the fixed fade margin during clear-sky conditions (i.e., no fading), thereby improving the rate of frequency reuse and link packing density in the geographical area of the link. This is because lower fade margins use less transmit power, which lessens the interference on adjacent links.
 Long-term point rainfall rate statistics [Ventouras et al., 2006] show that for an average year in the south of England, it only rains for ∼3% of the time (equivalent to a total time of 11 days, broken up into many events lasting for minutes/hours). For the remaining 97% of the time the fade margin needed to achieve 99.99% availability is unnecessary and hence advantage can be taken of clear-sky conditions to improve spectral efficiency.
2.1. ATPC Parameters
 The operating capability of the transmit terminal is defined by the EIRP levels Pmin and Pmax where: (1) Pmax is effectively determined by the maximum EIRP specified by the regulator in order to satisfy a given availability requirement. (2) Pmin is determined by the performance of the equipment. Pmax − Pmin is the ATPC range.
 The assignment criteria used by Ofcom to determine whether a new frequency assignment can be made to a point-to-point link without receiving or generating unacceptable interference address two different situations:
 2. The wanted path is in an unfaded state, as represented by the median received signal level, and the interfering path is enhanced, once again as modeled using ITU-R [2007a].
 In both of these situations it is necessary to satisfy a given wanted to unwanted signal ratio (W/U). Cochannel W/U values may be calculated from first principles and are based on a noise limited frequency assignment methodology where aggregate interference and individual sources of interference are limited to specified levels below an allowance for receiver noise. In practice, at the present time, cochannel and first adjacent channel values are taken from European Telecommunications Standards Institute (ETSI) Standards and modified in order to take account of multiple interferers. (Note that this approach is under review and a return to calculation from first principles is envisaged.) Offset W/U values, beyond the first adjacent channel, are based on the cochannel value, the Net Filter Discrimination (NFD) associated with the relative bandwidths of the wanted and unwanted signals, the out of band emissions of the interfering signal (transmit mask), the out of band discrimination of the receiver (receive mask) and the frequency offset of the two signals. The W/U ratios used here are identical to those used by Ofcom. Figure 1 gives schematic examples of the power against time for the transmitter and receiver in a non-ATPC system, an ideal ATPC system and a more realistic ATPC system.
2.2. Setting up the Link
 The nominal operating condition (i.e., the condition that the ATPC seeks to preserve) is set at a receive signal level a number of dB greater (e.g., 3 to 7 dB) than the 10−6 BER sensitivity level, referred to by Ofcom as the RSL. The 3 to 7 dB margin above the RSL is either called the offset or the remote fade margin (RFM). The nominal operating condition is set to provide a BER of the order of 10−11 or 10−12 in order to ensure that the Background Block Error Rate (BBER) is no more than would be the case were ATPC not to be used [ETSI, 2002].
 In setting up the link it is necessary to specify Pmax, in accordance with the maximum EIRP permitted, and to specify the receive signal level associated with the nominal operating condition. When initially setting up the link under nominal propagation conditions the ATPC immediately adjusts the power by way of a transmitter/receiver closed loop to establish the nominal operating condition.
Figure 2 graphically represents the issues raised in the following sections. In the ideal case the total fade margin associated with the required availability for the link is the sum of the ATPC range and the remote fade margin. The nominal operating power of the transmitter as established by the ATPC loop would be exactly Pmin.
2.2.1. ATPC Range Greater Than the Matched Situation (Green Line)
 In the case where the ATPC range is greater than that in the matched case above, Pmin of the equipment is less than the transmitter power required to establish the nominal operating condition. On setup the ATPC loop automatically adjusts the transmitter power to a level higher than the equipment's Pmin and sufficient to establish the nominal operating condition. The nominal operating power of the transmitter ends up at a level the same as the matched situation above but higher than the equipment's Pmin.
 In enhanced propagation conditions it can be expected that the ATPC loop will reduce the transmitter power as appropriate, but only down to Pmin of the equipment as the limit, in order to maintain the nominal operating condition.
2.2.2. ATPC Range Less Than the Matched Situation (Red Line)
 In the case where the ATPC range is less than that in the matched case above, Pmin of the equipment is higher than the transmitter power required to establish the nominal operating condition. On setup the ATPC loop can therefore only adjust the transmitter power down to Pmin.
 As a fade occurs the ATPC loop will not start to operate (i.e., increase the transmitter power) until the received signal level has fallen (due to the fade) to the specified nominal operating condition. The nominal operating power of the transmitter is at Pmin of the equipment which is higher than the matched situation above and which under nominal propagation conditions will provide a received signal level higher than absolutely necessary.
3. Planning and Analysis Software
 In order to investigate the effects of implementing ATPC, new software based on Ofcom's existing band planning tool was created. The project software has two parts: a planning tool, which plans a set of links using standard planning assumptions, and an analysis tool, which takes a plan produced by the first tool and examines the response of the links to a sequence of rain fields.
 The planning tool takes an existing plan (for example, the plan of all the fixed terrestrial links in the UK operating in the 38 GHz band) and replans it, subject to a number of assumptions: (1) the mix of ATPC and non-ATPC links and (2) the type of ATPC in use. The statistics of the new plan are then calculated to estimate changes in band efficiency.
 The initial plan was based on the existing 38 GHz band plan supplied by Ofcom. The 13,949 links in the initial plan were filtered to remove links for which the data appeared to be incorrect (76 links), for which antenna patterns could not be found (165 links) or which failed the Fresnel zone test (52 links), leaving 13,656 links, located throughout the UK; one link is one way, the remainder are two way; all links are vertically polarized.
 The planning process follows Ofcom , with some exceptions: (1) the links are not checked against the ‘minimum path length policy,’ (2) there is no 6 dB EIRP uplift for obstructed paths, and (3) antenna pointing is calculated by the application: the plan value is discarded.
Figure 3 shows the current existing assignments in the 38 GHz band in the UK. As can be seen, the band appears to be very congested, with large numbers of links in legacy subbands. The effect of replanning the band with the automated planning application (using a lowest first frequency assignment algorithm), results in a contraction of the assignments to the lower end of the band (see Figure 4).
 Plans have been run using the two ATPC methods described earlier: (1) Assigned EIRP is offset by a constant positive amount from EIRPnon-ATPC − FM. (The offset is variously called ‘remote fade margin’ or ‘operating margin.’ This method assumes the ATPC equipment is capable of covering the difference between the remote fade margin and the fade margin.) (2) Assigned EIRP is offset by a variable negative amount from EIRPnon-ATPC. (The offset is normally the ATPC range: however, the reduced EIRP is constrained to provide the required remote fade margin.) Plans have also been run for two orderings of the link data, forward and reverse, which tests the stability of the results against assumptions about link geometry.
 The effect of introducing ‘ideal’ ATPC on all links is apparent when comparing Figures 4 and 5(RFM is 5 dB): both the number of assignments in the first channel and the maximum bandwidth are significantly improved. The number of links assigned to the first channel rises from 51% to 75%; the maximum bandwidth decreases from 280 MHz to 168 MHz.
 A more realistic method of modeling ATPC was also considered, in which the non-ATPC EIRP was backed off by a constant offset (i.e., the ATPC range), subject to satisfying the RFM. The following plan (Figure 6) was produced for an assumed ATPC range of 10 dB and an RFM of 5 dB: the result of imposing the constraining effect of a limited ATPC range is to reduce the plan efficiency as compared with the ‘ideal’ case (Figure 5). The statistics of these plans are shown in Tables 12–3.
Table 1. Plan Statistics for Non-ATPC
Channel Spacing (MHz)
Maximum Channel Number
Number of Channels Used
Total Number of Assignments
Number of Assignments in Channel 1
Table 2. Plan Statistics for All ATPC With RFM Equal to 5 dB
Channel Spacing (MHz)
Maximum Channel Number
Number of Channels Used
Total Number of Assignments
Number of Assignments in Channel 1
Table 3. Plan Statistics for All ATPC With ATPC Range Equal to 10 dB and RFM Equal to 5 dB
Channel Spacing (MHz)
Maximum Channel Number
Number of Channels Used
Total Number of Assignments
Number of Assignments in Channel 1
4. ATPC Behavior in the Presence of Rain
 The rainfall rate fields used in this research were obtained by means of the Chilbolton Advanced Meteorological Radar (CAMRa), which is located in Hampshire in the south of England, at the latitude 51°9′N and the longitude 1°26W. The climate is temperate maritime, with an average annual rain rate exceeded for 0.01% of the time of approximately 22.5 mm/h. The radar is a 25 m steerable antenna, equipped with a 3 GHz Doppler-Polarization radar, and has an operational range of 100 km, and a beam width of 0.25°. To avoid reflections from ground clutter, maps of the rain rate field near the ground were produced by scanning with an inclination of 1.2°. These maps are produced on a polar grid, with a range resolution of 300 m and an angular resolution of 0.3°. The number of maps produced in a given time period is dependent on the total angle scanned. The radar has a maximum angular velocity of 1°/s.
 The radar scans were interpolated onto a square Cartesian grid, with a grid spacing of 300 m and a side length of 56.4 km (188 pixels square). The grids are separated in time by approximately 2 min.
 The analysis tool takes a plan generated by the planning tool (or by another process) and applies a sequence of rain fields, evaluating system performance as measured by outage probabilities. For each rain field, the fade on each link is calculated, which then allows the EIRP uplift to be determined for each ATPC link. Every link is then tested in turn against all interfering paths, for all rain fields, and the number of outages recorded (distinguishing between those outages directly caused by a rain fade and those outages caused by ATPC-enhanced interference); the ATPC-induced outage counts reported here are ‘extra’ outages (i.e., those outages occurring in a link that is not also in outage because of a rain fade that exceeds its fade margin).
 The rain fields with measured rainfall rates are 56.4 km square. In order to avoid edge effects, the analysis of link performance is not performed on the whole area, but on a smaller ‘test’ area; interference, however, is considered from links throughout the entire area (the ‘background’). The results presented here are for a test area of 35 km.
 The number of detected outages will depend upon the severity and distribution of the rain. Three types of measured rain data were used (convective, stratiform and frontal). The maximum rainfall rates for the measured convective, stratiform and frontal rain data were 52.5, 45.7 and 95.5 mm/h, respectively. (Please note that the maximum rainfall rate for the measured stratiform event is somewhat misleading: the effective maximum is ∼30 mm/h.)
 Ideally, a measured rain database consisting of an entire year's worth of rain events should be used in the analysis software instead of a small number of (admittedly intense) events. However, it was not possible to do this, as such a rain database, with the required small scale spatial and temporal resolution, does not exist in the UK. For this reason, a simulated annual rain field database was used in order to validate the results found using the measured data.
 The effect of introducing ATPC is apparent in Figure 7a, in which the number of extra, ATPC-induced outages rises as the proportion of ATPC links increases. Figure 7a plots the number of outages, rather than the percentage, to avoid mistakenly interpreting the results as annual statistics. These results were produced using the measured rain data, hence the probability of each measured rain event is not known and the results cannot therefore be compared directly with planned unavailability.
 In the example shown, the number of extra outages is 12% of the total. The number of outages directly caused by rain also increases with ATPC penetration, even though ATPC does not, in itself, reduce the protection a link has against rain fading: this rise is caused by the progressive withdrawal of ‘excessive’ fade margin as non-ATPC links with the 10 dB minimum fade margin are replaced by ATPC links with a lower fade margin (e.g., an RFM of 5 dB). The results also show the trade-off introduced by assumptions about ATPC equipment capability: a larger ATPC range results in a more efficient plan because EIRPs are minimized; however, if the ATPC range is smaller than typical FM–RFM values then some links will have ‘excessive’ RFM, and will be better protected against interference (e.g., if the fade margin for a link is 25 dB and the ATPC range is 10 dB, then the link will operate at 25 dB − 10 dB = 15 dB above RSL, even if the required RFM is only 5 dB). Matching the ATPC range and remote fade margin appears to be a very effective method of reducing ATPC-induced outages (see the curve for an ATPC range of 10 dB and an RFM of 10 dB).
 It should be noted that the measured frontal event used here is an extremely rare event. On average, a rain rate of 25 mm/h is exceeded for 0.01% of the year (or 52 min), whereas in the frontal event this rain rate is exceeded for approximately 0.2% of the event duration. A 60 mm/h rain rate is exceeded for approximately 0.001% of the time in both the annual curve and the frontal event curve.
 It has been shown earlier that improvements in band efficiency result from the introduction of ATPC, but that additional outages then occur (during intense rain). The question then arises of whether changes to the planning process could be made that retain the efficiency gains but reduce the number of ATPC-induced outages. Three such experimental adjustments have been investigated: (1) Increasing the fade margin for all links. (2) Increasing (or decreasing) the required W/U ratios for all links. (3) Increasing the interference margin.
 Increasing the fade margin (i.e., EIRP) of all links is not effective: as expected, the band efficiency is reduced somewhat, but without reducing the number of ATPC-induced outages. As the fade margins increase, the total number of outages decreases because of the extra protection against direct rain outages, but those links affected by a nearby ATPC link receive no specific protection. An increase of 1 dB in fade margin ‘cancels’ the extra outages caused by ATPC during the frontal rain event (for an ATPC range of 10 dB and an RFM of 5 dB).
 The second mitigation approach was to vary the W/U ratios used in planning, in the expectation that this would specifically provide extra protection against interference. Adjusting the W/U ratios in the planning process and then using the same adjusted values to judge whether an outage occurs in the rain analysis produces the paradoxical result that increasing the W/U ratios actually increases, not decreases, the number of outages. In a relatively efficient plan, there will always be a large number of interfering paths with small clash test excesses, ‘ready’ to cause interference. Band efficiency, however, behaves as expected: increasing the required W/U results in a relatively less efficient plan; decreasing the required W/U results in a relatively more efficient plan.
 The final approach is to adjust the interference margin. As expected, increasing the interference margin increases efficiency (though with diminishing effect), whereas there is effectively no change in the number of ATPC-induced outages.
 In summary, adjusting W/U in the planning process is a more effective technique for reducing ATPC-induced outages than adjusting the fade margins or interference margin. However, it is evident that none of these band-wide mitigation techniques targets the ATPC-induced outages very effectively.
Figure 8 shows a series of simulated outages, two caused by ATPC-induced interference and one caused by excessive fading of the wanted link. The ‘signal excess’ shows when a link outage is caused by excessive fading of the wanted link (i.e., fade > fade margin); a curve with ATPC applied is also shown (i.e., with remote fade margin replacing fade margin). The ‘clash test excess’ shows when a link outage is caused by interference (i.e., W/Urequired < W/Uactual).
Callaghan and Vilar  and Callaghan  have described the generation of a database of rain fields which can be used to expose fixed link plans to simulated annualized rain. An important step in gaining confidence in the annualized outage percentages obtained from runs of the fixed link planning simulator is to show that the simulator does indeed produce outages at the planned rate for a plan constructed under standard planning assumptions (i.e., no novel technologies). To examine the performance of the simulator, a standard plan was constructed and the outage rate measured in response to a sample of annualized rain fields. The results show that a plan constructed with the objective of achieving a 0.01% unavailability has, when exposed to simulated annualized rain, a measured unavailability of 0.008%. This is close enough to demonstrate the general method. However, it is probable that further improvements to the simulated rain might improve the ability of that rain to provoke the expected unavailability.
Figure 7b shows the outage rates of a series of plans using the annualized database. The trends are the same as for the intense frontal event. However, the outage scale can now be expressed in units that can be compared directly with planned link unavailabilities (e.g., 0.01%), which is a significant advance. The authors believe that matching the ATPC range and the remote fade margin would reduce the percentage of extra ATPC-induced outages even further.
5. Conclusions and Further Work
 The following conclusions regarding the spectrum efficiency gains resulting from the implementation of ATPC have been identified by this study:
 1. The implementation of ATPC in the 38 GHz band gives significant improvements in spectrum efficiency as measured by the increase in the number of links assigned to channel 1 (from ∼50% to ∼70%) and the decrease in the maximum bandwidth used (from ∼300 MHz to ∼180 MHz). The introduction of ATPC does give rise to a number of additional outages in the presence of intense rain (∼12% increase in frontal rain). These additional outages can be mitigated by matching the ATPC range with the remote fade margin; however, the outages cannot be wholly eliminated by the methods examined here. However, when exposed to an annualized simulated rain database the extra outages in this case falls to 2.6%.
 2. Adjusting W/U in the planning process is a more effective technique for reducing ATPC-induced outages than adjusting the fade margins or interference margin. However, it is evident that none of these band-wide mitigation techniques targets the ATPC-induced outages very effectively.
 3. On the basis of the similarity of average fade margins between the 38 GHz band and other high-frequency fixed link bands, gains in spectrum efficiency should equally be possible in those other bands.
 Previous work attempting to assess the impact of new assignment methods or technologies on planned link availabilities was restricted to relative comparisons because the use of a limited database of measured rain rate fields meant that annual outage rates could not be determined. This study therefore sought to remove this issue by using sequences of simulated rain fields that, together, represent annual rain statistics. This allows simulated link availabilities to be compared directly with planned availabilities. The method relies on the generation of simulated stratiform and convective rain fields, which are then scaled to fit the tail of the ITU-R  rain rate distribution and combined in proportions appropriate to annual rain statistics. The method has been proved by generating a plan with known link availabilities and then testing to see whether the links respond in the appropriate way to the scaled, mixed, annualized collection of rain fields.
 The results from this test show that a plan constructed with the objective of achieving a 0.01% unavailability has, when exposed to simulated annualized rain, a measured unavailability of 0.008%. This is close enough to prove the concept, and it is anticipated that further developments in the rain field model will improve the ability of the simulated rain to provoke the expected unavailability.
 Replanning a band can be done quickly and easily from the spectrum manager's point of view, but requires considerable expense and effort by the link operators in order to retune all their links. As there are significant numbers of legacy links in the 38 GHz band, future work should be to investigate how effective ATPC would be at increasing the number of links in an already congested band, where preexisting links may not have their frequencies reassigned. The final report of this project [Callaghan et al., 2005] is available online at http://www.ofcom.org.uk/research/technology/overview/ese/atpc/atpcfinal2.pdf.
 The research presented in this paper was funded by the UK's Ofcom as part of the Spectrum Efficiency Scheme.