Geomagnetically Induced Currents and Harmonic Distortion: High Time Resolution Case Studies

High time resolution (1–5 s) magnetometer, geomagnetically induced current (GIC), and mains harmonic distortion data from the Halfway Bush substation in Dunedin, New Zealand, are analyzed. A recently developed technique using very low frequency (VLF) radio wave data provides high‐resolution measurements of mains harmonic distortion levels. Three case studies are investigated, each involving high rates of change of local geomagnetic field, but with different timescales of magnetospheric driver mechanisms, and different substation transformer configurations. Two cases of enhanced GIC during substorm events are analyzed, and one case of a storm sudden commencement. Time delays between magnetic field fluctuations and induced transformer currents are found to be ~100 s for substorm events, but only ~20 s for the storm sudden commencement containing higher‐frequency variations. Boxcar averaging of the magnetic field fluctuations using running windows of ±2 min leads to spectral power profiles similar to those of GIC profiles, with reduced power at frequencies >0.003 Hz (periods <5 min). Substantially lower mains harmonic distortion levels were observed after the removal of the single phase bank transformer, HWB T4, from the high‐voltage configuration at Halfway Bush. No systematic time delay was found between GIC variations and mains harmonic distortion levels. The power spectra of magnetic field fluctuations and GIC variations during the sudden storm commencement with no harmonic distortion showed low levels of low‐frequency power (<0.003 Hz, periods >5 min). This low‐frequency component of the magnetic field power spectrum appears necessary for mains harmonic distortion to occur.


Introduction
Disturbances of the geomagnetic field can lead to rapid fluctuations in ionospheric currents (Birkeland, 1908;Cummings & Dessler, 1967), which result in quasi-direct currents (DCs) being induced in the surface of the Earth (known as geomagnetically induced currents, GICs). These quasi-DCs have the potential to appear in high-voltage power systems (Molinski, 2002;Pirjola & Boteler, 2017) and are known to have caused disruptions to equipment connected to long transmission lines since the 1840s (Boteler et al., 1998). High levels of GIC can affect high-voltage transformers in a number of ways, including increased reactive power consumption, waveform harmonic distortion, and even damage or destruction (Boteler, 2015(Boteler, , 2019Rodger et al., 2020;Samuelsson, 2013).
The creation of harmonic distortion components, as a result of a high-voltage transformer driven into half-cycle saturation by GIC, was investigated by Clilverd et al. (2018), building on earlier observations by Boteler et al. (1989), Girgis and Vedante (2015), and Hayashi et al. (1978). Clilverd et al. (2018) used colocated measurements of geomagnetic field perturbations, GIC levels, and very low frequency (VLF) radio waves made at a substation in Dunedin, New Zealand, to show that remote monitoring of transformer saturation effects was possible. Even order harmonics up to 1,500 Hz were observed on 8 September 2017, generated during a large geomagnetic storm. Time dependent variations in the intensity of the VLF harmonics, measured at 5-s resolution, were shown to be similar to the variations in GIC levels. This suggested that harmonic measurements could potentially be used as a proxy for GIC flowing in high-voltage power systems.
Subsequently, Rodger et al. (2020) provided a comprehensive analysis of even order harmonic distortion measurements throughout the whole of New Zealand. Low time resolution (10-min) distortion measurements logged by the network owner, Transpower Ltd, were analyzed to provide an extensive set of observations that far exceeded the set of GIC measurements. Evidence of transformer saturation effects, using the distortion measurements, was found at many locations during two large geomagnetic disturbances (March 2015 andSeptember 2017). Some locations showed evidence of transformer saturation that were not previously expected to be at risk, that is, in the comparatively low latitude North Island, and as such, those sites did not have GIC monitoring in place. However, the low time resolution (10-min) sampling of harmonic distortion available in that study would not be fast enough to fully represent all of the variations in GIC. This is particularly true for large short-lived bursts of GIC that have been associated with transformer failures in the past (Béland & Small, 2004;Guillon et al., 2016;Marshall et al., 2012;Rodger et al., 2017, and references therein) where timescales of 1 or 2 min were found to be critical in transformer damage.
The magnitude of GIC in power systems is initially dependent on complex magnetospheric processes driving geomagnetic field variations . This is followed by underlying surface conductivity structures which can act as a low-pass filter for the fluctuations, along with the high inductance of the current path through the power system due to the transformer windings (e.g., Divett et al., 2017). In order to understand the response of high-voltage transformers to geomagnetic disturbances, high time resolution measurements are necessary, especially at middle-low latitudes where large GIC have been shown to be driven by storm sudden commencements and sudden impulses, (e.g., Carter et al., 2015;Fiori et al., 2014;Marshall et al., 2013;Rodger et al., 2017;Watari et al., 2009). Rodger et al. (2017) studied 25 geomagnetic storms that generated large GIC in New Zealand and found that many of the peak current values occurred at the start of the storm, particularly storm sudden commencements. The study concluded that high time resolution measurements during GIC events, that is, 5-s sampling, would be of value to power grid operators.
However, long-lasting GIC events, potentially caused by phenomena such as substorms also have the potential to generate long-lasting impacts on transformer performance. For example, they result in more transformer heating than for shorter-lived events particularly for transformer components with long timescale responses, that is, the protective oil (Viljanen et al., 2001). Clilverd et al. (2018) also showed that long-lived GIC events of 30-60 min were capable of generating signatures of harmonic distortion associated with transformer DC saturation. In an analysis of over 100 years of midlatitude magnetometer data from the British Geological Survey observatories in the United Kingdom, Freeman et al. (2019) showed that substorms were the primary source of the highest rates of change of magnetic field components. Whatever the source and duration of the event, it is useful to understand the key magnetic field variation frequencies involved in generating GIC observed. Typically, a geomagnetic disturbance produces magnetic field fluctuations covering a wide range of frequencies; however, localized surface conductivity structures have the potential to low-pass filter the time-varying field components leading to GIC variations with less high time variability than the magnetic field . Ingham et al. (2017) used transfer function analysis to identify that fluctuations with periods less than 2 min have a significant contribution to overall GIC levels. These results are in contrast to the typical 10-min period used in a thin-sheet model in reproducing GIC observations developed by Richardson and Beggan (2017), based on earlier work of Mckay (2003), and adapted for New Zealand by Divett et al. (2017).
In this study we analyze magnetometer time series, GIC time series, and VLF radio wave time series from instrumentation focused on the Halfway Bush (HWB) substation in Dunedin, New Zealand. The time series sampling rates range from 1 to 5 s (section 2). Three case studies are investigated, each driven by periods of high rates of change of local geomagnetic field, but with different timescales of magnetospheric driver mechanisms, and with different substation transformer configurations (section 3). We compare each time series to determine response lag times, frequency sensitivity, and correlation levels between the rate of change of the local magnetic field and the resultant induced currents in the HWB T6 transformer (section 4.1). Further comparisons between the induced currents and VLF harmonic time series are analyzed to identify GIC saturation of a transformer leading to harmonic distortion components, determine transformer response lag times, and any induced current level threshold required for distortion to occur under different case scenarios (section 4.2). Conclusions are summarized in section 5. from a severely disturbed period on 7-8 September 2017 driven by a large coronal mass ejection (CME) that occurred on 6 September 2017 (Clilverd et al., 2018). The first case is a large substorm event~12 hr after the onset of the geomagnetic storm. Another substorm event (Case 2) comes from a geomagnetic storm period on 26 August 2018. The event occurred~14 hr after the onset of a geomagnetic storm with no sudden storm commencement, and relatively low solar wind speeds (~500 km s −1 ). However, the geomagnetic disturbance index, Dst, reached its minimum of −174 nT at the time of the GIC event. A third study event (Case 3) is from the disturbance caused by the sudden storm commencement following a solar wind shock arrival on 7 September 2017, with solar winds speeds reaching >700 km s −1 .

New Zealand GIC Observations
Here we use the Transpower New Zealand Limited HWB T6 transformer DC neutral current measurements from the Halfway Bush substation in Dunedin (see the maps in Figures 2 and 3 of Clilverd et al., 2018, and Figure 1 of Rodger et al., 2020, for locational context). Details of the LEM Hall effect measurement instrument and identification of the GIC component in the neutral Earth current data set are described in detail in Mac . An important expansion of the HWB T6 data for this study is to use the data set at its nominal 4-s resolution. The LEM device that does the actual measuring is analogue and provides a real-time current signal into the local station remote terminal unit (RTU). The RTU then samples the real-time signal from the LEM (at 4-s resolution) for real-time indication, and alarms, provided to Transpower operators. However, a new measurement is only logged if it deviates by more than 0.2 A from the last logged value. Typically, this means that data gaps are 4, 8, or 12 s during events when current levels are changing but can stretch to 1 or 2 min during quiet times. Uncompressing the data assumes the same current value for successive 4-s periods until a new, different value is recorded. In the events analyzed here the average data sample time gap was found to be~8 s, so data sets were typically doubled in size as a result of the decompression process.
Previous studies of GIC from the Halfway Bush substation have concentrated on the HWB T4 transformer there (e.g., Clilverd et al., 2018). This was primarily because of the historical importance of that particular transformer, which suffered a failure in November 2001 (Béland & Small, 2004;Marshall et al., 2012) caused by high levels of GIC . In this analysis we concentrate on measurements from the HWB T6 transformer due to the fact that in the 11-month interval between the two geomagnetic storms that are analyzed here, the HWB T4 transformer was decommissioned, that is, in December 2017. We note, however, that during the GIC events in September 2017, the HWB T4 and HWB T6 transformers experienced very similar GIC levels, both in magnitude and time variation. Figure 1 shows two single line diagrams of the Halfway Bush substation high-voltage connections, with and without the HWB T4 transformer in the configuration. One of the questions addressed here is as follows: Did the removal of the HWB T4 transformer from the substation result in significant reductions in measured harmonic distortion during subsequent GIC events? Figure 2 provides an overview of the GIC levels in the three cases analyzed. Panels (a)-(c) show the cases in the order that they are described in the rest of this manuscript. Each panel has the same x axis period (just over an hour) and has the same y axis range of induced current. An observed start time of enhanced GIC is given by a vertical red dash-dotted line in each panel, arranged such that the start times of each case are aligned within the plot window. Start times are indicated where the GIC exceeds ±5 A , or when rapid fluctuations of the local magnetic field are observed. Induced currents are shown to exhibit distinct differences from case to case, in terms of time variations and the sign of the currents. Panels (a) and (b) are long-lasting events that occurred during the main/recovery phase of their respective storms, while panel (c) was caused by a storm sudden commencement and shows large induced currents predominantly at the very start of the event.

Magnetometer and VLF Harmonic Time Series
Local magnetic field measurements were made at Swampy Summit close to Dunedin, 7 km distant from the Halfway Bush substation. Instrument details are given in Clilverd et al. (2018). Following previous analysis we use the rate of change of the horizontal magnetic field component, dH/dt, which has been shown to be highly correlated with GIC magnitude (Mac . The magnetic field data was initially 10.1029/2020SW002594 Space Weather sampled at~1-s resolution, although for this study we have subsampled to 4 s in order to compare with the 4-s sampling of the GIC.
Broadband VLF measurements with 46-Hz bins were made at the security perimeter fence of the Halfway Bush substation site. The frequency resolution of the bins is small enough to be able to resolve individual harmonic components of the 50-Hz mains frequency. The time sample of the measurements was 5-s. Analysis here is restricted to an upper limit of the 24th harmonic of the fundamental 50 Hz, that is, up to 1,200 Hz. Details of the VLF instrument design and capabilities are given in Clilverd et al. (2009Clilverd et al. ( , 2018.

Case 1: 8 September 2017
The first case study period is taken from the recovery phase of a large geomagnetic disturbance in September 2017. Clilverd et al. (2018) described the source driving mechanism as a magnetospheric substorm. Figure 3 provides a comparison between the local horizontal magnetic field variation (panel a) and the GIC recorded on HWB T6 (panel b). The time sampling for both panels is 4 s as previously discussed in section 2. The vertical dash-dotted line indicates the event onset time. Prior to this time the dH/dt fluctuations are near zero, which is consistent with the near-zero GIC levels. After the event onset, rapid fluctuations in the magnetic field H component occur, with both positive and negative gradients. By the end of the plotting period the dH/dt is oscillating around zero again with no significant offset, but still exhibiting high-frequency variations. The HWB T6 GIC panel shows less high frequency variations, with large positive currents occurring during the first hour of the event, followed by some negative current swings before the values return to near-zero levels at the end of the plotting window. Panel (c) shows dH/dt fluctuations again, but this time the plotted data has been smoothed with a running 40-point boxcar average, which has the effect of reducing the power of the high-frequency variations above 0.002 Hz (<500 s) by a factor of~10 (see discussion in section 4.1). The similarity of the time variations of the GIC in panel (b) and the low-band-pass-filtered dH/dt in panel (c) are notable, particularly regarding the peaks, and the negative-going swings. The 40-point boxcar average was determined through an analysis which was undertaken of the time delay between the magnetic field variation and the consequent GIC

10.1029/2020SW002594
Space Weather response, as well as the dH/dt smoothing level required. All combinations of the two parameters (i.e., time delaying and smoothing) were varied to determine largest Pearson correlation coefficient between the two data sets. A time delay of 76 s, combined with a 40-point boxcar average, gave a Pearson correlation coefficient, which peaked at 0.74 (significance <10 −44 using 1,260 data points). For comparison, no time shift, and no smoothing, gave a correlation coefficient of 0.37 for the same data set. No time shift, combined with a 40-point boxcar average, gave a correlation coefficient of 0.64. A time shift of 76 s with no boxcar averaging gave a correlation coefficient of 0.42.
Large GIC levels occurring in the Halfway Bush substation have the potential to generate harmonic distortion through half-wave saturation within a transformer (see the description in Rodger et al., 2020). The wideband VLF radiowave instrument located at Halfway Bush can observe enhancements in the intensity of mains frequency harmonics as a result of harmonic distortion, although it is not possible to distinguish which of the nearby transformers is generating the distortion signatures. Given transformer HWB T4's history of a previous failure during large GIC (Béland & Small, 2004;Marshall et al., 2012;Rodger et al., 2017), it would seem reasonable to assume that it may be the cause of the harmonics, but the perimeter VLF observations cannot answer that riddle alone. Figure 4 shows the time variation of a selection of the frequency bins associated with mains harmonics, plotted during the time period covered by Case 1. The frequency variation of the harmonics during the disturbed period can be contrasted with a nearby quiet time period, shown in Figure S1 in the supporting information. Panel (a) shows the amplitude of even harmonics from 100 to 1,100 Hz (red and blue lines) with the harmonic order labeled in black. In this panel there are three odd harmonics that will be shown consistently in all cases studied here (plotted as black lines representing 250, 550, 1,150 Hz). Each bin is offset by typically 10 dB to separate the lines, which are identified by the 50-Hz mains frequency harmonic that occurs within the 46-Hz-wide bin. The signal amplitude (vertical) scale in dB is the same for every harmonic, and the vertical offset between each pair of adjacent harmonics has been chosen to minimize overlap (while trying not to make the figure impracticably tall). This leads to a typical offset of 10 dB between each harmonic, which is indicated on the figure by a vertical bar. Red lines indicate even harmonics (second to twelfth) that show variations with time that are consistent with the GIC variations shown in panel (b) of Figure 3. Blue lines indicate even harmonics which do not clearly show the same variation pattern (14th to 22nd). We note that this is somewhat subjective, but the aim is to help simplify the panel. The black vertical dash-dotted line indicates the start of the event as identified in Figures 2 and 3, while red dash-dotted vertical lines are plotted to facilitate easier comparison of the timing of amplitude peaks in different frequency bins and represent times of peaks in GIC as described in the next panel, below.
Building on panel (a) of Figure 4, panel (b) shows the variation of the average signal intensity in all bins from 100 to 600 Hz, including all even and odd harmonic channels. A horizontal black dashed line is plotted to indicate the preevent intensity levels. The same levels occur at the end of the event at about 13 UT, followed by a sharp decrease below the line. It is unclear if this is a result of the GIC event but is more likely to be associated with an occasionally observed dip in noise levels coming from the substation which occurs approximately every 2 hr, is centered on the twelfth harmonic and lasts about 5 min. The variation in the average signal between 100 and 600 Hz is compared with the absolute T6 GIC variation in panel (c). A common time sample interval of 20 s was made by subsampling each data set in order to make the figures and undertake correlation analysis. The Pearson correlation coefficient was calculated, varying the time delay between the harmonic distortion and GIC data sets. The maximum correlation coefficient was 0.94 (significance <10 −44 , for 252 data points), which was achieved when there was no time shift between the two data sets, that is, less than 20 s. The best fit is shown as a red line in panel (c), with a gradient of 0.33 dB/A.

Case 2: 26 August 2018
Previously, we have noted that the GIC levels measured at the HWB T4 and HWB T6 transformers were very similar during the Case 1 event. It is possible that HWB T4 was responsible for the enhanced levels of harmonic distortion observed to be emanating from the substation rather than HWB T6, as HWB T4 is a single phase bank transformer, while HWB T6 has a three phase design. Therefore, in Case 2 we investigate the response of only HWB T6 and substation harmonic distortion levels to a GIC event, which occurred after HWB T4 had been removed from operation. On 26 August 2018 there was a geomagnetic storm with moderately enhanced solar wind speed (~500 km s −1 ) but with large Dst (−174 nT). The

Space Weather
CLILVERD ET AL.
The 78-point boxcar average of dH/dt was determined through the same analysis technique approach that was used for Case 1. All combinations of time shift and boxcar filtering factor between the dH/dt and GIC data sets were analyzed to find largest Pearson correlation coefficient. A time delay of 128 s, combined with a 78-point boxcar average, gave a Pearson correlation coefficient which peaked at 0.83 (significance <10 −44 using 900 data points).
The substorm-driven event in Case 2 exhibits similar ranges of dH/dt and HWB T6 GIC values compared with the substorm-driven event in Case 1 and also lasts for about the same length of time. However, the principle difference is that the Halfway Bush transformer HWB T4 was removed from operation between the dates of Case 1 and Case 2. Figure 6a shows the same individual 50-Hz mains harmonics components as in Figure 4. The start time of the event and the main two peaks of enhanced GIC are indicated by vertical dash-dotted lines. For some of the harmonics peaks are visible at these times, especially for the fifth, sixth, eighth, and eleventh harmonics. Notably, enhancements in amplitude were not observed in the 2nd and the 23rd harmonics. The frequency variation of the harmonics during the disturbed period can be contrasted with a nearby quiet time period, shown in Figure S2 in the supporting information. Panel (b) shows the average variation of all channels between 100 and 600 Hz. As before, panel (c) shows the relationship between the average 100-to 600-Hz harmonic distortion levels and the absolute GIC data. A common time sample interval of 20 s was made for each data set in order to make the plot and undertake a Pearson correlation, as in Figure 4 (for Case 1). The maximum correlation coefficient was 0.68 (significance <10 −16 using 180 data points), with a boxcar averaging of six points, which was achieved when there was no time shift between the two data sets, that is, less than 20 s. The correlation is lower for Case 2 than Case 1, probably because of the lower amplitude harmonic distortion present. The best fit is shown as a red line in panel (c), with a gradient of 0.12 dB/A. This suggests that the amplitude harmonic distortion levels in Case 2 are about one third of those seen in Case 1.

Case 3: 7 September 2017
The third case study event comes from 7 September 2017 and is associated with the arrival of a solar wind pressure pulse just after 23 UT. The shock was identified by the SOHO satellite at L1 at 2238 UT [umtof. umd.edu/pm/FIGS.HTML] with the solar wind suddenly increasing from~450 to~700 km s −1 . This particular case study event has been discussed by Clilverd et al. (2018) as "Event 3" and Rodger et al. (2020) as "Event 3a", where the apparent absence of harmonic distortion even in the presence of large induced currents was noted. During this event both HWB T4 and HWB T6 were operational at Halfway Bush substation. Figure 2c showed more than an hour of HWB T6 GIC data during this event. However, in this Case 3 we concentrate on only the initial short period following the arrival of the pressure pulse where large negative currents were measured as part of a rapid response to the solar wind shock arrival. Figures 7a and 7b show the magnetometer dH/dt and HWB T6 GIC, respectively, during the short period following the start of the event, indicated by a dash-dotted vertical line. Large negative rates of change of magnetic field correspond to large negative induced currents in HWB T6. The panels cover a period of 8 min, although the duration of large GIC values only lasts 3 min. Panel (c) shows dH/dt filtered by a boxcar average using 10 points. As in previous analysis, the maximum Pearson correlation between the dH/dt and the HWB T6 GIC data over this short period was determined. A maximum correlation coefficient of 0.89 (significance 4 × 10 −25 using 80 data points) was found, associated with a delay of 20 s between the magnetic field fluctuations and the resultant GIC level, and filtering provided by 10-point boxcar averaging (40 s). The similarities between the filtered dH/dt and the GIC variations are notable.
Harmonic distortion levels are shown in Figure 8, following the same format as Figures 4 and 6. In panel (a) it is difficult to identify any clear amplitude enhancements in the individual channels. The frequency variation of the harmonics during the disturbed period can be contrasted with a nearby quiet time period, shown in Figure S3 in the supporting information. The average harmonic distortion shown in panel (b) shows a small enhancement in amplitude above the indicated baseline, although the largest peak occurs prior to the onset of the event. Panel (c) shows that there is little harmonic response to a relatively large range of current values. The Pearson correlation coefficient of 0.63 is the lowest of all cases, and the significance, p = 0.002, is the largest. The best fit is shown as a red line in panel (c), with a gradient of 0.04 dB/A, that is, about eight times lower than seen for Case 1, which occurred only 13 hr later.

Space Weather
CLILVERD ET AL.

Discussion
Three cases of rapid magnetic field fluctuations, with enhanced levels of GIC in transformer HWB T6 at Halfway Bush substation in Dunedin, New Zealand, and resultant mains harmonic distortion levels at the substation have been investigated. Cases 1 and 2 describe the transformer/substation responses to the magnetic field disturbance caused by two similar substorms. Large induced current peaks between 45 and 50 A were seen in both cases, and the enhanced GIC lasted for about an hour each time. However, for Case 1 the substation had both transformers HWB T4 and HWB T6 operating, while for Case 2 HWB T4 had been withdrawn from service, and only HWB T6 remained. Case 3 describes the transformer/substation response to a storm sudden commencement which generated large GIC that lasted only 3 min. No obvious harmonic distortion was generated despite large induced current peaks of~35 A being present. The results from each case study are summarized in Table 1 and discussed below. During the two substorms, GIC variations were observed to occur in response to the fluctuating local magnetic field. Maximum correlation came from taking into account a time delay of~100 ± 30 s, with dH/dt leading GIC. Maximum correlation also required some high-frequency filtering of the magnetic field fluctuations, achieved in this study through boxcar averaging of 240 ± 80 s. As a result of this treatment of the dH/dt time series the typical correlation found, and its significance, was~0.8 for the~1,000-point comparisons. Mac  undertook correlation analysis between various GIC data sets throughout the South Island of New Zealand, and magnetic field fluctuations recorded at Eyrewell near Christchurch, South Island. For a storm period on 2 October 2013, 1-min resolution data over an interval of 6 hr (i.e., 960 data points) showed correlation coefficients of~0.75 for dH/dt. Those correlation coefficient values are comparable with the values found here. Note that the use of 1-min averaging in the Mac  study has the effect of filtering out high-frequency dH/dt fluctuations in a similar manner to the 240-s boxcar averaging used in this study. The 1-min averaging also reduces the influence of thẽ 100-s time delay between dH/dt and GIC.
The effect of boxcar filtering on the power spectra is shown in Figure 9. Panel (a) shows the normalized power spectrum for dH/dt (black line) and filtered dH/dt (blue line) for Case 1. Also shown is the HWB T6 GIC normalized power spectrum (red line) for comparison. Panel (b) shows the same format but for Case 2. The highest frequency of 0.125 Hz is determined by the 4-s sampling period, while the lowest frequency of 0.0002 Hz is determined by the~1-hr duration of the events. For Case 1 the filtering is achieved by using 160-s boxcar averaging as found in section 3.1, while Case 2 uses the 312-s boxcar average found in section 3.2. Both panels show that the effect of the filtering is to reduce the power in the high-frequency dH/dt components. For Case 1 reductions in spectral power by a factor of 10 occur at about 0.005 Hz (a period of~3 min), while for Case 2 the reductions occur at 0.003 Hz (~6 min). This analysis suggests that dH/dt variations with periods longer than 3-6 min are likely to dominate the GIC response and will provide more accurate power system modeling than if the high-frequency dH/dt components are included in such modeling. Case 2 is notable in that the filtered dH/dt has a very similar spectral variation to the power spectrum of the GIC during the event.
The frequency smoothing and time delay found in these cases are likely to be due to the local ground conductivity structure in the region around the substation, and the network inductance. The time-varying magnetic field driver is related to the geoelectric field variations through the ground impedance (e.g., Trichtchenko & Boteler, 2006). Ground conductivity profiles with lower conductivity at greater depth would reduce the influence of high-frequency components of the driving magnetic field compared to the low-frequency parts, resulting in the time domain smoothing found here. High network inductance is likely to introduce a time lag of the GIC compared to the magnetic field driver, with the lag duration dependent on the frequency components present Ingham et al., 2017).

Space Weather
Both Case 1 and Case 2 substorms generated similar peak GIC levels, that is, 45 to 50 A. During both of these cases, harmonic distortion in the range 100 to 600 Hz was observed at the Halfway Bush substation. Even order harmonic enhancements were observed, as well as odd order harmonics. However, the intensity of the harmonics (dB/A) was a factor of 2.75 lower in Case 2 than in Case 1. Also, the second harmonic did not show enhanced levels in Case 2, whereas it did in Case 1. High levels of the lower order harmonics (i.e., second or third) is thought to play a role in the incorrect operation of protective switching during GIC events, which can lead to power blackouts and the potential destabilization of the local power grid (Bolduc, 2002;Wik et al., 2009). The reduction of the harmonic distortion levels between Case 1 and Case 2 appears to be associated with the decommissioning of the HWB T4 transformer. The reduction in harmonic distortion suggests that T6 experiences less half-cycle saturation than T4 did, despite experiencing similar levels of GIC. The reduction in distortion by a factor of almost 3 is consistent with the three phase HWB T6 transformer design compared with the design of the HWB T4 single phase bank transformer.
Pearson correlation coefficients were found between the observed GIC and harmonic distortion (100-to 600-Hz average) time series. Both data sets were subsampled to a common 20-s sample rate before investigating time delays and boxcar smoothing levels. Case 1 gave a maximum coefficient of 0.94 with no time

10.1029/2020SW002594
Space Weather delay, and no smoothing required. Case 2 gave a maximum coefficient of 0.68, again with no time delay, but this time with 80-s boxcar averaging required (four points). The lower coefficient in Case 2 compared with Case 1 is consistent with lower intensities of the distortion harmonics occurring, and background noise levels having more influence on the correlation level. The lack of any detectable time delay (±10 s) between GIC levels and distortion intensity is somewhat surprising given the potential inductive latency of the transformers. However, this effect might be more obvious in the apparent threshold effect between rising GIC levels and the onset of observable harmonic distortion (reported by Clilverd et al., 2018), rather than the well correlated variations of the two time series once the threshold current levels have been surpassed. This idea is discussed in further detail at the end of the next subsection.
Low time resolution percentage total harmonic distortion (THD) levels during Case 1 were shown for Even and Odd harmonics by Clilverd et al. (2018). As a result of the substorm the Even THD reached~0.8% while the Odd THD was~0.4%. The steady state upper THD recommendation for 69-161 kV is 2.5% [IEEE 441Std 519-2014]. We can roughly estimate what harmonic amplitude level is likely to be seen when 2.5% distortion is present. From Figure 4c there is a 12-dB increase in 100-to 600-Hz average amplitude for~35 A GIC. The average (even and odd) THD percentage of (0.8 + 0.4)/2 = 0.6% therefore generates 12 dB of signal increase. Thus, to achieve 2.5% distortion levels, an extra 12-dB increase (20 × log(2.5/0.6)) in signal amplitude would be required. During Case 1, when HWB T4 was operating, the 0.33-dB/A scale factor for harmonic distortion due to GIC would suggest that 24-dB distortion levels; that is, 2.5% THD, would have been reached with 24/0.33 =~70 A. This level is similar to the~100 A estimated by Rodger et al. (2017) for the HWB T4 failure in 2001. During Case 2, when HWB T4 had been withdrawn from operation the 0.12-dB/A scale factor leads to 50-dB (2.5%) distortion levels at~200 A, suggesting that a much larger DC current would be required to generate significant distortion.

Response to a Storm Sudden Commencement: Case 3
Case 3 showed rapid variations in the local magnetic field, and peak GIC levels high enough to be expected to generate harmonic distortion as with the other events studied here. However, no harmonic distortion was observed during the sudden, short-lived, impulsive event. During the first few minutes of the event, when the peak GIC values of 35 A occurred, there was only a short delay between dH/dt and GIC variations (~20 s). Additionally, little boxcar averaging was required to maximize the Pearson correlation coefficient (0.89). This suggests a strong coupling between the local magnetic field perturbations and induced currents, but little impact of those large currents on the levels of harmonic distortion.
In the previous substorm analysis we found that dH/dt variations with periods longer than 3-6 min (<0.003-0.005 Hz) are likely to dominate the GIC response. This is investigated for the Case 3 sudden storm commencement event via Figure 10. The upper panel shows the spectral power for dH/dt during the whole event period (as shown in Figure 2c) plotted in comparison to the substorm, Case 1. The lower panel of Figure 10 shows the GIC power spectra for Case 3 compared with Case 1, in the same format. Comparisons are made with Case 1 rather than Case 2 (or both) because these events are within hours of each other and have the same substation configuration. The power values are normalized to the maximum power observed in Case 1. Both parameters show similar spectra for both Case 3 and Case 1. Vertical dash-dotted lines indicate two key frequencies. The highest frequency identified in the panels is 0.03 Hz (33-s period). Above this frequency the power spectra for Case 1 and Case 3 are similar, for both the dH/dt and GIC panels. The lowest frequency indicated is 0.003 Hz (333-s period). Between 0.003 and 0.03 Hz the Case 3 spectral power is noticeably higher than for the substorm Case 1, suggesting that this is a significant characteristic of the storm sudden commencement. Below this frequency the power in Case 3 is lower than for Case 1, suggesting that the sudden commencement does not generate low-frequency fluctuations in the same way as substorms.
Additionally, our earlier analysis picked out frequencies below 0.003 Hz as being particularly important in the coupling between magnetic field perturbations and GIC that resulted in harmonic distortion. This suggests that the explanation for the lack of harmonic distortion response to the storm sudden commencement is due to less excitation of longer period perturbations (>5 min) both in local magnetic field variability, and the resultant GIC variations. Five-minute periods are likely to represent the characteristic response time of the transformer, where the slow variation allows magnetic flux to build up, and push a core into saturation thus generating harmonic distortion (Albertson et al., 1992;Viljanen et al., 2001).
Other things to consider in the generation of harmonic distortion are transformer power loading and AC system voltage-a transformer loaded to capacity will require more DC current to saturate than a transformer, which is lightly loaded, and a higher system voltage will cause saturation to occur at lower DC currents. For Cases 1 and 3 loading was 5.9 and 4 MVA respectively (i.e., lightly loaded, as HWB T6 is rated for 150 MVA) and the system voltage was approximately 228 kV for both events. So these effects can be ruled out as explanations for the lack of harmonic distortion in Case.
The lack of harmonic distortion response to high-frequency perturbations of GIC allows us to make an estimate of the apparent threshold in current that is needed to generate any distortion. In Case 1, there is a steady increase in current from 0 to 30 A in 0.4 hr, or 75 A/hr. If harmonic distortion is driven by periods >5 min, this would suggest that a response is unlikely to be observed before this time, that is, before currents have reached 7 A (75 * 5/60). For Case 2, the rate of increase was~160 A/hr, suggesting a "threshold" level of 15 A. These "threshold" levels are consistent with the suggested level of 15 A observed by Clilverd et al. (2018). For Case 3, the GIC levels had returned to near 0 A well before 5 min had elapsed from the onset time (i.e., within 3 min) despite reaching substantial levels of~35 A at one stage. Therefore, little harmonic distortion would be expected in this short-lived event, as was confirmed by the observations. Two rapid transformer failures have been well documented. The failure of the HWB T4 transformer at Halfway Bush in 2001 occurred in the same minute as the sudden storm commencement of a large geomagnetic storm (Rodger et al., , 2020. The collapse of the Hydro-Québec power system in 1989 was triggered within 90 s of the start of a magnetic disturbance caused by a combined sudden storm commencement with a coincident substorm (Boteler, 2019). The collapse was initiated by harmonics triggering Static VAR Compensators to trip out. In the case study of the sudden storm commencement undertaken here (i.e., Case 3) no obvious harmonic distortion occurred within the first few minutes, which goes against the idea that harmonics may have been involved in these two failures. However, the collapse of the Hydro-Québec power system is known to have been triggered by the presence of harmonic distortion (Boteler, 2019). This suggests that unusually high power, low-frequency components were present in the magnetic field fluctuations, possibly as a consequence of coincident substorm and sudden commencement activity (Boteler, 2019). The 2001 failure of the T4 transformer at Halfway Bush is not thought to have occurred as a result of harmonic distortion. Of the three pathways to failure: heating, harmonics, or insulation breakdown (Boteler, 2015;Samuelsson, 2013) insulation breakdown through the cumulative effect of multiple overloads appears most likely in the case of HWB T4 (Béland & Small, 2004;Marshall et al., 2012).

Summary
In this study three different periods of enhanced GIC have been investigated using data collected from within and close to the Halfway Bush substation in Dunedin, New Zealand. Two cases of enhanced GIC during substorm events are analyzed, as well as one event driven by a storm sudden commencement. Three different data sets have been compared: local magnetic field perturbations, particularly dH/dt; GIC levels from transformer HWB T6 at Halfway Bush; and harmonic distortion levels determined from the substation perimeter using a VLF radiowave receiver. The time series involved were initially sampled at 1, 4, and 5 s, respectively, but the dH/dt were subsampled to 4 s for detailed magnetic field and GIC comparisons. Between the case events the historic transformer, HWB T4, was withdrawn from operation at Halfway Bush, allowing changes in local circuit configuration to be investigated. The following conclusions can be made: 1. Time delays between magnetic field fluctuations and induced transformer currents are found to be of the order of 100 s for the pair of substorm events studied here, but only 20 s for the storm sudden commencement case, which contained significantly higher frequency components. 2. Boxcar averaging of the magnetic field fluctuations using a running window of~±2 min leads to spectral power profiles that are similar to GIC profiles, with reduced power at frequencies higher than 0.003 Hz (periods <5 min). 3. Pearson correlation coefficients between magnetic field fluctuations and induced transformer currents maximize at~0.8 (p < 10 −44 ) when taking Points (1) and (2) into account. Analysis using 1-min average values effectively compensates for these time delays and spectral filtering requirements, consistent with the earlier results of Mac . 4. Even and odd harmonics between 100 and 600 Hz were observed in both substorm events. However, substantially lower harmonic distortion levels were observed after the removal of the single phase transformer, HWB T4, from the high-voltage configuration at Halfway Bush. 5. Correlation coefficients between GIC variations and harmonic distortion levels were high during the substorm events and exhibited a maximum of 0.94 (p < 10 −44 ) when HWB T4 was operating. 6. The power spectra of magnetic field fluctuations and GIC variations during the sudden storm commencement showed comparatively low levels of low-frequency power (<0.003 Hz, >5-min period). This component of the magnetic field power spectrum appears necessary for harmonic distortion to occur. 7. GIC levels of >7-15 A were observed to generate coincident harmonic distortion, but only when the GIC contained significant spectral power with periods >5 min. A storm sudden commencement, with a peak current of 35 A, but which faded after only 3 min did not generate any obvious harmonic distortion.
The time delay and the boxcar averaging of magnetic field fluctuations determined here from Conclusions (1) and (2) suggest typical values for interpreting local magnetometer data in a GIC context. These 10.1029/2020SW002594

Space Weather
findings show that local magnetic field and VLF measurements alone could be used to establish a method that would signal potential damage to power transformers caused by GIC.