Improved operational reliability and contaminant removal in water reuse through filter upgrades

A redesign, construction, and replacement of the filters used for water reuse at a 11 million gallon per day Florida municipal water reclamation facility provided a prime opportunity to evaluate the impact of the filtration technologies on water quality, trace organic contaminant (TOrC) removal, and removal of pathogens. This study was designed to capture operational data, pathogen removal (Cryptosporidium and Giardia), and TOrC removal before the replacement of the filters using the existing synthetic media filters (SMF) and traveling bridge filters (TBF) followed by an evaluation of the deep bed filters (DBF) after construction and commissioning. The new DBF units provided substantially improved control of turbidity and total suspended solids while also significantly improving the removal of Giardia cysts (increase of >2 log removal) as compared to removal across the older SMF and TBF units and minor improvements to Cryptosporidium oocyst removal. TOrC removal was not significantly changed when comparing removal across the SMF, TBF, and DBF units nor post chlorination (chloramination).


| INTRODUCTION
Water reclamation and water reuse provide vital environmental and community level services through replenishment of local wetlands, lakes, rivers, and aquifers, and by providing water for agricultural, industrial, and recreational use that offsets the need for potable water. There are regional differences that drive the amount and types of applications of water reuse (USEPA, 2012), but in general nonpotable reuse is governed at the state level and is designed to meet specific treatment goals, typically focused on filtration and disinfection requirements for removal of pathogens, turbidity, and/or nutrients such as nitrogen and phosphorus. For example, in Florida nonpotable reuse for landscape irrigation with unrestricted access requires secondary wastewater treatment with "highlevel disinfection" (per section 62-600.440 of the Florida Administrative Code [FAC]) using site-specific requirements for total chlorine residual (generally 1-1.5 mg/L) and total microorganism concentrations (generally 75% of samples < method reporting level and not more than one sample at 25 fecal coliform values per 100 ml). The FAC also requires a total suspended solids (TSS) load of not more than 5 mg/L. Typical nonpotable water reuse treatment processes (e.g., secondary wastewater treatment plus filtration and chlorination, ultraviolet light (UV) disinfection, or chloramination) are designed to minimize risks from microbial contaminants (Taran-Benshoshan et al., 2015) and provide a stable water for distribution to nonpotable customers, but they are not designed specifically to manage trace organic contaminants (TOrCs) that may be present in municipal wastewaters and their effluents (Huang et al., 2001;Kummerer, 2001;Snyder et al., 1999;Stumm-Zollinger & Fair, 1965).
The design and operation of granular media filters for water reclamation and reuse applications differ greatly from those for drinking water applications. Granular media filtration, sometimes called depth filtration, was originally developed for drinking water applications, and then adapted for wastewater treatment, usually referred to as tertiary treatment. Modern drinking water filters are designed and operated to produce a finished water turbidity of 0.1-0.5 NTU when used with a coagulation process in front of the filters (Huck et al., 2000). While there is a range of designs, most filters consist of dual media-anthracite on top of sandwith the depth of anthracite typically about 24 inches for conventional dual media filters and the sand depth around 12 inches, though this may vary widely (Edzwald, 2011). Deeper bed filters are also used in practice that contain up to 60 inches of anthracite over 12 inches of sand. The effective size of the anthracite typically ranges from 0.8-0.9 mm for the conventional depth filters to 1.1-1.4 mm for deep bed filters (DBF). The sand is normally around 0.5 mm in drinking water applications but may be much larger in wastewater applications. Filtration design rates range from 4-6 gpm/ft 2 for typical dual media filters up to 8-10 gpm/ft 2 or higher for DBF, though actual operation may be much lower. To produce a finished water that meets regulatory standards and treatment goals, a coagulant is added to destabilize particles and improve the removal of natural organic matter (Huck et al., 2000). The filters are designed to remove particles in the raw water as well as any added or formed during treatment from coagulant addition, the oxidation of dissolved metals, and the precipitation of natural organic carbon that were not removed by clarification.
Wastewater filters are typically designed to produce an effluent with TSS concentrations below 10 mg/L and a turbidity range of 2-8 NTU-much higher than drinking water plants, and many do not use a coagulant unless they are targeting phosphorus reduction (Tchobanoglous et al., 2003). There are a wide variety of filter designs used in wastewater treatment plants to achieve these goals. Some are similar to those in drinking water plants-dual media that operate at filtration rates of 2-6 gpm/ft 2 , but most are monomedia designs using slightly larger media. Filtration in wastewater typically provides enhanced removal of Cryptosporidium oocysts and Giardia cysts as filter depth increases (Rose et al., 2004). The use of continuous backwashing traveling bridge filters (TBF) is also common in wastewater treatment plants as is the use of synthetic or cloth media instead of sand, gravel, and anthracite (Tchobanoglous et al., 2003). One of the most important differences between wastewater treatment plant filters and drinking water filters is that coagulation is not typically practiced in wastewater treatment plants which is essential for meeting low effluent turbidity levels and optimizing pathogen removal. In wastewater treatment facilities, filtration is the primary mechanism for removal of (oo)cysts (e.g., Cryptosporidium, Giardia), and filtration increases the effectiveness of the disinfection process .
While significant information is available on the performance of granular media filters for removing Giardia and Cryptosporidium (oo)cysts in drinking water plants (Edzwald et al., 2003;Huck et al., 2002), there is relatively little published information on the impact of filter technology, media, design, and operation on the removal of TOrC or Cryptosporidium and Giardia (oo)cysts at wastewater treatment facilities. However, there are generally consistent results that media filtration provides improved pathogen removal versus cloth filters (Rose et al., 2004;Vaughn et al., 2005). While limited information exists regarding TOrC removal by filtration in general (Matamoros et al., 2007;Nakada et al., 2007;Yang et al., 2011), there have been no comprehensive studies that evaluate the effluent water quality impacts when changing from an older technology such as TBF to DBF, in terms of pathogens or chemical contaminants at the same facility. The Loxahatchee River District's (LRD) filter replacement provided a prime opportunity to evaluate the impact of the filtration technologies on TOrC removal and disinfection while also evaluating overall improvements in water quality and treatment. This study was designed to leverage the opportunity to capture operational/water quality data, pathogen data, and TOrC data before the replacement of the filters using the existing synthetic media filters (SMF) and TBF followed by evaluation of the same parameters through DBF after construction and commissioning, as shown in Figure 1.

| Description of LRD's water reclamation and reuse facilities
The LRD owns and operates a regional wastewater treatment facility that serves northeastern Palm Beach County Article Impact Statement This manuscript will be of use to the community as it provides additional understanding of public health protection in water reuse. and southeastern Martin County, Florida, USA. The LRD wastewater treatment facility provides secondary treatment in accordance with Florida Administrative Code FAC-62-610, Part III including mechanical filtration, flow equalization, diffused aeration, secondary clarification, filtration, and "high-level disinfection" (as specifically defined in the FAC-62-610.410) by chlorination with a permitted treatment capacity of 11.0 million gallon per day. Note that chlorine gas is used for final disinfection at the plant but because of the high ammonia content, the free chlorine is rapidly converted to chloramines, measured as total chlorine. The typical chlorine dosage applied to the filter effluent ranges from 8.4 to 11.4 mg/L with the average dose of 9.9 mg/L. This dosage yields a combined chlorine (chloramine) residual in the range of 3-5 mg/L with a typical residual value of approximately 4 mg/L. During wet weather periods, approximately 30% of the time, treated effluent is disposed of by deep well injection. Otherwise, treated effluent, termed Irrigation Quality Water (IQ Water), is stored in on-site lakes (61.4 acres) and distributed to reclaimed water customers to meet public-access, landscape irrigation needs. A summary of influent and effluent water quality for the facility is provided in the Table S1.

| Basic design and operational parameters of the filtration systems
Four TBF units were constructed in the mid-1980s, with two additional units were added in 1994. TBF are lowhead, down-flow, granular media filters with influent flow entering from the top and passing down through the media bed. Flow to each filter unit is provided by Filter Pump Station 1 and controlled independently. Each unit had a media bed surface area of 550 ft 2 with 18-inch depth of sand (0.8-0.9 mm in diameter) compartmentalized into 44 12-inch-wide cells to facilitate backwashing. The average effective filter depth to media diameter ratio (L/D) was 540 and individual cells were 12.5 ft 2 . Cells were individually backwashed on an hourly basis for 40-45 s by an overhead, traveling bridge assembly, while all other cells remained in service. Because of the nearly continuous backwash, at least one cell was always in a ripening phase, where higher levels of particulates (including pathogens) could pass into the effluent. Design loading rate was 2.0 gpm/ft 2 , average loading rate was 1.2 gpm/ft 2 , and peak loading rate was 3.5 gpm/ft 2 . Filtered effluent flowed through the underdrain, into a common effluent channel, and ultimately to a chlorine contact chamber for disinfection. Data for the TBF, as well as the SMF and DBF units described subsequently, are provided in Table 1.
There are four SMF units, with all placed into operation in 2007 ("Fuzzy Filter™" by Schreiber Water, Trussville, AL) and are still available for standby operation. SMF are up-flow filters with influent flow entering from the bottom and passing up through compressible media. Flow to each filter unit is provided by a filter pump station and is controlled independently from other filter pumps. Each unit has a media bed surface area of 49 ft 2 with 60-inch depth of compressible, synthetic media made of 1.25-inch to 1.5-inch diameter spheres. Operators can alter filter bed porosity to optimize solids capture based upon influent water quality. Design loading rate is 30 gpm/ft 2 , average loading rate is 28 gpm/ft 2 , and peak loading rate is 42 gpm/ft 2 . Filtered effluent flows through the compressible media, into a common effluent header pipe, and ultimately to a chlorine contact chamber for disinfection.
There are six DBF units, each were put into operation in 2017. DBF are down-flow filters with influent flow entering from the top and passing down through 72-inches of sand (2.0-3.0 mm diameter) and 18-inches of sorted gravel (6.35-31.75 mm diameter) before reaching the underdrain, providing an average effective L/D of 760. Flow to each filter unit is provided by Filter Pump Station 1 and controlled independently. Each unit has a media bed surface area of 380 ft 2 . Design loading rate is 4.0 gpm/ft 2 , average loading rate is 3.4 gpm/ft 2 , and peak loading rate is 5.4 gpm/ft 2 . Filtered effluent flows through the underdrain, into a backwash water supply tank, and ultimately to a chlorine contact chamber for disinfection. SMF units were not decommissioned after the addition of the DBF units and are available for backup treatment should the need arise. SMF units were not used during the DBF sampling campaign as indicated in Figure 1.

| Study design
The LRD was planning to upgrade the water reclamation and water reuse facilities from the existing TBF and SMF filters (as described previously) to DBF and SMF filters. The TBF filters had reached the end of their useful life and maintenance costs had become excessive. While the facility could have operated on only the SMF, LRD decided to replace the TBF filters with DBF filters to maintain operational flexibility and resilience to process fluctuations of the plant. This filter upgrade project did not alter the existing final chlorination (chloramine) disinfection process, which remained in place and functional. With this opportunity to provide the wastewater community with additional information regarding the performance of full-scale SMF and TBF systems relative to the new DBF system in the same water reclamation facility, the authors developed a study design and sampling plan to examine TOrC removal and operational performance (i.e., removal of TSS and turbidity) through the filtration and chloramination steps at the facility along with limited Cryptosporidium and Giardia testing across the filtration processes.
The operational evaluation of filter performance included parameters associated with water reuse permits (TSS, turbidity, Cryptosporidium, and Giardia). The working hypothesis was that the newly installed DBF system would lower average filter effluent turbidity and TSS compared to the TBF due to the increased filter depth and increased empty bed contact time (Rose, Farrah, Harwood, et al. 2004).
The new filters contain more filter media grains which provide greater opportunity for particle-media contact and removal. With an expected decrease in filter effluent turbidity and TSS, a corresponding decrease in Cryptosporidium and Giardia (oo)cyst concentration was hypothesized.
Analytes, that is, TOrCs, were a priori categorized as "good" or "poor" regarding biodegradability, sorption to filter media, and chloramine oxidation rates, and statistical tests assessed TOrC removal among these a priori defined groups and across filter types. In this case, "poor" is meant to reflect having no appreciable adsorption observed from past studies while "good" is meant to reflect having observable sorption of at least 50% from past studies (e.g., Snyder et al., 2007). The analyte list was selected to focus on compounds that have various levels of biodegradation, adsorption to filter media or biofilms, and oxidation by combined chlorine (note, due to presence of ammonia in the filter effluent, the free chlorine is rapidly converted to chloramines) in order to determine if any particular mechanism seemed to dominate, or at lease influence, TOrC removal. A priori compound selection and grouping for oxidizability, sorption, and biodegradation was modeled after previously published work (Snyder et al., 2007). While it is understood that sorption was not expected to be a dominant mechanism as no activated carbon is present in the filter media (Yang, Flowers, Weinberg, et al. 2011), there could be minimal sorption observed from increased biofilm in the DBF configuration relative to the SMF and TBF design (Le-Minh et al., 2010;Nakada et al., 2007;Nichols Jr. & Abboud, 1995). As such, newer DBF units with greater hydraulic detention time, filter media, surface area, and biofilm surface were hypothesized to provide better removal of analytes categorized as having good sorption, good biodegradation, or both as compared to the TBF or SMF units. Likewise, it was hypothesized that acetaminophen and triclosan would show some removal across the chloramination step (Greyshock & Vikesland, 2006;Wu, Shi, Adams, et al. 2012), but that the remaining analytes would not be significantly degraded during the disinfection process, that is, the addition of chlorine/chloramine. Tables 2 and 3 show the list of key compounds, as categorized a priori, which was used in subsequent sections to examine removal performance across filter types. This list of analytes in Tables 2 and 3 were used predominantly in the analysis of filter performance; however, the analytical laboratory employed two separate methods that incorporate an additional 82 analytes that were also included in the analysis and are listed and discussed in the Supplemental Information. The sampling plan for this study (Table S2) included evaluation of the old and new filter performance during two separate monitoring periods to capture both high filter loading and low filter loading periods: the SMF and TBF systems were monitored twice monthly for a period of 2 months for a total of eight discrete sampling events. The DBF system was monitored twice monthly for a period of 7 months, starting at the second month of operation, for a total of 14 sampling events. Samples were collected from the common tertiary filtration influent (i.e., secondary effluent) and individual effluent for each filter type (SMF, TBF, and DBF). Samples were collected by utility staff with bottles provided by Eurofins Eaton Analytical and were shipped on ice back to the Eurofins laboratories for extraction and analysis per their TOrC, Cryptosporidium, and Giardia methods. Specifically, Cryptosporidium and Giardia were analyzed by EPA method 1623 with samples collected in 10 L cubitainers with no preservative, stored and shipped at less than 20 C, and analyzed within 96 h. TOrCs were analyzed per the Eurofins Eaton Analytical Monrovia (CA) laboratory method 9609-2017 with samples being collected in 4 Â 40 ml amber glass vials with 2 Â 40 ml vials containing 0.5 ml (1.04 g/L sodium omadine +4 g/L Ascorbic Acid) and 2 Â 40 ml vials containing 0.5 ml (100 g/L sodium azide +4 g/L ascorbic acid). Samples were stored and shipped at 4 C with a maximum hold time of 20 days prior to analysis. Triplicate samples were collected for one sampling event for the first phase of the study (SMF and TBF samples) and once during the second phase of the study (DBF) to examine intra-sample variability during analysis. Triplicate analysis demonstrated replicability of sampling results, the coefficient of variation was <25% in over 90% of samples, which provided confidence for subsequent singlet sample collection events. Field and laboratory blanks were included in analysis per Eurofins standard procedures. The analysis of TOrCs by the Eurofins Eaton Analytical Laboratory methods was evaluated in previously published literature and compared against other EPA and accepted laboratory methods (Vanderford et al., 2014).

| Statistical analyses
Two statistical analyses were used to investigate the impact of filter upgrades to operations and to TOrC and pathogen removal. To examine whether there was a statistically significant change in effluent TSS and turbidity once DBF was installed, a Welch's Two Sample t-test was applied (Welch, 1947). To analyze TOrC removal by filtration and chloramine oxidation, Wilcoxon signed-rank tests (Wilcoxon, 1947) were conducted on the influent and effluent of the filters and contact tanks. Details about these statistical tests are described in the Supplemental Information.

| Assumption for samples less than the method reporting limit
For most TOrCs that were analyzed, detectable concentrations were observed. However, for certain compounds, over half of the observations were less than the method reporting limit (MRL). To include these data in statistical analyses and percent removal calculations, an assumption was required. Conservatively, it was assumed that the observed concentrations were equal to the MRL. To examine the data and the implications of this assumption, the raw data and reproducible code is provided to the reader in the Data Availability Statement.

| Operational impacts
Data from LRD's supervisory control and data acquisition system (SCADA) were extracted to analyze the impact of the filter performance on filter effluent turbidity and TSS as measured from online analyzers (Turbidity meter is a HF Scientific MicroTOL; TSS analyzer is a Cerlic BB2 with a CTX 2025 LC flow-through low suspended solids sensor). Considering the combined TSS and turbidity for the SMF and TBF systems, a significant difference was observed (p ( .01 based on Welch's Two Sample t-test) for both the mean TSS concentration and turbidity concentration when the new DBF system was brought online (Figure 2). On average, effluent TSS declined by 40% and effluent turbidity declined by 25%, and these performance improvements have been sustained to current day (from August 2019 to September 2020 average effluent TSS was 0.9 mg/L; data from Arrington, personal observation). LRD wastewater treatment plant operators reported improved control and stability of effluent quality with fewer process adjustments required (e.g., adjusting dissolved oxygen levels in aeration basin, revising return activated sludge set point). Thus, from an operational perspective of providing a solid margin of safety below the 5 mg/L TSS limit for recycled water, the DBF upgrade confirmed the operational hypothesis described in the methods section.

| TOrC removal
To test whether biodegradation and sorption characteristics impacted TOrC removal during filtration, the TOrC were split into the four categories outlined in Table 2. The boxplots in Figure 3 depict percent removal of TOrC for DBF, which shows median values near zero across each category. This result is representative of removal results across the other filter types (i.e., TBF and SMF), although variability in removal data differed slightly across filter types. For instance, TOrC with both poor sorption and poor biodegradation had the largest interquartile range for DBF, as represented by the extents of the purple box; however, this pattern was not observed in either TBF or SMF removal data. For each category and for each filter type, influent and effluent concentrations were compared with the Wilcoxon signed-rank test. Most tests failed to reject the null hypothesis (Table 4)-meaning that TOrC removal data in each case were approximately zero for each of the three filter types. Even in the cases where a statistically significant difference was found between the influent and effluent data (i.e., DBF for TOrC with poor sorption), removal of TOrC was similar to the other cases (see purple and green boxplots in Figure 3). Thus, the increase in hydraulic detention time within the DBF did not have a meaningful positive impact on TOrC removal relative to the TBF and SMF units.
While it was not expected that the change in filtration technology would impact oxidation of TOrCs during F I G U R E 2 Impact of filter upgrades from synthetic media filters (SMF) and traveling bridge filters (TBF) to deep bed filters (DBF) on TSS (a) and turbidity (b) disinfection, the removal of the TOrCs across disinfection was analyzed nonetheless with respect to their chloramine oxidation categories (i.e., poor or good, Table 2). Due to the study design, the data are presented based on both filter type and oxidation category. Because SMF and TBF effluent was blended before disinfection, these filters were considered together in the post-disinfection analysis. This subset of the data is referred to as "SMF + TBF". In both cases (DBF and SMF + TBF effluent), the median chlorine demand (calculated as the difference between the amount dosed on a mass/flow basis and the amount measured in the effluent) was between 2.3 and 3.9 mg/L, but could increase to more than 15 mg/L due to fluctuating ammonia levels. Residual chlorine had a median value of 3.6 mg/L (as combined chlorine) and ranged from 1.6 to 5.0 mg/L. Among the four test groups for the two filter types and two chloramine oxidation categories, three groups showed significant removal (noted with a * in Table 5) between influent and effluent TOrC concentrations (Table 5).
Moreover, in each case, there was positive removal for these three cases, with most values between 0% and 25% ( Figure 4). TOrCs with good chloramine oxidation characteristics tended to have higher removal than those with poor oxidation (Figure 4). These oxidation results contrast with filtration results which showed little to no removal of TOrCs.
Investigating the individual TOrCs with good chloramine oxidation, acetaminophen and triclosan, it was found that triclosan removal was statistically significant, whereas acetaminophen was not (Table S8). After disinfection, triclosan concentrations tended to decrease by 25% or more (Figure 5a), whereas acetaminophen tended to show little to no removal (Figure 5b). It has been shown that triclosan rapidly reacts with free chlorine (Vanderford et al., 2008) and more slowly with monochloramine (Greyshock & Vikesland, 2006;Wu et al., 2012) as such could have been partially oxidized before the free chlorine was converted to chloramine or that chloramine may have exerted some oxidation on the available triclosan.

| Pathogen removal
Based on the increase in filter media depth, the DBF units were expected to provide better particle storage than the shallower depth SMF and TBF units and as such provide better removal of pathogens, such as Cryptosporidium and Giardia. As observed with the turbidity and solids handling for the filters, both Cryptosporidium and Giardia (oo)cysts were shown to have improved removal across the DBF units ( Figure 6). DBF demonstrated an increase of Cryptosporidium oocyst removal by approximately 0.2 log when compared to SMF and TBF (Figure 6a). DBF also showed an increase of Giardia cyst removed by approximately 1.5 log when compared to SMF and TBF (Figure 6b). However, because the sample sizes were small, 6-7 samples per filter per pathogen, no statistical tests were performed for this analysis. F I G U R E 5 Percent removal of triclosan (a) and acetaminophen (b) by chloramine oxidation