Insect outbreaks are major ecosystem disturbances, affecting a similar area as forest fires annually across North America. Tree mortality caused by bark beetle outbreaks alters carbon cycling in the first several years following the disturbance by reducing stand-level primary production and by increasing the amount of dead organic matter available for decomposition. The few studies of biogeochemical cycling following outbreaks have shown a range of impacts from small responses of net carbon fluxes in the first several years after a severe outbreak to large forest areas that are sources of carbon to the atmosphere for decades. To gain more understanding about causes of this range of responses, we used an ecosystem model to assess impacts of different bark beetle outbreak conditions on coupled carbon and nitrogen cycling. We modified the Community Land Model with prognostic carbon and nitrogen to include prescribed bark beetle outbreaks. We then compared control simulations (without a bark beetle outbreak) to simulations with various levels of mortality severity, durations of outbreak, and snagfall dynamics to quantify the range of carbon flux responses and recovery rates of net ecosystem productivity to a range of realistic outbreak conditions. Our simulations illustrate that, given the large variability in bark beetle outbreak conditions, a wide range of responses in carbon and nitrogen dynamics can occur. The fraction of trees killed, delay in snagfall, snagfall rate, and management decisions about harvesting killed trees will have major impacts on postoutbreak carbon fluxes for several decades and postoutbreak carbon stocks up to 100 years.
 Only a few studies have measured impacts on biogeochemical cycling following a bark beetle outbreak. At smaller scales, Brown et al.  conducted an eddy covariance study in British Columbia in two stands with high-severity outbreaks. They found that net ecosystem productivity was slightly negative or nearly zero 1–5 years after an outbreak, which the authors attributed to the response of surviving trees and understory to increased resources (light, water, nutrients). Similarly, Romme et al.  reported an increase in net primary productivity (NPP) of surviving trees owing to increased resources and reported a recovery in NPP between 5 and 15 years following the disturbance. However, few studies have directly measured soil nutrients following a bark beetle outbreak. Goodman and Hungate  reported no increase in soil nitrogen (N) between 4 and 6 years following a spruce beetle outbreak, although Huber et al.  and Huber  reported elevated soil nitrate and N leaching within the first 5 years following bark beetle attack in a spruce forest.
 There have been relatively few modeling studies on the impact of insect outbreaks on biogeochemical cycling. Kurz et al.  used the Carbon Budget Model of the Canadian Forest Sector (CBM-CFS3) to quantify the impact of a large mountain pine beetle outbreak in British Columbia. They found that an outbreak occurring over 130,000 km2 resulted in a loss of 270 Tg of ecosystem carbon, equivalent to 5 years of carbon emissions from Canada's transportation sector. Similarly, Kurz and Apps  used CBM-CSF2 to simulate insect outbreaks and fire between 1929 and 1989 and concluded that these disturbances have changed Canadian forests from a net carbon sink to a net carbon source in the 1990s.
 Ecosystem models are useful tools to study insect outbreaks because observations are lacking to date in spatial extent and duration, thereby preventing an accurate quantification of the impact of a beetle outbreak. Modeling studies increase understanding of complex phenomena, study processes not included in observations, provide the capability of filling in gaps in space and time, and can be used to generate hypotheses that can be tested in future field experiments.
 In this study, we analyzed simulations of the temporal variability of carbon and nitrogen stocks and fluxes following hypothetical, realistic bark beetle outbreaks and quantified the variability of responses. Our research questions were: How do outbreak severity and outbreak duration alter carbon fluxes following mountain pine beetle outbreaks? Does snagfall significantly alter carbon fluxes following epidemics? We expect the severity and duration of an outbreak to alter the recovery rate following an outbreak, with severe outbreaks that last over 3 years having the longest time to recovery. We also expect that snagfall will significantly alter carbon fluxes by modifying the timing and magnitude of decomposition.
 To answer our questions, we added a bark beetle outbreak subroutine to the Community Land Model (CLM) with prognostic carbon and nitrogen cycling version 4 (CLM4) [Lawrence et al., 2011]. With prescribed outbreak conditions, we used CLM4 to assess the effects of bark beetle-caused tree mortality severity, outbreak duration, and snagfall rate on carbon and nitrogen dynamics. Our study is particularly important given the severity and extent of current bark beetle outbreaks in North America and the potential for effects on atmospheric CO2 concentrations.
2.1. Numerical Model
 We used CLM4 [Lawrence et al., 2011] with prognostic carbon and nitrogen (CN) for our simulations. CLM4 is the land model of the Community Climate System Model [Gent et al., 2011] and is a fully prognostic ecosystem model that contains a process-based representation of many biophysical and biogeochemical processes such as momentum absorption by canopy elements, sensible and latent heat fluxes, emission of longwave radiation, absorption and reflection of shortwave radiation, photosynthesis, allocation of assimilated C to live, dead, and storage C pools, decomposition, and soil N dynamics. Land surface heterogeneity is represented through a tiling approach whereby fractional coverage of lakes, wetlands, glaciers, bare soil and plant functional types (PFTs) are specified for each grid cell. Surface energy, water, and carbon and nitrogen fluxes are calculated separately on each tile. Inputs to the model include a meteorological forcing data set [Qian et al., 2006], CO2 concentrations, nitrogen deposition and aerosol deposition. The model time step is 30 min. A detailed technical description of the model is provided by Lawrence et al.  with additional details on the carbon and nitrogen cycling by Thornton and Rosenbloom  and Thornton and Zimmerman .
2.2. Modifications to CLM4
 We modified CLM4 to include a mechanistic representation of bark beetle outbreaks on the basis of similar work by Thornton et al.  for harvest. A conceptual diagram of this process is shown in Figure 1. Upon an outbreak, C and N in live tree pools were moved to dead pools. Bark beetles attack trees in late summer, and needles on trees fade from green to red at the beginning of the next summer, and so we prescribe mortality (C and N transfer) on 1 January when a bark beetle outbreak occurred in the previous summer. We transfer C and N in six main pools: (1) leaf pool, (2) live stem pool, (3) dead stem pool, (4) live coarse root pool, (5) dead coarse root pool, and (6) fine root pool. We move leaf C and N into dead foliage pools to represent dead needles that remain on the tree for a period of 3–5 years [Wulder et al., 2006]. After this prescribed period, we transferred C and N from the dead foliage to the litter pools, which are then cascaded into slower decomposing soil organic matter pools. We transfer C and N in both live and dead stem pools into newly added snag pools. C and N in the snag pools remain for a prescribed time (snag delay period) before we begin moving snag C and N into the coarse woody debris (CWD) pool at a prescribed exponential decay (half-life) rate. We transfer C and N in coarse roots to CWD and fine roots C and N to the labile litter pools immediately after an outbreak. C and N then move from CWD into the cellulose/hemicellulose and lignin pools before entering the converging soil organic matter cascade [Thornton and Rosenbloom, 2005]. Decomposition occurs at different rates on the basis of litter/soil quality for each of three litter pools and each of four soil pools and as a function of soil temperature and moisture. A potential C flux is first calculated on the basis of decomposition rate constants for each pool and C:N ratios of each pool. The potential decomposition flux can either be positive or negative. If positive, N is mineralized; if negative, N is immobilized. In the case of mineralization, the flux remains as calculated and contributes to the soil mineral N pool. In the case of immobilization, a competition for available N occurs between plant uptake and immobilization. If there is insufficient available N to meet both plant and immobilization requirements, both proceed at lower rates and thus reduce potential GPP and heterotrophic respiration (see Thornton and Rosenbloom 2005 for additional details). Reductions in GPP due to decreased plant N uptake have feedbacks to the ecosystem by reducing net carbon assimilation (i.e., GPP), allocation to new leaf carbon (i.e., LAI) thereby causing further reductions in GPP.
 Our modifications to CLM4 prescribe reductions in carbon and nitrogen stocks. Because CLM4 is a fully prognostic model, these prescribed reductions directly modify calculations of LAI and canopy height and thus biophysics such as energy and momentum absorption and precipitation interception. However, the additional snag and dead foliage pools were not added into the biophysics of the model, rather only alter biogeochemical components such as decomposition and N dynamics.
2.3. Numerical Experiments
 Our study includes three classes of simulations: (1) a base case simulation of bark beetle outbreaks that represents typical severe outbreak conditions; (2) simulations that include realistic variations in the severity and duration of an outbreak; and (3) simulations that assess impacts of the treatment of snags and dead foliage. Each class of simulation is described below and listed in Table 1. All simulations use the needleleaf evergreen tree-temperate (NET) PFT in CLM, details of which can be found in the work of Oleson et al. . Leaf area and canopy height are prognostic in the model, and were 3.8 m2 m−2 and 18.6 m at steady state conditions for our simulated location.
Table 1. Description of Parameters Selected for Snagfall Treatment, Snagfall Timing Period, Outbreak Severity, and Outbreak Duration for Each Simulation
Base case simulation.
Stems removed from ecosystem in the harvest simulation.
 In our base case, the duration of the outbreak was 3 years and the total severity (percentage of C and N moved from live to dead pools) was 95%, representing a high-severity attack [Brown et al., 2010]. The mortality was structured so that the same amount of C and N was moved from live to dead pools each year. The snagfall delay period was 5 years and the half-life exponential transfer rate into CWD was 10 years [Mitchell and Preisler, 1998; Page and Jenkins, 2007; Angers et al., 2010]. Dead foliage remained on the trees for 3 years and then entered the litter pools at a 2 year half-life rate [Wulder et al., 2006].
2.3.2. Severity and Duration of an Outbreak
 Both the severity and duration of outbreaks exhibit large variability within an outbreak and among outbreaks [U.S. Department of Agriculture Forest Service, 2004]. To investigate the impact of variation in severity and duration, we conducted simulations that were the same as the base case but with 25, 50, and 75% mortality. We conducted additional simulations with outbreak durations of 1, 5, and 15 years to determine the impact of outbreak duration on C and N cycling.
2.3.3. Snag Treatment Simulations
 To investigate the effect of different treatments of snagfall, we conducted simulations with (1) a range of snagfall rates (transfer of snags to CWD) and (2) a range of snagfall timing (delay before snagfall initiation). The snagfall rate simulations were identical to the base case except we transferred C and N in the snag pools to CWD at exponential decay rates with half-lives of 5 and 25 years (instead of 10 years). These rates represent lower and upper bounds of snagfall that have been reported for conifer forests [Mielke, 1950; Mitchell and Preisler, 1998; Wilhere, 2003; Lewis and Hartley, 2005; Garber et al., 2005; Page and Jenkins, 2007; Angers et al., 2010]. We also varied the initial delay of snagfall (snagfall timing), prescribing 10 and 20 year delays in addition to 5 years. We prescribed an immediate transfer of C and N into CWD instead of an exponential decay to represent a windthrow event.
 For comparison with the base case outbreak simulation, we conducted additional sensitivity simulations. One simulation with an immediate transfer of C and N from stems to CWD represented an upper bound of the rate and amount of C and N entering CWD. We also conducted a simulation in which we assumed that the killed trees were harvested. In this simulation, C in live stems was immediately moved from live tree pools to 10 and 100 year wood product pools (i.e., removed from the ecosystem). This simulation represented the lower bound of the amount of C and N entering CWD.
 Each numerical experiment was conducted by running CLM4 offline in single-point mode. In single-point mode, soil properties and climate forcing were specified for one location in Idaho. The meteorological forcing was repeated for years 1948 to 1972 from the data set provided with CLM4 [Qian et al., 2006]. Spun-up carbon and nitrogen pools (i.e., pools that were in equilibrium and thus stable in time) were achieved by running the CLM4 for ∼3000 years using the accelerated spin-up procedure described in the work of Thornton and Rosenbloom . All experiments were initialized with the spun-up control simulation and then run for 100 years after the bark beetle-caused tree mortality. Fluxes were postprocessed into yearly averages to remove intraseasonal variations. Anomalies were calculated by subtracting the control run from experimental runs.
2.3.4. Dead Foliage Treatment
 We conducted a simulation without a dead foliage pool to determine the impact of needlefall on carbon and nitrogen dynamics following a bark beetle outbreak. In this simulation, C and N in needles were immediately moved to the litter pool (instead of including a 3 year delay period before needlefall).
3. Results and Discussion
3.1. Carbon and Nitrogen Dynamics Following a Bark Beetle Outbreak
 We begin by describing the results of the base case outbreak, discussing causes of the response, and comparing the dynamics with published studies. Before the outbreak, gross primary productivity (GPP) was 1240 g C m−2 yr−1, net primary productivity (NPP) was 385 g C m−2 yr−1, and heterotrophic respiration (Rh) was 344 g C m−2 yr−1. NEP (NPP – R h) was small but positive, a result of spin-up procedures. The NPP/GPP ratio of 0.31 is lower than estimates of Waring et al. , but are within reason [Goulden et al., 2011]. A low NPP/GPP ratio may reduce recovery rates following an outbreak because of larger respiration losses. The preoutbreak total vegetative C stock (above and below ground) was 55 Mg C ha−1.
3.1.1. Initial Ecosystem Response (0–5 Years Presnagfall)
 After the prescribed bark beetle outbreak, ecosystem leaf area index (LAI) initially decreased by nearly 80% compared to the control run because of the transfer of live leaf C to the dead foliage pool (Figure 2). Snag and dead foliage C increased from the transfer of stems and foliage of killed trees. C in CWD increased over the first 3 years of the outbreak owing to input by coarse roots, and then decreased beginning 4 years following the disturbance because of decomposition. Immobilization of soil mineral N increased as decomposition of C from roots started. The increase in immobilization peaked between year two and three and then decreased to below control levels at year five. Once immobilization decreased, mineral nitrogen increased to a peak at year five. N uptake by plants responded to the increase in soil mineral nitrogen and uptake was enhanced between years five to eight. The soil N dynamics can be summarized as an increase in microbial activity (immobilization) following inputs of C and N to the soil from the roots of the killed trees, followed by decomposition of the C from roots and release of immobilized N stimulating increased plant uptake. Increased soil mineral N following an outbreak was consistent with observations of increased nitrate and leaching [Huber et al., 2004].
 These effects on LAI and N had significant impacts on C fluxes (Figure 3). GPP and autotrophic respiration (Ra = growth + maintenance respiration) both rapidly decreased because of the reduction in LAI. Although LAI decreased over 80%, reductions in GPP were less than 60% because of the nonlinear relationship of GPP and LAI. The net effect reduced NPP (NPP = GPP − Ra) by over 30% at the end of the outbreak. The increase in pool sizes resulting from the transfer of roots and fine roots to CWD and litter, respectively, caused an initial increase in decomposition. However, soil respiration decreased, as reductions in belowground autotrophic respiration were larger than increases in heterotrophic respiration.
 The simulated trajectory of an initial reduction in NPP agrees with the observations of Romme et al.  and simulations of individual stands by Pfeifer et al. . Increases in Rh have not been reported but are anticipated [Brown et al., 2010]. Similarly, Morehouse et al.  hypothesized changes in soil respiration following an outbreak, but measurements did not support this expectation. Short-term reductions of NEP have been observed by Brown et al.  and simulated by Kurz et al. .
3.1.2. Subsequent Dynamics (After Initiation of Snagfall)
 Forest recovery began with a rapid increase of LAI to within 20% of the control run in just over 5 years (Figure 2). A fast recovery of LAI was also reported by Misson et al. , who found an initial decrease of 35% after a thinning treatment followed by recovery in 1 year in a ponderosa pine plantation in the Sierra Nevada Mountains of California. However, our simulation of LAI recovery was faster than that following stand-replacing fires in boreal forests [Cueves-Gonzalez et al., 2009; Goulden et al., 2011]. It is important to note that we are simulating a 100% needleleaf evergreen tree stand and these LAI dynamics do not include an understory or broadleaf succession following disturbance. The fast recovery of LAI is solely due to the productivity of surviving trees. Including an understory or succession would likely increase the recovery rate of LAI [e.g., Brown et al., 2011].
 Initial soil mineral N alterations were short-lived, and mineral N decreased to levels below the control run within 7 years. Immobilization decreased to levels lower than the control run and then increased again as C and N were added to the soil from the snag pool. Increased soil N following an outbreak has been reported by observational [Morehouse et al., 2008] and laboratory [Lovett et al., 2002] studies. Many studies inferred increased resource availability (light, water, nutrients) following an insect outbreak that caused increased rates of productivity of both surviving trees [Romme et al., 1986] and surviving trees and understory [Brown et al., 2010]. However, Goodman and Hungate  directly measured soil mineral N and found no increase 4–6 years after a spruce beetle attack. They hypothesized that either the soil mineral N dynamics were fast and short-lived, or that soil mineral N was immobilized. The fast soil mineral N dynamics in our simulations, in which mineral N was initially immobilized following large inputs of C to the soil, generally agree with these studies. We also found that increases in soil moisture following the outbreak (owing to reduced canopy interception and transpiration) aided short-term (less than 5 years) increases of GPP. However, C and N dynamics controlled the long-term recovery of GPP.
 GPP recovered to within 20% of the control run in less than 10 years (Figure 3). Nearly complete recovery to the unattacked simulation occurred by year 40 for all C and N fluxes. Because productivity is a function of LAI, the recovery of NPP was expected given the simulated recovery of LAI. This result agrees with the observational study by Romme et al.  in which primary productivity recovered within 5 to 15 years. It is important to note, however, that our simulations did not include a shrub and/or herbaceous understory response, which may lead to a faster recovery of NPP if included in our simulations and if present in reality.
 The reduction in Rh in years six to ten relative to the control run were a result of reduced C inputs to the litter and soil, but only lasted a few years until C from snags entered CWD pools. In our simulations, a large initial input of C from fine and coarse roots and foliage resulted in a large increase of Rh. However, this increase in Rh was short-lived (less than 5 years) as compared to the extended increase owing to inputs of snags into CWD because fine roots and foliage decayed rapidly as labile litter whereas coarse roots and snags decayed slower in cellulose and lignin pools.
 It is difficult to compare the timing of Rh to observational studies because such studies do not span long time periods, and also typically do not directly measure Rh. Morehouse et al.  hypothesized that soil respiration (i.e., both autotrophic and heterotrophic respiration) rates would be higher in infested plots. However, their measurements within 2 years following outbreak did not support this hypothesis. Similarly, Brown et al.  anticipated higher rates of ecosystem respiration following an attack, but measurements within 1 year postoutbreak as well as 4–5 years postoutbreak did not support this. Our results suggest that Rh is very dynamic in the years following an outbreak, implying difficulty in generalizing measurements taken at a single time. Decreases in belowground autotrophic respiration were larger than increases of Rh, resulting in a reduction in soil respiration in our simulations.
 Net ecosystem productivity recovered to control levels within 40 years, with 80% of this recovery in the first 5 years. The ecosystem became nearly carbon neutral 7 years after the outbreak, but continued to be a source of carbon until year 15 when it was a carbon sink for 1 year. After year 15 the ecosystem became a carbon source until year 23. The ecosystem fully recovered to a net sink of carbon at year 23. The fast recovery of NEP to a near neutral sink (7 years) agrees with eddy covariance data following a beetle outbreak [Brown et al., 2010] and NEP recovery after thinning [Howard et al., 2004; Amiro et al., 2010]. The timing of NEP recovery is also consistent with the modeled ecosystem recovery following harvest [Thornton et al., 2002]. In our simulations, the reduction and recovery of NEP was dominated by GPP, also reported by Amiro et al. . Similar C dynamics have also been seen for wildfire [Amiro et al., 2006, 2010].
 In our simulations, the long-term recovery of GPP and NPP was influenced by soil N dynamics, particularly the balance between immobilization and plant uptake of N following the outbreak. The initial increase in immobilization limited productivity (within the first 5 years postoutbreak). However, after immobilization decreased below control levels ∼40 years postoutbreak, the rate of uptake by plants was larger than that of the control simulations, leading to a slight overshoot of NPP in the base outbreak case relative to NPP in the control run.
 Carbon stocks recovered slower than fluxes. Total vegetative C recovered to preoutbreak levels by year 100. This timing agrees well with a modeling study conducted by Pfeifer et al.  as well as observations of C stock recovery as a function of stand age [Law et al., 2003; Gough et al., 2008].
3.2. Sensitivity to Outbreak Conditions
3.2.1. Severity and Duration of an Outbreak
 The severity of an outbreak had a large effect on the dynamics of C and N fluxes and stocks (see Figures 4 and 5). The initial reductions in LAI and vegetation C were linearly related to outbreak severity, as were GPP, Ra, and NPP. However, mineral N, immobilization, Rh, and NEP all responded nonlinearly to increasing outbreak severity, with the 75% and 95% mortality runs exhibiting much larger differences than those for lower severities (e.g., between 25% and 50%). The 25% mortality case never became a strong source of carbon after the outbreak, while the 50% and 75% mortality cases became a source of carbon for the first 7 and 14 years, respectively, after the outbreak. These nonlinear responses occurred because in the high-severity case (95%), available soil mineral N was greater than the plant and immobilization demands for an extended period throughout the year. As a result, soil mineral N increased. Conversely, plant growth and immobilization was limited by N availability for a longer period of time in the lower-severity cases. Since plant growth and immobilization are both functions of N availability, both GPP and immobilization increased at a greater rate in the high-severity case as compared with the lower-severity cases resulting in a nonlinear response with severity because of the availability (or lack of) N. These nonlinear effects only occurred during the first 5 years after the outbreak and thus were associated with the transfer of roots to CWD. Because all requirements for N were met, GPP (and thus NEP) recovered quickly in the first 5–10 years for the high-severity cases.
 Our recovery rates of NEP following our severe outbreaks (75% and 95%) are slower than those reported by Brown et al. . However, similar to Brown et al. , our simulations show a strong effect of climate on annual NEP (Figure 5f) where a less severe (50%) outbreak first became a sink of carbon at year seven, but alternated between a sink and a source until 30 years following the outbreak. Our simulations show a strong influence of severity on soil N dynamics, especially immediately after the outbreak (<5 years). Similarly, short-term N dynamics have been reported by Huber  and Griffin et al. . Our results also suggest high-severity outbreaks will result in the largest impacts to N cycling, in agreement with Griffin et al.  who reported minor impacts to N cycling following a lower-severity MPB outbreak as compared to a higher-severity fire disturbance.
 The duration of an outbreak had a large impact on ecosystem fluxes only during the outbreak period and shortly thereafter (see Figure 10). The longer the outbreak duration, the more dampened the response in biogeochemical fluxes and stocks because the same amount of mortality was spread out over more years. The duration of an outbreak does not, however, alter long-term (>25 year) fluxes and stocks. Thus, outbreak severity appears to be important for long-term impacts, whereas duration is important for short-term impacts to C and N cycling.
3.2.2. Snag Delay Period
 Snag delay period had a larger impact on C and N dynamics than any other outbreak characteristic (see Figures 6 and 7). NEP recovery following the initial reduction was similar to the harvest simulation (which is shown in Figure 9), but a secondary decrease in NEP occurred when the snags entered CWD owing to increases in decomposition and reductions in GPP through the aforementioned N dynamics. Snagfall delay had major implications on the ecosystem recovering to a carbon sink. NEP in the 10 and 25 year snagfall delay periods recovered to positive values (carbon sinks) after 8 years following the outbreak, however, they became negative (carbon sources) after the snags fell. NEP for the 10 year snagfall delay period case became a carbon source between years 12 to 27 and NEP for the 25 year snagfall delay period became a carbon source between years 28 to 47. Conversely, the 5 year snagfall delay period simulation recovered to a carbon sink at year 23 and remained a sink of carbon because snagfall occurred quickly after the outbreak.
 The secondary increase in Rh and its magnitude at first appears to have major consequences for studies of postoutbreak biogeochemical cycling. For example, measurements of C and N conducted before the snags begin to fall may miss a key component if the snagfall dynamics are similar to the 10 or 25 year snagfall delay period cases. Amiro et al.  reported similar increases in ecosystem respiration (Ra + Rh) following wildfires due to large amounts of CWD. This secondary increase has not been observed in the field following mountain pine beetle outbreaks, possibly because decreases in belowground Ra offset increases in Rh resulting in lower rates of total soil respiration or because of the lack of observations of long-term C dynamics following outbreak.
3.2.3. Snagfall Rate
 For the simulation in which C and N was immediately transferred to CWD (“immediate”), there was an initial pulse of C and N to CWD from foliage, fine and coarse roots, and stems (Figure 8) that is much larger than the other outbreak simulations in which stem C was transferred into a snag pool before entering CWD. The initial pulse of C decayed over 20 to 25 years, at which time the CWD pool was smaller than the control run because of reduced inputs. CWD slowly increased to control run levels over the next 55 to 60 years.
 There was no pulse of C to CWD from stems in the harvest simulation because stem C was transferred into product pools instead of CWD (Figure 8). The initial pulse of C to CWD from roots and dead foliage in the outbreak simulations with a snag pool was similar to the harvest case, but in these outbreak simulations, CWD received a secondary pulse from snagfall and dead foliage additions. The fastest snagfall transfer rate (5 year half-life) exhibited a secondary pulse that lasted 35 years, whereas the slowest snagfall transfer rate (25 year half-life) did not have a secondary pulse. In the slowest snagfall case, C from snags entered CWD at a slightly faster rate than C was removed through decomposition.
 The response of heterotrophic respiration was a function of snagfall treatment (Figure 9). The highest rates of Rh were simulated for the immediate transfer case (no snag pool), and elevated Rh rates lasted nearly 20 years. All snagfall rate cases had identical Rh rates for the first 5 years because of the lack of snagfall during this period. Once snags began to fall, faster transfer rates resulted in larger increases in Rh by increasing the amount of C available for decomposition. The harvest simulation had higher heterotrophic respiration rates than the snagfall simulations because of additions of dead foliage to litter pools.
 Snagfall rates altered soil N dynamics and therefore productivity (Figures 8 and 9). When we immediately transferred C in snags to CWD, there was a large pulse of C to the soil, which initially stimulated microbial activity and immobilization, thereby reducing mineral N uptake by plants and GPP. Once this material decomposed (after 20 years when Rh is below control run rates), mineral N uptake by plants was greater than the control run and GPP and NPP were higher than in the control run. Faster snagfall rates corresponded to higher rates of immobilization and lower NPP after the initial dynamics in the base case.
 In the harvest simulation, immobilization did not increase during the snagfall period (after the 5 year delay period) because there were no additional inputs of C from snags. As such, N uptake by plants was higher than the snagfall simulations, resulting in the fastest recovery of productivity. In almost all simulations, once immobilization decreased after snagfall, mineral N uptake by plants increased for the next 20+ years causing elevated rates of GPP and NPP compared with the control run. This overshoot of GPP and NPP occurred within the first 15 years for the harvest case, 20 years for the immediate transfer case, and around 30–40 years for the snagfall simulations.
 The combined effects of snag treatment on GPP, NPP, and Rh controlled the NEP response after tree mortality (Figure 9). The immediate transfer of C and N in killed trees to CWD caused the largest reduction in NEP, occurring around year five. NEP recovery to control run levels was similar for all runs that included a snag pool (35 years). In the 5 year snagfall half-life case, the transfer of C and N caused a secondary reduction in NEP from snags to CWD after snagfall began. Generally, faster and larger pulses of material into CWD, and thus decomposition pools, resulted in larger perturbations to the ecosystem.
 Recovery to a carbon sink was also a function of snag treatment and snagfall half-life. The harvest case recovered to a carbon sink at year 7, whereas the immediate transfer of snags to CWD recovered to a carbon sink at year 19. The snagfall rate simulations recovered to carbon sinks at years 12, 23, and 27 for the 25, 10, and 5 year half-life simulations, respectively. Thus, ecosystem may recover to a carbon sink as quickly as 7 years in a harvest simulation or as slowly as 27 years if the snags fall quickly.
Griffin et al.  used a chronosequence approach spanning 30 years to examine N dynamics following a MPB outbreak in a lodgepole pine forest located in northwestern Wyoming. The authors found that changes to soil N cycling were substantially less than those following fire across the chronosequence, and dynamics were greater immediately after the outbreak than in subsequent years. Similarly, Huber  reported short-term N dynamics and Goodman and Hungate  inferred short-term N dynamics following bark beetle outbreaks. Our snagfall and snag delay simulations agree with these findings by predicting short-term N dynamics following beetle attack but also demonstrate the potential for long-term N dynamics.
3.2.4. Dead Foliage Treatment
 Including a dead foliage pool did not have a large impact on carbon and nitrogen dynamics following bark beetle–caused tree mortality (data not shown). This response occurred because the amounts of C and N in needles were not large enough to significantly perturb the soil dynamics in CLM4 relative to the changes in soil dynamics from root and snag decomposition.
3.2.5. Summary of the Impact of Outbreak Conditions on NEP Responses
 To compare net impacts to C fluxes in different time periods among the different outbreak conditions, we computed cumulative NEP as the sum of NEP beginning in year zero (Figure 10). Differences in cumulative NEP illustrate the sensitivity of simulated biogeochemical response to different outbreak conditions. Cumulative NEP for the control case was positive (C sequestration by the forest) in all time periods. For outbreak simulation that immediately transferred C and N to CWD pools, the forest was a C source for over 50 years, only switching to a C sink in the 100 year time period. Conversely, harvesting the snags resulted in a small C source that switched to a C sink before 25 years, and this simulation had the greatest cumulative NEP at 100 years. The magnitude and sign of cumulative NEP for simulations that included snag pools were dependent on outbreaks conditions as discussed below.
 Cumulative NEP was sensitive to the severity of an outbreak in all time periods. Lower-severity outbreaks resulted in a C sink of the forest (positive NEP) for all time periods. In contrast, for higher-severity outbreaks, the forest was a source of C for decades and had much reduced cumulative NEP at 100 years compared with the control run. Other outbreak conditions had different consequences for NEP. The snagfall transfer rate was important for 25 and 50 year cumulative NEP fluxes, but is not important at times <10 years or at 100 years. Snagfall delay influenced the 10 and 25 year periods, and the outbreak duration influenced only the 5 and 10 year periods.
 This analysis demonstrates that differences in outbreak conditions and management decisions following an outbreak (such as harvesting) can have long-term C cycling consequences. In contrast, other factors such as windthrow events or outbreak duration are important over shorter time periods but are less important over the long term.
3.3. Uncertainties in Simulations
 The three main uncertainties in these simulations are (1) the sensitivity of CLM4 to N; (2) the lack of a forest age-class structure representing cohorts; and (3) treatment of snag transfer into CWD. Our results suggest that soil mineral N dynamics play a key role in ecosystem recovery (through their impact on productivity) following an insect outbreak. However, there are some indications that CLM4 may be too sensitive to N limitations [Bonan and Levis, 2010]. To evaluate whether or not the results from the different snag treatments are highly dependent on N dynamics, we conducted simulations in which snags only altered decomposition rates and did not affect N availability to plants. In these simulations, the snag treatment (snagfall, harvest, delay, and immediate transfer) had an impact on NEP through decomposition alone. Thus, our hypothesis that snag treatment following an insect outbreak was important in NEP holds, with and without N feedbacks to productivity.
 Second, CLM4 does not explicitly contain an age-class distribution for trees and thus does not capture different rates of productivity as a function of age as described by past studies [Law et al., 2003]. Including these dynamics into CLM4 would likely result in faster initial recovery rates and slower long-term recovery rates. Incorporating a cohort structure would also alter the distribution of tree mortality; bark beetles prefer large trees. However, we would not expect an age-class structure to alter our general conclusions that snagfall is important through its effect on soil mineral N dynamics and that a range of outbreak conditions caused a range of biogeochemical responses.
 Finally, we assumed that the snags are transferred into CWD uniformly over the entire grid cell. Under this assumption, so-called “hot spots” where increases in decomposition and immobilization are associated with a particular snag and are therefore localized to the immediate area around that snag were not captured. Thus our results may be overly sensitive to inputs of C to CWD because we were in effect spreading the C over the entire grid cell evenly. This assumption also meant that the model was likely too sensitive to large additions of C to CWD and thus our windthrow simulations (Figures 8 and 9) may have been too sensitive to the timing and magnitude of the perturbations to C and N cycling. In the other simulations, because snags are slowly transferred into CWD, we do not believe that this assumption had a large impact. Furthermore, because these simulations represent stand-level dynamics, we believe this assumption is more accurate than for large landscape level simulations.
 Forest responses following attack by mountain pine beetles vary as a function of different outbreak and stand characteristics. Using the process-based ecosystem model CLM4, we demonstrated that several bark beetle outbreak characteristics significantly affected postoutbreak C and N cycling. The severity of an outbreak (number of killed trees) was a key factor in both the initial decrease in NEP and vegetative C and their long-term recovery over 80 to 100 years following the disturbance. Outbreak duration had important consequences for short-term C fluxes. These model results suggest that both severity and duration of an outbreak must be well characterized in observational studies of insect outbreaks for accurate analysis of these important forest disturbances.
 The fate of killed trees in the years after an outbreak had significant impacts on C fluxes. Our results indicate that if the trees were immediately cut after an outbreak and left on site, the large input of C coupled with soil N dynamics and related increases decomposition resulted in a large C source (as an upper bound of sensitivity to the rate of C and N entering CWD). Conversely, if the killed trees were removed from the site (harvested), the C source was small. We also find that a windthrow event (snag delay period simulations) resulted in a secondary reduction in NEP from decomposition of snags following the initial reduction due to reductions of GPP. This secondary reduction may not be expected unless snags are explicitly considered.
 Finally, we note that our base case outbreak with default outbreak characteristics reasonably captured observed dynamics of C and N following a bark beetle outbreak. However, more experimental and observational studies are needed to develop an improved understanding of coupled C and N dynamics following a bark beetle outbreak, and evaluate the hypotheses presented here. Our hypotheses resulting from this study include: (1) inputs of C to the soil from snags and roots increases microbial decomposition and immobilization; (2) N uptake by plants is reduced during periods of increased decomposition, causing reductions in productivity; (3) a secondary reduction in NEP occurs when snags begin decomposing. This secondary reduction is caused by both a reduction in GPP and an increase in Rh. Observations to test these hypotheses must include a long-term measurement period or chronosequence approach similar to that by Amiro et al. . Such campaigns need many measurements constraining important component fluxes of the carbon and nitrogen cycle.
 We thank Erik Kluzek, Nan Rosenbloom, and Sean Swenson, NCAR, for computational support and postprocessing algorithms, and Katy Kavanagh, University of Idaho, for assistance with interpretation of results. This work was funded by the National Institute for Climate Change Research, USDA Forest Service Western Wildland Environmental Threat Assessment Center, Los Alamos National Laboratory, and U.S. Department of Energy, Office of Science, Biological and Environmental Research (BER). Travel support was provided by the Climate and Global Dynamics Division of the National Center for Atmospheric Research (NCAR). Computing resources were provided by the Climate Simulation Laboratory at NCAR's Computational and Information Systems Laboratory, sponsored by the National Science Foundation and other agencies. The CESM project is supported by the National Science Foundation and the Office of Science (BER) of the U.S. Department of Energy. Oak Ridge National Laboratory is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.