SEARCH

SEARCH BY CITATION

Keywords:

  • Host-parasite interaction;
  • Insects;
  • Life history evolution;
  • Natural selection;
  • Trade-offs

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Individual fitness is expected to benefit from earlier maturation at a larger body size and higher body condition. However, poor nutritional quality or high prevalence of disease make this difficult because individuals either cannot acquire sufficient resources or must divert resources to other fitness-related traits such as immunity. Under such conditions, individuals are expected to mature later at a smaller body size and in poorer body condition. Moreover, the juvenile environment can also produce longer-term effects on adult fitness by causing shifts in resource allocation strategies that could alter investment in immune function and affect adult lifespan. We manipulated diet quality and immune status of juvenile Texas field crickets, Gryllus texensis, to investigate how poor developmental conditions affect sex-specific investment in fitness-related traits. As predicted, a poor juvenile diet was related to smaller mass and body size at eclosion in both sexes. However, our results also reveal sexually dimorphic responses to different facets of the rearing environment: female life history decisions are affected more by diet quality, whereas males are affected more by immune status. We suggest that females respond to decreased nutritional income because this threatens their ability to achieve a large adult body size, whereas male fitness is more dependent on reaching adulthood and so they invest in immunity and survival to eclosion.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Many animals have complex life cycles comprising distinct juvenile and adult stages. The fitness of these animals generally benefits from an earlier transition to adulthood and at a large body size (Nylin & Gotthard, 1998; Stearns, 2000; Morbey & Ydenberg, 2001). An earlier transition, however, is likely to be costly to the time available for growth and therefore might result in a reduced size at transition (Stearns, 2000; Roff, 2002). The optimal balance between development time and size at maturity should depend on the ecological factors experienced by the individual during the juvenile stage (Roff, 1992; Nylin & Gotthard, 1998). For example, if size at maturity is essential to adult fitness and nutritional availability is poor, then perhaps development time might be prolonged in order to maximize size (Roff, 1992; Nylin & Gotthard, 1998). Many studies in a variety of taxa support the prediction of life history models that development time should be prolonged when the resources that are required for growth and development are limited (Roff, 1992; Nylin & Gotthard, 1998). On the other hand, if the individual is time-constrained (e.g. the end of the reproductive season is approaching), then an earlier transition will probably be more essential to fitness than the size at transition. Another factor that might mediate the age–size trade-off is the prevalence of parasites and pathogens to which a juvenile is exposed (Rantala & Roff, 2005). Immune challenges might negatively affect development if some portion of an individual's limited nutritional resources are allocated to immunity rather than to growth (Lochmiller & Deerenberg, 2000; Zuk & Stoehr, 2002; see also Valtonen et al., 2010).

Life history models have been instrumental in predicting how ecological constraints experienced during the juvenile stage affect the age and size at which individuals should transition to adulthood (Rowe & Ludwig, 1991; Abrams et al., 1996; Day & Rowe, 2002). However, as de Block & Stoks (2005; see also Rolff et al., 2004) point out, these models have neglected to include other potentially important variables, such as body condition, that might also be optimized during development. Body condition is a likely candidate for optimization because the performance and quality of fitness-related traits in adults, such as sexually selected traits (Hunt et al., 2004) and immune function (Jacot et al., 2004; Kelly, 2011), are condition dependent. Some support for this hypothesis is found in studies of insects in which food-restricted individuals have a greater proportion of their body mass dedicated to fat, for a given body size (Stoks et al., 2006; Barrett et al., 2009; Dmitriew et al., 2009), compared with less-restricted individuals. These studies suggest that, in at least some species, individuals have the ability to adjust their allocation of nutritional resources to storage (i.e. body fat, mass) and structural growth independently depending on the prevailing ecological conditions.

Mathematical treatments that describe how a poor environment during ontogeny affects life history and adult phenotype also lack sex-specific predictions (de Block & Stoks, 2005). This is likely a critical omission given that the role of body size and reproductive lifespan in individual fitness can differ between the sexes (Fairbairn et al., 2007). So, for example, if body size is more important to female than to male fitness, then females might prolong development in order to accrue body size, whereas males might benefit by speeding through the juvenile phase in order to start mating as soon as possible (Nylin & Gotthard, 1998).

The juvenile environment not only affects an individual's development time and adult phenotype but it can also have profound and complicated long-term effects on resource allocation to other fitness-related life history traits and associated trade-offs in the adult (Lindstrom, 1999; Metcalfe & Monaghan, 2001; Boggs, 2009). Many empirical studies show that poor nutrition during development negatively affects adult lifespan or survival (Judge et al., 2008; Barrett et al., 2009), some components of immune function (Jacot et al., 2005b; de Block & Stoks, 2008; Muturi et al., 2011; Simmons, 2012) and the performance of traits related to reproduction (Hunt et al., 2004, 2005; Jacot et al., 2005a; Hebets et al., 2008; Judge et al., 2008; Valtonen et al., 2010; Woodgate et al., 2010). However, other studies suggest that a poor juvenile diet has null or beneficial effects on adult lifespan (Hunt et al., 2004; Joy et al., 2010; Lyn et al., 2010; Dmitriew & Rowe, 2011), immunity (Jacot et al., 2005b; Dmitriew et al., 2007; de Block & Stoks, 2008; Kelly & Tawes, 2013) and reproduction (Barrett et al., 2009; Adler et al., 2013). In addition to malnutrition, immune challenges during ontogeny could negatively affect fitness-related traits in the adult given that fighting disease, pathogens and parasites consumes limited nutritional resources that would otherwise be allocated to growth, somatic maintenance or the development of reproductive tissues (Tu & Tatar, 2003; Jacot et al., 2005a; Bascuñán-García et al., 2010; McNamara et al., 2013). Similar to diet-restricted juveniles, immune-challenged juveniles can also experience improved performance of some life history traits as adults (e.g. immune function: Jacot et al., 2005b). The effects of ontogenetic immune challenges on adult fitness are poorly studied compared with the effects of diet quality. To complicate matters even further, the costs to adult fitness of a poor diet or immune challenges during the development might manifest only when individuals are exposed to both of these stressors simultaneously (Valtonen et al., 2010; Simmons, 2012).

In this study, we use the Texas field cricket (Gryllus texensis) to test the hypothesis that a poor environment (i.e. poor diet and repeated immune challenges) during ontogeny will have negative effects on growth, development and adult lifespan and that these effects will differ between the sexes. Because a poor diet does not provide the nutritional resources that are essential for optimal growth and development, and because repeated immune challenges consume those limited nutritional resources, we predict that poorly fed and immune-challenged individuals will experience a protracted time to adulthood and have a smaller body size, mass and condition as adults compared with crickets in a good environment (i.e. well fed and nonchallenged). A poor rearing environment should have sex-specific consequences on life history traits and measures of fitness. Because fitness is likely more strongly related to body size in female than in male crickets, we predict that males reared in a poor environment will have a smaller body size and shorter development time compared with females. We also predict that a poor rearing environment will be costly to female fecundity. The predicted outcome of a poor rearing environment on adult lifespan is less clear. On the one hand, a poor environment is expected to reduce adult lifespan because these individuals should be malnourished and in poorer condition, whereas on the other, a poor environment might cause individuals to allocate resources to immunity (Fellous & Lazzaro, 2010), alter metabolic process (Sohal & Weindruch, 1996; Lin et al., 2002) or, in the case of males, invest less in sexual selection (Hunt et al., 2004), which could prolong lifespan relative to crickets raised in a good environment.

Materials and methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Experimental crickets were laboratory-reared descendants of individuals field-collected in Austin, TX (USA), and were raised communally for their first week in large bins (64 L) with water and dry cat food (Special Kitty: 34% protein, 13% fat) provided ad libitum. Crickets were then housed in separate individual containers (300 mL) and provided with water and food ad libitum. When 47 ± 2 days old (mean ± SD), crickets were haphazardly assigned to either a low (90% bran and 10% cat food)- or high (10% bran and 90% cat food)-protein diet (= 127 and 126, respectively) for the duration of the experiment. The bran and cat food were ground to a fine powder using a coffee grinder and thoroughly homogenized. Also at this time, crickets were haphazardly assigned to an immune status treatment, which consisted of injecting 2 μL of either a control injection of phosphate-buffered saline (PBS; Sigma-Aldrich) (= 125) or an experimental injection of 2 μg of lipopolysaccharide (LPS; Sigma-Aldrich) derived from the bacterium Serratia marcescens and dissolved in 2 μL of PBS (= 128). LPS is a nonpathogenic and nonliving elicitor that stimulates several pathways in the immune system of insects (Moret & Schmid-Hempel, 2000; Ahmed et al., 2002) including orthopterans (e.g. Adamo, 1999; Jacot et al., 2004; Fedorka & Mousseau, 2007; Shoemaker & Adamo, 2007; Leman et al., 2009; Kelly, 2011). All injections were given into the haemocoel, through the membrane between the sixth and seventh abdominal sternites using a 10-mL Hamilton syringe equipped with a 26s-gauge needle. Juvenile crickets were each injected three times with each injection given every 7th day. The first injection was given at a standardized pronotum length of 1.68 ± 0.557 mm (mean ± SD), and a two-way anova shows that body size indeed was similar between the experimental treatments at the time of the first injection (diet × status: F1,248 = 0.009, = 0.924; status: F1,248 = 0.013, = 0.908; diet: F1,248 = 1.42, = 0.234). Pronotum length (a proxy for body size) was defined as the distance between the anterior and posterior edges of the pronotum and was measured under a stereomicroscope using Leica LAS image analysis software (Leica Microsystems Inc., Buffalo Grove, IL, USA).

Experimental crickets were fed ad libitum, and food, water and containers were replaced weekly. Crickets were reared and maintained at 27 ± 1 °C on a 12 : 12-h light–dark cycle and were checked daily for death and eclosion to adulthood. At eclosion, the time to eclose (days), body mass (g) and pronotum length (mm) were recorded. A total of 11 crickets were removed from the data set because they eclosed after receiving only two injections of either saline or LPS.

Seven days after eclosion, experimental males and females were paired with an opposite-sex stimulus cricket to assess the effect of our experimental treatments on reproductive investment. Experimental females were placed in a deli cup with a haphazardly chosen adult male and a small dish of sand for oviposition. The stimulus male was replaced after 3 days with another haphazardly chosen male. After 3 days with the second stimulus male, the oviposition cup was collected and all eggs laid over the 6 days were counted. Thereafter, stimulus males and oviposition cups were changed weekly (eggs not counted) until the death of the experimental female. Experimental males were each placed with a stimulus adult female. Stimulus females were replaced weekly until the death of the experimental male. We attempted to sample mate-calling effort of each experimental male in order to measure reproductive investment; however, too few males were successfully recorded to permit analysis.

Body condition at eclosion was calculated for each cricket using Peig & Green's (2009, 2010) scaling mass index (SMI). This index uses the equation SMI = Mi[L0/Li]∋bSMA to standardize individual mass (Mi) to a specific fixed body size (L0 = mean size of data set; Li = individual size). The scaling mass index is superior to other methods of determining body condition from mass and length estimates because its use of model II linear regression (i.e. standardized major axis regression) incorporates the likelihood that both variables have some underlying error rate associated with their measurement. We first used model II regression to calculate the slope (bSMA) of the best-fit line from a standardized major axis regression of fresh body mass on pronotum length (both variables log-transformed). The model II slopes for the two sexes did not differ (likelihood ratio = 0.0004, λ = 7.64, = 0.98), so we combined them and used the common slope (bSMA = 2.76) in our analysis. L0 is the mean pronotum length from the entire data set (mean pronotum length = 2.938 mm). We calculated each cricket's SMI by substituting their fresh body mass (Mi) and pronotum length (Li) into the equation SMI = Mi[2.938/Li]2.76. We note that in this case using residual mass body mass instead from a regression of log (body mass) on log (pronotum length) of the scaled mass index gives identical results; however, we elected to present the scaled mass index because it has the advantage of retaining the original units of Y and it produces standardized Y-values that can be compared across studies of the same species.

The effect of diet treatment, immune status and sex on body mass, pronotum length and body condition at eclosion was tested using generalized linear models. We used Cox regression (our data met the assumption of proportional hazards) to test the effect of diet, immune status and sex on the time to eclosion, adult lifespan and total lifespan. The effect of diet and immune status on female fecundity was examined by using a generalized linear model (GLM) ancova with Poisson error distribution. Female body size was entered as a covariate. We used the model selection approach based on Akaike's information criterion to select a minimal adequate model for each analysis. Statistical analyses were conducted using the statistical environment R 3.0.1 (R Development Core Team, 2013) within which model II regressions were conducted using smatr (Warton et al., 2011), the minimal adequate generalized linear model was selected using MuMIn (Bartoń, 2013) and data were visualized using ggplot2 (Wickham, 2009). Means are given ±1 SEM unless otherwise stated.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Effect of diet quality and immune status on juvenile survival and time to adult eclosion

Immune status was not retained in the minimal adequate model that explored the effect of our experimental factors on eclosion success. The best-fit generalized linear model showed that significantly fewer crickets eclosed on the poor vs. the good diet (diet treatment*eclosion success: = −3.86, = 0.00014; Fig. 1).

image

Figure 1. Eclosion success of Gryllus texensis crickets reared on poor and good diets (black bars = did not eclose; white bars = eclosed).

Download figure to PowerPoint

For those crickets that survived to eclosion, the minimal adequate model showed significant sex*diet quality (= 2.973, = 0.0029) and sex × immune status (= 1.865, = 0.062) interactions on the time to eclose; therefore, we analysed the data for each sex separately. The minimal adequate model showed that diet quality significantly affected the time to eclose in females (= 2.95, = 0.0031) with those on the poor diet (median = 101 days, = 13) requiring more time than females fed the good diet (median = 76 days, = 30) (Fig. 2a). In contrast, the minimal adequate model showed that immune status affected the time to eclose in males (= 1.98, = 0.048) as LPS-injected males (median = 83 days, = 15) tended to require more time than saline-injected males (median=73 days, = 25) (Fig. 2b).

image

Figure 2. Sex-specific factors affecting nymphal development time. Female Gryllus texensis (a) development was significantly delayed on a poor (dashed line) rather than on a good (solid line) diet, whereas male Gryllus texensis (b) development was significantly delayed after repeated immune challenges with LPS (solid line) rather than after repeated injections with saline (dashed line).

Download figure to PowerPoint

Effect of diet quality and immune status on body size, mass and condition

As expected, the minimal adequate model showed that crickets raised on a good diet eclosed with a significantly longer pronotum (i.e. body size) than those on a poor diet (diet: t80 = 4.46, < 0.0001) and with males being significantly smaller at eclosion than females (sex: t80 = 2.48, = 0.015) (Fig. 3).

image

Figure 3. Gryllus texensis reared on a poor diet eclosed at a significantly larger mean (±SE) body size (i.e. pronotum width) than those reared on a good diet and females (triangles) eclosed at a larger size than males (circles).

Download figure to PowerPoint

As expected, the minimal adequate model showed that crickets fed a poor diet had significantly lower body mass than those fed a good diet (good diet: 0.328 ± 0.011 g, = 55; poor diet: 0.252 ± 0.011 g, = 28; diet treatment: t81 = 4.287, < 0.0001).

Contrary to expectation, the minimal adequate model showed no significant sex difference in body condition (scaled mass index) of crickets at eclosion (males: 0.312 ± 0.010 g, = 40; females: 0.291 ± 0.009 g, = 43; sex: t81 = 1.568, = 0.121).

Effect of diet quality and immune status on adult survival

The minimal adequate model revealed a significant interaction between diet treatment and sex (= 3.109, = 0.0019); therefore, we analysed each sex separately. Separate minimal adequate models for each sex showed that female adult longevity was not significantly affected by our experimental treatments (diet treatment: = 0.534, = 0.593; immune status: = 0.964, = 0.335; diet × immune status: = 0.815, = 0.415) (Fig. 4a). Male adult longevity, however, was significantly affected by diet (= 2.96, = 0.0031) with those fed a poor diet (median = 32 days, = 25) having a longer adult lifespan than those fed a good diet (median = 23 days, = 15) (Fig. 4b).

image

Figure 4. Sex-specific factors affecting adult longevity. The adult longevity of female Gryllus texensis (a) was not significantly affected by either diet (dashed line = poor; solid line = good) or juvenile immune challenges (grey = saline-injected; black = LPS-injected). The adult longevity of males (b) was significantly reduced on a good (solid line) vs. a poor diet (dashed line).

Download figure to PowerPoint

Effect of diet quality and immune status on total lifespan

The minimal adequate model showed significant sex × diet quality (= 1.877, = 0.061) and sex × immune status (= 2.608, = 0.0091) interactions on the total lifespan of crickets that survived to adulthood; therefore, we analysed the data for each sex separately. The minimal adequate model showed that diet quality significantly affected total lifespan in females (diet treatment: = 2.86, = 0.0042; immune status: = 1.13, = 0.26; diet × immune status: = 1.45, = 0.15) as those fed a poor diet (median = 109 days, = 28) lived longer than females fed the good diet (median = 100 days, = 55) (Fig. 5a). In contrast, the minimal adequate model showed that immune status significantly affected the total lifespan in males (= 2.58, = 0.0099) with LPS-injected males (median = 114 days, = 15) living longer than saline-injected males (median = 98 days, = 25) (Fig. 5b).

image

Figure 5. Sex-specific factors affecting total lifespan. The total lifespan of female Gryllus texensis (a) was significantly longer on a poor (dashed line) vs. a good diet (solid line). Male (b) total lifespan was significantly longer in those that were repeatedly immune-challenged as juveniles with LPS (solid line) vs. saline (dashed line).

Download figure to PowerPoint

Female fecundity

A female that was LPS-injected and fed a good-quality diet was dropped from the final data set because the number of eggs that she laid (= 181) was more than three standard deviations from the mean. The minimal adequate model retained the two main effects (diet treatment: = 7.233, < 0.0001; immune status: = 6.341, < 0.0001) as well as the covariate pronotum length (= 6.508, < 0.0001). This model showed that, as predicted, a poor diet (Fig. 6a) and repeated immune challenges (Fig. 6b) significantly reduced female fecundity. The number of eggs laid was also significantly positively related to female body size (Fig. 6a and b).

image

Figure 6. Diet (poor = triangles and dashed line; good = circles and solid line) and immune status (saline = triangles and dashed line; LPS = circles and solid line) significantly affected female fecundity. The number of eggs laid was also significantly positively related to female body size in both treatments.

Download figure to PowerPoint

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Evidence is accumulating across a wide array of animal taxa suggesting that adult fitness is critically dependent upon the conditions experienced as a juvenile (Nylin & Gotthard, 1998; Lindstrom, 1999; Metcalfe & Monaghan, 2001; Monaghan, 2008; Dmitriew, 2011). Poor conditions during ontogeny are known to increase age at maturity (i.e. prolonged development time) and to decrease body size and mass at maturity (Nylin & Gotthard, 1998), all of which can negatively affect reproductive success and adult survival. In line with our prediction, poor rearing conditions had a significant negative effect on the body size and mass at eclosion in both sexes of G. texensis crickets while, surprisingly, having little effect on body condition at eclosion. We also found that poor rearing conditions significantly extended juvenile development time compared with good conditions. Consequently, because poor rearing conditions had a profound effect on juvenile development compared with adult lifespan, crickets in the poor environment had a significantly longer total lifespan than those reared in a good environment. Importantly, the sex-specific responses to different aspects of the rearing environment observed in our study suggest that there is no universally ‘poor environment’ in G. texensis crickets.

We predicted that a poor diet and repeated immune challenges would delay the age at maturity because individuals would be forced to allocate their already limited nutritional resources to immune function rather than to growth and development. Although the data failed to support this prediction, we did find sex-specific effects of diet and immune status on nymph development. Our experiment showed that female development is more affected by diet quality than by immune status with those fed a poor-quality diet requiring significantly more time to eclose than those fed a good-quality diet. By delaying maturity under poor nutritional conditions, females might compensate for their suboptimal nutritional income and increase their body size at eclosion (Roff, 1992). Increasing body size should benefit female fitness because larger female G. texensis are generally more fecund (this study).

The lack of a significant effect of repeated immune challenges on female development time suggests that perhaps females simply allocate the bulk of their nutritional resources to growth and development at the expense of immunity. This seems unlikely, however, as survival to adulthood at any body size would surely be more beneficial to female fitness than failing to eclose due to a lethal dose of disease or parasites. Alternatively, perhaps our LPS dosage was too weak to elicit a significant investment into an immune response by females, which might suggest that female G. texensis have a higher tolerance for parasites, pathogens and disease than males. Thus, setting a higher tolerance threshold might permit females to allocate more resources to growth than males. We suggest that future experiments increase the dosage of the immune elicitor to determine whether immune challenges during ontogeny do indeed significantly affect female life history decisions.

In contrast, male development was more negatively affected by repeated immune challenges during ontogeny than diet quality, a finding similar to Simmons' (2012) study on the cricket T. commodus (see also McNamara et al., 2013). This response by males might be adaptive for two reasons. First, body size and mass might not be as important to fitness in male crickets as reducing the age of maturity and entering the mating arena as quickly as possible (reviewed in Nylin & Gotthard, 1998; Morbey & Ydenberg, 2001). Despite some evidence suggesting that adult body size is important to male fitness in crickets (e.g. Simmons, 1988, 1995; Zuk, 1988; Simmons & Zuk, 1992; Harrison et al., 2013), a growing number of studies point to adult diet being a greater fitness determinant (Gray & Eckhardt, 2001; Holzer et al., 2003; Scheuber et al., 2003; Hunt et al., 2004; Hedrick, 2005; Judge et al., 2008; Maklakov et al., 2008; Zajitschek et al., 2009; Kelly & Tawes, 2013). That males fed either a poor or good diet eclosed at the same time, on average, but at different sizes and masses (i.e. well-fed males were larger and heavier), supports the hypothesis that minimizing development time might be more important than maximizing size. Second, allocating resources to ensure survival to the reproductive stage of life should be adaptive, despite increasing the age of maturity, because dying before eclosion guarantees a fitness of zero.

Poor rearing conditions did not negatively affect body condition at eclosion in our study, a finding that differs from other work on crickets (Gray & Eckhardt, 2001; Scheuber et al., 2003; Jacot et al., 2005a; b; Kelly & Tawes, 2013). This apparent difference in outcome might be due to other studies using residual body mass as an index of body condition, an index that is considered inferior to the scaled mass index (Peig & Green, 2009, 2010). That said, however, both methods gave similar results in the present study. Similar body condition across treatments in our study might be explained by males and females adjusting their allocation of nutritional resources to both size and mass simultaneously during ontogeny in a manner that maximizes body condition at eclosion. Maximizing condition at eclosion should be adaptive given that several components of fitness in crickets and their kin are condition dependent (e.g. mate calling: Holzer et al., 2003; Hunt et al., 2004; Jacot et al., 2004; Judge et al., 2008; immune function: Jacot et al., 2004; Kelly, 2011; female fecundity: Adler et al., 2013; Stahlschmidt et al., 2013) rather than being strictly size based (Gray & Eckhardt, 2001; Scheuber et al., 2003; Zajitschek et al., 2009).

Adult longevity was significantly affected by nutritional intake in males only with those fed a poor diet having significantly greater longevity than those fed a good diet. This result counters the general pattern across animals (Nakagawa et al., 2012), but is in line with Hunt et al.'s (2004) study on the field cricket Teleogryllus commodus in which they argued that well-fed males die earlier because they invest heavily in energetically expensive mate calling early in life. The earlier death of well-fed adult males in our study is perhaps also due to greater investment in mate-calling effort compared with poorly fed males. This does not appear to be a universal effect of diet quality on male crickets, however, as Judge et al. (2008) found that well-fed male G. pennsylvanicus lived longer and called more than poorly fed males. In contrast to males, we found no effect of diet on the adult lifespan of females, a finding contradictory to other studies on crickets (Wagner & Harper, 2003; Hunt et al., 2004; Judge et al., 2008). That poorly fed females in our study had the same adult lifespan as well-fed individuals might be due to a reallocation of resources by poorly fed females to somatic maintenance and away from reproduction, given that poorly fed females laid fewer eggs than well-fed females (see below). The common observation across animal taxa of poorly fed females having reduced fecundity without suffering a cost to their adult lifespan (Klass, 1977; e.g. Chapman & Partridge, 1996; Selesniemi et al., 2008) has long been thought to reflect an adaptive resource allocation strategy to maximize an individual's chances of surviving periods of food shortage (Holliday, 1989; Shanley & Kirkwood, 2000; Partridge et al., 2005). Recent studies, however, have called this hypothesis into question (Mair et al., 2004; Grandison et al., 2009; Fanson et al., 2012; Adler et al., 2013).

Factors comprising our poor rearing environment affected the total lifespan of each sex differently but, within each sex, a longer total lifespan was due to a longer juvenile phase. Female total lifespan was significantly influenced by diet quality with individuals fed a poor-quality diet having a longer lifespan than those fed a high-quality diet. Because none of our experimental treatments affected the longevity of adult females, the observed difference in total lifespan is attributed to the prolonged developmental period of poorly fed females. In contrast, Barrett et al. (2009) found that female cockroaches (Nauphoeta cinerea) reared on poor and good diets had similar total lifespans, but achieved them through different processes: poorly fed females had a long juvenile phase and short adult lifespan, whereas the opposite was true for well-fed females. The total lifespan of males, on the other hand, was significantly influenced by immune status with immune-challenged individuals having a longer total lifespan than with saline-injected individuals. Although poorly fed males had a longer adult lifespan than their well-fed counterparts, the effect of adult lifespan seems to have had little effect on their total lifespan. Thus, the prolonged developmental period of LPS-injected males appears to be the main factor responsible for the observed extended total lifespan in males.

We found that repeated immune challenges with LPS during the juvenile stage did not significantly affect adult longevity. Jacot et al. (2005b) found that male crickets that were injected with LPS as juveniles had some immune parameters that functioned better than others as adults, and Kelly & Tawes (2013) found that female crickets reared on a poor diet had significantly greater disease resistance than females reared on good diet. Thus, if adult longevity is tied to immune function, then we would likely only see the effects of our experimental treatments when individuals are exposed to disease, pathogens or parasites. However, because our experimental crickets were maintained as adults under relatively benign conditions, greater immunological activity would likely have had little opportunity to impact longevity.

As predicted, a poor juvenile environment significantly affected female reproductive success with poorly fed and immune-challenged juvenile females laying significantly fewer eggs as adults compared with their well-fed and unchallenged counterparts. This is not surprising given the significant positive relationship between body size and fecundity in our females and that our poorly fed females had smaller adult body size. However, that both experimental treatments negatively affected fecundity, despite our statistically controlling for body size, suggests that diet and immune status have effects on female fecundity that are above and beyond the simple body size–fecundity relationship. Because females were maintained on their same diets as adults, poorly fed females might simply have had fewer nutritional resources to allocate to egg production compared with females fed a good diet. Other studies on insects have also demonstrated that females that are well fed as juveniles (Adler et al., 2013) or adults (Stahlschmidt et al., 2013) are more fecund than poorly fed individuals. Less straightforward, however, is our observed negative effect of juvenile immune challenges on adult fecundity. One possible explanation is that, similar to Jacot et al.'s (2005a,b) finding in male G. campestris crickets, perhaps immune challenges during ontogeny caused females to up-regulate aspects of their adult immune system that are in competition with egg production for limited resources. Alternatively, perhaps ovarian development was hampered by the repeated immune challenges if this structure's development was in competition with the immune system for limited resources prior to eclosion (see also Barrett et al., 2009). In contrast to our finding, McNamara et al. (2013) did not find any effect of a juvenile immune challenge on adult female reproduction in the bollworm Helicoverpa armigera.

Taken together, our results suggest that the life history decisions of male and female G. texensis crickets during ontogeny are affected by different facets of the rearing environment; females are affected more by diet quality, whereas males are affected more by immune status. Moreover, the particular facet to which the sexes respond appears to be directly linked to how each sex maximizes their fitness. Our data suggest that juvenile females respond to decreased nutritional income because this likely threatens their ability to achieve a large adult body size, whereas male fitness is less affected by body size than it is by reaching adulthood and so males invest more in their immunity and survival to eclosion. Also, our finding that both sexes might maximize body condition at eclosion supports de Block & Stoks (2005) cautionary tale that phenotypic traits other than size or mass might be maximized during the development in a poor environment.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Funding was provided by Iowa State University Faculty Start-up Award to CDK. We thank two anonymous referees for improving the article.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References