To investigate age-related trends in bulimic symptoms and associated putative risk factors among Norwegian youth.
To investigate age-related trends in bulimic symptoms and associated putative risk factors among Norwegian youth.
A sample of 3,150 participants, 1,421 (45.1%) males and 1,759 (54.9%) females, was prospectively followed for 11 years at three time points from adolescence to adulthood. Linear random coefficient models were applied.
For females, bulimic symptoms increased from age 14 to 16 and declined slowly thereafter. For males, the symptoms decreased between ages 14 and 16 and returned in the early 20s. Females had higher levels of symptoms than males at every age. Age-associated trends in body mass index, appearance satisfaction, and symptoms of anxiety and depression were associated with some of the trends for both genders. For females, changes in alcohol consumption and cohabitation status functioned as predictors as well.
Males and females show distinct developmental trajectories of bulimic symptoms during adolescence and in the transition to adulthood. Prevention interventions should focus on putative risk factors in mid-adolescence for females and in the early 20s for males. © 2011 by Wiley Periodicals, Inc. (Int J Eat Disord 2012; 45:737–745)
Bulimia is one of the most commonly reported types of eating disorders in both clinical1 and nonclinical populations.2 For girls, adolescence is regarded as a developmental period marked by elevated risks of eating disorders, including bulimia.3 Moreover, it has been stated that the prevalence of bulimia gradually declines from the early 20s.4 Relatively little is known, however, about developmental trends of bulimic behavior across adolescence and in the transition to adulthood in nonclinical populations, and how such trends might be explained. Furthermore, there is a notable lack of population-based studies that include both male and female participants and systematically examine gender differences in developmental patterns of symptoms of bulimia. By using data from a large population-based longitudinal cohort study, this study aims to examine age trends in bulimia symptomatology for both males and females and determine how these trends might be related to putative risk factors.
To date, few prospective longitudinal studies have addressed the course of eating problems from adolescence to adulthood. Keel et al.5 located only six studies that bridged this transition period, most of them with small samples and short follow-ups that could preclude the detection of development-related changes in eating problems. Many studies have focused on females attending selective colleges or private schools, so those results may not be representative of the general population. Hence, knowledge regarding developmental pathways and risk factors associated with bulimia is still incomplete.3, 6
In general, adolescence is considered the critical developmental stage for bulimic symptoms, and the symptoms increase during this period.3, 7 The peak risk for abnormal eating behavior seems to be mid-adolescence (15–16 years),8 while binge eating and purging appear somewhat later.9 A recent community-based longitudinal study also reported that the peak period of risk for onset of bulimia and binge eating was between ages 17 and 18.10 The risk of developing bulimia may decline during the transition into adulthood, but thus far only one longitudinal study following adolescents into adulthood supports this pattern.5, 11 Other studies, however, have found no significant changes in the prevalence of bulimia and disordered eating in the transition from adolescence to young adulthood.12, 13 One possible explanation for the finding of no change in prevalence is that participants in these studies were followed up only until their early 20s, potentially missing developmental changes affecting eating problems during the complete transition to adulthood.12, 13 It is also possible that the different findings of these studies are the result of differences related to study designs, sample populations, and definitions and measurements of eating problems.
As for potential differences between males and females in the developmental trends of bulimia, again only the aforementioned study by Keel and colleagues explored this issue.5, 11 Their study found an increased risk of bulimia during adolescence for both females and males, followed by a significant decrease during the transition into adulthood.5, 11 The decreases reported for females, however, were more dramatic than those for males; and females had higher bulimia scores than males at each survey time.5
The increases in bulimic symptoms from late adolescence to young adulthood were associated with changes in dieting behavior, body dissatisfaction, and body weight.11 Moreover, decreases in the bulimia scores of females from late adolescence to midlife (≈ 40 years) were related to increased body satisfaction and the assumption of adult roles such as marriage and motherhood. For males, only decreased dieting frequency could associate with the decrease in bulimic symptoms.5 However, the role of other potential risk factors in explaining developmental trends in bulimic symptoms from adolescence to adulthood has not yet been longitudinally examined. Such potential risk factors include depressive symptoms,14 self-esteem,15 and alcohol consumption.16
This study is therefore designed to examine the development of symptoms of bulimia in a large population-based sample of males and females studied longitudinally from adolescence into young adulthood. The aim is to investigate how the level of symptoms changes with age and how those changes can be associated with putative risk factors during the same developmental period. In addition, as bulimic symptoms appear to develop differently for females and males, developmental trends are examined separately for each gender. Body mass index (BMI), appearance satisfaction, symptoms of anxiety and depression, self-worth, alcohol consumption, and cohabitation status are regarded as the putative risk factors in this article.
Data from the longitudinal study “Young in Norway” were analyzed.17, 18 This national representative study was conducted at four time points: 1992 (T0), 1994 (T1), 1999 (T2), and 2005 (T3). The initial sample at T0 was composed of 12,287 students in grades 7–12 (12–20 years of age) from 67 representative schools in Norway. Each grade was equally represented. Every school in the country was included in the register from which the schools were selected. The sample was stratified according to geographical region and school size, which in Norway is closely related to degree of urbanization. Each school's sampling probability was proportional to the number of students at the school. This procedure provided an equal probability of selection for each student. Detailed information about the sampling procedure is presented elsewhere.17, 18 The response rate at T0 was 97%.
In 1994, three of the participating schools at T0 were not part of the follow-up study (T1; 14–22 years of age). At another school, a burglary in the school's archives resulted in the loss of the project's identification records. In all, then, 9,679 students from 63 schools were eligible to complete the T1 questionnaire. As a considerable proportion of the students had completed their 3-year track at the junior or senior high school they were attending at T0, subjects who were no longer at the same school at T1 received the questionnaire by mail. The overall response rate at T1 was 79%.
Only students who completed the questionnaire in school at T1 (n = 3,844) were followed up at T2 due to a comparatively lower response rate among those receiving the questionnaire by mail. Because the survey was originally planned as a two-wave study, informed consent had to be obtained again at T1. Out of the total number of consenting individuals at T1 (n = 3,507, 91.2%), 2,923 (83.8%) responded to the questionnaire they received by mail at T2 (19–28 years of age). This represented an overall response rate of 68%.
In 2005 (T3), all those who had consented at T1 to the follow-up were again invited to participate (25–34 years of age). In all, 2,890 of 3,507 potential participants, or 82.4%, completed the questionnaire, resulting in an overall response rate of 67%. For purposes of this study, data from T1–T3 were used, as bulimic symptoms were not measured at T0. Moreover, only those who had responded to at least two of the three questionnaires from T1 to T3 were included in the analyses.
The following variables were measured from T1 to T3.
Bulimic symptoms were measured by the 30-items symptom scale of the Bulimic Investigatory Test, Edinburgh (BITE).19 The BITE scale does not only assess bulimic symptoms but also attitudinal and cognitive aspects of bulimia.19 The scale has been validated by showing high sensitivity and specificity to identify female bulimic patients.19, 20 One week test-retest coefficients were 0.86 and 0.68 in nonclinical populations and bulimic women, respectively.19 However, the validity for males has been questioned: a study conducted among Australian boys and girls aged 12 to 17 showed that the BITE revealed a one-factor structure (“Bulimia”) for girls, whereas a two-factor structure for boys which labeled as “Emotional and Rigid/Disruptive Eating Style” and “Food Preoccupation and Binging.”21 All items were rated on a four-point scale, ranging from 1 to 4. A mean BITE score was calculated, with high scores indicating high levels of symptoms. The internal consistency (Cronbach's α) was 0.83, 0.83, and 0.82 at T1, T2, and T3, respectively.
BMI (kg/m2) was computed from self-reported measures of height and weight. The accuracy of self-reported BMI has been demonstrated to be a valid measure of BMI.22
Appearance satisfaction was assessed by the Body Areas Satisfaction Scale (BASS).23 The BASS scale showed acceptable validity and high test-retest reliability; 1-month test-retest coefficients were 0.86 for males and 0.74 for females.24 The scale consists of seven items rating the individual's level of satisfaction with the following seven body areas: face, lower torso, mid-torso, upper torso, muscle tone, weight, and height. Response options varied from 1 (“very dissatisfied”) to 5 (“very satisfied”). A mean score was computed, with high scores indicating a high level of satisfaction. The scale showed good internal consistency at each survey point: 0.81, 0.81, and 0.82 at T1, T2, and T3, respectively.
Depressive symptoms were measured by the six-items of Depressive Mood Inventory, constructed by Kandel and Davies.25 The instrument has shown good validity and acceptable psychometric properties.25 It also showed high test-retest reliability (0.76) over a 5- to 6-month interval.25 Using a four-point response scale ranging from 1 to 4, participants were asked to restrict their ratings to the preceding week. Mean scores were calculated, with high scores showing high levels of depressive symptoms. The internal consistency of the scale was 0.78, 0.80, and 0.79 at T1, T2, and T3, respectively.
Symptoms of anxiety were measured by six items derived from the Hopkins Symptom Checklist.26 Confirmatory factors analyses supported the construct validity of the six items of anxiety symptoms.27 The items have a four-point response scale, ranging from 1 to 4. The rating was restricted to the preceding week. A mean score was computed, with high scores showing strong symptoms of anxiety. The scale showed a satisfactory internal consistency on all occasions: 0.81, 0.84, and 0.85 at T1, T2, and T3, respectively.
General self-worth was measured using the Global Self-Worth subscale of a revised version of the Harter's Perception Profile for Adolescents.28, 29 A validation study of the scale among Norwegian subjects has shown good reliability and validity.28 A four-point response scale was applied, ranging from 1 (“corresponds very poorly”) to 4 (“corresponds very well”). A mean score was computed, with high scores indicating high self-worth. The five-item scales showed good internal consistency: 0.81, 0.80, and 0.82 at T1, T2, and T3, respectively.
Alcohol consumption was measured by asking participants to indicate how often they had “drunk so much that you felt clearly intoxicated” during the preceding 12 months; the six-point response scale ranged from 1 (“never”) to 6 (“more than 50 times”). A cohabitation status variable was constructed, with 1 indicating no cohabitation (single) and 2 indicating that the participant was living with a partner (whether married or not). Age was recorded at the time of each survey. Gender was coded as 1 for male and 2 for female. Parental socio-economic status (SES) was determined on the basis of the occupations of the participant's mother and father. Reported occupations were categorized according to the ISCO-88 classification,30 and all parents were assigned to one of five categories, ranging from 1 (“manual workers”) to 5 (“professional leaders”). The parent with the highest occupational classification was used to indicate parental SES. Survey times (T1, T2, and T3) were coded as 0, 5, and 11 to represent the unequally spaced intervals (years) between the measurements. Parental SES and time variables were used only as controlling covariates.
We aimed to investigate the changes in BITE scores with age and the effects of covariates on the overall level of BITE scores. Accordingly, we used linear mixed effects models, more specifically random coefficient models. These models are also often called growth curve models and are recommended for use when modeling individual trajectories over time in longitudinal studies.31, 32 Such models are particularly suited for analyzing inherently unbalanced longitudinal data.32 Age was used as the temporal predictor for an individual growth model because participants varied in age at the outset of the study, and age, not time points, is usually the appropriate metric for analysis in an accelerated cohort design.33 The mean BITE score was modeled as a combination of population characteristics “β” assumed to be shared by all individuals (fixed effects) and subject-specific characteristics (random effects). The random effects included both a random intercept and a random slope. For purposes of this study, the fixed part of the model explaining changes in the mean BITE scores over age was presented and discussed. To simplify model fitting, we specified only a random slope for age in the analysis. We conducted a step-by-step generation of a basic model using a model contrasting approach illustrated by log likelihood values and the Akaike information criterion (AIC). Maximum likelihood estimates were applied. Statistical analysis was performed using Stata SE/10 for Windows.
Since a cluster sampling strategy was used where students were recruited at selected schools, the intraclass correlation coefficient (ICC) was computed, indicating that 5% of the variance of bulimic symptom scores was attributable to the school level. Since we found such low ICC for the school effect, we did not include schools as a higher level in the random coefficient models.
To increase homogeneity, 72 participants, those who were above 22 years old (n = 69) or below 13 years old (n = 3) at T1, were excluded. In all, 3,150 participants were included in the analysis; 1,421 (45.1%) were males and 1,759 (54.9%) were females. Summary statistics and mean changes over time for all variables are summarized in Table 1. The participants were prospectively followed for 11 years covering the age span from 14 to 33 years. The mean ages were 16.3, 21.8, and 28.3 years at T1, T2, and T3, respectively. By comparing mean BITE scores at each time point, we observed a significant reduction in BITE scores for females over time (β = −0.3) but saw no change in BITE scores over time for males. Females reported significantly higher BITE scores compared with males at each assessment. Both BMI and alcohol consumption increased significantly in males and females over time. Females reported higher levels of depressive and anxiety symptoms compared with males at each assessment. Symptoms of both depression and anxiety declined significantly for females over time, but for males only the anxiety symptoms decreased significantly. Females reported a significant increase in appearance satisfaction over time, while males reported a significant decrease. Still, males demonstrated higher appearance satisfaction than females at each time point (see Table 1).
|Variables||Males (n = 1,421, 45.1%)||Females (n = 1,729, 54.9%)|
|T1||T2||T3||β (SE)||T1||T2||T3||β (SE)|
|BITE score||1.63 (0.3)||1.64 (0.2)||1.64 (0.2)||0.004 (0.01)ns||1.83 (0.3)||1.80 (0.4)||1.75 (0.3)||−0.3 (0.01)***|
|Age (years)||16.3 (1.7)||21.8 (1.7)||28.3 (1.7)||5.99 (0.1)***||16.5 (1.7)||21.8 (1.9)||28.4 (1.8)||5.94 (0.1)***|
|BMI||21.6 (5.3)||24.1 (4.7)||25.3 (3.7)||1.90 (0.8)***||20.7 (4.6)||22.6 (4.2)||23.6 (4.1)||1.45 (0.5)***|
|Appearance satisfaction||3.47 (0.6)||3.71 (0.6)||3.64 (0.6)||−0.05 (0.01)***||3.24 (0.6)||3.32 (0.6)||3.38 (0.6)||0.07 (0.01)***|
|Self-worth||2.60 (0.3)||2.64 (0.3)||2.58 (0.3)||−0.01 (0.01)ns||2.51 (0.3)||2.52 (0.3)||2.50 (0.3)||−0.01 (0.01)ns|
|Anxiety symptoms||1.29 (0.3)||1.29 (0.4)||1.26 (0.4)||−0.01 (0.01)*||1.62 (0.5)||1.53 (0.5)||1.41 (0.4)||−0.10 (0.01)***|
|Depressive symptoms||1.58 (0.5)||1.61 (0.5)||1.55 (0.5)||−0.01 (0.01)ns||1.91 (0.6)||1.80 (0.6)||1.63 (0.6)||−0.13 (0.01)***|
|Alcohol consumption||2.48 (1.6)||4.07 (1.5)||3.91 (1.4)||0.74 (0.03)***||2.48 (1.6)||3.51 (1.5)||3.12 (1.5)||0.33 (0.02)***|
|Parental SES at T1||3.49 (1.2)||–||–||–||3.41 (1.2)||–||–||–|
|Cohabitation status (%)||1.83||16.6||64.1||–||5.0||36.2||78.1||–|
We then graphically examined how changes in mean BITE scores were related to the age of participants. As shown in Figure 1, females had higher levels of bulimic symptoms at every age, when compared with males. The figure also indicates gender differences in the shape of the developmental trends of bulimic symptoms: While the mean BITE score for females increased until age 16 and then followed a downward trend, males' mean score decreased until age 17, rose thereafter until age 20, and then remained relatively stable.
Our next step was to fit a model to determine how BITE scores change with age from adolescence to young adulthood and to examine the shape of this change. For this purpose, a series of random effect models was estimated, with age and gender as independent variables and time as a control covariate (Table 2). In Model 1 in the table, age was included as the sole predictor of BITE scores. The significant regression coefficient for age indicates that the mean BITE score decreased linearly with −0.01 per year. In Model 2, the quadratic effect of age was added as a predictor to look for nonlinear age trends. No such quadratic trend was found, as shown by the nonsignificant regression coefficient. Moreover, the quadratic trend model did not improve the overall fit of the model, when compared with a linear trend model, χ2(1) = 0.7, p > .05. The quadratic term was therefore dropped from further analysis.
|Males and Females Combined (n = 3,150)|
|Model 1||Model 2||Model 3||Model 4|
|β (SE)||β (SE)||β (SE)||β (SE)|
|Intercept||1.888 (0.043)***||1.949 (0.066)***||1.653 (0.043)***||1.441 (0.056)***|
|Age||−0.009 (0.002)**||−0.015 (0.005)*||−0.010 (0.002)***||−0.001 (0.003)ns|
|Gender||–||–||0.167 (0.009)***||0.306 (0.025)***|
|Intercept||0.397 (0.020)***||0.397 (0.020)***||0.378 (0.020)***||0.371 (0.020)***|
|Age||0.013 (0.001)***||0.013 (0.001)***||0.013 (0.001)***||0.013 (0.001)***|
|Residual||0.211 (0.003)***||0.211 (0.003)***||0.210 (0.003)***||0.211 (0.003)***|
In Model 3, gender differences in the mean BITE score were examined by including gender as a predictor. The results show a significant gender difference in BITE scores, with females reporting higher symptoms on average than males. To assess gender differences in the shape of the developmental trend, an age-by-gender interaction term was included in Model 4. The coefficient of this interaction suggests that males and females differed in how bulimic symptoms developed over age. More specifically, the BITE scores declined more markedly for females than for males. The addition of gender and its interaction with age also resulted in considerable improvements of the log likelihood and AIC values, reflecting the superior fit of these models compared with the model where only age was included as a predictor.
Because a significant interaction effect for gender was found, separate analyses were conducted to examine specific age trends for females and for males. To capture the shape of the developmental trends adequately, piecewise spline growth curve models were constructed.34 To illustrate the application of the spline model, we specified three age knots based on the observed trends in Figure 1. The knots represent different slopes for the changes in BITE scores at ages less than or equal to 16, between 16 and 20, and greater than or equal to 20. This approach accommodates the nonlinear development of bulimic symptoms. The results are presented in Table 3.
|Males (n = 1,421, 45.1%)||Females (n = 1,729, 54.9%)|
|β (SE)||β (SE)+||β (SE)||β (SE)+|
|Intercept||2.515 (0.162)***||2.245 (0.176)***||1.037 (0.186)**||1.599 (0.198)***|
|14 to 16 years||−0.057 (0.010)***||−0.047 (0.011)***||0.053 (0.012)***||0.022 (0.012)ns|
|16 to 20 years||0.008 (0.003)*||0.005 (0.001)ns||−0.019 (0.004)***||−0.022 (0.005)*|
|≥ 20 years||0.001 (0.001)ns||0.002 (0.003)ns||−0.005 (0.001)***||−0.004 (0.004)ns|
|BMI||–||0.007 (0.001)***||–||0.006 (0.001)***|
|Appearance satisfaction||–||−0.063 (0.007)***||–||−0.181 (0.009)***|
|Self-worth||–||0.035 (0.013)*||–||0.019 (0.016)ns|
|Anxiety symptoms||–||0.084 (0.015)***||–||0.108 (0.013)***|
|Depressive symptoms||–||0.044 (0.011)***||–||0.057 (0.011)***|
|Alcohol consumption||–||−0.001 (0.002)ns||–||0.013 (0.003)***|
|Cohabitation status||–||−0.023 (0.011)ns||–||−0.042 (0.012)***|
|Intercept||0.407 (0.020)***||0.347 (0.024)***||0.346 (0.036)***||0.289 (0.040)***|
|Age||0.155 (0.001)***||0.013 (0.001)***||0.011 (0.002)***||0.008 (0.002)***|
|Residual||0.162 (0.003)***||0.163 (0.004)***||0.239 (0.005)***||0.221 (0.004)***|
As Table 3 shows, the changes in BITE scores for females were significant at each age knot; the scores increased until age 16 and then followed a declining trend. For males, the changes in BITE scores revealed a different trend: their scores decreased significantly until age 16, rose afterward until age 20, and remained stable thereafter. After adjusting for putative risk factors, the regression coefficients for females aged less than or equal to 16 years and those greater than or equal to 20 years diminished into insignificance. Likewise, no significant time trend was found for males aged 16 to 20 years when we controlled for all putative risk factors. However, the declining trends of BITE scores until age 16 in males and at age 16–20 in females, while declining somewhat, remained significant.
Furthermore, Table 3 shows how the putative risk factors were related to symptoms of bulimia for both males and females. BMI was positively associated with bulimic symptoms in both genders, thereby indicating that participants with high BMI experienced significantly more symptoms. In contrast, appearance satisfaction was negatively related to bulimic symptoms; that is, high satisfaction with body appearance associated with low symptom scores. Although self-worth showed a significant effect in the univariate analysis (β = −0.08, p < .001), its adjusted value was nonsignificant for females. Both symptoms of anxiety and depression showed a significant positive association with the development of bulimic symptoms in males and females; experiencing a high level of symptoms was significantly related to higher BITE scores. For females, greater frequency of alcohol consumption was significantly related to an increase in symptoms. Cohabitation status showed a protective effect in females, indicating that cohabitating individuals experienced significantly reduced BITE scores. The larger regression coefficients for predictors in the model for females may reflect the fact that the characteristics of the putative risk factors had stronger impacts on the changes in BITE scores for females than for males.
This study investigates age-related changes in the level of bulimic symptoms from adolescence to young adulthood in a population-based sample of males and females. The main finding is that both males and females show distinct developmental trajectories of bulimic symptoms during adolescence and in the transition to adulthood. For females, mid-adolescence appears to represent a high-risk time for bulimic behaviors; thereafter, their level of symptoms decreases significantly with age. For males, bulimic symptoms show a decline from age 14 to mid-adolescence and then increase somewhat, leveling off in the early 20s. Females demonstrate higher levels of bulimic symptoms than males at all ages. BMI, appearance satisfaction, and symptoms of anxiety and depression are significant predictors of bulimic symptoms for both genders; alcohol consumption and cohabitation, however, are significant predictors only for females. Not only does the inclusion these variables reduce the increase level of bulimic symptoms in females from adolescence into young adulthood to insignificance, but such inclusion reduces also in large part the decrease in bulimic symptoms exhibited by females in their early 20s and beyond. In contrast, the decrease in bulimic symptoms for males up to the age of 16 is not statistically explainable by the covariates, whereas their subsequent increase in symptoms becomes nonsignificant when the covariates are included in the analyses.
Consistent with previous research findings,3, 6 we find females to be at greater risk than males for developing bulimia, particularly during adolescence. We associate this higher risk with greater levels of body dissatisfaction, anxiety, and depressive symptoms. The gender difference may also stem from adolescent females' increasing concern about and dissatisfaction with their body appearance and a tendency to overestimate their actual body size even when they have low body weight, factors that can lead to negative affectivity and unhealthy eating practices.3, 7
Specifically, our study shows that the period of greatest risk for females is mid-adolescence. This finding is in accordance with prior studies reporting this age group as peak risk for bulimic behaviors8, 10, 35: Stice et al.10 found 17- to18-year-old females at greatest risk for the onset of bulimia and binge eating, and Calam and Waller35 reported a higher level of bulimic symptoms at 19 years than at 12 years. Unlike the aforementioned studies, however, this study specifically explores the trend of bulimic pathology in females from a developmental perspective, following a large number of participants through important stages of their early development to young adulthood.
On the other hand, males experienced lower level of bulimic symptoms in mid-adolescence, when compared with early adolescence (age 14). This reduction in symptom level may be the result of less concern with or a general acceptance of their body size; they may see pubertal development as bringing them closer to the masculine ideal, which, in turn, could lead to less body dissatisfaction and psychological distress and, thus, to a reduced risk for bulimia.3, 21 However, such explanations could not be tested with the data at hand. Thus, the mechanism behind the development of bulimic symptoms in adolescent males requires further investigation.
In parallel with findings of Keel et al.,5 the transition into adulthood is marked by a considerable decline of bulimic symptoms in females. This transition is accompanied by reduced levels of anxiety and depressive symptoms, a change from living alone to cohabitating, and increased satisfaction with body appearance despite a rising trend in BMI. Thus, psychosocial processes involved in females' transition to adulthood, including greater acceptance of personal appearance, a decline in emotional problems, and the formation of close and stable relationships, may be related to the reduced risk of bulimic pathology during adulthood for females.
Unlike prior longitudinal research indicating a significant decrease of bulimia in the transition into adulthood for males,5, 11 our data do not support this pattern. Rather, we observed that bulimic symptoms in males returned in the early 20s. Our findings indicate that increased levels of BMI, anxiety, and depressive symptoms may be associated with the rise of bulimic symptoms during this period. Alternatively, males' reported increase in bulimic symptoms in the early 20s may be due mainly to increased scores on BITE items related to binging behaviors. Previous studies have suggested that binge eating behavior may increase at a later age than other forms of eating disorders36 and affects males as much as females.3 However, this phenomenon needs to be investigated in future studies.
To our knowledge, this study is the first to show that age trends in the development of bulimic symptoms are related to changes in putative risk factors such as BMI, appearance satisfaction, and anxiety and depressive symptoms in both males and females. With reference to the trend for females, cohabitation and alcohol consumption are also predictors of the changes in the level of bulimic symptoms from adolescence to adulthood. Of particular interest, the protective effect of living with a partner (cohabitating) suggests that taking on adult roles may influence in the development of bulimic symptoms in females. It also substantiates the unique mechanism involved in the reduction of bulimic symptoms in adult women. This finding confirms and extends findings of previous longitudinal studies, providing support for the mechanism related to adult roles and the development of symptoms of bulimia in females.5, 17
A main strength of the study is its longitudinal nature, which allowed us to examine changes in bulimic symptoms and identify predictors of those changes in a cohort of males and females followed prospectively from adolescence to young adulthood, the period of greatest risk for the emergence of and growth in bulimic pathology. The study provides information about developmental trends and putative risk factors that may have important implications for understanding the preventive management of bulimic disorders.
The study also has some limitations. First, even though the study design is longitudinal, we could not delineate the temporal order between bulimic symptoms and the putative risk factors. The research therefore provides no information about the causal relationship between bulimic symptoms and any of the putative risk factors. In other words, age trends in putative risk factors might be the effect of trends in bulimic symptoms not the cause of them. It has been noted in the scientific literature that few well-documented risk factors of eating symptomatology—including bulimic behavior—have been established to date.6, 37 So although this study is an important step toward explaining age trends in the development of bulimic symptoms, current knowledge of associated risk factors does not permit us to draw strong conclusions about causality. Second, our study measures bulimic symptoms by survey measures only, thus providing no information about the diagnosis of bulimic disorder and its clinical level of impairment. Moreover, the BITE scale used in the study does not only assess core symptoms of bulimia but gives a broad assessment on cognitive and behavioral aspects related to bulimia. The scale is thus particularly well suited to provide information about dysfunctional eating behaviors and cognitions for a subclinical level of bulimia. Even though it has been argued that full-syndrome bulimia may not differ qualitatively from subthreshold levels of bulimic symptoms,6, 38 it remains to be examined whether the age trends found in this study will also be found in studies using diagnostic interviews. Third, it is possible that factors not included in the study may be associated with changes in bulimic symptoms. Specifically, the developmental trends of bulimic symptoms up to age 16 in males and 16 to 20 years in females remained significant even after adjusting for putative risk factors, suggesting that these trends could be a result of uncontrolled covariates. Potentially important risk factors for eating pathology that were not included in the study include factors related to family relationships, adverse life-events, perfectionism, drive for thinness, and thin-ideal internalization.3, 6, 37 Fourth, the potency of putative risk factors may vary according to age and thus, age-by-putative risk factors interactions should be examined in the future studies. Finally, as we followed an age-heterogeneous cohort of individuals at different time points, the developmental trends in bulimic symptoms may be confounded by time of measurement effects. But the advantage of using the accelerated cohort design is that we were able to model changes in the level of bulimic symptoms over a broad range of ages using three waves of data.
Overall, the findings suggest that mid-adolescence for females and early 20s for males represent high-risk periods for developing bulimic symptoms. The risk for developing symptoms from adolescence to young adulthood is related to changes related to BMI, appearance satisfaction, alcohol consumption, symptoms of anxiety and depression, and cohabitation status. Prevention interventions should therefore focus on the mid-teenage period for females and the early 20s for males, and also on the aforementioned putative risk factors. More studies are needed to advance our understanding about causal relationships and developmental trends of both bulimia and other types of eating problems.
Visit: http://www.ce-credit.com for additional information. There may be a delay in the posting of the article, so continue to check back and look for the section on Eating Disorders. Additional information about the program is available at www.aedweb.org