SEARCH

SEARCH BY CITATION

Keywords:

  • trophic structure;
  • source partitioning;
  • convex polygon;
  • mixing region

Summary

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information
  1. Stable isotope analysis is often used to identify the relative contributions of various food resources to a consumer's diet. Some Bayesian isotopic mixing models now incorporate uncertainty in the isotopic signatures of consumers, sources and trophic enrichment factors (e.g. SIAR, MixSIR). This had made model outputs more comprehensive, but at the expense of simple model evaluation, and there is no quantitative method for determining whether a proposed mixing model is likely to explain the isotopic signatures of all consumers, before the model is run.
  2. Earlier linear mixing models (e.g. IsoSource) are easier to evaluate, such that if a consumer's isotopic signature is outside the mixing polygon bounding the proposed dietary sources, then mass balance cannot be established and there is no logical solution. This can be used to identify consumers for exclusion or to reject a model outright. This point-in-polygon assumption is not inherent in the Bayesian mixing models, because the source data are distributions not average values, and these models will quantify source contributions even when the solution is very unlikely.
  3. We use a Monte Carlo simulation of mixing polygons to apply the point-in-polygon assumption to these models. Convex hulls (‘mixing polygons’) are iterated using the distributions of the proposed dietary sources and trophic enrichment factors, and the proportion of polygons that have a solution (i.e. that satisfy point-in-polygon) is calculated. This proportion can be interpreted as the frequentist probability that the proposed mixing model can calculate source contributions to explain a consumer's isotopic signature. The mixing polygon simulation is visualised with a mixing region, which is calculated by testing a grid of values for point-in-polygon.
  4. The simulation method enables users to quantitatively explore assumptions of stable isotope analysis in mixing models incorporating uncertainty, for both two- and three-isotope systems. It provides a quantitative basis for model rejection, for consumer exclusion (those outside the 95% mixing region) and for the correction of trophic enrichment factors. The simulation is demonstrated using a two-isotope study (15N, 13C) of an Australian freshwater food web.

Introduction

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

Stable isotope analysis (SIA) is a popular tool for analysing food webs, and mixing models are used to quantify the links between consumers and their dietary sources based on their stable isotopic signatures (Fry 2006; Layman et al. 2012). The numerous assumptions and methods of SIA necessitate an evaluation of a proposed mixing model, before that model is run. It is assumed, for example, that every source in a mixing model contributes to the consumer's diet and that the model adequately explains the isotopic signature of every consumer. It is further assumed that an analysis has correctly estimated or applied: trophic enrichment factors, isotopic turnover rates, variance in source signatures and the aggregation of dietary sources to constrain model outputs (Cabana & Rasmussen 1996; Phillips, Newsome & Gregg 2005; Boecklen et al. 2011; Layman et al. 2012). Violations of these assumptions (such as a missing dietary source) have often been assessed using the ‘point-in-polygon’ approach (e.g. Benstead et al. 2006); that is, for mass balance to be established in a linear mixing model, a consumer's isotopic signature must be within a polygon bounding the signatures of the sources (Phillips & Gregg 2003). If a consumer's signature is outside this polygon, then no solution exists for that consumer, and one or more assumptions of mixing models have been violated.

A recent development are Bayesian mixing models that formally incorporate the uncertainty of trophic enrichment factors (the consistent difference in an isotopic ratio between a predator and its prey) and of isotopic signatures (e.g. SIAR, Parnell et al. 2010; MixSIR, Moore & Semmens 2008). Modelling this uncertainty has created more powerful and realistic models (Layman et al. 2012), but it has also made model evaluation more difficult. This is because the source data are distributions, not average values, and there is no longer a discrete mixing polygon to assess for point-in-polygon (as exists in IsoSource, Phillips & Gregg 2003). The Bayesian mixing models will calculate source contributions even when a model is very unlikely to satisfy point-in-polygon for every consumer (Parnell et al. 2010). Points can be visually inspected in reference to confidence intervals or to the area enclosed in lines joining confidence ellipses (Hopkins & Ferguson 2012), but this assessment is largely qualitative, is not practical in three dimensions and may not accurately predict the mixing space predicted by frequentist sampling methods. A quantitative tool for point-in-polygon is needed to allow an a priori evaluation of these mixing models, by indicating when the data are unlikely to create the mixing geometry needed for a logical model.

Our method for the evaluation of these models is to generate a large number of possible mixing polygons with a Monte Carlo simulation, using the same uncertainty incorporated in the Bayesian mixing models and testing these polygons for point-in-polygon (i.e. the ability to establish mass balance). The proportion of iterated polygons that satisfy the point-in-polygon assumption is calculated for each consumer and is interpreted as the frequentist probability that a consumer's isotopic signature can be explained by the proposed model. This probability provides a quantitative basis for consumer exclusion (any consumer outside the 95% mixing region, for example), for the correction of trophic enrichment factors (to ensure all consumers are within the 95% mixing region) or for outright model rejection. The mixing polygon simulation is demonstrated using an SIA of an Australian freshwater food web.

Mixing polygon simulation

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

Necessary inputs for the simulation are the isotopic signatures of consumers and the mean and standard deviation of dietary source isotopic signatures and trophic enrichment factors. For each iteration of a Monte Carlo simulation, isotopic signatures for sources are generated by sampling a univariate normal distribution for each isotope. Each signature is then corrected using trophic enrichment factors sampled from a univariate normal distribution for each isotope (the source is corrected for enrichment, rather than the consumer, to allow for source-specific trophic enrichment factors). A convex hull (or ‘mixing polygon’) is calculated for each iteration of source data, and a point-in-polygon algorithm is used to determine whether consumers are within, or on the edge of, the mixing polygon (the point-in-polygon approach). Iterations continue until the variance of the mixing polygon's area stabilises (1500 iterations are usually sufficient). The proportion of iterations for which each consumer is inside the mixing polygon is calculated. This is interpreted as the frequentist probability that the current mixing model can explain an individual consumer's isotopic signature.

The mixing polygon simulation is possible for two- and three-isotope systems. The approach is identical for these systems, although the functions needed for calculating convex hulls and point-in-polygon (point-in-polyhedron in 3 dimensions) differ, and the number of model iterations in the three-isotope model is evaluated using mixing polyhedron volume as opposed to polygon area. For two-isotope systems, a mixing region is visualised by testing a grid of values for point-in-polygon, at a resolution determined by the user (currently 250 × 250 values). Contours are used to illustrate probabilities within the mixing region at 5% (the outermost contour) and every 10% interval. The 95% mixing region is the area within the 5% contour. This visualisation of the mixing region is too computationally expensive for three-isotope systems. The mixing polygon simulation has been written for both R (www.r-project.org) and MATLAB® (Mathworks, Natick, MA, USA) environments, with separate scripts for two- and three-isotope systems. These scripts are available as supplementary material and at http://www.famer.unsw.edu.au/downloads.html.

Case study sampling and analysis

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

The mixing polygon simulation is demonstrated with a two-isotope case study from an Australian freshwater food web. Presented are two source–consumer mixing models: one with Australian bass (Macquaria novemaculeata) as the consumer and one with aquatic insect larvae as the consumer. Australian bass, insect larvae and potential sources of energy for both consumers were collected from three sites and subject to an analysis of isotopes 13C and 15N (Table 1). The sites were impoundments routinely stocked with Australian bass: Danjera Dam (34·920 S, 150·385 E); Brogo Dam (36·491 S, 149·740 E); and Flat Rock Dam (34·888°S, 150·575°E); all located in New South Wales, Australia. A gut content analysis for Australian bass (Smith et al. 2011) was used to identify dietary sources, and sources for insect larvae consumers were identified from the literature. Sources were aggregated into coarse groups before isotopic analysis (Table 1). Muscle tissue was analysed for Australian bass and prey fish, and other groups were analysed whole. Samples were dried at 65°C for 72 h and ground to a fine powder for isotopic analysis.

Table 1. The consumers and dietary sources used in the case study. Two consumer–source relationships were used for the simulations: one with an aggregated aquatic insects group as consumer and five dietary sources (A) and one with Australian bass (Macquaria novemaculeata) as the consumer and three dietary sources (B). Most organisms consisted of 1–3 taxa and were aggregated into groups before isotopic analysis, with the exception of Chaoboridae, which have a largely pelagic lifestyle, and were aggregated with the other ‘zooplankton’ after analysis due to their similar isotopic signatures. Australian bass were analysed as individuals. All aquatic insects were analysed as larvae
OrganismGroupTissue
(A)
Aquatic insectsConsumerWhole
Daphnia sp.1. ZooplanktonWhole
Copepoda1. ZooplanktonWhole
Biofilm–Chlorophyta matrix2. PeriphytonWhole
Macrophyte species 13. MacrophyteWhole
Macrophyte species 23. MacrophyteWhole
Macrophyte species 33. MacrophyteWhole
1- to 200-μm pelagic particles4. POMWhole
Benthic leaf litter5. DetritusWhole
(B)
Macquaria novemaculeata ConsumerCaudal muscle
Retropinna semoni 1. Prey fishCaudal muscle
Gambusia holbrooki 1. Prey fishCaudal muscle
Daphnia sp.2. ZooplanktonWhole
Copepoda2. ZooplanktonWhole
Atyidae2. ZooplanktonWhole
Chaoboridae2. ZooplanktonWhole
Odonata3. Aquatic insectsWhole
Ephemeroptera3. Aquatic insectsWhole
Dytiscidae3. Aquatic insectsWhole
Corixidae3. Aquatic insectsWhole
Chironomidae3. Aquatic insectsWhole
Culicidae3. Aquatic insectsWhole

Isotopic analysis was done at the Australian Nuclear Science and Technology Organisation (ANSTO) in Sydney, Australia. Powdered and homogenised animal and plant tissue samples were loaded into tin capsules and were analysed with a continuous flow isotope ratio mass spectrometer (CF-IRMS), model Delta V Plus (Thermo Scientific Corporation, Waltham, MA, USA), interfaced with an elemental analyser (Thermo Fisher Flash 2000 HT EA, Thermo Electron Corporation, USA). Stable isotope values were reported in delta (δ) units in parts per thousand (‰) relative to the international standard and determined as follows:

  • display math

where X = δ13C or δ15N and R = 13C/12C or 15N/14N, respectively. The average ± SD trophic enrichment factors used in the case study were 3·4 ± 0·6 for δ15N and 1·0 ± 0·3 for δ13C and were the same for every dietary source.

Case study results

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

The mixing polygon simulation approach is demonstrated using data for an aggregated aquatic insect consumer and five dietary sources (Table 1A), sampled as described above. Five aggregated consumer replicates, and three replicates for each source, were analysed for δ13C and δ15N (Fig. 1a). For each consumer, the simulation is used to calculate the proportion of iterations for which it was within the mixing polygon (Table 2). This can be interpreted as the probability that a consumer's isotopic signature can be explained by the proposed mixing model (i.e. the specified source averages and standard deviations). In this example, all consumers are within more than 90% of possible mixing polygons, as observed in the mixing region of possible polygons (Fig. 1b). Inspection of the distribution indicates that all consumers are located within the 5% contour threshold specified above (the outermost contour; Fig. 1b), so the proposed mixing model can be used.

Table 2. The proportion of iterations for which the consumer was within a mixing polygon. This can be interpreted as the probability that a consumer can be explained by the proposed mixing model. If an individual is within fewer than 5% of polygons, we consider the model unsuitable for that consumer (values highlighted in bold)
ConsumerDanjera DamBrogo DamFlat Rock Dam
Aquatic insect consumer (Fig. 1a)Australian bass consumer (Fig. 2a)Australian bass consumer (Fig. 4a)
11·000·1830·069
20·9440·076 0·009
30·9610·6670·203
40·9940·215 0·031
50·985 0 0·158
6 0·959 0·007
7 0·0830·058
8  0 0·012
9 0·366 0·043
10 0·611 0·044
image

Figure 1. (a) A biplot of stable isotopic signatures for consumers (aquatic insect larvae) and sources (zooplankton, periphyton, macrophytes, detritus and particulate organic matter POM) from Danjera Dam. This biplot was created using Bayesian mixing model SIAR (Parnell et al. 2010). Error bars represent 95% confidence intervals and incorporate the error in the source isotopic signatures and in trophic enrichment factors (δ15N 3·4 ± 0·6 SD; δ13C 1·0 ± 0·3 SD). (b) The simulated mixing region for the biplot in (a). The positions of the aquatic insect consumers (black dots) and the average source signatures (white crosses) are shown. Probability contours are at the 5% level (outermost contour) and at every 10% level.

Download figure to PowerPoint

Errors in a stable isotope analysis, such as in identifying, sampling or aggregating dietary sources, can result in unlikely mixing models where consumers lie outside the average mixing polygon. To demonstrate how such a problem can be identified and addressed using the simulated mixing polygon approach, data with an Australian bass consumer and three dietary sources (Table 1B) from Brogo Dam are presented. In this example, each dietary source represents an aggregate of the isotopic signatures of numerous individuals and taxa, greatly simplifying the food web. Prey aggregation is especially problematic in this example, because the consumers have a ‘population generalist, individual specialist’ niche (Smith et al. 2011), which results in large variation in isotopic signatures. Applying the mixing polygon simulation approach gives an indication of the suitability of the proposed mixing model (Fig. 2a), with two of the ten consumers occurring outside all possible mixing polygons (Table 2, Fig. 2b). These individuals must be excluded if a logical mixing model is to be calculated. The implications of this are observed in Bayesian mixing models generated using SIAR (Parnell et al. 2010), which show a shift in the importance of the aquatic insects and zooplankton groups between the full data set (Fig. 3a) and with the two consumers excluded (Fig. 3b).

image

Figure 2. (a) A biplot of stable isotopic signatures for consumers (Australian bass) and sources (prey fish, aquatic insects and zooplankton) from Brogo Dam. This biplot was created using the Bayesian mixing model SIAR. Error bars represent 95% confidence intervals and incorporate the error in the source isotopic signatures and in trophic enrichment factors (δ15N 3·4 ± 0·6 SD; δ13C 1·0 ± 0·3 SD). Some consumers appear to be outside the boundaries of possible source values, and the mixing polygon simulation can quantitatively determine if this is the case. (b) The simulated mixing region for the biplot shown in (a). The positions of the Australian bass consumers (black dots) and the average source signatures (white crosses) are shown. Two consumers lie outside the 95% mixing region (the outermost contour), which means an alternative model is needed to explain their isotopic signatures.

Download figure to PowerPoint

image

Figure 3. Probability histograms for the source contributions for the proposed mixing model for the Australian bass consumers in Fig. 2 (prey fish, black; aquatic insects, red; zooplankton, green). The Bayesian mixing model SIAR was used with all ten consumers (a) and again without the two consumers identified for exclusion using the mixing polygon simulation (b). By removing these two consumers, we see a shift in the importance of zooplankton and aquatic insects in the diet of Australian bass.

Download figure to PowerPoint

Whilst the mixing polygon simulation can improve some mixing models via consumer exclusion, the simulation approach can also be used as a basis for outright rejection of a proposed mixing model. Model rejection is prudent when too many consumers violate the point-in-polygon assumption. This is demonstrated using data from Flat Rock Dam for the Australian bass mixing model (Table 1B). There is again considerable variation in the consumers’ isotopic signatures, and the biplot does not give a clear indication that these points lie outside the sources due to large amounts of uncertainty in the source values (Fig. 4a). The mixing polygon simulation reveals that six of the ten consumers occur in fewer than 5% of possible polygons (Table 2), which implies that the proposed mixing model should not be fitted to these data. The threshold for model rejection is one of user preference, but it would seem unwise to use a mixing model in which consumers more numerous than statistical outliers violate the point-in-polygon assumption. Due to substantial uncertainty in this model, there is much less overlap between the simulated polygons than previous examples. This results in a diffuse mixing region and a low probability that any consumer can be explained by the proposed mixing model (Fig. 4b).

image

Figure 4. (a) A biplot of stable isotopic signatures for consumers (Australian bass) and sources (prey fish, aquatic insects and zooplankton), for Flat Rock Dam. This biplot was created using the Bayesian mixing model SIAR. Error bars represent 95% confidence intervals and incorporate the error in the source isotopic signatures and in trophic enrichment factors (δ15N 3·4 ± 0·6 SD; δ13C 1·0 ± 0·3 SD). (b) The simulated mixing region for the biplot shown in (a). The positions of the Australian bass consumers (black dots) and the average source signatures (white crosses) are shown. Six consumers lie outside the 95% mixing region (the outermost contour), which indicates that an alternative model is needed to explain the diet of Australian bass on this occasion.

Download figure to PowerPoint

Discussion

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

The advent of complex Bayesian mixing models (SIAR, MixSIR) that formally incorporate uncertainty has meant a more comprehensive analysis of isotopic data, but the models can give compelling results even when a solution is unlikely. The simulation of mixing polygons determines whether a result if possible (i.e. likely) before the result is calculated. The simulation makes ‘point-in-polygon’ a quantitative tool for Bayesian mixing models, as it is for previous mixing models such as IsoSource (Phillips & Gregg 2003; Benstead et al. 2006). Point-in-polygon is an appropriate assumption for these models due to their reliance on the mass balance procedure, and the geometry of the mixing space remains essential to their interpretation (Moore & Semmens 2008; Parnell et al. 2010; Tarroux et al. 2010; Galván, Sweeting & Polunin 2012).

Applying the simulation method

The mixing polygon simulation evaluates an isotopic mixing model and either validates the model or identifies consumers that are unlikely to be explained by the model. Model validation is straightforward and accomplished when all consumers are within the 95% mixing region. If consumers are outside this region, then there are three options: consumer exclusion; parameter correction; and model rejection.

Consumer exclusion is based on the frequentist probability that a given consumer can be explained by the source data. Consumers with a very low probability (e.g. <5%) must be removed if the mixing model is to proceed. If these consumers are deemed outliers, that is, arose by chance according to a predetermined statistical distribution, then the mixing model can be considered complete and solved once the outliers have been excluded. If there are more consumers outside the mixing region than predicted by chance, the model must be considered incomplete. It is at the discretion of the user to decide whether useful data can be derived from an ecologically incomplete (but mathematically logical) mixing model. It is essential, in any case, that excluded consumers are reported along with the mixing model output. The literature provides numerous examples of studies using Bayesian mixing models when consumer exclusion may have been necessary, in both terrestrial (Moreno et al. 2010; Giroux et al. 2012) and marine systems (Cardona et al. 2012; Witteveen et al. 2012). These studies show consumers outside or near the edge of likely mixing polygons and would benefit from model evaluation. In the case of Giroux et al. (2012), two arctic fox individuals represented the entire 2003 year class, but were enriched in δ15N and probably outside the 95% mixing region. The diet composition results calculated using SIAR for this year class may be spurious, given the likelihood of an error (such as a missing dietary source) in the mixing model.

Parameter correction could be used to move some consumers within the 95% mixing region. The likely parameters for correction are the trophic enrichment factors, although the ‘optimisation’ must be done by trial and error. This should be done with caution, as trophic enrichment factors can have profound effects on a model's output (Moore & Semmens 2008; Bond & Diamond 2011), and the procedure should be reported along with the results. It should also be done only if system-specific trophic enrichment factors are not available, and the values remain within the limits of established meta-analyses (e.g. Vander Zanden & Rasmussen 2001; Vanderklift & Ponsard 2003). It is counterproductive to increase the parameter uncertainty, because any improvement in a simulation due to increased uncertainty would be counteracted by the same uncertainty being propagated in the mixing model's output.

In some cases, model rejection may be the only reasonable course (e.g. Fig. 4). The simulation may be indicating a procedural (such as prey aggregation) or numerical issue (such as poorly selected trophic enrichment factors) that can be corrected and a successful mixing model found. It may be the case, however, that the source isotopic signatures are unlikely to explain the signatures of the consumers, and a quantitative basis for rejection is required. A likely problem is one or more missing sources, which can be difficult to remedy. It may be that some food webs are poorly suited to a stable isotopic analysis, such as some generalist consumers (Galván, Sweeting & Polunin 2012) and those with large amounts of interindividual variation (such as the Australian bass, Smith et al. 2011). The mixing polygon simulation can help to determine whether a food web or design is appropriate for a mixing model.

Simulation considerations and caveats

The mixing polygon simulation reveals some intriguing aspects of the geometry of mixing models. If source error bars overlap (Fig. 4a), a diffuse mixing region can result (Fig. 4b). Due to the large variation between iterated mixing polygons, the mixing region has a low probability of explaining any consumer. This result highlights the impact that source uncertainty can have on mixing geometry; that is, rather than increasing the logical mixing space, increased uncertainty can create geometries inappropriate for mixing models. A second intriguing aspect is when the mixing polygon includes isolated vertices (sources distant from the polygon's centroid), as this causes reduced probability near that vertex. This is again observed in Fig 4(b), in which the zooplankton source is surrounded by unlikely mixing space due to reduced overlap of iterated mixing polygons. A consumer with an isotopic signature similar to that of zooplankton would be within fewer than 5% of mixing polygons and would need to be excluded from the mixing model. This means that a source's average isotopic signature is not a good indicator of the 50% probability contour, and the more isolated the vertex, the lower the probability that part of the mixing region will have of establishing mass balance. The impact of overlapping error bars or isolated vertices on mixing geometry is not observed by visual methods, such as joining ellipses (e.g. Hopkins & Ferguson 2012). This highlights the importance of including a simulation approach into the evaluation of uncertainty in isotopic mixing models.

The simulation method has caveats that should be understood. The simulation is only one evaluation method and should only be used to support the validity of dietary composition data, rather than interpret it. The simulation does not guarantee that a mixing model is correct – only that a mathematical solution that satisfies the geometry of mixing models can be found. Another caveat is that the simulated mixing polygons exist as convex hulls. A convex hull is the smallest possible polygon that encompasses all points, with no concave vertices. This means that if a particular dietary source is within a polygon of other sources, it need not contribute to the mixing model (Phillips & Gregg 2003) and will usually not be important in the mixing polygon simulation. If prior information is used in a Bayesian mixing model, and great importance is placed upon one or more of these ‘interior’ sources, then the mixing space generated in the simulation may not reflect the mixing space weighted in the mathematical model. Caution should be used on these occasions, because the simulation may not accurately evaluate the suitability of the mixing model.

The mixing polygon simulation assumes constant elemental concentration between consumers and independence between isotopes. Thus, caution should be applied if concentration dependence (Phillips & Koch 2002) or isotopic correlation (Hopkins & Ferguson 2012) is included in the mixing model being evaluated. Isotopic correlation could be accounted for in the simulation by sampling isotopic values from a single multivariate distribution with a covariance structure for each source, rather than separate univariate distributions. Univariate distributions are likely to overestimate the mixing region, compared to a multivariate distribution of strongly correlated isotopes.

The mixing polygon simulation is a frequentist method of calculating probability. It assumes that there is one true mixing polygon and that our simulation encompasses the range in which it lies. A Bayesian approach assumes the mixing polygons come from a probability function, and all mixing polygons can be considered possible, though with different degrees of probability. This is a fundamental difference in the interpretation of probability by the simulation and the Bayesian mixing models and could extend to the interpretation of the mixing geometry. These two interpretations are compatible if the Monte Carlo simulation evaluates only whether a mixing model has a solution (for each consumer, with 95% probability), and the subsequent Bayesian models find that solution.

Concluding remarks

Many isotopic mixing models contain consumers near the boundary of the mixing space, and these require evaluation to ensure the proposed model is logical. A quantitative model evaluation is prudent even when the numerous assumptions of a stable isotopic analysis have been carefully addressed. Assessing the geometry of the mixing space is a common way of doing so, and the mixing polygon simulation enables the point-in-polygon method to quantitatively evaluate two- or three-isotope mixing models that incorporate uncertainty in trophic enrichment factors and isotopic signatures. The simulation method is a valuable addition to isotopic analysis, as it identifies whether a proposed mixing model can establish mass balance for all consumers considered (with 95% confidence) and whether the trophic enrichment factors and source uncertainty create a suitable mixing geometry. It is a necessary addition to the tools of model validation (such as examining model residual errors; Parnell et al. 2010), which represent model fit, by evaluating whether a model should be fit.

Acknowledgements

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

Thanks to Rene Diocares and Linda Barry for assistance with the isotopic analysis, to Jason Everett who helped refine the script and to Alan Smith for assistance in collecting samples. We acknowledge funding from the Australian Institute of Nuclear Science and Engineering (AINSE) scheme (award no. ALNGRA10040).

References

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information

Supporting Information

  1. Top of page
  2. Summary
  3. Introduction
  4. Mixing polygon simulation
  5. Case study sampling and analysis
  6. Case study results
  7. Discussion
  8. Acknowledgements
  9. References
  10. Supporting Information
FilenameFormatSizeDescription
mee312048-sup-0001-DataS1.rtext/r5KData S1. R script - 2 isotopes.
mee312048-sup-0002-DataS2.rtext/r3KData S2. R script - 3 isotopes.
mee312048-sup-0003-DataS3.mplain text document4KData S3. Matlab script - 2 isotopes.
mee312048-sup-0004-DataS4.mplain text document3KData S4. Matlab script - 3 isotopes.
mee312048-sup-0005-DataS5.csvCSV document0KData S5. Example mixture data - 2 isotopes.
mee312048-sup-0006-DataS6.csvCSV document0KData S6. Example mixture data - 3 isotopes.
mee312048-sup-0007-DataS7.csvCSV document0KData S7. Example source data - 2 isotopes.
mee312048-sup-0008-DataS8.csvCSV document0KData S8. Example source data - 3 isotopes.
mee312048-sup-0009-DataS9.csvCSV document0KData S9. Example TEF data - 2 isotopes.
mee312048-sup-0010-DataS10.csvCSV document0KData S10. Example TEF data - 3 isotopes.

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.