If we want science to be credible and useful for citizens, the effectiveness of our actions for the conservation of biodiversity needs to be evaluated. As Possingham (2012) points out, there is a critical lack of such empirical evaluations, and one of the reasons for this is that the related work is essentially computer-/desk-based and much less ‘sexy’ than fieldwork. As a consequence, we fail to attract young researchers to do the job, and even if we manage to do so, it is difficult to have them stay in the area.

We warmly welcome analytic activities, as we are analysts ourselves. As Possingham (2012) points out, there are numerous and huge existing datasets improperly or never analyzed. Tremendous methodological developments have been made over the past few years (e.g. allowing the combination of diverse and sparse sources of information; Schaub et al., 2007; Ovaskainen & Soininen, 2011), which can help us to improve our conservation decisions. Therefore, we should seriously consider the idea of (re)analyzing existing data.

We generally find ourselves more optimistic than Possingham (2012) as ecology and conservation biology have become more quantitative over recent years. There is an increasing number of workshops in quantitative ecology, conferences in mathematical and statistical ecology, working groups [e.g. under the positive influence of the National Center for Ecological Analysis & Synthesis, and its recently born French baby Centre de Synthèse et d'Analyse sur la Biodiversité (], methodological journals (such as Methods in Ecology and Evolution) and ecological journals willing to publish methodological papers. Importantly, young scientists are becoming more and more involved in these activities. Despite this trend, Possingham (2012) asks ‘why are not there a hundred more analyses (of existing datasets to inform future actions) every year?’ We suspect that it might be due to too little dialog between field practitioners and quantitative ecologists. Our efforts should be devoted to fill in this gap, and the involvement of field practitioners in scientific projects should be promoted. Social sciences have a role to play in that respect to help in improving interdisciplinary practices.

This being said, the question still remains. How to attract young scientists to quantitative ecology? Our point here is to share a practice we have encouraged in our group that might help in having young scientists enjoying computer and desk work. We actually take the opposite view to the challenging Possingham's (2012) ‘proclamation that conservation needs more analysts, not more field data’ (which, he says, ‘invariably elicits a hostile reception among field ecologists’); let's collect data, and try to balance quantitative ecology with fieldwork.

We will not reiterate the reasons why collecting data is so important, as it has been done elsewhere (e.g. Clutton-Brock & Sheldon, 2010; Magurran et al., 2010). Rather, we would like to share the viewpoint of young scientists (sadly excluding the first author) working in quantitative ecology, with a training in methodology or biology and with some or regular fieldwork practice.

Collecting data and spending time in the field is essential to better understand our study systems. The more we understand the ins and outs of a project and what the stakes are, the more we become keen on investing time and being involved in analyses. Fieldwork experience is crucial to remain biologically relevant and build realistic models as it is the best way to grasp important features of species biology. On the other hand, simply collecting data says nothing. Some may be terrified when it comes to data analysis, as it is well known within the scientific community (e.g. Van Emden, 2008). Being out in the field allows ‘feeling’ our study animals or plants as well as ‘seeing’ the data. We can then objectively confirm our assumptions and share our knowledge in a more objective and therefore convincing way. Among existing conservation programs, those adopting the adaptive management framework in which iterative decisions are made in the presence of uncertainty (e.g. Walters, 1986; McCarthy & Possingham, 2007; Runge, 2011) are perfect case studies for young scientists to be involved in the whole process of monitoring, modeling and evaluating. Even if the entire adaptive management process cannot be implemented, the underlying conceptual framework of structured decision making (Gregory et al., 2012) remains motivating: information has a value, and in most cases, the optimal allocation of resources includes both continued monitoring and conservation actions (Nichols & Williams, 2006).

If we want students to become analysts, we need to train them in an adequate and motivating way. We therefore call for a revision of quantitative ecology teaching, through the development of more interdisciplinary programs at the undergraduate and graduate levels that would mix modeling, ecology and field practice. In that spirit, a relevant approach has recently been advocated for research programs in which biologists and modelers interact at all stages of a study, from initial model formulation and field study design to data collection and analysis (Restif et al., 2012). Such a framework has the potential to help address the ‘serious disconnections between the quantitative nature of ecology, the quantitative skills we expect of ourselves and our students, and how we teach and learn quantitative methods’ pointed out by Ellison & Dennis (2010). In Switzerland, for example, undergraduate students are taught a one-semester population dynamics course with practical in the summer, during which they are expected to collect data and analyze them. Another example is in Québec where students have typically done two seasons of fieldwork when entering a PhD program. They are therefore highly motivated to use and develop analytical tools to make the best use of the valuable data they have contributed to collect in the field. Knowing the value of data, they seem keener on spending less time in the field and more behind a computer. This integrated training might prove difficult to set up in some countries (e.g. France) where there is a pressure to limit the duration of Master's internships and PhDs, thereby decreasing the opportunities for fieldwork experience in favor of quantitative topics based on existing data. For our proposal to be most effective, all students should be given the opportunity to work in quantitative ecology.

Quantitative conservation biology involves some exciting interconnected aspects of a scientist's job from designing field protocols and experiments, collecting and analyzing data, conceiving and building models, and assessing responses to management and conservation actions. To encourage young scientists embracing this richness, let's send them on the field!


This is a contribution of the ‘boulet’ team.


  1. Top of page
  2. References