We start the issue with a letter from Sarah Weinstein of the Massachusetts Department of Environmental Protection, who comments in response to an “Editor's Choice” essay by Michael Greenberg.(1) Representing the Northeast Waste Management Officials Association, Weinstein shares information about a state government strategy to meet pollution reduction goals in an era of decreasing resources.
Next we present another in our series of biographical profile articles. This time we feature Prof. Daniel Kahneman, winner of the Nobel Prize in Economics, professor of psychology, and author of the new book reviewed in this issue, Thinking Fast and Slow. We hope you enjoy reading about the highlights of his life and career and his contributions to the field of risk.
We begin our set of research articles with six that deal with prevention. Preventing consequential hazard events is the first directive of risk management. Changing the path and reducing the strength of hurricanes is a long-standing idea based on the assumption that controlling hurricanes would reduce their impact. Kelly Klima, Wandi Bruine de Bruin, Granger Morgan, and Iris Grossman sampled 157 Florida residents, and they found that respondents did not perceive the idea to be very effective, for example, less effective than hardening. Many respondents were concerned about the possibility of a diverted hurricane causing even more damage, which made them angry.
Placing human activity in vulnerable locations leads to serious hazard events, including infrastructure destruction and failure. Shital Thekdi and James Lambert built multiple layered decision support models to assess the relative impacts of development choices on risk, cost, and potential mitigative remedies, and they tested their approach in Virginia, which has seen more than its share of these events.
Waterborne illnesses are a major threat to public health across the globe. But how much of a threat are they in a moderate-sized North American city? Funded by the Canadian government, K. Pintar, A. Fazil, and F. Pollarie et al. used quantitative microbial risk assessment with local stakeholder input to assess the risk of Cryptosporidium infection via municipally treated drinking water in a city in Ontario, Canada. They find that during routine operations risk is very low.
Preventing successful terrorist attacks has been a major focus of our journal. Seth Guikema et al. present two papers about intelligent adversary modeling of terrorism risks. Dr. Guikema proposes that intelligent adversary models should include as much behavioral predictive accuracy as possible, computational capacity to solve the problem, explicit consideration of uncertainty, and accessibility to users who decide whether to use decision support models. Supported by the US Department of Homeland Security, Casey Rothschild, Laura McLay, and Seth Guikema present and demonstrate a k-game theory model that has a great deal of capacity and flexibility.
Extended blackouts are not infrequent and can be prevented by strategic investments in backup systems. Anu Narayanan and Granger Morgan describe small electric generation distribution systems that would provide energy during a prolonged blackout. The authors estimate the costs of such systems and suggest that the investment makes sense for some regions.
Our final prevention paper is about willingness to pay for preventing cancer. Stefania Tonin, Anna Alberini, and Margherita Turvani polled people who lived near the Marghera hazardous waste site, located on the mainland side of Venice. The authors sampled at distances out to 10 km, gathered data about income, age, preferences and personality, and found a central tendency value that is 2.6 million euro ($3.56 million in 2007 US$).
The remaining four papers explore standard and several novel tools in risk analysis. Graeme Hickey and Peter Craig review statistical methods to estimate interspecies variations in sensitivity to toxic substances as part of risk assessments. The authors focus on the log-normal distribution and conclude that there is confusion in the literature that needs to be addressed to avoid misleading results.
Using a Delpi method, another standard tool, and sponsored by the US Department of Agriculture, Karin Hoelzer, Haley Oliver, and Larry Kohl et al. compared expert opinions about the sources of Listeria monocytogenes contamination in delicatessens. The exercise found notable differences among the 41 experts and important data gaps.
Louis Anthony Cox compares the results produced by what he calls “risk indices” in software packages compared to more complex formulations of risk for allocating resources. Using a relatively simple example, he finds that the simple indices are useful, but the more complex formulations are far more effective at allocating scare resources.
Our final paper is a fascinating effort by Matthew Wheeler and John Bailer, a paper to demonstrate a semiparametric model to estimate benchmark doses. The model fits a variety of circumstances, but the authors note that the model it is a challenge to solve with standard computer packages, and the interpretation of the results is not as straightforward as it is with standard models.
Finally, Area Editor Warner North provides a useful “mega review” of three books related to the topic of decision analysis.