Ten Most Important Accomplishments in Risk Analysis, 1980–2010


  • Michael Greenberg,

  • Charles Haas,

  • Anthony Cox Jr.,

  • Karen Lowrie,

  • Katherine McComas,

  • Warner North

Address correspondence (klowrie@rci.rutgers.edu)

As part of the celebration of the 30th anniversary of the Society for Risk Analysis and Risk Analysis, An International Journal, a group of your editors engaged in a process to select the 10 most important accomplishments in risk analysis. The article that follows is the product of this process.

Some preliminary decisions were that we would reach out to the full membership for nominations, focus on the period 1980 to 2010, and accept nominations for contributions to theory, methods, and applications. Also, we focused on accomplishments that address health, safety, and the environment, which has been our tradition.(1) All the accomplishments have contributed to answering at least one of the six following risk analysis questions:(2-5)

  • 1What can go wrong?
  • 2What are the chances that something with serious consequences will go wrong?
  • 3What are the consequences if something does go wrong?
  • 4How can consequences be prevented or reduced?
  • 5How can recovery be enhanced, if the scenario occurs?
  • 6How can key local officials, expert staff, and the public be informed to reduce concern and increase trust and confidence?

Four caveats are in order. First, the list does not imply a ranking of importance. Second, we acknowledge that there were many other meritorious contributions. Our list of 10 is based on what we received from the SRA membership, with some revision in the descriptions of the contributions, based on the judgment of the editors. After listing the 10, we present a synopsis of each, state its contribution and its significance, and note some of the key studies. Third, we acknowledge that we have deliberately and disproportionately cited articles from Risk Analysis to illustrate the contributions. Fourth, these short presentations are not full recitations of the idea, nor do we pretend to provide all the key citations. Indeed, we expect to publish perspectives from SRA members about these and other important contributions during the past 30 years and prospects for the next decade.



  • 1Understanding how affect and trust influence risk perception and behavior
  • 2Recognizing that personal decisions reflect different processes for valuing and combining anticipated and actual losses, gains, delays, and surprises.
  • 3Developing an environmental justice ethic and frameworks


  • 4Using formal uncertainty analysis in risk assessment
  • 5Building the capacity and frameworks to apply multiobjective decision making to complex decisions that trade off risk and other criteria
  • 6Modifying existing regional economic impact assessment tools


  • 7Estimating likelihoods of events across a broad spectrum of hazard events
  • 8Applying intelligent agent models to terrorism
  • 9Building an applied field of risk communications
  • 10Making risk-informed legal and regulatory decisions


1. Understanding How Affect and Trust Influence Risk Perception and Behavior

Psychologists have contributed to our understanding of risk perceptions and beliefs. The “affect” heuristic—whereby a fast, intuitive, and emotional response anchors initial evaluations of risky prospects, which slower cognitive pathways may then adjust—and trust are at the heart of their efforts. About half of the 20 most cited articles in the journal are the product of these endeavors.

Affect is negative or positive feelings and emotions about a hazard, or, more generally, in response to any activity, object, or stimulus. Some assert that we should base our risk perceptions on rational deliberation. But research shows that our perceptions are heavily influenced by what we feel about a hazard and the people who manage it.(6)

The literature on applications of the affect heuristic now includes studies of smoking, gambling, investing, drug use, and many other activities. It shows that affect often leads people to hard-to-shake perceptions that do not follow from rational deliberation, and indeed are sometimes contrary to personal self-interest. In other words, there are two types of thinking, one “analytical” that is grounded in deliberative analysis of risk and a second that is “experiential” based on intuitive, fast, image-based affective reaction to danger.(7-10) Mixing affect with analytical deliberations has had a profound impact on our ability to understand personal risk analyses, choices, and behaviors.

No less influential has been the probing of the relationship between trust and perception. Many people lack information and do not connect emotionally to an issue, yet they may connect through what they believe about authorities who manage the risk and issue warnings. If risk managers are perceived to be competent at assessing and managing risk and to be honest, caring, and accessible to the public, then less concern is felt and more benefits are perceived, leading to greater acceptance of a hazard.(11-16) How heavily individuals weight the messages and management of authorities in reducing their concerns about hazards varies predictably with political ideology.(17) Trust can be built and it can be lost.(14) From food-related risks to nuclear waste management, assessing trust in specific decisionmakers and purveyors of information has been shown to be important.(16) An ongoing important endeavor is to understand the relative and combined contributions of affect and trust.

2. Recognizing that Personal Decisions Reflect Different Processes for Valuing and Combining Anticipated and Actual Losses, Gains, Delays, and Surprises

Utility theory assumes that people seek personally optimal solutions, are self-interested, forward looking, and rely on consistent and rational decision-making processes. If that were the case, and if adequate information were available at no or little cost, then more than 46% of the U.S. adult population would have a last will and testament, including some people with considerable resources who do not;(18) every driver would use seatbelts; no one would smoke (unless they had strong biochemical and genetic information that they were smoke-tolerant), people living in areas that are prone to natural hazard events would have insurance, and so on. Amos Tversky and Daniel Kahneman's prospect theory posits that, in the face of uncertain risks, people group and order options and then create individual value functions for each option.(19,20) These functions are nonlinear (and steeper for losses than for gains) because we mind losing what we already have more than we mind not winning what we do not yet have. Instead of focusing on final outcomes such as accumulated wealth, people focus on changes around a reference point—gains and losses from their personal status quo point or other aspiration level. Furthermore, low probability outcomes are overweighted and moderate to high ones are underweighted, although certainty is overweighted compared to near-certainty. Prospect theory has been featured primarily in the economics literature, and indeed Kahneman and Tversky's 1979 paper is the most cited paper in Econometrica.(19)

During the last 30 years, almost 200 articles in Risk Analysis have used prospect theory to better understand environmental and public health risks. The first major article about prospect theory in the journal in 1983 derived hypothetical weights and functions to predict reactions to rare events with a focus on possible problems at nuclear energy facilities.(21) A 2011 publication in Risk Analysis reports that employees of the USDA Forest Service who have the authority to choose how to manage wildfire events chose options that avoid loss, selecting the safe option more often when the consequences of the choice were framed as potential gains.(22) The subjects also exhibited discounting, choosing to minimize short-term over long-term risk due to a belief that future risk could be controlled. There is a massive and growing literature on applications of prospect theory because it resonates so well with behavior.(23-28)

Prospect theory, however, should not be regarded as a guide for normative decision making, as noted in Kahneman's 2011 book.(29) Rather, decisionmakers should reflect carefully on the extent that the zero point and regret should enter the evaluation of gains and losses. While prospect theory may be better than traditional (e.g., Bernoulli's) utility theory in describing how people actually make choices, individuals and organizations should be urged to evaluate outcomes carefully in terms of long-term rather than short-term goals. Normative valuation across a multiplicity of criteria remains a challenge, especially in a social or group decision context. As Kenneth Arrow showed with his Impossibility Theorem,(30) a group preference function reflecting individual preferences may not exist.

3. Developing an Environmental Justice Ethic and Frameworks

In 1994, U.S. President Clinton issued Executive Order 12898 requiring federal agencies and departments to develop environmental justice (EJ) strategies for administration of environmental rules and guidelines, including hiring. Every environmental impact statement considers EJ, and prudent private and public managers routinely consider EJ before they pick a location for a new facility or make major operational changes at existing sites.

Executive Order 12898 had its roots in the Civil Rights Act of 1964. Similar efforts occurred across the globe from the 1950s through the 1980s. In the United States, the strongest arguments for EJ were made by the United Church of Christ (UCC), the only religious organization that still had a civil rights thrust in the late 1980s. Benjamin Chavis, Jr. coined the expression “environmental racism” and Charles Lee of the UCC built the arguments and a database to support the theory. The report Toxic Waste and Race(31) showed that the concentration of hazardous waste sites was strongly associated with ethnicity/race and socioeconomic status, and was the basis upon which to assert inequity in the distribution of environmental risks and benefits.

The EJ literature is massive, including nearly 200 papers in Risk Analysis that directly focus on EJ or invoke the theory to make an argument. Notably, the literature is equivocal about the empirical evidence.(32-37) For example, Zimmerman(32) studied Superfund (NPL) sites and observed that these disproportionately were located in minority areas, but that this same finding did not apply for socioeconomic status. Cutter et al.(37) used the Toxic Release Inventory and other data to examine the distribution of polluting industrial facilities in South Carolina, and did not observe a disproportionate burden on racial minorities and the poor. Burger et al.(38) argue that all populations should have access to the decision-making process and provide suggestions for ecological information. Greenberg(39) argued and demonstrated that markedly different results could be produced by manipulating the geographical scale of the data, time period, definitions of populations, and statistics used to measure inequity. Nevertheless, the explication of the theory in the context of civil rights has been a powerful device to challenge policies about transportation, energy, and cleanup priorities after environmental disasters (e.g., Katrina). EJ theory may be the most important theoretical addition to environmental policy over the last two decades because it forces decisionmakers to consider distributive impacts and nonquantifiable costs and benefits.(40-42)


4. Using Formal Uncertainty Analysis in Risk Assessment

Rules, regulations and guidelines about human and ecological risks historically have been based on deterministic analyses and conservative safety factors. However, decisionmakers might make different decisions if informed about uncertainty in the underlying data. Formal Bayesian methods allow users to incorporate both prior knowledge and experimental data into risk assessments.(43-45) Uncertainty in multiple inputs to a deterministic analysis can be described by a set of probability distributions on these inputs, which may need a conditional structure because some inputs depend upon others. Monte Carlo simulation has become a widely used practice for calculating a probability distribution on model output from the set of probability distributions on model inputs. Decision trees and influence diagrams (or Bayesian networks(46)) are also used for such calculations, and have the advantage that the conditional structure is made explicit. These types of formal uncertainty analyses have become widely used for understanding plausible risks and identifying critical uncertainties associated with them for U.S. federal agencies, and in the European Union and Japan, among other nations.(46-51) Some subnational government agencies now mandate the use of probabilistic risk assessment (PRA) and provide guidance on its use.

Thousands of government documents and journal articles, including over 500 articles in Risk Analysis, have documented the evolution of PRA and excellent summary papers have been written.(52-54) Guidance and practice suggest that this method is used after simpler approaches indicate there is a need to be concerned. Important refinements have been suggested, such as separating analyses into components that the decisionmaker can control and those that it cannot.

Examples abound, including assessments of contaminated sediments in water bodies, cancer risk assessment, and many others.(55,56) In Risk Analysis, beginning with the very first article in which Kaplan and Garrick(2,57) defined the risk assessment trilogy, the major focus was nuclear energy and waste management. Individual articles have focused on reactor safety, loss of off-site power, the modeling of a variety of initiating event issues associated with radioactive waste repositories, fires, precursor events, and health effects from Chernobyl.(58-67) All U.S. nuclear power plants have been required to prepare risk assessments that rely on probabilistic simulations and this practice has spread.

Probabilistic risk analysis has limitations. Data requirements are substantial. Decisionmakers and/or their staff need some background in these methods in order to understand that options are clarified and not obfuscated, and some distrust the method because they assume that values are buried in the numbers and that these drive the decisions. Nevertheless, many government bodies and private organizations have become more committed to the method and have issued guidelines that eliminate some of the concerns of the early years. It appears to us that with the emergence of guidance and off-the-shelf software packages, the use of the method will continue to diffuse.

5. Building the Capacity and Frameworks to Apply Multiobjective Decision Making to Complex Decisions that Trade Off Risk and Other Criteria

The ability to assess the influence of multiple criteria as part of an analytical decision-making process has appealed to analysts and some decisionmakers for decades.(68) Globalization, population growth, and increasing pressure on resources and the environment have increased the potential value of multiattribute decision support tools and models. Analysts have responded with multiple methods, such as the analytic hierarchy process, data envelopment, goal programming, multiple-attribute utility theory, value engineering, and a host of others.(69,70) Yet there has been skepticism about this class of models, some asserting that they are too expensive to build, use, and maintain, inflexible in the face of multiple agency needs, and difficult to understand. For example, a widely cited paper in urban planning characterized these models as dinosaurs that collapsed rather than evolved.(71) One reason for this label was the models’ inability to quickly solve the problems. This remains an issue but considerable progress has been made.

The key challenge has been overcoming objections that the models try to be too comprehensive and thereby become incomprehensible. We have clear evidence that this issue can be addressed. Risk Analysis has published about 100 articles that apply one or more multiobjective decision-making tools to topics such as facility siting, waste management, and homeland security.(72-80) For example, Linkov et al. used multicriteria decision analysis (MCDA) to examine alternative ways of managing contaminated sediments.(72) The authors built an MCDA approach that explicitly wove together expert judgment and stakeholder views.

A goal for developers of homeland security applications has been to make them both comprehensive and comprehensible. For example, Leung et al.(77) developed a two-stage approach aimed at determining how to safeguard bridges against terrorist attacks. Building on the hierarchical holographic modeling of Haimes, they examined and prioritized risks and then considered risk management responses. Li et al.(80) built a simulation model of the MIT campus using value trees based on multiattribute utility theory. In both of these and other cases, the models provide entry for decisionmakers’ preferences about the value and potential impacts of options. Far from the black box models that repulsed potential users, these user-friendly efforts do not require advanced degrees in mathematics to follow and apply. They signal the possibility of many more directly useful public policy applications.

6. Modifying Existing Regional Economic Impact Assessment Tools

Multipliers, econometric, and input models have been used to estimate impacts of new steel mills, airports, road projects, and other large projects on regional economic product, jobs, taxes, and population migration. These tools depend on historical relationships embedded in government-collected economic data. However, in the case of hazard events, the inability to take into account precipitous loss of economic system capacity can lead to flawed economic consequence estimates and inaccurate assessments of the benefits of investments to reduce consequences.

Inoperability input-output models (IIM) and computable general equilibrium (CGE) models are two simulation approaches that consciously address these limitations. IIM models focus on the reduction of demand sectors most directly impacted by the event. Suppliers and customers of these sectors are affected and the impacts diffuse through the entire economy. So, for example, if a tornado demolished a major port, the immediate impacts would be on the shipping, rail, and auto traffic that depends upon that port. Then all of the businesses that receive materials from the port or ship their products through the port would be impacted. Their workers might lose their jobs, leading to an induced income impact. After the event, if there are no more disruptions, the economy eventually recovers and finds a new equilibrium point.(81,82)

CGE models assume a general economic equilibrium(83) and use simulation models to estimate levels of supply, demand, and prices for an economy. If the economy suffers an economic blow, such as a terrorist attack on a rail system or a tornado disabling the water system, the model adjusts to changes, deriving equations with the most efficient solutions to make the economy function. The models can explicitly incorporate resilience in the equation structure, which is a major advantage. For example, Rose(84) built a CGE model to test the impact of an earthquake on the water supply and in turn on the economy of the Portland, Oregon region. The model incorporated improving the pipes, pumps, and remainder of the water system, as well as changing water use schedules. These substantially reduced the estimated negative regional economic impact.

Both IIM and CGE models are conceptual improvements for estimating the economic consequences of hazard events, yet both have limitations requiring use of assumptions about demand elasticities, among others. Also, other approaches are being developed and applications are spreading.(85-88) Stay tuned, because new methods will continue to be developed in this area.


7. Estimating Likelihood of Events Across a Broad Spectrum of Hazard Events

Traditionally, certain kinds of risks have been very difficult to quantify due to a lack of adequate scientific understanding and valid modeling methods. For example, risks of exposure-related cancers used to be based largely on default assumptions (e.g., that cancer risk is proportional to exposure) and safety factors of uncertain and questionable validity for specific agents. Microbial and antimicrobial hazards used to be assessed using simplistic models and factors that did not adequately distinguish between individual dose-response probabilities and population distributions of individual dose-response functions. This situation has improved greatly over the past 30 years. For cancer risk assessment, the introduction of the two-stage stochastic clonal expansion (TSCE) model by Moolgavkar et al., in a series of Risk Analysis articles and related papers that began in the early 1980s, opened the door to a variety of “biologically-based” risk models that were simultaneously biologically plausible and practical to estimate from experimental and epidemiological data.(89-93) Applications to radiation-related risks (e.g., from uranium mines), chemical carcinogens and leukemogens (such as benzene), interactions between smoking and occupational exposures, and a host of other practical applications swiftly followed, but these applications were not confined to cancer risk and included, for example, contamination of drinking water supplies by viruses.(94)

During the last decade, microbial risk assessment applications have been rapidly increasing. For example, Risk Analysis has published more than 50 papers during the last decade that rest on the substantial validation of the beta-Poisson model and application to biological hazards such as Campylobacter, salmonella, Legionnaires's Disease, SARS coronavirus, drinking water, and anthrax.(95) These findings have been applied to standard public health problems such as the spread of influenza, food contamination, and the spread of diseases during transportation. For example, Chen et al. examined dose response for Campylobacter jejuni in chickens,(96) Loge et al. looked at the importance of bathers in spreading pathogens,(97) and Brienen et al. used these tools to examine the effect of mask use on controlling the spread of influenza.(98) Many recent applications relate to deliberate efforts to affect populations. For example, Tamrakar and Haas examined the use of Rickettsia rickettsii, the causative agent of Rocky Mountain spotted fever, and Coxiella burnetii, which causes Q fever.(99,100) The clear direction of these applications is to provide public health and homeland security officials with more realistic assessment of the risks under different environmental conditions and with a sense of likelihood.

Many applications have also been made to natural disasters, and Risk Analysis has published many papers on this topic. As a start we suggest John Garrick's 2008 book(101) and a paper by Budnitz et al.(102) on using a panel of experts for seismic risk analysis.

8. Applying Intelligent Agent Models to Terrorism

In the decade after September 11, 2001, PRA was extended and applied to risks of deliberate attacks on infrastructure, population centers, food supply chains, and other vulnerable targets by intelligent agents with motives and capabilities that may not be fully understood by defenders. Much of the work featured proposed methods and models for assessing and comparing threats, vulnerabilities, and consequences of attacks to help allocate defensive priorities, often using the suggested definition that “Risk = Threat x Vulnerability x Consequence,” with expert judgment and PRA being used to help provide the three admittedly somewhat ambiguous input components. While the PRA model is still applied, it has limitations for some applications and hence other tools from operations research are also being applied to this important policy area, including multilevel optimization (e.g., finding a defender's best decision, anticipating both the attacker's best response to it, and the defender's own best response to the attacker's response), multicriteria optimization, value modeling (for attackers and defenders), decision analytic techniques (such as influence diagrams, decision trees, and game trees), game theory, and empirically and psychologically motivated risk models. Our journal has published articles that have led not only the application of PRA methods to terrorist threats, but also critiques and possible improvements using operations research models and methods.(103-111)

Yet, much remains to be done, and the most effective methods for risk assessment and risk management for deliberate attacks are still debated. In addition to articles seeking better methods and models to understand, predict, and prevent damaging attacks—or, more generally, to allocate defensive resources effectively to reduce attack risks and to mitigate their consequences, for example, by optimizing evacuation versus shelter-in-place decisions—our journal has also published articles modeling attacks via specific routes (e.g., dirty bomb attacks on ports or cities; attacks using trucks, trains, and planes; attacks on food supplies) and on various types of targets (e.g., power grids, transportation infrastructure, and cybersecurity).(112-117) Developing more practical methods and effective strategies to improve individual, organizational, community, and national responses to attack risks—but without overpreparing, or creating additional incentives to attack via excessive concern and overreaction to threats—remains as a grand challenge for further work in risk analysis, drawing together all of the elements of risk perception, assessment, management, and communication and combining them with fundamental questions about how decisions should be made when intelligent adversaries are the source of risks.

9. Building an Applied Field of Risk Communications

In 1980, we commonly heard that risk communications meant scientists explaining the science to decisionmakers, reporters, and the public, for example, “the public needs to be educated.” Fast forward 30 years to find a large body of theory and empirical evidence that demonstrates that successful risk communication is providing reliable and useful information to all interested parties, including scientists and managers who too often think that they already know whatever they need to know.(118) Sandman(119) argues that recognizing the role of personal feelings of control, voluntariness, equity, morality, and other factors is critical and that risk communicators need to formulate their messages to address these. Poorly written and ineptly delivered risk information is not only boring but can be confusing, leading to counterproductive public perceptions and behaviors.

Our journal has contributed to the theory, methods, and empirical evidence in over 1,200 articles in this field. In his first editorial in 1981, editor Robert B. Cumming(120) pointed to the complexity of the science and communicating information as challenges. The field slowly, and in some ways painfully, has learned that risk communications required training people who go before the public and the media, and persuading organizations that risk communications are only as good as good risk management and require a genuine commitment of resources for planning, implementation, and evaluation.(121-128) Researchers have produced seven cardinal rules of risk communication,(129) ten commandments for communicating with journalists,(130) and the equivalent for organizations.(131, 132)

Reasons for discomfort with generic “how to” advice that was developed in the 1980s and 1990s has grown among scholars. These include that there are multiple audiences, each requiring a communication program; that the effectiveness of communications is strongly influenced by affect and trust; and that rigorous pretesting and posttesting evaluations are not often conducted. Furthermore, we know that the media sometimes amplify risk (especially when locked into feedback loops with concerned citizens and activists)(133) and other times that risk is attenuated by the media and others. Also, we have learned that we need to be cautious about using risk comparisons, and that citizen advisory boards and public participation represent good democratic practices, but need careful management.

Applications started with communicating about cancer risk, and rapidly expanded to hazardous waste management, facility siting, air pollution, water quality, radon, tobacco, alcohol and other drug abuse, the Challenger, anthrax, Three Mile Island and Chernobyl, hurricanes, tornadoes, biological hazards, electromagnetic fields, terrorism, and a host of others. Risk communication is a dynamic field, both an art and increasingly a science, and a great deal more theoretical and empirical work is required before our comfort level increases.

10. Making Risk-Informed Legal and Regulatory Decisions

Risk analysis creates information for decisionmakers. What happens to the information depends upon who receives it and their institutional objectives and environment. Here we recognize contested risk-related decisions that overturned or substantially changed established policy. Some of these are well-known and others are not. For example, the phaseout of chlorofluorocarbons and tetraethyl lead from gasoline, the adoption of airbags in cars, efforts to reduce small particle air pollutants, and the requirement that children sit in restraints in the back seat are well-known changes.(134-139) The system of regulatory impact assessment, the “prompt” letter and FDA decision to list trans-fat content on food labels, and the destruction of the U.S. chemical weapon stockpile are less well-known illustrations.(140-142)

The complexity of risk-informed decisions is illustrated by oxygenating agents in motor vehicles. Before 1990, Austria, Brazil, and Japan had legally phased out tetraethyl lead in gasoline. During the next decade, Canada, Denmark, Hong Kong, Germany, New Zealand, Singapore, Slovakia, South Korea, Sri Lanka, Sweden, and the United States joined the group, and many more have followed. The risk-based evidence focused on neurological damage to children. None of the nations made this decision without opposition.(143,144)

We think, however, that the most drawn-out process was in the United States. It began with the Clean Air Act in 1970 that created a mandate for unleaded gas and then required a catalytic converter, which tetraethyl lead poisons. After a major debate in 1985 that almost scrapped the program, a great deal of risk-related information was provided to senior federal officials that showed a clear advantage to phasing out lead. Finally, Congress chose to ban it effective in 1996. The results have been a marked decrease in atmospheric lead. This decision, however, has precipitated an ongoing lengthy debate over what should replace lead as the anti-knock compound in gasoline. Methyl tertiary butyl ether (MTBE) was chosen over the objections of some risk analysts who noted the likelihood of serous groundwater pollution.(143,144) MTBE has now been replaced by various alcohols, aromatic hydrocarbons, and ethanol. None of these compounds is without risk, and this debate has not ended but is a good example of the need for a risk-based decision-making protocol that keeps track of the immediate and life-cycle consequences of decisions. The need for risk-informed decisions is paramount across the globe and represents an ongoing challenge for every member of the risk analysis community.(145,146)


Researching, thinking about, and writing this article has been both an enjoyable and challenging exercise. The challenges were in choosing among so many important contributions and then summarizing each in about 400 words. We acknowledge that you may have alternative nominations and additional thoughts about those we chose. We invite you to send us letters to the editor. And note that we have invited experts in each of the 10 areas to write perspectives to appear in future issues on new directions in these fields.


We thank members who suggested topics to us, and we note that the contents do not represent an official position of the Society of Risk Analysis.