Principles of Good Practice for Decision Analytic Modeling in Health-Care Evaluation: Report of the ISPOR Task Force on Good Research Practices—Modeling Studies
Article first published online: 22 JAN 2003
Value in Health
Volume 6, Issue 1, pages 9–17, January 2003
How to Cite
Weinstein, M. C., O'Brien, B., Hornberger, J., Jackson, J., Johannesson, M., McCabe, C. and Luce, B. R. (2003), Principles of Good Practice for Decision Analytic Modeling in Health-Care Evaluation: Report of the ISPOR Task Force on Good Research Practices—Modeling Studies. Value in Health, 6: 9–17. doi: 10.1046/j.1524-4733.2003.00234.x
- Issue published online: 22 JAN 2003
- Article first published online: 22 JAN 2003
Objectives: Mathematical modeling is used widely in economic evaluations of pharmaceuticals and other health-care technologies. Users of models in government and the private sector need to be able to evaluate the quality of models according to scientific criteria of good practice. This report describes the consensus of a task force convened to provide modelers with guidelines for conducting and reporting modeling studies.
Methods: The task force was appointed with the advice and consent of the Board of Directors of ISPOR. Members were experienced developers or users of models, worked in academia and industry, and came from several countries in North America and Europe. The task force met on three occasions, conducted frequent correspondence and exchanges of drafts by electronic mail, and solicited comments on three drafts from a core group of external reviewers and more broadly from the membership of ISPOR.
Results: Criteria for assessing the quality of models fell into three areas: model structure, data used as inputs to models, and model validation. Several major themes cut across these areas. Models and their results should be represented as aids to decision making, not as statements of scientific fact; therefore, it is inappropriate to demand that models be validated prospectively before use. However, model assumptions regarding causal structure and parameter estimates should be continually assessed against data, and models should be revised accordingly. Structural assumptions and parameter estimates should be reported clearly and explicitly, and opportunities for users to appreciate the conditional relationship between inputs and outputs should be provided through sensitivity analyses.
Conclusions: Model-based evaluations are a valuable resource for health-care decision makers. It is the responsibility of model developers to conduct modeling studies according to the best practicable standards of quality and to communicate results with adequate disclosure of assumptions and with the caveat that conclusions are conditional upon the assumptions and data on which the model is built.