SEARCH

SEARCH BY CITATION

References

  • Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & van Noorden, R. (2010). Metrics: Do metrics matter? Nature, 465, 860862.
  • Appleseed Inc. (2003). Engines of economic growth: The economic impact of Boston's eight research universities on the metropolitan Boston area. Boston, MA: Author.
  • Australian Research Council. (2011). ERA 2012 submission guidelines, Excellence in Research for Australia. Canberra, Australia: Author.
  • Australian Research Council. (2012). Linkage Projects funding rules for funding commencing in 2012 variation (No. 1). Canberra, Australia: Author.
  • Austrian Science Fund. (2007). Rethinking the impact of basic research on society and the economy. Vienna, Austria: Author.
  • Barré, R. (2005). S&T Indicators for policy making in a changing science–society relationship. In H. Moed , W. Glänzel , & U. Schmoch (Eds.), Handbook of quantitative science and technology research (pp. 115131). Dordrecht, The Netherlands: Springer.
  • Beise, M., & Stahl, H. (1999). Public research and industrial innovations in Germany. Research Policy, 28(4), 397422. doi:10.1016/s0048-7333(98)00126-7
  • Bell, S., Shaw, B., & Boaz, A. (2011). Real-world approaches to assessing the impact of environmental research on policy. Research Evaluation, 20(3), 227237.
  • Bensing, J.M., Caris-Verhallen, W.M.C.M., Dekker, J., Delnoij, D.M.J., & Groenewegen, P.P. (2003). Doing the right thing and doing it right: Toward a framework for assessing the policy relevance of health services research. International Journal of Technology Assessment in Health Care, 19(04), 604612. doi:10.1017/S0266462303000564
  • Biotechnology and Biological Sciences Research Council. (2012). BBSRC policy on maximising the impact of research. Swindon, United Kingdom: Author.
  • Bornmann, L. (2010). Mimicry in science? Scientometrics, 86(1), 173177.
  • Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45, 199245.
  • Bornmann, L. (2012). Measuring the societal impact of research. EMBO Reports, 13(8), 673676.
  • Bornmann, L., Mutz, R., Neuhaus, C., & Daniel, H.-D. (2008). Use of citation counts for research evaluation: Standards of good practice for analyzing bibliometric data and presenting and interpreting results. Ethics in Science and Environmental Politics, 8, 93102. doi:10.3354/esep00084
  • Bozeman, B., & Sarewitz, D. (2011). Public value mapping and science policy evaluation. Minerva, 49(1), 123. doi:10.1007/s11024-011-9161-7
  • Burke, J., Bergman, J., & Asimov, I. (1985). The impact of science on society. Washington, DC: National Aeronautics and Space Administration.
  • Bush, V. (1945). Science: The endless frontier. [A report to President Truman outlining his proposal for post-war U.S. science and technology policy.] Washington, DC: United States Government Printing Office.
  • Buxton, M. (2011). The payback of “Payback”: Challenges in assessing research impact. Research Evaluation, 20(3), 259260.
  • Buxton, M., & Hanney, S. (1994). Assessing payback from Department of Health Research and Development: Preliminary report: Vol. 1. The main report (Research Report, No. 19). Uxbridge, United Kingdom: HERG, Brunel University.
  • Buxton, M., & Hanney, S. (1996). How can payback from health services research be assessed? Journal of Health Services Research & Policy, 1(1), 3543.
  • Centre for Quality and Change Management. (2011). Final report of Delphi study. E3M project—European indicators and ranking methodology for university third mission. Valencia and León, Spain: Universidad Politécnica de Valencia and the Universidad de León.
  • Committee on Prospering in the Global Economy of the 21st Century. (2007). Rising above the gathering storm: Energizing and employing America for a brighter economic future. National Academy of Sciences, National Academy of Engineering, Institute of Medicine. Washington, DC: National Academies Press.
  • Danish Council. (2006). A tool for assessing research quality and relevance. Copenhagen, Denmark: Danish Council for Research Policy.
  • Department of Education, Science and Training. (2005). Research quality framework: Assessing the quality and impact of research in Australia (Issue paper). Canberra: Commonwealth of Australia.
  • Department of Education, Science and Training. (2006). Research quality framework: Assessing the quality and impact of research in Australia. Research Impact (Report by the RQF Development Advisory Group). Canberra: Commonwealth of Australia.
  • Donovan, C. (2007). The qualitative future of research evaluation. Science and Public Policy, 34(8), 585597. doi:10.3152/030234207X256538
  • Donovan, C. (2008). The Australian Research Quality Framework: A live experiment in capturing the social, economic, environmental, and cultural returns of publicly funded research. New Directions for Evaluation, 2008(118), 4760. doi:10.1002/ev.260
  • Donovan, C. (2011). State of the art in assessing research impact: Introduction to a special issue. Research Evaluation, 20(3), 175179.
  • Donovan, C., & Hanney, S. (2011). The “Payback Framework” explained. Research Evaluation, 20(3), 181183.
  • Evaluating Research in Context (ERiC). (2010). Evaluating the societal relevance of academic research: A guide. Delft, The Netherlands: Delft University of Technology.
  • Erno-Kjolhede, E., & Hansson, F. (2011). Measuring research performance during a changing relationship between science and society. Research Evaluation, 20(2), 131143. doi:10.3152/095820211x12941371876544
  • European Commission. (2010). Assessing Europe's university-based research. Expert group on assessment of university-based research. Brussels, Belgium: Publications Office of the European Union.
  • Frank, C., & Nason, E. (2009). Health research: Measuring the social, health and economic benefits. Canadian Medical Association Journal, 180(5), 528534. doi:10.1503/Cmaj.090016
  • Frodeman, R., & Briggle, A. (2012). The dedisciplining of peer review. Minerva, 50(1), 319. doi:10.1007/s11024-012-9192-8
  • Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., & Trow, M. (1994). The new production of knowledge: The dynamics of science and research in contemporary societies. London, United Kingdom: Sage.
  • Godin, B., & Doré, C. (2005). Measuring the impacts of science; beyond the economic dimension, INRS Urbanisation, Culture et Société. Paper presented at the HIST Lecture, Helsinki Institute for Science and Technology Studies, Helsinki, Finland. Available at: http://www.csiic.ca/PDF/Godin_Dore_Impacts.pdf
  • Göransson, B., Maharajh, R., & Schmoch, U. (2009). New activities of universities in transfer and extension: Multiple requirements and manifold solutions. Science and Public Policy, 36(2), 157164. doi:10.3152/030234209x406863
  • Grant, J. (1999). Evaluating the outcomes of biomedical research on healthcare. Research Evaluation, 8(1), 3338.
  • Grant, J., Brutscher, P.-B., Kirk, S., Butler, L., & Wooding, S. (2009). Capturing research impacts. Cambridge, United Kingdom: RAND Europe.
  • Gregersen, B., Linde, L.T., & Rasmussen, J.G. (2009). Linking between Danish universities and society. Science and Public Policy, 36(2), 151156. doi:10.3152/030234209x406818
  • Hanney, S., Packwood, T., & Buxton, M. (2000). Evaluating the benefits from health research and development centres: A categorization, a model and examples of application. Evaluation, 6(2), 137160.
  • Health Economics Research Group, Office of Health Economics, RAND Europe. (2008). Medical research: What's it worth? Estimating the economic benefits from medical research in the UK. London, United Kingdom: Evaluation Forum.
  • Higher Education Funding Council for England. (2009). Research Excellence Framework. Second consultation on the assessment and funding of research. September 2009/38. Bristol: Higher Education Funding Council for England.
  • Higher Education Funding Council for England. (2011). Decisions on assessing research impact. Bristol: Higher Education Funding Council for England.
  • Higher Education Funding Council for England. (2012). Panel criteria and working methods. Bristol: Higher Education Funding Council for England.
  • Holbrook, J.B. (2010). The use of societal impacts considerations in grant proposal peer review: A comparison of five models. Technology & Innovation, 12(3), 213224. doi:10.3727/194982410x12895770314078
  • Holbrook, J. (2012). Re-assessing the science–society relation: The case of the US National Science Foundation's broader impacts merit review criterion (1997–2011). Retrieved from http://www.scienceofsciencepolicy.net/system/files/attachments/Holbrook_BIC_2.0_final.pdf
  • Holbrook, J.B., & Frodeman, R. (2010). Comparative Assessment of Peer Review (CAPR). EU/US Workshop on Peer Review: Assessing “broader impact” in research grant applications. Brussels, Belgium: European Commission, Directorate-General for Research and Innovation.
  • Holbrook, J.B., & Frodeman, R. (2011). Peer review and the ex ante assessment of societal impacts. Research Evaluation, 20(3), 239246.
  • Kamenetzky, J.R. (in press). Opportunities for impact: Statistical analysis of the National Science Foundation's broader impacts criterion. Science and Public Policy. doi:10.1093/scipol/scs059
  • Klautzer, L., Hanney, S., Nason, E., Rubin, J., Grant, J., & Wooding, S. (2011). Assessing policy and practice impacts of social science research: The application of the Payback Framework to assess the Future of Work programme. Research Evaluation, 20(3), 201209.
  • Kuruvilla, S., Mays, N., Pleasant, A., & Walt, G. (2006). Describing the impact of health research: A Research Impact Framework. BMC Health Services Research, 6.
  • Lähteenmäki-Smith, K., Hyytinen, K., Kutinlahti, P., & Konttinen, J. (2006). Research with an impact evaluation practises in public research organisations, Kemistintie, Finland: VTT Technical Research Centre of Finland.
  • Lamm, G.M. (2006). Innovation works. A case study of an integrated pan-European technology transfer model. B.I.F. Futura, 21(2), 8690.
  • Lewison, G., & Sullivan, R. (2008). The impact of cancer research: How publications influence UK cancer clinical guidelines. British Journal of Cancer, 98(12), 19441950.
  • Leydesdorff, L. (2012). The Triple Helix of university–industry–government relations. Amsterdam, The Netherlands: Amsterdam School of Communications Research.
  • Link, A.N., & Scott, J.T. (2011). The theory and practice of public-sector R&D economic impact analysis. Gaithersburg, MD: National Institute of Standards and Technology.
  • Macilwain, C. (2010). What science is really worth. Nature, 465(7299), 682684.
  • Mansfield, E. (1991). Academic research and industrial innovation. Research Policy, 20(1), 112. doi:10.1016/0048-7333(91)90080-a
  • Mansfield, E. (1998). Academic research and industrial innovation: An update of empirical findings. Research Policy, 26(7–8), 773776. doi:10.1016/s0048-7333(97)00043-7
  • Mardis, M.A., Hoffman, E.S., & McMartin, F.P. (2012). Toward broader impacts: Making sense of NSF's merit review criteria in the context of the National Science Digital Library. Journal of the American Society for Information Science and Technology, 63(9), 17581772. doi:10.1002/asi.22693
  • Martin, B.R. (2007). Assessing the impact of basic research on society and the economy. Paper presented at the Rethinking the impact of basic research on society and the economy (WF-EST International Conference, 11 May 2007), Vienna, Austria.
  • Martin, B.R. (2011). The Research Excellence Framework and the “impact agenda”: Are we creating a Frankenstein monster? Research Evaluation, 20(3), 247254.
  • May, R.M. (1998). The scientific investments of nations. Science, 281(5373), 4951.
  • Mervis, J. (2011). Beyond the data. Science, 334(6053), 169171. doi:10.1126/science.334.6053.169
  • Miranda, L.C.M., & Lima, C.A.S. (2010). On trends and rhythms in scientific and technological knowledge evolution: A quantitative analysis. International Journal of Technology Intelligence and Planning, 6(1), 76109.
  • Molas-Gallart, J., Salter, A., Patel, P., Scott, A., & Duran, X. (2002). Measuring third stream activities. Final report to the Russell Group of universities. Brighton, United Kingdom: Science and Technology Policy Research Unit, University of Sussex.
  • Molas-Gallart, J., & Tang, P. (2011). Tracing “productive interactions” to identify social impacts: An example from the social sciences. Research Evaluation, 20(3), 219226.
  • Montada, L., Krampen, G., & Burkard, P. (1999). Personal and social orientations of psychology college teachers on evaluative criteria for own job performances: Results of an expert survey in German graduate psychology college teachers. Psychologische Rundschau, 50(2), 6989.
  • Mostert, S., Ellenbroek, S., Meijer, I., van Ark, G., & Klasen, E. (2010). Societal output and use of research performed by health research groups. Health Research Policy and Systems, 8(1), 30.
  • Narin, F., Hamilton, K.S., & Olivastro, D. (1997). The increasing linkage between US technology and public science. Research Policy, 26(3), 317330. doi:10.1016/s0048-7333(97)00013-9
  • Nason, E., Curran, B., Hanney, S., Janta, B., Hastings, G., O'Driscoll, M., & Wooding, S. (2011). Evaluating health research funding in Ireland: Assessing the impacts of the Health Research Board of Ireland's funding activities. Research Evaluation, 20(3), 193200.
  • Niederkrotenthaler, T., Dorner, T.E., & Maier, M. (2011). Development of a practical tool to measure the impact of publications on the society based on focus group discussions with scientists. BMC Public Health, 11, 588.
  • Nightingale, P., & Scott, A. (2007). Peer review and the relevance gap: Ten suggestions for policy-makers. Science and Public Policy, 34(8), 543553. doi:10.3152/030234207x254396
  • Pålsson, C.M., Göransson, B., & Brundenius, C. (2009). Vitalizing the Swedish university system: Implementation of the “third mission.” Science and Public Policy, 36(2), 145150.
  • Petit, J.C. (2004). Why do we need fundamental research? European Review, 12(2), 191207.
  • Roberts, M.R. (2009). Realizing societal benefit from academic research: Analysis of the National Science Foundation's broader impacts criterion. Social Epistemology, 23(3–4), 199219. doi:10.1080/02691720903364035
  • Rowe, G., & Wright, G. (2001). Expert opinions in forecasting: The role of the Delphi technique. In J.S. Armstrong (Ed.), Principles of forecasting: A handbook for researchers and practitioners (pp. 125144). Norwell, MA: Kluwer Academic.
  • Royal Netherlands Academy of Arts and Sciences, Association of Universities in the Netherlands. (2010). Standard Evaluation Protocol 2009–2015. Protocol for research assessment in the Netherlands. Amsterdam, The Netherlands: Author.
  • Royal Society. (2011). Knowledge, networks and nations: Global scientific collaboration in the 21st century. London, United Kingdom: Author.
  • Ruegg, R., & Feller, I. (2003). A toolkit for evaluating public R&D investment: Models, methods, and findings from ATP's first decade. Gaithersburg, MD: National Institute of Standards and Technology.
  • Rymer, L. (2011). Measuring the impact of research—The context for metric development. Turner, Australia: The Group of Eight.
  • Salter, A.J., & Martin, B.R. (2001). The economic benefits of publicly funded basic research: A critical review. Research Policy, 30(3), 509532.
  • Schapper, C.C., Dwyer, T., Tregear, G.W., Aitken, M., & Clay, M.A. (2012). Research performance evaluation: The experience of an independent medical research institute. Australian Health Review, 36, 218223. doi:10.1071/AH11057
  • Scott, J.E., Blasinsky, M., Dufour, M., Mandal, R.J., & Philogene, G.S. (2011). An evaluation of the Mind–Body Interactions and Health Program: Assessing the impact of an NIH program using the Payback Framework. Research Evaluation, 20(3), 185192.
  • SISOP. (2011). Conceptualizing the social impact of science. Retrieved from http://sisobproject.wordpress.com/2011/07/20/conceptualizing-the-social-impact-of-science-2/
  • Smith, C.H.L. (1997). What's the use of basic science? Retrieved from http://wwwnew.jinr.ru/section.asp?sd_id=94
  • Smith, R. (2001). Measuring the social impact of research. British Medical Journal, 323(7312), 528. doi:10.1136/bmj.323.7312.528
  • Social Impact Assessment Methods for Research and Funding Instruments Through the Study of Productive Interactions (SIAMPI). (2010, December). SIAMPI Workshop, Brussels. Retrieved from http://www.siampi.eu/Content/SIAMPI/Report%20SIAMPI%20workshop.pdf
  • Social Impact Assessment Methods for Research and Funding Instruments Through the Study of Productive Interactions (SIAMPI). (2011). Final report on social impacts of research. Retrieved from http://www.siampi.edu/
  • Spaapen, J., & van Drooge, L. (2011). Introducing “productive interactions” in social impact assessment. Research Evaluation, 20(3), 211218.
  • Spaapen, J.B., Dijstelbloem, H., & Wamelink, F. (2007). Evaluating research in context: A method for comprehensive assessment. The Hague, The Netherlands: Consultative Committee of Sector Councils for Research and Development.
  • Stephan, P. (2012). How economics shapes science. Cambridge, MA: Harvard University Press.
  • Toole, A.A. (2012). The impact of public basic research on industrial innovation: Evidence from the pharmaceutical industry. Research Policy, 41(1), 112. doi:10.1016/j.respol.2011.06.004
  • United Nations Development Programme. (2010). The millennium development goals. New York, NY: Author.
  • United States Government Accountability Office. (2012). Designing evaluations. Washington, DC: Author.
  • van der Meulen, B., & Rip, A. (2000). Evaluation of societal quality of public sector research in the Netherlands. Research Evaluation, 9(1), 1125.
  • van der Weijden, I., Verbree, M., & van den Besselaar, P. (2012). From bench to bedside: The societal orientation of research leaders: The case of biomedical and health research in the Netherlands. Science and Public Policy, 39(3), 285303. doi:10.1093/scipol/scr003
  • van Vught, F., & Ziegele, F. (Eds.). (2011). Design and testing the feasibility of a multidimensional global university ranking. Final report: Consortium for Higher Education and Research Performance Assessment, CHERPA-Network.
  • Walter, A.I., Helgenberger, S., Wiek, A., & Scholz, R.W. (2007). Measuring societal effects of transdisciplinary research projects: Design and application of an evaluation method. Evaluation and Program Planning, 30(4), 325338. doi:10.1016/j.evalprogplan.2007.08.002
  • Ziman, J. (2000). Real science: What it is, and what it means. Cambridge, United Kingdom: Cambridge University Press.