SEARCH

SEARCH BY CITATION

REFERENCES

  • Adams, R., & Wu, M. (2002). PISA 2000 technical report. Paris: OECD.
  • Ajzen, I. (2001). Nature and operation of attitudes. Annual Review of Psychology, 52, 2758.
  • Aldridge, B. G. (1989). Essential changes in secondary school science: Scope, sequence and coordination. Washington, DC: National Science Teachers Association.
  • Allchin, D. (2011). Evaluating knowledge of the nature of (whole) science. Science Education, 95, 518542.
  • American Association for the Advancement of Science. (1989). Science for all Americans: A Project 2061 report on literacy goals in science, mathematics, and technology. Washington, DC: Author.
  • Anastasi, A., & Urbina, A. (1997). Psychological testing (7th ed.). Upper Saddle River, NJ: Prentice Hall.
  • Anderson, L. W., Krathwohle, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (2001). A taxonomy for learning, teaching, and assessing: A revison of Bloom's taxonomy of educational objectives. New York: Longman.
  • Angoff, W. H. (1971). Scales, norms and equivalent scores. In R. L. Thorndike (Ed.), Educational measurement (2nd ed., pp. 508600). Washington, DC: American Council on Education.
  • Au, W. (2007). High stakes testing and curricular control: A qualitative metasynthesis. Educational Researcher, 36(5), 258267.
  • Bakhtin, M. M. (1981). The dialogic imagination: Four essays (C. Emerson & M. Holquist, Transl.). Austin: University of Texas Press.
  • Berland, L., & McNeill, K. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94, 765793.
  • Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: the classification of educational goals; Handbook I: Cognitive domain. New York: David McKay.
  • Bloom, B. S., Hastings, J. T., & Madaus, G. E. (1971). Handbook on formative and summative evaluation of student learning. New York: McGraw-Hill.
  • Brown, G., & Desforges, C. (1977). Piagetian psychology and education: Time for revision. British Journal of Educational Psychology, 47, 717.
  • Bryce, T. G., McCall, J., MacGregor, J., Robertson, I. J., & Weston, R. J. (1988). Techniques for assessing process skills in practical science (TAPS 2). London: Heinemann.
  • Champagne, A., Bergin, K., Bybee, R., Duschl, R. A., & Gallagher, J. (2004). NAEP 2009 science framework development: Issues and recommendations. Paper prepared for the National Assessment Governing Board Washington, Washington, DC.
  • Comber, L. C., & Keeves, J. P. (1973). Science education in nineteen countries. New York: Wiley.
  • DeBoer, G. E. (2011). The globalization of science education. Journal of Research in Science Teaching, 48(6), 567591.
  • Donaldson, M. (1984). Children's minds. London: Fontana.
  • Driver, R., & Easley, J. (1978). Pupils and paradigms: a review of literature related to concept development in adolescent science students. Studies in Science Education, 5, 6184.
  • Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education, 84(3), 287312.
  • Duschl, R. A., & Grandy, R. E. (2008). Reconsidering the character and role of inquiry in school science: framing the debates. In R. A. Duschl & R. E. Grandy (Eds.), Teaching scientific inquiry: Recommendations for research and implementation. Rotterdam, The Netherlands: SensePublisher.
  • Finley, F. (1983). Science processes. Journal of Research in Science Teaching, 20, 4754.
  • Ford, M., & Forman, E. A. (2006). Redefining disciplinary learning in classroom contexts. Review of Research in Education, 30, 132.
  • Gagne, R. M. (1965). The psychological basis of science—A process approach. AAAS Miscellaneous Publication, 6568.
  • Giere, R. N., Bickle, J., & Mauldin, R. F. (2006). Understanding scientific reasoning. Belmont, CA: Thomson Wadsworth.
  • Gott, R., & Murphy, P. (1987). Assessing investigation at ages 13 and 15. Assessment of Performance Unit Science Report for Teachers: 9. London: Department of Education and Science, Welsh Office, Department of Education for Northern Ireland.
  • Gruber, T. R. (1993). A translation approach to portable ontology specifications. Knowledge Acquisition, 5, 199220.
  • Haertel, E., & Calfee, R. (1983). School achievement: Thinking about what to test. Journal of Educational Measurement, 20(2), 119132.
  • Harlen, W. (1999). Purposes and procedures for assessing science process skills. Assessment in Education: Principles, Policy & Practice, 6(1), 129144.
  • Hewson, P. W. (1981). A conceptual change approach to learning science. European Journal of Science Education, 3(4), 383396.
  • Hodson, D. (1998). It this really what scientists do? Seeking a more authentic science in and beyond the school laboratory. In J. Wellington (Ed.), Practical work in school science. Which way now? (pp. 93108). London: Routledge.
  • Hueftle, S. J., Rakow, S. J., & Welch, W. W. (1983). Images of science: A summary of results from the 1981–1982 National Association in Science. Minneapolis: Minnesota Research and Evaluation Centre.
  • Hughes, J., Jewson, N., & Unwin, L. (2007). Introduction: Communities of practice: A contested concept in flux. In J. Hughes, N. Jewson & L. Unwin (Eds.), Communities of practice: Critical perspectives (pp. 116). Abingdon, England: Routledge.
  • Husen, T., & Postlewaite, T. N. (1996). A brief history of the International Association for the Evaluation of Educational Achievement (IEA). Assessment in Education: Principles, Policy & Practice, 3(2), 129141.
  • Inhelder, B., & Piaget, J. (1958). The growth of logical thinking. London: Routledge Kegan Paul.
  • Jenkins, E. (2007). School science: a questionable construct? International Journal of Science Education, 39(3), 265282.
  • Johnson, S. (1989). National assessment: The APU science approach. London: HMSO.
  • Kane, M. T. (1992). The assessment of professional competence. Evaluation & the Health Profession, 15(2), 163182.
  • Kane, M. T. (2006). Validation. In L. Brennan (Ed.), Educational measurement (4th ed.). Washington, DC: National Council on Measurement in Education and American Council on Education.
  • Kind, P.M. (2013). Establishing assessment scales using a novel knowledge-based rationale for scientific reasoning. Journal of Research in Science Teaching, 50(5), 530560.
  • Kind, P.M., Osborne, J., & Szu, E. (In review). Towards a model of scientific reasoning for science education. Science Education.
  • Klahr, D., Fay, A. L., & Dunbar, K. (1993). Heuristics for scientific experimentation: A developmental study. Cognitive Psychology, 24(1), 111146.
  • Klahr, D., & Li, J. (2005). Cognitive research and elementary science instruction: From the laboratory, to the classroom, and back. Journal of Science Education and Technology, 14(2), 217238.
  • Klopfer, L. E. (1971). Evaluation of learning in science. In B. S. Bloom, J. T. Hastings, & G. F. Madeus (Eds.), Handbook on formative and summative evaluation of student learning. New York: McGraw-Hill.
  • Koretz, D. (2008). Measuring up: What educational testing really tells us. Cambridge, MA: Harvard University Press.
  • Koslowski, B. (1996). Theory and evidence: The development of scientific reasoning. Cambridge, MA: MIT Press.
  • Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.
  • Latour, B. (1987). Science in action. Milton Keynes, England: Open University Press.
  • Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation. Cambridge, England: Cambridge University Press.
  • Lehrer, R., & Schauble, L. (2006). Scientific thinking and science literacy: Supporting development in learning in contexts. In W. Damon, R. M. Lerner, K. A. Renninger, & I. E. Sigel (Eds.), Handbook of child psychology (6th ed., Vol. 4). Hoboken, NJ: Wiley.
  • Li, M., & Shavelson, R. J. (2001). Examining the links between science achievement and assessment. Paper presented at the American Educational Research Association.
  • Linn, R. L., & Baker, E. L. (1995). What do international assessments imply for world-class standards? Educational Evaluation and Policy Analysis, 17(4), 405418.
  • McCloskey, M. (1983). Naive theories of motion. In D. Gentner & A. L. Stevens (Eds.), Mental models. Hillsdale, NJ: Erlbaum.
  • Millar, R., & Driver, R. (1987). Beyond process. Studies in Science Education, 14, 3362.
  • Miller, G. A., (2003). The cognitive revolution: A historical perspective. Trends in Cognitive Sciences, 7(3), 141144.
  • Moseley, D., Baumfield, V., Elliot, J., Gregson, M., Higgins, S., Miller, J., et al. (2005). Frameworks for thinking. A handbook for teaching and learning. Cambridge, England: Cambridge University Press.
  • Mullis, I. V. S. (1992). Developing the NAEP content-area frameworks and innovative Assessment methods in the 1992 assessments of mathematics, reading, and writing. Journal of Educational Measurement, 29(2), 111131.
  • Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O'Sullivan, C. Y., Arora, A., & Erberber, E. (2005). TIMSS 2007 assessment frameworks. Amsterdam: International Association for the Evaluation of Educational Achievement.
  • Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O'Sullivan, C. Y., & Preuschoff, C. (2009). TIMSS 2011 assessment frameworks. Boston: TIMSS & PIRLS International Study Center Lynch School of Education, Boston College.
  • Mullis, I. V. S., Martin, M. O., Smith, T. A., Garden, R. A., Gregory, K. D., Gonzalez, E. J., et al. (2001). TIMSS assessment frameworks and specifications 2003. Chestnut Hill, MA: Boston College, International Study Center.
  • Mullis, I. V. S., Martin, M. O., Smith, T. A., Garden, R. A., Gregory, K. D., Gonzalez, E. J., et al. (2003). TIMSS assessment frameworks and specifications 2003 (2nd ed.). Boston: International Study Center, Boston College.
  • Murphy, P., & Gott, R. (1984). The assessment framework for science at age 13 and 15. APU science report for teachers: 2. London: DES.
  • NAEP. (1979). Three assessments of science, 1969–77: Technical summary. Report no. 08-S-21. Washington, DC: Education Commission of the States.
  • NAGB. (2004). Science framework for the 2005 National Assessment of Educational Progress. Washington, DC: U.S. Department of Education.
  • NAGB. (2008). Science framework for the 2009 National Assessment of Educational Progress. Washington, DC: Author.
  • Neidorf, T. S., Binkley, M., & Stephens, M. (2006). Comparing science content in the National Assessment of Educational Progress (NAEP) 2000 and Trends in International Mathematics and Science Study (TIMSS) 2003 assessments. Technical Report. Washington, DC: U.S. Department of Education, National Center for Education Statistics.
  • Nohara, D., & Goldstein, A. A. (2001). A comparison of the National Assessment of Educational Progress (NAEP), the Third International Mathematics and Science Study Repeat (TIMSS-R), and the Programme for International Student Assessment (PISA). Working paper no. 2001-07. Washington, DC: U.S. Department of Education, National Center for Education Statistics.
  • OECD. (1999). Measuring student knowledge and skills. A new framework for assessment. Paris: Author.
  • OECD. (2003a). Definition and selection of competencies: Theoretical and conceptual foundations (DeSeCo), summary of the final report “Key competencies for a successful life and a well-functioning society.” Paris: Author.
  • OECD. (2003b). The PISA 2003 assessment framework—Mathematics, reading, science and problem solving knowledge and skills. Paris: Author.
  • OECD. (2006). Assessing scientific, reading and mathematical literacy: A framework for PISA 2006. Paris: author.
  • Osborne, R. J. (1982). Conceptual change—For pupils and teachers. Research in Science Education, 12, 2531.
  • Osborne, R. J., & Wittrock, M. C. (1985). The generative learning model and its implications for science education. Studies in Science Education, 12, 5987.
  • Pickering, A. (Ed.). (1992). Science as practice and culture. Chicago: University of Chicago Press.
  • Prahalad, C. K., & Hamel, G. (1990). The core competence of the corporation. Harvard Business Review, 68(3), 7991.
  • Quine, W. V. (1969). Natural kinds. In W. V. Quine (Ed.), Ontological relativity and other essays. New York: Columbia University Press.
  • Robitaille, D. F., Schmidt, W. H., Raizen, S., McKnight, C., Britton, E., & Nicol, C. (1993). Curriculum frameworks for mathematics and science. TIMSS Monograph No. 1. Vancouver, British Columbia, Canada: Pacific Educational Press.
  • Rosier, M. J. (1987). The Second International Science Study. Comparative Education Review, 31(1), 106128.
  • Shippman, J. S., Ash, R. A., Battista, M., Carr, L., Eyde, L. D., Hesketh, B., et al. (2000). The practice of competency modeling. Personnel Psychology, 53, 703740.
  • Siegel, H. (1989). The rationality of science, critical thinking, and science education. Synthese, 80, 941.
  • Simon, H. A. (1966). Scientific discovery and the psychology of problem solving. In R. Colodny (Ed.), Mind and cosmos (pp. 2240). Pittsburgh, PA: University of Pittsburgh Press.
  • Tyler, R. W. (1950). Basic principles of curriculum and instruction. Chicago: University of Chicago Press.
  • Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
  • Wiley, D. E. (2001). Validity of constructs versus construct validity. In H. Braun, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement (pp. 207227). Mahwah, NJ: Erlbaum.
  • Wu, M. (2010). Comparing the similarities and differences of PISA 2003 and TIMSS. OECD Education Working Papers, no. 32. Paris: OECD.