SEARCH

SEARCH BY CITATION

Abstract

Learning progressions are ordered descriptions of students' understanding of a given concept. In this paper, we describe the iterative process of developing a force and motion learning progression and associated assessment items. We report on a pair of studies designed to explore the diagnosis of students' learning progression levels. First, we compare the use of ordered multiple-choice (OMC) and open-ended (OE) items for assessing students relative to the learning progression. OMC items appear to provide more precise diagnoses of students' learning progression levels and to be more valid, eliciting students' conceptions more similarly to cognitive interviews. Second, we explore evidence bearing on two challenges concerning reliability and validity of level diagnoses: the consistency with which students respond to items set in different contexts and the ways in which students interpret and use language in responding to items. As predicted, students do not respond consistently to similar problems set in different contexts. Although the language used in OMC items generally seems to reflect student thinking, misinterpretation of the language in items may lead to inaccurate diagnoses for a subset of students. Both issues are less problematic for classroom applications than for use of learning progressions in large-scale testing. © 2008 Wiley Periodicals, Inc. Sci Ed93: 389–421, 2009