Rating Reading Passages According to the ACTFL Reading Proficiency Standard: Can It Be Learned?1


  • Dale L. Lange,

    1. University of Minnesota
    Search for more papers by this author
    • 4

      Dale L. Lange (Ph.D., University of Minnesota) is Professor of Second Languages and Cultures Education, Department of Curriculum and Instruction, College of Education, at the University of Minnesota, Twin Cities Campus.

  • Pardee Lowe Jr.

    1. The Central Intelligence Agency
    Search for more papers by this author
    • 5

      Pardee Lowe, Jr. (Ph.D., University of California, Berkeley) is a Testing Specialist in foreign languages, Office of Training and Education at The Central Intelligence Agency, Washington, D.C.

  • 1

    This article represents a revised version of a paper presented at the Language Testing Research Colloquium held in 1987 on February 28 through March 1 at the Defense Language Institute (Monterty, CAI. We wish to thank Martha Herzog, John Lett, and Ray T. Clifford at the Defense Language Institute for reading an earlier draft of the paper. Their comments have been invaluable. To Martha Herzog and her colleagues at the Defense Language Institute also our gratitude for the excellent selection of texts and for providing insightful commentary on their use in face-to-face reading proficiency interviews.


ABSTRACT  This paper describes a short study investigating how fully potential users of the ACTFL (1, 2) and ILR Proficiency Scales (8) acquire the standard and thus accurately grade reading passages according to the scales. Because the study's participants came from three different languages, French German, and Spanish, English passages served for the training. The passages had been previously rated by the ILR Testing Committee and were subsequently rated blind by the participants in the study. The extent to which the standard was correctly applied was checked by two tasks, ranking passages for difficulty and rating them according to the scales. We hypothesized that if potential users of the scale could accomplish these tasks with suitable accuracy, then a major criticism of the reading proficiency scales and a significant impediment to their use could be overcome. In the present study, twenty-five participants in a five-day workshop designed items for testing proficiency in listening, reading, writing and speaking at the University of Minnesota. On the first and final days of the workshop participants attempted the ranking and rating tasks for reading texts. The success achieved on the two tasks suggests strongly that the reading proficiency standard can indeed be learned and passages ranked and rated accordingly.