Evidence-based information literacy instruction: Curriculum planning from the ground up



The purpose of this longitudinal research study is to assess the information literacy (IL) skills of grade 12 students as they transition to university in order to determine their preparedness for academic work in the digital age. This poster reports the results of the first phase of this study which included a university-wide information literacy instruction (ILI) audit, as well as the administration of the quantitative Information Literacy Test (ILT) to 103 grade 12 students. Results indicate a gap between the expectations and skills required in secondary and post-secondary education. The results of this study contribute new knowledge to the research literature on IL, by providing a unique understanding of the information literacy skills possessed by grade 12 students as they transition to university. This will also be important for professional practice by providing librarians tasked with ILI with evidence enabling construction of tailored curriculum to address specific IL deficits shown by new students.


Practices grounded in undergraduates' perspectives must explore their experiences with a view to building curriculum and designing pedagogy that have potential to increase the relevancy of students' learning for the digital context in which they live and work. It is with this goal in mind that a collaborative, longitudinal study of students in writing-intensive disciplines (i.e., humanities and social sciences) is being conducted at a major Canadian university. This poster reports on the results of the first phase of this longitudinal study, on the IL skills of students as they transition from high school to university. This study targeted grade 12 students preparing to graduate and enter a new phase of their life. Students transitioning from high school to university experience a dramatic shift from highly structured tasks with guidance to self-managed learning.

Information literacy instruction in academic libraries tends to focus on the student's entry into the postsecondary environment. Individual knowledge is generally assessed with little consideration of previous instruction or experience. It is often assumed that high school students have been exposed to basic information literacy skills, however, recent research suggests this expectation may not be met (Julien & Barker, 2009).


The first phase of the study focused on information literacy programs and practices at the University of Alberta as well as high school student information literacy skills. In order to audit ILI at the University, a systematic content analysis of the Libqual 2007 survey, the InfoLit Survey administered in 2009, content of the University Calendar, and documents produced by the University and its librarians related to information literacy. This content analysis drew upon student and librarian perceptions and experiences to evaluate the quantity and quality of support available for information literacy instruction.

The Information Literacy Test (ILT) was administered to 103 grade 12 high school students in 3 high schools. The ILT was created by James Madison University to assess students' abilities to “locate, evaluate, and effectively use information when it is required” (ACRL, 2003). The Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education have been widely accepted as IL standards for academic institutions. Attainment of 4 of the 5 standards was evaluated by the 65-item multiple choice test, administered in computer labs on school premises. Participation in the study was strictly voluntary and no rewards were offered. These data will set a baseline for understanding entering university students' familiarity and comfort with core skills and expectations.


Audit summary

Analysis of ILI at the University demonstrated that students recognize the importance of information literacy skills but are unaware of resources available to support or enhance their skills. Instructional practices at the University were largely traditional, and distributed, with little central coordination or direction. In addition, currently there is little evaluation of IL instructional practices or outcomes.

ILT result summary

The results of the ILT show that high school students are not proficient when it comes to information literacy skills. The ILT scores were poor with a mean of 50.7%. No students demonstrated “advanced” information literacy skills (a score of 90% or higher), 19% of participants achieved “proficiency” (scores of 65%–89%), and 80% of participants were considered “non-proficient” (scores less than 65%) (see Figure 1 for a complete distribution of scores). The coefficient alpha determined during our administration of the exam was 0.88. Achievement for the individual ACRL Standards was in the mid-50s with the exception of Standard 2 which was 41%.

Figure 1.

Distribution of ILT test Scores

Item Analysis

A preliminary item content analysis was performed to determine the knowledge, skill or attribute strength and weakness trends among the participants. Key words were assigned reflecting the skills or content knowledge required to answer each question correctly. Major weaknesses included developing effective and efficient search strategies, using Boolean operators, and understanding academic journals, databases and the publication process. Strengths included the use of traditional print resources and understanding ethical and legal issues around the access and use of information.

Rapid Guessing Analysis

Examinee motivation and effort presented a validity concern as the ILT is a low-stakes test. The outcome of the test had no bearing on the student and was not completed as a part of a course. Thus, the research team was concerned that student motivation would be lacking, effort invested would be low and the proficiency results would not necessarily reflect participants' abilities. Initially the intent was to consider motivation or rapid-guess filtering of the data to ensure a valid assessment of proficiency. Theoretically students who rapid guess are not demonstrating proficiency or a lack thereof; it is simply a lack of effort that is demonstrated. In removing this data from the set, researchers assume that the rapid guesses are unrelated to proficiency. There is, however, a possibility that the effort reflected by the response time is indicative of proficiency and removing the related scores would positively bias proficiency estimates (Wise, Pastor, & Kong, 2009). Previous studies have found that motivation filtering (Wise & Kong, 2005), rapid-response filtering (Wise, 2006), and employing the effort-moderated IRT model (Wise & Demars, 2006) has improved test validity. The studies were replicated by following the steps outlined by the aforementioned researchers. The filtered and unfiltered results are currently being compared to assess validity.

Overall, 30 students rapid guessed on at least one question; 7 rapid guessed at least 50% of the items; and 73 participants ranged between 0.91 and 1.00, indicating that they rarely rapid guessed. Removing instances of rapid guessing from the data did not significantly alter the results, indicating that the ILT results are reliable.


This research is generously supported by a grant from the Teaching and Learning Enhancement Fund, University of Alberta.