Using a Wiki as a platform for formative evaluation



Expert reviewers used a Wiki to provide evaluative comments on a set of educational modules developed by the Digital Library (DL) Curriculum Development project, a joint effort of researchers at the University of North Carolina and Virginia Tech, with funding from the National Science Foundation. A total of 32 experts from 23 universities in 6 countries provided comments on 10 draft educational modules. The Wiki was an effective tool for the reviewers to identify and discuss the strengths and weaknesses of the modules, to share their experiences of teaching DL courses, and to communicate with the project team. The Wiki played an important role as a venue for an evaluation process that was available to the DL community at large.

1. Introduction

To support learning and teaching about digital libraries (DLs), a joint research team from Virginia Tech (VT CS) and the University of North Carolina (UNC SILS) has been developing a DL curriculum framework and educational materials applicable for both computer science (CS) and information and library science (ILS) programs ( After a series of analyses of the topics and reading materials covered by existing DL courses and group discussions with the project's advisory board members, the current version of a DL curriculum framework was established (see Figure 1). It is composed of 10 core areas: Overview, Digital Objects, Collection Development, Information/Knowledge Organization, Architecture, User Behavior/Interactions, Services, Preservation, Management and Evaluation, and DL Education and Research. Each of the core areas of the curriculum is comprised of 2-7 educational modules.

The educational modules are lesson plans to be implemented by instructors. Each module includes the module scope, learning objectives, the related attributes of the 5S theory (Gon&çalves et al., 2004), the level of effort required in terms of in-class and out-of class time, relationships with other modules, prerequisite knowledge and skills required, the body of knowledge to be covered, required and recommended readings, learning activities and exercises, evaluation guidelines for assignments, a glossary, and additional useful links.

Prior to field testing of the modules, experts evaluated the modules using a Wiki as a communication platform. That process is described in detail here, and our assessment of the usefulness of this evaluation method is discussed.

2. A Wiki as a Collaboration Tool

A Wiki is a Web-based application designed to promote collaboration in building content online. Those who have access to a Wiki can add, delete, and modify its content. These activities are tracked and the content can be restored to its previous state easily, if needed.

Figure 1.

Framework for a DL Curriculum (Last updated 2008/08/23) (Titles in blue indicate that the module was reviewed during 2008)

We decided to use a Wiki to support module evaluation in this project for several reasons. First, a Wiki can enable reviewers around the world to access it. Because the project's expert reviewers were recruited from a worldwide community of DL scholars, this capability was essential. Second, the simplified markup language helps reviewers to add and modify comments easily. While this markup language is not as simple to use as a word processor, it was well within the capabilities of the reviewers. Third, a Wiki enabled us to assign different levels of access and control. The contents are readable by anyone, but the writing option was limited to the invited reviewers and project team members only. Most importantly, because it is intended to be a collaborative tool, a Wiki allows its users not only to post their own comments, but also to refer to (or change) others' comments. It was hoped that reviewers would interact with each other to discuss issues and problems, leading to a meaningful collaboration to improve the modules.

Media Wiki ( is the software selected for use in this project. It was originally developed for Wikipedia, but it has been widely used to support various types of online activities such as teaching and learning online (Gao et al., 2008; Florea et al., 2008) and providing library services (Lombardo et al., 2008; Frumkin, 2005) (see Figure 2 for the main page of the project wiki,

Figure 2.

DL Curriculum Development Project Wiki Homepage

3. The Formative Evaluation Process

Among the 47 modules in the curriculum framework, drafts of 10 modules were reviewed in 2008. They are:

  • Module 1-b: History of digital libraries and library automation

  • Module 3-b: Digitization

  • Module 4-b: Metadata

  • Module 5-a: Architecture overviews

  • Module 5-b: Application software

  • Module 6-a: Information needs/relevance

  • Module 6-b: Online information seeking behaviors and search strategies

  • Module 6-d: Interaction design and usability assessment

  • Module 7-b: Reference services

  • Module 9-c: Digital library evaluation, user studies

Each module evaluation was supported by its own Wiki page, accessible from the main Wiki page (Figure 2). A PDF version of the module was linked from the module page, for easy viewing. For each module, 3-4 expert reviewers were invited to conduct the evaluation; they were aware of each others' identities throughout the process. The reviewers received individual Wiki accounts, and were asked to “sign” each of their comments (see Figure 3). Unlike other applications of Wikis, the reviewers were NOT asked to edit the others' comments. The goal was to encourage discussion among the reviewers, rather than to develop a shared evaluation report.

Figure 3.

An Example of the Wiki Evaluation Comments Page

The reviewers were asked to critique each module based on evaluation criteria related to the module's learning objectives, the body of knowledge, readings, learning activities, logistics and practical aspects of teaching the module, and its overall structure (see the contents list at the top of Figure 2). Finally, the reviewers were invited to leave additional comments.

4. Results and Discussion

A total of 32 experts from 23 universities in 6 countries (England, Finland, Japan, New Zealand, Vietnam, and USA) provided evaluative comments on the 10 available modules. They identified the strengths and weaknesses of each module and suggested ways in which it could be improved. The project team has revised the modules based on this feedback, and is now field-testing the revised modules in classrooms.

The Wiki was an effective tool with which to conduct this formative evaluation and promote collaboration among the reviewers. Our findings related to reviewers' use of this tool will be presented in terms of the content of the evaluations and the form/structure of the reviewers' discussions.

Content of the evaluative comments

In general, reviewers provided their overall impressions of the modules. At the same time, they identified specific problems and issues in the modules. The line and page numbers in the module PDFs helped reviewers to point to particular sections in the modules. For example, one reviewer provided detailed comments on module 7-b, Reference services: “I have a few specific comments: page 3 line 31; also allows the user not to have to wait around for a response page 3 line 34; worth mentioning environments such as Second Life?”

Reviewers often shared their experiences of teaching DL courses on the module topics, describing the in-class activities, assignments, and readings they had used. For example, one reviewer noted: “In my class on “Introduction to Digital Libraries” (face-to-face), I teach “digitization” like a hands-on lab.” Another said: “I have taught digital libraries at least 4 times. I have used some of the readings you list…”

Explicit agreement or disagreement with others' comments was often posted. For example, in the discussion of module 7-b, Reference services, Vondracekr said: “I agree with Lili and Joe's comments, and think Joe's comment about evidence on whether or not knowledge-based systems work is a critical one.”

These examples demonstrate that the reviewers were able to provide a variety of evaluative comments. Often, the comments were based on the reviewers' personal experiences. Some comments were general, while others were very detailed. Reviewers often agreed with each other, but were also able to express differences of opinion.

Form/structure of the interaction

Reviewers were able to access the Wiki whenever they had time. They were able to revisit the Wiki to modify their own comments and to respond to other reviewers' comments. Figure 4 shows an example of the discussion among reviewers.

Figure 4.

An example of discussion between reviewers (1-b: History of DLs and Library Automation Module)

In this example, three reviewers (Gregoryv, Hahnt, and Leskm) commented on the learning objectives of the module. An examination of the date and time stamps related to the each comment reveals an ongoing discussion about the possibility to of using the module in online classes was observed.

  • Section (a): Gregroyv first expressed her doubts about the use of the module in online classes.

  • Section (b): Hahnt disagreed with her and explained her reasons.

  • Section (c): Gregoryv came back to the Wiki a few days later and added more comments on the issue.

  • Section (e): Leskm accessed the Wiki a few days later. He responded to the discussion saying, “I agree with the statements that people in online discussions tend to repeat each other”, and added more comments on the module.

This example illustrates a conversation among the three reviewers that occurred over a three-week period. The Wiki platform allowed the reviewers to insert their comments to create a semantic thread (rather than the sequential structure of a blog). This capability was less structured and more spontaneous in form than a threaded discussion forum.

The Wiki also promoted communication between the reviewers and the project team. Reviewers asked questions in order to clarify concepts in the modules, and the module developers provided responses. For example, one reviewer asked for clarification: “I only have a question about the terminology – does the phrase “question answering (QA) services” in the second objective mean the same thing as QA systems (the automated QA systems discussed in page 4)? In page 9, services like Yahoo! Answers are also referred to as “QA systems” with no associated DL…” The module developer responded: “By QA services I meant TREC-style automated QA. But this is a good point. This module should differentiate between library-based human-intermediated reference services and community-based services. I will make changes to the module to include services like Yahoo Answers.”

The whole process of module evaluation, discussion, and questioning and answering was transparent to all the participants, including the development team. Reviewers were able to interact with each other. While the development team did not initiate comments (other than the initial invitation to the reviewers to participate), they could easily respond to questions of clarification that were raised by reviewers. In addition, the entire interaction was visible to the entire DL community.

5. Conclusion

Among various online communication tools (e.g., discussion forums, blogs, emails, etc.), a Wiki was chosen for evaluating a set of DL educational modules. It served as an effective platform for this purpose. It enabled DL experts from around the world to collaborate online in evaluating a set of DL curriculum modules, interacting with each other in a relatively straightforward and fluid way while keeping a record of those interactions. In addition, the entire evaluation/review process was transparent to the DL community at large, enabling others to view the draft modules as well as expert reviewers' comments about them.

As new modules are developed, this form of formative evaluation will continue. Thus, it is expected that the project Wiki will continue to play an important role as a venue to gather DL experts and to build the DL community based on collaborative review activities.


This project builds upon a collaboration between Virginia Tech and the University of North Carolina, Chapel Hill, funded by the National Science Foundation through grants NSF IIS-0535057 and IIS-0535060, respectively.