This paper examines a key question in Information Seeking and Retrieval: how do people assess the usefulness of documents? In an experimental study, we presented 25 participants with five task-based search scenarios and asked them to assess and comment on the usefulness of Web documents from the Canadian government domain. Data was analyzed to test for the effect of five information task types: fact-finding, deciding, doing, learning and problem-solving. Participant assessments show a low level of agreement on usefulness scores overall, but consistency varied by task type. The criteria used to assess usefulness varied by level of usefulness, by task type and by participant. Findings contribute to our understanding of consistency in relevance assessments and the impact of tasks on information behaviour.