Rationale and objective Evidence-based sources of information do not integrate self-assessment tools to assess the impact of a users’ search for clinical information. We present a method to evaluate evidence-based sources of information, by systematically assessing the impact of searches for clinical information in everyday practice.
Methods We integrated an information management tool (InfoRetriever 2003) with an educational intervention in a cohort of 26 family medicine residents. An electronic impact assessment scale was used by these doctors to report the perceived impact of each item of information (each hit) retrieved on hand-held computer. We compared the types of impact associated with hits in two distinct categories: clinical decision support systems (CDSS) vs. clinical information-retrieval technology (CIRT). Information hits in CDSS were defined as any hit in the following InfoRetriever databases: Clinical Prediction Rules, History and Physical Exam diagnostic calculator and Diagnostic Test calculator. CIRT information hits were defined as any hit in: Abstracts of Cochrane Reviews, InfoPOEMs, evidence-based practice guideline summaries and the Griffith's 5 Minute Clinical Consult.
Results The impact assessment questionnaire was linked to 5160 information hits. 4946 impact assessment questionnaires were answered (95.9%), and 2495 contained reports of impact (48.4%). Reports of positive impact on doctors were most frequently in the areas of learning and practice improvement. In comparison to CDSS, CIRT hits were more frequently associated with learning and recall. CDSS hits were more frequently associated with reports of practice improvement.
Conclusions Our new method permits systematic and comparative assessment of impact associated with distinct categories of information.