SEARCH

SEARCH BY CITATION

Abstract

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

The objective was to assess the effect on health professionals’ skills of one to eight hours literature search and retrieval training from electronic health databases. We searched: Cochrane library (2002; Issue 3), medline (1977–2002/5), embase (1980–2002/7); cinahl (1982–2002/5); assia (1982–2002/7), bni (1994–2002/5), eric (1985–2002/6); lisa (1969—current), NRR (2002, Issue 2), the world-wide-web and references. The selection criteria consisted of randomised controlled trials, controlled before and after, and controlled cohort studies in comparison with no training. The intervention had to be one to eight hours training in literature search and retrieval skills for health professionals. The outcome was the effect on health professionals’ literature search and retrieval skill levels measured through reliable instruments. For data collection and ana-lysis, one reviewer extracted data and assessed the quality of the studies and the second reviewer checked it. The results indicate that there is some evidence of positive impact on health professionals’ skill levels in literature searching and they find the training useful. In conclusion, the size of the positive effect is debatable as only three small and methodologically weak studies met the inclusion criteria and out of those only two showed the positive effect.


Background

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

Clinical governance is a UK government initiative to improve the quality of service within the National Health Service.1 Evidence-based practice, continuous professional development (CPD) and lifelong learning have all been identified as instruments of clinical governance or quality improvement.2

Accessing and retrieving relevant and up-to-date literature is one of the major parts of evidence-based practice. It is also crucial for lifelong learning and CPD of health professionals. The provision of online access to world-wide knowledge bases, to support decision making and professional development, has been recognized as an information strategy objective by the UK government.3 National Health Services’ research and development strategy also places a great emphasis on electronic means of dissemination of research findings.4

The national policy to disseminate information electronically assumes that health professionals have access to and the skills to search appropriate databases in order to access and retrieve information.

This paper focuses on the local level where health professionals working for Chorley and South Ribble NHS Trust, Preston Acute Hospitals NHS Trust, Preston Primary Care Group (PCG) and Chorley Primary Care Trust use the library services situated at Preston Acute Hospital, Chorely and South Ribble District General Hospital and Sharoe Green Hospital. (Since this systematic review was written, Chorley and South Ribble NHS Trust and Preston Acute Hospitals NHS Trust, mentioned above) have unified to become Lancashire Teaching Hospitals NHS Trust.)

Training in literature search skills using online databases is offered either as a scheduled one-hour session or on demand. During these sessions, librarians are frequently asked questions regarding searching and computer literacy.

The top eight questions that are asked and the inferred training needs of the local health professionals

  • 1
    I am not very good with computers. What do I do? (Computer literacy.)
  • 2
    Can you show me how to use the computer for finding research? (Sources of information and experience with the search interface.)
  • 3
    How can I get the research on this topic? (Sources of information, knowledge about the databases and principles of searching.)
  • 4
    There are thousands of articles on this topic. How do I get fewer? (Use of Boolean and Limits function.)
  • 5
    What do these stand for? (Understanding of field labels.)
  • 6
    Can I save this search? (Experience with the database search interface and computer literacy.)
  • 7
    How do I print out these references? (Computer literacy and experience with the search interface.)
  • 8
    Why can’t I see the full article? (Assumed availability of full text of articles.)

These questions demonstrate a clear need for training. The one-hour sessions currently provided are inadequate in terms of content covered and trainee skills developed and the librarians feel dissatisfied with the quality of training they are able to provide in an hour.

The need for training in electronic literature search and retrieval among Chorley and Preston health professionals is not unique. Pyne et al.’s5 survey of London acute and community service health professionals and Urquhart et al.’s6 study of trainee general practitioners and hospital doctors from 13 hospital sites in England and Wales reveal similar needs and it would be logical to assume that there is a UK-wide need for training in literature search skills among health professionals.

The librarians would like to provide a more co-ordinated and comprehensive literature search and retrieval training programme with a built-in element of computer literacy for the local health professionals. Before starting such a training programme, a review of the literature was carried out in order to establish what the optimum period of training was. The timescales considered were between one hour and eight hours.

Eight hours is an arbitrary upper time limit, chosen because we believe it is the maximum the library professionals can provide due to time constraints.

Objective of the review

To assess what effect training (one to eight hours) has on health professionals’ search and retrieval skills, when they are using electronic health databases.

Criteria for considering studies for the review

Type of intervention. Educational interventions with a co-ordinated online literature search and retrieval education programme (1–8 h) offered, either as a single course, or as part of another course, with the proviso that there is data included which demonstrates improvement in literature search and retrieval skills.

Types of outcome measures. Health professionals’ skills in literature search and retrieval from electronic databases were assessed by measuring pre- and post-training competence.

Types of studies. Randomised controlled trials (RCTs), controlled before and after studies (CBAs) and controlled cohort studies. The minimum requirement is that there has to be a comparison with no training.

Types of participants. Qualified health professionals and student health professionals in any setting.

Search strategy for the identification of studies.  Subject-specific search strategies were formulated using free text words (ft) and thesaurus or index terms (ME) for each database. Both print and electronic sources were used.

Sources
  • • 
    Cochrane Library (2002, Issue 3) using: Health-Personnel* (ME), PAM* (ft), Students-Health-Occupations* (ME), Evidence-Based-Medicine* (ME), evidence-based practice (ft), literature search* (ft), evidence search* (ft), database search* (ft), Education* (ME).
  • • 
    medline (1977–2002/5) using: Health-Personnel (ME explode), PAM* (ft), Students-Health-Occupations (ME explode), Databases-Bibliographic (ME explode), Information-Storage-and-Retrieval (ME explode), literature search* (ft), database search* (ft), evidence search* (ft), evidence retriev* (ft), Evidence-Based-Medicine (ME explode), evidence-based practice (ft), Education (ME explode).
  • • 
    embase (1980–2002/7) using: health-care-personnel (ME explode), student (ME explode), PAM* (ft), bibliographic-database (ME explode), information-retrieval (ME explode), literature search* (ft), evidence search* (ft), evidence retriev* (ft), database search* (ft), evidence-based-medicine (ME explode), evidence-based practice (ft), education (ME explode).
  • • 
    cinahl (1982–2002/5) using: Health-Personnel (ME explode), PAM* (ft), Students-Health-Occupations (ME explode), Reference-Databases-Health (ME explode), Computerized-Literature-Searching (ME explode), Professional-Practice-Evidence-Based (ME explode), database search* (ft), Education (ME explode).
  • • 
    assia (1982–2002/7) using: Personnel or Personnels (ME), Students (ME), PAM* (ft), database search* (ft), literature search* (ft), Evidence-Based (ME), Teaching (ME), Training (ME).
  • • 
    bni (1994–2002/5) using: Paramedical-Professions (ME explode), Medical-Profession (ME), nurse* (ft), student* (ft), Literature-Searching (ME), Evidence-Based-Practice (ME), education (ft), train* (ft), teach* (ft).
  • • 
    eric (1985–2002/6) using: health-personnel (ME explode), PAM* (ft), allied health occupations education (ME explode), nursing education (ME explode), medical students (ME explode), Literature search* (ft), information retrieval (ME explode), information seeking (ME explode), Evidence-Based Practice (ft), training (ME explode), education (ME explode).
  • • 
    lisa (1969—current) using: health professionals (ME), students (ME), allied health profession* (ft), PAM* (ft), information seeking behaviour (ME), evidence-based practice (ft), evidence-based medicine (ft), training (ME), education (ME).
  • • 
    National Research Register (2002, Issue 2) using terms as for Cochrane Library.
  • • 
    Sheffield School of Health and Related Research (UK) site was also searched for conference proceedings as the school has a special interest in this area.
  • • 
    Internet. AltaVista search engine using search terms: ‘literature search training’ and (health professional* or student*).

Citations were also tracked from the available articles and books.

Methods of the review

Results of the literature search were initially screened by one reviewer (AG).

The following were excluded in the first instance: studies that described teaching methods and curriculum; trainee needs analyses and evaluation studies that only evaluated the training provision in terms of trainee satisfaction; pre- and post-training database usage levels and/or just an increase in the knowledge about literature search and retrieval terminology.

Only studies that evaluated the impact of 1–8 h literature search and retrieval training on health professionals’ literature search skills were considered. If the information regarding the number of hours of training provided was unavailable or could not be obtained, the study was excluded. Studies were included if a full article could be obtained. No disagreement about the inclusion of studies arose.

Three studies met the inclusion criteria. Out of these, two, Rosenberg et al. 19987 and Erickson and Warner, 19988 were randomised controlled trials (RCTs). These were appraised using the questions suggested by the Critical Appraisal Skills Programme, Oxford9 and NHS Centre for Reviews and Dissemination, York.10 These questions help to judge the methodological quality of the trials and relate to the randomization process, level of blinding to the intervention, baseline equality of the control and intervention groups, validity and reliability of the measuring instrument used and the reliability of the results. The third study, Ghali et al. 200011 was a controlled-before-and-after (CBA) study. Its appraisal was based on the questions produced by Crombie.12

Assessments of methodological quality were undertaken by one reviewer (AG) and checked by the second reviewer (KT). All disagreements were resolved by discussion. The studies (nature, results and methodological quality assessments) have been summarized in the Table 1.

Table 1.  Summary of studies: nature, results and reviewers’ comments on the methodological quality and results of the studies.
Study referenceType and size of the studyReviewers’ comments on the methodological quality of the studyStudy results: impact on skillsReviewers’ comments on the reliability and validity of results
  1. nc, number of participants in the control group; ni, number of participants in the intervention group.

Rosenberg et al. 19987RCT nc = 54 ni = 54SatisfactoryEffective, P = 0.0001 (training effect for increase in median search score in comparison with control). Increase in yield score P = < 0.0001 (ulcer problem) and P = 0.5 (cardiac problem) in comparison with control Only ulcer problem results are reliable and valid
Erickson and Warner, 19988RCT nc = 8 ni = 23 (11 + 12)WeakNot effective: no difference in search recall and precision between control and intervention groupsUnreliable and invalid results due to major systems change during the trial and a large dropout rate from the trial
Ghali et al. 200011CBA (quasi-experimental design) nc = 26 ni = 34WeakEffective, P = 0.002 (significance of between group differences in change from baseline for increase in medlinesearch skills) and P = 0.002 for tendency to use computerized searchesUnreliable results

Quantitative synthesis can only be applied to systematic reviews where the interventions, participants, outcomes and study designs are similar enough to suggest the results can be pooled. In this review all the studies had different pre- and post-teaching measurement instruments and methods, so quantitative synthesis was not applicable.

Description, results and reviewers’ comments on the methodological quality of the included studies

Eighteen potentially useful studies were found but only three met the inclusion criteria and only these were formally assessed. Brief description, results and methodological appraisal of each of the included study is as follows.

Rosenberg et al. 19987

In this single-blind study, one hundred and eight first clinical year students at Oxford University Medical School, UK were randomised to equal size control and intervention groups. The intervention group received three hours interactive training, in small groups, on medline via WinSpirs (Silver Platter's Windows medline searching software). Their pre- and post-training searches (conducted within three months of the completion of training) were assigned a search score (maximum 18) based on the technical aspects of the search quality, like use of free-text searching, truncation, MeSH searching, Boolean operators and limits, etc. The quality of evidence retrieved was assessed and scored (0, worst; 4, best) by clinicians. Students also rated their satisfaction with training on a 1 (not very useful) to 6 (extremely useful) scale.

The control group did not receive any training and they were assessed once, simultaneously with the intervention group's post-training evaluation.

Results. Median, interquartile ranges and the related P-values for ulcer and cardiac problems are shown in Tables 2 and 3, respectively. Out of the 67% of the trainees who responded, 96% found it very useful.

Table 2.  Median, interquartile ranges and the related P-values for ulcer problem.
Ulcer problemSearch score (0–18)Yield score (0–4)
  1. For training effect on median search score in comparison to control, P = < 0.0001; for training effect on median yield score in comparison to control, P = < 0.0001.

Before training intervention group4 (3–6)1 (0–2.5)
After training intervention group9.5 (9–11)4 (3–4)
Control group4 (3–7)0 (0–2)
Training effect in comparison to control5 (4–6)3 (1–4)
Table 3.  Median, interquartile ranges and the related P-values for cardiac problem.
Cardiac problemSearch score (0–18)Yield score (0–4)
  1. For training effect on median search score in comparison to control, P = < 0.0001; for training effect on median yield score in comparison to control, P = 0.5.

Before training intervention group5 (3–6)4 (4–4)
After training intervention group6 (7–9.5)4 (4–4)
Control group4 (4–6)4 (3–4)
Training effect in comparison to control3 (2–3)0 (0–0)
Methodological quality
Randomization.

Standard Random Numbers chart was used for randomization and it was blocked to ensure equal number of participants in the intervention and control groups.

Level of blinding.

It is a single blind study where markers were blind to the identity of the students and whether they were marking pre- or post-training search strategies.

Baseline equality of control and intervention groups.

Students in both the groups had previous search experience and one student in each group had received formal training in literature search and retrieval before. No other information on the baseline equality of the groups was available.

Equality of treatment for control and intervention groups.

The groups were treated in the same way during the training intervention, apart from the training provided to the intervention group.

Were all the participants who entered the trial properly accounted for at its conclusion?

In the intervention group, only 45 out of the allocated 54 participated in the training and out of this only 38 were given the post-training test. The reasons for the nine participants withdrawing from the training and seven not being given the post-training test are not discussed. All pre-test results (45 participants) were included in the statistical analysis.

Reliability and validity of the measuring instrument used.

The search score assessment criteria used in the study has face validity but its reliability is limited by the fact that the search scores achieved by the trainees partially depend on the complexity of the search problem on which they are tested. If the search problem requires the formulation of a very detailed search strategy, i.e. using all the techniques taught to retrieve relevant articles, student's search score would be high, but if it doesn’t the search score would be low. This is obvious from the results of the students’ that searched on the ‘Cardiac problem’. Their search scores showed less improvement after training because the assigned problem did not require the formulation of a detailed search strategy to retrieve relevant articles in the first instance. This problem transcends to yield score measurement as well. Contrary to the expectations, the direct relationship between the search score and the yield score doesn’t exist, i.e. for a particular search strategy the search score given may be low but the yield score could be high, as a simple search strategy could sometimes lead to the retrieval of relevant articles on which the yield score depends.

How precise are the results?

Results related to the ‘Ulcer problem’ are valid and reliable.

Erickson and Warner, 19988 This single-blind, small, randomised controlled trial was conducted in Thomas Jefferson University Hospital, Philadelphia, USA. The participants (31) were obstetrics and gynaecology, trainee residents. They were assigned to three groups using opaque sealed envelopes. The first intervention group (11 students) received one-hour individual tutorials and performed searches, the second intervention group (12) attended the session where all searching was conducted by the instructor and the control group (8) did not receive any training. Both the intervention groups were trained in searching medline through CD-plus. Two searches were conducted prior to teaching and two after it. They were rated for relevance by the faculty members who were blinded to the study. The seven-point relevance scale used was developed by Haynes et al. 199013 at McMaster University. The relative search recall (number of relevant articles retrieved by one searcher divided by the number of relevant articles retrieved by the group) and precision rates (number of relevant articles retrieved by one searcher divided by the total number of articles retrieved by that searcher) were calculated by the students.

The residents also rated their search satisfaction on a 1 (very dissatisfied) to 5 (very satisfied) scale. Midway through the study the university's search system was changed to Ovid search interface so the students were trained and conducted pre-training searches using CD-plus but did the post-training searches through Ovid interface.

Results. No difference in search recall, precision and satisfaction were found between control and intervention groups.

Methodological quality

Randomization. Students were first stratified according to their postgraduate year. Then opaque sealed envelopes were distributed by the researchers to randomise the students into a control and two intervention groups. It is not a standard method of randomization because there is a possibility of researchers introducing bias into the study.

Level of blinding. Markers were unaware of the identity of the students.

Baseline equality of control and intervention groups. Prior to the intervention, the study participants filled in a questionnaire indicating their computer experience and medline usage and proficiency. The tutorial group students were disproportionately female (75%) and 33% had medline at home so the groups were not equal at the baseline in important ways.

Equality of treatment for control and intervention groups. No conclusions about the equality of treatment for control and intervention group participants can be made on the basis of the written report of the trial.

Were all the participants who entered the trial properly accounted for at its conclusion? Ninety and 84% of the participants did the first and second pre-teaching searches, respectively, while only 71% did the third and 58% did the fourth post-teaching search. Only 29% of the participants submitted their citation lists for the fourth search. This declining participation has been noted in the study.

Reliability and validity of the measuring instrument used. The study used a pre-validated instrument developed by Haynes et al. 199013 for determining the relevance of searches. They found its intra-rater (weighted kappa = 0.79, 95% CI (0.74–0.84), an excellent level of agreement) and inter-rater reliability (weighted kappa = 0.49, 95% CI (0.41–0.49), a fair level of agreement).

How precise are the results? The results of this study are totally unreliable due to the major systems change during the trial—from CD plus to Ovid interface, and because of the large participant dropout rate of the trial.

Ghali et al. 200011 This controlled before-and-after trial was conducted at two mobile sites of Boston University Medical School with 60 (ni = 34, nc = 26), third-year medical students. The intervention group received training in how to harness medline as a part of an Evidence-based Medicine (EBM) training programme. They participated in four, 90 min sessions over a period of 4 weeks. The instructions on harnessing medline were only given during the first session, while during the rest of the sessions students conducted the searches related to the clinical questions given. The control group received no database search and retrieval training.

To determine the efficacy of the intervention, students in control and intervention groups were surveyed immediately before and after the course using a self-reporting questionnaire. One of the questions related to the use of medline where students rated their own skills on a six point ordinal scale: 1 (extremely effective) to 6 (extremely ineffective).

Mean change in scores from the baseline was calculated for both the groups. The P-values were calculated for the differences between groups using Wilcoxan rank sum test.

Results relevant to this review. The results are shown in Table 4.

Table 4.  Results relevant to this review.
 Intervention group change in score from the baselineControl group change in score from the baselineP
Skill in using+0.7−0.3P = 0.002
medline
Methodological quality

Level of blinding. It is an unblinded study as the students reported on their own performance through a self-reporting questionnaire.

Baseline equality of control and intervention groups. Students were not checked for any important baseline characteristics such as their levels of computer experience.

Equality of treatment for control and intervention groups. They were treated equally.

Reliability and validity of the measuring instrument used. The study details do not tell whether the instrument's reliability and validity was checked.

Were all the participants who entered the trial properly accounted for at its conclusion? All participants undertook the training and filled in the pre- and post-training questionnaires.

How precise are the results? The post-course reduction in control group's medline skill level could purely be attributed to psychological reasons. It could be argued that the control group students felt that they have become worse or are worse than before because the other group had received training while they did not. Similar argument could be made for the post-training increase in medline-skill level reported by the intervention group. This has negative implications for the validity of the results reported for the increase in medline skill level by the researchers.

Results of the review

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

The combined results of primary studies (Table 1) conducted in teaching hospitals, mainly with student doctors, show that there is some improvement in their searching skills and they find the training useful. These results are similar to the conclusions reached by a rapid review on the information skills training presented in the Evidence-based Librarianship Conference in Sheffield, 2001.14

Discussion

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

Although the studies show a positive impact of training on the searching skills of the trainees, no consensus on the quantification of the level of skills improvement can be reached due to the small number of studies and the poor quality of most of the existing studies.

There is also no set definition of the optimal level of skill in literature searching which must be reached to be efficient in practice. The generalisability and conclusions that can be drawn from the small number of trials (of mixed quality) found are limited.

One explanation for the shortage of research in this area could be that the educational research neither attracts high level of funding nor has the prestige associated with the research effort in medical sciences.15

None of the studies assesses what the optimum period of training might be.

The scope of this review was limited only to determining the impact of training on health professionals’ literature search and retrieval skills from electronic health databases due to the difficulties of measuring the effect of such interventions on their clinical practice.6,16 No effort was made to measure the effect of such interventions on clinical practice.

It is also based on the analysis of the published studies only. One unpublished study17 was found and efforts were made to procure it without success. This has negative implications for the overall precision of the review's findings.18 At this stage, we accept that the review's results may not be precise but it still gives us a clear indication of the need for good-quality further research in this area.

The scope of the review could be widened by including studies whose evaluation was limited to measuring more basic learning outcomes, such as increase in knowledge about the information sources, literature search and retrieval process and its associated terminology, and increase in the usage of electronic databases.19,20 We believe these learning outcomes are the basis for developing a good level of skill in literature search and retrieval. The reviewers believe that a good level of skill in this area could be defined as the ability to execute searches and retrieve information independently.

Reviewers’ conclusions

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

Implications for practice

Evidence from research, mainly conducted in the teaching hospitals and medical schools with medical students, shows some effectiveness in improving health professionals’ searching skills. There is no clear evidence about the level of effect, whether the effect lasts and what the effects of teaching are on fully qualified doctors, nurses and on the members of allied medical professions.

Because there is a lack of good-quality research evidence, the development and implementation of any new literature-search training programme should be considered on an experimental basis only with in-built vigorous evaluation strategies to prove local effectiveness.

Implications for research

There is an urgent need for good-quality research that is methodologically rigorous, especially those using controlled before-and-after studies, which measure impact of training on skills through objective, valid and reliable instruments, conducted in non-teaching hospitals and in primary care settings with a variety of health professionals.

Acknowledgements

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References

The reviewers would like to thank Ann Green and Ann Chadwick, Assistant Librarians, at Chorley and South Ribble District General Hospital library for their prompt and efficient service in providing the requested articles.

References

  1. Top of page
  2. Abstract
  3. Background
  4. Results of the review
  5. Discussion
  6. Reviewers’ conclusions
  7. Acknowledgements
  8. References
  • 1
    Department of Health. The new NHS. Modern. Dependable. London: The Stationery Office Limited, 1997.
  • 2
    Department of Health. A First Class Service. Quality in the New NHS. London: Department of Health, 1998.
  • 3
    Department of Health. Information for Health. An Information Strategy for the Modern NHS 1998–2005. London: Department of Health, 1998.
  • 4
    Department of Health. Research and Development. Towards an Evidence-Base for Health Services. Public Health and Social Care. Information Pack. London: Department of Health, 1998.
  • 5
    Pyne, T., Newman, K., Leigh, S., Cowling, A. & Rounce, K. Meeting the information needs of clinicians for the practice of evidence-based healthcare. Health Libraries Review 1999, 16, 314.
  • 6
    Urquhart, C., Massiter, C., Thomas, R., Sharp, S. & Smith, J. Getting information to vocational trainees: report of the GIVTS project. Library and Information Commission Research Report 26. London: Library and Information Commission, 1999.
  • 7
    Rosenberg, W. M. C., Deeks, J., Lusher, A., Snowball, R., Dooley, G. & Sackett, D. Searching skills and evidence retrieval. Journal of the Royal College of Physicians of London 1998, 32(6), 55763.
  • 8
    Erickson, S. & Warner, E. R. The impact of an individual tutorial session on medline use among obstetrics and gynaecology residents in a academic training programme: a randomised trial. Medical Education 1998, 32, 26973.
  • 9
    Learning and Development Division, Public Health Resource Unit. Headington, Oxford: Institute of Health Sciences. Available at: http://www.phru.org.uk/-casp/resources/rct-tool.htm.
  • 10
    NHS Centre for Reviews and Dissemination. Undertaking systematic reviews of research on effectiveness. CRD Guidelines for Those Carrying Out or Commissioning Reviews. CRD Report 4. York: NHS Centre for Reviews and Dissemination, University of York, 1996: 75.
  • 11
    Ghali, W. A., Saitz, R., Eskew, A. H., Gupta, M., Quan, H. & Hershman, W. Y. Successful teaching in evidence-based medicine. Medical Education 2000, 34, 1822.
  • 12
    Combie, I. K. The Pocket Guide to Critical Appraisal. London: BMJ Publishing, 1996: 49.
  • 13
    Haynes, R. B., McKibbon, K. A., Walker, C. J., Ryan, N., Fitzgerald, D. & Ramsden, M. F. Online access to medline in clinical settings. A study of use and usefulness. Annals of Internal Medicine 1990, 112, 7884.
  • 14
    Brettle, A. Information skills training: a rapid review of the literature. Evidence Based Librarianship Conference, 3–4 September 2001, Sheffield. UK.
  • 15
    Hutchinson, L. Evaluating and researching the effectiveness of educational interventions. British Medical Journal 1999, 318(7193), 12679.
  • 16
    Booth, A. & Falzon, L. Evaluating information service innovations in the health service: ‘If I was planning on going there I wouldn’t start from here’. Health Informatics Journal 2001, 7, 139.
  • 17
    Clancy, M. Evaluation and appraisal of teaching evidence-based practice to PAMs. A completed (1998) but unpublished study. Publication ID: N0231084751. Source: The National Research Register at http://www.update-software.com/nrr/CLIBINET.EXE?A=1&U=1001&P=10001.
  • 18
    NHS Centre for Reviews and Dissemination. Literature searching and study retrieval. In: Undertaking Systematic Reviews of Research on Effectiveness. CRD guidelines for those carrying out or commissioning reviews. CRD Report 4. York: NHS Centre for Reviews and Dissemination, 1996, 1925.
  • 19
    Griffin, N. L. & Schumm, R. W. Instructing occupational therapy students in information retrieval. The American Journal of Occupational Therapy 1992, 46(2), 15861.
  • 20
    Wallace, M. C., Shorten, A. & Crookes, P. A. Teaching information literacy skills: an evaluation. Nurse Education Today 2000, 20, 4859.