Assessing the usability of methods of public reporting of adverse drug reactions to the UK Yellow Card Scheme
Division of Social Research in Medicines and Health
School of Pharmacy
University of Nottingham
Nottingham NG7 2RD
Objectives The aim of this study, which was part of the first independent evaluation of patient reporting of adverse drug reactions (ADRs) to the Yellow Card Scheme, was to observe the three reporting systems (paper, internet and telephone) ‘in use’ in a simulated setting to identify aspects which facilitated or hindered reporting.
Methods Forty adult participants were recruited from the general public using posters in pharmacies and a press article, and from a pool of volunteer simulated patients maintained by University of Nottingham medical and pharmacy schools. The participants, in seven groups that met at different times, were asked to ‘think aloud,’ as they were individually observed completing the reporting process for the paper and internet system, highlighting their thoughts and any issues encountered. They were asked to talk about their experience of reporting immediately after they had reported by telephone. Data from the field notes were analysed thematically and supplemented with relevant information from digital audio recordings.
Conclusions Usability testing using the ‘think aloud’ approach worked well and identified areas of the Yellow Card reporting system which could be improved. Whilst the three methods of reporting available to the public are all reasonably ‘fit for purpose’, there were many suggestions identified for improving ease of completion and data quality, especially for the internet system. When systems for reporting of ADRs are designed, they should be tested by potential users before they are launched, so that potential problems are identified in advance.
Spontaneous reporting of adverse drug reactions (ADRs) is one method of pharmacovigilance. In the United Kingdom, this is undertaken through the Yellow Card Scheme that was established in 1964 as a consequence of the thalidomide tragedy.1 Yellow Card reports are submitted to the Medicines and Healthcare products Regulatory Agency (MHRA) by post, telephone or via the internet. The MHRA electronically records and reviews information submitted so that major safety issues can be detected through signal generation.2 For over 45 years, health-care professionals have used this system. The potential benefits of patient reporting were summarized at the First International Conference on Consumer Reports on Medicines in 2000 and included the following: the promotion of consumer rights and equity; acknowledging that consumers have unique perspectives and experiences; and, that health-care organizations would benefit from consumer involvement.3 In 2004, the Committee on Safety of Medicines (CSM) Independent Review of Access to the Yellow Card Scheme1 identified, as a counterpart to its main remit of making recommendations on access to Yellow Card data, that patient reporting should be introduced. A subsequent working party on patient reporting of ADRs recommended that a patient reporting system had to be accessible to patients and the public, easy to use, available in different reporting systems and able to provide information needed to monitor medicines effectively. The first Yellow Card for use by the public was designed to include questions from the professionals’ Yellow Card and retain a similar format to the professional card although exact wording was not always retained. Consultations were held with patients and patient organizations, and the first pilot of the patient Yellow Card was made available on the MHRA website in January 2005. Following the pilot, the forms were further adapted and a nationwide direct patient reporting scheme was officially launched along with a media campaign in September 2005. Following the redesign of the paper-based reporting form and web-interface for reporting, the scheme was launched again in February 2008, with another media campaign. This study was part of the first major evaluation of patient reporting to the Yellow Card Scheme; the overall aims of the study were to evaluate patient reporting of suspected ADRs to the Yellow Card Scheme in the United Kingdom by assessing the pharmacovigilance contribution of patient reports compared with those of health professionals, exploring the views of patient reporters and members of the public and comparing study findings with those from existing schemes worldwide.
Usability testing is one way of ensuring that systems are adapted to their users, their tasks and that there are no negative outcomes of their usage, measuring whether a system has the capacity to fulfil its purpose. The goal of usability testing is to assess the degree to which a system is effective and efficient and favours positive attitudes and responses from the intended users.4 Participants are invited to undertake typical tasks that may or may not be simulated whilst their behaviours are observed and recorded. Should design problems be identified, recommendations are proposed that may improve the quality of the product and tailor the product to the users’ preferred ways of working. The user Interface has not received much attention as a research area in health care until recently and has usually involved health professionals’ use of health informatics systems but patients have previously been involved in usability testing of web-based patient education information.5
The ‘think aloud’ method
The ‘think aloud’ method of usability testing originates in psychology and has been applied in psychological and educational research on cognitive processes, but has also been used for knowledge acquisition in the context of building knowledge-based computer systems. The ‘think aloud’ method has been criticized with respect to the validity and completeness of the reports it generates as a result of the researcher disturbing the cognitive processes.6–8 It has been shown however that if the researcher minimizes interventions in the process of verbalizing and reminds participants to keep talking when they stop verbalizing their thoughts, the ongoing cognitive processes are no more disturbed than by other knowledge acquisition techniques.9,10 Some researchers have stated that it is likely to reveal more issues than just exploring people’s perceptions of a system.11 The method has been widely adopted by information communications technology and engineering as a way of assessing usability of computer programmes and expert systems.12 Participants are asked to ‘think aloud’ whilst solving a problem or performing a task.
The aim of this study was to observe the three reporting systems (paper, internet and telephone), ‘in use’, to identify aspects of the systems which facilitated or hindered reporting.
Participants were recruited from the general public by posters in pharmacies, and a press article, and from a pool of volunteer simulated patients maintained by University of Nottingham medical and pharmacy schools. Those interested in participating were asked to contact the researchers and were invited to a session convenient to them. Seven sessions were conducted on different times and days of the week and were preceded by a focus group where patient reporting of ADRs was discussed (reported elsewhere). Participants gave informed signed consent. Participants were offered a £25 inconvenience allowance. Ethical approval was received from Warwickshire Research Ethics Committee and the Independent Scientific Advisory Committee of MHRA.
Simulating the Yellow Card Scheme reporting systems
Blank paper Yellow Card Scheme forms were provided by the MHRA. The telephone reporting system at the MHRA is staffed by science graduates, including pharmacists, from 10 am to 2 pm Monday to Friday. As not all the usability groups were conducted during these times, and to avoid problems reporting fictitious ADRs to the MHRA telephone reporting system (e.g. not wanting fictitious reports to be recorded on the system or people taking the calls knowing/suspecting they were being contacted by a simulated patient), the decision was made to use a pharmacist researcher to reproduce the reporting system throughout the usability testing process. To ensure that the participants’ experience of reporting using the telephone method was as realistic as possible, the MHRA training materials were used to inform the interaction with the reporter. For the Internet reporting system, the MHRA provided a dummy system, identical to the real system.
Six scenarios describing a patient experiencing an ADR were developed (see example scenario in Box 1). Each participant then chose two scenarios and two reporting methods to ensure that only those familiar with the internet or able to use a telephone would be given that option. This avoided the elicitation of confidential information from the participants, allowed those who had not experienced a side effect to participate and gave consistency to data collection. Scenarios were provided in large print for participants who required them.
Participants were asked to ‘think aloud’ as they completed the reporting process for the paper and internet system highlighting their thoughts and any issues encountered. They were asked to talk about their experience of reporting immediately after they had reported by telephone. There was no time limit placed on completion of the reports, but participants were told that if they felt that they would have given up completing the report at home and they should do the same during the usability tests. Participants were individually observed, and their comments digitally audio recorded. Observers made field notes and asked questions for clarification and reminded the participants to keep talking if they stopped verbalizing their thoughts. We continued running the usability sessions until no new themes were being identified. The data from the field notes were analysed thematically and supplemented with relevant additional information from the audio recordings (these were not transcribed).
Box 1 Example scenario
Here are the details for the patient Scenario. Please note that this is not a real patient and all the contact details are made up. In this scenario, you are the patient.
Mr John Smith is a 56-year-old gentleman. His address is 42 Long Acre, Beeston, Nottingham NG9 4EG. Tel.: 0115 9286695. His email at work is email@example.com
The doctor started him on a new medication for his cholesterol in early May 2008. The drug was simvastatin 40 mg at night. He started to get pains in the muscles in his legs and arms a few weeks after taking the newly prescribed medicine. He noticed an aching in his muscles most of the time and he was not able to get around quite as well as normal. The problem went away about a week after stopping taking the medicine. He tried taking paracetamol for the pains but this was not very helpful. He stopped taking the medicine after discussing the pains with his doctor. He also uses a salbutamol inhaler for his asthma and takes aspirin for his heart and St John’s wort for low mood. He has had angina since 2007 and has been an asthmatic since the age of five. He is allergic to penicillin.
Mr Smith weighs 13 stone and is 5 feet 9 inches tall.
His doctor is Dr Mayberry, Middle Vale Surgery, Beeston, Nottingham, NG9 7ET.
Response rates and demography
Forty participants took part in seven groups (Table 1); 67.5% (27) of the group were women, and 72.5% (29) were aged 50 or over. Two individuals were hearing impaired, and a third had to use a magnifying glass with the large print materials. Thirty-seven people completed internet reports, 36 completed written forms and eight completed telephone reports (81 reports in total). Fewer participants completed telephone reports following the first two groups when it became obvious that there were very few issues associated with telephone reporting.
Table 1. Demographic data for usability testing participants
|Age (in years)|
|Educational level achieved|
| Left school at 16||4||10.0|
| Left school at 18||2||5.0|
| Further education||17||42.5|
| Higher education||14||35.0|
| Postgraduate degree||2||5.0|
| Not stated||2||5.0|
The form was generally well received with many of the participants commenting positively:
This is an excellent form and with my own information I could er… fill it in easily. (Female, 60–69 years)
Two participants with visual impairment found the font size too small. Several participants commented that they would not bother reading the information about the YCS provided within the integrated information leaflet and reporting form; they would just start filling in the form. It should be noted, however, that the scheme had been explained to participants in the focus group directly before they undertook the usability tests.
In some of the scenarios, details were provided about a number of drugs that were being taken by the fictional individual, and some of the participants found it difficult to fit all the required information onto the form. Whilst the form suggests that further information can be provided on a separate sheet, participants were generally not keen on this:
Having to attach extra sheets to give all the information would annoy me, so I wouldn’t bother! (Male, 70–79 years)
I’m frustrated to have to add additional sheets. (Female, 70–79 years)
And some felt that this part of the form was tedious and long winded:
I have lost the will to live (comment made when entering all details of the drugs). (Female, 50–59 years).
Some individuals commented that they would struggle to remember the date on which their various drugs had started and finished and they would need to look up this information.
The layout of the form was commented on by some participants. Details about the reporter are asked for at the end of the report after details of the ADR and the severity of the side effect. This did not feel natural to some of participants who felt that detailed ADR information should be requested after more basic information has been collected.
Whilst all the scenarios included the height and weight of the fictional character experiencing the side effect, some participants commented that they would not necessarily be able to provide these details, particularly their weight.
When participants tried to insert their competed form in to the reply-paid envelope which is also part of the leaflet, they were disappointed to discover that the form did not slide neatly into it; instead, the form had to be folded:
The envelope is the same width as the form, so it doesn’t go in without folding. And the adhesive is not very good. (Male, 18–29 years)
One participant commented that the use of a first class reply-paid envelope made her feel that the MHRA was taking the whole system very seriously.
It was suggested that access to the system would be improved by making the paper-based and online forms, available in languages other than English:
It might be useful to er you know, provide the service in other languages, er not only English. (Male, 60–69 years)
The telephone reporting system was popular with all those who used it. The interactive nature of the conversation with the person manning the telephone was highlighted as a factor that made the process much easier for the reporter. Participants expressed concern that such a convenient method of reporting was only available for a limited number of hours, thereby limiting access. They were pleased to be told that the reporting line has a direct dial number which does not require reporters to enter a queuing system with an automated response menu:
I might be put off by having to pay for the phone call…the time restriction of the service might put me off too. (Female, 40–49 years)
The only issue encountered with the telephone system occurred to one participant who had impaired hearing. He found that he was unable to hear the person taking the report because of background noise and the test had to be abandoned because of this. The participant stated that at home he would not have encountered this problem and he did not consider this to be an inherent problem.
The online reporting system received mixed comments from participants. There were a few who found the system easy to navigate, but many found it very challenging, even though they used the internet regularly and completed forms online.
When reporting the nature of the suspected ADR, there are a number of drop-down menus (dictionaries) that provide mixed response options. However, the complexity of the terms used within these menus confused a number of participants. For example, entering an ADR term of ‘rash’ produced over 70 medical terms for rash (such as ‘rash desquamating’, ‘rash morbilliform’). Participants were often unable to understand these terms, and many participants felt forced to choose the first term on the list (which is always a general term) or a term at random.
A number of participants did not see the drop-down menus appear, as they were concentrating on the next part of the form that they had to complete. This resulted in lack of further detail being provided.
There appeared to be no facility to save data at each stage of completing the online form. At least two participants had problems with losing information after taking some considerable time completing sections of the form. Having wasted their time, these participants told us that they would not have finished completing the form in real life.
I would have given up… too frustrating… (Female 45 years)
One comment made by a number of participants related to the different ordering of questions on the online and paper forms. A consistent approach would make it easier for users who may use the paper form to collate their information, but then choose to submit the data using the online system.
Usability testing using the ‘think aloud’ approach worked well and provided a number of ideas for improving the Yellow Card reporting system Whilst the three methods of reporting are all reasonably ‘fit for purpose’, there were many suggestions identified for improving ease of completion and data quality. The most fundamental issues related to the internet form. Our findings suggest that the frustration that occurs when data are lost by navigating backwards when trying to complete a report online may result in the reporter abandoning their attempt at reporting. Changes to the internet report forms could be made relatively easily. The online system also asks for extra information to that requested on the paper form. It is important to consider whether this additional information is required, and if so, whether it should also be collected on the paper form. Levels of reporting to Yellow Card Scheme may be lower than they might be expected to be because some patients may give up having tried unsuccessfully to make a report. The adaptation of a system designed for health professionals rather than designing a system that is specific for patients has led to a number of problems including the issue of the drop-down menus. If a system was to be designed for patients, it should start with two questions (i) what do we need to know from patients for the information to be useful? (ii) what information could we reasonably expect patients to have and to then provide? Modifying the internet form to fit more with patient requirements, whilst capturing similar data from both patients and HCPs, would be the optimal approach to allow for collection of comparable data, but in a way that patients find straightforward. This ‘think aloud’ approach probably revealed more issues than would have been found by assessing participants’ reported perceptions alone. Our findings indicate that when systems for reporting of ADRs are designed, it is important to test them with potential users before they are launched so that possible problems are identified in advance. The ‘think aloud’ approach proved to be a very useful method and could be used in the design of both patient and heath professional reporting systems and expert systems in the future.
Limitations of the study
The sample of 40 participants was small, but no new issues were identified in the later usability sessions and saturation of data occurred. The scenarios for the usability tests were ‘hypothetical’, and this may have introduced some artificiality. It might also have lengthened the process of completing the forms by any of the three methods because of lack of personal association with the data. Conversely, the process may have been shortened as dates for starting and finishing the medicines and height and weight were included in the scenarios, whereas in real life participants would have had to recall this information, look it up or ask someone else. We could have moved the gentleman who had found it hard to hear the telephone conversation owing to background noise to a quieter area in the university, but this was not done at the time.
Thinking aloud takes place concurrently with cognitive processes. Cognitive processes could take longer when ‘the think aloud’ method is used. Alternatively participants’ verbalization may not keep up with their cognitive processes and the evaluation may then be incomplete. Also, participants might have had different verbal skills which could influence their contribution and the subsequent results.7,8 We did not, as some suggest,6 give the participants a chance to talk aloud with a similar task before we asked them to simulate completing a Yellow Card.
For practical reasons, we did not invite only adults who had experienced ADRs to take part in the usability testing. Even so, it was a challenge to recruit 40 participants, and it is unlikely that we would have achieved this number if we had restricted entry criteria to people who had experienced ADRs. However, whilst some of our participants say that they would have ‘given up’ trying to report because of difficulties they had experienced with the internet reporting system, it is possible that people who had suffered a suspected ADR would have been more persistent. We do not know how many of our participants had experienced an ADR.
We successfully simulated the three methods of reporting to the YCS and were able to produce recommendations for improvement. Usability testing with the 40 participants indicated that telephone reporting worked well, but identified specific suggestions for enhancing internet and paper reports. Suggested changes to internet reports include the following: making it easier for users to navigate through the web pages, reducing the complexity of the drop-down menu options for ADRs and allowing users to save the report as they are going through it. Suggested changes to paper reports include as follows: allowing more space for the recording of multiple medications, having a larger font size for people with visual impairment and redesigning the envelope so that the report fits within it more easily. When systems for reporting of ADRs are designed, they should be tested by potential users before they are launched so that potential problems are identified in advanced. These findings will be published in a forthcoming HTA report and have been shared with the MHRA. We hope the findings will be used to revise the scheme to encourage more reporting and ensure people are not deterred because of practical difficulties.
This research was funded by the NIHR Health Technology Assessment Programme (project number RM05/JH30.) The views and opinions expressed therein are those of the authors and do not necessarily reflect those of the Department of Health. We thank the MHRA for their assistance and members of the study advisory group for comments on our study report, which contained the findings presented in this paper. We also thank all the participants who gave us their valuable opinions and Randy Barber for his help with administration.