Paradigm shift: Beyond the COVID‐19 era, is YouTube the future of education for CABG patients?

Abstract Introduction Patients commonly use YouTube for education, and this may have increased due to COVID‐19 related restrictions on access to healthcare professionals. However, YouTube videos lack peer review and regulation. To assess patient education in the COVID‐19 era, we analyzed the quality of YouTube videos on coronary artery bypass graft (CABG) surgery. Methods We searched YouTube using the phrase “coronary artery bypass graft.” Two authors individually used the Journal of the American Medical Association (JAMA), DISCERN, and Health on the Net (HON) systems, to rate the first 50 videos retrieved. Data collected for each video included; number of views, duration since upload, percentage positivity (proportion of likes relative to total likes plus dislikes), number of comments, and video author. Interobserver reliability was assessed using an intraclass correlation coefficient (ICC). Associations between video characteristics and quality were tested using linear regression or t‐tests. Results The average number of views was 575,571. Average quality was poor, with mean scores of 1.93/4 (ICC 0.54) for JAMA criteria, 2.52/5 (ICC 0.78) for DISCERN criteria, and 4.04/8 (ICC 0.66) for HON criteria. Videos uploaded by surgeons scored highest overall (p < .05). No other factors demonstrated significant association with video quality. Conclusion YouTube videos on CABG surgery are of poor quality and may be inadequate for patient education. Given the complexity of the procedure and that beyond the COVID‐19 era, patients are more likely to seek education from digital sources, treating surgeons should advise of YouTube's limitations and direct patients to reliable sources of information.


| METHODS
We searched YouTube using the phrase "coronary artery bypass graft" (CABG) [5] on December 29, 2020 and collected and included the first 50 videos in our study. This was performed in the English (United States) language and no filters were used. We collected the following data for each video; number of views, number of comments, time (years) elapsed since the video was posted, percentage positivity (proportion of likes relative to total likes plus dislikes), and author category. Authors were categorized as being either surgeons, media or other (e.g., allied health professionals).
Videos retrieved were viewed and assessed independently by two authors (Aashray K. Gupta and Joshua G. Kovoor) using the ranking systems from the Journal of the American Medical Association (JAMA), 24 DISCERN, 25 and Health on the Net (HON). 26 The JAMA ranking system is scored on a four-point scale, with categories being authorship, attribution, currency, and disclosure. The DISCERN tool involves a 15-part questionnaire to assess the quality and reliability of a publication. 25 Each question is scored on a scale of 1-5 points, with the mean score across the 15 questions reported as the video's final score. The HON ranking system scores eight distinct criteria each given one point and these include financial disclosure, justifiability and transparency. 26 This process was repeated three separate times by each investigator and an average score for each video from the investigator was obtained. For subsequent analysis, we used the mean score from both authors. For each ranking system, we assessed Interobserver reliability using intraclass correlation coefficient (ICC) analysis with values >.7 considered to be good correlation. Associations between the scores assigned by assessors and the number of views, number of comments, video length, percentage positivity, and age of the video were analyzed using linear regression. Relationships between scores and the number of views and comments after these factors had been controlled for, and age of the video, were also analyzed using linear regression. When assessing the relationship between author category and video ratings, analysis of variance was used. The software IBM SPSS version 27.0 (IBM Corp.) 27 was used for all statistical analysis.
This study had no human or animal subjects as we only used publicly available data on YouTube and as such does not require Ethics.  However, using the DISCERN (Figure 4) criteria, surgeons did not score higher compared with author authors.

| DISCUSSION
To our knowledge, this study provides the first assessment of the educational quality of YouTube videos in the field of Cardiac Surgery, and is particularly relevant as CABG is the most common operation.
Videos were mostly authored by surgeons and media companies.
When assessed using three validated scoring systems, average quality of the videos was consistently poor. Videos uploaded by surgeons score highest overall using the JAMA and HON criteria but In response to the COVID-19 pandemic, to reduce transmission, most governments have instituted physical distancing policies. 35 This has resulted in less face-to-face medical care and provides a challenge in delivering healthcare. 36 For surgical systems worldwide, preoperative screening, triage, and intraoperative practice has been affected by the risk of COVID-19 transmission in the community. [37][38][39][40] Modern technology allows healthcare to be delivered and consumed through the internet, whilst also limiting physical movement of persons and in effect, lowering the density per area of people at healthcare centers. The combined effect has been to reduce risks of COVID-19 transmission to patients and healthcare staff. 41 One adaptation to this environment has been the acceleration of telehealth, 42 which has been utilized for the safety of surgical departments during the pandemic. 43 Our study has a few limitations.

AUTHOR CONTRIBUTIONS
Data extraction, initial analysis, and manuscript preparation were

ETHICS STATEMENT
Ethical approval not required as use of publicly available videos found on YouTube.