The Quality and Content of Internet-Based Information on Orthopaedic Sports Medicine Requires Improvement: A Systematic Review

Purpose To evaluate the quality and content of internet-based information available for some of the most common orthopaedic sports medicine terms. Methods A search of the PubMed, Embase, and Cochrane databases following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) guidelines was performed. All English-language literature published from 2010 to 2020 discussing information quality pertaining to orthopaedic sports medicine terms was included. Outcomes included the search engines used, number and type of websites evaluated, platform, and quality scoring metrics. Descriptive statistics are presented. Results This review includes 21 studies. Of these, 3 evaluated both the upper and lower extremity. Twelve focused on either the upper or lower extremity, most commonly rotator cuff tears (3 of 12) and/or anterior cruciate ligament pathologies (7 of 12). The most common engines were Google (18 of 21), Bing (16 of 21), Yahoo (16 of 21), YouTube (3 of 21), Ask (3 of 21), and AOL (2 of 21). The average number of media files assessed per study was 87 ± 55. Website quality was assessed with DISCERN (7 of 21), Flesch-Kincaid (9 of 21), Health on the Net (7 of 21), and/or Journal of the American Medical Association Benchmark (7 of 21) scores. YouTube was evaluated with Journal of the American Medical Association Benchmark scores (1.74 ± 1.00). Image quality was reported in 2 studies and varied with search terminology. Conclusions The results of this systematic review suggest that physicians should improve the quality of online information and encourage patients to access credible sources when conducting their own research. Clinical Relevance Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.

P atients have immediate access to powerful search engines and often use the internet to obtain inexpensive, quick medical advice. Previous studies have evaluated the reliability of public-access websites and have reported that many lack high-quality, accurate information. 1 A unique subset of patients who have yet to be investigated in this context is orthopaedic athletes. Surgical interventions often have recovery periods that impact quality of lifedespecially in an active population in which an injury results in a significant decrease in daily activity. It is common for the surgeon to encourage limited use of an injured area or even complete immobilization to promote healing. Many active individuals facing such downtime turn to the internet since it is a wealth of information that is easy to access.
The purpose of this study was to evaluate the quality and content of internet-based information available for some of the most common orthopaedic sports medicine terms. 2 We hypothesized that websites with a Health on the Net (HON) seal or those authored by academic institutions would provide the most medically accurate, safe, and pertinent information whereas websites published by individuals or for-profit businesses would provide the least.

Methods
Two independent reviewers (D.A.H. and J.W.B.) searched PubMed, Embase, and the Cochrane Library up to June 12, 2020. The following search terms were used: (internet information quality) AND (anterior cruciate ligament) or (meniscal) or (shoulder instability) or (Bankart) or (rotator cuff) or (shoulder) or (tennis elbow) or (lateral epicondylitis) or (medial collateral ligament) or (posterior cruciate ligament) or (osteochondral defect) or (cartilage defect) or clavicle or knee. A total of 324 records were identified through the search of the 3 databases.
Preliminary searches were reviewed by title and/or abstract to determine study eligibility based on the inclusion criteria: studies discussing searching internet information quality pertaining to common sports medicine orthopaedic topics including anterior cruciate ligament (ACL) rupture, medial collateral ligament (MCL) tear, posterior cruciate ligament tear, meniscal tear, osteochondral defect of the knee (cartilage defect of the knee), shoulder labral tear (Bankart tear), rotator cuff tear, shoulder arthritis, clavicle fracture, and/or lateral epicondylitis (tennis elbow); full-text studies published in the English language; studies of Level I to IV evidence; and studies published from 2010 to 2020.
Studies were included if they discussed searching at least one of the following databases: Google (Google LLC, Mountain View, CA), Yahoo (Verizon Media, New York City, NY), YouTube (Google LLC), Ask (IAC Search and Media, Oakland, CA), AOL (Verizon Media, New York City, NY), and/or Bing (Microsoft Corporation, Richmond, WA). NoneEnglish-language studies, studies for which the full text was not available, cadaveric studies, basic science articles, case reports, personal correspondence, studies that did not evaluate search engines or consider a medical problem, studies that were not related to orthopaedic sports medicine, and personal correspondence were excluded. Twenty-one studies met the inclusion and exclusion criteria (Fig 1). Data extraction from each study was performed independently (I.S.). Disclosure of funding and third-party involvement were not required to obtain any of the collected data.

Reporting Outcomes
The outcomes extracted included the primary search engines used, the number of websites evaluated by each study, the type or classification of the websites, the primary platform of the search (websites/Web pages, videos, or images), and the metrics used to score the websites. Scoring systems included the following: DISCERN instrument, 4 Flesch-Kincaid (FK) tool, [5][6][7] Journal of the American Medical Association (JAMA) Benchmark scores, and/or HON foundational principles. [8][9][10] The DISCERN instrument consists of 15 separate questions aimed at a specific quality criterion plus an overall quality rating. 4,11 The DISCERN categories include reliability, treatment choices, and overall quality.
The FK tool is the most widely used measure of reading ease. The tool has 2 parts: reading ease and grade level. The first number in a score indicates reading ease (0-100). The second number indicates the average reading grade level. The national average reading level is an eighth-grade level. The recommended published reading level for the layperson is a sixth-grade level. 5 Both reading ease and the grade level are calculated using the same set of metrics: word length and sentence length. Reading ease and grade level are inversely relatedda higher reading ease level correlates to a lower grade level. (Formulas are available in Appendix 1.) The HON seal is granted based on 6 core principles: quality, confidentiality, neutrality, transparency, community, and visibility. [8][9][10] The JAMA Benchmark score ranges from 0 to 4 points. 9 The 4 criteria include author description, references, dating, and disclosures. One point is given for each of the aforementioned aspects; a score of 3 or greater is considered "high quality."

Statistical Analyses
Descriptive data are presented. Owing to the heterogeneity among studies, no calculable data or metaanalyses are presented in this review.

Results
This systematic review was conducted based on the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) checklist and guidelines. 3

Website Media: Images and Video
Video. YouTube, the second most popular social media network, was the only search engine used to assess e1548 videos. 13 Only 2 studies discussed the video medium: those of Akpolat and Kurdal 13 and Cassidy et al. 15 Cassidy et al. reported no correlation between the number of views and video quality or accuracy based on any scoring system.
Images. DeFroda et al. 18,19 discussed the image medium. In their analysis of internet images based on knee ligament search terms, they found that the inter-rater reliability was high (Cronbach a ¼ 0.89) for "PCL tear" (posterior cruciate ligament tear) searched on Bing and nearly equivalent (Cronbach a > 0.9) for the remainder of the search queries (ACL tear, MCL tear, and LCL [lateral collateral ligament] tear). When then compared Google with Bing, the only significant difference was in the ACL group. Bing returned a significantly greater number of correct images: 60% compared with Google's 45% (P ¼ .034). Otherwise, for MCL and LCL (lateral collateral ligament) tear searches, Google and Bing were not statistically significantly different. In their study assessing meniscal images, DeFroda et al. 19 found that search engines displayed meniscal tears with greater than 80% accuracy but that many of the images were technical and required additional education in anatomy and physiology to understand and interpret.

Website Affiliation
Most of the media files assessed were physician affiliated (25%), followed by news or other (15%) and industry or commercial (15%) (Appendix Table 2). Somerson et al. 28 specifically considered source accuracy based on website type. They found that commercial websites had the most errors. When they compared academic sources with commercial sources, commercial sources had a 5 times greater chance of publishing false information. Nonprofit websites had the highest percentage of HON seals. Academic websites had the highest completeness score (19.2 AE 6.7; maximum, 49) when compared with commercial (15.2 AE 2.9), nonprofit (18.7 AE 6.8), and physician (16.6 AE 6.3) websites, indicating that even though a source may be factually correct, it could still be incomplete. This key point was highlighted by Wang et al., 29 who found that most websites, even if considered "high quality," failed to distinguish between focal chondral defects and diffuse osteoarthritis, an important clinical factor in an orthopaedic setting.    25 reported an average content-specific DISCERN score across Bing, Google, and Yahoo of 3.4 AE 0.59.

Discussion
In this systematic review evaluating internet-based guidance for common orthopaedic sports medicine diagnoses, most search engines preferentially populate media that lacks appropriate scientific and medical  30 reported, the use of more complex search terms provided websites with information of a higher reading grade level but not of higher quality.
Most of the websites that populate the internet when searching frequently used orthopaedic terms and diagnoses are not associated with an HON seal, meaning they are not approved for accuracy, completeness, or reliability. Many of the images that appear when searching clinical diagnoses do not align with the actual term used in the search. Finally, most videos available are non-educational and miss key clinical information. This inconsistency highlights that there exists great variability in the major search engines. In support of the findings of Bruce-Brand et al., 14 many of the studies in this review mentioned that health care information online frequently omits treatment options, such as doing nothingda key feature in the DISCERN scoring rank, risks, and prognosis. Nonetheless, website accuracy, reading level, and the presence of an HON seal were positively correlated. 14,22 Websites with a seal had higher overall DISCERN and JAMA Benchmark scores. 14 This review emphasizes that there are very few checkpoints ensuring that medical information on the internet is vetted for safety and correctness. The 21 studies in this review stressed that awareness and use of search engines for health purposes are growing in popularity, but the general public lacks literacy regarding source credibility, which could lead to adverse health outcomes, delayed treatment, and potential exacerbation of a condition or injury. In summary, the findings of this systematic review suggest that physicians can mitigate the discrepancy in health literacy and internet information by taking an active role in guiding patients. Health care providers are in a unique position and can encourage the use of websites with HON seals and encourage patients to refrain from selfdiagnosis and self-treatment based on the guidance of the internet.

Future Directions
The problem of a physician having to prove or disprove a patient's online diagnosis and presumed treatment merits continued analysis. Future studies should consider patient interaction with the internet and its impact on clinic visits, the added burden encountered by physicians, and potential correlations between internet use and physician visits.

Limitations
In this study, only complete data available on the day of the search were analyzed. Therefore, variables outside the scope of the initial search, such as standardized methodologies (several studies used their own scoring tools to evaluate website content), 14,23,31 direct implications for patients, and clinical care correlations, do not have data available for comparison. Only 2 studies looked at images, and both of those only focused on the knee, meaning there is a lack of information available on the shoulder and clavicledother commonly injured parts. 18,19 The only video streaming medium used was You-Tube, which has additional commercial bias given that it is a social media platform. Plus, the specifications on the algorithm used by each specific search are not available and could significantly impact the results that appear. Additionally, we cannot definitively know all search-user characteristics, intentions, and biases when evaluating for a systematic review. There are limits to the generalizability of this study given that the major search engines analyzed (Google, Bing, Yahoo, AOL, and Ask) constantly undergo updates and changes to how they search, their advertisements and sponsors, and what is deemed relevant based on user and computer data. In fact, these changes over time are not well documented, and this could impact search results in every domain. Finally, there are no well-established tools used to rank health-based information that can be translated across all media forms: text, images, and videosdthe closest certification for information vetting is an HON seal.

Conclusions
The results of this systematic review suggest that physicians should improve the quality of online information and encourage patients to access credible sources when conducting their own research. Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.