Online Patient Education Materials for Common Sports Injuries Are Written at Too-High of a Reading Level: A Systematic Review

Purpose To determine the readability of online patient information for common sports injuries. Methods A systematic search of the literature using PubMed/MEDLINE, Embase, and the CINAHL databases was performed according to Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines. Studies were included if they (1) were published between 2000 and September 2020, (2) were English-language publications and complete studies from peer-reviewed journals, (3) evaluated online information directed toward patients with common sports injuries. Results Eleven studies met inclusion criteria and were included. The mean Flesch-Kincaid Grade Level for online education information was 10.5, whereas the mean Flesch Reading Ease was 51.2, indicating existing health resources are written above the recommended readability grade level (no greater than a sixth-grade reading level). The mean DISCERN score was 41.5, indicating that the quality of information accessible to patients was fair. The accuracy of health content determined by the ACL-Specific Score was reported as moderate level (mean 8.85). Conclusions This study demonstrates that online patient information regarding common sports injuries the does not match the readability recommendations of the American Medical Association and National Institutes of health. Clinical Relevance Future health-related information should be written by qualified experts at a level that can be easily understood by patients of all health literacy levels. Surgeons should be more attentive to where patients get their information from and how they interpret it. Accurate, easy to understand educational tools can improve efforts to help patients identify misconceptions about treatment options, and to guide patients to choices that are consistent with their values.

T he value of patient education materials relies on the users' ability to access and understand the presented information. Within the last several years, the Internet has transformed into the primary source of health information for many people. 1 More than 345 million Americans, representing 95.0% of the population, have Internet access, with more than one-half using the Internet to seek health information. 2 Moreover, there is an emergent body of literature across multiple specialties supporting the importance of accurate and accessible health information for patients. 3 The quality of information provided to patients regarding their care may substantially influence their understanding of their condition/injury. 4 Further, patient education may influence treatment choice and outcome expectations. 5 In the orthopaedic setting, effective patient education may contribute to a favorable postoperative course. Johansson et al. 6 reported that preoperative orthopaedic patient education improved pain, length of hospital stay, self-efficacy, and motivation to complete exercises. It is therefore imperative to assess the quality, readability, and accuracy of online patient education materials. Furthermore, patient education tools are now a major focus in management and are counted among the factors considered in health care quality assessment. 7 Attention to from where patients obtain their information and how they interpret it represents an important step in patient management: the patient, when correctly informed, plays a substantial role in discussing treatment options and subsequent surgical procedures. 8,9 Without quality information, the patient is in less of a position to accurately weigh tests and treatment options that are in line with their goals, values, and preferences. 9 The purpose of this study was to determine the readability of online patient information for common sports injuries. We hypothesized that the readability of online patient information for common sports injuries would not meet recommended levels.

Methods
The systematic review was performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines. 10 No meta-analysis was undertaken for the included studies, given the heterogeneity of patient education materials assessed.

Information Sources and Search Strategy
The literature search was conducted, with the assistance of a research support librarian, using the PICO framework. A comprehensive search was conducted using the PubMed/MEDLINE, Embase, and CINAHL databases. All databases were searched from inception to September 2020. Each database was searched for the following Medical Subject Headings (MeSH) and key words: "athletic injuries," "education delivery," "patient engagement," "shared decision-making," "preoperative," and "postoperative." Search and query of terms used in combination with Boolean operators available as Appendix Tables 1-3, available at www. arthroscopyjournal.org. Each included study's reference list was also reviewed.

Eligibility Criteria
Studies were included if they (1) were published between September 2000 and September 2020 to capture different variations of studies, while excluding obsolete knowledge, and incorporating the present trends in the study topic as compared to the recent past; (2) were English-language publications and complete studies from peer-reviewed journals; and (3) evaluated online information directed toward patients with common sports injuries. Exclusion criteria were publication types other than peer-reviewed studies such as protocols, reviews, or case series.

Selection Process and Data Collection
The query yielded 722 studies from PubMed/MED-LINE, 2868 from Embase, and 3652 from CINAHL databases after duplicates were removed. Data were independently extracted by 2 of the coauthors (Y.A. and A.A.) using standard data extraction forms for all studies. These reviewers screened full-text studies using the same procedure with acceptable reproducibility for all decisions. Disagreements were resolved by consensus. The following data items were collected: condition or injury, information source, number of webpages analyzed, authorship, methods of acquiring information, and key study results regarding quality, readability, and accuracy (Table 1). [11][12][13][14][15][16][17][18][19][20][21] Outcome Measures

Measures of Readability
Three scores were used to calculate readability: Flesch-Kincaid Grade Level (FKGL), Flesch Reading Ease Score (FRES), and Gunning Fog Index (GFI). FKGL measures the grade level that one must complete to comprehend a given text, whereas the FRES measures the readability of a text. 22 FKGL and FRES range from 0 to 29: very difficult to read or a postgraduate reading level; 30 to 49: difficult to read, college reading level; 50 to 59: fairly difficult to read, high school reading level; 60 to 69: standard difficulty to read, 8th to 9th grade reading level; 70 to 79: fairly easy to read, 7th grade reading level; 80 to 89: easy to read, 5th to 6th grade reading level; 90 to 100: very easy to read, 4th to 5th grade reading level. GFI estimates the years of formal education a person needs to understand the text on first reading. [23][24][25]

Measures of Quality and Accuracy
Six scores were used to calculate quality and accuracy: DISCERN questionnaire, Journal of the American Medical Association (JAMA) benchmark criteria, ACL Specific Score (ASS), the Global Quality score (GQS), Unique quality and accuracy score, and Health On the Net Code (HONcode).
The DISCERN questionnaire is a standardized quality index of consumer health information that determines publication quality based on 16 questions that pertain to the reliability of the publication, content information, and overall quality rating. 26 The DISCERN criteria scale ranges from 6-80, with a greater score indicating greater quality.
The GQS was assigned by the reviewer after evaluating the pertinent websites. The GQS uses a 5-point scale to rate overall quality and scores range from 0 to 5, with a greater score indicating greater quality.
Unique quality and accuracy scores are based on guidelines written by American Academy of Orthopedic Surgeons (greater score ¼ greater quality or accuracy). [12][13][14]16,17,27,28 Finally, the presence of HONcode certification identifies websites that agree to comply with a code of ethics to provide quality objective and transparent medical information. 29

Assessment of Study Quality
Study quality was evaluated through the following variables recommended in Crombie's items for assessing the quality of cross-sectional studies 30 : (1) appropriateness of design to meet the aims, (2) justification of sample size, (3) adequate description of the data, (4) report number of excluded studies, (5) adequate representativeness of the sample to the total, (6) clearly stated aims and likelihood of reliable and valid measurements, and (7) adequate description of statistical methods. Each parameter received a score of 0, 0.5, or 1 point for not reporting, unclearly reporting, or clearly reporting, respectively. Studies were denoted as high quality if more than 5 of the 7 criteria were described and considered. Studies were denoted as moderate quality if 4-5 of the criteria were described and considered. Quality scores less than 4 were deemed low quality.
Readability, Quality, and Accuracy of Information Table 2 11-21 reports the readability, quality, and accuracy of online patient information for sports medicine related injuries. Six of 11 (54.5%) studies evaluated components of readability (Table 3). 11,14,[16][17][18][19] The mean FKGL was 10.5 (range 8.1-13.4), which is defined as "very difficult to read," or a postgraduate reading level. The mean FRES was 51.18 (range 50.17-52.14), which is defined as "fairly difficult to read," or a high school reading level. Only one study reported a mean GFI of 9.02, which is higher than the threshold (index less than 8) for universal understanding. 19 Ten of 11 (90.9%) studies evaluated components of quality (Table 4). [12][13][14][15][16][17][18][19][20][21] Overall, the quality of information accessible to patients was classified as fair, with a mean DISCERN score of 41.5 (range 39.47-44). The mean JAMA benchmark score for websites was 1.8 (range 1.32-2.4). Only one study reported a poor ASS of 5.5. 18 Bruce-Band et al. 12 and Dalton et al. 19 demonstrated that HONcode-certified sites (2 studies in total), were significantly more difficult to read (P ¼ .004).

ACCURACY OF ONLINE PATIENT INFORMATION e867
Assessment of Study Quality Study quality of articles included in the review ranged from 4.5 to 7, indicating moderate to high quality. Fifteen of 17 studies (88.2%) were high quality based on their quality assessment scores, whereas 2 of 17 (11.8%) were moderate quality. No studies were deemed low quality (Table 6). [11][12][13][14][15][16][17][18][19][20][21] Discussion Our analysis shows that online patient education material for the most common sports injuries is at a high reading level. Readability of the included studies was calculated as difficult to read, with no studies reporting a FKGL score under the recommended (no greater than a sixth-grade reading level) threshold for readable patient education material. 35 This corroborates previous studies that analyzed online patient education material demonstrating poor readability. 19,27 Taken together, analysis of the data suggests that many patients may not fully comprehend the continuous stream of online information about a wide range of sports injuries. This may lead to increased hospitalization rate, poor compliance, increased costs, and poor health status. 1,36,37 While decision aids are increasingly being used in orthopaedic practice, aids written beyond the recommended reading level diminishes shared decision-making and the ability of a patient to grasp all attributes of care. Future health-related information should be written by qualified experts, at a level that can be easily understood by patients of all health literacy levels. Surgeons should be more attentive to where patients get their information from and how they interpret it. Accurate, easy-to-understand educational tools can improve efforts to help patients identify misconceptions about treatment options, and to guide patients to choices that are consistent with their values.
SD, standard deviation. e868 as a second language. Previous studies have found that websites using medical terminology and those that have an advanced reading level are also more accurate. 33,38 This confirms a bias that favors patients with greater levels of education and greater health literacy. 11,12,38 While many patients are accessing this information online, it may come up short in its purpose to explain and instruct patients concerning their sports injury and treatment choices. To adequately use the Internet as a resource for health information, clinicians should guide patients to websites that include descriptions of injuries and treatment options that meet their reading level. For example, fifth grade is the average Medicare beneficiary level, and eighth grade is the average U.S. resident reading levels. 39 Information shared on the internet can impact patients' choices, convictions, and mentalities toward their care. In medicine, qualified experts provide clinical advice; however, most online information is written by people who may not have such qualifications. We found that less than 40% were physician authored. Most patients do not have the right tools to evaluate health literature for biases, unreliability, and inaccurate information; such data can leave patients vulnerable to poor healthcare decisions and misinformation. 40 Future research may provide updates and more comprehensive insights regarding the characteristics of available patient information. Further, additional work on online patient education of sports injuries should focus on more in-depth assessment of cost utility, impact on total office visit time, and influence on postoperative outcomes, and patient expectations.

Limitations
There are several limitations to this study. Heterogeneity of the outcome measures and variation in diagnosis and patient characteristics made it difficult to evaluate and compare studies. Furthermore, studies published several years ago or more may be out of date with respect to currently available online patient resources, particularly since the internet is such a massive and constantly changing source of information

Conclusions
This study demonstrates that online patient information regarding common sports injuries does not match the readability recommendations of the American Medical Association and National Institutes of Health.   OR "Anterior Cruciate Ligament Injuries" OR "Knee Injuriesþ" OR "Menisci, Tibial" OR "Meniscal Injuries" OR "Shoulder Instability, Posterior" OR "Shoulder Instability, Multidirectional" OR "Shoulder" OR "Shoulder Dislocation" OR "Shoulder Jointþ" OR "Shoulder Injuriesþ" OR "Joint Instabilityþ" OR "Rotator Cuff Injuries" OR "Patella Dislocation" OR "Athletic Injuriesþ" ) 70,903 2 MH ( "Patient Educationþ" OR "Models, Educational" OR "Selective Dissemination of Information" OR "Consumer Health Informationþ" OR "Health Educationþ" OR "Pamphlets" OR "Mobile Applications" OR "Communications Mediaþ" OR "Computers, Hand-Heldþ" OR "Educational Technology" ) OR ) OR ( MH ( "Decision Making, Shared" OR "Patient Centered Care" OR "Rehabilitationþ" OR "Sports Re-Entry" OR "Preoperative Education" OR "Preoperative Periodþ" OR "Preoperative Careþ" OR "Perioperative Careþ"OR "Postoperative Period" OR "Postoperative Careþ" ) OR AB ( patient participation OR shared decision making OR patient-centered OR patient centered OR patient centered care OR decision involvement OR patient involvement OR patient engagement OR rehabilitation OR patient expectation OR patient expectations OR return to sport OR return to activity ) OR TI ( patient participation OR shared decision making OR patient-centered OR patient centered OR patient centered care OR decision involvement OR patient involvement OR patient engagement OR rehabilitation OR patient expectation OR patient expectations OR return to sport OR return to activity ) ) 686