If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Address correspondence to Austin V. Stone, M.D., Ph.D., Department of Orthopaedic Surgery and Sports Medicine, University of Kentucky, 2195 Harrodsburg Rd., Lexington, KY 40504.
To evaluate the content and quality of YouTube videos concerning patellar dislocations.
Methods
“Patellar dislocation” and “kneecap dislocation” were searched on the YouTube library. The Uniform Resource Locator of the first 25 suggested videos was extracted, for a total of 50 videos. The following variables were collected for each video: number of views, duration in minutes, video source/uploader, content type, days since upload, view ratio (views/day), and number of likes. Video source/uploader was categorized as academic, physician, nonphysician, medical source, patient, commercial, and other. The Journal of the American Medical Association (JAMA), Global Quality Scale (GQS), Patellar Dislocation Specific Score (PDSS), and DISCERN scores were used to assess each video. A series of linear regression models were used to explore relationships between each of these scores and the aforementioned variables.
Results
The median video length was 4.11 minutes (interquartile range 2.07-6.03, range 0.31-53.56), and the total number of views for all 50 videos was 3,697,587 views. The mean overall JAMA benchmark score ± standard deviation was 2.56 ± 0.64, GQS: 3.54 ± 1.05, total PDSS: 5.76 ± 3.42. Physicians were the most common video source/uploader (42%). Academic sources had the greatest mean JAMA benchmark score (3.20), whereas nonphysician and physician sources had the greatest mean GQS scores (4.09 and 3.95, respectively). Videos uploaded by physicians had the greatest PDSS scores (7.5).
Conclusions
The overall transparency, reliability, and content quality of YouTube videos on patellar dislocation measured by the JAMA benchmark score and PDSS, respectively, are poor. Additionally, the overall educational and video quality, as assessed by the GQS, was intermediate.
Clinical Relevance
It is important to understand the quality of information patients receive on YouTube so providers can guide patients to greater-quality sources.
As technology has become more accessible, patients have turned to the internet as a resource for health information. One study of internet use in an outpatient elective spinal surgery patient population found that of the patients with internet access, 30% used the internet to research their condition.
Another study found that 49% of outpatient orthopaedic patients search their condition online before their appointment with a physician, and 42% search the internet after their visit.
has become a popular site for videos containing health-related information. However, the quality of the health-related information on YouTube is sometimes questionable. Since YouTube currently has no process for guaranteeing the accuracy of posted content, many patients may encounter unreliable information and become misinformed on their conditions. Previous studies examining the quality of information found on YouTube concerning orthopaedic conditions and treatments including shoulder instability, hip arthritis, and cervical fusions found that the quality and reliability of these topics were low.
the educational quality and accuracy of information on patellar dislocations and subluxations has not been evaluated. A 2020 study of internet usage found that 77% of U.S. internet users aged 15 to 25 years accessed YouTube, making this age group one of YouTube’s largest target audiences.
are differentiated from dislocations. Subluxations can be symptomatic and occur when the patella partially shifts outside the trochlear groove instead of completely.
Since this patient population also comprises those most often accessing YouTube, it is essential that the online content they encounter concerning patellar dislocations is of high quality. They may be potentially exposed to lower-quality information if they are using YouTube to search their condition, which can impact outcomes. The purpose of this study was to evaluate the content and quality of YouTube videos concerning patellar dislocations. We hypothesized that most of the videos evaluated would be of low quality.
Methods
This project did not require review by the institutional review board.
YouTube Search and Video Characteristics
The terms “patellar dislocation” and “kneecap dislocation” were searched separately on the YouTube online library.
For each search term, the Uniform Resource Locator (URL) of the first 25 suggested videos were extracted to Excel (Microsoft Excel, Redmond, WA), for a total of 50 videos. To maintain our desired sample size, additional videos were reviewed and included to make up for any exclusions. YouTube was accessed through an incognito mode Google browser. Incognito mode was used so that previous search queries could not influence the search results.
Videos that required a sign-in to view, were greater than 60 minutes long, advertisements, and those about knee dislocation were excluded.
The following variables were collected for each video: number of views, duration in minutes, video source/uploader, content type, days since upload, view ratio (views/day), and number of likes. Previous studies also collected number of dislikes, like ratio (like∗100/[like + dislike]), and video power index (Like ratio∗view ratio/100).
However, this information was not collected in the present study because YouTube recently made the dislike count of videos private, although viewers can still see and use the dislike button.
video source/uploader was categorized as academic, physician, nonphysician, medical source, patient, commercial, and other. The definitions of these categories can be found in Table 1. Content type was categorized as disease-specific information, patient experience, surgical technique or approach, nonsurgical management, and other.
Table 1Definitions of Video Source/Uploader Categories
Video Source/Uploader
Definition
Academic
Uploaders affiliated with research groups, colleges, or universities
Physician
Individual physician or physician group not affiliated with academic institution
Nonphysician
Health professional other than medical doctor, for example, physical therapist or athletic trainer
Medical source
Content or animations from health websites
Patient
Patient uploader sharing information on their experience with patellar dislocation
Commercial
For-profit entity not involved in direct patient care, positions itself as a source of health information
Other
Uploaders not clearly fitting the other categories, for example, news or sports sources
Two trained reviewers (B.S. and M.S.) graded videos based on the Journal of the American Medical Association (JAMA) benchmark score. Any disputes were resolved by a third, senior author (A.V.S). The JAMA benchmark score was collected as a measure of video transparency and reliability. The JAMA Benchmark score consists of 4 items: authorship, attribution, currency (dates when content was posted and updated), and disclosure, and 1 point is awarded for each item present on the video. A greater JAMA Benchmark score represents greater video transparency and reliability.
The Global Quality Scale (GQS), which is a tool designed to evaluate online resources, was collected as a measure of the educational and video quality of videos.14,18 The score ranges from 1 to 5 and is based on the flow, ease of use, and overall quality of the video. Scores ranging from 1 to 2 are considered low quality, 3 is considered intermediate, and 4 to 5 represent high quality.
To better assess the video content quality specific to patellar dislocations, a Patellar Dislocation Specific Score (PDSS) was developed based on information published by the American Academy of Orthopaedic Surgeons.
This score is composed of 18 items divided into 5 domains: patient presentation, general patellar dislocation information, diagnosis and evaluation, treatment, and postoperative course. One point is awarded for each item addressed in a video, with a maximum score of 18; a greater score represents greater patellar specific quality. Quality scores were grouped into categories of Very Poor (0-3), Poor (4-7), Fair (8-11), Good (12-15), and Excellent (16-18). Although this tool is not validated, multiple peer-reviewed studies have used similar orthopaedic topic-based scoring systems (Table 2).
To determine whether videos met their educational goals overall, videos also were evaluated based on 2 questions from the DISCERN score, a grading tool designed to assess the quality of online information.
These questions (and potential answers) were as follows: (1) Does the video state its aims/purpose? (2) Does the video achieve its aims/purpose? (Yes—clearly states aims, Partially—alludes to aims but does not clearly state them, No—no mention of video aims/purpose).
Descriptive statistics were used to summarize video characteristics, many of which were highly right-skewed. For each of the 3 measures of aforementioned quality, summary statistics were calculated overall and broken out by video sources and content type. Separate one-way analysis of variance models were used to analyze the differences in these scores across video sources and across content types. For measures in which the overall model was significant, relevant pairwise differences were calculated and assessed for significance. In addition, for each of the 3 quality measures, a multiple linear regression model was used to examine the relationship between that quality measure and statistically important video characteristics, after adjusting for the effect of video source and content type. Diagnostic measures and residual analysis were used to check model assumptions in each case. Throughout the study, a P-value of less than .05 was considered statistically significant. All analyses were completed in R, version 4.1.2 (R Foundation for Statistical Computing; Vienna, Austria).
Results
Of the initial 50 videos generated from the search, 5 were excluded. Thus, 5 additional videos were evaluated. The overall median video length was 4.11 minutes (interquartile range [IQR] 2.07-6.03, range 0.31-53.56). The median number of views was 14,413 (IQR 3,287-38,724, range 333-1,225,334), and the total number of views for all 50 videos was 3,697,587 views. The median view ratio was 11.20 views/day (IQR 3.35-41.49, range 0.24-1,324.69). The median number of video likes was 144 (IQR 28-398, range 1-5,400).
The mean overall JAMA benchmark score ± standard deviation was 2.56 ± 0.64, GQS: 3.54 ± 1.05, total PDSS: 5.76 ± 3.42 (poor). PDSS scores ranged from 0-13 (very poor to good). The most common PDSS scores were between 6 and 8 (n = 15) or 0 and 2 (n = 12) (Fig 1). Physicians were the most common video source/uploader (42%), and the second most common was nonphysician health professionals (22%). The least common sources were “other” (2%), which was one rugby sports channel and commercial (4%). The most common content type was disease specific information (40%) and least common was surgical technique (6%).
Fig 1Histogram of total PDSS score. (PDSS, Patellar Dislocation Specific Score.)
Evaluating the videos by source/uploader, we found that academic sources had the greatest mean JAMA benchmark score (3.20), although there were only 5 academic sources. Physician uploaders had the second greatest mean JAMA benchmark score (2.67). The “other” uploader had the lowest JAMA score (1), and patient uploaders had the second lowest score (1.75) (Table 3). However, it is worth noting that there were small numbers of videos in these categories (other: n = 1 and patient: n = 4). The mean JAMA benchmark score of academic sources was significantly greater than patient (P ≤ .001), nonphysician (P = .032), and medical sources (P = .041). The mean JAMA benchmark score of physician sources was significantly greater than patient (P = .004).
Table 3Quality Scores by Video Source and Content Type
Uploader/Source (n)
Mean JAMA Benchmark
Mean GQS
Mean PDSS
Academic (5)
3.20
3.00
6.40
Physician (21)
2.67
3.95
7.50
Nonphysician (11)
2.55
4.09
3.90
Medical source (6)
2.50
3.00
5.50
Patient (4)
1.75
2.00
3.50
Commercial (2)
2.50
3.50
4.00
Other (1)
1
1
0
Content type
Disease-specific information (20)
2.75
3.80
8.60
Patient experience (9)
2.11
2.33
3.80
Surgical technique (3)
3.00
4.00
7.30
Nonsurgical management (12)
2.42
3.83
3.10
Other (6)
2.67
3.67
3.80
GQS, Global Quality Scale; JAMA, Journal of the American Medical Association; PDSS, Patellar Dislocation Specific Score.
Nonphysician (4.09) and physician sources (3.95) had the greatest mean GQS scores, whereas other (1.00) and patient sources (2.00) had the lowest GQS scores (Table 3). The mean score of both nonphysician and physician sources was significantly greater than patient sources (P = .0001). The mean GQS score of commercial sources (3.50) was also significantly greater than patient sources (P = .039). The mean GQS score of videos that clearly stated aims was 4.00. Videos that did not state aims had a mean GQS score of 2.67 (P = .034).
Video Content Quality: PDSS
The greatest mean PDSS scores were found in videos uploaded by physician (7.5) and academic (6.4) sources, whereas other (0) and patient sources (3.5) had the lowest PDSS scores. The mean PDSS score of videos from physician sources was significantly greater than nonphysician (P = .003) and patient sources (0.021). When looking at the PDSS score by video content type, videos on disease-specific information (8.6) and surgical techniques (7.3) had the greatest mean PDSS scores, whereas nonsurgical management (3.1), patient experience (3.8), and other (3.8) content were lowest (Table 3). The mean PDSS score of videos focusing on disease-specific information was significantly greater than that of nonsurgical management (P < .001) and patient experience (P < .001). In addition, it was found that video duration was significantly associated with total PDSS (P = .002), after adjusting for video source and content type. For each additional minute of video length, the estimated mean PDSS score increased 0.149 points.
Discussion
The main findings of this study are that, overall, the transparency/reliability and content quality of YouTube videos on patellar dislocation measured by the JAMA benchmark score and PDSS, respectively, are poor. In addition, the overall educational and video quality, as assessed by the GQS, was intermediate. These results agree with our hypothesis and previously reported findings from YouTube studies of other orthopaedic topics, including anterior cruciate ligament (ACL), posterior cruciate ligament and femoroacetabular impingement (FAI).
By conducting our search with an incognito browser, we believe our sample of 50 videos is a fair representation of what users may see when searching about patellar dislocations.
The mechanisms of injury that typically lead to patellar dislocations are either noncontact twisting injuries or from a direct impact to the medial aspect of the knee.
Patients with generalized ligament laxity, which is more common in female patients, are at an increased risk as well, but these patients usually have recurrent subluxations instead of dislocations.
Dislocations usually occur in the lateral direction, due to the direction of pull of the quadriceps muscle being lateral to the mechanical axis of the leg. Medial dislocations are far rarer and usually occur due to congenital conditions, quadriceps atrophy, or iatrogenic.
The medial patellofemoral ligament is usually torn with patellar dislocations, as it provides static resistance to lateral patellar instability in the first 20° of knee flexion.
Interestingly, it seems that YouTube videos on patellar dislocation are not as popular as other orthopaedic topics. The total number of views of the 50 patellar dislocation videos evaluated was 3,697,587. A study of the quality of 50 meniscus YouTube videos garnered a total of 14,141,285 views,
which is substantially larger than both the mean and median number of views of patellar dislocation videos. This discrepancy in number of views could be explained by the difference in injury rates. For example, ACL injuries account for 50% of all knee injuries,
Since the rate of ACL injuries is greater than that of patellar dislocations, it makes sense that videos pertaining to the ACL have more views than the narrower search topic of patellar dislocation. Regardless of the popularity of videos on patellar dislocation, it is still a common injury, especially in young athletes.
and may be easily influenced by the information they view online, it is important to evaluate the quality of information found on YouTube on this topic.
Evaluating videos by uploader/source, we found that physician and academic uploaders had the greatest quality. Examples of physician uploads would be a private practice physician uploading a video who is not involved in academia, while academic uploaders can also be physicians, but they have affiliations with research groups, colleges, or universities. Videos published by physician uploaders had the second greatest mean JAMA and GQS scores and the greatest PDSS. Academic uploaders had the greatest mean JAMA and second greatest mean PDSS. Excluding the one “other” source, videos uploaded by patients had the worst scores in all three quality domains: JAMA, GQS, and PDSS. Mean JAMA, GQS, and PDSS were significantly greater for physician uploaded videos compared with patient sources. Researchers conducting a study on the quality of meniscus YouTube videos found similar results; physician uploaders had the greatest JAMA, GQS, and MSS scores (which measured meniscus education content), and patient-sourced videos had the lowest JAMA and MSS and the second lowest GQS.
similar to the present study. Another study assessing the quality of YouTube videos on cervical fusion found that videos uploaded by physicians had the greatest mean JAMA and Cervical Fusion Content Score, which is a novel score (similar to the MSS and PDSS) developed by the authors to assess the content quality specific to cervical fusion.
These results highlight that YouTube videos uploaded by academic or physician sources often have the highest content quality and reliability, especially when compared to patient published videos. Physicians and academic institutions are likely uploading evidence-based information, whereas patients are uploading more anecdotal information based on personal experience. Nonphysicians had a slightly greater median GQS score than physicians (4.09 vs 3.95), but this finding was not significant. Since both groups constitute medical professionals, it can be expected that the content quality of their video will be similar.
Although similar methods for evaluating the educational quality of YouTube videos have been used in peer-reviewed studies,
the finding that the patellar dislocation content quality on YouTube is low overall, as measured by the PDSS, should be interpreted cautiously. The PDSS is limited because it includes information on all aspects of patellar dislocations from patient presentation to the postoperative course. Thus, one video whose scope is limited may score poorly on the PDSS even though the content may be accurate and reliable. For this reason, the 2 DISCERN questions regarding aims were included in the data collection as another measure of video content quality. Of the 50 videos assessed, 76% stated their aims (38 of 50). Previous YouTube studies have employed modified DISCERN tools to score quality, though they have not included questions about aims.
Interestingly, the mean GQS of videos that clearly stated aims was significantly greater than the GQS of videos that did not, highlighting that measuring aims may be an additional way to measure educational and video quality, especially for those with limited scope that may otherwise score poorly using a more comprehensive tool such as the PDSS. Nine of the 50 (18%) videos had GQS scores of either 1 or 2, placing them in the “low quality” range as it regards video flow, ease of use, and overall quality.
Further highlighting this relationship between PDSS scores and limited video scope, video duration significantly predicted total PDSS, highlighting that longer videos have higher PDSS scores. By the nature of the PDSS scale, this makes sense; a longer video has more time to mention more of the patellar dislocation information included in the PDSS scoring tool. A study of YouTube quality for FAI found that videos with greater quality scores for FAI diagnosis and treatment were significantly longer than videos with lower quality, less useful information (P < .001).
This finding reflects that more time is often necessary to fully explain a topic in a YouTube video, conferring greater quality. However, the tradeoff is that patients may not have the time or patience to watch longer videos, reflecting the lower view count of the longer, greater-quality videos on FAI.
This study was not without limitations. In addition to the limitations of the PDSS tool previously discussed, the YouTube video search was conducted on one day. Someone searching on a different day using different search terms may encounter different suggested videos. Although there is a seemingly infinite number of videos suggested with each search term in YouTube, only 50 videos were evaluated for this study, and thus, the results reflect the content of a subset of patellar dislocation YouTube videos. Also, using the same scoring systems for videos uploaded by physicians/academic sources and commercial/patient sources can result in heterogeneous comparison.
The overall transparency, reliability, and content quality of YouTube videos on patellar dislocation measured by the JAMA benchmark score and PDSS, respectively, are poor. In addition, the overall educational and video quality, as assessed by the GQS, was intermediate.
The authors report that they have no conflicts of interest in the authorship and publication of this article. Full ICMJE author disclosure forms are available for this article online, as supplementary material.