Skip to main content

From quality to clarity: evaluating the effectiveness of online ınformation related to septic arthritis

Abstract

Background

The aim of this study was to assess the content, readability, and quality of online resources on septic arthritis, a crucial orthopedic condition necessitating immediate diagnosis and treatment to avert serious complications, with a particular focus on the relevance to individuals from the general public.

Methods

Two search terms (“septic arthritis” and “joint infection”) were input into three different search engines on the Internet (Google, Yahoo, and Bing) and 60 websites were evaluated, with the top 20 results in each search engine. The websites underwent categorization based on their type, and their content and quality were assessed utilizing the DISCERN score, the Journal of the American Medical Association (JAMA) benchmark, the Global Quality Score (GQS), and the Information Value Score (IVS). The readability of the text was assessed through the utilization of the Flesch Kincaid Grade Level (FKGL) and the Flesch Reading Ease Score (FKRS). The presence or absence of the Health on Net (HON) code was evaluated on each website.

Results

The DISCERN, JAMA, GQS, FKGL, and IVS scores of the academic category were found to be substantially greater when compared with the physician, medical, and commercial categories. But at the same time, academic sites had high readability scores. Websites with HON code had significantly higher average FKGL, FCRS, DISCERN, JAMA, GQS, and IVS scores than those without.

Conclusion

The quality of websites giving information on septic arthritis was variable and not optimal. Although the content of the academic group was of higher quality, it could be difficult to understand. One of the key responsibilities of healthcare professionals should be to provide high quality and comprehensible information concerning joint infections on reputable academic platforms, thereby facilitating patients in attaining a fundamental level of health literacy.

Introduction

Septic arthritis is a critical orthopedic condition requiring prompt diagnosis and treatment to prevent severe complications [1, 2]. This condition is usually characterized by a bacterial infection that affects the joints, leading to inflammation, pain, and stiffness [3]. In the general population, the annual incidence of septic arthritis is estimated to be between 2 and 10/100.000 people [4]. Septic arthritis can occur in any joint in the body, but it most commonly affects the knee, hip, and shoulder [5]. Patients at higher risk of developing this condition include those with compromised immune systems, preexisting joint disease, and recent joint surgery [6, 7]. If left untreated, septic arthritis can cause significant damage to the joint and surrounding tissues, resulting in long-term disability and chronic pain. In advanced cases, patients may require joint replacement surgery, which can be costly and associated with a long recovery time.

Patients now have increased access to information due to the widespread use of the Internet as a source of data [8, 9]. In order to learn more about the diagnosis, management, and treatment of septic arthritis, patients, families, and healthcare professionals are turning more frequently to the Internet.

The accessibility and convenience of the Internet have changed the traditional dynamic of the doctor-patient relationship. Patients today have access to an unprecedented amount of medical information and are increasingly involved in the decision-making process regarding their care [10, 11]. While this has many positive implications, it can also lead to potential pitfalls in the diagnosis and treatment of septic arthritis. In some cases, self-diagnosis or reliance on unverified information can lead to a delay in seeking proper medical attention. This delay can have serious consequences for patients with septic arthritis, as timely diagnosis and treatment are essential for good outcomes. Moreover, the Internet is rife with inaccurate and misleading information, which can negatively impact patient psychology and cause further complications [12].

Therefore, it is crucial for patients and healthcare providers to critically evaluate the reliability and quality of the information found on the Internet. This involves assessing the source of the information, seeking evidence-based materials from reputable sources, and being wary of misleading information. By taking these measures, patients and healthcare providers can make informed decisions about the management of septic arthritis, ultimately leading to better patient outcomes. Given the importance of accurate and reliable information for treating septic arthritis, there is a need to evaluate the validity and usefulness of the material available on the Internet. Limited research exists about online patient resources about septic arthritis. The purpose of this research was, thus, to assess the content, readability, and quality of online resources about septic arthritis. By addressing these issues, this study aims to enhance our understanding of the role played by online resources in supporting patient education, self-management, and decision-making regarding septic arthritis. Ultimately, it seeks to contribute to the improvement of patient outcomes by promoting the critical evaluation of online information, fostering health literacy, and empowering individuals to make informed decisions about their health care.

Materials and methods

This investigation followed the methodology described in a recent paper by Agar et al. [13] to provide the scientific reliability. The websites were chosen using a combination of search terms that patients are most likely to use to seek information about the procedure. These search terms were “septic arthritis” and “joint infection.” As of February 2023, Google had an 80% market share, followed by Bing (15%) and Yahoo! (3%) [14]. Both search terms were inputted into three popular search engines (Google, Yahoo, and Bing), resulting in six distinct searches (2 search terms, 3 search engines). The top 10 results from each search were recorded, for a total of 60 websites. All searches were done on the same day, February 19, 2023, and all browsers' cookies were deleted before searching. Keywords and search engines separated websites. When a duplicate website was identified, the search term and engine were chosen based on which combination returned the highest-ranked (i.e., earliest) website for a given query. Video contents were also excluded from the study. Two orthopedic and traumatology specialists independently assessed each website's content and evaluation scores. Before analysis, a third orthopedic and traumatology specialist resolved any authorship or content conflicts. The websites were classified into four distinct categories, including academic, physician, medical (health-related websites), and commercial. To define the types of academic, physician, medical, and commercial resources, specific criteria were established based on multiple factors. These criteria encompassed various aspects, including content, domain, source, and publisher. Academic resources were identified as websites affiliated with recognized educational institutions, research organizations, or scholarly journals, focusing on disseminating peer-reviewed scientific knowledge. Physician resources referred to websites associated with healthcare professionals, such as individual practitioners, medical clinics, or hospitals, which provide medical information, treatment guidelines, or clinical expertise. Medical resources encompassed health-related websites that aim to educate patients, provide general medical information, or address health concerns. Commercial resources were classified as websites associated with pharmaceutical companies, medical device manufacturers, or healthcare-related businesses that may offer products, services, or advertisements.

Methods of assessment

Using the DISCERN tool [15] for determining the reliability of written health information, all relevant websites were examined for the Journal of the American Medical Association (JAMA) benchmark, the Global Quality Score (GQS), the Flesch-Kincaid Readability Test Tool (FK), the presence or absence of the Health On the Net (HON) Foundation seal, and the information value score (IVS).

The DISCERN questionnaire assisted individuals in providing consumers with a universally accepted measure of the quality of health information. The focus of this exercise is on evaluating written materials relevant to health care. In 1998, as part of a nationwide effort in Britain, the instrument was validated to set minimum standards for the quality of written information on treatment options made available to the public by nonprofits, the National Health Service, self-help groups, the pharmaceutical sector, and other industries. The DISCERN instrument consists of 16 questions ranked on a 5-point scale. For each criterion, a score of 1 indicates that it is not met at all, 2–4 indicates that it is met to some extent, and a score of 5 indicates that it is met completely. As the response options for each question range from 1 to 5, the lowest and highest total scores for this instrument are 6 and 80, respectively. Websites were ranked as “excellent” (63–80), “good” (52–61), “medium” (39–50), “poor” (28–38), or “very poor” (27) based on their aggregated scores [15].

Four factors are used in JAMA's benchmark evaluation of the quality of online content (authorship, attribution, description, and currency) [16]. Authorship requires that a website provide the names, affiliations, and links of all authors and collaborators. References and materials used in creating content must be acknowledged, and copyright information must be included. Website ownership information, including any and all commercial, financial, and possible conflicts of interest, should be made available to the public. Last but not least, currency guarantees that the website's material is dated at the time of the original upload and future revisions. Scores ranged from 0 (the lowest quality) to 4 (the highest quality) based on how many of the criteria were met.

In addition, the Global Quality Score (GQS), which is a statistic that ranks the overall quality of the websites and consists of a scale of 5 points, was determined for each website that was included in our analysis [17]. The ratings assigned to the website reflected both the quality of the information it provided and the potential advantages it offered to patients.

The Flesch-Kincaid Grade Level (FKGL) assessment is utilized for determining the educational proficiency necessary for comprehending a given text. The FKGL score ranges from fifth grade to college graduate level, with 5 being the minimum level and 12 being the maximum. The article's readability was measured with the Flesch-Kincaid Reading Ease (FKRS) test. The score informs the reader of the approximate education level needed to understand a given text. A number between zero and one hundred was assigned to a piece of writing to signify how easily it could be understood. Scores at or near 100 indicated the document was easy to read, but scores at or near zero indicated the document was difficult to comprehend. To acquire FK scores, the text of each website was copied to a Microsoft Word (Redmond, Washington) document, a method utilized in prior studies [18, 19].

We investigated whether or not websites were in compliance with the Health on the Net Code (HON code). The recommendations provided by the HON Foundation are now the credibility standard that is utilized most frequently for online medical information. In 1995, a non-profit organization based in Switzerland established the foundation with the goal of elevating the standard of health information provided on the Internet. The foundation offers a code of conduct for websites that acknowledge its ideals and adhere to its standards; compliance with the code is audited in a random way to verify that it is being respected [20].

We have also developed our own scoring system based on the information value of the data presented on the sites to make our assessment similar to previous studies [21, 22]. Information value score (IVS) is a 100-point rating scale comprised of four major factors: disease summary (maximum 40 points), diagnosis and treatment (maximum 20 points), pathogeneses and risk factors (maximum 20 points), and complications (maximum 20 points). The summary of the disease mentions the emergency of the disease at 20 points: Five points for pain, swelling, redness, and limited mobility for a total of 40 points. The treatment and diagnosis mention five points for joint fluid analysis, blood laboratory analysis, joint debridement, and antibiotics, for a total of 20 points. Pathogenesis and risk factors were accepted as 20 points in total. Five points each were received when any of the following pathogens were mentioned: bacteria and other pathogens. Each of the following risk variables received two points for a total of 10 points: suspected sexual activity, immunosuppressed disorders, tick bite, recent joint surgery, and joint trauma. Each of the following complications received five points for a total of 20: cartilage damage, joint dysfunction, osteomyelitis, and septicemia.

Statistics

The study findings were analyzed using IBM SPSS Statistics 22, which was provided by SPSS IBM in Turkey. The Kolmogorov–Smirnov and Shapiro–Wilks tests were used to evaluate the suitability of the normal distribution parameters. In addition to utilizing descriptive statistical methods such as mean, standard deviation, median, and frequency to compare categories, the Kruskal–Wallis test was also employed. The group that was responsible for the discrepancy was determined using Dunn's test. Non-normally distributed characteristics were compared using the Mann–Whitney U test. The Spearman's rho correlation analysis was performed to analyze the correlations between non-normally distributed factors. To establish the degrees of agreement among observers, the intra-class correlation coefficient, as well as its minimum and maximum values, was determined. The significance threshold was established at p < 0.05.

Results

Initially, 60 websites were categorized based on their sources: 56.7% were academic, 20% were physician, 11.7% were medical, and 11.7% were commercial (Fig. 1).

Fig. 1
figure 1

Website distribution according to sources

The mean DISCERN score was 55.02 ± 18.62, the mean JAMA benchmark score was 3.18 ± 1.02, and also the mean FKRS and FKGL scores were 37.6 ± 18.75 and 10.06 ± 2.06, respectively (Table 1).

Table 1 Minimum, maximum, mean, standard deviation and median values of scores

The study revealed that academic websites exhibited significantly higher average scores across multiple evaluation metrics compared to physician, medical, and commercial websites. Specifically, academic websites demonstrated superior performance in DISCERN (69.51 ± 7.19), GQS (4.85 ± 0.31), JAMA (3.93 ± 0.18), FKGL (11.35 ± 0.9), and IVS (96.47 ± 4.06). Conversely, commercial websites exhibited the lowest average scores in DISCERN (24.5 ± 3.52), GQS (1 ± 0), JAMA (1 ± 0), FKGL (6.51 ± 1.45), and IVS (47.14 ± 16.71). Further post hoc analyses were conducted to identify the categories that contributed significantly to these findings (Table 2), and it was found that the academic category consistently outperformed the physician, medical, and commercial categories in terms of DISCERN, JAMA, GQS, FKGL, and IVS scores.

Table 2 Scores from each category

There was a strong positive correlation between the first and second observers' DISCERN, JAMA, and GQS scores(p < 0.05). The degree of correlation among the two observers was 0.988 for the DISCERN score, 0.915 for the JAMA score, and 0.952 for the GQS score (Table 3).

Table 3 Level of inter-observer JAMA, DISCERN, and GQS scores' compatibility

The DISCERN and JAMA scores were found to have a strong positive correlation of 0.877 and a relationship (p < 0.05). There were also strong positive correlation found between the DISCERN and IVS scores at the level of 0.849, between the JAMA and IVS scores at the level of 0.865, and between the GQS and IVS scores at the level of 0.818 (p < 0.05) (Table 4).

Table 4 Correlation between scores

Moreover, a HON code was present on 76.7% of the websites. The websites that had a HON code had significantly higher DISCERN score values than the websites that did not have a HON code (p < 0.05). In addition, the JAMA, GQS, FKGL, and IVS scores of websites with a HON code were significantly higher compared to those that did not have one (p < 0.05). However, the websites that had a HON code had signigiciantly lower FKRS score values than the websites that did not have a HON code (p < 0.05) (Table 5).

Table 5 Evaluation of scores dependent on the presence of the HON code

Discussion

The most important finding of the study was that the quality of the websites providing information on septic arthritis was variable and suboptimal. Academic websites contain higher-quality information than others, but they demand a high level of expertise to be easily read, which is the second most important finding of the current study. The information available to patients online should be easily readable, reliable and of high quality because the Internet is both an information resource and a possible health education tool that can be used for the management of diseases. The patient and his or her physician need to work together to treat this disease effectively and promptly, which is necessary to avoid complications [23]. In addition, several studies have demonstrated that patient education and appropriate information increase treatment outcomes [24, 25]. This is the first systematic evaluation of the quality, readability, and content of common websites addressing septic arthritis. Many research studies have shown that the quality of Internet-based information for many medical conditions is low or non-optimal quality [19, 26, 27].

Despite the low or non-optimal quality of the information available online, patients keep depending on online resources. In this condition, the patient's resistance to the doctor's advice and the collaborative decision-making process between doctors and self-trained patients can both suffer when patients have access to the Internet. Depending on the situation, there may be delays in the treatment of septic arthritis patients. Therefore, it is crucial to review Internet resources and assist patients in locating high-quality, complete material and easily accessible websites, as low-quality information may negatively impact the relationship between patients and their doctors.

It is even possible for anyone to construct a web site without having the appropriate experience or education. When looked at from the point of view of the patient, it is challenging to evaluate what information is reliable and accurate. Managing patients who read websites offering inaccurate or low-quality information is one of the most serious difficulties facing medicine today. This might change the dynamic between a patient and doctor since patients may have different expectations for diagnosis and treatment. A physician must be aware of the information available to patients and determine what online research other physicians are conducting on a certain issue today.

In this study, the academic group's scores on DISCERN, JAMA, GQS, and IVS were all found to be considerably higher than those of the other categories. Consistent with previous studies [10, 28], we determined that information in the academic field was of the highest quality and most relevant. Some research, in contrast, demonstrated no relationship between groups and their quality scores [13]. These results demonstrate that the quality and content of the material available on the Internet, including academic studies, may be variable.

The average DISCERN score for the sample was 55.02 ± 18.62 (1–80). This conclusion is consistent with other research that have demonstrated that the quality of information provided on websites is moderate [28, 29]. Key et al., on the other hand, reported low quality scores [30]. This may be because the rate of academic sites in our investigation is greater than the rate in their study. Moreover, the websites belonging to the academic group exhibited higher DISCERN scores in comparison with the other websites. This suggests that academic websites offer information that is more reliable and of better quality when it concerns DISCERN scoring. The publications were evaluated using the DISCERN overall score, which yielded a moderate rating. This suggests that the websites are a valuable source of information but could be further improved with additional information.

The average score on the JAMA benchmark score was 3.18 ± 1.02 (0–4), which is a relatively good result. The results of this study were like those of other studies [26, 29]. Including the names and qualifications of authors is a crucial aspect of improving the credibility of a website and instilling confidence in patients. It is advisable for both patients and healthcare professionals to exercise a degree of caution when considering online information, particularly in cases where the authorship of the information is unclear and the timeliness of the information is uncertain. Although these scores are low in some other studies, we can explain this by having more academic writing in our study [10, 19]. We believe the explanation for the low JAMA scores among non-academic groups is that the majority of websites lack citations or sources. In addition, we observed a positive correlation between the JAMA benchmark criteria and the FKGL and IVS scores (p < 0.05) (Fig. 2).

Fig. 2
figure 2

Relationship of JAMA score to other scores

The inference that can be drawn from this correlation suggests that websites that include authorship information and attributes are indicative of higher-quality content.

The average IVS was 83.18 ± 19.8. This content score was consistent with previous studies [31, 32]. Academic content with high IVS scores was likely written by experts on the subject. Nevertheless, not all studies have shown the same results [33, 34]. These are relatively old studies, and there may have been advancements in Internet information since then. In addition, we created this scoring system ourselves, so the results may be different when compared to previous studies.

Results of this study indicated that the average scores on the FKGL and FKRS were 10.06 ± 2.06 and 37.6 ± 18.75. We found that web pages that provided information about septic arthritis had similar readability ratings to those that examined arthroplasty, the foot and ankle, the hand, cancer, and the spine [35,36,37,38]. According to these findings, the FKRS score was at the “difficult to read” level, while the FKGL score was almost 4 degrees above the sixth-grade reading level indicated by the American Medical Association (AMA) and the National Institutes of Health (NIH). This indicates that a patient seeking information on septic arthritis online would benefit from having English language skills at or above the high school level. This is problematic because it makes it more difficult for many individuals to access the information they need about septic arthritis online.

We also discovered a positive, statistically significant correlation between GQS and IVS across all sources we looked at. This was notably true for FKGL, DISCERN, and JAMA. In the case of septic arthritis, this demonstrates a causal relationship between text readability and quality.

The majority of websites (76.7%) in this survey had a HON code. Websites with a HON code had significantly higher DISCERN, JAMA, GQS, and IVS evaluation score values compared to those without a HON code (p < 0.05). The findings of our research demonstrate that the HON code certification is a dependable indicator for identifying websites with superior quality and content scores in the context of Internet searches related to septic arthritis. These results are consistent with previous research examining the caliber of various healthcare subjects [39,40,41]. The aforementioned information is reassuring, and in our capacity as medical professionals, we can advise our patients to seek out this quality assurance marker during their online searches. However, while our study identified that websites with the HON code had significantly higher DISCERN, JAMA, GQS, and IVS scores, it is notable that these sites exhibited lower FKRS scores. This apparent contradiction could be attributed to several factors that deserve exploration. One possible explanation is that websites adhering to the HON code might feel compelled to maintain a certain level of technicality to uphold their credibility, inadvertently resulting in content that is difficult for laypeople to grasp. This trade-off between maintaining clinical accuracy and ensuring readability could contribute to the lower FKRS scores observed in HON-certified websites. Therefore, while the HON code contributes to the trustworthiness of the information, it might not necessarily address the challenge of making the content comprehensible to a wider audience.

Despite having high-quality content, academic websites have the highest degree of reading difficulty among all websites examined. Academic groups' readability scores in the current investigation showed a consistent trend of source material being pitched at a reading level far beyond what patients could easily access. Among the possible consequences of this condition is that researchers who do not comprehend the academic group's content may turn to websites with lower-quality content.

Although our findings highlight gaps in online patient education resources about septic arthritis, they also present an opportunity for improvement. The Agency for Healthcare Research and Quality (AHRQ) has published recommendations to improve health literacy and recommends taking "universal precautions" when disseminating health information. This method requires the healthcare provider to presume that everyone will have difficulty with comprehension and communicate in a manner that is universally understood [42]. Several institutions have published guidelines on how to accomplish this, including the use of simple words and phrases, careful language at lower reading level, consistent terminology, and the avoidance of excessive medical jargon [43].

To enhance the quality and accessibility of patient education resources, it is crucial for healthcare providers to actively engage in guiding patients towards reliable and easily readable websites. One effective approach is the implementation of search engine optimization (SEO) strategies [44] by academic groups and healthcare organizations that aim to provide accurate and trustworthy information to patients at appropriate readability levels. By optimizing website content and structure, utilizing clear and concise language, and employing user-friendly interfaces, these organizations can ensure that their resources rank higher in search engine results and are more easily discovered by patients seeking information on specific medical conditions. Furthermore, healthcare providers can play an active role by compiling a list of reputable websites and resources, which they can provide to patients during consultations or make available on their personal websites. This proactive approach not only empowers patients with reliable information but also fosters a stronger patient-physician dynamic, reducing the likelihood of patients relying on inaccurate or misleading information about their conditions. By embracing these strategies, healthcare providers can contribute to an improved online landscape for patient education where accurate and easily accessible resources are readily available to those in need.

Healthcare providers should actively guide patients during consultations, recommending reputable online resources and emphasizing the importance of verifying information. Encouraging patients to question sources fosters skepticism and empowers them to make informed decisions. Healthcare providers must stay updated on digital health literacy through ongoing education. Training should focus on critical appraisal of online information and evidence-based practice in the digital age.

Future studies should assess the impact of patient education interventions, explore the correlation between patients' digital health literacy and comprehension, and investigate the role of social media in shaping patient understanding of septic arthritis.

Limitations

This study has a number of limitations. Due to the frequent updating of Internet resources, the results of our study may have changed since the initial search. After the first page of search engine results, the number of websites that a user visits during a search decreases significantly [45]. Additionally, our research only looked at written material found online, but patients may also use audio-visual content to learn about their conditions; this aspect was not evaluated. Due to the dynamic nature of the Internet, search engines and their respective ranking algorithms may alter the results or order of results frequently. Search results may still differ from person to person even after we remove cookies. Even more so, the quality of information on websites other than the three most popular search engines was not evaluated in this study. The present research employs online data, thereby rendering it susceptible to the dynamic character of the Internet. The rapid creation or updating of websites can have an impact on their ranking in search engine query results. Consequently, the websites that were analyzed may undergo rapid changes. To the best of our knowledge, this is the first study on online patient information for septic arthritis to appear in the literature.

Conclusion

We determined that, like the quality of previous publications on orthopedic conditions, the websites providing information on septic arthritis were variable. Some online resources, particularly academic ones, have content of higher quality than others, but they are difficult to read for the general public. This makes it difficult for patients to find the information they are looking for about septic arthritis and can lead them to the inaccurate websites. With advancements in technology and the growing popularity of the Internet as a source of health information, physicians should refer patients to reliable websites and encourage the creation of content that is beneficial to patients and presented in a language and reading level they can easily read.

Availability of data and materials

The datasets analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Ross JJ. Septic arthritis of native joints. Infect Dis Clin North Am. 2017;31(2):203–18. https://doi.org/10.1016/j.idc.2017.01.001.

    Article  PubMed  Google Scholar 

  2. Kerbel YE, Lieber AM, Kirchner GJ, Stump NN, Prodromo JP, Petrucelli PM, et al. In-hospital complications following arthrotomy versus arthroscopy for septic knee arthritis: a cohort-matched comparison. J Knee Surg. 2021;34(1):74–9. https://doi.org/10.1055/s-0039-1693450.

    Article  PubMed  Google Scholar 

  3. Earwood JS, Walker TR, Sue GJC. Septic arthritis: diagnosis and treatment. Am Fam Physician. 2021;104(6):589–97.

    PubMed  Google Scholar 

  4. Clerc O, Prod’hom G, Greub G, Zanetti G, Senn L. Adult native septic arthritis: a review of 10 years of experience and lessons for empirical antibiotic therapy. J Antimicrob Chemother. 2011;66(5):1168–73. https://doi.org/10.1093/jac/dkr047.

    Article  CAS  PubMed  Google Scholar 

  5. Kaandorp CJ, Dinant HJ, van de Laar MA, Moens HJ, Prins AP, Dijkmans BA. Incidence and sources of native and prosthetic joint infection: a community based prospective survey. Ann Rheum Dis. 1997;56(8):470–5. https://doi.org/10.1136/ard.56.8.470.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Elsissy JG, Liu JN, Wilton PJ, Nwachuku I, Gowd AK, Amin NH. Bacterial septic arthritis of the adult native knee joint: a review. JBJS Rev. 2020;8(1):e0059. https://doi.org/10.2106/JBJS.RVW.19.00059.

    Article  PubMed  Google Scholar 

  7. Mathews CJ, Weston VC, Jones A, Field M, Coakley G. Bacterial septic arthritis in adults. Lancet. 2010;375(9717):846–55. https://doi.org/10.1016/S0140-6736(09)61595-6.

    Article  PubMed  Google Scholar 

  8. Trotter MI, Morgan DW. Patients’ use of the Internet for health related matters: a study of Internet usage in 2000 and 2006. Health Inform J. 2008;14(3):175–81. https://doi.org/10.1177/1081180X08092828.

    Article  Google Scholar 

  9. Gupte CM, Hassan AN, McDermott ID, Thomas RD. The internet–friend or foe? A questionnaire study of orthopaedic out-patients. Ann R Coll Surg Engl. 2002;84(3):187–92.

    CAS  PubMed  PubMed Central  Google Scholar 

  10. Morr S, Shanti N, Carrer A, Kubeck J, Gerling MC. Quality of information concerning cervical disc herniation on the Internet. Spine J. 2010;10(4):350–4. https://doi.org/10.1016/j.spinee.2010.02.009.

    Article  PubMed  Google Scholar 

  11. Stellefson M, Hanik B, Chaney B, Chaney D, Tennant B, Chavarria EA. eHealth literacy among college students: a systematic review with implications for eHealth education. J Med Internet Res. 2011;13(4):e102. https://doi.org/10.2196/jmir.1703.

    Article  PubMed  PubMed Central  Google Scholar 

  12. McMullan M. Patients using the Internet to obtain health information: how this affects the patient-health professional relationship. Patient Educ Couns. 2006;63(1–2):24–8. https://doi.org/10.1016/j.pec.2005.10.006.

    Article  PubMed  Google Scholar 

  13. Agar A, Sahin A. Kyphosis-related information on the Internet Is the quality, content and readability sufficient for the patients? Glob Spine J. 2022;12(3):476–82. https://doi.org/10.1177/21925682211015955.

    Article  Google Scholar 

  14. The Top 11 Search Engines, Ranked by Popularity. 2023. https://blog.hubspot.com/marketing/top-search-engines.

  15. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105–11. https://doi.org/10.1136/jech.53.2.105.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  16. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor—Let the reader and viewer beware. JAMA. 1997;277(15):1244–5.

    Article  CAS  PubMed  Google Scholar 

  17. Bernard A, Langille M, Hughes S, Rose C, Leddin D, van Zanten SV. A systematic review of patient inflammatory bowel disease information resources on the World Wide Web. Am J Gastroenterol. 2007;102(9):2070–7. https://doi.org/10.1111/j.1572-0241.2007.01325.x.

    Article  PubMed  Google Scholar 

  18. Kaya E, Görmez S. Quality and readability of online information on plantar fasciitis and calcaneal spur. Rheumatol Int. 2022;42(11):1965–72. https://doi.org/10.1007/s00296-022-05165-6.

    Article  PubMed  Google Scholar 

  19. Noback PC, Trofa DP, Dziesinski LK, Trupia EP, Galle S, Rosenwasser MP. Kienböck disease: quality, accuracy, and readability of online information. Hand (N Y). 2020;15(4):563–72. https://doi.org/10.1177/1558944718813631.

    Article  PubMed  Google Scholar 

  20. HON code of conduct for medical and health Web sites. Am J Health Syst Pharm. 2000;57(13):1283. https://doi.org/10.1093/ajhp/57.13.1283a

  21. Plusch K, Carfagno J, Givner D, Fletcher D, Aita D, Gallant GG, et al. An evaluation of the source and content of Dupuytren’s disease information available on the internet. Cureus. 2021;13(11):e19356. https://doi.org/10.7759/cureus.19356.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Tang K, Azhar U, Babar M, Ahmed A, Oh A, Day W, et al. Assessing the quality of Youtube videos on adhesive capsulitis. Cureus. 2022;14(7):e27406. https://doi.org/10.7759/cureus.27406.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Nwosu CR, Cox BM. The impact of the Internet on the doctor-patient relationship. Health Inform J. 2000;6(3):156–61.

    Article  Google Scholar 

  24. Garcia GH, Taylor SA, Dy CJ, Christ A, Patel RM, Dines JS. Online resources for shoulder instability: what are patients reading? J Bone Joint Surg Am. 2014;96(20):e177. https://doi.org/10.2106/JBJS.M.01228.

    Article  PubMed  Google Scholar 

  25. Majid N, Lee S, Plummer V. The effectiveness of orthopedic patient education in improving patient outcomes: a systematic review protocol. JBI Database Syst em Rev Implement Rep. 2015;13(1):122–33. https://doi.org/10.11124/jbisrir-2015-1950.

    Article  Google Scholar 

  26. Hartnett DA, Philips AP, Daniels AH, Blankenhorn BD. Readability and quality of online information on total ankle arthroplasty. Foot (Edinb). 2023;54:101985. https://doi.org/10.1016/j.foot.2023.101985.

    Article  PubMed  Google Scholar 

  27. Murphy B, Irwin S, Condon F, Kennedy C. Readability and quality of online information for patients pertaining to revision knee arthroplasty: an objective analysis. Surgeon. 2022;20(6):e366–70. https://doi.org/10.1016/j.surge.2021.12.009.

    Article  PubMed  Google Scholar 

  28. Ghodasra JH, Wang D, Jayakar RG, Jensen AR, Yamaguchi KT, Hegde VV, et al. The assessment of quality, accuracy, and readability of online educational resources for platelet-rich plasma. Arthroscopy. 2018;34(1):272–8. https://doi.org/10.1016/j.arthro.2017.06.023.

    Article  PubMed  Google Scholar 

  29. Shazil Jamal M, Hurley ET, Davey MS, Asad H, Gaafar M, Mullett H. Evaluation of the quality of information online for arthroscopic Bankart repair. Ann R Coll Surg Engl. 2022. https://doi.org/10.1308/rcsann.2022.0024.

    Article  PubMed  Google Scholar 

  30. Key S, Yalın M, Erten M. Growing taller without hormones? Dr. Consult Google—an evaluation of online information related to limb lengthening. Healthcare (Basel). 2023;11(2):172. https://doi.org/10.3390/healthcare11020172.

    Article  PubMed  Google Scholar 

  31. Özduran E, Hanci V. Evaluating the readability, quality and reliability of online information on Behçet’s disease. Reumatismo. 2022. https://doi.org/10.4081/reumatismo.2022.1495.

    Article  PubMed  Google Scholar 

  32. White MD, Latour K, Giordano M, Taylor T, Agarwal N. Reliability and quality of online patient education videos for lateral lumbar interbody fusion. J Neurosurg Spine. 2020. https://doi.org/10.3171/2020.4.SPINE191539.

    Article  PubMed  Google Scholar 

  33. Winship B, Grisell M, Yang CB, Chen RX, Bauer AS. The quality of pediatric orthopaedic information on the internet. J Pediatr Orthop. 2014;34(4):474–7. https://doi.org/10.1097/BPO.0000000000000125.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Nassiri M, Bruce-Brand RA, O’Neill F, Chenouri S, Curtin P. Perthes disease: the quality and reliability of information on the Internet. J Pediatr Orthop. 2015;35(5):530–5. https://doi.org/10.1097/BPO.0000000000000312.

    Article  PubMed  Google Scholar 

  35. Shah AK, Yi PH, Stein A. Readability of orthopaedic oncology-related patient education materials available on the internet. J Am Acad Orthop Surg. 2015;23(12):783–8. https://doi.org/10.5435/JAAOS-D-15-00324.

    Article  PubMed  Google Scholar 

  36. Polishchuk DL, Hashem J, Sabharwal S. Readability of online patient education materials on adult reconstruction Web sites. J Arthroplasty. 2012;27(5):716–9. https://doi.org/10.1016/j.arth.2011.08.020.

    Article  PubMed  Google Scholar 

  37. Badarudeen S, Sabharwal S. Assessing readability of patient education materials: current role in orthopaedics. Clin Orthop Relat Res. 2010;468(10):2572–80. https://doi.org/10.1007/s11999-010-1380-y.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Badarudeen S, Sabharwal S. Readability of patient education materials from the American Academy of Orthopaedic Surgeons and Pediatric Orthopaedic Society of North America web sites. J Bone Joint Surg Am. 2008;90(1):199–204. https://doi.org/10.2106/JBJS.G.00347.

    Article  PubMed  Google Scholar 

  39. Elhassan Y, Sheridan G, Nassiri M, Osman M, Kiely P, Noel J. Discectomy-related information on the internet: Does the quality follow the surge? Spine Phila Pa (1976). 2015;40(2):121–5. https://doi.org/10.1097/BRS.0000000000000689.

    Article  PubMed  Google Scholar 

  40. Nason GJ, Baker JF, Byrne DP, Noel J, Moore D, Kiely PJ. Scoliosis-specific information on the internet: has the “information highway” led to better information provision? Spine Phila Pa (1976). 2012;37(21):E1364–9. https://doi.org/10.1097/BRS.0b013e31826619b5.

    Article  PubMed  Google Scholar 

  41. Bruce-Brand RA, Baker JF, Byrne DP, Hogan NA, McCarthy T. Assessment of the quality and content of information on anterior cruciate ligament reconstruction on the internet. Arthroscopy. 2013;29(6):1095–100. https://doi.org/10.1016/j.arthro.2013.02.007.

    Article  PubMed  Google Scholar 

  42. Kružliaková N, Porter K, Ray PA, Hedrick V, Brock DJ, Zoellner J. Understanding and advancing organizational health literacy within a public health setting. HLRP Health Literacy Res Pract. 2021;5(1):e35–48. https://doi.org/10.3928/24748307-20210114-01.

    Article  Google Scholar 

  43. Krempec J, Hall J, Biermann JS. Internet use by patients in orthopaedic surgery. Iowa Orthop J. 2003;23:80.

    PubMed  PubMed Central  Google Scholar 

  44. Brouwer WP, Hollenbach M. Search engine optimization for scientific publications: How one can find your needle in the haystack. U Eur Gastroenterol J. 2022;10(8):906–8. https://doi.org/10.1002/ueg2.12311.

    Article  Google Scholar 

  45. Lawson KA, Codella S, Ciccotti MG, Kane PW, Duncan IC, Cohen SB. Evaluation of internet information about rotator cuff repair. Am J Orthop (Belle Mead NJ). 2016;45(3):E136–42.

    PubMed  Google Scholar 

Download references

Funding

No funding was received for conducting this study.

Author information

Authors and Affiliations

Authors

Contributions

FG and SBC contributed substantially to the research design or the acquisition, analysis, or interpretation of data. FG and SBC contributed to drafting the paper or revising it critically. FG and SBC are involved in the approval of the submitted and final versions. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Fatih Golgelioglu.

Ethics declarations

Ethics approval and consent to participate

This article does not contain any studies with human participants or animals performed by any of the authors.

Consent for publication

Not applicable.

Competing interests

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Golgelioglu, F., Canbaz, S.B. From quality to clarity: evaluating the effectiveness of online ınformation related to septic arthritis. J Orthop Surg Res 18, 689 (2023). https://doi.org/10.1186/s13018-023-04181-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13018-023-04181-x

Keywords