Tab Application Banner
  • Users Online: 1384
  • Home
  • Print this page
  • Email this page
Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 

Ahead of print publication  

Evaluating the readability, quality, and reliability of online information on sjogren's syndrome

1 Department of Physical Medicine and Rehabilitation, Dokuz Eylul University, Izmir, Turkey
2 Department of Anesthesiology and Reanimation, Subdivision of Critical Care Medicine, Dokuz Eylul University, Izmir, Turkey

Date of Submission08-Mar-2022
Date of Acceptance21-Apr-2022
Date of Web Publication13-Jul-2022

Correspondence Address:
Erkan Ozduran,
Department of Physical Medicine and Rehabilitation/Pain Medicine, Eylul University School of Medicine, University Hospital, Inciralti Mahallesi Mithatpasa Caddesi No: 1606 Balcova, Izmir
Login to access the Email id

Source of Support: None, Conflict of Interest: None

DOI: 10.4103/injr.injr_56_22


Background: There are concerns over the reliability and comprehensibility of health-related information on Internet. The goal of our research was to analyze at the readability, reliability, and quality of information obtained from websites associated with Sjogren's syndrome (SS).
Methods: In this cross-sectional study, the term “Sjogren's Syndrome” was used to perform a search on Google, and 75 eligible websites were identified on September 15, 2021. The Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level, and Gunning Fog (GFOG) were used to evaluate the readability of the website. The Journal of American Medical Association (JAMA) score was used to assess the websites' reliability, the DISCERN score, the Health on the Net Foundation code of conduct (HONcode) was used to assess quality, and Alexa was used to analyze their popularity.
Results: The results revealed that the mean FRES was 42.25 ± 17.11 (difficult), and the mean GFOG was 14.80 ± 3.56 years (very difficult). According to the JAMA scores, 24% of the websites had a high-reliability rating and 26.7% adhered to the HONcode. The readability was found to significantly differ from the reliability of the websites (P < 0.001). Moreover, websites with scientific content were found to have higher readability and reliability scores (P < 0.001).
Conclusion: The readability of SS-related information on the Internet was found to be considerably higher than that recommended by the National Health Institute's Grade 6, with moderate reliability and fair quality. We believe that online information should have some level of readability and must have reliable content that is appropriate to educate the public, particularly for websites that provide with patient education material.

Keywords: Health information, Internet, readability, rheumatology, Sjogren's syndrome

How to cite this URL:
Ozduran E, Hanci V. Evaluating the readability, quality, and reliability of online information on sjogren's syndrome. Indian J Rheumatol [Epub ahead of print] [cited 2022 Dec 5]. Available from:

  Introduction Top

Sjogren's syndrome (SS) is a slowly progressive autoimmune disease characterized by lymphocytic infiltration of the exocrine glands and significant loss in oral and secretory function.[1] This disease predominantly affects middle-aged females but can also be observed in children, males, and elderly individuals.[2] The clinical presentation of SS is heterogeneous and can range from symptoms of Sicca to systemic disease and lymphoma.[2] The treatment of SS ranges from local and symptomatic treatments aimed at controlling dryness to disease-modifying agents and biologic medications.[3]

The Internet is one of the most essential and accessible tools to receive information and raise awareness about health-related problems. It has become easier to access information on diseases, as well as medications, treatment alternatives, and surgical protocols, thanks to the increased usage of the Internet. The need for a reliable health-related website increases with the demand for accessing health information through the Internet.[4],[5] Many people look up information about their health conditions on the Internet before going to a doctor. Half of the population of the United States has access to health-related information in the Internet. Furthermore, over 70% of people said that they first utilized the Internet to get health information, according to the 2018 Health Information National Trends Survey.[6],[7]

The competition among health websites to attract patients is becoming increasingly intense. This has raised concerns over site content quality and timeliness, the reliability of information offered to individuals, and advertising and sponsorship relationships.[8] These websites contain a wide spectrum of health-related information ranging from highly reliable to deceptive. This information is not peer-reviewed, its quality differs, and the reading level of online information is not appropriate for the public. Conversely, there is no mechanism in place to control this information. According to the National Institutes of Health, the US Department of Health and Human Services, and the American Medical Association, patient education materials on the Internet should be written at a sixth-grade level.[9] If the readability of online material on a website exceeds this threshold, it is likely to be difficult to read and comprehend for the typical reader. As a result, it is critical that health-related material on websites be appropriate for the reader and thoroughly assessed before being used. There have been numerous studies published in the literature on the readability, reliability, and quality of online information in various disorders.[10],[11],[12],[13]

Individuals with rheumatological disorders commonly use the Internet to learn about alternate treatment options, risk factors, and disease complications. There is no study in the literature that analyzes information on SS found online. The goal of our research was to assess the readability, quality, and reliability of SS-related websites. Furthermore, website typologies that provide highly reliable information on SS were investigated.

  Methods Top

This study was planned as a cross-sectional study. Our study was conducted with the approval of the Non-Interventional Research Ethics Committee (6493-GOA 2021/20-11). Two independent authors searched the keyword “Sjogren's Syndrome” on Google (, the most popular search engine, on September 15, 2021. Our study was based on data from June 2020, and we selected Google because it is the most popular search engine with an 83.75% market share.[14]

During the website search, cookies and the computer's browser history were cleared to ensure that the search results were unaffected (for reasons such as Google Ads). In addition, the study was conducted by logging out of the Google account. Completing the search, the first 200 websites' uniform resource locators (URLs) were recorded, following the methodology of similar research in the literature.[15],[16] The top 10 websites on the first page were ranked as the most viewed websites.[17] The study excluded websites with non-English content, websites without information about the SS, websites that demand registration or subscription, repetitious websites, websites with video or audio recording content but no written content, and websites with <300 words. In addition, graphics, pictures, videos, tables, figures, and list formats in the texts, all punctuation marks, URL websites, author information, addresses, and phone numbers, as well as references to avoid erroneous results were not included in the evaluation.[18]

During the website evaluation, if an evaluation criterion could not be identified on the homepage, the three-click rule was used, which states that a website user should be able to find any information in three mouse clicks or less.[19] Although this is not an official rule, it is believed that if information cannot be found in three clicks, the users will be unable to complete their task and will leave the site.

Website typology

Two independent authors classified websites into seven categories based on their typology. If there were any discrepancies between the authors, the website typology was re-evaluated by both scientists, and a final verdict was reached.

Typologies were professional (websites created by organizations or individuals with professional medical qualifications), commercial (websites that sell product for profit), nonprofit (non-profit educational/charitable/supporting sites), health portals (websites that provide information about health issues), news (news and information created to provide magazine websites or newspaper), government (websites created, regulated or administered by an official government agency), and scientific journal (accessible academic publications or scientific articles).

Journal of American Medical Association Benchmark criteria

The Journal of American Medical Association (JAMA) benchmarks analyzes online information and resources under four criteria: authorship, attribution, disclosure, and currency (JAMA score 0–4, Authorship (1 point): Authors and contributors, their affiliations, and relevant credentials should be provided; Attribution (1 point): References and sources for all content should be listed; Disclosure (1 point): Conflicts of interest, funding, sponsorship, advertising, support, and video ownership should be fully disclosed; and Currency (1 point): Dates that on which the content was posted and updated should be indicated). JAMA is used to evaluate the accuracy and reliability of information. The scorer awards 1 point for each criterion in the text, and the final score ranges from 0 to 4. Four points represent the highest reliability and quality.[8]

DISCERN criteria

The DISCERN criterion, a technique for assessing the quality of websites, consists of 16 questions with scores ranging from 1 to 5.[20] The first eight questions ask on the website's basic content, such as “are the aims clear?” and “were citations used?” The last eight questions test treatment knowledge, such as “is it clear that there is more than one treatment option?” Using the DISCERN criteria, two authors independently examined websites. Averaging the data from the two separate authors yielded the final DISCERN score for each website. The final DISCERN score varies from 16 to 80. According to the results, 63–80 represents “excellent,” 51–62 represents “good,” 39–50 represents “fair,” 28–38 represents “poor,” and 16–27 represents “very poor.”[21]

Health on the Net Foundation code of conduct certification

The Health on the Net Foundation (HON) was founded to promote the efficient transmission and use of reliable and useful health information through the Internet. HONcode was created by HON to assist standardize the accuracy of health-related information on the Internet.[22] To meet the HONcode criteria, the content's date and source should be disclosed, the authors' qualifications should be specified, the privacy policy should be explained, the patient–physician relationship should be supported rather than replaced, the website's financing and advertising policy should be specified, and contact information should be explained.[23] HON grants HONcode certificates to websites as an option. HONcode is an affordable and optional certificate. The HONcode certificate is subject to a price and its use is restricted. In our research, we investigated if the main page or a connected URL had a HONcode stamp.


The following readability formulas were used to assess website readability: Flesch reading ease score (FRES), Flesch-Kincaid grade level (FKGL), Simple Measure of Gobbledygook (SMOG), Gunning FOG (GFOG), Coleman-Liau score (CL), automated readability index (ARI), and Linsear Write (LW) readability formulas from[24],[25],[26],[27],[28]

A total of 300 words from the beginning, middle, and end of the texts were examined in this study. All websites' ranking values were calculated and recorded. Microsoft Office Word 2007 (Microsoft Corporation, Redmond, WA) was used to copy and save the texts. Based on the sixth-grade level specified by the American Medical Association and the National Institutes of Health, the average readability level according to all readability formulas was compared.

Popularity and visibility analysis

Alexa ( is a popular traffic engine, and it is frequently used to assess area visibility and popularity.[29] It compares the number of times a website has been visited in the last 3 months to the number of times other websites have been visited. The higher the score, the more popular the site is because of more clicks.

Compete Rank is a Compete, Inc. traffic analysis and ranking unit ( Every website that Compete Rank crawls and indexes are assigned a number and ranked based on its traffic popularity.

WebRank ( is a toolbar that rates websites and pages in multiple search engines automatically.

Content analysis

Websites were assessed based on their typologies to see if they contained any SS-related content (diagnosis, pathophysiology, symptoms, lymphoma, risk factors, complications, and treatment).

Statistical analysis

For statistical analysis, data were uploaded to SPSS Windows 25.0 software (SPSS Inc., Chicago, IL, USA). Continuous values are indicated as mean ± standard deviation, while frequency variables are given as number (n) and percentage. Whether the data with continuous values conformed to the normal distribution pattern was determined by the normality tests Kolmogorov–Smirnov and Shapiro–Wilkins tests. For statistical analysis, according the results of the normality tests, the Mann–Whitney U test was used to compare groups with continuous values, such as readability indices and sixth class level. For comparison of frequency variables, the Pearson Chi-square or Fisher's exact test was used. Pearson correlation test was used for correlation analysis. A P < 0.05 was accepted as a statistically significant difference. In our study, Bonferroni correction was made in multiple comparisons of website typologies. When the value of 7, which is the number of groups, was taken, the P value was determined as lower than 0.0071 after the Benforroni correction in the analyzes where the website typologies were compared.

  Results Top

The study comprised 200 websites; 125 were eliminated because they did not match the inclusion requirements, and the remaining 75 were evaluated. Commercial (n = 18, 24%) and scientific journal (n = 17, 22.7%) websites were found to be the most common when 75 websites were compared according to their typologies [Figure 1].
Figure 1: Types of websites in the whole search

Click here to view

The previous research has indicated that visitors place a high value on the first page of a search engine's results. On Google's first page, there are ten search results. There was no statistically significant difference between the first 10 search results, and the remaining search results when they were analyzed according to their typologies (P = 0.050). The readability values of the top 10 websites differed significantly from the readability values of the remaining websites, indicating that the top 10 websites were more readable (FRES P = 0.025, GFOG = 0.029, CL P = 0.017, SMOG P = 0.035). A significant result (P = 0.026) was obtained when the Alexa values of the first 10 sites were compared to the Alexa values of the remaining sites. As one might assume, the top ten websites on the first page were more popular in terms of search, viewing, and traffic. There was no statistically difference between the presence of JAMA reliability (P = 0.685), DISCERN quality (P = 0.631), or HONcode (P = 0.306) on other websites and the top 10 websites [Table 1].
Table 1: All group of websites' mean results and statistical comparison of text content to 6th grade reading level

Click here to view

These 75 websites had an average JAMA score of 2.16 ± 1.39, a DISCERN score of 46.58 ± 22.96, a Web Rank of 6.21 ± 1.73, an Alexa score of 235102.35 ± 536838.27, and a Compete Rank of 21321.51 ± 38407.56. The websites included in the study with these results have been assessed to be moderately reliable and of fair quality. In the examination of the texts of the 75 evaluated websites, the mean FRES was 42.25 ± 17.11 (difficult), and the mean GFOG was 14.80 ± 3.56 (very difficult). The mean FKGL and SMOG were determined to be 12.21 ± 3.19 and 10.78 ± 2.56 years of education, respectively, while the CL index was 12.83 ± 1.97 years and the ARI index was 12.93 ± 3.11 years of education. The site typologies and all readability indices were compared, and the results indicated a significant (P < 0,001). A statistically significant difference was found when the readability index averages of 75 websites were compared to the grade 6 reading level (P < 0.001) [Table 1].

When the top 10 websites were compared to the remaining 65 websites using content analysis, there was no statistically significant difference identified (P > 0.05).

Only websites with lymphoma (P = 0.001) and risk factors (P = 0.004) content show a statistically significant difference in their contents according to typology when all 75 websites are assessed (P > 0.007) [Table 2]. Health portals frequently mentioned risk factors, and websites with Scientific Journal frequently mentioned lymphoma (P > 0.007) [Table 2].
Table 2: Content analysis by typology

Click here to view

There was a significant difference between typologies of 75 websites and JAMA reliability scores (P < 0.001), DISCERN quality scores (P < 0.001), and HONcode (P = 0.006). This statistical difference can be explained by higher JAMA reliability scores and DISCERN quality scores in scientific journals. These scores were found to be lower in commercial content websites. Only n = 20 (26.7%) of all sites had HONcode. The highest number of HONcode was found on health portals with 10. It was shown that websites with scientific content were more difficult and websites with government content easy to read (P < 0.007) [Table 3].
Table 3: Comparison of Journal of American Medical Association, DISCERN scores, the Health on the Net Foundation code of conduct presences, and reading levels according to the typologies of the websites

Click here to view

The FRES, FKGL, SMOG, GFOG, CL, ARI, LW readability formula averages, JAMA and DISCERN scores, and HONcode entities were analyzed with respect to the site rankings. There was a weak positive correlation between the JAMA reliability scores of the texts and the mean FKGL (r = 0.521, P < 0.001), SMOG (r = 0.505, P < 0.001), GFOG (r = 0.521, P < 0.001), CL (r = 0.380, P = 0.001), ARI (r = 0.452, P < 0.001), and LW (r = 0.456, P < 0.001. Similarly, the DISCERN quality scores had a weak positive correlation with the mean of ile GFOG (r = 0.524, P < 0.001) FKGL (r = 0.528, P < 0.001), and SMOG (r = 0.521, P < 0.001) [Table 4]. It was determined that there was a weak negative correlation between the readability score averages and the HONcode quality assessment. As a result, it is reasonable to conclude that websites with high readability scores provide more reliable and high-quality content. The readability indexes had a correlation; however, there was none between the readability scores and the popularity and visibility analysis indexes (Alexa, Compete, Web Rank).
Table 4: Correlation relationships between rank and readability formulas, Journal of American Medical Association, DISCERN scores, Health on the Net Foundation code of conduct precenses

Click here to view

There was a statistically significant difference between the reading level and the typologies of 300 words chosen from the texts (P = 0.001). This statistical difference can be attributed to scientific journal websites. [Figure 2] shows the reading level and age based on the contents of text sections.
Figure 2: Analysis of reading level and reading age results (%) according to all text and text sections

Click here to view

  Discussion Top

According to our study results, websites with the typologies of a commercial, scientific journal, and health portal were the most frequently found sites in all search results. There was a significant difference between Website typologies and reliability scores. Higher JAMA scores in scientific journals were shown to relate to this significant difference. A significant difference was found when the readability indices and rankings of all sites were assessed according to their typologies. Scientific journals are thought to be more difficult to read, but commercial and nonprofit websites rank higher and offer more interaction.

There is a rapid increase in the use of Internet-based health information, a phenomenon known as the e-patient revolution.[30] Although offering quick access to health information benefits patients, the lack of control mechanisms for the distribution of health-care information on the Internet creates plenty of issues.[31] Some of these issues include illegal product sales, unlicensed health product promotion, and poor health and information management.[32] Furthermore, the previous research has often indicated that there is no standard for the readability and reliability of information available on the Internet.[5] It is possible to fully comprehend this information if the individual can read and understand it, develop health literacy, and apply it to sound decision-making. Health literacy is described as “the level of accessing, processing, and understanding basic health information that individuals require to make health decisions.”[28] Due to their health literacy, people can access and read correct health information. According to the National Institute of Literacy of the United States Department of Education, roughly 32 million American adults are unable to read, and 68 million have literacy skills below the fifth grade level.[28],[33],[34] Given that Google receives millions of health-related queries every day from around the world and that nearly 80% of American Internet users get health-related information online, the need of creating a readable website becomes clear once more.[28],[33],[34] Even though an Internet site provides reliable and high-quality content, its readability might make it tough for users to comprehend.

Readability, it is an important factor in understanding patient education materials. Complex sentences with long words and long sentences can break readers' confidence in learning about a medical condition. The readability of a text is determined by the readability formulas that control the number of words and sentences, syllable lengths, and long syllable word numbers in that text.

In recent studies, they stated that the readability levels in general rheumatological conditions and rheumatology medicine information sheets are very high.[35],[36] In our study, similar to the literature, it was revealed that the readability levels of internet-based materials related to SS were quite difficult.

Basavakumar et al.[15] reported that FKGL readability scores were lower in the top 10 websites. Similarly, we found a significant relationship between the top 10 websites and other websites in terms of readability scores (FRES, GFOG, FKGN, and SMOG readability scores). It was determined that the top 10 websites were more readable. Kocyigit et al.[37] and Bagcier et al.[38] found no significant difference between the two groups. Considering that the top 10 websites are the most visited websites, better readability will help users to understand the information.

When all website typologies were compared in terms of readability, statistically significant differences were found in all readability formulas. It was determined that scientific journals were more difficult to read, while government websites were easier to read. The average readability scores obtained in the present study were well above the 6th grade reading level recommended by the National Institute of Health. Willen et al.[35] stated that websites about rheumatological conditions and Reynold et al.[39] reported that websites about lupus contained texts with readability levels above the recommended level, and these texts were difficult to understand for internet users.

Similar to our study, Lee et al.[40] also stated that the readability of scientific journals is more difficult. Consistent with the literature, we found that scientific journals are difficult to read, while government websites are easy to read. This can be explained by the fact that government sources try to prepare websites that are easier to read during periods when the public needs to receive information accurately and quickly, such as pandemics.

In the present study, the commercial websites in the top 10 websites are similar to those reported in the literature.[38],[41] Considering that users preferably visit the top 10 websites, this can be explained as Google ranking websites by prioritizing financial targets. In order for users to reach healthy information, we think that commercial websites should not ignore their financial goals.

In our study, the reliability of the websites, which are two important factors apart from readability, were evaluated with JAMA scoring and their quality with DISCERN and HONcode.

In the present study, HONcode was found in 26.7 (20%) of 100 websites. In their study investigating the readability of osteoporosis, Yurdakul et al.[41] found HONcode in 12.6% of the websites investigated. Reynold et al.[39] found HONcode in 30.8% of the websites about systemic lupus erythematosus, and Arif et al.[16] found HONcode in 17.9% of the websites they investigated. The results of the present study are consistent with the literature in this aspect. In the present study, a significant relationship was found between the presence of HONcode according to typology. It was determined that this statistical difference was caused by health portals. Similar to our results, Chumber et al.[42] reported that HONcodes were more frequently used in health portals. On the other hand, some studies report that HONcodes are more frequently used by Scientific Journals.[16],[37] In the present study, only 2 (2/17) of the Scientific Journal websites had the HONcode stamp. HONcode stamps for scientific publications are a matter of debate. However, it is obvious that there should be appropriate certification methods for online health information. Evaluation of health-related information by a council or an institution before it reaches the public may be considered for presentation of better quality information.

In our study, the mean DISCERN score was found to be “fair” with 46.58 ± 22.96. Reynold et al.[39] found it to be 47.7 ± 13.2 in lupus and Willen et al.[35] found it to be 53.7 ± 10.3 in rheumatological conditions. In the present study, a significant difference was found between website typologies and DISCERN quality scores. DISCERN scores of Scientific Journals were higher. Commercial sources had the lowest DISCERN score average. In this respect, our results are consistent with other studies in the literature.[40] In addition, a significant difference was found between readability scores (FRES, GFOG, FKGL, and SMOG) and DISCERN quality scores. Accordingly, it can be said that websites with lower readability offer higher quality content. Willen et al.[35] also reported this inverse relationship in their study. This can be explained by the fact that sometimes it may be necessary to compromise on quality to present easy-to-read texts.

Arif et al.[16] and Basavakumar et al.[15] found no significant difference between the top 10 websites and other websites in terms of JAMA score. Our results are consistent with the literature in this regard. In addition, a significant difference was found between the JAMA scores according to the typology of the websites. Scientific journals had higher JAMA reliability scores.

In our study, the content analyzes of the websites were also evaluated. Content analysis revealed that websites with postsymptom treatment content were the most common. No statistically significant difference was found between website typologies and topics. It was found that content about lymphoma and risk factors are mostly on health portals. This shows that the risk status of individuals regarding SS is often expressed by health portals and offered to users. Consistent with the present study, Basavakumar et al.[15] stated that websites about fibromyalgia mostly contain symptom-related content. Bagcier et al.,[38] on the other hand, found that websites about myofascial pain mostly address therapeutic issues. According to these results, it can be said that websites of different typologies prefer popular content of each disease and topics suitable for the website typology when presenting content.

The study has its own set of limitations. We only search for websites in English, use a single search engine, only use “Sjogren's Syndrome” as a search keyword, and only detect Internet sites that use a single country's data network, to name a few limitations of our work. Although there is no complete consensus on which index is ideal for assessing the readability of Internet-based patient education materials, the indices we employed in our study are among the most widely used formulas. The websites were targeting an education level considerably over the appropriate level, according to all of the metrics we analyzed.

  Conclusion Top

In our study, we found that the readability of online information regarding “SS” was considerably greater than the National Health Institute's Grade 6 recommendation. The contents of the websites have been assessed to be partially reliable and of fair quality. Our results showed that the majority of highly reliable information may be found on academic websites. Early diagnosis and awareness are very important in a rheumatological disease such as SS that can cause serious mortality and morbidity. As a result, while creating health-related websites for the general audience, it is essential to examine the language used according to readability indexes. Furthermore, we believe that the information should be presented at a level of readability appropriate for the country or countries to which it is directed.


The authors thank University Hospital for their valuable technical assistance.

Financial support and sponsorship


Conflicts of interest

There are no conflicts of interest.

  References Top

Qin B, Wang J, Yang Z, Yang M, Ma N, Huang F, et al. Epidemiology of primary Sjögren's syndrome: A systematic review and meta-analysis. Ann Rheum Dis 2015;74:1983-9.  Back to cited text no. 1
Brito-Zerón P, Baldini C, Bootsma H, Bowman SJ, Jonsson R, Mariette X, et al. Sjögren syndrome. Nat Rev Dis Primers 2016;2:16047.  Back to cited text no. 2
Negrini S, Emmi G, Greco M, Borro M, Sardanelli F, Murdaca G, et al. Sjögren's syndrome: A systemic autoimmune disease. Clin Exp Med 2022;22:9-25.  Back to cited text no. 3
Yeh TK, Yeh J. Chest pain in pediatrics. Pediatr Ann 2015;44:e274-8.  Back to cited text no. 4
Murray KE, Murray TE, O'Rourke AC, Low C, Veale DJ. Readability and quality of online information on osteoarthritis: An objective analysis with historic comparison. Interact J Med Res 2019;8:e12855.  Back to cited text no. 5
Amante DJ, Hogan TP, Pagoto SL, English TM, Lapane KL. Access to care and use of the Internet to search for health information: Results from the US National Health Interview Survey. J Med Internet Res 2015;17:e106.  Back to cited text no. 6
Scott BB, Johnson AR, Doval AF, Tran BN, Lee BT. Readability and understandability analysis of online materials related to abdominal aortic aneurysm repair. Vasc Endovascular Surg 2020;54:111-7.  Back to cited text no. 7
Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the internet: Caveant lector et viewor-let the reader and viewer beware. JAMA 1997;277:1244-5.  Back to cited text no. 8
AlKhalili R, Shukla PA, Patel RH, Sanghvi S, Hubbi B. Readability assessment of internet-based patient education materials related to mammography for breast cancer screening. Acad Radiol 2015;22:290-5.  Back to cited text no. 9
Crawford-Manning F, Greenall C, Hawarden A, Bullock L, Leyland S, Jinks C, et al. Evaluation of quality and readability of online patient information on osteoporosis and osteoporosis drug treatment and recommendations for improvement. Osteoporos Int 2021;32:1567-84.  Back to cited text no. 10
Wang Q, Xie L, Wang L, Li X, Xu L, Chen P. Readability in printed education materials for Chinese patients with systemic lupus erythematosus: A mixed-method design. BMJ Open 2020;10:e038091.  Back to cited text no. 11
Siddhanamatha HR, Heung E, Lopez-Olivo MLA, Abdel-Wahab N, Ojeda-Prias A, Willcockson I, et al. Quality assessment of websites providing educational content for patients with rheumatoid arthritis. Semin Arthritis Rheum 2017;46:715-23.  Back to cited text no. 12
Vivekanantham A, Protheroe J, Muller S, Hider S. Evaluating on-line health information for patients with polymyalgia rheumatica: A descriptive study. BMC Musculoskelet Disord 2017;18:43.  Back to cited text no. 13
Statista. Available from: [Last accessed on 2022 Jan 15].  Back to cited text no. 14
Basavakumar D, Flegg M, Eccles J, Ghezzi P. Accuracy, completeness and accessibility of online information on fibromyalgia. Rheumatol Int 2019;39:735-42.  Back to cited text no. 15
Arif N, Ghezzi P. Quality of online information on breast cancer treatment options. Breast 2018;37:6-12.  Back to cited text no. 16
Eysenbach G, Köhler C. How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ 2002;324:573-7.  Back to cited text no. 17
Boztas N, Omur D, Ozbılgın S, Altuntas G, Piskin E, Ozkardesler S, et al. Readability of internet-sourced patient education material related to “labour analgesia”. Medicine (Baltimore) 2017;96:e8526.  Back to cited text no. 18
Zeldman J. Taking Your Talent to the Web: A Guide for the Transitioning Designer. Indianapolis: New Riders; 2001.  Back to cited text no. 19
Charnock D, Shepperd S, Needham G, Gann R. DISCERN: An instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 1999;53:105-11.  Back to cited text no. 20
Weil AG, Bojanowski MW, Jamart J, Gustin T, Lévêque M. Evaluation of the quality of information on the Internet available to patients undergoing cervical spine surgery. World Neurosurg 2014;82:e31-9.  Back to cited text no. 21
Boyer C, Selby M, Appel RD. The health on the net code of conduct for medical and health web sites. Stud Health Technol Inform 1998;52 Pt 2:1163-6.  Back to cited text no. 22
Boyer C, Baujard V, Geissbuhler A. Evolution of health web certification through the HONcode experience. Stud Health Technol Inform 2011;169:53-7.  Back to cited text no. 23
Walsh TM, Volsko TA. Readability assessment of internet-based consumer health information. Respir Care 2008;53:1310-5.  Back to cited text no. 24
Garfinkle R, Wong-Chong N, Petrucci A, Sylla P, Wexner SD, Bhatnagar S, et al. Assessing the readability, quality and accuracy of online health information for patients with low anterior resection syndrome following surgery for rectal cancer. Colorectal Dis 2019;21:523-31.  Back to cited text no. 25
Calo WA, Gilkey MB, Malo TL, Robichaud M, Brewer NT. A content analysis of HPV vaccination messages available online. Vaccine 2018;36:7525-9.  Back to cited text no. 26
Sheats MK, Royal K, Kedrowicz A. Using readability software to enhance the health literacy of equine veterinary clients: An analysis of 17 American Association of Equine Practitioners' newsletter and website articles. Equine Vet J 2019;51:552-5.  Back to cited text no. 27
Huang G, Fang CH, Agarwal N, Bhagat N, Eloy JA, Langer PD. Assessment of online patient education materials from major ophthalmologic associations. JAMA Ophthalmol 2015;133:449-54.  Back to cited text no. 28
Yılmaz FH, Tutar MS, Arslan D, Çeri A. Readability, understandability, and quality of retinopathy of prematurity information on the web. Birth Defects Res 2021;113:901-10.  Back to cited text no. 29
Wald HS, Dube CE, Anthony DC. Untangling the web-the impact of Internet use on health care and the physician-patient relationship. Patient Educ Couns 2007;68:218-24.  Back to cited text no. 30
Barajas-Gamboa JS, Klingler M, Landreneau J, Strong A, Al Zubaidi A, Sharadgah H, et al. Quality of information about bariatric surgery on the internet: A two-continent comparison of website content. Obes Surg 2020;30:1736-44.  Back to cited text no. 31
Washington TA, Fanciullo GJ, Sorensen JA, Baird JC. Quality of chronic pain websites. Pain Med 2008;9:994-1000.  Back to cited text no. 32
Daraz L, Morrow AS, Oscar J, Ponce OJ, Farah W, Katabi A, et al. Readability of online health information: A meta-narrative systematic review. Am J Med Qual 2018;33:487-92.  Back to cited text no. 33
Eysenbach G, Kohler Ch. What is the prevalence of health-related searches on the World Wide Web? Qualitative and quantitative analysis of search engine queries on the internet. AMIA Annu Symp Proc. 2003;2003:225-9.  Back to cited text no. 34
Willen RD, Pipitone O, Daudfar S, Jones JD. Comparing quality and readability of online English language information to patient use and perspectives for common rheumatologic conditions. Rheumatol Int 2020;40:2097-103.  Back to cited text no. 35
Oliffe M, Thompson E, Johnston J, Freeman D, Bagga H, Wong PKK. Assessing the readability and patient comprehension of rheumatology medicine information sheets: A cross-sectional Health Literacy Study. BMJ Open 2019;9:e024582.  Back to cited text no. 36
Kocyigit BF, Koca TT, Akaltun MS. Quality and readability of online information on ankylosing spondylitis. Clin Rheumatol 2019;38:3269-74.  Back to cited text no. 37
Bagcier F, Yurdakul OV, Temel MH. Quality and readability of online information on myofascial pain syndrome. J Bodyw Mov Ther 2021;25:61-6.  Back to cited text no. 38
Reynolds M, Hoi A, Buchanan RRC. Assessing the quality, reliability and readability of online health information regarding systemic lupus erythematosus. Lupus 2018;27:1911-7.  Back to cited text no. 39
Lee RJ, O'Neill DC, Brassil M, Alderson J, Lee MJ. Pelvic vein embolization: An assessment of the readability and quality of online information for patients. CVIR Endovasc 2020;3:52.  Back to cited text no. 40
Yurdakul OV, Kilicoglu MS, Bagcier F. Evaluating the reliability and readability of online information on osteoporosis. Arch Endocrinol Metab 2021;65:85-92.  Back to cited text no. 41
Chumber S, Huber J, Ghezzi P. A methodology to analyze the quality of health information on the internet: The example of diabetic neuropathy. Diabetes Educ 2015;41:95-105.  Back to cited text no. 42


  [Figure 1], [Figure 2]

  [Table 1], [Table 2], [Table 3], [Table 4]


     Search Pubmed for
    -  Ozduran E
    -  Hanci V
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

  In this article
Article Figures
Article Tables

 Article Access Statistics
    PDF Downloaded9    

Recommend this journal