Readability Analysis of Otolaryngology Consent Documents on the iMed Consent Platform

2022 ◽  
Author(s):  
Kyle Hannabass ◽  
Jivianne Lee

ABSTRACT Introduction The American Medical Association (AMA) and National Institutes of Health (NIH) recommend all patient information and consent materials be provided at the fourth- to sixth-grade level. The iMed Consent platform is used nationally by the Veterans Health Administration and private hospitals. We aimed to assess the readability of otolaryngology consents at the West Los Angeles Veterans Affairs (WLA-VA) hospital to determine whether they conform with AMA/NIH guidelines. Materials and Methods A readability analysis of 27 otolaryngology iMed consent documents was performed. The main outcome measure was the Flesch–Kincaid Grade Level (FKGL). The setting of the study was an otolaryngology clinic at a major VA hospital. All consents used in the WLA-VA otolaryngology clinic for the month of October 2018 were analyzed using readability metrics. These included the Flesch Reading Ease (FRE) score, the FKGL, the Gunning Fog Index (GFI), Simple Measure of Gobbledygook (SMOG), and Coleman–Liau Index (CLI). Results The following means of all consents were calculated for each of the readability metrics: FRE 56.3, FKGL 8.3, GFI 14.5, SMOG 11.3, and CLI 11.2. The standardized anesthesia and blood consent were analyzed separately with the following scores: FRE 45.1, FKGL 11.7, GFI 15.5, SMOG 14.6, and CLI 12.6. The average FKGL of the consents was found to be significantly above the sixth-grade level (P: .0013). Conclusion The average grade level of the otolaryngology iMed consents reviewed was at a reading level above the AMA/NIH recommendations. This objective measure should be taken into consideration when revising the iMed system and in the creation of future standardized consents. Readability analysis does not take into consideration the significant variance that exists as part of the verbal consent process that takes place between patient and provider.

2021 ◽  
Vol 8 ◽  
pp. 237437352110564
Author(s):  
Shayan Hosseinzadeh ◽  
Philip Blazar ◽  
Brandon E Earp ◽  
Dafang Zhang

Dupuytren's contracture is a common hand pathology for which consultation and treatment are largely at the patient's discretion. The objective of this study was to evaluate the readability of current online patient information regarding Dupuytren's contracture. The largest public search engines (Google, Yahoo, and Bing) were queried using the search terms “Dupuytren's contracture,” “Dupuytren's disease,” “Viking's disease,” and “bent finger.” The first 30 unique websites by each search were analyzed and readability assessed using five established algorithms: Flesch Reading Ease, Gunning-Fog Index, Flesch–Kincaid Grade level, Coleman–Liau index, and Simple Measure of Gobbledygook grade level. Analysis of 73 websites demonstrated an average Flesch Reading Ease score of 48.6 ± 8.0, which corresponds to college reading level. The readability of websites ranged from 10.5 to 13.3 reading grade level. No article was written at or below the recommended sixth grade reading level. Information on the internet on Dupuytren's contracture is written at higher than recommended reading grade level. There is a need for high-quality patient information on Dupuytren's contracture at appropriate reading grade levels for patients of various health literacy backgrounds. Hospitals, universities, and academic organizations focused on the development of readable online information should consider patients’ input and preferences.


Author(s):  
Naudia Falconer ◽  
E. Reicherter ◽  
Barbara Billek-Sawhney ◽  
Steven Chesbro

The readability level of many patient education materials is too high for patients to comprehend, placing the patient’s health at risk. Since health professionals often recommend Internet-based patient education resources, they must ensure that the readability of information provided to consumers is at an appropriate level. Purpose: The purpose of this study was to determine the readability of educational brochures found on the American Physical Therapy Association (APTA) consumer website. Methods: Fourteen educational brochures on the APTA website in March 2008 were analyzed using the following assessments: Flesch-Kincaid Grade Level, Flesch Reading Ease, Fry Readability Formula, Simple Measure of Gobbledygook (SMOG), Checklist for Patient Education Materials, and Consumer Health Web Site Evaluation Checklist. Results: According to the Flesch-Kincaid and Flesch Reading Ease, over 90% of the brochures were written at greater than a sixth grade level. The mean reading level was grade 10.2 (range = 3.1 to 12) with a Reading Ease score between 31.5 to 79.9. Using the SMOG formula, the brochures had a mean reading level of grade 11.5 (range = 9 to 13). The Fry Readability showed that 85% of the brochures were written higher than a sixth grade level, with a mean reading level of grade 9.5 (range = 6 to 14). Conclusion: Findings suggest that most of the consumer education information available on the website of this health professional organization had readability scores that were too high for average consumers to read.


2019 ◽  
Vol 30 (3) ◽  
pp. 328-336
Author(s):  
Derya Arslan ◽  
Mahmut Sami Tutar ◽  
Betul Kozanhan ◽  
Zafer Bagci

AbstractObjective:Murmurs are abnormal audible heart sounds produced by turbulent blood flow. Therefore, murmurs in a child may be a source of anxiety for family members. Families often use online materials to explore possible reasons for these murmurs, given the accessibility of information on the Internet. In this study, we evaluated the quality, understandability, readability, and popularity of online materials about heart murmur.Methods:An Internet search was performed for “heart murmur” using the Google search engine. The global quality score (on a scale of 1 to 5, corresponding to poor to excellent quality) and Health on the Net code were used to measure the quality of information presented. The understandability of the web pages identified was measured using the Patient Education Materials Assessment Tool (score range from 0 to 100%, scores below 70% reflect poor performance). The readability of each web pages was assessed using four validated indices: the Flesch Reading Ease Score, the Flesch–Kincaid Grade Level, the Gunning Frequency of Gobbledygook, and the Simple Measure of Gobbledygook. The ALEXA traffic tool was used to reference domains’ popularity and visibility.Results:We identified 230 English-language patient educational materials that discussed heart murmur. After exclusion, a total of 86 web pages were evaluated for this study. The average global quality score was 4.34 (SD = 0.71; range from 3 to 5) indicating that the quality of information of most websites was good. Only 14 (16.3%) websites had Health on the Net certification. The mean understandability score for all Internet-based patient educational materials was 74.6% (SD = 12.8%; range from 31.2 to 93.7%). A score suggesting these Internet-based patient educational materials were “easy to understand”. The mean readability levels of all patient educational materials were higher than the recommended sixth-grade reading level, according to all indices applied. This means that the level of readability is difficult. The average grade level for all web pages was 10.4 ± 1.65 (range from 7.53 to 14.13). The Flesch–Kincaid Grade level was 10 ± 1.81, the Gunning Frequency of Gobbledygook level was 12.1 ± 1.85, and the Simple Measure of Gobbledygook level was 9.1 ± 1.38. The average Flesch Reading Ease Score was 55 ± 9.1 (range from 32.4 to 72.9).Conclusion:We demonstrated that web pages describing heart murmurs were understandable and high quality. However, the readability level of the websites was above the recommended sixth-grade reading level. Readability of written materials from online sources need to be improved. However, care must be taken to ensure that the information of web pages is of a high quality and understandable.


Author(s):  
A Habeeb

Abstract Objective This study aimed to assess the quality and readability of websites on chronic rhinosinusitis. Methods A total of 180 results from 3 different search engines regarding ‘chronic rhinosinusitis’, ‘sinusitis’ and ‘sinus infections’ were analysed for readability using the Flesch–Kincaid Grade Level, Flesch Reading Ease Score and Gunning Fog Index. The Discern tool was used to approximate information quality. Results From 180 total searches, 69 unique websites were identified. These had an average Flesch–Kincaid Grade Level of 9.75 (95 per cent confidence interval = 9.12–10.4), a Flesch Reading Ease Score of 45.0 (41.0–49.0) and a Gunning Fog Index of 13.7 (12.9–14.4), which equates to the average reading level of a college or university student. Discern scores were variable but consistently showed good-quality information. Conclusion Chronic rhinosinusitis information is of a high quality but is for a reading level higher than that of the average adult. Standardising patient information should ensure adequate comprehension and improve patient compliance.


2018 ◽  
Vol 28 (Supp) ◽  
pp. 475-484
Author(s):  
Adriana Izquierdo ◽  
Michael Ong ◽  
Felica Jones ◽  
Loretta Jones ◽  
David Ganz ◽  
...  

Background: Little has been written about engaging potentially eligible members of a health care system who are not accessing the care to which they are entitled. Know­ing more about the experiences of African American Veterans who regularly experi­ence health care access challenges may be an important step toward equitable, coordi­nated Veterans Health Administration (VHA) care. This article explores the experiences of African American Veterans who are at risk of experiencing poor care coordination.Design: We partnered with a community organization to recruit and engage Veterans in three exploratory engagement workshops between October 2015 and February 2016.Participants and Setting: Veterans living in South Los Angeles, CaliforniaMain Outcome Measures: Veterans were asked to describe their experiences with community care and the VHA, a division of the US Department of Veterans Affairs (VA). Field notes taken during the workshops were analyzed by community and academic partners using grounded theory methodol­ogy to identify emergent themes.Results: 12 Veterans and 3 family members of Veterans participated in one or more en­gagement workshops. Their trust in the VA was generally low. Positive themes included: Veterans have knowledge to share and want to help other Veterans; and connecting to VA services can result in positive experi­ences. Negative themes included: functional barriers to accessing VA health care services; insensitive VA health care environment; lack of trust in the VA health care system; and Veteran status as disadvantageous for accessing non-VA community services.Conclusions: Veterans living in underserved areas who have had difficulty accessing VA care have unique perspectives on VA services. Partnering with trusted local com­munity organizations to engage Veterans in their home communities is a promising strategy to inform efforts to improve care access and coordination for vulnerable Vet­erans.Ethn Dis. 2018;28(Suppl 2):475-484; doi:10.18865/ed.28.S2.475.


2018 ◽  
Vol 33 (5) ◽  
pp. 487-492 ◽  
Author(s):  
Lubna Daraz ◽  
Allison S. Morrow ◽  
Oscar J. Ponce ◽  
Wigdan Farah ◽  
Abdulrahman Katabi ◽  
...  

Online health information should meet the reading level for the general public (set at sixth-grade level). Readability is a key requirement for information to be helpful and improve quality of care. The authors conducted a systematic review to evaluate the readability of online health information in the United States and Canada. Out of 3743 references, the authors included 157 cross-sectional studies evaluating 7891 websites using 13 readability scales. The mean readability grade level across websites ranged from grade 10 to 15 based on the different scales. Stratification by specialty, health condition, and type of organization producing information revealed the same findings. In conclusion, online health information in the United States and Canada has a readability level that is inappropriate for general public use. Poor readability can lead to misinformation and may have a detrimental effect on health. Efforts are needed to improve readability and the content of online health information.


2020 ◽  
Vol 40 (11) ◽  
pp. NP636-NP642 ◽  
Author(s):  
Eric Barbarite ◽  
David Shaye ◽  
Samuel Oyer ◽  
Linda N Lee

Abstract Background In an era of widespread Internet access, patients increasingly look online for health information. Given the frequency with which cosmetic botulinum toxin injection is performed, there is a need to provide patients with high-quality information about this procedure. Objectives The aim of this study was to examine the quality of printed online education materials (POEMs) about cosmetic botulinum toxin. Methods An Internet search was performed to identify 32 websites of various authorship types. Materials were evaluated for accuracy and inclusion of key content points. Readability was measured by Flesch Reading Ease and Flesch-Kincaid Grade Level. Understandability and actionability were assessed with the Patient Education Materials Assessment Tool for Printed Materials. The effect of authorship was measured by undertaking analysis of variance between groups. Results The mean [standard deviation] accuracy score among all POEMs was 4.2 [0.7], which represents an accuracy of 76% to 99%. Mean comprehensiveness was 47.0% [16.4%]. Mean Flesch-Kincaid Grade Level and Flesch Reading Ease scores were 10.7 [2.1] and 47.9 [10.0], respectively. Mean understandability and actionability were 62.8% [18.8%] and 36.2% [26.5%], respectively. There were no significant differences between accuracy (P > 0.2), comprehensiveness (P > 0.5), readability (P > 0.1), understandability (P > 0.3), or actionability (P > 0.2) by authorship. Conclusions There is wide variability in the quality of cosmetic botulinum toxin POEMs regardless of authorship type. The majority of materials are written above the recommended reading level and fail to include important content points. It is critical that providers take an active role in the evaluation and endorsement of online patient education materials.


2012 ◽  
Vol 147 (5) ◽  
pp. 848-854 ◽  
Author(s):  
Jean Anderson Eloy ◽  
Shawn Li ◽  
Khushabu Kasabwala ◽  
Nitin Agarwal ◽  
David R. Hansberry ◽  
...  

Objective Various otolaryngology associations provide Internet-based patient education material (IPEM) to the general public. However, this information may be written above the fourth- to sixth-grade reading level recommended by the American Medical Association (AMA) and National Institutes of Health (NIH). The purpose of this study was to assess the readability of otolaryngology-related IPEMs on various otolaryngology association websites and to determine whether they are above the recommended reading level for patient education materials. Study Design and Setting Analysis of patient education materials from 9 major otolaryngology association websites. Methods The readability of 262 otolaryngology-related IPEMs was assessed with 8 numerical and 2 graphical readability tools. Averages were evaluated against national recommendations and between each source using analysis of variance (ANOVA) with post hoc Tukey’s honestly significant difference (HSD) analysis. Mean readability scores for each otolaryngology association website were compared. Results Mean website readability scores using Flesch Reading Ease test, Flesch-Kincaid Grade Level, Coleman-Liau Index, SMOG grading, Gunning Fog Index, New Dale-Chall Readability Formula, FORCAST Formula, New Fog Count Test, Raygor Readability Estimate, and the Fry Readability Graph ranged from 20.0 to 57.8, 9.7 to 17.1, 10.7 to 15.9, 11.6 to 18.2, 10.9 to 15.0, 8.6 to 16.0, 10.4 to 12.1, 8.5 to 11.8, 10.5 to 17.0, and 10.0 to 17.0, respectively. ANOVA results indicate a significant difference ( P < .05) between the websites for each individual assessment. Conclusion The IPEMs found on all otolaryngology association websites exceed the recommended fourth- to sixth-grade reading level.


2009 ◽  
Vol 141 (5) ◽  
pp. 555-558 ◽  
Author(s):  
Jewel Greywoode ◽  
Eric Bluman ◽  
Joseph Spiegel ◽  
Maurits Boon

Objective: To evaluate the readability of patient-oriented online health information (OHI) presented on the American Academy of Otolaryngology–Head and Neck Surgery (AAO–HNS) website. Study Design: Review of the Flesch-Kincaid (FK) grade level for 104 articles on the AAO–HNS website. Methods: The FK grade level for 104 articles was determined using the readability calculator available within Microsoft Office Word 2003. The interobserver reliability for the FK grade level was determined by calculating the intraclass correlation coefficient (ICC) for 52 entries. Results: The average FK grade reading level of the articles was 10.8 (range 6.3-16.7; 95% CI, 10.4-11.2). Eighty-one percent of the articles were written at a ninth grade level or higher. The intraclass correlation was good (r = 0.83) for the 52 articles that were independently reviewed. Conclusions: This analysis has shown that the average reading level for each article on the AAO–HNS site was higher than the recommended sixth grade reading level. Although the AAO–HNS site is written at a higher level than that suggested for the general public, it is important to realize that readability is just one consideration in the evaluation of OHI comprehension. Physicians need to be cognizant of their patients' ability to read and comprehend written information and tailor their educational material appropriately.


2020 ◽  
Vol 3 ◽  
Author(s):  
Jason Kabir ◽  
Jennifer Maratt

Background: Colonoscopies and esophagogastroduodenoscopies (EGDs) are commonly performed to screen for polyps and Barrett’s esophagus (BE), respectively. Findings from screening exams determine if, and when, surveillance is needed. Within the Veterans Health Administration (VHA), communication of test results is mandated; however, there is no clear guidance on how to communicate these results. The aim of this study was to determine the content and readability of endoscopy pathology letters that are used to relay results to patients within the VHA.        Methods: We used Corporate Data Warehouse to identify patients who had a colonoscopy for colorectal cancer screening or post-polypectomy surveillance, or an EGD for BE screening or surveillance, between 2010-2018. We then identified patients who had either: low-risk colon adenomas (LRA), high-risk colon adenomas (HRA), non-dysplastic BE (NDBE), BE with low-grade dysplasia (BE-LGD), or BE with high-grade dysplasia (BE-HGD). Pathology letters for each of these findings were obtained and reviewed by two reviewers independently to categorize as containing ‘alarming,’ ‘not alarming,’ or ‘balanced’ terminology. The readability of each letter was determined by using Microsoft Word to obtain the Flesch-Kincaid reading ease and grade level equivalency scores.     Results: Pathology letters from Richard L. Roudebush VA Medical Center were found to be non-alarming for LRAs, HRAs, and NDBE; balanced for BE-LGD; and alarming for BE-HGD. The average Flesch-Kincaid reading ease and grade level equivalency scores for the letters were 41.44 and 10.28, respectively.    Conclusion and Potential Impact: While pathology letters may contain risk-appropriate terms to describe lesions, readability measures indicate that the content is above the recommended reading ease and grade level for an average adult in the U.S. Improving the readability of pathology letters could improve patients’ understanding of their risk status, thus leading to increased adherence to surveillance endoscopy recommendations.    


Sign in / Sign up

Export Citation Format

Share Document