Background
Patient-centered medicine is a cornerstone of 21st century health care in the United States. At the center of this ideal lies shared decision-making between physician and patient. This process is essential in oncology, where patients are often confronted with an array of treatment options described by multiple specialists. These options are associated with individualized risks and benefits and are judged according to the unique values and interpretations of each patient. Patients often come to physician appointments with information gleaned from Web sites, the lay press, and online networking portals. There is clear evidence that patients are gathering treatment information on the Internet and are using that information to help guide their treatment decisions even before meeting with an oncologist.1 In order to make discerning oncologic treatment choices, it is imperative that patients find health care information from trusted and understandable sources. Poor patient comprehension of health care information correlates with lower patient satisfaction and compromises health outcomes.2,3 Patients place more trust in information and are more likely to follow recommendations that they understand.2,4 Effective communication plays an important role in overcoming health care disparities.5,6
Comprehension of online health care information depends on a patient's health literacy; that is, the ability to read and process health information and translate that information into health care decisions. More than one-third of US adults have health literacy at or below the basic level.7 Only 12% have proficient (ie, the highest level) health literacy.8 For comprehension, appropriately written health care information should account for average rates of health literacy in the United States. National guidelines, including those by the Department of Health & Human Services (HHS), recommend that health information be written at or below the sixth grade level based on the current US literacy rate.9 We set out to determine whether patient materials found on the Web sites of NCI-Designated Cancer Centers (NCIDCC) were written at an appropriate level to facilitate patient comprehension. Other groups have demonstrated significant gaps between recommended information complexity and actual written information for patients in a variety of non-oncology fields.10–14 We gathered and examined patient-targeted information found on NCIDCC Web sites and analyzed it with 10 distinct tests of readability. We tested the hypothesis that the average readability of online patient information (OPI) from NCIDCC Web sites would be written at the appropriate sixth grade level.
We performed secondary exploratory analyses to investigate potential differences in readability between geographic regions or between comprehensive and noncomprehensive cancer centers. Results from the freely available American Cancer Society (ACS) OPI were used as a comparison. If OPI from NCIDCCs were not written at an appropriate level for optimal patient comprehension, this finding would have significant implications for shared decision-making and health care disparities.
Methods
Text Extraction
We identified NCIDCC Web sites using the online list available at http://cancercenters.cancer.gov/Center/CancerCenters as of October 2014. Web sites were individually viewed by 1 of 2 study authors (D.F., M.M.F.) and patient-targeted information related to general information, treatment options, and side effects for breast, prostate, lung, and colon cancer was extracted. Our analysis included information about general descriptions of each cancer, screening, treatment options (eg, surgery, chemotherapy, radiation), benefits, side effects, risks, survivorship, and other issues surrounding cancer care. Links to scientific protocols, citations, references, patient accounts, physician profiles, or outside institutions were explicitly excluded from this analysis. Every attempt was made to capture online information from NCIDCC Web sites because patients would have encountered it reading through each appropriate Web site. In addition, any links leading outside each individual cancer center's domain (eg, to the NCI or ACS) were also excluded from analysis.
For comparison, text from OPI relevant to breast, prostate, lung, and colon cancers available on the ACS Web site as of October 2014 was also collected. The ACS OPI source material includes information that is often encountered through each NCIDCC Web site regarding description of cancer, screening, diagnosis, treatment options (chemotherapy, surgery, radiation), and side effects. These documents are comprehensive and thorough in their content of the aforementioned topics. Because these documents are similar in the type and scope of information presented on NCIDCC Web sites, they were chosen as a choice of comparison.
Assessment of Readability
Extracted text was uploaded into Readability Studio version 2012.0 for analysis (Oleander Software, Hadapsar, India). We chose 10 commonly used readability tests to assess the readability of this material and avoid potential biases present within each individual test: the New Dale-Chall Readability Formula, Flesch Reading Ease scale, Flesch-Kincaid Grade Level, FORCAST scale, Fry Readability Graph, Simple Measure of Gobbledygook (SMOG) test, Gunning Frequency of Gobbledygook index, New Fog Count, Raygor Readability Estimate, and Coleman-Liau Index. These tests are used in both the public and private sector and have been well validated as measures of readability.15–22 Each test reports a score or score range, which was used for all analyses.
Statistical Analysis
Statistical analyses were performed using IBM SPSS for Macintosh, version 22.0 (IBM Corporation, Armonk, NY). We compared and measured readability of OPI from NCIDCCs to the sixth grade level. We chose a sixth grade reading level as the standard against which to compare the readability of the texts because this has been established as the target grade level by HHS.9 Given a set standard for comparison, single-sample t-tests were used to determine the significance of the difference between our texts and the ideal reading level. Additional analyses included a comparison of comprehensive versus noncomprehensive NCIDCCs using independent samples t-tests. We also assessed whether readability varied systematically according to geographic region, using ANOVA to determine significant differences. We used the geographic regions as defined by the National Adult Literacy Survey (NALS).23 This divided cancer centers into 1 of 4 census definitions of regions: Northeast, Midwest, South, and West. States for each region are listed in supplemental eTable 1 (available with this article at JNCCN.org).
Results
OPI was collected from 58 NCIDCCs. Only nonclinical centers (n=7) and those without OPI (n=3) were excluded (Figure 1). A mean of 30,507 words (range, 5,732–147,425), 1,639 sentences (range, 319–7,094), and 762 paragraphs (range, 155–2,652) per Web site were extracted.
CONSORT diagram of study design and flow. A total of 68 NCI-Designated Cancer Centers were identified. Nonclinical centers (n=7) and centers that lacked patient information (n=3) were excluded from the analysis.
Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 14, 6; 10.6004/jnccn.2016.0075
Two of the most commonly reported readability tests are the Flesch Reading Ease scale and the Raygor Readability estimate. The Flesch Reading Ease scale generates a score ranging from 100 (very easy) to 0 (very difficult) with “plain English” scoring a level of 60 to 70 (understood by most 13–15 year olds). This test focuses on words per sentence and syllables per word and is a standard measurement of readability often used by US government agencies.18 The mean score on this test was 43.33 (SD, 7.46; range, 27–57) for comprehensive and 44.78 (SD, 6.63; range, 31–55) for noncomprehensive NCIDCCs (supplemental eFigure 1A, B). In these analyses, OPI at all NCIDCCs was at least 2 standard deviations away from the target goal for an appropriate reading level based on the Flesch Reading Ease scale. The Raygor Readability Estimate Graph uses the number of sentences and letters per 100 words and provides a grade-level estimate. Using this test, OPI for all NCIDCCs was 14.1 (ie, college-level: SD, 2.3; Figure 2). Again, this is significantly higher than the target goal of a sixth grade level (P<.001).
To bolster our analysis and because there is not one single validated health care–specific readability test, we analyzed OPI using a panel of 10 different tests. Across all 10 readability tests, OPI found at NCIDCCs (whether comprehensive or noncomprehensive)
Raygor Readability level. Online patient information (OPI) from NCI-Designated Cancer Centers (NCIDCC) Web sites (red) and American Cancer Society (ACS) Web sites (blue) underwent Raygor Readability Analysis. OPI for all NCIDCCs was 14.1 (college-level, SD, 2.3). The ACS OPI provides easier language (seventh to ninth grade) compared with NCIDCC Web sites (P<.01).
Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 14, 6; 10.6004/jnccn.2016.0075
An additional set of analyses was performed to determine whether the reading levels of OPI from NCIDCCs were significantly different from those obtained from the ACS patient handouts. Across all metrics, ACS Web sites provide easier language (Table 1 and Figure 2; P<.1 in all cases). Post hoc comparisons of the subgroup means using Bonferroni corrections indicated that the ACS mean was significantly different from both comprehensive and noncomprehensive cancer centers for each readability metric (P<.05 for each).
Across individual readability tests, there were no differences in readability between comprehensive and noncomprehensive cancer centers (Table 1 and supplemental eFigure 1A, B). Similarly, no difference between the mean readability for comprehensive (12.56; SD, 1.33) versus noncomprehensive cancer centers (12.24; SD, 1.19) was found using the composite scale of 8 measures of readability (t(56)=0.90; P=.37).
Finally, as there are documented differences in literacy across geographic regions, we assessed regional differences in readability as an exploratory analysis. When comparing individual readability measures, significant regional differences were identified in 4 of the 10 metrics (P<.05) (see supplemen-tal eTable 2). Web sites from the Midwest tended to provide information that was easiest to read, whereas those from the Northeast and West were the most difficult (Figure 3).
Using the previously described composite measure of readability, only a trend toward regional variation was seen (P=.08).
Discussion
The past 20 years have witnessed a paradigm shift in how patients obtain health care information. What was once the domain of physicians and other health care providers is now often supplemented or replaced by keyword or symptom searches on the Internet.1 Patients searching for online health information can find helpful, balanced, and appropriate counseling but may also encounter misinformation, scare tactics, and highly biased reports. Patients seek information they can comprehend, and will often consider that information as a trusted source.24,25
NCIDCCs are in a position to be at the forefront of providing patients with access to appropriate cancer-related information. In this analysis we determined how well patient information is presented based on recommended target readability levels. National guidelines recommend that health information be presented at a sixth grade reading level. We found that OPI from NCIDCCs was written at
Readability of NCIDCC and ACS Online Patient Information
We focused our analyses on general patient information and educational material related to 4 of the most common cancers in the United States: breast, colon, prostate, and lung. We chose to use a panel of readability analyses rather than rely on a single measure of readability. These analyses have been well validated in a number of different settings. Regardless of the test used, information from NCIDCCs was too complex based on the target level.
In discussing our preliminary findings with colleagues, it was commonly suggested that cancer care information is too complex to be written at a sixth grade level. Analysis of OPI provided by the ACS documents that information is written closer to the target grade level. ACS documents included information about chemotherapy, radiotherapy, and surgery and are written between a seventh and ninth grade level. Although still written above the sixth grade level, this information is presented at a more appropriate level than that seen on NCIDCC Web sites.
We performed a number of secondary analyses with our data. We compared readability of OPI between comprehensive and noncomprehensive cancer centers and found no statistically significant difference between these groups. The designation of a comprehensive cancer center is determined by the NCI and is dependent on the services provided by the center to patients and on ongoing research. The quality of OPI is not considered for NCI designation. NALS has documented regional differences related to educational level, immigration, poverty, and other factors.23 We placed cancer centers into regions as defined by the NALS and found regional differences in 4 of 10 tests. On average, the grade level of OPI is highest in the Northeast (12.9 grade level) and lowest in the Midwest (11.8) (Figure 3). This contrasts with the results of the NALS where adults in the Midwest outperformed those in the Northeast in regard to literacy.23 This suggests that a larger gap between readability of information and literacy is seen in the Northeast than in other regions.
Most readability measures we used derived their metric (ie, grade output) by analyzing the number of syllables per word and the number of words per sentence. This implies that the use of simpler words and/or shorter sentences would decrease the reading level of OPI. For example, “immunologic modulation via pharmacologic targeting of the PD-1 receptor” could instead be written as “drugs that help your own immune system fight cancer.” This changes the grade level from doctorate to approximately seventh to ninth grade.
Improving the information found on NCIDCC Web sites requires a multidisciplinary approach involving physicians, nurses, health care educators, and patients. Experts in health care communication also should be included when developing OPI. An open dialogue and exchange focused on improving accuracy and access to this information and communication has the potential to improve outcomes and benefit patients.26 With this in mind, there must be a push forward to make improvements in how information is presented and disseminated to patients.27 Besides improving the actual content of information, appropriate separation of information for patients, clinicians, and researchers could help greatly improve online communication and readability. A few NCIDCC Web sites have now created specific “subsections” explicitly targeted to one of the abovementioned groups. This could help ensure that complex clinical or scientific information is not
Regional variations in readability of online patient information. Cancer centers were assigned to regions based on the National Adult Literacy Survey (West = yellow, Midwest = blue, South = green, and Northeast = orange). Individual NCI-Designated Cancer Centers are shown with circle size representing the composite grade level measure (red). For comparison, results of the American Cancer Society analysis are provided (legend, blue circle). Regional differences were identified in 4 of the 10 readability metrics (P<.05) with higher levels in the Northeast (12.9) and lower levels in the Midwest (11.8).
Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 14, 6; 10.6004/jnccn.2016.0075
We acknowledge several limitations of this work. We only analyzed data from a single time point, whereas cancer center Web sites can change daily, weekly, or monthly. Although information on research findings or clinical trials may be updated regularly, it is less likely that educational material provided to patients is revised frequently. This is partly compensated by the analysis of data from every NCIDCC. In addition, we did not attempt to assess the accuracy of content, only the readability of information. Viewing the Web sites, it becomes clear that many Web sites mix patient information with information for clinicians and researchers. A number of centers have begun to provide information specific for each group of potential users. Such an approach should allow data to be presented at a reading level appropriate for their target audience. Finally, we did not attempt to analyze printed patient educational materials that are often provided to patients visiting these centers. It is possible that this information is written at a more appropriate level.
NCIDCCs are identified as the local and national centers of excellence where patient care, research, and innovation occur. Many patients look to their local NCIDCC as a source of information and treatment. It is imperative that patient information from these centers be well written, accurate, and understandable for most Americans. Failure to provide appropriate information can result in patients seeking alternative sources of information, often of variable quality. NCIDCCs should be leaders in disseminating accurate and appropriate written information on cancer care.
Conclusions
In this study, we examined the readability of OPI found on NCIDCC Web sites. We found that information is written at the 12th to 13th grade level (freshmen collegiate level), which is significantly above the recommended national guidelines of sixth grade. Improvement in the readability of OPI will depend on a multidisciplinary involvement of physicians, nurses, educators, and patients.
See JNCCN.org for supplemental online content.
This work was supported in part by NIH/NCI CA160639 (R.J.K.). Dr. Bradley has disclosed that she is an Intellectual Property Contributor to UpToDate. Dr. Anderson has disclosed that she has received research funding from Elekta. Dr. Kimple has disclosed that he has received research funding from Threshold Pharmaceuticals. The remaining authors have disclosed that they have no financial interests, arrangements, affiliations, or commercial interests with the manufacturers of any products discussed in this article or their competitors.
References
- 1.↑
Koch-Weser S, Bradshaw YS, Gualtieri L, Gallagher SS. The Internet as a health information source: findings from the 2007 Health Information National Trends Survey and implications for health communication. J Health Commun 2010;15(Suppl 3):279–293.
- 2.↑
Bains SS, Bains SN. Health literacy influences self-management behavior in asthma. Chest 2012;142:1687; author reply 1687–1688.
- 3.↑
Health literacy: report of the Council on Scientific Affairs. Ad Hoc Committee on Health Literacy for the Council on Scientific Affairs, American Medical Association. JAMA 1999;281:552–557.
- 4.↑
Rosas-Salazar C, Apter AJ, Canino G, Celedon JC. Health literacy and asthma. J Allergy Clin Immunol 2012;129:935–942.
- 5.↑
5 Berkman ND, Sheridan SL, Donahue KE et al.. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med 2011;155:97–107.
- 6.↑
Hirschberg I, Seidel G, Strech D et al.. Evidence-based health information from the users' perspective—a qualitative analysis. BMC Health Serv Res 2013;13:405.
- 7.↑
White S. Assessing the nation's health literacy: Key concepts and findings of the National Assessment of Adult Literacy. Atlanta, GA: American Medical Association; 2008.
- 8.↑
Cutilli CC, Bennett IM. Understanding the health literacy of America: results of the National Assessment of Adult Literacy. Orthop Nurs 2009;28:27–32; quiz 33–34.
- 9.↑
How to Write Easy-to-Read Health Materials. Available at: https://www.nlm.nih.gov/medlineplus/etr.html. Accessed May 2, 2016.
- 10.↑
Colaco M, Svider PF, Agarwal N et al.. Readability assessment of online urology patient education materials. J Urol 2013;189:1048–1052.
- 11.
Svider PF, Agarwal N, Choudhry OJ et al.. Readability assessment of online patient education materials from academic otolaryngology-head and neck surgery departments. Am J Otolaryngol 2013;34:31–35.
- 12.
Shukla P, Sanghvi SP, Lelkes VM et al.. Readability assessment of internet-based patient education materials related to uterine artery embolization. J Vasc Interv Radiol 2013;24:469–474.
- 13.
Misra P, Kasabwala K, Agarwal N et al.. Readability analysis of internet-based patient information regarding skull base tumors. J Neurooncol 2012;109:573–580.
- 14.↑
Eloy JA, Li S, Kasabwala K et al.. Readability assessment of patient education materials on major otolaryngology association websites. Otolaryngol Head Neck Surg 2012;147:848–854.
- 15.↑
Walsh TM, Volsko TA. Readability assessment of internet-based consumer health information. Respir Care 2008;53:1310–1315.
- 16.
Albright J, de Guzman C, Acebo P et al.. Readability of patient education materials: implications for clinical practice. Appl Nurs Res 1996;9:139–143.
- 17.
Cooley ME, Moriarty H, Berger MS et al.. Patient literacy and the readability of written cancer educational materials. Oncol Nurs Forum 1995;22:1345–1351.
- 19.
McLaughlin GH. SMOG grading: a new readability formula. J Reading 1969;12:639–646.
- 20.
Coleman M, Liau TL. A computer readability formula designed for machine scoring. J Appl Psychol 1975;60:283–284.
- 21.
Raygor AL. The Raygor readability estimate: a quick and easy way to determine difficulty. In: Pearson PD, ed. Reading: Theory, Research, and Practice (26th Yearbook of the National Reading Conference). Clemson SC: National Reading Conference; 1977:259–263.
- 23.↑
Kirsch I, Jungeblut A, Jenkins L, Kolstad A. Adult Literacy in America: A First Look at the Results of the National Literacy Survey. 3rd ed. Washington, DC: National Center for Education, U.S. Department of Education; 2002
- 25.↑
Ellimoottil C, Polcari A, Kadlec A et al.. Readability of websites containing information about prostate cancer treatment options. J Urol 2012;188:2171–2175.
- 26.↑
Safeer RS, Keenan J. Health literacy: the gap between physicians and patients. Am Fam Physician 2005;72:463–468.
- 27.↑
Weiss BD. Health literacy research: isn't there something better we could be doing? Health Commun 2015;30:1173–1175.