A Hospital-Wide Intervention to Improve Compliance With TNM Cancer Staging Documentation

View More View Less
  • 1 School of Medicine, University of California, San Diego, San Diego;
  • | 2 Department of Biomedical Informatics, UC San Diego Health, University of California, San Diego, La Jolla;
  • | 3 Moores Cancer Center at UC San Diego Health, La Jolla;
  • | 4 Division of Hematology-Oncology, Department of Medicine, University of California, San Diego, La Jolla; and
  • | 5 Division of Otolaryngology – Head & Neck Surgery, Department of Medicine, University of California, San Diego, San Diego, California.

Background: Accurate oncologic staging meeting clinical practice guidelines is essential for guideline adherence, quality assessment, and survival outcomes. However, timely and uniform documentation in the electronic health record (EHR) at the time of diagnosis is a challenge for providers. This quality improvement project aimed to increase provider compliance of timely clinical TNM (cTNM) or pathologic TNM (pTNM) staging for newly diagnosed oncologic patients. Methods: Providers in the following site-specific oncologic teams were included: head and neck, skin, breast, genitourinary, gastrointestinal, lung and thoracic, gynecologic, colorectal, and bone marrow transplant. Interventions to facilitate timely cTNM and pTNM staging included standardized EHR-based workflows, learning modules, stakeholder meetings, and individualized provider training sessions. For most teams, staging was considered compliant if it was completed in the EHR within the first 7 days of the calendar month after the date of the patient visit. Factors associated with staging compliance were analyzed using logistic regression models. Results: From January 1, 2014, to December 31, 2018, 7,787 preintervention and 5,152 postintervention new patient visits occurred. During the preintervention period, staging was compliant in 5.6% of patients compared with 67.4% of patients after intervention (P<.001). In the final month of the postintervention period, the overall staging compliance rate was 78.1%. At most recent tracking, staging compliance was 95%, 97%, and 93% in December 2019, January 2020, and February 2020, respectively. Logistic regression found that increasing years of provider experience was associated with decreased staging compliance. Conclusions: High rates of staging compliance in complex multidisciplinary academic oncologic practice models can be achieved via comprehensive quality improvement and structured initiatives. This approach serves as a model for improving oncologic documentation systems to facilitate clinical decision-making and multidisciplinary coordination of care.

Background

Cancer staging categorizes the size, location, and extent of the primary tumor; the extent of lymph node involvement; and the presence of distant metastases. Accurate staging documentation is critical for guideline evaluation, quality assessment, and survival outcomes.1 Nonetheless, the timely and uniform documentation of staging at the time of diagnosis remains understudied.2 Systematic and early documentation of cancer staging in electronic health records (EHRs) is a metric of quality cancer care endorsed by the National Quality Forum.3

Stage determination is performed both in clinical encounters by providers and in cancer registries. Although highly accurate, cancer registries are often abstracted several months after the commencement of therapy via extensive manual review of patient clinical, radiographic, and pathologic data. Currently at the University of California San Diego Health System, as in other academic centers, formal abstraction within the cancer registry takes 3 to 6 months.4,5 Staging compliance derived by our cancer registrar noted a 15% rate of unstaged disease and conflicting staging data among primary cancer providers and other clinicians. Furthermore, the work of cancer registrars to fill in missing staging documentation in the EHR due to provider noncompliance represents a redundancy that, if addressed, could lead to more efficient use of institutional resources.1

Proper documentation of clinical or pathologic TNM (cTNM or pTNM, respectively) stage, which should occur within 4 months after diagnosis or at the time of disease progression (whichever occurs first), is crucial for initial management and provides data for quality assessment.68 Improving accuracy and timeliness of documentation could enhance clinical decision-making, care coordination, and communication, especially as revisions to staging systems may be adopted in real time.2

Previous clinician-led attempts at improving staging documentation have had favorable results but did not directly target clinician staging timeliness in the EHR. In 2010, the Cancer Care Ontario Stage Capture Project improved province-wide stage capture rate in the paper medical record from 30% to 68%, which underscored the importance of a multifaceted approach involving educational outreach, use of reminders, opinion leaders, and timely feedback.9 The Michigan Urological Surgery Improvement Collaborative, a statewide collaborative of 29 urology practices (>70% of urologists in the state), improved staging compliance as collected by a web-based clinical registry from 58% to 79%.10 Cecchini et al4 at the Smilow Cancer Hospital at Yale New Haven Hospital implemented an Epic best practices advisory to improve physician cancer stage reporting, which resulted in an improvement in staging compliance from 28% to 60%.

In the present study, we report the development of a quality improvement initiative at our NCI-designated comprehensive cancer center to increase provider staging compliance in the Epic EHR system (Epic Systems Corporation) with the primary aim of achieving complete, accurate, and standardized cTNM or pTNM staging for new patients.6 Epic Beacon is an Epic EHR medical oncology application that includes cancer staging, treatment plans and protocols, and therapeutic plans and outcomes.11 It supports the AJCC and FIGO oncologic staging systems.6,12,13 Our initiative included deficiency notifications, quality assessment reports, clinical decision support improvements, and communication strategies to achieve a durable improvement in institution-wide staging practices.

Methods

Context

Moores Cancer Center (MCC) is UC San Diego Health’s NCI-designated comprehensive cancer center. Patient care is delivered between MCC, 2 academic teaching hospitals, the main health sciences campus, and 7 outpatient locations. All providers are full-time employees, and all sites use the Epic EHR. This quality improvement project targeted all attending medical oncology and surgical oncology physicians, nurse practitioners (NPs), and physician assistants (PAs) at MCC in the following teams: head and neck, skin, breast, genitourinary, gastrointestinal, lung/thoracic, gynecologic, colorectal, and bone marrow transplant (BMT). All of these oncologic care teams use Epic, our system-wide EHR system. Because radiation oncology physicians use the ARIA EHR system (Varian Medical Systems), which is not used by other care teams within our hospital system, their performance metrics were not evaluated or targeted by this quality improvement initiative.

Specific Aims

The intervention examined provider ability to ensure every new patient encounter had clinical or pathologic oncologic staging documented in the EHR. The primary outcome was the prevalence of timely completed cTNM staging for stageable cancers in the EHR based on internationally accepted staging systems (AJCC and FIGO).6,12,13 At the commencement of this project in 2016, Epic supported the 6th and 7th editions of the AJCC Cancer Staging Manual.14,15 In 2018, the 8th edition of the AJCC Cancer Staging Manual was implemented.6 Because patients can present either before or after intervention as new patients to providers, both clinical and pathologic staging were acceptable for compliance.

For a new patient encounter to be considered compliant, clinical or pathologic T, N, and M (cTNM or pTNM) staging and overall staging group must have been completed and signed in the EHR form within the first 7 days of the calendar month after the date of the patient visit. The BMT and colorectal teams were considered compliant if staging was completed within 7 days after the second calendar month of the patient visit, because these teams requested additional time to stage patients in order to incorporate additional radiographic, surgical, and laboratory data. These deadlines applied regardless of whether a patient was seen in the beginning, middle, or end of the month. In the cases of new patients who visited multiple providers, each provider visit was independently assessed for compliance based on the ability to ensure that staging was present in the chart according to this deadline, regardless of who signed the staging form. Because our intervention targeted providers’ ability to ensure that staging was documented in a timely manner for each new patient, we did not differentiate which provider had staged the patient.

Secondary outcomes included time in days from patient visit to Epic staging form completion and the association of patient, provider, and treatment team characteristics, including 1- or 2-month deadline, with the prevalence of and time to staging. Patient characteristics included age, sex, race, and Charlson comorbidity index (CCI) score. Provider characteristics included years of oncologic practice, sex, and type (ie, medical oncologist, surgical oncologist, NP, or PA).

Quality Improvement Methodology

The quality improvement intervention for TNM staging documentation was first performed in a pilot program with the head and neck team from June 1, 2016, to December 31, 2016. The Epic staging workflow was presented to clinicians at a head and neck multidisciplinary tumor conference. Clinicians received weekly staging prevalence reports. For every newly seen patient who lacked complete staging, an Epic in-basket message was sent to the provider with the patient’s name, visit date, and diagnosis and a clear message with instructions to complete staging using the staging form (supplemental eFigure 1, available with this article at JNCCN.org). Individualized training in the provider’s clinic was made available to address questions regarding the Epic staging workflow. Providers in the pilot group identified the staging form design as the most likely barrier to documentation; therefore, implementation adjustments were made. Epic Crystal, the report platform for the Epic EHR system, was implemented to track staging and identified that redundant or incorrect oncologic diagnoses were entered by nononcology clinicians into the EHR problem (ie, diagnosis) list. In these cases, oncology providers were asked to update the problem list for accuracy.

After the pilot period, the following intervention was rolled out to each oncologic team to improve oncologic staging compliance, with a plan to present this EHR staging improvement initiative at each disease team’s multidisciplinary tumor conference. First, the Epic Beacon team identified each disease team’s diagnosis Groupers (QPID Health), an Epic add-on that links together a set of codes, concepts, or clinical terms to represent related conditions and associated clinical data. Each Grouper was verified with the cancer registrar and programmed to trigger its associated corresponding AJCC or FIGO staging form. Tumor types that were not stageable according to AJCC or FIGO guidelines did not trigger the staging form. The following training material was prepared: online education resources created by the AJCC, instructions on how to use the Epic cancer staging form, and in-house eLearning video modules.16

At each disease team’s multidisciplinary tumor conference, a baseline staging report was presented to the disease team, together with the staging initiative’s goals and methods. Epic Crystal was used to track staging, and a best practices advisory was created within the Epic EHR. Twice per month, a detail report was emailed to each provider, and an automatic in-basket message was sent to providers for each unstaged encounter. In addition, a monthly summary report was prepared, including the staging prevalence per team and per provider. This report included each provider’s name and staging rate and was shared with the oncologic team.

Some team-specific modifications were made to optimize compliance and workflow. The breast team requested that PAs and NPs be included in Epic staging and in the monthly report. The gynecology team followed FIGO staging guidelines rather than AJCC guidelines. The BMT and colorectal teams requested 2 months to stage, rather than 1 month as in other teams, to allow inclusion and complete interpretation of additional relevant data for TNM staging, such as additional radiographic, surgical, and laboratory data findings.

Statistical Analysis

Patient visit information in the EHR was analyzed from January 1, 2014, through December 31, 2018. Some visits had negative values for time to stage, which represented completion of the EHR staging form during precharting before the visit, backdated staging information to reflect cancer diagnosis and care from a different provider, or previous entry by another provider in the MCC system. These visits were assigned a time-to-stage value of 0 days, rather than keeping their negative values or excluding them altogether, to reflect the provider’s ability to confirm that staging had indeed been completed in the EHR at the time of the visit.

To further evaluate patient and provider predictors of staging compliance, regression analysis was performed. Univariable logistic regression models were used to evaluate patient and provider factors as independent variables associated with staging compliance, including patient age, sex, race, and CCI score, and provider years of oncologic practice, sex, and type, with regard to binary compliant staging status as the dependent variable. Variable selection was applied using Akaike information criterion (AIC)–based stepwise selection. Variables that survived AIC-based selection were patient age, sex, and CCI score, and provider years of experience. Other variables (disease team, patient race, and provider sex) did not reach the statistical thresholds for significance (P<.05). Model performance was evaluated using the area under the receiver operating characteristic curve (AUC) for discrimination and the Hosmer-Lemeshow test for calibration.17 All analyses were performed in the R environment for statistical computing and graphics (R Foundation for Statistical Computing).

Results

In total, 15,870 new patient encounters for 14,571 patients with stageable cancers were captured. Patient encounters with unstageable cancers were not included in quality assessment or statistical analyses. After eliminating repeat visits and providers who were no longer in the MCC system, the total number of encounters included 12,939 patients across 88 providers in the head and neck (n=7), skin (n=2), breast (n=15), genitourinary (n=9), gastrointestinal (n=23), lung/thoracic (n=6), gynecologic (n=6), colorectal (n=5), and BMT (n=15) teams. There were 7,787 preintervention visits compared with 5,152 postintervention new patient visits (Table 1).

Table 1.

Patient, Provider, and Team Characteristics

Table 1.

In both preintervention and postintervention arms, the highest proportions of patients were seen by the breast, gastrointestinal, and genitourinary teams. Notably, the percentage of patients with head and neck cancer increased from 4% of total new patient visits to 17%, reflecting an expansion of MCC head and neck cancer providers. There was no significant difference in patient or provider characteristics before versus after intervention. Before intervention, patients were, on average, aged 60.2 years and mostly female (57.3%) and White (66.5%), with a mean CCI score of 1.87. After intervention, patients were, on average, aged 60.6 years and mostly female (55.0%) and White (63.5%), with a mean CCI score of 1.96. Before intervention, providers had a mean of 19.96 years of experience and were mostly male (65.4%). After intervention, providers had a mean of 17.66 years of experience and were mostly male (64.1%) (Table 1).

Staging Compliance

In May 2016, before implementation of the intervention, the pilot group consisting of the head and neck team was 0% compliant in staging new patients within 7 days of the following calendar month of the first patient encounter (n=34 patients). By the final month of the pilot phase in December 2016, staging compliance had increased to 94.7% (n=19 patients).

Timely staging occurred in 5.6% of new patients before intervention and in 67.4% of new patients after implementation of the intervention overall (P<.001). All care teams showed significant increases in the prevalence of staging compliance (Table 2). With regard to staging prevalence trends (Figure 1), all teams showed substantial improvements in staging compliance. By December 2018, the final month of postintervention tracking, the average staging compliance prevalence was 78.1% for 228 patients (Figure 2). To assess the durability of the association, staging compliance was evaluated beyond the study period and was found to be 95%, 97%, and 93% in December 2019, January 2020, and February 2020, respectively.

Table 2.

Staging Prevalence Rates

Table 2.
Figure 1.
Figure 1.

Staging prevalence with EHR documentation of clinical stage by the treating provider for each team.

Citation: Journal of the National Comprehensive Cancer Network 2021; 10.6004/jnccn.2020.7799

Figure 2.
Figure 2.

Staging prevalence with EHR documentation of clinical stage by the treating provider for all teams.

Citation: Journal of the National Comprehensive Cancer Network 2021; 10.6004/jnccn.2020.7799

Time to Staging

Among staged patients, median time to staging was 0 days in the preintervention period (n=433) and 1 day in the postintervention period (n=3,471; P=.007) (Table 3). Patient encounters that were not staged were not analyzed for time-to-staging analysis. Combined time to staging was calculated separately for the BMT and colorectal teams to demonstrate staging behaviors in the setting of their extended deadlines. When analyzed by patient and provider characteristics, the median time to stage increased from preintervention to postintervention arms, but these changes were typically <5 days and therefore clinically insignificant. Median time to stage did not exceed 4 days in either the preintervention or postintervention arm. These results were consistent in team-specific analyses, except in the BMT and colorectal teams. For BMT providers, median time to stage increased from 0 to 21.5 days (P<.001). For colorectal providers, there was no significant change in median time to stage from 15 days (P=.807).

Table 3.

Days to Complete Staging Documentation for New Patient Visit

Table 3.

Regression Analysis

Overall, the intervention effect on the binary outcome of staging remained significant after adjusting for covariates using logistic regression. The odds for the postintervention group being staged were 34.6-fold greater than the odds for the preintervention group after adjustment for patient sex, comorbidity, tumor score, and provider years of oncologic practice. Notably, the provider years of oncologic practice (P<.001) had a negative coefficient value, suggesting an inverse relationship between staging compliance and provider experience. Comorbidity (P<.001) had a positive effect on staging (ie, improved staging compliance), whereas patient’s male sex (P=.001) had a negative effect (ie, worsened staging compliance). In addition, patient age also trended toward a negative coefficient value (P=.059) (Table 4). This model had an AUC of 0.868 (95% CI, 0.861–0.875) (Figure 3A) and was shown to be well calibrated because it passed the standard Hosmer-Lemeshow test (P=.237). When calibrated by provider experience, the prediction probability of staging was lower among providers with increased years of oncologic practice (Figure 3B).

Table 4.

Coefficients of Logistic Regression Model Using AIC-Selected Variables

Table 4.
Figure 3.
Figure 3.

(A) ROC curve for logistic regression model to predict timely staging of cancer in the electronic health record. The green line represents the performance of a random classifier. (B) Regression model showing staging prediction probability, calibrated by years of provider experience. With increasing years of provider experience, visits were less likely to be staged in our model.

Citation: Journal of the National Comprehensive Cancer Network 2021; 10.6004/jnccn.2020.7799

Discussion

In this report, we show that timeliness of cancer staging documentation in the EHR can be improved through a comprehensive initiative at an academic NCI-designated comprehensive cancer center. As described, this initiative incorporated educational materials and individualized training to target clinician familiarity and awareness of staging documentation guidelines and current practices, which lowered team- and individual-specific barriers and resulted in behavior change toward improved patient care.18 The intervention effect was durable, with consistently high overall prevalence of staging compliance at the most recent tracking.

The intervention was piloted in one disease team, allowing adjustments before rolling out the optimized initiative to other disease teams. Meetings with each team at respective multidisciplinary tumor conferences for feedback enabled team-specific adjustments. Openly reporting staging rates for each provider to disease team members by name created a peer comparison effect to make providers aware of their absolute performance and relative ranking. Peer comparison has been studied as an effective nonfinancial incentive to change physician behavior, especially when comparators are part of the same group and in close proximity. The effect of social pressure is more powerful when peer comparisons are unblinded to the names and performance of colleagues, as in our intervention.19

The intervention resulted in a pronounced and durable increase in staging compliance across all disease teams. However, there was some team-to-team variability in postintervention compliance and time-to-staging results. For example, the BMT team had a postintervention compliance of 46.7%, lower than that of other teams. In addition, although the BMT and colorectal teams specifically requested 2 months rather than the 1 month allotted for other teams, both teams showed improved compliance after intervention (46.7% and 75.2%, respectively) and by the end of the study period (64.3% and 81.8%, respectively). The 2 teams achieved a median time to stage of 18 days overall, which could mean that a 1-month allotment may be sufficient for staging. Although there is no specific literature on the relative difficulty of staging certain cancers versus others, we acknowledge that disease- and site-specific challenges exist on both clinical practice and staging guideline levels. Designating the same 1 month for the BMT and colorectal teams may have decreased time to stage, but because of team-specific barriers, compliance with this deadline may have been lower, resulting in decreased staging compliance. We hope that our findings inspire further research into differential staging practices between disease teams, and this topic is currently an active target of quality improvement intervention at our institution.

In our logistic regression model, provider years of oncologic practice was inversely associated with staging compliance. Slower EHR adoption and decreased use in older physicians have been well documented. The National Ambulatory Medical Care Survey (2002–2011) revealed that physicians aged ≥55 years had nearly half the rate of EHR adoption compared with those aged ≤45 years.20 Increased attention during training and rollout should be provided to older providers to ensure adequate EHR staging compliance while maintaining efficient clinic workflow.

In our model, higher comorbidity score and younger patient age were associated with increased staging compliance. Although the relationship between comorbidity and EHR completeness has not been formally investigated, others have implied that the amount of exposure to the healthcare system increases opportunities for documentation.21 Patient female sex trended toward increased staging compliance (P=.059), which may be explained by the higher proportion of patients seen by the breast team (25.6% and 24.7% in preintervention and postintervention arms, respectively). Although all teams showed a dramatic increase in compliance, the breast team also had a higher relative preintervention staging rate at 11.2%.16

Limitations of our study include the single-institution design, which could limit generalizability, although the Epic EHR captures a significant portion of the academic healthcare system market.22 The reported improvements in staging compliance could have been the result of measurement and not of feedback, educational resources, or one-to-one training. Although the Hawthorne effect, in which alteration of provider behavior is influenced by the observation itself, likely had an effect, this feature was used actively in our intervention in the form of regular audits to maintain staging at a high level and should not be considered a limitation. In addition to comprehensive intervention, this project underscores the importance of actively observing documentation compliance. Further investigation may reveal whether enhancements in stage documentation and workflow translate to improved patient outcomes.

Conclusions

Our study findings showed significantly improved compliance with cancer staging documentation by oncology providers with the implementation of a multimodal EHR quality improvement initiative. Crucial to our effort were stakeholder input and buy-in and continuous monitoring of intervention progress. The advent of EHRs has led to new methods for clinicians to examine and improve cancer diagnosis and treatment.

References

  • 1.

    Evans TL, Gabriel PE, Shulman LN. Cancer staging in electronic health records: strategies to improve documentation of these critical data. J Oncol Pract 2016;12:137139.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2.

    Asare EA, Washington MK, Gress DM, et al. Improving the quality of cancer staging. CA Cancer J Clin 2015;65:261263.

  • 3.

    National Quality Forum. Cancer endorsement maintenance 2011 final report: December 2012. Washington, DC: National Quality Forum; 2012.

  • 4.

    Cecchini M, Framski K, Lazette P, et al. Electronic intervention to improve structured cancer stage data capture. J Oncol Pract 2016;12:e949956.

  • 5.

    Warner JL, Levy MA, Neuss MN, et al. ReCAP: feasibility and accuracy of extracting cancer stage information from narrative electronic health record data. J Oncol Pract 2016;12:157158; e169179.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6.

    Amin MB, Edge SB, Greene FL, et al. , eds. AJCC Cancer Staging Manual, 8th ed. New York, NY: Springer; 2017.

  • 7.

    Burke HB. Improving the safety and quality of cancer care. Cancer 2017;123:549550.

  • 8.

    Abernethy AP, Herndon JE, Wheeler JL, et al. Poor documentation prevents adequate assessment of quality metrics in colorectal cancer. J Oncol Pract 2009;5:167174.

  • 9.

    Lankshear S, Brierley JD, Imrie K, et al. Changing physician practice: an evaluation of knowledge transfer strategies to enhance physician documentation of cancer stage. Healthc Q 2010;13:8492.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10.

    Filson CP, Boer B, Curry J, et al. Improvement in clinical TNM staging documentation within a prostate cancer quality improvement collaborative. Urology 2014;83:781787.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11.

    Bellamy L, Purcell WT. One institution’s experience with implementation of EPIC/Beacon: lessons learned. Oncology (Williston Park) 2014;28: 105106; 108; C3.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 12.

    Bhatla N, Berek JS, Cuello Fredes M, et al. Revised FIGO staging for carcinoma of the cervix uteri. Int J Gynaecol Obstet 2019;145:129135.

  • 13.

    Bhatla N, Aoki D, Sharma DN, et al. Cancer of the cervix uteri. Int J Gynaecol Obstet 2018;143(Suppl 2):2236.

  • 14.

    Greene FL, Page DL, Fleming ID, et al. , eds. AJCC Cancer Staging Manual, 6th ed. New York, NY: Springer; 2002.

  • 15.

    Edge SB, Byrd DR, Compton CC, et al. , eds. AJCC Cancer Staging Manual, 7th ed. New York, NY: Springer; 2010.

  • 16.

    American Joint Committee on Cancer. Physician. Accessed December 15, 2020. Available at: https://cancerstaging.org/CSE/Physician/Pages/Physician.aspx

    • Search Google Scholar
    • Export Citation
  • 17.

    Iezzoni LI, Shwartz M, Ash AS, et al. Risk adjustment methods can affect perceptions of outcomes. Am J Med Qual 1994;9:4348.

  • 18.

    Cabana MD, Rand CS, Powe NR, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA 1999;282: 14581465.

  • 19.

    Navathe AS, Emanuel EJ. Physician peer comparisons as a nonfinancial strategy to improve the value of care. JAMA 2016;316:17591760.

  • 20.

    Decker SL, Jamoom EW, Sisk JE. Physicians in nonprimary care and small practices and those age 55 and older lag in adopting electronic health record systems. Health Aff (Millwood) 2012;31:11081114.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21.

    Wells BJ, Chagin KM, Nowacki AS, et al. Strategies for handling missing data in electronic health record derived data. EGEMS (Wash DC) 2013;1:7.

    • Search Google Scholar
    • Export Citation
  • 22.

    Kanakubo T, Kharrazi H. Comparing the trends of electronic health record adoption among hospitals of the United States and Japan. J Med Syst 2019;43:224.

Submitted July 7, 2020; final revision received December 15, 2020; accepted for publication December 16, 2020.

Published online August 27, 2021.

Author contributions: Study concept and design: Kane, Califano. Implementation of interventions: Ramsey, Gold, Califano. Data collection: Ramsey. Data analysis: Kim. Supervision of data collection and analysis: Califano. Analytic design: Mohamed. Verification of analytic method: Lee. Manuscript – original draft: Lee, Faraji, Califano. Figure design: Lee, Kim, Faraji. Manuscript – review and editing: Mohamed, Ramsey, Kim, Kane, Gold, Faraji, Califano.

Disclosures: The authors have disclosed that they have not received any financial consideration from any person or organization to support the preparation, analysis, results, or discussion of this article.

Correspondence: Joseph A. Califano III, MD, Division of Otolaryngology – Head and Neck Surgery, Department of Surgery, University of California, San Diego, 9300 Campus Point Drive, La Jolla, CA 92037. Email: jcalifano@ucsd.edu

Supplementary Materials

  • View in gallery

    Staging prevalence with EHR documentation of clinical stage by the treating provider for each team.

  • View in gallery

    Staging prevalence with EHR documentation of clinical stage by the treating provider for all teams.

  • View in gallery

    (A) ROC curve for logistic regression model to predict timely staging of cancer in the electronic health record. The green line represents the performance of a random classifier. (B) Regression model showing staging prediction probability, calibrated by years of provider experience. With increasing years of provider experience, visits were less likely to be staged in our model.

  • 1.

    Evans TL, Gabriel PE, Shulman LN. Cancer staging in electronic health records: strategies to improve documentation of these critical data. J Oncol Pract 2016;12:137139.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2.

    Asare EA, Washington MK, Gress DM, et al. Improving the quality of cancer staging. CA Cancer J Clin 2015;65:261263.

  • 3.

    National Quality Forum. Cancer endorsement maintenance 2011 final report: December 2012. Washington, DC: National Quality Forum; 2012.

  • 4.

    Cecchini M, Framski K, Lazette P, et al. Electronic intervention to improve structured cancer stage data capture. J Oncol Pract 2016;12:e949956.

  • 5.

    Warner JL, Levy MA, Neuss MN, et al. ReCAP: feasibility and accuracy of extracting cancer stage information from narrative electronic health record data. J Oncol Pract 2016;12:157158; e169179.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6.

    Amin MB, Edge SB, Greene FL, et al. , eds. AJCC Cancer Staging Manual, 8th ed. New York, NY: Springer; 2017.

  • 7.

    Burke HB. Improving the safety and quality of cancer care. Cancer 2017;123:549550.

  • 8.

    Abernethy AP, Herndon JE, Wheeler JL, et al. Poor documentation prevents adequate assessment of quality metrics in colorectal cancer. J Oncol Pract 2009;5:167174.

  • 9.

    Lankshear S, Brierley JD, Imrie K, et al. Changing physician practice: an evaluation of knowledge transfer strategies to enhance physician documentation of cancer stage. Healthc Q 2010;13:8492.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10.

    Filson CP, Boer B, Curry J, et al. Improvement in clinical TNM staging documentation within a prostate cancer quality improvement collaborative. Urology 2014;83:781787.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11.

    Bellamy L, Purcell WT. One institution’s experience with implementation of EPIC/Beacon: lessons learned. Oncology (Williston Park) 2014;28: 105106; 108; C3.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 12.

    Bhatla N, Berek JS, Cuello Fredes M, et al. Revised FIGO staging for carcinoma of the cervix uteri. Int J Gynaecol Obstet 2019;145:129135.

  • 13.

    Bhatla N, Aoki D, Sharma DN, et al. Cancer of the cervix uteri. Int J Gynaecol Obstet 2018;143(Suppl 2):2236.

  • 14.

    Greene FL, Page DL, Fleming ID, et al. , eds. AJCC Cancer Staging Manual, 6th ed. New York, NY: Springer; 2002.

  • 15.

    Edge SB, Byrd DR, Compton CC, et al. , eds. AJCC Cancer Staging Manual, 7th ed. New York, NY: Springer; 2010.

  • 16.

    American Joint Committee on Cancer. Physician. Accessed December 15, 2020. Available at: https://cancerstaging.org/CSE/Physician/Pages/Physician.aspx

    • Search Google Scholar
    • Export Citation
  • 17.

    Iezzoni LI, Shwartz M, Ash AS, et al. Risk adjustment methods can affect perceptions of outcomes. Am J Med Qual 1994;9:4348.

  • 18.

    Cabana MD, Rand CS, Powe NR, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA 1999;282: 14581465.

  • 19.

    Navathe AS, Emanuel EJ. Physician peer comparisons as a nonfinancial strategy to improve the value of care. JAMA 2016;316:17591760.

  • 20.

    Decker SL, Jamoom EW, Sisk JE. Physicians in nonprimary care and small practices and those age 55 and older lag in adopting electronic health record systems. Health Aff (Millwood) 2012;31:11081114.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21.

    Wells BJ, Chagin KM, Nowacki AS, et al. Strategies for handling missing data in electronic health record derived data. EGEMS (Wash DC) 2013;1:7.

    • Search Google Scholar
    • Export Citation
  • 22.

    Kanakubo T, Kharrazi H. Comparing the trends of electronic health record adoption among hospitals of the United States and Japan. J Med Syst 2019;43:224.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 605 605 605
PDF Downloads 283 283 283
EPUB Downloads 0 0 0