Background
Cancer care in the United States is in a period of stunningly rapid yet essential transformation. The lion’s share of research progress, and the attendant improvement in patient care, occurs at the nation’s academic cancer centers, particularly those that are members of the Association of American Cancer Institutes (AACI), a 60-year-old alliance of centers that have achieved NCI designation (70 centers), and 26 additional centers with aspirations to do so (2 more AACI centers are in Canada). A subset of these (28 at the time of writing) has aligned to develop and share best approaches to cancer care through the National Comprehensive Cancer Network (NCCN).1 Through their emphasis on high-quality cancer care, both NCI and NCCN support academic cancer centers’ efforts to advance care, reduce cost, and improve outcomes and survival to increase value in cancer care.
These aspirations encounter complex challenges marked by scattershot implementation and a tangled web of demographic, geographic, socioeconomic, and political threads impacted by location and patient population. Idiosyncrasies have been cataloged only through NCCN and NCI information for those centers that report data to the NIH.2 These data do not provide a detailed view of cancer care delivery and prevention networks.3 Furthermore, publicly available data do not allow direct comparison and benchmarking.
Against this backdrop, AACI undertook a first-of-its-kind mixed-methods survey of the status of cancer care across cancer center networks. The survey addressed the geographic reach of care, the distribution of oncology expertise, physician support of hospital networks, electronic medical record (EMR) consolidation, support for clinical trials, management of quality measurements and outcomes across networks, and the use of care paths and navigators.
Methods
Participants
During the survey, AACI had 98 academic cancer center members. The survey was sent to 90 cancer centers in the United States and 1 in Canada; 7 centers did not receive the survey (5 basic science centers, 1 newly established center that was not yet treating patients, and 1 center in Canada). Differences in health system structures between the United States and Canada made the survey difficult to complete for AACI’s Canadian members.
Design
The AACI Network Care Initiative Steering Committee, composed of cancer center directors and chief medical officer roles, designed a mixed-methods descriptive survey. To help standardize responses, the main cancer center was defined as a tertiary care center, also known as the flagship treatment and research facility (inpatient and outpatient), where state-of-the-art clinical services are provided. The main cancer center provides specialized diagnosis, treatment, prevention, and care. Off-site, hospital-linked disease program facilities would be considered part of the main cancer center if owned and operated as one unit.
Network practice sites were defined as multiple physician practice sites providing clinical inpatient or outpatient care. Network sites may or may not share the cancer center, medical center, or university’s branding and can be located within the United States or abroad. This definition was approved by the authors, piloted at 8 centers for applicability, and agreed to by all participating centers. The survey consisted of 27 multiple-choice and short-answer questions. If a respondent reported no network sites, they did not complete the remaining 26 questions.
Procedure
Crafted using Qualtrics assessment software, the survey was emailed to each cancer center’s director and chief administrative director. Recipients were asked to either complete the survey themselves or delegate it to an employee at the organization overseeing the cancer center’s network system. The first round of responses was collected between September 2017 and January 2018. A reminder was sent in March 2018, and the survey closed in December 2018.
Results
Demographic Characteristics of Survey Respondents
Table 1 shows demographic characteristics of survey respondents and nonrespondents. Of the 69 respondents, 74% were NCI-designated, 87% were part of a matrix health system, and 13% were freestanding. A plurality of centers were in the South (36%), followed by the Midwest (26%), Northeast (23%), and West (13%). The one Canadian respondent was part of a matrix system and reported no network sites.
Demographics of Cancer Center Survey Respondents Versus Nonrespondents
Of 69 responding centers, 56 reported ≥1 network practice sites; 13 had no network sites and did not complete the survey. We emphasized breadth of reporting centers and did not aim to identify or stratify centers. Respondents reported a range of 2 to 31 in-state networks, 0 to 18 for out-of-state networks, and 0 to 4 for international networks. Table 2 shows the total sum of all NCI or non–NCI designated in-state, out-of-state, and international network sites reported by the survey respondents. Twenty-nine respondents (52%) reported that the closest in-state network site was within 5 miles of the main cancer center (range, 0–114 miles). Of the farthest sites, a median of 82 miles (mean, 231 miles) was reported (range, 0–2,983 miles).
Total Number of All Network Practice Sites
Patient Visits Across Sites
Respondents reported the number of total unique, physician, and infusion patient visits over the last fiscal year or 12-month period. Of 43 reporting, the median number of unique visits at the main center was 14,975, the median total physician visits at the main center was 50,000, and the median total infusion visits at the main center was 31,205. Of the 32 respondents who reported total patient visits at the networks over the same period, the median number of unique visits was 12,615, the median number of physician visits was 29,435, and the median number of infusion visits was 21,334.
Physician Workforce
Respondents reported the total number of medical/hematologic oncologists, radiation oncologists, and surgeons at the main center and network sites. Figure 1 shows the total number of physicians reported by each center at their networks compared with the total number of networks reported by each center. Of the 44 respondents, 91% reported having network sites with at least medical/hematologic and radiation oncology board-certified experts.
Use of Patient Navigators
Figure 2 shows how cancer centers use patient navigators at the main center and network sites. Ten respondents said their centers did not use patient navigators. The largest number of respondents (43/56) use navigators for disease-specific populations; 41 use them for assisting new patients, 35 for patients with financial needs, 34 for underserved populations, and 16 for transitioning patients on or off clinical trials. Twenty-four respondents reported that patient navigators were used for all patients. Of this subgroup, 21% indicated patient navigators were used for all patients only at the network sites, whereas 37% indicated patient navigators were used for all patients at both the main cancer center and network sites. Our study did not include questions allowing stratification of the data between nurse or lay navigators.
Cancer Center Care Paths and Quality Measures Across Network Sites
Respondents identified the type of care path used at their centers and described the impact of care paths in certain areas of care. Of the 56 who reported using care paths at network sites, 7 (13%) used a homegrown system, 9 (16%) used a system developed by a commercial vendor such as ClinicalPath (formerly Via Oncology), 3 (5%) used insurance carrier–defined care paths, and 16 (29%) used more than one type of care path, including care path models relying on NCCN Clinical Practice Guidelines in Oncology (NCCN Guidelines) for patient management.1 Twenty-one (38%) said monitored care paths were not available at network sites.
Respondents were asked whether using care paths had led to an overall increase, decrease, or no change in certain areas of care since they had been implemented at the main center and network sites (Figure 3). Centers using care paths reported an increase in consistency of care (92% of main centers and 96% of networks), care efficiency and use (82% of main centers and 86% of networks), coordination of care across sites (88% of main centers and 95% of networks), and survival outcomes (76% of main centers and 88% of networks), and a decrease in hospital admissions (72% of main centers and 68% of networks).
Respondents also identified certification programs in which main centers and network sites participate. Thirty-two said their network practice sites participate in the Quality Oncology Practice Initiative (QOPI), whereas 17 indicated that their main centers and network practice sites participate in “other” certification programs (Children’s Hospitals’ Solutions for Patient Safety, the National Accreditation Program for Breast Centers, and the Agency for Healthcare Research and Quality Patient Safety Certificate Program).
Access to EMRs
Respondents reported that 57% of networks had complete, integrated access to their main center’s EMRs, whereas 5 respondents reported no access between networks and the main center (Figure 4). Conversely, only half of main centers had full EMR access to their entire networks, and 6 main centers had no EMR access to their network sites. Respondents were also asked to describe, via short answer, challenges and solutions regarding information technology system incompatibility between the main center and network practice sites. Of 28 respondents, 75% indicated that different EMR systems were used at the main center and network sites, citing a lack of interface between each location’s EMR platforms as a challenge. Solutions included hiring chief information officers to act as liaisons between sites to improve communication and data transfer, staff training, and interfaces. Half of respondents indicated their organization was moving all sites to the Epic cloud computing–based EMR platform.
Access to Clinical Trials and Investigational Drug Services
The survey assessed how physicians at the networks participate in clinical research (Table 3). Of the 45 respondents, 23 (51%) indicated clinicians participate in the design of investigator-initiated trials, 28 (62%) participate in trials as an investigator, and 42 (93%) participate as co-investigators. A total of 27 respondents (60%) indicated clinicians at networks participate in clinical research review committees, 31 (69%) said their main cancer center provides financial support to the network sites for full-time equivalents (FTEs) associated with clinical research (eg, coordinators, research nurses, or regulatory staff), and 22 (39%) indicated that investigational drug services are available at the networks.
Clinical Research Participation at Networks
Respondents were asked to what extent the main center is responsible for providing funding or staffing for certain components of care at network sites. Of 39 respondents, 22 (56%) indicated the main center provides 100% of funding for clinical research activities at the network, with a mean of 76% of clinical research funding provided by the main center at all networks. Of 35 respondents, 15 (43%) said that the main center provided 100% of pharmacy support at the network sites, with a mean of 62% of all pharmacy support provided by the main center at all networks.
Of 45 respondents, 42% reported that the main cancer center investigational pharmacy for clinical investigation supports trials in the networks. Thirty-one percent of network sites procure drugs directly from the sponsor. Twelve percent reported that pharmacy support at the networks differs based on trial type. Two centers noted that drugs from cooperative group trials were supplied directly to the network sites, whereas the main center supplied drugs for pharmaceutical and noncooperative group trials.
In 2018, an AACI Clinical Research Innovation member survey on interventional treatment clinical trial office operations received 79 responses, largely overlapping with the respondents to the network survey. The survey confirms that clinical trial activity takes place predominantly at the main center versus network sites. Seventy-nine respondents reported a median of 282 (range, 31–1,833) interventional treatment trials open at the main center. The same centers reported a median of only 22 (range, 0–710) of these trials open at a network site mostly sponsored by national cooperative groups.
Discussion
The nation’s academic cancer centers are a major component of US cancer care. Recent consolidation of medical services into larger aggregates of care is a major driver of healthcare economics and, for patients with cancer, may result in better access to consistent and advanced care, including personalized genomics; newer modalities of care, including clinical trials; and improved access to decision-making guidance and multidisciplinary management, including complex surgery. The potential value for patients is clear: increased satisfaction and outcomes and decreased delays and risks of treatment. This survey’s results show considerable variability regarding those goals. Clearly, as the nation’s major academic cancer centers spread across geographically dispersed sites, coordination and consistency of cancer care delivery remain a work in progress.
Regionally, most respondents were in the East, South, or Midwest. Participation of AACI’s centers located in the West, particularly California, was relatively low, indicating an opportunity for further study of cancer care at networks in this region. Collectively, the network sites’ burden—and advantage—is the arduous task of coordinating care across large regions and through many different configurations, often confronting conflicting legacy structures, employment models, and medical record integration. Of the 56 survey respondents who provided mileage counts between sites, 3 indicated that care was being administered at network sites as far as 1,000 miles away from the main campus, with the mean farthest site being 288 miles away.
Twenty-three percent of respondents have more physicians located throughout their networks, and others have very few, indicating the broad diversity of employment and deployment systems managing these sites. There was a correlation between the number of sites and number of physicians (ie, the more sites reported, the higher the total number of network physicians), except for 2 centers that reported >120 physicians at their networks despite having the sixth and eighth highest numbers of networks, respectively. Notably, all but one freestanding cancer center respondent elected not to provide physician numbers at networks, with some indicating that many physicians rotate between the main and network sites, making counting difficult.
Of 56 respondents reporting network sites, 24 reported use of navigation for all patients, presenting the need for broad expansion of navigation measures across the nation’s networks. With the ever-increasing complexity of cancer care, many centers rely on patient navigators to help patients transition from initial diagnosis to treatment. Navigation has been shown to increase patient awareness and self-management, improve the physician–patient relationship, and increase referrals to other specialties.4
Some centers use cancer practice pathways across networks.5 Of 56 respondents reporting network sites, most (63%) use some type of care path at their network sites. The benefits of care path coordination at network sites were evident in our survey, with respondents reporting increases in survival outcomes, care efficiency, care consistency, and care coordination. In addition, 68% of respondents saw a decrease in hospital admissions. However, we discovered no consistent application of practice pathways across centers, which dovetails with available research on care path practices and provides an area of potential improvement if the goal is to standardize approaches across network sites.6 ASCO has issued guidelines for implementing high-value care paths, but uptake is not clear.7
Likewise, quality measures, such as participation in certification programs, are lacking across center–network duos. The mandated quality measures, with additional oversight through the Centers for Medicare & Medicaid Services Oncology Care Model (OCM), were most commonly implemented across centers and networks.8 However, no dominant approach to measuring quality exists. In addition to the OCM, most survey respondents used ≥1 quality measures: ASCO’s QOPI or the Commission on Cancer were cited by a median of 63% of network sites. Evidence of efficacy of these metrics indicates improved patient outcomes.9
Improved coordination of services is hampered by lack of an integrated medical records system. Of 49 respondents, 57% reported that network sites had full access to the main center EMRs, which is consistent with available research on how main centers and networks share EMRs.10 Many respondents noted challenges with EMR integration between the main center and networks. However, several respondents indicated that their centers were working toward full integration of EMR systems, providing the potential for improvement of care coordination between sites and lower costs for staff time and effort.
Improved access to novel therapeutics and clinical trials is a hallmark of cancer centers. Our findings show that network sites receive scant investment or center support for clinical trial activity, including investigational drug services, and network physicians are infrequently principal investigators. By including network sites, centers benefit from a larger patient base, particularly for accrual to biomarker-driven (genomic, immunohistochemistry) trials.
Although further analysis is needed on care coordination between main centers and network sites in North America, in order to improve cancer center organization and coordination and encourage high-quality care across geographic regions, we propose the following:
More uniform distribution of subspecialty expertise across network sites, especially in underserved geographic regions, and more active workforce planning and use of navigators to accommodate patient and referral needs
Consistent access to EMRs across networks to enhance care coordination, decrease referral gaps and lapses in handoffs, and improve navigator function
EMR assessments for quality measures and value-based purchasing, increasing consistency of care through care paths and quality measures
Consistent quality measures across network sites to improve outcomes and benchmarking through the OCM and QOPI
Better clinical trial and research pharmacy access at network sites, particularly rural sites, while boosting the utility and actionability of personalized genomics and decision-making; because populations are not evenly distributed by income, ethnicity, race, or age, this dissemination would greatly improve equity of cancer care
Adoption of care paths to standardize treatments across sites, including tumor boards, genomic tumor referrals, and standardized diagnostics, to reduce overall cost and improve consistency of care, outcomes, and patient satisfaction
Conclusions
As personalized genomics, imaging, and other risk stratification become commonplace, cancer center networks must consolidate data access and interpretation to optimize clinical decision-making, referrals, and clinical trial access. Our survey findings reinforce this need. Although optimal cancer care consolidation remains a moving target, opportunities exist to link the incorporation of network sites to initiatives that will improve value, cost efficiencies, and patient outcomes.
References
- 1.↑
National Comprehensive Cancer Network. NCCN Guidelines. Accessed October 25, 2019. Available at: https://www.nccn.org/professionals/physician_gls/default.aspx
- 2.↑
NIH; National Cancer Institute. NCI-designated Cancer Centers. Accessed October 25, 2019. Available at: https://www.cancer.gov/research/nci-role/cancer-centers
- 3.↑
Ribisl KM, Fernandez ME, Friedman DB, et al.. Impact of the Cancer Prevention and Control Research Network: accelerating the translation of research into practice. Am J Prev Med 2017;52:S233–240.
- 4.↑
Riley S, Riley C. The role of patient navigation in improving the value of oncology care. J Clin Pathw 2016;2:41–47.
- 5.↑
Daly B, Zon RT, Page RD, et al.. Oncology clinical pathways: charting the landscape of pathway providers. J Oncol Pract 2018;14:e194–200.
- 6.↑
Chen KS, Glaser SM, Garda AE, et al.. Utilizing clinical pathways and web-based conferences to improve quality of care in a large integrated network using breast cancer radiation therapy as the model. Radiat Oncol 2018;13:44.
- 7.↑
Zon RT, Edge SB, Page RD, et al.. American Society of Clinical Oncology criteria for high-quality clinical pathways in oncology. J Oncol Pract 2017;13:207–210.
- 8.↑
Rocque GB, Williams CP, Hathaway AR, et al.. Evaluating the impact of treatment care planning on quality measures. J Oncol Pract 2019;15:e271–276.
- 9.↑
Rosenblum R, Huo R, Scarborough B, et al.. Comparison of quality oncology practice initiative metrics in solid tumor oncology clinic with or without concomitant supportive oncology consultation. J Oncol Pract 2018;14:e786–793.
- 10.↑
Patt D, Stella P, Bosserman L. Clinical challenges and opportunities with current electronic health records: practicing oncologists’ perspective. J Oncol Pract 2018;14:577–579.