Applying Quality Indicators to Examine Quality of Care During Active Surveillance in Low-Risk Prostate Cancer: A Population-Based Study

Authors:
Narhari Timilshina Department of Medicine, University Health Network, Toronto, Ontario, Canada
Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada

Search for other papers by Narhari Timilshina in
Current site
Google Scholar
PubMed
Close
 MPH, PhD
,
Antonio Finelli Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada
Division of Urology and Surgical Oncology, University Health Network, Toronto, Ontario, Canada

Search for other papers by Antonio Finelli in
Current site
Google Scholar
PubMed
Close
 MD, MSc
,
George Tomlinson Department of Medicine, University Health Network, Toronto, Ontario, Canada
Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada

Search for other papers by George Tomlinson in
Current site
Google Scholar
PubMed
Close
 PhD
,
Beate Sander Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada
The Toronto Health Economics and Technology Assessment Collaborative, Toronto, Ontario, Canada
Toronto General Hospital Research Institute, University Health Network, Toronto, Ontario, Canada
Institute of Clinical Research Services, Toronto, Ontario, Canada
Public Health Ontario, Toronto, Ontario, Canada

Search for other papers by Beate Sander in
Current site
Google Scholar
PubMed
Close
 PhD
, and
Shabbir M.H. Alibhai Department of Medicine, University Health Network, Toronto, Ontario, Canada
Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada

Search for other papers by Shabbir M.H. Alibhai in
Current site
Google Scholar
PubMed
Close
 MD, MSc
Full access

Background: Although a few studies have reported wide variations in quality of care in active surveillance (AS), there is a lack of research using validated quality indicators (QIs). The aim of this study was to apply evidence-based QIs to examine the quality of AS care at the population level. Methods: QIs were measured using a population-based retrospective cohort of patients with low-risk prostate cancer diagnosed between 2002 and 2014. We developed 20 QIs through a modified Delphi approach with clinicians targeting the quality of AS care at the population level. QIs included structure (n=1), process of care (n=13), and outcome indicators (n=6). Abstracted pathology data were linked to cancer registry and administrative databases in Ontario, Canada. A total of 17 of 20 QIs could be applied based on available information in administrative databases. Variations in QI performance were explored according to patient age, year of diagnosis, and physician volume. Results: The cohort included 33,454 men with low-risk prostate cancer, with a median age of 65 years (IQR, 59–71 years) and a median prostate-specific antigen level of 6.2 ng/mL. Compliance varied widely for 10 process QIs (range, 36.6%–100.0%, with 6 [60%] QIs >80%). Initial AS uptake was 36.6% and increased over time. Among outcome indicators, significant variations were observed by patient age group (10-year metastasis-free survival was 95.0% for age 65–74 years and 97.5% in age <55 years) and physician average annual AS volume (10-year metastasis-free survival was 94.5% for physicians with 1–2 patients with AS and 95.8% for those with ≥6 patients with AS annually). Conclusions: This study establishes a foundation for quality-of-care assessments and monitoring during AS implementation at a population level. Considerable variations appeared with QIs related to process of care by physician volume and QIs related to outcome by patient age group. These findings may represent areas for targeted quality improvement initiatives.

Background

Active surveillance (AS) for low-risk prostate cancer (PC) has become an internationally recognized standard-of-care treatment option.1 AS is increasingly used worldwide; with AS, doctors delay curative treatments in low-risk patients until there is evidence of disease progression, without missing the opportunity for cure.

In Ontario, approximately 9,600 patients with PC are diagnosed annually, representing 20% of all cancer diagnoses.2 Most patients with low-risk PC receive initial AS in Ontario (up to 69% in 2014)3; however, there is a lack of data on the quality of AS care in Canada and globally. Routine quality assessment assists clinicians and health service providers in delivering optimal care to men undergoing AS.47 The foundation for quality assessment is based on the structure-process-outcomes paradigm by Donabedian.811 The first quality indicators (QIs) for PC were developed more than a decade ago by investigators at RAND1216 and have subsequently been used to show widespread variation in the quality of early-stage PC care.12,16,17 Spencer et al17 evaluated compliance with 29 quality-of-care disease-specific structure and process indicators developed by RAND from the American College of Surgeons National Cancer Database. Findings showed 70% overall compliance for structure indicators, and academic and cancer centers had higher compliance than community hospitals. Although large differences were not observed for any indicators in low-risk PC, these findings were not specific to AS.17 A study from Michigan examined the frequency of follow-up prostate-specific antigen (PSA) testing and prostate biopsy among men managed with AS,18 finding wide variation between medical practices. The median rate of guideline-concordant follow-up was 26.5% (range, 10%–68%); other QIs were not examined.18 A single population-level study found that of 78.3% of men with PC diagnosed through needle biopsy, 72.5% had ≥8 biopsy cores, 84% reported tumor grade as per guidelines, and 23% of patients with low-risk PC received AS.19

Adoption and implementation of AS protocols at a population level remain poorly understood. There is no universal guideline for AS follow-up2022; thus, ongoing surveillance among patients with PC likely varies given the presence of multiple guidelines and changes over time in AS care. Because almost all published data come from academic centers, yet a significant number of patients are managed in community settings, broad compliance with guidelines on AS and level of quality of care are largely unknown. This information would provide a broader understanding of AS implementation, particularly in community-based urologic practices, where most patients with PC are managed.

Compounding this trend, previous QIs were targeted mostly for curative intervention (ie, radiation therapy, radical prostatectomy, or brachytherapy) in low-risk PC and often do not apply to AS. No study has formally developed and validated QIs to evaluate the quality of AS care in low-risk PC. The substantial variations in AS use suggest gaps in the quality of care. To close the gaps, measuring AS quality of care is necessary before monitoring and improving it.20 We recently developed QIs for measuring quality of care in patients with PC receiving AS.23 To date, the utility of these indicators has not been established. Our current study aims to apply these structure-, process-, and outcomes-based QIs to describe variations in quality of care during AS among patients with low-risk PC using population-based data.

Methods

We conducted a population-based retrospective cohort study of men with PC managed with AS in Ontario, Canada, between 2002 and 2014. The Ontario Cancer Registry (OCR) database captures the entire population with newly diagnosed cancer, along with cancer mortality.24

Determining Feasibility of Using Selected QIs in Population-Based Data

The methodology used to select QIs is described in detail in previous research.23 In brief, in the first phase we proposed a preliminary list of guideline-based QIs, based on a comprehensive literature review. A multidisciplinary panel of Canadian specialists (urologists and radiation oncologists) reached consensus and approved a total of 20 QIs through a modified Delphi methodology.

Application in Administrative Databases to Determine Feasibility

To assess the feasibility of using the 20 QIs selected by the experts in a suitable dataset, we first identified all men with a diagnosis of low-risk (Gleason score ≤6) nonmetastatic PC from the OCR.25 This cohort was linked at ICES to our previously abstracted biopsy pathology data for all men with PC who had biopsies or surgical pathology specimens in the OCR. The OCR collects information from hospitals, regional cancer centers, pathology reports, and death certificates.24 Details of the reliability and validity of administrative databases used in this study have been described previously (supplemental eTable 1, available with this article at JNCCN.org).3 The feasibility of capturing each QI using population data (based on whether the data necessary to assess the measure could be found in linked databases) was established before using the QI to assess the quality of care in AS (supplemental eFigure 1).

Analysis of QIs for Measuring Quality of Care

We used an available case analysis approach for each QI, meaning that patients whose information could not be retrieved were excluded from the numerator and denominator for each specific QI and were classified as missing for that QI (available denominators for each QI are provided in supplemental eTable 2).

Performance on all QIs was summarized using percentages with 95% confidence intervals calculated based on the large-sample normal approximation method. All time-to-event outcome QIs were estimated based on the Kaplan-Meier estimates of cumulative incidence function. Variation in AS care was assessed by patient age group, physician annual AS volume, and year of diagnosis. We chose these 3 illustrative variables based on their importance in the PC literature. For a physician’s mean annual AS volume, we used the total number of patients divided by the physician’s actual number of years of practice during the study period (2002–2014). We then specified the average annual number as a categorical variable (based on tertiles of average annual AS volume among the 364 physicians in the dataset): 1–2 patients, 3–5 patients, and ≥6 patients. We used a threshold of 80% to define a good score for each QI.26,27 We used the Cochran-Armitage test to assess monotonic relationships between percentages meeting the QI and important covariates. All statistical analyses were performed using SAS 9.4 (SAS Institute Inc.), with a 2-sided P<.05 indicating statistical significance.

Results

The 33,454 men identified as having low-risk PC had a median age of 65 years (IQR, 59–71 years), a median PSA level of 6.2 ng/mL (IQR, 4.7–8.5 ng/mL), and a median of 2 positive cores (IQR, 1–3 cores) at diagnosis (Table 1). Median follow-up was 10 years. A total of 17 of the 20 proposed QIs were deemed feasible for measuring the quality of care at the population level (supplemental eFigure 1).

Table 1.

Description of Cohort (2002–2014)

Table 1.

QI for Structures of Care

QI-1 refers to the percentage of patients receiving AS managed by a PC specialist (urologist/radiation oncologist); overall, 99.9% of these patients were managed by PC specialists.

QIs for Process of Care

QI-4 to QI-7 refer to the process domain measuring care at PC diagnosis and regarding AS eligibility criteria. During the study period, 91.9% of patients underwent a biopsy containing at least 8 cores (QI-4) and 85.1% of patients with low-volume disease (≤3 positive cores and <50% of maximum percentage cores) underwent AS (QI-7) (Figure 1).

Figure 1.
Figure 1.

Summary of QI results for AS care.

Abbreviations: AS, active surveillance; DRE, digital rectal examination; OH-CCO, Ontario Health-Cancer Care Ontario; PC, prostate cancer; PSA, prostate-specific antigen; QI, quality indicator.

Citation: Journal of the National Comprehensive Cancer Network 21, 5; 10.6004/jnccn.2022.7256

QI-8 to QI-14 refer to measures at follow-up during AS and at the time of switching to definitive treatment. Overall, 81.3% of patients on AS had regular follow-up with a urologist (QI-8), whereas only 42.6% of patients on AS had a confirmatory biopsy within 6 to 12 months from diagnosis (QI-10). Repeat biopsy in 2 to 5 years was noted in 76.7% of patients (QI-11), 91.5% of patients had a PC specialist visit before switching to definitive treatment (QI-12), 76.2% had a biopsy before definitive treatment (QI-13), and 88.9% received active treatment after an upgrade in the Gleason score (QI-14) (supplemental eFigure 2). Among the 10 process QIs, 6 (60%) reached the 80% threshold.

QIs for Outcomes

QI-15 to QI-20 refer to outcomes indicators for AS. Outcomes indicators showed a 5-year treatment-free survival of 50.6%; 5- and 10-year metastasis-free survivals of 98.5% and 95.5%, respectively; and 5- and 10-year PC-specific survivals of 99.6% and 98.4%, respectively. Overall survival at 10 years was 90.7%.

QI Score by Age Group

Most process QIs decreased (ie, worsened) with increasing age, except PC specialist visits before switching to definitive therapy, which increased with age (supplemental eFigure 2). Outcomes indicators were significantly worse in older patients; 10-year metastasis-free survival was 97.5% for age <55 years, 96.3% for age 55–64 years, 95.0% for age 65–74 years, and 89.3% for age ≥75 years (P<.001). The 10-year PC-specific survival of patients in these age groups was 99.7%, 98.8%, 98.5%, and 92.9%, respectively (P<.001) (Table 2).

Table 2.

Differences in Outcomes Indicators by Age Group and Physician Average Annual AS Volume

Table 2.

QI Score by Physician Average Annual AS Volume

There was tremendous variability in the mean annual volume of patients receiving AS managed by a physician (supplemental eFigure 3). Figure 2 shows process and outcomes QI scores by physician average annual volume. Of 3,515 patients who were managed by physicians with a large annual volume, 71% were managed at an academic center. Process QI scores related to PC diagnosis or AS eligibility criteria increased with higher physician volume. However, process QIs that assessed AS follow-up were similar across physician annual AS volume categories (Figure 2). In addition, a modest gradient in the percentage of patients receiving definitive treatment after Gleason grade progression by physician volume was observed (87.1% in the low volume tertile, 88.4% in the middle volume tertile, and 89.4% in the high volume tertile), although this finding was not statistically significant (P=.11; see supplemental eTable 3, Figure 2).

Figure 2.
Figure 2.

Process QI scores by physician average annual AS volume.

Abbreviations: AS, active surveillance; OH-CCO, Ontario Health-Cancer Care Ontario; PC, prostate cancer; pts, patients; QI, quality indicator.

Citation: Journal of the National Comprehensive Cancer Network 21, 5; 10.6004/jnccn.2022.7256

Patients managed by physicians with a lower annual volume had significantly worse 10-year metastasis-free survival (94.5% in the low volume tertile and 95.8% in the high volume tertile; P=.019). Similarly, 10-year PC survival was lower in the lower physician volume category (96.9% for the group with 1–2 patients vs 98.8% for the group with ≥6 patients; P<.001). All 6 outcomes QIs had better scores with higher physician annual case volume (Table 2); however, no difference was observed in 5-year treatment-free survival between patients receiving AS based on physician volume (5-year treatment-free survival was 49.9% in the low volume tertile, 47.4% in the middle volume tertile, and 50.3% in the high volume tertile) (supplemental eTable 4).

Process QI Score Change Over Time

QI-4 (using ≥8 cores in diagnostic biopsy) improved significantly over time; it was 72.9% in the diagnosis years 2002–2005, 90.4% in 2006–2008, 97.3% in 2009–2011, and 98.5% in 2012–2014. In contrast, large declines over time in QI-10 (confirmatory biopsy within 1 year) were observed in these same periods: 55.3%, 44.6%, 38.8%, and 36.7%, respectively. Of the 10 process QIs, 60% improved, 20% declined, and 20% changed minimally over time. In more recent years (2012–2014), 60% of process QIs met the 80% threshold (supplemental eFigure 4).

Discussion

The present study details the quality of care for AS of a large cohort of patients with low-risk PC within a universal care delivery system in Ontario, Canada. These provincial-level data offer a broad perspective on the quality of AS care and factors driving variations in outcomes of care. We were able to explore the quality of care at the population level with 17 of 20 QIs nominated by an expert panel.23

This study provided several important findings. Overall, the vast majority of patients receiving AS were mainly managed by PC specialists, similar to a US national Medicare study.28 A large number of low-risk patients received initial AS, and most patients receiving AS received regular follow-up as per AS guidelines. Outcomes-based QIs indicated that approximately half of patients (52.8%) discontinued AS within 5 years, the metastasis-free survival was excellent, and disease-specific death rates were low.

Findings showed a relatively good level of compliance with most process QIs; however, some process QIs revealed potential gaps in quality of care (Figures 1 and 2, and supplemental eFigure 2). We also observed improvement in several process-of-care QIs in recent years (2012–2014), most notably a 25.5% increase in diagnostic biopsy with ≥8 cores and a 16.7% increase in appropriate follow-up care. However, there was a decline in confirmatory biopsy within 6 to 12 months (55.3% in 2002–2005 vs 36.7% in 2012–2014) and biopsy before definitive treatment (78.9% in 2002–2005 vs 73.0% in 2012–2014). These findings suggest moderately good quality of AS care received by patients in Ontario; however, some indicators showed opportunities to improve care for older patients and those managed by lower-volume physicians. Published studies have reported substantial gaps in adherence to repeat biopsy or confirmatory biopsy. These studies indicate significant variation in time to confirmatory biopsy adherence within 1 year, ranging from 53.3% to 81% adherence.18,20,2934 A recent systematic review found that among 6 observational studies reporting on time to confirmatory biopsy, adherence was measured at different time points (within 6 months, 12 months, and 18–24 months), and findings showed a wide range of adherence to confirmatory biopsy.20 Only 2 studies reported on adherence to repeat biopsy over time,30,34 suggesting gradual improvement in adherence between 2015 and 2019 among patients receiving AS.

In our study, 71% of patients receiving AS who were managed by physicians with a higher annual case volume were also treated at an academic center. This result may be a possible reason for observed outcome QI differences among patients receiving AS managed by physicians with high volumes, because treatment patterns and care plans may differ by physician practice setting.35,36 In addition, it may be challenging for busy clinicians to identify how many and which patients in an AS practice are receiving appropriate follow-up and high-quality care (eg, regular measurement of PSA levels, repeat biopsies, and close follow-up) without an intelligent electronic management record.37 This finding suggests that quality improvement initiatives should focus on targeting low-volume practice settings or community settings to minimize gaps in AS outcomes-related QIs, and addressing variations in AS outcomes.

We observed that scores for several process and outcomes indicators were lower for men aged ≥75 years than for those aged <55 years. Most process QIs for men aged ≥75 years had lower quality scores, which may have contributed to the worse metastasis-free survival and PC-specific death rates seen in older patients. Three major guidelines38 recommended a threshold of age ≤75 years as an inclusion criterion for AS. However, a modified Delphi study of international PC experts suggested that a reasonable state of health and life expectancy were important principles for AS initial selection rather than age.39 It is possible that some of those men aged ≥75 years were being transitioned from AS to watchful waiting. More data and more-nuanced guidelines are needed for AS in men aged ≥75 years.

Increasing importance is being placed on meeting standards defined by clinical guidelines or benchmarks for the care delivered.40 A recent commentary proposed a threshold of 75% for receipt of a confirmatory biopsy within 6 months of diagnosis.30 We used a threshold of 80% for all process QIs. These thresholds are similar to one another but, although commonly used, remain somewhat arbitrary.

Our results must be considered in light of several limitations. One limitation is incomplete clinical information for certain QIs, in particular clinical stage, PSA measurement during AS follow-up, comorbidities, and family history, which are not commonly available in administrative databases. We also observed differences in pathologic characteristics or type of patients managed by radiation oncologists and urologists and differences between academic and community settings for patients receiving AS; therefore, we did not report QI score by hospital type or physician type in this study because these metrics are difficult to modify and may suggest intrinsically different patient populations. In addition, although Birkmeyer et al6 recommended quality of life and functional status as important outcome indicators, these are not available at the population level. We only examined quality of care across 3 variables: patient age, physician volume, and year of diagnosis. There are undoubtedly multiple other variables that may be relevant to variation in quality of care (eg, race, socioeconomic status). Because of the substantially larger number of patients in the highest physician volume tertile, our ability to observe small differences between low and high physician volumes for some outcomes may have been limited. Finally, due to limited availability of data on serial biopsies, we only reported the first repeat biopsy after confirmatory biopsy.

To our knowledge, this is the first study reporting on the quality of AS care at the population level. A major strength of our study is the application of formally developed QIs.23 We used population-level data, which reduced selection bias and represented 98% of all PC cases in Ontario.41 The direct abstraction of pathology data ensured that all selected patients had a Gleason score ≤6 at diagnosis and standardized codification.

Conclusions

Even though AS use is widely practiced, there is limited information at a population level on the quality of AS care. This study applied 17 expert-selected QIs to measure the quality of care during AS. Many elements of process-of-care and oncologic outcomes are similar at a population level to those in published data from academic centers. This gives assurance to patients receiving AS and physicians that the quality of care received is generally consistent with current clinical guidelines. However, several potential gaps in quality of care were identified, particularly by physician volume and patient age. This study established a foundation on which to build benchmarking for quality-of-care assessment and monitor the quality of care for patients receiving AS. Future efforts can focus on quality improvement initiatives and measure appropriateness of care according to AS guidelines. Further research is needed to evaluate the causes of variation in quality of care by hospital type (academic vs community, cancer center vs other), which may also improve outcomes for patients receiving AS.

References

  • 1.

    Cooperberg MR, Lin DW, Morgan TM, et al. Active surveillance: very much “preferred” for low-risk prostate cancer. J Urol 2022;207:262264.

  • 2.

    Canadian Cancer Statistics Advisory Committee in collaboration with the Canadian Cancer Society, Statistics Canada, and the Public Health Agency of Canada. Canadian cancer statistics 2021. Accessed January 12, 2023. Available at: https://cdn.cancer.ca/-/media/files/research/cancer-statistics/2021-statistics/2021-pdf-en-final.pdf

    • PubMed
    • Export Citation
  • 3.

    Timilshina N, Komisarenko M, Martin LJ, et al. Factors associated with discontinuation of active surveillance among men with low-risk prostate cancer: a population-based study. J Urol 2021;206:903913.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4.

    Sampurno F, Earnest A, Kumari PB, et al. Quality of care achievements of the Prostate Cancer Outcomes Registry-Victoria. Med J Aust 2016;204:319.

  • 5.

    Cheng JY. The Prostate Cancer Intervention Versus Observation Trial (PIVOT) in perspective. J Clin Med Res 2013;5:266268.

  • 6.

    Birkmeyer JD, Dimick JB, Birkmeyer NJ. Measuring the quality of surgical care: structure, process, or outcomes? J Am Coll Surg 2004;198:626632.

  • 7.

    Birkmeyer JD, Siewers AE, Finlayson EV, et al. Hospital volume and surgical mortality in the United States. N Engl J Med 2002;346:11281137.

  • 8.

    Donabedian A. The quality of care: how can it be assessed? JAMA 1988;260:17431748.

  • 9.

    Donabedian A. The role of outcomes in quality assessment and assurance. QRB Qual Rev Bull 1992;18:356360.

  • 10.

    Blumenthal D, Epstein AM. Quality of health care. Part 6: the role of physicians in the future of quality management. N Engl J Med 1996;335:13281331.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11.

    Ullman M, Metzger CK, Kuzel T, et al. Performance measurement in prostate cancer care: beyond report cards. Urology 1996;47:356365.

  • 12.

    Spencer BA, Steinberg M, Malin J, et al. Quality-of-care indicators for early-stage prostate cancer. J Clin Oncol 2003;21:19281936.

  • 13.

    Miller DC, Litwin MS, Sanda MG, et al. Use of quality indicators to evaluate the care of patients with localized prostate carcinoma. Cancer 2003;97:14281435.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 14.

    Miller DC, Saigal CS. Quality of care indicators for prostate cancer: progress toward consensus. Urol Oncol 2009;27:427434.

  • 15.

    Patt D, Page R. Measuring quality is complicated. J Oncol Pract 2018;14:12.

  • 16.

    Hillner BE, Smith TJ, Desch CE. Hospital and physician volume or specialization and outcomes in cancer treatment: importance in quality of cancer care. J Clin Oncol 2000;18:23272340.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 17.

    Spencer BA, Miller DC, Litwin MS, et al. Variations in quality of care for men with early-stage prostate cancer. J Clin Oncol 2008;26:37353742.

  • 18.

    Luckenbaugh AN, Auffenberg GB, Hawken SR, et al. Variation in guideline-concordant active surveillance follow-up in diverse urology practices. J Urol 2017;197:621626.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 19.

    Ortelli L, Spitale A, Mazzucchelli L, et al. Quality indicators of clinical cancer care for prostate cancer: a population-based study in southern Switzerland. BMC Cancer 2018;18:733.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 20.

    Kith G, Lisker S, Sarkar U, et al. Defining and measuring adherence in observational studies assessing outcomes of real-world active surveillance for prostate cancer: a systematic review. Eur Urol Oncol 2021;4:192201.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21.

    Sampurno F, Zheng J, Di Stefano L, et al. Quality indicators for global benchmarking of localized prostate cancer management. J Urol 2018;200:319326.

  • 22.

    Cooperberg MR. Active surveillance for low-risk prostate cancer—an evolving international standard of care. JAMA Oncol 2017;3:13981399.

  • 23.

    Timilshina N, Finelli A, Tomlinson G, et al. National consensus quality indicators to assess quality of care for active surveillance in low-risk prostate cancer: an evidence-informed, modified Delphi survey of Canadian urologists/radiation oncologists. Can Urol Assoc J 2022;16:E212219.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 24.

    McLaughlin JR, Kreiger N, Marrett LD, et al. Cancer incidence registration and trends in Ontario. Eur J Cancer 1991;27:15201524.

  • 25.

    Richard PO, Alibhai SM, Panzarella T, et al. The uptake of active surveillance for the management of prostate cancer: a population-based analysis. Can Urol Assoc J 2016;10:333338.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 26.

    Minami CA, Wayne JD, Yang AD, et al. National evaluation of hospital performance on the new Commission on Cancer melanoma quality measures. Ann Surg Oncol 2016;23:35483557.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 27.

    Karve S, Cleves MA, Helm M, et al. Good and poor adherence: optimal cut-point for adherence measures using administrative claims data. Curr Med Res Opin 2009;25:23032310.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 28.

    Lai LY, Shahinian VB, Oerline MK, et al. Understanding active surveillance for prostate cancer. JCO Oncol Pract 2021;17:e16781687.

  • 29.

    Tosoian JJ, Trock BJ, Landis P, et al. Active surveillance program for prostate cancer: an update of the Johns Hopkins experience. J Clin Oncol 2011;29:21852190.

  • 30.

    Ginsburg KB, Cher ML, Montie JE. Defining quality metrics for active surveillance: the Michigan Urological Surgery Improvement Collaborative experience. J Urol 2020;204:11191121.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 31.

    Lee EK, Baack J, Penn H, et al. Active surveillance for prostate cancer in a veteran population. Can J Urol 2010;17:54295435.

  • 32.

    Bul M, Zhu X, Valdagni R, et al. Active surveillance for low-risk prostate cancer worldwide: the PRIAS study. Eur Urol 2013;63:597603.

  • 33.

    Bokhorst LP, Alberts AR, Rannikko A, et al. Compliance rates with the Prostate Cancer Research International Active Surveillance (PRIAS) protocol and disease reclassification in noncompliers. Eur Urol 2015;68:814821.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 34.

    Evans MA, Millar JL, Earnest A, et al. Active surveillance of men with low-risk prostate cancer: evidence from the Prostate Cancer Outcomes Registry-Victoria. Med J Aust 2018;208:439443.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 35.

    Auffenberg GB, Lane BR, Linsell S, et al. Practice- vs physician-level variation in use of active surveillance for men with low-risk prostate cancer: implications for collaborative quality improvement. JAMA Surg 2017;152:978980.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 36.

    Tyson MD, Graves AJ, O’Neil B, et al. Urologist-level correlation in the use of observation for low- and high-risk prostate cancer. JAMA Surg 2017;152:2734.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 37.

    Chen RC, Rumble RB, Loblaw DA, et al. Active surveillance for the management of localized prostate cancer (Cancer Care Ontario guideline): American Society of Clinical Oncology clinical practice guideline endorsement. J Clin Oncol 2016;34:21822190.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 38.

    Bruinsma SM, Bangma CH, Carroll PR, et al. Active surveillance for prostate cancer: a narrative review of clinical guidelines. Nat Rev Urol 2016;13:151167.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 39.

    Merriel SWD, Moon D, Dundee P, et al. A modified Delphi study to develop a practical guide for selecting patients with prostate cancer for active surveillance. BMC Urol 2021;21:18.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 40.

    Chiew KL, Sundaresan P, Jalaludin B, et al. A narrative synthesis of the quality of cancer care and development of an integrated conceptual framework. Eur J Cancer Care (Engl) 2018;27:e12881.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 41.

    Robles SC, Marrett LD, Clarke EA, et al. An application of capture-recapture methods to the estimation of completeness of cancer registration. J Clin Epidemiol 1988;41:495501.

    • PubMed
    • Search Google Scholar
    • Export Citation

Submitted June 29, 2022; final revision received December 20, 2022; accepted for publication December 21, 2022.

Author contributions: Study concept and design: All authors. Data analysis: Timilshina, Tomlinson. Interpretation of results: All authors. Supervision: Finelli, Alibhai. Writing—original draft: Timilshina, Finelli, Alibhai. Writing—review & editing: Finelli, Tomlinson, Sander, Alibhai.

Disclosures: The authors have not received any financial contribution from any person or organization to support the preparation, results, analysis, or discussion of this article.

Acknowledgement: This study contracted ICES data & analytic services and used de-identified data from the ICES Data Repository, which is managed by ICES with support from its funders and partners: Canada’s Strategy for Patients-Oriented Research (SPOR), the Ontario SPOR Support unit, the Canadian Institute for Health Research and the Government of Ontario. This study was supported through provision of data by ICES and Ontario Health-Cancer Care Ontario and through funding support to ICES from an annual grant by the Ministry of Health and the Ontario Institute for Cancer Research. The opinions, results and conclusions reported in this paper are those of the authors. No endorsement by ICES or any of its funders or partners is intended or should be inferred.

Correspondence: Shabbir M.H. Alibhai, MD, MSc, Department of Medicine, University Health Network, Institute of Health Policy, Management and Evaluation, University of Toronto, 200 Elizabeth Street, Room EN14-214, Toronto, Ontario M5G 2C4, Canada. Email: Shabbir.Alibhai@uhn.ca

View associated content

Supplementary Materials

  • Collapse
  • Expand
  • Figure 1.

    Summary of QI results for AS care.

    Abbreviations: AS, active surveillance; DRE, digital rectal examination; OH-CCO, Ontario Health-Cancer Care Ontario; PC, prostate cancer; PSA, prostate-specific antigen; QI, quality indicator.

  • Figure 2.

    Process QI scores by physician average annual AS volume.

    Abbreviations: AS, active surveillance; OH-CCO, Ontario Health-Cancer Care Ontario; PC, prostate cancer; pts, patients; QI, quality indicator.

  • 1.

    Cooperberg MR, Lin DW, Morgan TM, et al. Active surveillance: very much “preferred” for low-risk prostate cancer. J Urol 2022;207:262264.

  • 2.

    Canadian Cancer Statistics Advisory Committee in collaboration with the Canadian Cancer Society, Statistics Canada, and the Public Health Agency of Canada. Canadian cancer statistics 2021. Accessed January 12, 2023. Available at: https://cdn.cancer.ca/-/media/files/research/cancer-statistics/2021-statistics/2021-pdf-en-final.pdf

    • PubMed
    • Export Citation
  • 3.

    Timilshina N, Komisarenko M, Martin LJ, et al. Factors associated with discontinuation of active surveillance among men with low-risk prostate cancer: a population-based study. J Urol 2021;206:903913.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4.

    Sampurno F, Earnest A, Kumari PB, et al. Quality of care achievements of the Prostate Cancer Outcomes Registry-Victoria. Med J Aust 2016;204:319.

  • 5.

    Cheng JY. The Prostate Cancer Intervention Versus Observation Trial (PIVOT) in perspective. J Clin Med Res 2013;5:266268.

  • 6.

    Birkmeyer JD, Dimick JB, Birkmeyer NJ. Measuring the quality of surgical care: structure, process, or outcomes? J Am Coll Surg 2004;198:626632.

  • 7.

    Birkmeyer JD, Siewers AE, Finlayson EV, et al. Hospital volume and surgical mortality in the United States. N Engl J Med 2002;346:11281137.

  • 8.

    Donabedian A. The quality of care: how can it be assessed? JAMA 1988;260:17431748.

  • 9.

    Donabedian A. The role of outcomes in quality assessment and assurance. QRB Qual Rev Bull 1992;18:356360.

  • 10.

    Blumenthal D, Epstein AM. Quality of health care. Part 6: the role of physicians in the future of quality management. N Engl J Med 1996;335:13281331.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11.

    Ullman M, Metzger CK, Kuzel T, et al. Performance measurement in prostate cancer care: beyond report cards. Urology 1996;47:356365.

  • 12.

    Spencer BA, Steinberg M, Malin J, et al. Quality-of-care indicators for early-stage prostate cancer. J Clin Oncol 2003;21:19281936.

  • 13.

    Miller DC, Litwin MS, Sanda MG, et al. Use of quality indicators to evaluate the care of patients with localized prostate carcinoma. Cancer 2003;97:14281435.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 14.

    Miller DC, Saigal CS. Quality of care indicators for prostate cancer: progress toward consensus. Urol Oncol 2009;27:427434.

  • 15.

    Patt D, Page R. Measuring quality is complicated. J Oncol Pract 2018;14:12.

  • 16.

    Hillner BE, Smith TJ, Desch CE. Hospital and physician volume or specialization and outcomes in cancer treatment: importance in quality of cancer care. J Clin Oncol 2000;18:23272340.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 17.

    Spencer BA, Miller DC, Litwin MS, et al. Variations in quality of care for men with early-stage prostate cancer. J Clin Oncol 2008;26:37353742.

  • 18.

    Luckenbaugh AN, Auffenberg GB, Hawken SR, et al. Variation in guideline-concordant active surveillance follow-up in diverse urology practices. J Urol 2017;197:621626.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 19.

    Ortelli L, Spitale A, Mazzucchelli L, et al. Quality indicators of clinical cancer care for prostate cancer: a population-based study in southern Switzerland. BMC Cancer 2018;18:733.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 20.

    Kith G, Lisker S, Sarkar U, et al. Defining and measuring adherence in observational studies assessing outcomes of real-world active surveillance for prostate cancer: a systematic review. Eur Urol Oncol 2021;4:192201.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21.

    Sampurno F, Zheng J, Di Stefano L, et al. Quality indicators for global benchmarking of localized prostate cancer management. J Urol 2018;200:319326.

  • 22.

    Cooperberg MR. Active surveillance for low-risk prostate cancer—an evolving international standard of care. JAMA Oncol 2017;3:13981399.

  • 23.

    Timilshina N, Finelli A, Tomlinson G, et al. National consensus quality indicators to assess quality of care for active surveillance in low-risk prostate cancer: an evidence-informed, modified Delphi survey of Canadian urologists/radiation oncologists. Can Urol Assoc J 2022;16:E212219.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 24.

    McLaughlin JR, Kreiger N, Marrett LD, et al. Cancer incidence registration and trends in Ontario. Eur J Cancer 1991;27:15201524.

  • 25.

    Richard PO, Alibhai SM, Panzarella T, et al. The uptake of active surveillance for the management of prostate cancer: a population-based analysis. Can Urol Assoc J 2016;10:333338.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 26.

    Minami CA, Wayne JD, Yang AD, et al. National evaluation of hospital performance on the new Commission on Cancer melanoma quality measures. Ann Surg Oncol 2016;23:35483557.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 27.

    Karve S, Cleves MA, Helm M, et al. Good and poor adherence: optimal cut-point for adherence measures using administrative claims data. Curr Med Res Opin 2009;25:23032310.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 28.

    Lai LY, Shahinian VB, Oerline MK, et al. Understanding active surveillance for prostate cancer. JCO Oncol Pract 2021;17:e16781687.

  • 29.

    Tosoian JJ, Trock BJ, Landis P, et al. Active surveillance program for prostate cancer: an update of the Johns Hopkins experience. J Clin Oncol 2011;29:21852190.

  • 30.

    Ginsburg KB, Cher ML, Montie JE. Defining quality metrics for active surveillance: the Michigan Urological Surgery Improvement Collaborative experience. J Urol 2020;204:11191121.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 31.

    Lee EK, Baack J, Penn H, et al. Active surveillance for prostate cancer in a veteran population. Can J Urol 2010;17:54295435.

  • 32.

    Bul M, Zhu X, Valdagni R, et al. Active surveillance for low-risk prostate cancer worldwide: the PRIAS study. Eur Urol 2013;63:597603.

  • 33.

    Bokhorst LP, Alberts AR, Rannikko A, et al. Compliance rates with the Prostate Cancer Research International Active Surveillance (PRIAS) protocol and disease reclassification in noncompliers. Eur Urol 2015;68:814821.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 34.

    Evans MA, Millar JL, Earnest A, et al. Active surveillance of men with low-risk prostate cancer: evidence from the Prostate Cancer Outcomes Registry-Victoria. Med J Aust 2018;208:439443.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 35.

    Auffenberg GB, Lane BR, Linsell S, et al. Practice- vs physician-level variation in use of active surveillance for men with low-risk prostate cancer: implications for collaborative quality improvement. JAMA Surg 2017;152:978980.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 36.

    Tyson MD, Graves AJ, O’Neil B, et al. Urologist-level correlation in the use of observation for low- and high-risk prostate cancer. JAMA Surg 2017;152:2734.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 37.

    Chen RC, Rumble RB, Loblaw DA, et al. Active surveillance for the management of localized prostate cancer (Cancer Care Ontario guideline): American Society of Clinical Oncology clinical practice guideline endorsement. J Clin Oncol 2016;34:21822190.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 38.

    Bruinsma SM, Bangma CH, Carroll PR, et al. Active surveillance for prostate cancer: a narrative review of clinical guidelines. Nat Rev Urol 2016;13:151167.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 39.

    Merriel SWD, Moon D, Dundee P, et al. A modified Delphi study to develop a practical guide for selecting patients with prostate cancer for active surveillance. BMC Urol 2021;21:18.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 40.

    Chiew KL, Sundaresan P, Jalaludin B, et al. A narrative synthesis of the quality of cancer care and development of an integrated conceptual framework. Eur J Cancer Care (Engl) 2018;27:e12881.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 41.

    Robles SC, Marrett LD, Clarke EA, et al. An application of capture-recapture methods to the estimation of completeness of cancer registration. J Clin Epidemiol 1988;41:495501.

    • PubMed
    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 1527 1527 33
PDF Downloads 867 867 30
EPUB Downloads 0 0 0