Effort Tracking Metrics Provide Data for Optimal Budgeting and Workload Management in Therapeutic Cancer Clinical Trials

Authors:
Pam James From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Pam James in
Current site
Google Scholar
PubMed
Close
 MS, CCRP
,
Patricia Bebee From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Patricia Bebee in
Current site
Google Scholar
PubMed
Close
 RN, BA, CCRP
,
Linda Beekman From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Linda Beekman in
Current site
Google Scholar
PubMed
Close
 RN, MBA
,
David Browning From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by David Browning in
Current site
Google Scholar
PubMed
Close
 BS
,
Mathew Innes From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Mathew Innes in
Current site
Google Scholar
PubMed
Close
 BSE, MBA
,
Jeannie Kain From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Jeannie Kain in
Current site
Google Scholar
PubMed
Close
 BA, CIP, CCRP
,
Theresa Royce Westcott From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Theresa Royce Westcott in
Current site
Google Scholar
PubMed
Close
 BBA, CCRP
, and
Marcy Waldinger From the Clinical Trials Office, University of Michigan Comprehensive Cancer Center, Ann Arbor, Michigan.

Search for other papers by Marcy Waldinger in
Current site
Google Scholar
PubMed
Close
 MHSA
Full access

Clinical trials operations struggle to achieve optimal distribution of workload in a dynamic data management and regulatory environment, and to achieve adequate cost recovery for personnel costs. The University of Michigan Comprehensive Cancer Center developed and implemented an effort tracking application to quantify data management and regulatory workload to more effectively assess and allocate work while improving charge capture. Staff recorded how much time they spend each day performing specific study-related and general office tasks. Aggregated data on staff use of the application from 2006 through 2009 were analyzed to gain a better understanding of what trial characteristics require the most data management and regulatory effort. Analysis revealed 4 major determinants of staff effort: 1) study volume (actual accrual), 2) study accrual rate, 3) study enrollment status, and 4) study sponsor type. Effort tracking also confirms that trials that accrued at a faster rate used fewer resources on a per-patient basis than slow-accruing trials. In general, industry-sponsored trials required the most data management and regulatory support, outweighing other sponsor types. Although it is widely assumed that most data management efforts are expended while a trial is actively accruing, the authors learned that 25% to 30% of a data manager's effort is expended while the study is either not yet open or closed to enrollment. Through the use of a data-driven effort tracking tool, clinical research operations can more efficiently allocate workload and ensure that study budgets are negotiated to adequately cover study-related expenses.

In response to perceived increases in clinical trial complexity and concomitant greater demands on staffing resources, the University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) endeavored to quantify the amount of effort required by its staff to efficiently and effectively conduct clinical trials. With the development of a homegrown Research Effort Tracking Application (RETA), the CTO was able to obtain real-time metrics related to staff time required to conduct therapeutic clinical trials. Using a Web-based tool, CTO staff record how much time they spend each day performing specific study-related and general office tasks. The information gathered was incorporated into study budget preparation to aid in budget negotiation and cost recovery. RETA not only assisted in budget development but also yielded information that significantly improved workload allocation and incremental staff justification. Before the implementation of RETA in December 2005, study budgets were based on either a best estimate of the resources required, or the dollar amount offered by a sponsor to perform the study. Allocation of workload was also a challenge, because the effort required by data management and regulatory staff to complete their study-related responsibilities was poorly understood.

The development of RETA is described in greater detail in a previous paper.1 In brief, the RETA tool provides staff with a Web-based method for tracking the amount of time spent performing study-related activities on a daily basis. Staff choose from a list of activities related to their job function (i.e., data management or regulatory) and indicate how much time they spent on that activity and for which study. This differs from a traditional time study in that RETA is used on a daily basis across all studies conducted in the UMCCC CTO, rather than having information gathered about only a handful of studies. It is a flexible tool capable of capturing changes in effort as research requirements and protocol complexity change. The usefulness of RETA has made it indispensible to UMCCC CTO operations. These data are being incorporated into the office's payroll process and are allowing managers to monitor how staff are spending their time on an ongoing basis.

This article discusses the method used to analyze the data yielded and to compare the impact of the primary study characteristics identified. The results and discussion examine what the authors learned in their 4-year analysis, and how the findings agree with or differ from conventional wisdom about the staff effort associated with the conduct of clinical trials. The authors will also describe how the CTO has incorporated these findings into the current and prospective management of studies coordinated through their office.

Method

This analysis details the hours of effort expended on therapeutic trials over a 4-year period: January 2006 through December 2009. Because most funding provided to support CTO operations is based on annual accrual to therapeutic trials, staff time devoted to nontherapeutic trials was not included in this analysis.

Combining information from RETA and the authors' clinical trials management system, the authors explored how the time-and-effort data would help elucidate the staffing hours required to conduct therapeutic studies. The CTO separates the regulatory and data management responsibilities; therefore, each role was analyzed independently. The staff effort required to conduct clinical trials varies according to accrual rate, complexity, and other factors. Data were collected for a 4-year period, which proved more than adequate to capture the natural ebb and flow of work through the office and the variability in study types (phase I, II, III) and sponsors (industry, investigator-initiated, other peer-reviewed and cooperative group).

Hours Reported

Over the 4-year period, an annual average of 8.9 (range, 6.2–11.3) full-time equivalent (FTE) regulatory personnel and 28.3 (range, 20.1–32.6) FTE data management personnel reported their time. The total number of hours reported by regulatory personnel during this period averaged 17,541 annual hours (range, 12,347–22,311 hours) or 1971 per FTE; of those total hours, an annual average of 10,592 hours, or 60% (1190 hours per FTE), were recorded toward study-related activity. The total number of hours data management personnel reported during this period averaged 55,944 annual hours (range, 39,771–64,439 hours), or 1976 per FTE. Of those total hours, an annual average of 39,117 hours, or 70% (1382 per FTE), were recorded toward study-related activity.

As indicated in the first article,1 RETA was introduced to the CTO staff in 2006, and they initially struggled to accurately allocate their time toward assigned projects. As a result, during the first year, only 40% to 50% of each staff member's effort was allocated toward study-related tasks, which was lower than what the managers intuitively knew it should be. In an effort to improve the process during the first year, the managers focused on regular evaluation and reeducation of the staff until they became more adept at accurately reporting their time. Although the RETA data reflect that regulatory staff members are associating 60% of their time with study-related activities, the managers knew that number to be higher. Studies are not made available for tracking in RETA until they are scheduled for scientific review. However, regulatory staff begins work on studies before this time. As discussed in the first article,1 through reviewing the amount of time staff members spent on non–study-related activities (sick time, vacation, meetings, and professional development), the authors were ultimately able to determine that 70% to 75% of a staff member's time was appropriately allocated toward study-related tasks.

The data analysis also showed that the average number of study-related activity hours for data management and regulatory effort were artificially low because of the learning curve in the first year of RETA implementation. When considering whether data from the first year of RETA implementation were appropriate to include in the analysis, the authors examined their effect on the overall dataset and determined that excluding data from the first year did not change the results significantly. As the proportion of time reported by staff members increased, a similar proportional increase was seen in the amount of time spent on study-related tasks.

The first group to record their time in RETA was the regulatory team. After their initial training, the data managers were oriented and trained on the RETA application. The entire process was phased in over 1 year. Review of the process determined that staff spent 10 to 15 minutes per day logging activities into RETA. The hours spent on study-related activities by regulatory and data management was then used to analyze the amount of effort required by individual studies. Hours spent on non–study-related activities (e.g., staff meetings, professional development, sick and vacation time) were not included. Examples of data management and regulatory tasks are listed in Table 1.

Table 1

Distribution of Effort by Task

Table 1

CTO staff members are not responsible for consenting, scheduling patients, and research blood draws. These tasks are performed by clinical research staff in the clinic setting. UMCCC CTO is in the early stages of implementing RETA with the clinical research staff; however, they currently have only 1 year of data, and it is too early to draw any meaningful conclusions. Except for the tasks associated with UMCCC phase I trials, other tasks, such as budget development, negotiation with sponsors regarding trial compensation, and contract negotiations, are performed by non–UMCCC CTO teams. Because these employees are not employed by the UMCCC CTO, adoption of RETA by these employees has not been feasible to date.

Study Characteristics

Staffing for clinical trial support has been primarily based on accruals; however, this approach does not accurately account for all of the work required.2,3 The group's analyses clearly showed that 4 study characteristics significantly affect the amount of effort necessary to conduct a trial: 1) actual yearly accrual rate, 2) overall study volume, 3) sponsor type, and 4) study enrollment status.

From 2006 through 2009, the authors examined an average of 264 therapeutic trials annually (range, 239–284). Total accruals averaged 554 per year (range, 477–644). Trials were categorized by accrual rate, size, sponsor type, and study enrollment status (Table 2).

Table 2

Category Definitions

Table 2

Once a study was classified, the authors examined the study and staff job roles (e.g., data management or regulatory) on both a “per study” and “per accrual” basis. In this way, they were able to quantify differences between trial types.

Results

Analysis of the RETA data confirmed that the amount of effort spent on trials varies according to the specific trial characteristics, a concept widely assumed but lacking validation.

Data Management Effort According to Sponsor Type

An important factor in determining differences in studies and study accruals is the study sponsor. Mirroring research trends, the portfolio of research sponsors at the CTO has been changing over recent years.4 To better understand how the changing mix of sponsor types at the UMCCC affects the day-to-day work of CTO data managers and regulatory staff, the CTO reviewed the UMCCC portfolio of studies over a 7-year period from 2002 through 2009. An increase in industry-sponsored projects occurred as the UMCCC opened more early-phase studies of novel targeted therapies (Figures 1 and 2).

In addition, the CTO participated in more peer-reviewed multisite trials to pool resources and gather data quickly to further the development of new therapies. This raised the question of whether the shift toward conducting a larger number of industry-sponsored studies could explain historical perceptions that shortages in staffing existed, despite the fact that the number of staff and patients accrued to trials remained relatively consistent for several years. In the absence of RETA data in the early years to confirm these suspicions, the CTO was unable to respond appropriately to the changing sponsor mix by adding additional staff, and this likely led to the perceived shortages. The pharmaceutical studies were significantly more time-consuming; the effort expended for an accrual on an investigator-initiated study is far less than the effort expended for an accrual on an industry study. Although this concept is generally accepted in the clinical trials community, the extent of the disparity shown by the RETA data is remarkable. For example, Figure 3 shows that in 2009, data management efforts on industry studies required 2.45 times more effort than institutional studies, and 2.85 times more effort than institutional-industry studies.

Because the CTO manages a relatively small number of the UMCCC's cooperative group studies, RETA data across a large enough sample of these types of studies are insufficient for drawing any meaningful conclusions. The greater effort required by industry trials is likely attributable to 1) greater protocol complexity with increasingly more trials requiring central laboratories, pharmacokinetic and pharmacodynamics studies, and heightened monitoring and reporting of complications (e.g., serial echocardiograms); 2) increased vigilance and coordination required by the data manager to ensure that all aspects of the complicated protocols are completed; and 3) sponsor efforts to reduce costs through implementing electronic case report forms and using multiple vendors for their studies, thus resulting in increased time to communicate with all parties involved in the trial and greater difficulty obtaining all necessary results and documentation from multiple parties. The revelation of this significant difference provided the basis for new practices in the development of budgets and decision-making regarding workload distribution among data managers.

Figure 1
Figure 1

Cancer center trends: composition of therapeutic research by type (based on a yearly accrual). Number of accruals per sponsor type in the University of Michigan Comprehensive Cancer Center Clinical Trials Office portfolio from 2002–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Figure 2
Figure 2

Cancer center trends: composition of therapeutic research by type (based on enrolling studies). Number of trials per sponsor type in the University of Michigan Comprehensive Cancer Center Clinical Trials Office portfolio from 2002–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Figure 3
Figure 3

Number of hours spent per patient accrual categorized by sponsor type for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

In contrast, the 2009 data management effort for industry studies was similar to that for peerreviewed studies. Peer-reviewed studies may be more closely matched to the complexity of industry studies because an external coordinating center provides oversight and quality checks of the work performed by data managers. Therefore, data managers spend more time engaged with the sponsor in addressing data queries and other questions about how to complete the data correctly. Investigator-initiated trials generally do not receive the same level of oversight, and because case report forms are developed on-site, there are fewer questions about how to accurately complete them.

The authors observed a spike in the amount of data management effort allocated toward industry-sponsored trials in 2008, because significant effort was devoted to studies without offsetting accruals to those studies. This typically occurs in the start-up and closure phases of projects. In 2008, UMCCC opened a dedicated phase I unit and, concurrently, 50% of sponsored studies closed.

Regulatory Effort According to Sponsor Type

Data captured in RETA for regulatory staff showed somewhat less variation between project sponsor types (Figure 4). Data analysis showed that industry-sponsored studies require approximately 30% to 50% more regulatory effort than institutional, institutional-industry, and peer-reviewed protocols. Increased effort expended on industry protocols is likely from 1) the heightened communication requirements from sponsoring companies; 2) increased numbers of external serious adverse events because of the large size of industry multicenter studies; and 3) personnel changes at contract research organizations and pharmaceutical companies, requiring duplication of effort.5 As with the changes implemented in budget negotiations and work distribution for data managers, the actual data from RETA allowed for necessary adjustments in planning for regulatory staffing based on study sponsor type. The RETA tracking tool provides data to review and analyze study effort in real time, and reallocate or justify additional resources as needed.

Data Management Effort According to Study Enrollment Status

Data managers' workloads have been determined by the number of accruals and active studies they manage that are open to enrollment. As studies close to enrollment, managers look to reallocate the data management time made available by that closed study. Before RETA was implemented, the amount of time that needed to be reserved for work associated with studies that were not open to enrollment (study initiation, long-term follow-up) was unclear. As seen in Figure 5, the data gathered from 2006 through 2009 showed that, on average, 72% of data management effort is committed to studies in the open enrollment phase. Therefore, approximately one-quarter of data management effort should be reserved for studies that are not yet open to enrollment or have been closed to enrollment.

Regulatory Effort Per Study According to Study Volume

Study budgets are designed to collect funds at various milestones reached over the course of the clinical trial.6 Studies are typically distributed to data managers and regulatory staff based on expected accruals and current workload assignments. Although expenses associated with conducting a study are expected to increase with study accrual, the RETA data showed that, on average over a 4-year period, there is less than a 10% difference in the amount of regulatory effort required for small, medium, and large studies (Figure 6). This can be problematic because staffing decisions in a clinical trials office are often based on annual accruals. Additionally, payments made by sponsors to the sites are tied to accruals and milestones. This begs the question whether regulatory charges should be flat-rate charges with costs tied to metrics, such as number of amendments and serious adverse event form submissions, versus per accrual, as is generally standard practice.

Figure 4
Figure 4

Hours of regulatory effort for each trial categorized by sponsor type for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Figure 5
Figure 5

Percentage of data management effort allocated to each phase of study activity for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Small and large studies each averaged 52 and 54 hours per study, respectively, but medium studies averaged a slightly higher 59 hours per study over the 4-year analysis. The decrease in effort from medium to large trials can be attributed to the correlation of large studies with institutional sponsoring, a subset shown to correspond with reduced overall workload per study. As evidenced by the requirement of approximately 34 hours per study, a financial risk is associated with keeping studies that have little or no accrual open for a protracted duration, because regulatory costs will be generated without a corresponding source of revenue from accruals.

Relative Data Management Effort Per Accrual According to Accrual Rate

The accrual rate affects overall personnel and opportunity costs. Studies incur similar costs on a per-patient basis regardless of whether they are slowor fast-accruing. In slow-accruing trials, the revenue generated per patient is collected at a slower pace, which often leads to budget short-falls for data management effort. If enrollments are performed in quick succession, economies of scale can be realized, as shown in Figure 7. Note that the effort per accrual for studies enrolling 10 or more patients per year is less than one-half of the effort per accrual required for studies that enroll only 1 to 5 patients per year.

Figure 6
Figure 6

Average number of regulatory hours per study categorized by study size for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Figure 7
Figure 7

Percentage of data management effort per patient accrual categorized by accrual speed for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

The principal investigator is responsible for accurately assessing whether the accrual goal can be met, through considering the protocol design and projected number of accruals over a specified period. Slow-accruing studies have a significant resource impact on the CTO, which represents loss in effort and revenue. An analysis of this impact on 14 cancer centers indicated “approximately 90% of the accrual to industry and cooperative group sponsored trials is from 26% of the trials enrolling more than two patients a trial.”7 Therefore, allocating effort toward higher or faster accruing trials is a more cost-effective use of personnel.

Relative Regulatory Effort Per Study According to Accrual Rate

In contrast to the inverse relationship described earlier for data management effort related to accrual rate, the analysis of regulatory effort per study according to accrual rate remains relatively consistent across the various rates of accrual, with only minor differences that are logically explained based on what was learned from this analysis thus far. Figure 8 shows a 7% average increase in effort on a study with fastor medium-paced accrual compared with a study with slow accrual during the 4-year period. This relationship is logical, because a study with more accrual activity will have concomitant increases in some regulatory tasks, such as protocol deviations and internal reporting of serious adverse events. However, recent years have seen a shift in the effort required for medium-paced trials, with more effort required than either slowor fast-accruing trials. Fast-accruing trials are often investigator-initiated and tend to require less effort overall. The increased effort required for medium-accruing trials seen in recent years likely reflects the shift in the UMCCC portfolio of studies, which shows an increase in industry trials. The authors' data analysis has shown that opening slow-accruing industry trials is often not cost-effective, because they are complex and lack economies of scale to offset expenses, and the cost per patient is higher than for medium- or fast-accruing industry trials. As shown in Figure 8, substantial effort is associated with studies that have a flat rate of accrual (< 0.5 accruals per year). Studies that ultimately never accrue incur significant expenses that will not be recovered.

Figure 8
Figure 8

Percentage of regulatory effort per trial categorized by accrual speed for 2006–2009.

Citation: Journal of the National Comprehensive Cancer Network J Natl Compr Canc Netw 9, 12; 10.6004/jnccn.2011.0116

Discussion

Before the data provided by the RETA tracking tool were analyzed, the authors struggled with the same fundamental question asked by the remainder of the clinical research community: how much effort is required to run a clinical trial? The authors' literature search showed inconsistencies in methods for collection of reliable effort metrics. The inability to accurately quantify effort through time surveys, single-study observations with limited scope, and other subjective measurements led the UMCCC CTO to create a tool that would generate quantitative data to validate or challenge assumptions widely held by the clinical research community.8 At the outset, the authors believed that higher accruing studies result in greater overall effort. However, they learned that larger and faster-accruing studies actually require less effort per patient than smaller and slower-accruing studies. Additionally, the assumption that regulatory effort is directly proportional to accrual rate/size was proven incorrect by the data. In fact, nonaccruing studies continue to require significant effort, which is typically not billable and results in revenue shortfalls.

The CTO's assumption that industry studies are becoming more complex, and therefore more labor-intensive, was verified. The data showed that industry-sponsored trials require more than twice the effort per accrual than institutional studies. This knowledge has enabled the CTO management to adjust workload assignments and plan appropriate staffing and FTE needs. Before analyzing the data, the authors were unable to quantify the effort expended for studies not actively accruing. Now they can document that 25% of a data manager's study effort should be reserved for activities outside the active enrollment period, such as pre-enrollment and follow-up activities. RETA also allowed them to learn to budget 70% to 75% of data management and regulatory staff FTEs for study activities and 25% to 30% for nonstudy activities (e.g., general office meetings, sick time, vacation).

Conclusions

The clinical research program at UMCCC is broad in scope, incorporating a study portfolio mix that is consistent with the institution's mission and investigator vision and expertise. The CTO is responsible for managing the financial and opportunity costs in the administration and implementing the clinical research infrastructure. Through tracking effort metrics, UMCCC is better able to manage clinical trials, with particular emphasis on FTE requirements on a per-study basis. The RETA tool enables the research team to use verified data when negotiating with sponsors on resource needs, thereby ensuring that studies are adequately funded and staffed. These metrics provide CTO managers with the tools and data necessary to make staffing assignments and effectively monitor staff productivity. Understanding the factors that affect the effort required for different activities and projects allows resources to be dynamically redirected as needed. This practice supports fair distribution of workload among peers, reducing waste and improving efficiency and morale. Data collected provides information on CTO staffing availability for new projects and helps forecast future capacity. RETA data are also helpful in determining whether conducting a proposed study is financially feasible. Conversely, the findings can be taken into consideration when deciding whether to close poor-accruing studies.

Clinical investigators were initially unsure of RETA's usefulness and its impact on their research. However, they realized that the data yielded enable more accurate effort estimates, and therefore greater accuracy in determining the staffing needs required to conduct their trials. Additionally, greater confidence exists that the cost of their trials will be adequately covered by the budgets developed, leading to far fewer shortfalls that often required the institution or the investigator to cover the difference.

The successful implementation of RETA within the CTO has had a great impact on operations and related decision-making. Each year of data provides increased knowledge and insight into the past and future trends of clinical trials infrastructure personnel needs. With that information, the office anticipates being able to further refine its budgeting process, recognize opportunities to create efficiencies, and improve clinical research operations at UMCCC. Furthermore, information gleaned from RETA may potentially be generalized to the larger cancer clinical research community.

Mr. Innes and Mr. Browning have disclosed that they are co-inventors of the tool discussed herein. All other authors have disclosed that they have no financial interests, arrangements, or affiliations with the manufacturers of any products discussed in this article or their competitors.

The authors would like to thank Janet Tarolli, RN, BSN, CCRC, and Joy Stair, MS, BSN, for serving as editors.

References

  • 1

    James P, Bebee P, Beekman L et al.. Creating an effort-tracking tool to improve therapeutic cancer clinical trials workload management and budgeting. J Natl Compr Canc Netw 2011;9:12281233.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2

    Gwede C, Daniels S, Johnson D. Organization of clinical research services at investigative sites: implications for workload measurement. Drug Information Journal 2001;35:695705.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Fedor CA. The evolving role of the clinical research coordinator. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006:110.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4

    Fedor CA, Gabriele EF. Future trends: the professionalization of the CRC. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006:187197.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Pharmaceutical Research and Manufacturers of America. Pharmaceutical Industry Profile 2009. Washington, DC: Pharmaceutical Research and Manufacturers of America; 2009.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6

    Mister R, Delahunty N, Grosbard A. Recruitment and retention of research subjects. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Durivage HJ, Bridges KD. Clinical trials metrics: protocol performance and resource utilization from 14 cancer centers [abstract]. J Clin Oncol 2009;27(Suppl 1):15s. Abstract 6557.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 8

    Roche K, Paul N, Smuck B et al.. Factors affecting workload of cancer clinical trials: results of a multicenter study of the National Cancer Institute of Canada Clinical Trials Group. J Clin Oncol 2002;20:545556.

    • PubMed
    • Search Google Scholar
    • Export Citation

Correspondence: Marcy Waldinger, MHSA, 1500 East Medical Center Drive, 6316 Cancer Center, Ann Arbor, MI 48109-5942. E-mail: wald@umich.edu
  • Collapse
  • Expand
  • Cancer center trends: composition of therapeutic research by type (based on a yearly accrual). Number of accruals per sponsor type in the University of Michigan Comprehensive Cancer Center Clinical Trials Office portfolio from 2002–2009.

  • Cancer center trends: composition of therapeutic research by type (based on enrolling studies). Number of trials per sponsor type in the University of Michigan Comprehensive Cancer Center Clinical Trials Office portfolio from 2002–2009.

  • Number of hours spent per patient accrual categorized by sponsor type for 2006–2009.

  • Hours of regulatory effort for each trial categorized by sponsor type for 2006–2009.

  • Percentage of data management effort allocated to each phase of study activity for 2006–2009.

  • Average number of regulatory hours per study categorized by study size for 2006–2009.

  • Percentage of data management effort per patient accrual categorized by accrual speed for 2006–2009.

  • Percentage of regulatory effort per trial categorized by accrual speed for 2006–2009.

  • 1

    James P, Bebee P, Beekman L et al.. Creating an effort-tracking tool to improve therapeutic cancer clinical trials workload management and budgeting. J Natl Compr Canc Netw 2011;9:12281233.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2

    Gwede C, Daniels S, Johnson D. Organization of clinical research services at investigative sites: implications for workload measurement. Drug Information Journal 2001;35:695705.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Fedor CA. The evolving role of the clinical research coordinator. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006:110.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4

    Fedor CA, Gabriele EF. Future trends: the professionalization of the CRC. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006:187197.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Pharmaceutical Research and Manufacturers of America. Pharmaceutical Industry Profile 2009. Washington, DC: Pharmaceutical Research and Manufacturers of America; 2009.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6

    Mister R, Delahunty N, Grosbard A. Recruitment and retention of research subjects. In: Fedor CA, Cola PR, Pierre C, eds. Responsible Research: A Guide for Coordinators. London, England: Remedica; 2006.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Durivage HJ, Bridges KD. Clinical trials metrics: protocol performance and resource utilization from 14 cancer centers [abstract]. J Clin Oncol 2009;27(Suppl 1):15s. Abstract 6557.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 8

    Roche K, Paul N, Smuck B et al.. Factors affecting workload of cancer clinical trials: results of a multicenter study of the National Cancer Institute of Canada Clinical Trials Group. J Clin Oncol 2002;20:545556.

    • PubMed
    • Search Google Scholar
    • Export Citation

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 2829 1890 384
PDF Downloads 544 184 15
EPUB Downloads 0 0 0