SlideShare uma empresa Scribd logo
1 de 56
Clinical Development KPIs
      Measuring for Success
                             Manley Finch, PhD, MPH
                                            Executive Director
                                        HIV Nutrition Network
                                   Sr. Medical Science Liaison
                                         GTC BioTherapeutics




      “If we knew what it was we were doing, it would not
                           be called research, would it?”
                                          Albert Einstein



1                            M.R. Finch, PhD, MPH
Program/Project Evaluation


    Evaluation is to help projects become even better
     than they planned to be.… First and foremost,
        evaluation should support the project.…
                W.K. Kellogg Foundation
               Evaluation Approach, 1997


2                        M.R. Finch, PhD, MPH
Overview

 Evaluation measurement starts with the program development plan.

 The importance of clinical trial program evaluations; critical evaluation
    creates ROI.

 Identifying and defining the correct KPIs to track and measure.

 KPI measurement translates to clinical trial agility and efficiency; driving
    successful study completion depends on trial performance monitoring
    and corrective action plans.

 Monitoring outsourced activities is critical for excellence in execution.
3                                  M.R. Finch, PhD, MPH
Program/Project Evaluation

            What is program/project evaluation?

 Evaluation is the systematic acquisition and assessment
     of information to provide useful feedback about a
  dynamic process, the interlocking steps of the process,
       and the intended and unintended outcome(s).

  Prospective; before and during process operations –
   real time.
  Retrospective; after the fact, historical – data for the
4  future.                 M.R. Finch, PhD, MPH
Program/Project Evaluation

                       Goal of Evaluation

    The preemptive goal of evaluation should be to influence
    decision-making, real-time or in future planning, through the
     unbiased assimilation and extrapolation of empirically-
      driven information resultant from strategically designed
        informatics portals in order to effect a more positive
                              outcome.

                 Evaluate To Create an ROI !!!
5                            M.R. Finch, PhD, MPH
Program/Project Evaluation
                         Remember ->

    Program/project evaluation has a cost associated with it‟s
     planning, inception, monitoring, analyses, and reporting….

          Measure only what is real, important, and
                    will create value!!




6                            M.R. Finch, PhD, MPH
Program/Project Evaluation


                        Why measure?


     Quantitative data provides measurable metrics to gauge
     ongoing and future success and drive ongoing and future
               improvements in system or programs.




7                           M.R. Finch, PhD, MPH
Program/Project Evaluation



     Provides rationale for current and future decision making
        Evaluation programs are essential in any industry
             Reporting ROI to Senior Management

              Senior Management Buy-in = Funding



8                            M.R. Finch, PhD, MPH
Program/Project Evaluation

                      Program Evaluation:
     Formative – evaluation of a program/project during the
      development stage to ensure reiterative improvement
      process. Assess the merit, worthiness, and applicability.
        Hx data, interviews, questionnaires, focus groups, surveys.
        Proactive planning for successful real time assessments.
     Summative – evaluation of an ongoing or completed
      program/project to evaluate the successes and
      challenges in order to improve ongoing and future
      projects.
        Data driven metrics, quantitative, analyses driven.
        ROI reporting to stakeholders.
9                               M.R. Finch, PhD, MPH
Program/Project Evaluation

Formative Evaluation
      Prospective; prior to or in parallel with program/project design and
       planning.
        Define parameters (KPIs) to be monitored, assessed, and evaluated.
        Defines feasibility of evaluability; don‟t attempt to measure
          everything. Quantitative versus Qualitative.
        Define informatics reporting process and infrastructure.
        Define risks and risk mitigation strategies.
        Define implementation and training strategies.
        Define process evaluation strategies.
        Define responsible parties at all levels and assign accountability.
10                                 M.R. Finch, PhD, MPH
Program/Project Evaluation

Summative Evaluation:
      Retrospective; after data has been collected, from historical data
       collected, or from several different programs/projects.
         Outcome evaluation; did you meet your goals?
         Impact evaluation; what was the effect of real time changes?
         Cost effectiveness/benefit evaluation; ROI?
         Secondary evaluation; examine data to answer additional issues
         Meta-analyses; from several programs or projects, historical.



11                                 M.R. Finch, PhD, MPH
Program/Project Evaluation

                There are numerous models
            Management Oriented System Models

 PERT: Program Evaluation and Review Technique
 CPM: Critical Path Method
 GANTT: CPM Charting model

     Only examples and must be tailored to fit your needs, a
                    combination of all is best.
12                          M.R. Finch, PhD, MPH
Program/Project Evaluation


Sources for Program Evaluation Methodologies
 W.K. Kellogg
 World Health Organization
 Web Center of Social Research
 Project Management Institute (PMI).
 Drug Information Association
 eXL, Barnett and others
 PERT, CPM, and GANTT methods
13                       M.R. Finch, PhD, MPH
Program/Project Evaluation

      Evaluation Program
        Design from the start of program in parallel with early
         program or protocol plan development discussions.
        Determine relevant measures of program, protocol, and
         site performance early.
        Assign team to craft, implement, and monitor early in
         process.
        Don‟t reinvent the wheel – rely on standards already
         developed unless protocol demands it.


14                           M.R. Finch, PhD, MPH
Program/Project Evaluation


                  Program Evaluation Steps

     Assign Program Team evaluation program responsibilities
                    and set expectations early

                        SMART GOALS

            Goals Support Clinical Program Timelines
15                         M.R. Finch, PhD, MPH
Program/Project Evaluation



        Define challenges and determine action plan
        Determine evaluation metrics and how to assess
        Design assessment tools specific to metrics
        Determine frequency of assessments
        Develop reporting format specifics
        Meet often to assess program and steer appropriately

16                            M.R. Finch, PhD, MPH
Key Performance Indicators
                  KPIs

           Why do we measure and why do we care?

 Costs of trials are increasing alarmingly;
      400-800 million per drug in 2006; over 900-1.2 billion in 2011
      26K per Phase III patient in 2006..
      47.5K per Phase III patient in 2011
 Trials are delayed more frequently with study start up, site
  activation, recruitment and retention being blamed most.
 Failure to be first in class or first to market drives market
  share loss and substantially impoverished revenues.
17                                M.R. Finch, PhD, MPH
Evaluation ROI Key Points



         $$- TIME IS MONEY -$$

     Every day the trial is operating is 100 to
        200k USD operational cost alone.

18                     M.R. Finch, PhD, MPH
Evaluation ROI Key Points


     Marketing Considerations and Opportunities
      Blockbuster drug can generate 2-5 million USD per day in sales
                       revenue ( 750 to 1,500 mil/year)
       Market share decreases dramatically based on tier approval;
              First in Class, First to Market, 2nd to Market etc.

                Windows for marketing a drug are dynamic
                    First to market wins market share
                  Viagra® versus Cialis® as an example
19                             M.R. Finch, PhD, MPH
Evaluation ROI Key Points


            $$- TIME IS MONEY -$$
                   Delays in Time to Market
                 2 to 5 million per day marketing.
              700 to 1,500 million per year revenue
      Low approval tier decreases market share from 75-80%
                         to 35-25% or less.



20                        M.R. Finch, PhD, MPH
Program/Project Evaluation


     Evaluation and Assessment Create Real ROI

             Plan early and plan in parallel

          Assess early and assess in parallel

      Real time data = real time effective changes

21                      M.R. Finch, PhD, MPH
KPIs
                  It is critical to define the
        appropriate, measurable, meaningful, and value
                       granting KPIs early
  Study Start Up Process
     Vendor Selection
     Medical Writing; Protocol, Consent, CRF, Assessments, IVRS
     Regulatory Approval
     Study Site Selection and Activation
  Recruitment and Retention
  Data Collection and Management
  Data Cleaning and Data Locking
  Drug Approval Process
22                             M.R. Finch, PhD, MPH
KPIs and SPIs

  Key Performance Indicators - KPIs
        Similar across all trials
        Tracking for most is standard in the industry
        Determine as a corporate entity prior to program planning
        Share with vendors and sites
  Study Performance Indicators – SPIs
        May be similar within disease indication
        Vary across differing disease indications and trial phases
        Determine at the beginning of project or trial
        Share with vendors and sites

23                                 M.R. Finch, PhD, MPH
KPIs
                  Standard Program and Trial KPIs
                  How your program/project is scored
    MAP/Development Program Plan Completion
    DMF/IND Submitted to IND and/or First Protocol Approved
    Initial IRB Approval; Phase I, II, IIIa - IIIb
    First Site Selected (FSS); First Site Approved (FSA), and FSS
    First Patient In (FPI); FPE, FPR
    First Patient Completed (FPC), LPI, LPO
    Data Cleaned, Locked, Analyzed, Data Report Completed.
     Site Close Outs, Final Study Report, etc
    NDA Submission, NDA Approval, Drug on Market

24                                 M.R. Finch, PhD, MPH
KPIs and SPIs
          Trial Completion Key Performance Influencers

        Study Start Up
             Site assessment and selection
             Site training
             Site activation
        Study Conduct
             Patient screening and enrollment rates
             Patient retention
             Data monitoring and cleaning
        Study Closure
             Final data cleaning
             Data lock and analyses
             Study site closeouts
25           SAR, FSR and metrics reporting
                                   M.R. Finch, PhD, MPH
KPIs

Vendor Assessment & Selection Time
 Steering Committees / Lead PIs
 CROs
 Central IRBs
 Central Lab
 Central Reader/Scorer
 Rater Reliability
 Recruitment/Trial Awareness/PR (should be the first!).
 SMOs/PI Networks
26                         M.R. Finch, PhD, MPH
KPIs

              Demand Metrics from the Vendors!!!
                     CarFax = CROFax
 Vendors are service providers and therefore live and die on metrics.
 Demand formative strategy (case histories) and summative data from
  each vendor to ensure the best fit.
 Not all are created equal and a “one-stop” mentality can be fatal to your
  program or project.
 What is their Hx out of scope like? How often have they enrolled on
  time? How often have they completed on time? How often have they
  met or exceeded expectations?
 FDA and Sponsor Audits; CAPAs, 483s, Warning Letters, CIAs, etc?
27                               M.R. Finch, PhD, MPH
KPIs

                      CROs and Vendors

           Scope of Work & Task Order Agreements
                    Timelines, Milestones

     From these documents ALL KPIs are measurable from the
               outset. This can‟t be stressed enough.

     A Solid SOW and TOA = A Good Chance for Success!

28                          M.R. Finch, PhD, MPH
SOW & TOA KPIs
              Measurable Outcomes Start Here
                             “X”
                     Does Mark the Spot

  Roles and Responsibilities; who is doing what, when, where.
  Measureable Timelines and Deliverables; quantitative.
  Project Milestones are milestones, not guidelines.
  Define KPIs within the documents, set payments based on
   milestones, deliverables, and KPIs versus time burnt/FTE.
  Early communication and clarity amongst parties ensures a
29
   better chance for success. Finch, PhD, MPH
                             M.R.
KPIs
                              Central IRBs

 KPIs
      Initial Protocol and Consent Approval Time
      CRF and Assessment Approval Time
      Patient Recruitment Material Approval Time
      Individual Site Materials Approval Time
      Revision Approval Times
      Meet with their team in person and demand the metrics for
       assessment.
      Web access portals, multiple boards with multiple meetings, great
       FDA standing
30                               M.R. Finch, PhD, MPH
KPIs
                   Study Start Up and Activation
                      How fast can you come online!
 Define all Critical Paths, Floats, and Assess Resources
      Site Selection Process ensures success or failure in not only start
       times but also recruitment and retention.
 Site Selection KPIs - Site Assessments and Onboarding
      Feasibility Questionnaires, Historical Data must be a prerequisite for
       site KPI assessment.
      PSVs; Rapid assessment and onboarding. Verify ALL data at PSV
      Time to Contract Negotiation
      Time to IRB Approval
      Time to FPS, FPE, FPR M.R. Finch, PhD, MPH
31
Recruitment KPIs
     Patient Recruitment & Enrollment
          Is this the Holy Grail?




32               M.R. Finch, PhD, MPH
Recruitment KPIs

                     Will it end like
              This                              or
     This ?




33                       M.R. Finch, PhD, MPH
Recruitment KPIs

              80% of Trials Fail to Enroll on Time

 60-70 % are delayed greater than 3-6 months.
 50-40% are delayed greater than 6 months..
 30% are delayed up to 1 year plus.

 At 100K (conservative) per day for Phase III this would equal a
     cost of over 36.5 million dollars for one year not counting
             lost revenues from delayed time to market.

     Avg. Cost of Recruitment Plan is ~ 3 - 5% of Trial Cost
34                           M.R. Finch, PhD, MPH
Recruitment KPIs
Key Metrics
 Rate and Acceleration (Quantitative)
        Time to site selection & activation..
        Time to FSA, LSA.
        Time to FPS, FPE, FPR.
        LTFU, DO, Retention Rates
        Time to FPC, LPI, LPO.
 Qualitative (Can measure effect on above at inflection points)
      Source of Subject Tracking.
      PR and Trial Awareness Plan Execution/Implementation.
      Retention Efforts.
35                                M.R. Finch, PhD, MPH
Recruitment KPIs

     “Selecting optimal study sites is the single
        most important study start up activity
          related to rapid trial enrollment.”
                                     &
     No matter how many great sites you select, they can not
               overcome a poorly crafted protocol….




36                         M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
             PERFORMANCE

                     Platitudes to Ponder

     Slow site activation = slow or no patient recruitment

       Historical enrollment predicts future recruitment

                 Acceptance = Participation

                 Frustration = Abandonment
37                         M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
          PERFORMANCE


               Platitudes to Ponder

             Proactive = Performance

 Rescue programs (band-aids) cost more and do less

           Failure to plan is to plan to fail
38                      M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
             PERFORMANCE

 What are the key historical or current site performance
  metrics (KPIs) to monitor?
   Hx enrollment performance; EMRs or Paper?
           Number of patients in DB or access to patients
           Breadth and depth of referral network
           Willingness to attempt recruitment program activities
           Requests recruitment enhancement funding proactively
           Has on site trial relations or marketing manager
      Hx contract/budget negotiation time
      IRB approval time
         Central or local IRB
39                                M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
               PERFORMANCE
               Quantitative Analyses of Performance
             Create Corporate Sponsor/CRO PI Database
        Depth of PI patient DB – e.g. # of patients
        Contract/Budget negotiation time
        IRB approval time
        Site activation time
        Time to first patient screened & first patient randomized
        Number of patients screened/enrolled
        Number of patients ET/LTF/completed
        Enrollment time vs. allotted enrollment period
        Query, DCF, and DEE rates
        Various site related trial costs
40      Overall cost per patient enrolled -PhD, MPH
                                      M.R. Finch, CPP
Points To Ponder
         Are you providing feedback to the sites in real time?

     Are you assisting sites to measure their own performance?

      Are you creating centers of excellence using KPI metrics?

 Are you getting feedback from your sites on your performance
                      as a sponsor or CRO?

     One dollar invested proactively in the sites = Ten dollars
                       in return performance!!


41                             M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
          PERFORMANCE

 Create DB across all internal IR/D components
 Compare questionnaires to hx DB allows for
  quantitative analyses
 Select only the cream of the crop by stratifying the
  results
 Allows CTM/CPM to stratify the PI list and
  concentrate on the most rapidly activating sites to
  ensure FPI milestone capture
 Eliminate using the same non-performing sites over
  and over across company
42                      M.R. Finch, PhD, MPH
MEASURING INVESTIGATOR
                PERFORMANCE

                                 Points to Ponder

 Not all sites are created equal nor are all investigators.
    KOLs historically are poor enrolling centers – an unfortunate but real fact
 Not all sites accurately report Hx performance – over estimate.
                                  20% Rule Applies
 The level of involvement of the PI often is a valid predictor of enrollment
  when all other influencers are equal.
 Geographical location is important – incidence and prevalence of
  disease are impacted by population.
 These tools apply within disease indication – sites may enroll slower or
  faster in another indication.
43                                     M.R. Finch, PhD, MPH
MEASURING PROGRAM
                    PERFORMANCE

 Recruitment Program
      Include evaluation program at outset
      Design plan in conjunction with:
            Steering Committee – KOLs
            Site Input – PI, CRC, Research Dir., Marketing
            Internal Marketing and Medical Affairs
            Synergy across internal and external sources
      Implement early and evaluate early.
      Assess often and redesign as needed
      Craft and maintain evaluation ROI report – Sr. Execs will want report
       on cost to benefit ratio – your position may depend on effect.


44                                    M.R. Finch, PhD, MPH
MEASURING PROGRAM
                 PERFORMANCE

 Recruitment Program Evaluation
      Steering Committee, KOLs, and PIs
         Assess level of product knowledge versus
          literature, publications, presentations, posters
         Assess level of buy-in to protocol in general
               Know your message points and TPP
               Design presentations and assessment
               Assess level of knowledge
               Present data
               Reassess


           Understanding = Acceptance = Performance
45                                   M.R. Finch, PhD, MPH
MEASURING PROGRAM
                    PERFORMANCE

 Design Recruitment Program and Test
      Provide program to SC, KOLs, PI, and CRCs
      Use site specific paradigm – one size does not fit all
      Get site specific feedback and fine tune
           What media works best in their area?
           Develop site specific referral network list with contact information.

        Review with Marketing, Med Affairs, Legal
        Design evaluation KPIs for program
        Design SMART metrics and tracking system
        Design ROI report
        Circle back once more prior to implementation

46
                               Then off toPhD, MPH races!
                                    M.R. Finch, the
MEASURING PROGRAM
             PERFORMANCE

 Ok, this presentation is too short for a full recruitment program
                  design seminar and KPIs – so

                 Assume program is designed –

                   What should we measure?

                    $ Follow the spend $



47                           M.R. Finch, PhD, MPH
MEASURING PROGRAM
                  PERFORMANCE


                                Case Study

      New global trial, 150 US trial subjects required, very tight study
       completion timelines
      Poor perception of ability to capture goal
      Limited Senior Management Buy-in for recruitment program and
       associated costs
      Variety of band-aids in similar trials, no success
      Perception enrollment based entirely on site DB



48                                 M.R. Finch, PhD, MPH
Case Study

      Action Plan
         Began internal BU expertise and resource search
         Implemented evaluation program at start
         Implemented potential PI internal evaluation
             Interview CTM/CPM, CRAs, MSLs, etc.
             Reviewed available internal
                 Enrollment
                 Site activation times
             Created PI/Site DB from scratch



49                             M.R. Finch, PhD, MPH
Case Study

                          Action Plan

      Implemented site selection program using quantitative
       metrics
      Engaged all internal and external stakeholders for
       recruitment program design
      Designed and implemented recruitment program early
      Stratified site selection based on quantitative data
      Began site activation based on cohort strata
50                           M.R. Finch, PhD, MPH
MEASURING PROGRAM
                 PERFORMANCE

                               Case Study Facts

     Program ROI Evaluation Demonstrated
         Source of subjects
             DB represented ~ < 60% of patients
                 Required CRC time resource funding for EMR/Chart
                 Emails and letters very productive
                 PI/CRC to patient discussions were driver
             Community Organization ~ 20 % of patients
             Media and PR relations ~ 20% of patients


51                                 M.R. Finch, PhD, MPH
Case Study
                           Case Study Facts
      Results
         Enrollment goal 93% met in allotted time
         FPI goal met, LPI goal met
         Site activation by strata
               Time to grants/contracts and IRB reduced ~ 10%
               Time to overall activation reduced 15%
           Site stratification paradigm predicted enrollment
           PI/CRC acceptance predicted enrollment
           Screening rate elastic to program components = enrollment
           Increased site and investigator relations confirmed via summative
52          evaluation questionnaire at study end
                                      M.R. Finch, PhD, MPH
53
                       11
                         /2
                            9/
                               20




                                      0
                                          20
                                               40
                                                    60
                                                         80
                                                                 100
                                                                          120
                                                                                140
                                                                                      160
                       12        07
                         /2
                            9/
                               20
                                 07
                        1/
                           29
                             /2
                                00
                        2/        8
                           29
                             /2
                                00
                        3/        8
                           29
                             /2
                                00
                        4/        8
                           29
                             /2
                                00
                        5/        8
                           29
                             /2
                                00
                        6/        8
                           29
                             /2
                                00
                        7/        8
                           29
                                                                                            Screening




                             /2
                                00
                                  8




M.R. Finch, PhD, MPH
                        8/
                           29
                             /2
                                00
                        9/        8
                           29
                             /2
                                                                                                        Case Study




                                00
                       10         8
                         /2
                            9/
                               20
                       11        08
                         /2
                            9/
                               20
                                 08
                                                              Screening
Case Study
Screening rates:
 3 rates with 2 primary inflection points
      December 07 - April 21, „08:
         0.23 pts/day
      April 22, – July 23, ‟08:
         0.40 pts/day
      July 24 – October 17, ‟08:
         0.73 pts/day
• Additional inflection point
      September 23 – October 17, ‟08:
         0.94 pts/day         M.R. Finch, PhD, MPH
54
Conclusion

         Defining KPIs and SPIs Early is Critical

         Defining Evaluation Plan in Parallel with
     Program/Project Planning is Critical to Success

            Program Evaluation Creates ROI

     Measuring KPIs and SPIs Creates Documentable
           ROI and Increases Future Funding

55                       M.R. Finch, PhD, MPH
References
    Bain & Company. Has the Pharmaceutical Model Gone Bust? (www.bain.com; December 8th, 2003.)
    Body of Knowledge 5th edition, Association for Project Management, 2006.
    Colier, R. Rapidly rising clinical trial costs worry researchers. CMAJ. January 3, 2009. 180(3).
    Cutting Edge Information. “Clinical Operations: Accelerating Trials, Allocating Resources and Measuring Performance” [Accessed
     2006, www.ClinicalTrialBenchmarking.com]
    Cutting Edge Information. [Accessed 2011, www.ClinicalTrialBenchmarking.com]
    DiMasi, JA, Hansen, RW, Grabowski, HG. The price of innovation: New estimates of drug development costs. J Health Eco
     22(2003)151-185.
    Johnston, SC, Hauser, SL. Clinical Trials: Rising Costs Limit Innovation. Ann Neurol. Dec;62(6):A6-7.
    Milosevic, Dragan Z. . Project Management ToolBox: Tools and Techniques for the Practicing Project Manager. 2003, Wiley.
    Pharmaceutical Research and Manufacturers of America, Pharmaceutical Industry Profile, 2009. (Washington, DC: PhRMAA, April
     2009.
    William M.K. Trochim. Research Methods Knowledge Base: Introduction to Evaluation. Web Center for Social Research Methods.
     ,2006. [Accessed 28NOV2011, www.socialresearchmethods.net]
    W.K. Kellogg Foundation. W.K. Kellogg Foundation Program Evaluation Handbook. 1997 [ Accessed 28NOV2011,
     www.wkkf.org/knowledge-center/resources].




56                                                        M.R. Finch, PhD, MPH

Mais conteúdo relacionado

Mais procurados

Cinical trial protocol writing
Cinical trial protocol writingCinical trial protocol writing
Cinical trial protocol writingUrmila Aswar
 
Patient Safety in Clinical Trials
Patient Safety in Clinical TrialsPatient Safety in Clinical Trials
Patient Safety in Clinical TrialsMedelis
 
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...Marketing Network marcus evans
 
Breakthrough Therapy Designation- Spring 2014 Reg. Intelligence
Breakthrough Therapy Designation- Spring 2014 Reg. IntelligenceBreakthrough Therapy Designation- Spring 2014 Reg. Intelligence
Breakthrough Therapy Designation- Spring 2014 Reg. IntelligenceCharles Kemmerer
 
Guideline on patient safety and well being in clinical trials
Guideline on  patient safety and well being in clinical trialsGuideline on  patient safety and well being in clinical trials
Guideline on patient safety and well being in clinical trialsTrialJoin
 
Translational medicine
Translational medicineTranslational medicine
Translational medicineHaroon Rashid
 
Drug Development Life Cycle - Costs and Revenue
Drug Development Life Cycle - Costs and RevenueDrug Development Life Cycle - Costs and Revenue
Drug Development Life Cycle - Costs and RevenueRobert Sturm
 
Clinical Trial Protocol Review for Study Feasibility Analysis
Clinical Trial Protocol Review for Study Feasibility AnalysisClinical Trial Protocol Review for Study Feasibility Analysis
Clinical Trial Protocol Review for Study Feasibility AnalysisYing Lu
 
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics Act
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics ActPharmacovigilance and Materiovigilance, Drugs and Cosmetics Act
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics Actshashi sinha
 
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...Medpace
 
Career in clinical research
Career in clinical researchCareer in clinical research
Career in clinical researchaceindia367
 
Breakthrough Designation Opportunities Challenges AAPS 2014
Breakthrough Designation Opportunities Challenges AAPS 2014Breakthrough Designation Opportunities Challenges AAPS 2014
Breakthrough Designation Opportunities Challenges AAPS 2014Ajaz Hussain
 
Clinical trail protocol and development
Clinical trail protocol and developmentClinical trail protocol and development
Clinical trail protocol and developmentChintamBaladattaSai
 
DIA China Making Every Patient Count
DIA China Making Every Patient CountDIA China Making Every Patient Count
DIA China Making Every Patient CountE. Dennis Bashaw
 
Documentation clinical trial
Documentation clinical trialDocumentation clinical trial
Documentation clinical trialankit sharma
 
Clinical Trials: Regulatory & Privacy Issues
Clinical Trials:  Regulatory & Privacy IssuesClinical Trials:  Regulatory & Privacy Issues
Clinical Trials: Regulatory & Privacy IssuesMichael Swit
 
Clinical trial study team
Clinical trial study teamClinical trial study team
Clinical trial study teamManjuJhakhar
 
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...amitsoni240
 

Mais procurados (20)

Cinical trial protocol writing
Cinical trial protocol writingCinical trial protocol writing
Cinical trial protocol writing
 
Patient Safety in Clinical Trials
Patient Safety in Clinical TrialsPatient Safety in Clinical Trials
Patient Safety in Clinical Trials
 
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...
A Doctor’s Perspective on the Future Role of Pharmaceutical-Doctor Relationsh...
 
Breakthrough Therapy Designation- Spring 2014 Reg. Intelligence
Breakthrough Therapy Designation- Spring 2014 Reg. IntelligenceBreakthrough Therapy Designation- Spring 2014 Reg. Intelligence
Breakthrough Therapy Designation- Spring 2014 Reg. Intelligence
 
Cro
CroCro
Cro
 
Guideline on patient safety and well being in clinical trials
Guideline on  patient safety and well being in clinical trialsGuideline on  patient safety and well being in clinical trials
Guideline on patient safety and well being in clinical trials
 
Translational medicine
Translational medicineTranslational medicine
Translational medicine
 
Drug Development Life Cycle - Costs and Revenue
Drug Development Life Cycle - Costs and RevenueDrug Development Life Cycle - Costs and Revenue
Drug Development Life Cycle - Costs and Revenue
 
Clinical Trial Protocol Review for Study Feasibility Analysis
Clinical Trial Protocol Review for Study Feasibility AnalysisClinical Trial Protocol Review for Study Feasibility Analysis
Clinical Trial Protocol Review for Study Feasibility Analysis
 
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics Act
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics ActPharmacovigilance and Materiovigilance, Drugs and Cosmetics Act
Pharmacovigilance and Materiovigilance, Drugs and Cosmetics Act
 
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...
Webinar: Oncology Trial Recruitment: Challenging Indications and Challenging ...
 
Career in clinical research
Career in clinical researchCareer in clinical research
Career in clinical research
 
Breakthrough Designation Opportunities Challenges AAPS 2014
Breakthrough Designation Opportunities Challenges AAPS 2014Breakthrough Designation Opportunities Challenges AAPS 2014
Breakthrough Designation Opportunities Challenges AAPS 2014
 
Components of a clinical study protocol
Components of a clinical study protocolComponents of a clinical study protocol
Components of a clinical study protocol
 
Clinical trail protocol and development
Clinical trail protocol and developmentClinical trail protocol and development
Clinical trail protocol and development
 
DIA China Making Every Patient Count
DIA China Making Every Patient CountDIA China Making Every Patient Count
DIA China Making Every Patient Count
 
Documentation clinical trial
Documentation clinical trialDocumentation clinical trial
Documentation clinical trial
 
Clinical Trials: Regulatory & Privacy Issues
Clinical Trials:  Regulatory & Privacy IssuesClinical Trials:  Regulatory & Privacy Issues
Clinical Trials: Regulatory & Privacy Issues
 
Clinical trial study team
Clinical trial study teamClinical trial study team
Clinical trial study team
 
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...
CLINICAL TRAIL (TRIAL PROTOCOL & INSTITUTIONAL REVIEW BOARD/ INDEPENDENT ETHI...
 

Destaque

KPIs and Metrics for Advocacy & Service Delivery
KPIs and Metrics for Advocacy & Service DeliveryKPIs and Metrics for Advocacy & Service Delivery
KPIs and Metrics for Advocacy & Service DeliveryMichelle Dalton
 
BMJ Group presentation Quality Forum 2013
BMJ Group presentation Quality Forum 2013BMJ Group presentation Quality Forum 2013
BMJ Group presentation Quality Forum 2013BMJLearning
 
Clinical research ppt,
Clinical research   ppt,Clinical research   ppt,
Clinical research ppt,Malay Singh
 
Audit monitoring and inspections cro perspectives
Audit monitoring and inspections cro perspectivesAudit monitoring and inspections cro perspectives
Audit monitoring and inspections cro perspectivesDr Prashant Bodhe
 
13 tips to write skiller cover letter pdf ebook
13 tips to write skiller cover letter pdf ebook13 tips to write skiller cover letter pdf ebook
13 tips to write skiller cover letter pdf ebookjobsearchtipsa2z
 
Monitoring and auditing in clinical trials
Monitoring and auditing in clinical trialsMonitoring and auditing in clinical trials
Monitoring and auditing in clinical trialsJyotsna Kapoor
 
Top 12 skills for career success
Top 12 skills for career successTop 12 skills for career success
Top 12 skills for career successjobguide247
 
Drug Development Life Cycle
Drug Development Life CycleDrug Development Life Cycle
Drug Development Life CycleRajendra Sadare
 
Top 16 ways to make money online forever
Top 16 ways to make money online foreverTop 16 ways to make money online forever
Top 16 ways to make money online foreverjobguide247
 

Destaque (11)

KPIs and Metrics for Advocacy & Service Delivery
KPIs and Metrics for Advocacy & Service DeliveryKPIs and Metrics for Advocacy & Service Delivery
KPIs and Metrics for Advocacy & Service Delivery
 
BMJ Group presentation Quality Forum 2013
BMJ Group presentation Quality Forum 2013BMJ Group presentation Quality Forum 2013
BMJ Group presentation Quality Forum 2013
 
Clinical research ppt,
Clinical research   ppt,Clinical research   ppt,
Clinical research ppt,
 
Audit monitoring and inspections cro perspectives
Audit monitoring and inspections cro perspectivesAudit monitoring and inspections cro perspectives
Audit monitoring and inspections cro perspectives
 
Clinical Trial Phases
Clinical Trial PhasesClinical Trial Phases
Clinical Trial Phases
 
13 tips to write skiller cover letter pdf ebook
13 tips to write skiller cover letter pdf ebook13 tips to write skiller cover letter pdf ebook
13 tips to write skiller cover letter pdf ebook
 
Monitoring and auditing in clinical trials
Monitoring and auditing in clinical trialsMonitoring and auditing in clinical trials
Monitoring and auditing in clinical trials
 
Clinical trial design
Clinical trial designClinical trial design
Clinical trial design
 
Top 12 skills for career success
Top 12 skills for career successTop 12 skills for career success
Top 12 skills for career success
 
Drug Development Life Cycle
Drug Development Life CycleDrug Development Life Cycle
Drug Development Life Cycle
 
Top 16 ways to make money online forever
Top 16 ways to make money online foreverTop 16 ways to make money online forever
Top 16 ways to make money online forever
 

Semelhante a Clinical Development Kp Is Ii 08 Dec2011

How and When to Kill a Program in New Product Planning
How and When to Kill a Program in New Product PlanningHow and When to Kill a Program in New Product Planning
How and When to Kill a Program in New Product PlanningAnthony Russell
 
pc15222_brochure
pc15222_brochurepc15222_brochure
pc15222_brochureAmy Ripston
 
Does Innovation Pay DIA 2006
Does Innovation Pay DIA 2006Does Innovation Pay DIA 2006
Does Innovation Pay DIA 2006Neil Patel
 
Anthony K Taylor CV
Anthony K Taylor CVAnthony K Taylor CV
Anthony K Taylor CVAKTaylor
 
Anthony K Taylor CV
Anthony K Taylor CVAnthony K Taylor CV
Anthony K Taylor CVAKTaylor
 
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...RBFHealth
 
Introduction and Scope of M and E.pptx
Introduction and Scope of M and E.pptxIntroduction and Scope of M and E.pptx
Introduction and Scope of M and E.pptxagyeyatrippathi
 
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)wnelson0001
 
Presentation peter pfeiffer@pan-african-pmc_2017_24_05
Presentation peter pfeiffer@pan-african-pmc_2017_24_05Presentation peter pfeiffer@pan-african-pmc_2017_24_05
Presentation peter pfeiffer@pan-african-pmc_2017_24_05Peter Pfeiffer
 

Semelhante a Clinical Development Kp Is Ii 08 Dec2011 (20)

Philip Morganti Resume 1
Philip Morganti Resume 1Philip Morganti Resume 1
Philip Morganti Resume 1
 
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORINGAGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
 
How and When to Kill a Program in New Product Planning
How and When to Kill a Program in New Product PlanningHow and When to Kill a Program in New Product Planning
How and When to Kill a Program in New Product Planning
 
Disaster Risk Management
Disaster Risk ManagementDisaster Risk Management
Disaster Risk Management
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
pc15222_brochure
pc15222_brochurepc15222_brochure
pc15222_brochure
 
Resume - 2015 April
Resume - 2015 AprilResume - 2015 April
Resume - 2015 April
 
Does Innovation Pay DIA 2006
Does Innovation Pay DIA 2006Does Innovation Pay DIA 2006
Does Innovation Pay DIA 2006
 
Anthony K Taylor CV
Anthony K Taylor CVAnthony K Taylor CV
Anthony K Taylor CV
 
Anthony K Taylor CV
Anthony K Taylor CVAnthony K Taylor CV
Anthony K Taylor CV
 
NAAF Patient-Reported Outcomes Consortium
NAAF Patient-Reported Outcomes ConsortiumNAAF Patient-Reported Outcomes Consortium
NAAF Patient-Reported Outcomes Consortium
 
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...
Annual Results and Impact Evaluation Workshop for RBF - Day Eight - Learning ...
 
Introduction and Scope of M and E.pptx
Introduction and Scope of M and E.pptxIntroduction and Scope of M and E.pptx
Introduction and Scope of M and E.pptx
 
pc15147_brochure
pc15147_brochurepc15147_brochure
pc15147_brochure
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)
Effectively Communicating Pharmacoeconomic Research Winnie Nelson (Brief)
 
CCIH-2017-Monitoring-and-Evaluation-Preconference-Outline
CCIH-2017-Monitoring-and-Evaluation-Preconference-OutlineCCIH-2017-Monitoring-and-Evaluation-Preconference-Outline
CCIH-2017-Monitoring-and-Evaluation-Preconference-Outline
 
2nd Annual NPD Agenda
2nd Annual NPD Agenda2nd Annual NPD Agenda
2nd Annual NPD Agenda
 
Presentation peter pfeiffer@pan-african-pmc_2017_24_05
Presentation peter pfeiffer@pan-african-pmc_2017_24_05Presentation peter pfeiffer@pan-african-pmc_2017_24_05
Presentation peter pfeiffer@pan-african-pmc_2017_24_05
 

Clinical Development Kp Is Ii 08 Dec2011

  • 1. Clinical Development KPIs Measuring for Success Manley Finch, PhD, MPH Executive Director HIV Nutrition Network Sr. Medical Science Liaison GTC BioTherapeutics “If we knew what it was we were doing, it would not be called research, would it?” Albert Einstein 1 M.R. Finch, PhD, MPH
  • 2. Program/Project Evaluation Evaluation is to help projects become even better than they planned to be.… First and foremost, evaluation should support the project.… W.K. Kellogg Foundation Evaluation Approach, 1997 2 M.R. Finch, PhD, MPH
  • 3. Overview  Evaluation measurement starts with the program development plan.  The importance of clinical trial program evaluations; critical evaluation creates ROI.  Identifying and defining the correct KPIs to track and measure.  KPI measurement translates to clinical trial agility and efficiency; driving successful study completion depends on trial performance monitoring and corrective action plans.  Monitoring outsourced activities is critical for excellence in execution. 3 M.R. Finch, PhD, MPH
  • 4. Program/Project Evaluation What is program/project evaluation? Evaluation is the systematic acquisition and assessment of information to provide useful feedback about a dynamic process, the interlocking steps of the process, and the intended and unintended outcome(s).  Prospective; before and during process operations – real time.  Retrospective; after the fact, historical – data for the 4 future. M.R. Finch, PhD, MPH
  • 5. Program/Project Evaluation Goal of Evaluation The preemptive goal of evaluation should be to influence decision-making, real-time or in future planning, through the unbiased assimilation and extrapolation of empirically- driven information resultant from strategically designed informatics portals in order to effect a more positive outcome. Evaluate To Create an ROI !!! 5 M.R. Finch, PhD, MPH
  • 6. Program/Project Evaluation Remember -> Program/project evaluation has a cost associated with it‟s planning, inception, monitoring, analyses, and reporting…. Measure only what is real, important, and will create value!! 6 M.R. Finch, PhD, MPH
  • 7. Program/Project Evaluation Why measure?  Quantitative data provides measurable metrics to gauge ongoing and future success and drive ongoing and future improvements in system or programs. 7 M.R. Finch, PhD, MPH
  • 8. Program/Project Evaluation  Provides rationale for current and future decision making  Evaluation programs are essential in any industry  Reporting ROI to Senior Management Senior Management Buy-in = Funding 8 M.R. Finch, PhD, MPH
  • 9. Program/Project Evaluation Program Evaluation:  Formative – evaluation of a program/project during the development stage to ensure reiterative improvement process. Assess the merit, worthiness, and applicability.  Hx data, interviews, questionnaires, focus groups, surveys.  Proactive planning for successful real time assessments.  Summative – evaluation of an ongoing or completed program/project to evaluate the successes and challenges in order to improve ongoing and future projects.  Data driven metrics, quantitative, analyses driven.  ROI reporting to stakeholders. 9 M.R. Finch, PhD, MPH
  • 10. Program/Project Evaluation Formative Evaluation  Prospective; prior to or in parallel with program/project design and planning.  Define parameters (KPIs) to be monitored, assessed, and evaluated.  Defines feasibility of evaluability; don‟t attempt to measure everything. Quantitative versus Qualitative.  Define informatics reporting process and infrastructure.  Define risks and risk mitigation strategies.  Define implementation and training strategies.  Define process evaluation strategies.  Define responsible parties at all levels and assign accountability. 10 M.R. Finch, PhD, MPH
  • 11. Program/Project Evaluation Summative Evaluation:  Retrospective; after data has been collected, from historical data collected, or from several different programs/projects.  Outcome evaluation; did you meet your goals?  Impact evaluation; what was the effect of real time changes?  Cost effectiveness/benefit evaluation; ROI?  Secondary evaluation; examine data to answer additional issues  Meta-analyses; from several programs or projects, historical. 11 M.R. Finch, PhD, MPH
  • 12. Program/Project Evaluation There are numerous models Management Oriented System Models  PERT: Program Evaluation and Review Technique  CPM: Critical Path Method  GANTT: CPM Charting model Only examples and must be tailored to fit your needs, a combination of all is best. 12 M.R. Finch, PhD, MPH
  • 13. Program/Project Evaluation Sources for Program Evaluation Methodologies  W.K. Kellogg  World Health Organization  Web Center of Social Research  Project Management Institute (PMI).  Drug Information Association  eXL, Barnett and others  PERT, CPM, and GANTT methods 13 M.R. Finch, PhD, MPH
  • 14. Program/Project Evaluation  Evaluation Program  Design from the start of program in parallel with early program or protocol plan development discussions.  Determine relevant measures of program, protocol, and site performance early.  Assign team to craft, implement, and monitor early in process.  Don‟t reinvent the wheel – rely on standards already developed unless protocol demands it. 14 M.R. Finch, PhD, MPH
  • 15. Program/Project Evaluation Program Evaluation Steps Assign Program Team evaluation program responsibilities and set expectations early SMART GOALS Goals Support Clinical Program Timelines 15 M.R. Finch, PhD, MPH
  • 16. Program/Project Evaluation  Define challenges and determine action plan  Determine evaluation metrics and how to assess  Design assessment tools specific to metrics  Determine frequency of assessments  Develop reporting format specifics  Meet often to assess program and steer appropriately 16 M.R. Finch, PhD, MPH
  • 17. Key Performance Indicators KPIs Why do we measure and why do we care?  Costs of trials are increasing alarmingly;  400-800 million per drug in 2006; over 900-1.2 billion in 2011  26K per Phase III patient in 2006..  47.5K per Phase III patient in 2011  Trials are delayed more frequently with study start up, site activation, recruitment and retention being blamed most.  Failure to be first in class or first to market drives market share loss and substantially impoverished revenues. 17 M.R. Finch, PhD, MPH
  • 18. Evaluation ROI Key Points $$- TIME IS MONEY -$$ Every day the trial is operating is 100 to 200k USD operational cost alone. 18 M.R. Finch, PhD, MPH
  • 19. Evaluation ROI Key Points Marketing Considerations and Opportunities  Blockbuster drug can generate 2-5 million USD per day in sales revenue ( 750 to 1,500 mil/year)  Market share decreases dramatically based on tier approval; First in Class, First to Market, 2nd to Market etc.  Windows for marketing a drug are dynamic  First to market wins market share  Viagra® versus Cialis® as an example 19 M.R. Finch, PhD, MPH
  • 20. Evaluation ROI Key Points $$- TIME IS MONEY -$$  Delays in Time to Market  2 to 5 million per day marketing.  700 to 1,500 million per year revenue  Low approval tier decreases market share from 75-80% to 35-25% or less. 20 M.R. Finch, PhD, MPH
  • 21. Program/Project Evaluation Evaluation and Assessment Create Real ROI Plan early and plan in parallel Assess early and assess in parallel Real time data = real time effective changes 21 M.R. Finch, PhD, MPH
  • 22. KPIs It is critical to define the appropriate, measurable, meaningful, and value granting KPIs early  Study Start Up Process  Vendor Selection  Medical Writing; Protocol, Consent, CRF, Assessments, IVRS  Regulatory Approval  Study Site Selection and Activation  Recruitment and Retention  Data Collection and Management  Data Cleaning and Data Locking  Drug Approval Process 22 M.R. Finch, PhD, MPH
  • 23. KPIs and SPIs  Key Performance Indicators - KPIs  Similar across all trials  Tracking for most is standard in the industry  Determine as a corporate entity prior to program planning  Share with vendors and sites  Study Performance Indicators – SPIs  May be similar within disease indication  Vary across differing disease indications and trial phases  Determine at the beginning of project or trial  Share with vendors and sites 23 M.R. Finch, PhD, MPH
  • 24. KPIs Standard Program and Trial KPIs How your program/project is scored  MAP/Development Program Plan Completion  DMF/IND Submitted to IND and/or First Protocol Approved  Initial IRB Approval; Phase I, II, IIIa - IIIb  First Site Selected (FSS); First Site Approved (FSA), and FSS  First Patient In (FPI); FPE, FPR  First Patient Completed (FPC), LPI, LPO  Data Cleaned, Locked, Analyzed, Data Report Completed.  Site Close Outs, Final Study Report, etc  NDA Submission, NDA Approval, Drug on Market 24 M.R. Finch, PhD, MPH
  • 25. KPIs and SPIs Trial Completion Key Performance Influencers  Study Start Up  Site assessment and selection  Site training  Site activation  Study Conduct  Patient screening and enrollment rates  Patient retention  Data monitoring and cleaning  Study Closure  Final data cleaning  Data lock and analyses  Study site closeouts 25  SAR, FSR and metrics reporting M.R. Finch, PhD, MPH
  • 26. KPIs Vendor Assessment & Selection Time  Steering Committees / Lead PIs  CROs  Central IRBs  Central Lab  Central Reader/Scorer  Rater Reliability  Recruitment/Trial Awareness/PR (should be the first!).  SMOs/PI Networks 26 M.R. Finch, PhD, MPH
  • 27. KPIs Demand Metrics from the Vendors!!! CarFax = CROFax  Vendors are service providers and therefore live and die on metrics.  Demand formative strategy (case histories) and summative data from each vendor to ensure the best fit.  Not all are created equal and a “one-stop” mentality can be fatal to your program or project.  What is their Hx out of scope like? How often have they enrolled on time? How often have they completed on time? How often have they met or exceeded expectations?  FDA and Sponsor Audits; CAPAs, 483s, Warning Letters, CIAs, etc? 27 M.R. Finch, PhD, MPH
  • 28. KPIs CROs and Vendors Scope of Work & Task Order Agreements Timelines, Milestones From these documents ALL KPIs are measurable from the outset. This can‟t be stressed enough. A Solid SOW and TOA = A Good Chance for Success! 28 M.R. Finch, PhD, MPH
  • 29. SOW & TOA KPIs Measurable Outcomes Start Here “X” Does Mark the Spot  Roles and Responsibilities; who is doing what, when, where.  Measureable Timelines and Deliverables; quantitative.  Project Milestones are milestones, not guidelines.  Define KPIs within the documents, set payments based on milestones, deliverables, and KPIs versus time burnt/FTE.  Early communication and clarity amongst parties ensures a 29 better chance for success. Finch, PhD, MPH M.R.
  • 30. KPIs Central IRBs  KPIs  Initial Protocol and Consent Approval Time  CRF and Assessment Approval Time  Patient Recruitment Material Approval Time  Individual Site Materials Approval Time  Revision Approval Times  Meet with their team in person and demand the metrics for assessment.  Web access portals, multiple boards with multiple meetings, great FDA standing 30 M.R. Finch, PhD, MPH
  • 31. KPIs Study Start Up and Activation How fast can you come online!  Define all Critical Paths, Floats, and Assess Resources  Site Selection Process ensures success or failure in not only start times but also recruitment and retention.  Site Selection KPIs - Site Assessments and Onboarding  Feasibility Questionnaires, Historical Data must be a prerequisite for site KPI assessment.  PSVs; Rapid assessment and onboarding. Verify ALL data at PSV  Time to Contract Negotiation  Time to IRB Approval  Time to FPS, FPE, FPR M.R. Finch, PhD, MPH 31
  • 32. Recruitment KPIs Patient Recruitment & Enrollment Is this the Holy Grail? 32 M.R. Finch, PhD, MPH
  • 33. Recruitment KPIs Will it end like This or This ? 33 M.R. Finch, PhD, MPH
  • 34. Recruitment KPIs 80% of Trials Fail to Enroll on Time  60-70 % are delayed greater than 3-6 months.  50-40% are delayed greater than 6 months..  30% are delayed up to 1 year plus. At 100K (conservative) per day for Phase III this would equal a cost of over 36.5 million dollars for one year not counting lost revenues from delayed time to market. Avg. Cost of Recruitment Plan is ~ 3 - 5% of Trial Cost 34 M.R. Finch, PhD, MPH
  • 35. Recruitment KPIs Key Metrics  Rate and Acceleration (Quantitative)  Time to site selection & activation..  Time to FSA, LSA.  Time to FPS, FPE, FPR.  LTFU, DO, Retention Rates  Time to FPC, LPI, LPO.  Qualitative (Can measure effect on above at inflection points)  Source of Subject Tracking.  PR and Trial Awareness Plan Execution/Implementation.  Retention Efforts. 35 M.R. Finch, PhD, MPH
  • 36. Recruitment KPIs “Selecting optimal study sites is the single most important study start up activity related to rapid trial enrollment.” & No matter how many great sites you select, they can not overcome a poorly crafted protocol…. 36 M.R. Finch, PhD, MPH
  • 37. MEASURING INVESTIGATOR PERFORMANCE Platitudes to Ponder Slow site activation = slow or no patient recruitment Historical enrollment predicts future recruitment Acceptance = Participation Frustration = Abandonment 37 M.R. Finch, PhD, MPH
  • 38. MEASURING INVESTIGATOR PERFORMANCE Platitudes to Ponder Proactive = Performance Rescue programs (band-aids) cost more and do less Failure to plan is to plan to fail 38 M.R. Finch, PhD, MPH
  • 39. MEASURING INVESTIGATOR PERFORMANCE  What are the key historical or current site performance metrics (KPIs) to monitor?  Hx enrollment performance; EMRs or Paper?  Number of patients in DB or access to patients  Breadth and depth of referral network  Willingness to attempt recruitment program activities  Requests recruitment enhancement funding proactively  Has on site trial relations or marketing manager  Hx contract/budget negotiation time  IRB approval time  Central or local IRB 39 M.R. Finch, PhD, MPH
  • 40. MEASURING INVESTIGATOR PERFORMANCE Quantitative Analyses of Performance Create Corporate Sponsor/CRO PI Database  Depth of PI patient DB – e.g. # of patients  Contract/Budget negotiation time  IRB approval time  Site activation time  Time to first patient screened & first patient randomized  Number of patients screened/enrolled  Number of patients ET/LTF/completed  Enrollment time vs. allotted enrollment period  Query, DCF, and DEE rates  Various site related trial costs 40  Overall cost per patient enrolled -PhD, MPH M.R. Finch, CPP
  • 41. Points To Ponder Are you providing feedback to the sites in real time? Are you assisting sites to measure their own performance? Are you creating centers of excellence using KPI metrics? Are you getting feedback from your sites on your performance as a sponsor or CRO? One dollar invested proactively in the sites = Ten dollars in return performance!! 41 M.R. Finch, PhD, MPH
  • 42. MEASURING INVESTIGATOR PERFORMANCE  Create DB across all internal IR/D components  Compare questionnaires to hx DB allows for quantitative analyses  Select only the cream of the crop by stratifying the results  Allows CTM/CPM to stratify the PI list and concentrate on the most rapidly activating sites to ensure FPI milestone capture  Eliminate using the same non-performing sites over and over across company 42 M.R. Finch, PhD, MPH
  • 43. MEASURING INVESTIGATOR PERFORMANCE Points to Ponder  Not all sites are created equal nor are all investigators.  KOLs historically are poor enrolling centers – an unfortunate but real fact  Not all sites accurately report Hx performance – over estimate. 20% Rule Applies  The level of involvement of the PI often is a valid predictor of enrollment when all other influencers are equal.  Geographical location is important – incidence and prevalence of disease are impacted by population.  These tools apply within disease indication – sites may enroll slower or faster in another indication. 43 M.R. Finch, PhD, MPH
  • 44. MEASURING PROGRAM PERFORMANCE  Recruitment Program  Include evaluation program at outset  Design plan in conjunction with:  Steering Committee – KOLs  Site Input – PI, CRC, Research Dir., Marketing  Internal Marketing and Medical Affairs  Synergy across internal and external sources  Implement early and evaluate early.  Assess often and redesign as needed  Craft and maintain evaluation ROI report – Sr. Execs will want report on cost to benefit ratio – your position may depend on effect. 44 M.R. Finch, PhD, MPH
  • 45. MEASURING PROGRAM PERFORMANCE  Recruitment Program Evaluation  Steering Committee, KOLs, and PIs  Assess level of product knowledge versus literature, publications, presentations, posters  Assess level of buy-in to protocol in general  Know your message points and TPP  Design presentations and assessment  Assess level of knowledge  Present data  Reassess Understanding = Acceptance = Performance 45 M.R. Finch, PhD, MPH
  • 46. MEASURING PROGRAM PERFORMANCE  Design Recruitment Program and Test  Provide program to SC, KOLs, PI, and CRCs  Use site specific paradigm – one size does not fit all  Get site specific feedback and fine tune  What media works best in their area?  Develop site specific referral network list with contact information.  Review with Marketing, Med Affairs, Legal  Design evaluation KPIs for program  Design SMART metrics and tracking system  Design ROI report  Circle back once more prior to implementation 46 Then off toPhD, MPH races! M.R. Finch, the
  • 47. MEASURING PROGRAM PERFORMANCE Ok, this presentation is too short for a full recruitment program design seminar and KPIs – so Assume program is designed – What should we measure? $ Follow the spend $ 47 M.R. Finch, PhD, MPH
  • 48. MEASURING PROGRAM PERFORMANCE Case Study  New global trial, 150 US trial subjects required, very tight study completion timelines  Poor perception of ability to capture goal  Limited Senior Management Buy-in for recruitment program and associated costs  Variety of band-aids in similar trials, no success  Perception enrollment based entirely on site DB 48 M.R. Finch, PhD, MPH
  • 49. Case Study  Action Plan  Began internal BU expertise and resource search  Implemented evaluation program at start  Implemented potential PI internal evaluation  Interview CTM/CPM, CRAs, MSLs, etc.  Reviewed available internal  Enrollment  Site activation times  Created PI/Site DB from scratch 49 M.R. Finch, PhD, MPH
  • 50. Case Study Action Plan  Implemented site selection program using quantitative metrics  Engaged all internal and external stakeholders for recruitment program design  Designed and implemented recruitment program early  Stratified site selection based on quantitative data  Began site activation based on cohort strata 50 M.R. Finch, PhD, MPH
  • 51. MEASURING PROGRAM PERFORMANCE Case Study Facts Program ROI Evaluation Demonstrated  Source of subjects  DB represented ~ < 60% of patients  Required CRC time resource funding for EMR/Chart  Emails and letters very productive  PI/CRC to patient discussions were driver  Community Organization ~ 20 % of patients  Media and PR relations ~ 20% of patients 51 M.R. Finch, PhD, MPH
  • 52. Case Study Case Study Facts  Results  Enrollment goal 93% met in allotted time  FPI goal met, LPI goal met  Site activation by strata  Time to grants/contracts and IRB reduced ~ 10%  Time to overall activation reduced 15%  Site stratification paradigm predicted enrollment  PI/CRC acceptance predicted enrollment  Screening rate elastic to program components = enrollment  Increased site and investigator relations confirmed via summative 52 evaluation questionnaire at study end M.R. Finch, PhD, MPH
  • 53. 53 11 /2 9/ 20 0 20 40 60 80 100 120 140 160 12 07 /2 9/ 20 07 1/ 29 /2 00 2/ 8 29 /2 00 3/ 8 29 /2 00 4/ 8 29 /2 00 5/ 8 29 /2 00 6/ 8 29 /2 00 7/ 8 29 Screening /2 00 8 M.R. Finch, PhD, MPH 8/ 29 /2 00 9/ 8 29 /2 Case Study 00 10 8 /2 9/ 20 11 08 /2 9/ 20 08 Screening
  • 54. Case Study Screening rates:  3 rates with 2 primary inflection points  December 07 - April 21, „08:  0.23 pts/day  April 22, – July 23, ‟08:  0.40 pts/day  July 24 – October 17, ‟08:  0.73 pts/day • Additional inflection point  September 23 – October 17, ‟08:  0.94 pts/day M.R. Finch, PhD, MPH 54
  • 55. Conclusion Defining KPIs and SPIs Early is Critical Defining Evaluation Plan in Parallel with Program/Project Planning is Critical to Success Program Evaluation Creates ROI Measuring KPIs and SPIs Creates Documentable ROI and Increases Future Funding 55 M.R. Finch, PhD, MPH
  • 56. References  Bain & Company. Has the Pharmaceutical Model Gone Bust? (www.bain.com; December 8th, 2003.)  Body of Knowledge 5th edition, Association for Project Management, 2006.  Colier, R. Rapidly rising clinical trial costs worry researchers. CMAJ. January 3, 2009. 180(3).  Cutting Edge Information. “Clinical Operations: Accelerating Trials, Allocating Resources and Measuring Performance” [Accessed 2006, www.ClinicalTrialBenchmarking.com]  Cutting Edge Information. [Accessed 2011, www.ClinicalTrialBenchmarking.com]  DiMasi, JA, Hansen, RW, Grabowski, HG. The price of innovation: New estimates of drug development costs. J Health Eco 22(2003)151-185.  Johnston, SC, Hauser, SL. Clinical Trials: Rising Costs Limit Innovation. Ann Neurol. Dec;62(6):A6-7.  Milosevic, Dragan Z. . Project Management ToolBox: Tools and Techniques for the Practicing Project Manager. 2003, Wiley.  Pharmaceutical Research and Manufacturers of America, Pharmaceutical Industry Profile, 2009. (Washington, DC: PhRMAA, April 2009.  William M.K. Trochim. Research Methods Knowledge Base: Introduction to Evaluation. Web Center for Social Research Methods. ,2006. [Accessed 28NOV2011, www.socialresearchmethods.net]  W.K. Kellogg Foundation. W.K. Kellogg Foundation Program Evaluation Handbook. 1997 [ Accessed 28NOV2011, www.wkkf.org/knowledge-center/resources]. 56 M.R. Finch, PhD, MPH

Notas do Editor

  1. If you can’t justify a ROI on what is being measured then you have to ask the team is it relevant to measure and report? Labor hours cost money.
  2. Notes: some things can be measured but should they be? You can’t measure everything. Heisenberg – the act of measuring something actually changes the effect. It is sometimes best to measure certain activities in a blinded fashion to get the most unbiased reporting. What is the definition and scope of the problem or issue, or what&apos;s the question? Where is the problem and how big or serious is it? How should the program or technology be delivered to address the problem?
  3. These are not just project management techniques; they’re methods by which to measure performance, KPIs, and assess merit of your program/project parameters.
  4. The first three deal more with social research and social programs, however, a thorough understanding of the theories is essential to designing an evaluation program. Project management is essentially nothing more than a commercialization of program evaluation theory. It should be used more than just charting from Point A to Point B.
  5. I will add vendor and CRO selection to this as well.
  6. Mention the scope of work and task order agreement at this point. Different lecture but critical.
  7. Briefly hit contents of TOA.
  8. Discuss anecdotes from BMS regarding poor milestone accomplishment driven by SOW being defined as a pay for play FTE based model.
  9. A Great IRB can speed the time to market, a poor IRB selection can ruin your timelines. Don’t be afraid to pull a project if they fail to meet standards or promised delivery times.