SlideShare uma empresa Scribd logo
1 de 25
Baixar para ler offline
Using a Fidelity Index to Increase Program
 Attribution in Impact Evaluation Studies



     Presented by Donna Smith-Moncrieffe
  Canadian Evaluation Society-Ontario Chapter
             October 4th-5th, 2010
Presentation Outline

National Crime Prevention Centre (NCPC) mandate

What is a Quantifiable Fidelity Index? Why is it Important?

How can using a Quantifiable index increase Program
Attribution?

Steps: From Fidelity Tool Development to interpretations
regarding program attribution

Sample Fidelity tool and output data for two Model programs

  Towards No Drug Program (results)
  Stop Now and Plan Program ® (application of treatment intensity
  measures)

Summary


                                                                    2
NCPC Mandate/Core Activities


Mission statement:

To provide national leadership on effective and cost-efficient
ways to both prevent and reduce crime by addressing known
risk factors in high risk populations and places

Core activities:

   Supporting targeted interventions in local communities

   Building and sharing practical knowledge with policy
   makers and practitioners




                                                                 3
NCPC Priorities:


Provide funding to the following target groups/crime issues:

  Children and youth at risk

  Crime prevention in Aboriginal communities

  Prevent recidivism among high-risk groups

  Priority crime issues (youth gang, drug-related crimes)




                                                               4
Use of Evaluation in NCPC
         Disseminate results
              and encourage
           the province and
                                      NCPC Time-Limited Funding
    municipalities to replicate
          effective programs




Results contribute to
                                    Identify and encourage the development of
  Treasury Board
                                           Model and Promising Programs
      Reports




                                   Encourage the
                      Use
                                  development of
                   Multi-site
                                  Fidelity Tools to
                  Evaluations
                                          Increase
                  To Increase
                                         Attribution
                  Knowledge

                                                                                5
What is Program Fidelity ?




Fidelity in evaluation is used to describe the extent to which the
initiative/intervention corresponds to the originally intended program

The literature uses the following terms interchangeably

   Adherence
   Compliance
   Integrity




                                                                         6
Benefits of a Fidelity Index

A Fidelity Index:

   Lists the elements that contribute to the success of the
   program

   Makes the important elements of the program visible and
   helps evaluators “bound” the program

   When expected results are unfavourable, model programs can
   attribute the results to low fidelity levels while still being able to
   maintain a reputation for being an effective program

   Identifies what elements practitioners should focus on to
   increase compliance

   Ensures that elements in the index will have a complementary
   quality assurance protocol and definitions for practitioners to
   review

                                                                            7
What is a Quantifiable Fidelity Index?


A quantifiable fidelity index is a set of measureable items that
determine whether the program elements have been implemented
as planned.

  Each item provides a quantifiable measure using ratio or interval level
  data that can be later used to test for statistical significance, changes in
  effect sizes or clinical significance

  The items in the index can be checked for construct validity. We can
  confidently determine if we are measuring what we think we are
  measuring. (i.e. Using Chronbach’s alpha and other reliability tests can
  be achieved by using quantitative measures)

  Composite indices and aggregate scores can be inserted into
  multivariate analysis to determine whether the fidelity related scores
  contribute to participant related changes.




                                                                                 8
How Can a Measureable Fidelity Index Contribute to
          Better Program Attribution?
  You can confidently answer the question, ‘Did the model program contribute
  to change or were additional unintended program elements contributing to
  changes ? (i.e. Was the inconsistent implementation of cognitive
  behavioural sessions responsible for the lower than expected changes in
  anger for boys in the SNAP program?)

  Each element that is quantified can be placed into a multivariate causal
  model. The evaluator can:
    Isolate whether the program’s adherence levels contributed to
    favourable or unfavourable change in the outcomes of interest
    Isolate what specific aspect of the program contributed to greater effects
    or changes in the outcomes of interest
    Identify the need to find other explanatory variables that are contributing
    to program impact (i.e. R squared adjusted result < .80 informs us that
    there are other factors that account for the change in reduction of drugs
    or offending for example)
    Identify if low fidelity to the program is contributing to variation in the
    variables of interest (i.e. Does low participant responsiveness in 25% of
    the TND classrooms affect the overall goal of preventing or reducing
    drug use in youth?)

                                                                              9
Use of Fidelity in Four NCPC Model Programs
     (Based on literature prior to NCPC Implementation)



75% of model programs used            25% of model programs used fidelity
        fidelity for                                 for
  Process evaluation only                Outcome/Impact Evaluations




                                       Method of Attribution is stronger
Method of Attribution is limited to
                                          and can identify how much
      Inferences about how
                                        Correlation exists between the
 Program is related to outcomes
                                       Program elements and outcomes



                                                                            10
Key Steps in Using a Quantifiable Fidelity Tool to
                Increase Attribution


STEP 1: Identify all the key program elements using a comprehensive
        framework

STEP 2: Apply appropriate measures

STEP 3: Construct the causal model

STEP 4: Calculate and interpret the coefficients (multivariate
        analysis)

STEP 5: Reporting: Enhance program attribution. Determine if
        program fidelity levels contributed to outcomes



                                                                  11
STEP 1:
     Identify all key elements using a comprehensive framework

Based on Dane & Schneider Framework (1998)

Develop the fidelity tool and include all elements in the following four areas:

1.     Implementation: Was the program delivered as intended? (i.e. Were all 8
       sessions delivered on a weekly basis over a 2 month period?)

2.     Dosage: How much of the program was delivered (i.e. Did the youth receive
       the 48 hours of individual case management?)

3.     Quality: Are the main components of the program delivered clearly and
       correctly?

4.     Participant Responsiveness: Does the program stimulate interest among
       the participants and practitioners? (i.e. Are teachers interested in
       implementing the curriculum to students?)

5.     Monitoring Control/Comparison fidelity- What services did the
       comparison/control group receive? Did the program adhere to their intention to
       provide minimal services or “usual care”?

                                                                                   12
STEP 2:
                 Apply Appropriate Measures

Durlak and Dupre 2008 reviewed 59 studies (1998-2005) that had some
type of fidelity tool and found the following typical measures:

   27/59 studies (46%) used categorical measures
      Ordinal scales (i.e. assigning a definition to low, moderate and
      high fidelity levels)

   32/59 studies (54%) used continuous variables/interval measures
      Averages reported as percentages (i.e. observations of each
      session identified that 65% of the youth showed interest in the
      counselling sessions)
      Likert scales results were converted to continuous variables
      Used actual # of hours, # of sessions, # of months especially for
      dosage related measures

                                                                         13
STEP 2:
    Sample Treatment Intensity Measures
          (Stop Now and Plan® Program                            )
Sample : Quantifiable fidelity checklist (note: this is not the full checklist)
vTarget Group Population Met:                                8/10

v 12 Cognitive Behavioural group Sessions              10/12
             Sessions delivered weekly within 3 months
             Delivered by a trained facilitator
             Approved manuals used (weekly documentation)

v 12 Cognitive Behavioural concurrent parenting sessions      10/
                                                                12

Fidelity Level: High            T Score: 28/ or 82.4%
                                 otal         34
____________________________________________________________
Fidelity Legend
No Fidelity:       0-29%
Low Fidelity:      30-69%
Moderate Fidelity: 70-79%
High Fidelity:     80-100%


                                                                                  14
STEP 3: Construct the causal model
      (Multiple Regression Analysis sample)

A multiple regression equation can be used to make estimates
about key program outcomes (i.e. recidivism) based on given
values for a number of explanatory variables (i.e. levels of drug
use, # of hours participating in treatment, and type of
practitioner)

Ensure all statistical assumptions are met before using the model

Construct the causal model by using the following equation :

     y = β0 + β1X1 + β2X2 + …. + β kXk + ε


                                                                    15
STEP 4 :
           Calculate and interpret coefficients

Interpretations
Overall, we are looking at the marginal contribution that each X
(explanatory) variable is making on the Y (outcome) variable when the
 other X variables are being held constant.

Review and interpret the output and consider the following
  questions:
  Are the elements from the fidelity tool statistically significant?
  What elements of the fidelity tool have higher values and contribute
  more to change in the key outcomes of interest?
  What is the overall contribution of the program elements on the
  outcome?
  How much does the impact of fidelity have on program outcomes?


                                                                         16
STEP 4:
    Calculation and Interpretation of Coefficients
                        cont’d



B e c le a r a b o u t h o w yo u g o t fro m
yo ur fid e l ity to o l to d isc us si ng
w h e th e r a d h e re n ce le v e ls h a d a n
im p a ct o n p r o gr a m o u tc o m e s …
Th e re a r e n o m ir a cle s h e r e … ..

    “I
    th i n k y o u s h o u ld
b e m o re e xp lic it h e re
      in s te p tw o .”




                                                   11
                                                        17
Step 4:
     Calculate and Interpret Coefficients                 (sample   data only)



Y=               Aggressive Behaviour

Constant         5.274
Coefficient of   (CBS) Cognitive Behavioural Sessions (0-12 hours)
X1               0.4487


Coefficient of   (DSH) Duration of Service Hours (total program hours)
X2               0.3334


Coefficient of   (FPS) Levels of Fidelity (Participant Responsiveness)
X3               0.2448


R 2 (adj.)       0.7553

                                                                                 18
Step 4:
           Calculate and Interpret Coefficients
Y= 5.274 - 0.4487 (CBS) - 0.3334 (CPG) - 0.2448 (FPS)
R Square (adj.) =.7553
   For every increment of cognitive behavioural sessions, aggressive
   behaviour decreases by 0.4487
   For every increment of cognitive parental group sessions,
   aggressive behaviour decreases by 0.3334
   For every increment of fidelity (responsiveness), aggressive
   behaviour declines by 0.2448
   R square (adjusted) : The treatment intensity of the cognitive
   behavioural session, duration of service hours and total group
   responsiveness contributed to 76% of the variability in Aggression
   levels.
   Fidelity levels related to group responsiveness are positively
   correlated with the outcomes measuring changes in aggression
   levels but contribute less than treatment related factors

                                                                    19
Step 4 Calculation cont’d
                Use Multi-Level Analysis
Include fidelity scores in the ANOVA or regression equation

Use a multilevel analysis approach:
 Group fidelity measure
      Participant responsiveness
      Implementation measures
      Quality of Implementation

  Individual measures
      Dosage (quantity)

Ensure all key quantifiable fidelity scores are tested for separate
cohorts (i.e. various classrooms, sites or session). Correlate
fidelity scores with key results (i.e. changes in knowledge,
attitudes and behavioural changes)

                                                                      20
Step 5: Reporting Results
              What Really Contributed to the Results?
                       High Levels of Fidelity   Low levels of Fidelity to the
                          to the Program                  Program

Favourable Changes                                 Determine whether fidelity scores
                                                 or core treatment measures are
    Made in the              Increased
                                                 contributing to results
Outcomes of Interest    Confidence that the        Qualify results by explaining how
                        Program is Effective     fidelity may have contributed to
                            (It Works!!)         results
                                                   Be cautious about reporting on
                                                 program attribution


   Unfavourable                                  Explore other sources of data to
     Changes                                     explain findings:
                             Increased
    Made in the                                    How many elements of the
                         Confidence that the     fidelity index were low?
Outcomes of Interest    Program is Ineffective
                                                   What elements of the index were
                                                 low?


                                                                                21
STEP 5: Determine if Program Outcomes are related to Fidelity
                (Towards No Drugs sample)


Outcome                                                             Fidelity
Program-specific knowledge                                           0.33 (0.09)***
Beliefs: health-as-a-value                                           0.10 (0.04)**
Beliefs: pro-drug myths                                             -0.15 (0.06)**
Cigarette intentions                                                -0.08 (0.04)**
Marijuana intentions                                                -0.09 (0.04)**
Alcohol intentions                                                  -0.07 (0.04)*

Multi-level models * p< .10                     ** p < .05          *** p < .0001, one
  tailed
Rohrbach, L.A., Gunning, G., Sussman, S., & Sun, P. (May; 2008).
Predictors of implementation in the Project Towards No Drug Abuse
dissemination trial.




                                                                                         22
Summary

Attempt to utilize 5 categories: Use Implementation, Dosage,
Quality, Participant Responsiveness and Monitoring control group
fidelity categories to ensure the tool/index is comprehensive
   Utilize a participatory approach to identify the key elements
   Use a statistical approach (backward or forward selection procedures)

Evaluators should advocate for the development and use of a
quantifiable fidelity tool to be implemented in the evaluation
study
   Encourage the use of interval levels of measurement to each item in
   the tool. This type of measure can provide specific information about
   incremental changes in the outcomes of interest
   Qualitative information should also be collected to explain why the
   results may be favourable or unfavourable


                                                                           23
Summary
Construct a multivariate analysis model that will incorporate
elements of the fidelity tool
   Consider using multilevel models that utilize more than one equation
   that will take into consideration different levels of data (i.e. Towards
   No Drugs fidelity tool required a multilevel model for the analysis:
   schools are nested in communities; pupils are nested in schools etc..)

Report how the fidelity levels related to the outcomes of interest

   Where fidelity levels and expected results are low, ensure that
   triangulation with other data is used to verify program attribution
   levels
   Explore other elements of the regression or other multivariate
   equations used to isolate what program elements may or may not have
   contributed to the results.



                                                                              24
Contact Information




Donna Smith-Moncrieffe, BSc., Crim Dip, MSc.
          Senior Evaluation Advisor
            Public Safety Canada
      National Crime Prevention Center
   Policy Research and Evaluation Division
  E-mail: donna.smith-moncrieffe@ps.gc.ca




                                               25

Mais conteúdo relacionado

Mais procurados

Programs Coming Together Using ExamSoft to assess interprofessional education...
Programs Coming Together Using ExamSoft to assess interprofessional education...Programs Coming Together Using ExamSoft to assess interprofessional education...
Programs Coming Together Using ExamSoft to assess interprofessional education...ExamSoft
 
Measuring Sustainment of Multiple EBPs in Children's Mental Health Services
Measuring Sustainment of Multiple EBPs in Children's Mental Health ServicesMeasuring Sustainment of Multiple EBPs in Children's Mental Health Services
Measuring Sustainment of Multiple EBPs in Children's Mental Health ServicesUCLA CTSI
 
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...MEASURE Evaluation
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluationCarlo Magno
 
Obesity Steering Committee 4-2-12 Presentation
Obesity Steering Committee  4-2-12 PresentationObesity Steering Committee  4-2-12 Presentation
Obesity Steering Committee 4-2-12 Presentationhoaglina
 
Utilization focused evaluation: an introduction (Part 1 - ROER4D)
Utilization focused evaluation: an introduction (Part 1 - ROER4D) Utilization focused evaluation: an introduction (Part 1 - ROER4D)
Utilization focused evaluation: an introduction (Part 1 - ROER4D) SarahG_SS
 
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...Tao Zhang
 
Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706kristofferryan
 

Mais procurados (12)

Programs Coming Together Using ExamSoft to assess interprofessional education...
Programs Coming Together Using ExamSoft to assess interprofessional education...Programs Coming Together Using ExamSoft to assess interprofessional education...
Programs Coming Together Using ExamSoft to assess interprofessional education...
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 
Measuring Sustainment of Multiple EBPs in Children's Mental Health Services
Measuring Sustainment of Multiple EBPs in Children's Mental Health ServicesMeasuring Sustainment of Multiple EBPs in Children's Mental Health Services
Measuring Sustainment of Multiple EBPs in Children's Mental Health Services
 
Functions of evaluation
Functions of evaluationFunctions of evaluation
Functions of evaluation
 
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...
A Systematic Approach to the Planning, Implementation, Monitoring, and Evalua...
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluation
 
Obesity Steering Committee 4-2-12 Presentation
Obesity Steering Committee  4-2-12 PresentationObesity Steering Committee  4-2-12 Presentation
Obesity Steering Committee 4-2-12 Presentation
 
Bowen fp mch
Bowen fp mchBowen fp mch
Bowen fp mch
 
Utilization focused evaluation: an introduction (Part 1 - ROER4D)
Utilization focused evaluation: an introduction (Part 1 - ROER4D) Utilization focused evaluation: an introduction (Part 1 - ROER4D)
Utilization focused evaluation: an introduction (Part 1 - ROER4D)
 
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...
 
Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706
 
Data Demand and Use Workshop
Data Demand and Use WorkshopData Demand and Use Workshop
Data Demand and Use Workshop
 

Semelhante a Using a Fidelity Index to Increase Program Attribution

Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14
Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14
Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14CORE Group
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programsnium
 
Title of Session_Speaker Last Name
Title of Session_Speaker Last NameTitle of Session_Speaker Last Name
Title of Session_Speaker Last NameCORE Group
 
Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptShahidMahmood503398
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptxMaiwandHoshmand1
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
Evaluation introduction
Evaluation introductionEvaluation introduction
Evaluation introductionNathan Loynes
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesNayyar Kazmi
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What WorkEvonCanales257
 
8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdfssusere0ee1d
 
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual EventKey Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual EventMEASURE Evaluation
 
Importance of M&E
Importance of M&EImportance of M&E
Importance of M&Eclearsateam
 
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptx
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptxOECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptx
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptxSave the Children
 
IntroductionThe American Nursing Association (ANA) establish.docx
IntroductionThe American Nursing Association (ANA) establish.docxIntroductionThe American Nursing Association (ANA) establish.docx
IntroductionThe American Nursing Association (ANA) establish.docxbagotjesusa
 
CHWs on the Move_Bjerregaard_5.10.11
CHWs on the Move_Bjerregaard_5.10.11CHWs on the Move_Bjerregaard_5.10.11
CHWs on the Move_Bjerregaard_5.10.11CORE Group
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluationCarlo Magno
 

Semelhante a Using a Fidelity Index to Increase Program Attribution (20)

Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14
Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14
Latest Learning and Resources for iCCM_Tanya Guenther_5.5.14
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programs
 
Title of Session_Speaker Last Name
Title of Session_Speaker Last NameTitle of Session_Speaker Last Name
Title of Session_Speaker Last Name
 
Planning Multisite Evaluations
Planning Multisite EvaluationsPlanning Multisite Evaluations
Planning Multisite Evaluations
 
ME_Katende (2).ppt
ME_Katende (2).pptME_Katende (2).ppt
ME_Katende (2).ppt
 
Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.ppt
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptx
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Evaluation introduction
Evaluation introductionEvaluation introduction
Evaluation introduction
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Work
 
8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf
 
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual EventKey Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
 
Importance of M&E
Importance of M&EImportance of M&E
Importance of M&E
 
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptx
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptxOECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptx
OECD-DAC-Standards-Maksud-Hasan-monitoring-and-evaluation.pptx
 
IntroductionThe American Nursing Association (ANA) establish.docx
IntroductionThe American Nursing Association (ANA) establish.docxIntroductionThe American Nursing Association (ANA) establish.docx
IntroductionThe American Nursing Association (ANA) establish.docx
 
CHWs on the Move_Bjerregaard_5.10.11
CHWs on the Move_Bjerregaard_5.10.11CHWs on the Move_Bjerregaard_5.10.11
CHWs on the Move_Bjerregaard_5.10.11
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluation
 

Mais de Donna Smith-Moncrieffe

Feasibility of Using Realist Approaches in Canadian Evaluations
Feasibility of Using Realist Approaches in Canadian EvaluationsFeasibility of Using Realist Approaches in Canadian Evaluations
Feasibility of Using Realist Approaches in Canadian EvaluationsDonna Smith-Moncrieffe
 
Impact Evaluation: Balancing Rigor with Reality
Impact Evaluation: Balancing Rigor with RealityImpact Evaluation: Balancing Rigor with Reality
Impact Evaluation: Balancing Rigor with RealityDonna Smith-Moncrieffe
 
Evidence of the Afterlife (Science of Mediumship Event)
Evidence of the Afterlife (Science of Mediumship Event)Evidence of the Afterlife (Science of Mediumship Event)
Evidence of the Afterlife (Science of Mediumship Event)Donna Smith-Moncrieffe
 
Ces national conference june 9-12 fairmont royal york
 Ces national conference june 9-12 fairmont royal york Ces national conference june 9-12 fairmont royal york
Ces national conference june 9-12 fairmont royal yorkDonna Smith-Moncrieffe
 
Ps sp-#581705-1-how rigorous program evaluation supports program implementation
Ps sp-#581705-1-how rigorous program evaluation supports program implementationPs sp-#581705-1-how rigorous program evaluation supports program implementation
Ps sp-#581705-1-how rigorous program evaluation supports program implementationDonna Smith-Moncrieffe
 
Albertan gang symposium evaluation presentation
Albertan gang symposium evaluation presentationAlbertan gang symposium evaluation presentation
Albertan gang symposium evaluation presentationDonna Smith-Moncrieffe
 
Evaluating Long Term Complex Evaluations 2010
Evaluating Long Term Complex Evaluations 2010Evaluating Long Term Complex Evaluations 2010
Evaluating Long Term Complex Evaluations 2010Donna Smith-Moncrieffe
 
Canadian Evaluation Society presentation-2011_ final ppt
Canadian Evaluation Society presentation-2011_ final pptCanadian Evaluation Society presentation-2011_ final ppt
Canadian Evaluation Society presentation-2011_ final pptDonna Smith-Moncrieffe
 

Mais de Donna Smith-Moncrieffe (8)

Feasibility of Using Realist Approaches in Canadian Evaluations
Feasibility of Using Realist Approaches in Canadian EvaluationsFeasibility of Using Realist Approaches in Canadian Evaluations
Feasibility of Using Realist Approaches in Canadian Evaluations
 
Impact Evaluation: Balancing Rigor with Reality
Impact Evaluation: Balancing Rigor with RealityImpact Evaluation: Balancing Rigor with Reality
Impact Evaluation: Balancing Rigor with Reality
 
Evidence of the Afterlife (Science of Mediumship Event)
Evidence of the Afterlife (Science of Mediumship Event)Evidence of the Afterlife (Science of Mediumship Event)
Evidence of the Afterlife (Science of Mediumship Event)
 
Ces national conference june 9-12 fairmont royal york
 Ces national conference june 9-12 fairmont royal york Ces national conference june 9-12 fairmont royal york
Ces national conference june 9-12 fairmont royal york
 
Ps sp-#581705-1-how rigorous program evaluation supports program implementation
Ps sp-#581705-1-how rigorous program evaluation supports program implementationPs sp-#581705-1-how rigorous program evaluation supports program implementation
Ps sp-#581705-1-how rigorous program evaluation supports program implementation
 
Albertan gang symposium evaluation presentation
Albertan gang symposium evaluation presentationAlbertan gang symposium evaluation presentation
Albertan gang symposium evaluation presentation
 
Evaluating Long Term Complex Evaluations 2010
Evaluating Long Term Complex Evaluations 2010Evaluating Long Term Complex Evaluations 2010
Evaluating Long Term Complex Evaluations 2010
 
Canadian Evaluation Society presentation-2011_ final ppt
Canadian Evaluation Society presentation-2011_ final pptCanadian Evaluation Society presentation-2011_ final ppt
Canadian Evaluation Society presentation-2011_ final ppt
 

Using a Fidelity Index to Increase Program Attribution

  • 1. Using a Fidelity Index to Increase Program Attribution in Impact Evaluation Studies Presented by Donna Smith-Moncrieffe Canadian Evaluation Society-Ontario Chapter October 4th-5th, 2010
  • 2. Presentation Outline National Crime Prevention Centre (NCPC) mandate What is a Quantifiable Fidelity Index? Why is it Important? How can using a Quantifiable index increase Program Attribution? Steps: From Fidelity Tool Development to interpretations regarding program attribution Sample Fidelity tool and output data for two Model programs Towards No Drug Program (results) Stop Now and Plan Program ® (application of treatment intensity measures) Summary 2
  • 3. NCPC Mandate/Core Activities Mission statement: To provide national leadership on effective and cost-efficient ways to both prevent and reduce crime by addressing known risk factors in high risk populations and places Core activities: Supporting targeted interventions in local communities Building and sharing practical knowledge with policy makers and practitioners 3
  • 4. NCPC Priorities: Provide funding to the following target groups/crime issues: Children and youth at risk Crime prevention in Aboriginal communities Prevent recidivism among high-risk groups Priority crime issues (youth gang, drug-related crimes) 4
  • 5. Use of Evaluation in NCPC Disseminate results and encourage the province and NCPC Time-Limited Funding municipalities to replicate effective programs Results contribute to Identify and encourage the development of Treasury Board Model and Promising Programs Reports Encourage the Use development of Multi-site Fidelity Tools to Evaluations Increase To Increase Attribution Knowledge 5
  • 6. What is Program Fidelity ? Fidelity in evaluation is used to describe the extent to which the initiative/intervention corresponds to the originally intended program The literature uses the following terms interchangeably Adherence Compliance Integrity 6
  • 7. Benefits of a Fidelity Index A Fidelity Index: Lists the elements that contribute to the success of the program Makes the important elements of the program visible and helps evaluators “bound” the program When expected results are unfavourable, model programs can attribute the results to low fidelity levels while still being able to maintain a reputation for being an effective program Identifies what elements practitioners should focus on to increase compliance Ensures that elements in the index will have a complementary quality assurance protocol and definitions for practitioners to review 7
  • 8. What is a Quantifiable Fidelity Index? A quantifiable fidelity index is a set of measureable items that determine whether the program elements have been implemented as planned. Each item provides a quantifiable measure using ratio or interval level data that can be later used to test for statistical significance, changes in effect sizes or clinical significance The items in the index can be checked for construct validity. We can confidently determine if we are measuring what we think we are measuring. (i.e. Using Chronbach’s alpha and other reliability tests can be achieved by using quantitative measures) Composite indices and aggregate scores can be inserted into multivariate analysis to determine whether the fidelity related scores contribute to participant related changes. 8
  • 9. How Can a Measureable Fidelity Index Contribute to Better Program Attribution? You can confidently answer the question, ‘Did the model program contribute to change or were additional unintended program elements contributing to changes ? (i.e. Was the inconsistent implementation of cognitive behavioural sessions responsible for the lower than expected changes in anger for boys in the SNAP program?) Each element that is quantified can be placed into a multivariate causal model. The evaluator can: Isolate whether the program’s adherence levels contributed to favourable or unfavourable change in the outcomes of interest Isolate what specific aspect of the program contributed to greater effects or changes in the outcomes of interest Identify the need to find other explanatory variables that are contributing to program impact (i.e. R squared adjusted result < .80 informs us that there are other factors that account for the change in reduction of drugs or offending for example) Identify if low fidelity to the program is contributing to variation in the variables of interest (i.e. Does low participant responsiveness in 25% of the TND classrooms affect the overall goal of preventing or reducing drug use in youth?) 9
  • 10. Use of Fidelity in Four NCPC Model Programs (Based on literature prior to NCPC Implementation) 75% of model programs used 25% of model programs used fidelity fidelity for for Process evaluation only Outcome/Impact Evaluations Method of Attribution is stronger Method of Attribution is limited to and can identify how much Inferences about how Correlation exists between the Program is related to outcomes Program elements and outcomes 10
  • 11. Key Steps in Using a Quantifiable Fidelity Tool to Increase Attribution STEP 1: Identify all the key program elements using a comprehensive framework STEP 2: Apply appropriate measures STEP 3: Construct the causal model STEP 4: Calculate and interpret the coefficients (multivariate analysis) STEP 5: Reporting: Enhance program attribution. Determine if program fidelity levels contributed to outcomes 11
  • 12. STEP 1: Identify all key elements using a comprehensive framework Based on Dane & Schneider Framework (1998) Develop the fidelity tool and include all elements in the following four areas: 1. Implementation: Was the program delivered as intended? (i.e. Were all 8 sessions delivered on a weekly basis over a 2 month period?) 2. Dosage: How much of the program was delivered (i.e. Did the youth receive the 48 hours of individual case management?) 3. Quality: Are the main components of the program delivered clearly and correctly? 4. Participant Responsiveness: Does the program stimulate interest among the participants and practitioners? (i.e. Are teachers interested in implementing the curriculum to students?) 5. Monitoring Control/Comparison fidelity- What services did the comparison/control group receive? Did the program adhere to their intention to provide minimal services or “usual care”? 12
  • 13. STEP 2: Apply Appropriate Measures Durlak and Dupre 2008 reviewed 59 studies (1998-2005) that had some type of fidelity tool and found the following typical measures: 27/59 studies (46%) used categorical measures Ordinal scales (i.e. assigning a definition to low, moderate and high fidelity levels) 32/59 studies (54%) used continuous variables/interval measures Averages reported as percentages (i.e. observations of each session identified that 65% of the youth showed interest in the counselling sessions) Likert scales results were converted to continuous variables Used actual # of hours, # of sessions, # of months especially for dosage related measures 13
  • 14. STEP 2: Sample Treatment Intensity Measures (Stop Now and Plan® Program ) Sample : Quantifiable fidelity checklist (note: this is not the full checklist) vTarget Group Population Met: 8/10 v 12 Cognitive Behavioural group Sessions 10/12 Sessions delivered weekly within 3 months Delivered by a trained facilitator Approved manuals used (weekly documentation) v 12 Cognitive Behavioural concurrent parenting sessions 10/ 12 Fidelity Level: High T Score: 28/ or 82.4% otal 34 ____________________________________________________________ Fidelity Legend No Fidelity: 0-29% Low Fidelity: 30-69% Moderate Fidelity: 70-79% High Fidelity: 80-100% 14
  • 15. STEP 3: Construct the causal model (Multiple Regression Analysis sample) A multiple regression equation can be used to make estimates about key program outcomes (i.e. recidivism) based on given values for a number of explanatory variables (i.e. levels of drug use, # of hours participating in treatment, and type of practitioner) Ensure all statistical assumptions are met before using the model Construct the causal model by using the following equation : y = β0 + β1X1 + β2X2 + …. + β kXk + ε 15
  • 16. STEP 4 : Calculate and interpret coefficients Interpretations Overall, we are looking at the marginal contribution that each X (explanatory) variable is making on the Y (outcome) variable when the other X variables are being held constant. Review and interpret the output and consider the following questions: Are the elements from the fidelity tool statistically significant? What elements of the fidelity tool have higher values and contribute more to change in the key outcomes of interest? What is the overall contribution of the program elements on the outcome? How much does the impact of fidelity have on program outcomes? 16
  • 17. STEP 4: Calculation and Interpretation of Coefficients cont’d B e c le a r a b o u t h o w yo u g o t fro m yo ur fid e l ity to o l to d isc us si ng w h e th e r a d h e re n ce le v e ls h a d a n im p a ct o n p r o gr a m o u tc o m e s … Th e re a r e n o m ir a cle s h e r e … .. “I th i n k y o u s h o u ld b e m o re e xp lic it h e re in s te p tw o .” 11 17
  • 18. Step 4: Calculate and Interpret Coefficients (sample data only) Y= Aggressive Behaviour Constant 5.274 Coefficient of (CBS) Cognitive Behavioural Sessions (0-12 hours) X1 0.4487 Coefficient of (DSH) Duration of Service Hours (total program hours) X2 0.3334 Coefficient of (FPS) Levels of Fidelity (Participant Responsiveness) X3 0.2448 R 2 (adj.) 0.7553 18
  • 19. Step 4: Calculate and Interpret Coefficients Y= 5.274 - 0.4487 (CBS) - 0.3334 (CPG) - 0.2448 (FPS) R Square (adj.) =.7553 For every increment of cognitive behavioural sessions, aggressive behaviour decreases by 0.4487 For every increment of cognitive parental group sessions, aggressive behaviour decreases by 0.3334 For every increment of fidelity (responsiveness), aggressive behaviour declines by 0.2448 R square (adjusted) : The treatment intensity of the cognitive behavioural session, duration of service hours and total group responsiveness contributed to 76% of the variability in Aggression levels. Fidelity levels related to group responsiveness are positively correlated with the outcomes measuring changes in aggression levels but contribute less than treatment related factors 19
  • 20. Step 4 Calculation cont’d Use Multi-Level Analysis Include fidelity scores in the ANOVA or regression equation Use a multilevel analysis approach: Group fidelity measure Participant responsiveness Implementation measures Quality of Implementation Individual measures Dosage (quantity) Ensure all key quantifiable fidelity scores are tested for separate cohorts (i.e. various classrooms, sites or session). Correlate fidelity scores with key results (i.e. changes in knowledge, attitudes and behavioural changes) 20
  • 21. Step 5: Reporting Results What Really Contributed to the Results? High Levels of Fidelity Low levels of Fidelity to the to the Program Program Favourable Changes Determine whether fidelity scores or core treatment measures are Made in the Increased contributing to results Outcomes of Interest Confidence that the Qualify results by explaining how Program is Effective fidelity may have contributed to (It Works!!) results Be cautious about reporting on program attribution Unfavourable Explore other sources of data to Changes explain findings: Increased Made in the How many elements of the Confidence that the fidelity index were low? Outcomes of Interest Program is Ineffective What elements of the index were low? 21
  • 22. STEP 5: Determine if Program Outcomes are related to Fidelity (Towards No Drugs sample) Outcome Fidelity Program-specific knowledge 0.33 (0.09)*** Beliefs: health-as-a-value 0.10 (0.04)** Beliefs: pro-drug myths -0.15 (0.06)** Cigarette intentions -0.08 (0.04)** Marijuana intentions -0.09 (0.04)** Alcohol intentions -0.07 (0.04)* Multi-level models * p< .10 ** p < .05 *** p < .0001, one tailed Rohrbach, L.A., Gunning, G., Sussman, S., & Sun, P. (May; 2008). Predictors of implementation in the Project Towards No Drug Abuse dissemination trial. 22
  • 23. Summary Attempt to utilize 5 categories: Use Implementation, Dosage, Quality, Participant Responsiveness and Monitoring control group fidelity categories to ensure the tool/index is comprehensive Utilize a participatory approach to identify the key elements Use a statistical approach (backward or forward selection procedures) Evaluators should advocate for the development and use of a quantifiable fidelity tool to be implemented in the evaluation study Encourage the use of interval levels of measurement to each item in the tool. This type of measure can provide specific information about incremental changes in the outcomes of interest Qualitative information should also be collected to explain why the results may be favourable or unfavourable 23
  • 24. Summary Construct a multivariate analysis model that will incorporate elements of the fidelity tool Consider using multilevel models that utilize more than one equation that will take into consideration different levels of data (i.e. Towards No Drugs fidelity tool required a multilevel model for the analysis: schools are nested in communities; pupils are nested in schools etc..) Report how the fidelity levels related to the outcomes of interest Where fidelity levels and expected results are low, ensure that triangulation with other data is used to verify program attribution levels Explore other elements of the regression or other multivariate equations used to isolate what program elements may or may not have contributed to the results. 24
  • 25. Contact Information Donna Smith-Moncrieffe, BSc., Crim Dip, MSc. Senior Evaluation Advisor Public Safety Canada National Crime Prevention Center Policy Research and Evaluation Division E-mail: donna.smith-moncrieffe@ps.gc.ca 25