SlideShare a Scribd company logo
1 of 21
Improving quality of humanitarian
  programmes through the use of a
  scoring system: the Humanitarian
  Indicator Tool


Dr Vivien Margaret Walden and Nigel Timmins
Acknowledgements

• We wish to acknowledge the consultants, Andy Featherstone,
  Peta Sandison, Marlise Turnbull and Sarah House who
  helped refine the tools and the 10 countries offices who have
  been through the process and have hopefully come out of it
  unscathed.
• Photographs used in this presentation are by Caroline Gluck,
  Oxfam




                                                  Page 2
Why a new tool?

• In 1995, the ODI evaluation of the response to the Rwanda
  crisis raised concerns around the quality of service delivery
• As a result of findings, the Sphere project was started
• In 1997 the People in Aid Code of Best Practice was
  published
• In 2000 the Humanitarian Ombudsman became HAP
• In 2001 Griekspoor and Sondorp wrote a paper highlighting
  the various quality assurance measures and posed the
  question “Do all of the developments in the field of
  accountability and performance actually improve overall
  performance?”


                                                 Page 3
Why a new tool?

• Oxfam’s hypothesis – by improving the quality of a
  humanitarian programme, the likelihood of positive impact is
  increased
• The tool does not try to prove the link
• Oxfam has introduced the Global Performance Framework as
  part of reporting to DFID
• The framework has global indictors of which the Humanitarian
  Indicator is one
• Whereas the other indicators measure impact, the
  humanitarian indictor looks at quality of the response




                                                 Page 4
Oxfam’s Global Performance
Framework
   Global Output Indicators                                     Global Outcome Indicators
    Humanitarian Assistance:                                    Degree to which humanitarian responses meet
    Total number of people provided with appropriate            recognised quality standards for humanitarian
    humanitarian assistance, disaggregated by sex               programming (e.g. Sphere guidelines)


    Adaptation and Risk Reduction:                              % of supported households demonstrating greater
    # of people supported to understand current and likely
                                                                ability to minimise risk from shocks and adapt to
    future hazards, reduce risk, and/or adapt to climatic
    changes and uncertainty, disaggregated by sex               emerging trends & uncertainty


    Livelihood Enhancement Support:                             % of supported households demonstrating greater        Improved quality
    # of women and men directly supported to increase           income, as measured by daily consumption and            of life for poor
    income via enhancing production and/or market access        expenditure per capita
                                                                                                                       women and men
    Women’s Empowerment:                                        % of supported women demonstrating greater
    # of people reached to enable women to gain increased
                                                                involvement in household decision-making and
    control over factors affecting their own priorities and
    interests                                                   influencing affairs at the community level


    Citizen Mobilisation:                                       Degree to which selected interventions have
    # of a) citizens, CBO members and CSO staff supported       contributed to affecting outcome change, as
    to engage with state institutions/other relevant actors;    generated from findings of rigorous qualitative
    and b) duty bearers benefiting from capacity support        evaluations

    Campaigning and Advocacy:                                   Degree to which selected interventions have
    #of campaign actions directly undertaken or supported,      contributed to affecting outcome change, as
    e.g. contacts made with policy targets, online and          generated from findings of rigorous qualitative
    offline actions taken, media coverage, publications, etc.   evaluations

                                                                Degree to which selected interventions meet
                                                                recognised standards for accountable programming


                                                                Extent to which selected project delivery good value
                                                                for money




                                                                                                                       Page 5
Page 6
The indicator

• Output indicator – the total number of people
  provided with appropriate humanitarian
  assistance, disaggregated by sex
•   Data collected annually through online system


• Outcome indicator – the degree to which
  humanitarian responses meet recognised quality
  standards for humanitarian programming
•   Data collected through HIT




                                                    Page 7
The tool – for rapid onset
emergencies
  Global Humanitarian Indicator: Degree to which humanitarian responses meet recognised quality
  standards for humanitarian programming

  RAPID ONSET EMERGNECY – EARTHQUAKE, SUDDEN FLOODS, TSUNAMI, CYCLONES, TYPHOONS, HURRICANES, SUDDEN
  CONFLICT WITH DISPLACEMENT, AWD OUTBREAKS


  Number Quality standard                                                                      Met        Almost       Partially Not met
                                                                                               (score6)   met (4)      met       (score
                                                                                                                       (score 2) 0)
  1       Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours
          and initial implementation within three days
  2        Coverage uses 25% of affected population as an planned figure (response should
          reflect the scale of the disaster) with clear justification for final count
  3       Technical aspects of programme measured against Sphere standards


  Number Quality standard                                                                      Met        Almost       Partially Not met
                                                                                               (score3)   met (score   met       (score
                                                                                                          2)           (score 1) 0)
  4       MEAL strategy and plan in place and being implemented using appropriate
          indicators
  5       Feedback/complaints system for affected population in place and functioning and
          documented evidence of information sharing, consultation and participation leading
          to a programme relevant to context and needs
  6       Partner relationships defined, capacity assessed and partners fully engaged in all
          stages of programme cycle
  7       Programme is considered a safe programme: action taken to avoid harm and
          programme considered conflict sensitive
  8        Programme (including advocacy) addresses gender equity and specific concerns




                                                                                                                         Page 8
The benchmarks
• Timeliness - rapid appraisal/assessment enough to make
  decisions within 24 hours and initial implementation within
  three days
• Coverage uses 25% of affected population as an planned
  figure (response should reflect the scale of the disaster) with
  clear justification for final count
• Technical aspects of programme measured against Sphere
  standards
• MEAL strategy and plan in place and being implemented
  using appropriate indicators
• Feedback/complaints system for affected population in place
  and functioning and documented evidence of information
  sharing, consultation and participation leading to a
  programme relevant to context and needs

                                                    Page 9
The benchmarks
• Partner relationships defined, capacity assessed and partners
  fully engaged in all stages of programme cycle
• Programme is considered a safe programme: action taken to
  avoid harm and programme considered conflict sensitive
• Programme (including advocacy) addresses gender equity
  and specific concerns and needs of women, girls, men and
  boys and vulnerable groups
• Evidence that preparedness measures were in place and
  effectively actioned
• Programme has an advocacy/campaigns strategy and has
  incorporated advocacy into programme plans based on
  evidence from the field



                                                  Page 10
The benchmarks

• Country programme has an integrated approach including
  reducing and managing risk though existing longer-term
  development programmes and building resilience for the
  future
• Evidence of appropriate staff capacity to ensure quality
  programming




                                                 Page 11
The methodology

• Done by an external consultant – although preferably one who
  has a knowledge of Oxfam
• Done as a desk study using documentation and some
  telephone/Skype interviews
• Follows a pre-determined scoring system and list of
  documents
• Has to be commented on and accepted by the country
• The country writes a management response
• All reports and a summary for each are published on the
  Oxfam website – www. Oxfam.org



                                                 Page 12
The scoring
Quality standard   Evidence needed            Met (score 6)               Almost met (4)   Partially met (score 2)   Not met (score 0)

3                  Proposals                  Sphere standards            NA               Sphere standards          Standards only
                   MEAL strategy and          proposed and put in                          proposed and adjusted     mentioned in proposals
                   plans                      place with adjusted                          to context                but not replicated in
                   PH and EFSL strategies     standards for context                        Standards mentioned in    plans
                   Technical adviser visits   Training in standards                        proposals and             Or
                   Training agendas and       carried out for staff and                    LogFrames but not         No mention of Sphere
                   presentations              partners                                     monitored against         in any document
                   LogFrames and              Indicators use                               Some evidence of
                   monitoring frameworks      standards and                                training but not
                   donor reports              monitoring against                           widespread (staff but
                   RTE and other              standards takes place                        not partners or only in
                   evaluation reports         regularly                                    one area)
                   learning event or          Standards evaluated
                   review reports




                                                                                                               Page 13
Instructions for use

Benchmark Evidence                   Quality check              Benchmark
3         Technical aspects of        Proposals                 Check proposals and strategies to see if
          programme measured         MEAL strategy and plans    standards are mentioned not just as a possibility
          against Sphere standards   PH and EFSL strategies     but that they are considered in the context of
                                     Technical adviser visits   the response – this might mean that Sphere has
                                     Training agendas and       been adapted to suit the context
                                     presentations              The indicators on the LogFrame for technical
                                     LogFrames and monitoring   areas should reflect Sphere standards
                                     frameworks                 The MEAL strategy should have Sphere as
                                     donor reports              indicators and for data collection methods
                                     RTE and other evaluation   Check adviser reports for mention of standards
                                     reports                    and how these were implemented
                                     learning event or review   Check the RTE report for mention of Sphere
                                     reports                    standards
                                                                Check WASH and EFSL strategies and adviser
                                                                reports to see if any training was carried out for
                                                                staff and partners
                                                                Check review and evaluation reports for
                                                                mention of standards




                                                                                                                     Page 14
Final score

•   First year total score could be 30
•   Adjusted for non-applicable benchmarks
•   First year scoring
•   Somalia – 17/28
•   Kenya – 24/30
•   Ethiopia – 9/28
•   Pakistan – 19/30
•   Colombia – the test case for the HIT – 18/26
•   Second year – only South Sudan is complete – 21/30




                                                 Page 15
The findings – Somalia

• Benchmark 5 on accountability
• Score Met – 2/2
• “Oxfam’s partners appeared to differ in the level of beneficiary
  participation in design and delivery. Some documented highly
  participatory process, with qualitative and quantitative data.
  As well as gathering information, rapid assessments were
  done to establish VRCs to improve participation (criteria,
  entitlements, payment points, registration, complaints and
  feedback). Mobile phone hotlines were set up where possible,
  with feedback protocol to guide staff on how to register and
  follow-up complaints”



                                                    Page 16
Pakistan

• Benchmark 7 – protection and gender
• Partially met – ½
• Some protection concerns were identified relating to security
  for staff and women and girls using WASH facilities. Some
  actions were taken responding to dignity and protection
  including involving women in different activities. Post-
  distribution monitoring investigated some security concerns
  related to cashing cheques and distribution points, including
  analysis of responses from women. No protection problems
  observed were communicated to agencies or authorities
  responsible for, or specialising in, protection



                                                   Page 17
Ethiopia

• Benchmark 11 – advocacy
• Not met – 0/2
• Advocacy activities were clearly part of the intended response
  and there is a regional advocacy action plan with Ethiopia
  objectives and a media, advocacy and campaign strategy
  which includes a number of plans for Ethiopia. However, no
  Ethiopia country advocacy strategy was provided for the
  evaluation. The sitreps do mention Oxfam’s participation in
  influential meetings, but are not tied to an explicit strategy.
  There is no record of the impact of Oxfam’s advocacy
  activities



                                                   Page 18
Lessons learnt

• There are limitations to doing a desk study – it relies heavily
  on documentation and scores do not reflect absence of
  documentation or actual absence of good programming
• Several country teams objected to being judged solely on
  “little pieces of paper”
• There is no opportunity to get the views of the affected
  population unless this is already documented
• Telephone/Skype interviews can be biased – it is sometimes
  difficult to triangulate
• The process needs the goodwill and buy-in from the country
  team


                                                    Page 19
Advantages

• The process is fairly inexpensive – under £6000
• The country does not have to host a consultant
• The methodology can be used to track progress in
  subsequent responses in one country
• The scores are comparable across programmes (although
  context should be considered)




                                              Page 20
She deserves the best we can give –
           thank you!




                            Page 21

More Related Content

What's hot

CCXG Forum, March 2022, Sindy Singh
CCXG Forum, March 2022, Sindy SinghCCXG Forum, March 2022, Sindy Singh
CCXG Forum, March 2022, Sindy SinghOECD Environment
 
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...RBFHealth
 
CHW Program Assessment and Improvement Matrix
CHW Program Assessment and Improvement Matrix CHW Program Assessment and Improvement Matrix
CHW Program Assessment and Improvement Matrix jehill3
 
PSNP 4 Grievance redressing mechanism (GRM)
PSNP 4 Grievance redressing mechanism (GRM)PSNP 4 Grievance redressing mechanism (GRM)
PSNP 4 Grievance redressing mechanism (GRM)Jalane Hirpesa
 
3 investment-framework-summary-unaids-issues-brief
3 investment-framework-summary-unaids-issues-brief3 investment-framework-summary-unaids-issues-brief
3 investment-framework-summary-unaids-issues-briefIndonesia AIDS Coalition
 
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...AIDSTAROne
 
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...MEASURE Evaluation
 
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...RBFHealth
 
Crisis information management framework for regional disaster resiliency (Joe...
Crisis information management framework for regional disaster resiliency (Joe...Crisis information management framework for regional disaster resiliency (Joe...
Crisis information management framework for regional disaster resiliency (Joe...Learning Manager
 

What's hot (13)

CCXG Forum, March 2022, Sindy Singh
CCXG Forum, March 2022, Sindy SinghCCXG Forum, March 2022, Sindy Singh
CCXG Forum, March 2022, Sindy Singh
 
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
 
Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...
Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...
Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...
 
Amie Heap
Amie HeapAmie Heap
Amie Heap
 
10 core global-fundstrategy_framework_en1
10 core global-fundstrategy_framework_en110 core global-fundstrategy_framework_en1
10 core global-fundstrategy_framework_en1
 
CHW Program Assessment and Improvement Matrix
CHW Program Assessment and Improvement Matrix CHW Program Assessment and Improvement Matrix
CHW Program Assessment and Improvement Matrix
 
PSNP 4 Grievance redressing mechanism (GRM)
PSNP 4 Grievance redressing mechanism (GRM)PSNP 4 Grievance redressing mechanism (GRM)
PSNP 4 Grievance redressing mechanism (GRM)
 
3 investment-framework-summary-unaids-issues-brief
3 investment-framework-summary-unaids-issues-brief3 investment-framework-summary-unaids-issues-brief
3 investment-framework-summary-unaids-issues-brief
 
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
 
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
 
RIMS+ surveys: A tool for project design and evaluation
RIMS+ surveys: A tool for project design and evaluationRIMS+ surveys: A tool for project design and evaluation
RIMS+ surveys: A tool for project design and evaluation
 
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
 
Crisis information management framework for regional disaster resiliency (Joe...
Crisis information management framework for regional disaster resiliency (Joe...Crisis information management framework for regional disaster resiliency (Joe...
Crisis information management framework for regional disaster resiliency (Joe...
 

Similar to Humanitarian Indicator Tool (Dr Vivien Walden, Oxfam)

Using a Fidelity Index to Increase Program Attribution
Using a Fidelity Index  to Increase Program AttributionUsing a Fidelity Index  to Increase Program Attribution
Using a Fidelity Index to Increase Program AttributionDonna Smith-Moncrieffe
 
Program Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenProgram Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenBenjamingabanelabong
 
monitoring and evaluation approaches for key population programs: perils, pi...
 monitoring and evaluation approaches for key population programs: perils, pi... monitoring and evaluation approaches for key population programs: perils, pi...
monitoring and evaluation approaches for key population programs: perils, pi...LINKAGES
 
GBV Guidelines Regional Workshop Overview for Webinar
GBV Guidelines Regional Workshop Overview for WebinarGBV Guidelines Regional Workshop Overview for Webinar
GBV Guidelines Regional Workshop Overview for WebinarGBV Guidelines
 
Improvement of the ngo’s performance
Improvement of  the ngo’s performanceImprovement of  the ngo’s performance
Improvement of the ngo’s performanceDivyanshu Singh
 
Community-based Indicators for HIV Programs
Community-based Indicators for HIV ProgramsCommunity-based Indicators for HIV Programs
Community-based Indicators for HIV ProgramsMEASURE Evaluation
 
ce_wg_presentation_-_nutrition.ppt
ce_wg_presentation_-_nutrition.pptce_wg_presentation_-_nutrition.ppt
ce_wg_presentation_-_nutrition.pptWAJAHATVICTOR
 
Programme planning and evaluation in extension work
Programme planning and evaluation in extension workProgramme planning and evaluation in extension work
Programme planning and evaluation in extension workDegonto Islam
 
Presentation_DRRM Mainstreaming in the Planning Cycle.pptx
Presentation_DRRM Mainstreaming in the Planning Cycle.pptxPresentation_DRRM Mainstreaming in the Planning Cycle.pptx
Presentation_DRRM Mainstreaming in the Planning Cycle.pptxClarenceCasapao
 
Development of a Progressive Management Pathway to assist National and Intern...
Development of a Progressive Management Pathway to assist National and Intern...Development of a Progressive Management Pathway to assist National and Intern...
Development of a Progressive Management Pathway to assist National and Intern...EuFMD
 
Community HIV Program Indicators
Community HIV Program IndicatorsCommunity HIV Program Indicators
Community HIV Program IndicatorsMEASURE Evaluation
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Barb Knittel
 
Final outline plan for webinar evaluation and impact assessment mof 2004
Final outline plan for webinar evaluation and impact assessment mof 2004   Final outline plan for webinar evaluation and impact assessment mof 2004
Final outline plan for webinar evaluation and impact assessment mof 2004 EricaPackingtonIOD
 

Similar to Humanitarian Indicator Tool (Dr Vivien Walden, Oxfam) (20)

Using a Fidelity Index to Increase Program Attribution
Using a Fidelity Index  to Increase Program AttributionUsing a Fidelity Index  to Increase Program Attribution
Using a Fidelity Index to Increase Program Attribution
 
Framework user guide presentation cpw dec132015
Framework user guide presentation cpw dec132015Framework user guide presentation cpw dec132015
Framework user guide presentation cpw dec132015
 
North Africa/West Asia - Intermediate Development Outcomes
North Africa/West Asia - Intermediate Development OutcomesNorth Africa/West Asia - Intermediate Development Outcomes
North Africa/West Asia - Intermediate Development Outcomes
 
Program Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenProgram Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for children
 
Benefits of M&E.PDF
Benefits of M&E.PDFBenefits of M&E.PDF
Benefits of M&E.PDF
 
monitoring and evaluation approaches for key population programs: perils, pi...
 monitoring and evaluation approaches for key population programs: perils, pi... monitoring and evaluation approaches for key population programs: perils, pi...
monitoring and evaluation approaches for key population programs: perils, pi...
 
GBV Guidelines Regional Workshop Overview for Webinar
GBV Guidelines Regional Workshop Overview for WebinarGBV Guidelines Regional Workshop Overview for Webinar
GBV Guidelines Regional Workshop Overview for Webinar
 
Improvement of the ngo’s performance
Improvement of  the ngo’s performanceImprovement of  the ngo’s performance
Improvement of the ngo’s performance
 
Community-based Indicators for HIV Programs
Community-based Indicators for HIV ProgramsCommunity-based Indicators for HIV Programs
Community-based Indicators for HIV Programs
 
Planning Multisite Evaluations
Planning Multisite EvaluationsPlanning Multisite Evaluations
Planning Multisite Evaluations
 
How we can use impact evaluation to assure effective use of resources for dev...
How we can use impact evaluation to assure effective use of resources for dev...How we can use impact evaluation to assure effective use of resources for dev...
How we can use impact evaluation to assure effective use of resources for dev...
 
Disaster risk management in food security and agriculture
Disaster risk management in food security and agricultureDisaster risk management in food security and agriculture
Disaster risk management in food security and agriculture
 
ce_wg_presentation_-_nutrition.ppt
ce_wg_presentation_-_nutrition.pptce_wg_presentation_-_nutrition.ppt
ce_wg_presentation_-_nutrition.ppt
 
Programme planning and evaluation in extension work
Programme planning and evaluation in extension workProgramme planning and evaluation in extension work
Programme planning and evaluation in extension work
 
Presentation_DRRM Mainstreaming in the Planning Cycle.pptx
Presentation_DRRM Mainstreaming in the Planning Cycle.pptxPresentation_DRRM Mainstreaming in the Planning Cycle.pptx
Presentation_DRRM Mainstreaming in the Planning Cycle.pptx
 
Development of a Progressive Management Pathway to assist National and Intern...
Development of a Progressive Management Pathway to assist National and Intern...Development of a Progressive Management Pathway to assist National and Intern...
Development of a Progressive Management Pathway to assist National and Intern...
 
Community HIV Program Indicators
Community HIV Program IndicatorsCommunity HIV Program Indicators
Community HIV Program Indicators
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...
 
Final outline plan for webinar evaluation and impact assessment mof 2004
Final outline plan for webinar evaluation and impact assessment mof 2004   Final outline plan for webinar evaluation and impact assessment mof 2004
Final outline plan for webinar evaluation and impact assessment mof 2004
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 

More from ALNAP

Gf john's presentation
Gf john's presentationGf john's presentation
Gf john's presentationALNAP
 
From best practice to best fit: changing to a more flexible approach to human...
From best practice to best fit: changing to a more flexible approach to human...From best practice to best fit: changing to a more flexible approach to human...
From best practice to best fit: changing to a more flexible approach to human...ALNAP
 
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'
ALNAP PPT FOR MONTREUX XIII  |  'From best practice to best fit'ALNAP PPT FOR MONTREUX XIII  |  'From best practice to best fit'
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'ALNAP
 
ALNAP PPT FOR OFDA | 50 years: From best practice to best fit
ALNAP PPT FOR OFDA | 50 years: From best practice to best fitALNAP PPT FOR OFDA | 50 years: From best practice to best fit
ALNAP PPT FOR OFDA | 50 years: From best practice to best fitALNAP
 
Strengthening humanitarian leadership teams: Rethinking leadership?
Strengthening humanitarian leadership teams: Rethinking leadership?Strengthening humanitarian leadership teams: Rethinking leadership?
Strengthening humanitarian leadership teams: Rethinking leadership?ALNAP
 
'Learning from disaster' study launch presentation
'Learning from disaster' study launch presentation'Learning from disaster' study launch presentation
'Learning from disaster' study launch presentationALNAP
 
A networked response? 2013 presentation
A networked response? 2013 presentationA networked response? 2013 presentation
A networked response? 2013 presentationALNAP
 
Disaster risk management in nepal
Disaster risk management in nepalDisaster risk management in nepal
Disaster risk management in nepalALNAP
 
Comisión Nacional de Prevención de Riesgos y Atención de Emergencias
Comisión Nacional de Prevención de Riesgos y Atención de EmergenciasComisión Nacional de Prevención de Riesgos y Atención de Emergencias
Comisión Nacional de Prevención de Riesgos y Atención de EmergenciasALNAP
 
Government Forum for Government Response - an overview
Government Forum for Government Response - an overviewGovernment Forum for Government Response - an overview
Government Forum for Government Response - an overviewALNAP
 
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...ALNAP
 
Jamaican government experience and learning on disaster response
Jamaican government experience and learning on disaster responseJamaican government experience and learning on disaster response
Jamaican government experience and learning on disaster responseALNAP
 
Disaster Management Initiatives in India
Disaster Management Initiatives in IndiaDisaster Management Initiatives in India
Disaster Management Initiatives in IndiaALNAP
 
Disaster Response dialogue
Disaster Response dialogueDisaster Response dialogue
Disaster Response dialogueALNAP
 
International assistance for major disasters in Indonesia
International assistance for major disasters in IndonesiaInternational assistance for major disasters in Indonesia
International assistance for major disasters in IndonesiaALNAP
 
Simulation a tool to strengthen capabilities in India
Simulation  a tool to strengthen capabilities in IndiaSimulation  a tool to strengthen capabilities in India
Simulation a tool to strengthen capabilities in IndiaALNAP
 
Humanitarian leadership: who's in charge here?
Humanitarian leadership: who's in charge here?Humanitarian leadership: who's in charge here?
Humanitarian leadership: who's in charge here?ALNAP
 
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...ALNAP
 
Data, evidence and access to information
Data, evidence and access to informationData, evidence and access to information
Data, evidence and access to informationALNAP
 
Whast goes up must come down: challenges of getting evidence back to the ground
Whast goes up must come down: challenges of getting evidence back to the groundWhast goes up must come down: challenges of getting evidence back to the ground
Whast goes up must come down: challenges of getting evidence back to the groundALNAP
 

More from ALNAP (20)

Gf john's presentation
Gf john's presentationGf john's presentation
Gf john's presentation
 
From best practice to best fit: changing to a more flexible approach to human...
From best practice to best fit: changing to a more flexible approach to human...From best practice to best fit: changing to a more flexible approach to human...
From best practice to best fit: changing to a more flexible approach to human...
 
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'
ALNAP PPT FOR MONTREUX XIII  |  'From best practice to best fit'ALNAP PPT FOR MONTREUX XIII  |  'From best practice to best fit'
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'
 
ALNAP PPT FOR OFDA | 50 years: From best practice to best fit
ALNAP PPT FOR OFDA | 50 years: From best practice to best fitALNAP PPT FOR OFDA | 50 years: From best practice to best fit
ALNAP PPT FOR OFDA | 50 years: From best practice to best fit
 
Strengthening humanitarian leadership teams: Rethinking leadership?
Strengthening humanitarian leadership teams: Rethinking leadership?Strengthening humanitarian leadership teams: Rethinking leadership?
Strengthening humanitarian leadership teams: Rethinking leadership?
 
'Learning from disaster' study launch presentation
'Learning from disaster' study launch presentation'Learning from disaster' study launch presentation
'Learning from disaster' study launch presentation
 
A networked response? 2013 presentation
A networked response? 2013 presentationA networked response? 2013 presentation
A networked response? 2013 presentation
 
Disaster risk management in nepal
Disaster risk management in nepalDisaster risk management in nepal
Disaster risk management in nepal
 
Comisión Nacional de Prevención de Riesgos y Atención de Emergencias
Comisión Nacional de Prevención de Riesgos y Atención de EmergenciasComisión Nacional de Prevención de Riesgos y Atención de Emergencias
Comisión Nacional de Prevención de Riesgos y Atención de Emergencias
 
Government Forum for Government Response - an overview
Government Forum for Government Response - an overviewGovernment Forum for Government Response - an overview
Government Forum for Government Response - an overview
 
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
 
Jamaican government experience and learning on disaster response
Jamaican government experience and learning on disaster responseJamaican government experience and learning on disaster response
Jamaican government experience and learning on disaster response
 
Disaster Management Initiatives in India
Disaster Management Initiatives in IndiaDisaster Management Initiatives in India
Disaster Management Initiatives in India
 
Disaster Response dialogue
Disaster Response dialogueDisaster Response dialogue
Disaster Response dialogue
 
International assistance for major disasters in Indonesia
International assistance for major disasters in IndonesiaInternational assistance for major disasters in Indonesia
International assistance for major disasters in Indonesia
 
Simulation a tool to strengthen capabilities in India
Simulation  a tool to strengthen capabilities in IndiaSimulation  a tool to strengthen capabilities in India
Simulation a tool to strengthen capabilities in India
 
Humanitarian leadership: who's in charge here?
Humanitarian leadership: who's in charge here?Humanitarian leadership: who's in charge here?
Humanitarian leadership: who's in charge here?
 
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
 
Data, evidence and access to information
Data, evidence and access to informationData, evidence and access to information
Data, evidence and access to information
 
Whast goes up must come down: challenges of getting evidence back to the ground
Whast goes up must come down: challenges of getting evidence back to the groundWhast goes up must come down: challenges of getting evidence back to the ground
Whast goes up must come down: challenges of getting evidence back to the ground
 

Humanitarian Indicator Tool (Dr Vivien Walden, Oxfam)

  • 1. Improving quality of humanitarian programmes through the use of a scoring system: the Humanitarian Indicator Tool Dr Vivien Margaret Walden and Nigel Timmins
  • 2. Acknowledgements • We wish to acknowledge the consultants, Andy Featherstone, Peta Sandison, Marlise Turnbull and Sarah House who helped refine the tools and the 10 countries offices who have been through the process and have hopefully come out of it unscathed. • Photographs used in this presentation are by Caroline Gluck, Oxfam Page 2
  • 3. Why a new tool? • In 1995, the ODI evaluation of the response to the Rwanda crisis raised concerns around the quality of service delivery • As a result of findings, the Sphere project was started • In 1997 the People in Aid Code of Best Practice was published • In 2000 the Humanitarian Ombudsman became HAP • In 2001 Griekspoor and Sondorp wrote a paper highlighting the various quality assurance measures and posed the question “Do all of the developments in the field of accountability and performance actually improve overall performance?” Page 3
  • 4. Why a new tool? • Oxfam’s hypothesis – by improving the quality of a humanitarian programme, the likelihood of positive impact is increased • The tool does not try to prove the link • Oxfam has introduced the Global Performance Framework as part of reporting to DFID • The framework has global indictors of which the Humanitarian Indicator is one • Whereas the other indicators measure impact, the humanitarian indictor looks at quality of the response Page 4
  • 5. Oxfam’s Global Performance Framework Global Output Indicators Global Outcome Indicators Humanitarian Assistance: Degree to which humanitarian responses meet Total number of people provided with appropriate recognised quality standards for humanitarian humanitarian assistance, disaggregated by sex programming (e.g. Sphere guidelines) Adaptation and Risk Reduction: % of supported households demonstrating greater # of people supported to understand current and likely ability to minimise risk from shocks and adapt to future hazards, reduce risk, and/or adapt to climatic changes and uncertainty, disaggregated by sex emerging trends & uncertainty Livelihood Enhancement Support: % of supported households demonstrating greater Improved quality # of women and men directly supported to increase income, as measured by daily consumption and of life for poor income via enhancing production and/or market access expenditure per capita women and men Women’s Empowerment: % of supported women demonstrating greater # of people reached to enable women to gain increased involvement in household decision-making and control over factors affecting their own priorities and interests influencing affairs at the community level Citizen Mobilisation: Degree to which selected interventions have # of a) citizens, CBO members and CSO staff supported contributed to affecting outcome change, as to engage with state institutions/other relevant actors; generated from findings of rigorous qualitative and b) duty bearers benefiting from capacity support evaluations Campaigning and Advocacy: Degree to which selected interventions have #of campaign actions directly undertaken or supported, contributed to affecting outcome change, as e.g. contacts made with policy targets, online and generated from findings of rigorous qualitative offline actions taken, media coverage, publications, etc. evaluations Degree to which selected interventions meet recognised standards for accountable programming Extent to which selected project delivery good value for money Page 5
  • 7. The indicator • Output indicator – the total number of people provided with appropriate humanitarian assistance, disaggregated by sex • Data collected annually through online system • Outcome indicator – the degree to which humanitarian responses meet recognised quality standards for humanitarian programming • Data collected through HIT Page 7
  • 8. The tool – for rapid onset emergencies Global Humanitarian Indicator: Degree to which humanitarian responses meet recognised quality standards for humanitarian programming RAPID ONSET EMERGNECY – EARTHQUAKE, SUDDEN FLOODS, TSUNAMI, CYCLONES, TYPHOONS, HURRICANES, SUDDEN CONFLICT WITH DISPLACEMENT, AWD OUTBREAKS Number Quality standard Met Almost Partially Not met (score6) met (4) met (score (score 2) 0) 1 Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours and initial implementation within three days 2 Coverage uses 25% of affected population as an planned figure (response should reflect the scale of the disaster) with clear justification for final count 3 Technical aspects of programme measured against Sphere standards Number Quality standard Met Almost Partially Not met (score3) met (score met (score 2) (score 1) 0) 4 MEAL strategy and plan in place and being implemented using appropriate indicators 5 Feedback/complaints system for affected population in place and functioning and documented evidence of information sharing, consultation and participation leading to a programme relevant to context and needs 6 Partner relationships defined, capacity assessed and partners fully engaged in all stages of programme cycle 7 Programme is considered a safe programme: action taken to avoid harm and programme considered conflict sensitive 8 Programme (including advocacy) addresses gender equity and specific concerns Page 8
  • 9. The benchmarks • Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours and initial implementation within three days • Coverage uses 25% of affected population as an planned figure (response should reflect the scale of the disaster) with clear justification for final count • Technical aspects of programme measured against Sphere standards • MEAL strategy and plan in place and being implemented using appropriate indicators • Feedback/complaints system for affected population in place and functioning and documented evidence of information sharing, consultation and participation leading to a programme relevant to context and needs Page 9
  • 10. The benchmarks • Partner relationships defined, capacity assessed and partners fully engaged in all stages of programme cycle • Programme is considered a safe programme: action taken to avoid harm and programme considered conflict sensitive • Programme (including advocacy) addresses gender equity and specific concerns and needs of women, girls, men and boys and vulnerable groups • Evidence that preparedness measures were in place and effectively actioned • Programme has an advocacy/campaigns strategy and has incorporated advocacy into programme plans based on evidence from the field Page 10
  • 11. The benchmarks • Country programme has an integrated approach including reducing and managing risk though existing longer-term development programmes and building resilience for the future • Evidence of appropriate staff capacity to ensure quality programming Page 11
  • 12. The methodology • Done by an external consultant – although preferably one who has a knowledge of Oxfam • Done as a desk study using documentation and some telephone/Skype interviews • Follows a pre-determined scoring system and list of documents • Has to be commented on and accepted by the country • The country writes a management response • All reports and a summary for each are published on the Oxfam website – www. Oxfam.org Page 12
  • 13. The scoring Quality standard Evidence needed Met (score 6) Almost met (4) Partially met (score 2) Not met (score 0) 3 Proposals Sphere standards NA Sphere standards Standards only MEAL strategy and proposed and put in proposed and adjusted mentioned in proposals plans place with adjusted to context but not replicated in PH and EFSL strategies standards for context Standards mentioned in plans Technical adviser visits Training in standards proposals and Or Training agendas and carried out for staff and LogFrames but not No mention of Sphere presentations partners monitored against in any document LogFrames and Indicators use Some evidence of monitoring frameworks standards and training but not donor reports monitoring against widespread (staff but RTE and other standards takes place not partners or only in evaluation reports regularly one area) learning event or Standards evaluated review reports Page 13
  • 14. Instructions for use Benchmark Evidence Quality check Benchmark 3 Technical aspects of Proposals Check proposals and strategies to see if programme measured MEAL strategy and plans standards are mentioned not just as a possibility against Sphere standards PH and EFSL strategies but that they are considered in the context of Technical adviser visits the response – this might mean that Sphere has Training agendas and been adapted to suit the context presentations The indicators on the LogFrame for technical LogFrames and monitoring areas should reflect Sphere standards frameworks The MEAL strategy should have Sphere as donor reports indicators and for data collection methods RTE and other evaluation Check adviser reports for mention of standards reports and how these were implemented learning event or review Check the RTE report for mention of Sphere reports standards Check WASH and EFSL strategies and adviser reports to see if any training was carried out for staff and partners Check review and evaluation reports for mention of standards Page 14
  • 15. Final score • First year total score could be 30 • Adjusted for non-applicable benchmarks • First year scoring • Somalia – 17/28 • Kenya – 24/30 • Ethiopia – 9/28 • Pakistan – 19/30 • Colombia – the test case for the HIT – 18/26 • Second year – only South Sudan is complete – 21/30 Page 15
  • 16. The findings – Somalia • Benchmark 5 on accountability • Score Met – 2/2 • “Oxfam’s partners appeared to differ in the level of beneficiary participation in design and delivery. Some documented highly participatory process, with qualitative and quantitative data. As well as gathering information, rapid assessments were done to establish VRCs to improve participation (criteria, entitlements, payment points, registration, complaints and feedback). Mobile phone hotlines were set up where possible, with feedback protocol to guide staff on how to register and follow-up complaints” Page 16
  • 17. Pakistan • Benchmark 7 – protection and gender • Partially met – ½ • Some protection concerns were identified relating to security for staff and women and girls using WASH facilities. Some actions were taken responding to dignity and protection including involving women in different activities. Post- distribution monitoring investigated some security concerns related to cashing cheques and distribution points, including analysis of responses from women. No protection problems observed were communicated to agencies or authorities responsible for, or specialising in, protection Page 17
  • 18. Ethiopia • Benchmark 11 – advocacy • Not met – 0/2 • Advocacy activities were clearly part of the intended response and there is a regional advocacy action plan with Ethiopia objectives and a media, advocacy and campaign strategy which includes a number of plans for Ethiopia. However, no Ethiopia country advocacy strategy was provided for the evaluation. The sitreps do mention Oxfam’s participation in influential meetings, but are not tied to an explicit strategy. There is no record of the impact of Oxfam’s advocacy activities Page 18
  • 19. Lessons learnt • There are limitations to doing a desk study – it relies heavily on documentation and scores do not reflect absence of documentation or actual absence of good programming • Several country teams objected to being judged solely on “little pieces of paper” • There is no opportunity to get the views of the affected population unless this is already documented • Telephone/Skype interviews can be biased – it is sometimes difficult to triangulate • The process needs the goodwill and buy-in from the country team Page 19
  • 20. Advantages • The process is fairly inexpensive – under £6000 • The country does not have to host a consultant • The methodology can be used to track progress in subsequent responses in one country • The scores are comparable across programmes (although context should be considered) Page 20
  • 21. She deserves the best we can give – thank you! Page 21

Editor's Notes

  1. They then went on to discuss the fact that there were more agencies, more funding and more media attention. They even went as far as to say that 100,000 deaths could be attrivbuted to poor perfomraance by relief agencies
  2. Elderly, disabled, HIV positive, single women, female-headed households are examples