SlideShare uma empresa Scribd logo
1 de 21
Community Engagement: what constitutes
success (and how do we know?)
Susan Rudland
Thinking about evaluation
1. Clarify the program logic/outcomes as part of the planning process
2. Establish the purpose/uses of the evaluation with priority stakeholders
taking account of their needs/questions
3. Establish the type of information needed (performance indicators)
4. Establish the sources and methods for collecting information and
timeline for delivery of findings
5. Analyse information, make judgements
6. Use the judgements/findings to adaptively manage your program
Evaluating community engagement
 What is evaluation and why is it important in community
engagement?
 Program logic - one approach
 Benefits and challenges
 Supporting resources
What is evaluation?
 Evaluation is
 “the process of making a judgement about the value or worth of an
object under review” (Owen 2006:9).
 carried out in social, corporate and government settings
 a form of knowledge that informs decision-making about an object,
which is developed by systematically assessing its operation (Owen
2006:19)
 influenced by values and context
 Evaluation is the systematic collection of information about the
activities and outcomes of programs to:
 track progress
 make judgements and decisions
 improve effectiveness
 build understandings
Short term
Longer term
Why evaluate?
 Judge merit or worth
 Improve programs
 Generate
knowledge/understandings
 Engage stakeholders
 Be accountable
 Gain support for future
projects
Why evaluate community engagement?
Evaluation enables you to find out:
 What worked well and what did not?
 Did engaging with citizens or communities actually
meet the engagement objectives?
 Did engagement enhance knowledge and decision
making?
 Were there any unanticipated outcomes?
 Was engagement cost effective in terms of time and
resources?
 What does this mean for next time?
Evaluating Engagement
Evaluation can be valuable in a number of ways to:
 Assist planning for future consultation programs
 Develop appropriate techniques for particular objectives
 Develop appropriate techniques for participants with differing needs
 Improve effectiveness of consultation techniques
 Increase consultation skills of staff
 Assessing the effects of the engagement on the issues and
processes with which it is concerned. This may require ways of
measuring attitudes or levels of knowledge ‘before and after’ the
engagement .
Integrated engagement and evaluation
Community
Engagement
Initiate
and scope
Develop
and plan
Implement
and
monitor
Evaluate
and close
Links with community engagement
Community
engagement
Evaluation
Initiate and scope phase Clarify objectives, clarify assumptions,
define stakeholders, clarify roles, identify
risks, evaluation template – decide
formal/informal evaluation
Develop and plan Change control plan, communications plan,
reporting strategies, documentation
Implement and monitor Tracking progress, monitoring and data
collection, feedback and adjust
Evaluate and close Finalise evaluation documentation, conduct
the formal/informal evaluation, conduct
team review, communicate outcomes and
findings
What is Program Logic?
 How we make sense of a program
 A tool for establishing the logical
connections – why, what and how,
from inputs through to the ultimate
outcomes
 Allows the logic of a project to be
questioned and challenged, and
helps to identify assumptions that
link steps together
 Connects the evaluation activities
with the intended short, medium
and long term outcomes
 Is not a project plan
Backcasting
Source: The Natural Step
Hierarchy of Outcomes
• Impact on the overall issue and ultimate goals (social/economic, artistic, organisational,
communications and sustainability outcomes).
Ultimate outcomes
• Changes in individual and group knowledge, attitudes, skills; changes in aspirations,
intentions, practices and behaviour.
Intermediate outcomes
• Levels and nature of participation; reactions to the outputs/activities by
participants/stakeholders. Participation in activities, the quality and use of program
collateral and the range and appropriateness of activities is assessed at this level.
Immediate outcomes
• The products/services/activities the program actually offers to engage participants.
Outputs/activities
• Priority issues that the program must to respond to: (social, organisational, sectoral,
communications) based on existing or new information (policies, data, consultation,
research).
Needs
The challenge of outcomes
How do we make sense of a program?
Efficiency
Action/
outputs
Outcomes
Resources/
inputs
Process
Problem/
issue
Effectiveness
Appropriateness
Needs
Linking outcomes to priority needs –
you know it makes sense…doesn’t it?
 Sealand Community Engagement in Stormwater
Management Project
 (AKA Stubby Holders for School Kids)
 Stormwater run-off, turbidity and siltation in local
waterways
 Land clearing due to residential developments
 Project focus on engagement with local school-kids, to
inform and educate their parents regarding the
importance of reducing stormwater run-off into local
rivers
 Raindrop character, school performances, educational
messages and collateral including stubby holders with
pollution prevention messages
Developing a project evaluation action plan
Outcomes
Hierarchy
Evaluation Questions
(What do we want to know?)
Data sources
(Where will we
find the
information?)
Standard
(How do we
judge it?)
Utilisation
(How will
we use it?)
Ultimate outcomes
Immediate outcomes
Activities/output
Needs
Thinking about evaluation
1. Clarify the program logic/outcomes as part of the planning process
2. Establish the purpose/uses of the evaluation with priority stakeholders
taking account of their needs/questions
3. Establish the type of information needed (performance indicators)
4. Establish the sources and methods for collecting information and
timeline for delivery of findings
5. Analyse information, make judgements
6. Use the judgements/findings to adaptively manage your program
Thinking about data
From the beginning…………….
 What’s the available data?
 Does it respond to the evaluation questions and help demonstrate
desired outcomes?
 Who is the audience – and what will they value as evidence?
 Build off the existing platform
 What else do you need to collect?
 Who else can provide data?
 Consider quality and consistency
 Is it replicable?
 Consider different types of data and perspectives.
Demonstrating success
 Evaluation matters – it demonstrates impact and outcomes
 For best results, build it in from the beginning
 Program logic makes sense of why, what and how
 It challenges our assumptions, and focusses appropriate activity
 It doesn’t have to be complicated
 An integrated approach delivers a more robust process and
outcomes
 It supports transparency and accountability
 We learn from it!
Resources
 NSW Office of Environment and Heritage (former NSW Department
of Environment and Conservation) 2004, Does your project make a
difference?,
http://www.environment.nsw.gov.au/resources/communities/040110-
Project-Evaluation.pdf
 Hendricks, Alison, Evaluation Framework for Community
Engagement Based on the United Nations Brisbane Declaration,
http://www.iap2.org.au/sitebuilder/resources/knowledge/asset/files/4
0/undecevaluationframeworkforcommunityengagement.pdf
 Queensland Government 2011, Engaging Queenslanders:
evaluating community engagement,
http://www.qld.gov.au/web/community-engagement/guides-
factsheets/evaluating/
 Susan Rudland
srudland@urbis.com.au

Mais conteúdo relacionado

Mais procurados

Proactive evaluation
Proactive evaluationProactive evaluation
Proactive evaluation
Carlo Magno
 
Interactive evaluation
Interactive evaluationInteractive evaluation
Interactive evaluation
Carlo Magno
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
WorldFish
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
harrindl
 
Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)
Dr.RAJEEV KASHYAP
 

Mais procurados (20)

Program evaluation part 2
Program evaluation part 2Program evaluation part 2
Program evaluation part 2
 
Proactive evaluation
Proactive evaluationProactive evaluation
Proactive evaluation
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and Development
 
Introduction to Participatory Monitoring-Evaluation
Introduction to Participatory Monitoring-EvaluationIntroduction to Participatory Monitoring-Evaluation
Introduction to Participatory Monitoring-Evaluation
 
Introduction to monitoring and evaluation Ungaluk Program 2015
Introduction to monitoring and evaluation Ungaluk Program 2015Introduction to monitoring and evaluation Ungaluk Program 2015
Introduction to monitoring and evaluation Ungaluk Program 2015
 
Collaborative work 2 Part 1
Collaborative work 2 Part 1Collaborative work 2 Part 1
Collaborative work 2 Part 1
 
Two Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and EvaluationTwo Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and Evaluation
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Monitoring and evaluation (Part 1)
Monitoring and evaluation (Part 1)Monitoring and evaluation (Part 1)
Monitoring and evaluation (Part 1)
 
Interactive evaluation
Interactive evaluationInteractive evaluation
Interactive evaluation
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
Self-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & EvaluationSelf-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & Evaluation
 
Monitoring and Evaluating
Monitoring and EvaluatingMonitoring and Evaluating
Monitoring and Evaluating
 
Module 8 - Monitoring and Evaluation
Module 8 - Monitoring and EvaluationModule 8 - Monitoring and Evaluation
Module 8 - Monitoring and Evaluation
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
How to-develop-a-monitoring-and-evaluation-plan
How to-develop-a-monitoring-and-evaluation-planHow to-develop-a-monitoring-and-evaluation-plan
How to-develop-a-monitoring-and-evaluation-plan
 

Semelhante a Community engagement - what constitutes success

IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Institute of Development Studies
 

Semelhante a Community engagement - what constitutes success (20)

COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Evaluating the guidance program
Evaluating the guidance programEvaluating the guidance program
Evaluating the guidance program
 
Introduction To Evaluation
Introduction To EvaluationIntroduction To Evaluation
Introduction To Evaluation
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandra
 
USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
Importance of M&E
Importance of M&EImportance of M&E
Importance of M&E
 
Program Logic Models
Program Logic ModelsProgram Logic Models
Program Logic Models
 
Full Program Design
Full Program DesignFull Program Design
Full Program Design
 
P process plus sa (1)
P process plus sa (1)P process plus sa (1)
P process plus sa (1)
 
Project management
Project managementProject management
Project management
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptxSOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
Conducting Programme Evaluation
Conducting Programme EvaluationConducting Programme Evaluation
Conducting Programme Evaluation
 
Grant Writing: Summary Concepts 1 4
Grant Writing: Summary Concepts 1 4Grant Writing: Summary Concepts 1 4
Grant Writing: Summary Concepts 1 4
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
 

Último

2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
MadhuKothuru
 
Competitive Advantage slide deck___.pptx
Competitive Advantage slide deck___.pptxCompetitive Advantage slide deck___.pptx
Competitive Advantage slide deck___.pptx
ScottMeyers35
 

Último (20)

NAP Expo - Delivering effective and adequate adaptation.pptx
NAP Expo - Delivering effective and adequate adaptation.pptxNAP Expo - Delivering effective and adequate adaptation.pptx
NAP Expo - Delivering effective and adequate adaptation.pptx
 
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and NumberCall Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
 
Finance strategies for adaptation. Presentation for CANCC
Finance strategies for adaptation. Presentation for CANCCFinance strategies for adaptation. Presentation for CANCC
Finance strategies for adaptation. Presentation for CANCC
 
tOld settlement register shouldnotaffect BTR
tOld settlement register shouldnotaffect BTRtOld settlement register shouldnotaffect BTR
tOld settlement register shouldnotaffect BTR
 
Coastal Protection Measures in Hulhumale'
Coastal Protection Measures in Hulhumale'Coastal Protection Measures in Hulhumale'
Coastal Protection Measures in Hulhumale'
 
World Press Freedom Day 2024; May 3rd - Poster
World Press Freedom Day 2024; May 3rd - PosterWorld Press Freedom Day 2024; May 3rd - Poster
World Press Freedom Day 2024; May 3rd - Poster
 
2024: The FAR, Federal Acquisition Regulations, Part 31
2024: The FAR, Federal Acquisition Regulations, Part 312024: The FAR, Federal Acquisition Regulations, Part 31
2024: The FAR, Federal Acquisition Regulations, Part 31
 
Contributi dei parlamentari del PD - Contributi L. 3/2019
Contributi dei parlamentari del PD - Contributi L. 3/2019Contributi dei parlamentari del PD - Contributi L. 3/2019
Contributi dei parlamentari del PD - Contributi L. 3/2019
 
74th Amendment of India PPT by Piyush(IC).pptx
74th Amendment of India PPT by Piyush(IC).pptx74th Amendment of India PPT by Piyush(IC).pptx
74th Amendment of India PPT by Piyush(IC).pptx
 
A Press for the Planet: Journalism in the face of the Environmental Crisis
A Press for the Planet: Journalism in the face of the Environmental CrisisA Press for the Planet: Journalism in the face of the Environmental Crisis
A Press for the Planet: Journalism in the face of the Environmental Crisis
 
Pakistani Call girls in Sharjah 0505086370 Sharjah Call girls
Pakistani Call girls in Sharjah 0505086370 Sharjah Call girlsPakistani Call girls in Sharjah 0505086370 Sharjah Call girls
Pakistani Call girls in Sharjah 0505086370 Sharjah Call girls
 
31st World Press Freedom Day Conference.
31st World Press Freedom Day Conference.31st World Press Freedom Day Conference.
31st World Press Freedom Day Conference.
 
2024: The FAR, Federal Acquisition Regulations, Part 32
2024: The FAR, Federal Acquisition Regulations, Part 322024: The FAR, Federal Acquisition Regulations, Part 32
2024: The FAR, Federal Acquisition Regulations, Part 32
 
1935 CONSTITUTION REPORT IN RIPH FINALLS
1935 CONSTITUTION REPORT IN RIPH FINALLS1935 CONSTITUTION REPORT IN RIPH FINALLS
1935 CONSTITUTION REPORT IN RIPH FINALLS
 
2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
2024 asthma jkdjkfjsdklfjsdlkfjskldfgdsgerg
 
Time, Stress & Work Life Balance for Clerks with Beckie Whitehouse
Time, Stress & Work Life Balance for Clerks with Beckie WhitehouseTime, Stress & Work Life Balance for Clerks with Beckie Whitehouse
Time, Stress & Work Life Balance for Clerks with Beckie Whitehouse
 
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdfPeace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
 
3 May, Journalism in the face of the Environmental Crisis.
3 May, Journalism in the face of the Environmental Crisis.3 May, Journalism in the face of the Environmental Crisis.
3 May, Journalism in the face of the Environmental Crisis.
 
Competitive Advantage slide deck___.pptx
Competitive Advantage slide deck___.pptxCompetitive Advantage slide deck___.pptx
Competitive Advantage slide deck___.pptx
 
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
 

Community engagement - what constitutes success

  • 1. Community Engagement: what constitutes success (and how do we know?) Susan Rudland
  • 2. Thinking about evaluation 1. Clarify the program logic/outcomes as part of the planning process 2. Establish the purpose/uses of the evaluation with priority stakeholders taking account of their needs/questions 3. Establish the type of information needed (performance indicators) 4. Establish the sources and methods for collecting information and timeline for delivery of findings 5. Analyse information, make judgements 6. Use the judgements/findings to adaptively manage your program
  • 3. Evaluating community engagement  What is evaluation and why is it important in community engagement?  Program logic - one approach  Benefits and challenges  Supporting resources
  • 4. What is evaluation?  Evaluation is  “the process of making a judgement about the value or worth of an object under review” (Owen 2006:9).  carried out in social, corporate and government settings  a form of knowledge that informs decision-making about an object, which is developed by systematically assessing its operation (Owen 2006:19)  influenced by values and context  Evaluation is the systematic collection of information about the activities and outcomes of programs to:  track progress  make judgements and decisions  improve effectiveness  build understandings Short term Longer term
  • 5. Why evaluate?  Judge merit or worth  Improve programs  Generate knowledge/understandings  Engage stakeholders  Be accountable  Gain support for future projects
  • 6. Why evaluate community engagement? Evaluation enables you to find out:  What worked well and what did not?  Did engaging with citizens or communities actually meet the engagement objectives?  Did engagement enhance knowledge and decision making?  Were there any unanticipated outcomes?  Was engagement cost effective in terms of time and resources?  What does this mean for next time?
  • 7. Evaluating Engagement Evaluation can be valuable in a number of ways to:  Assist planning for future consultation programs  Develop appropriate techniques for particular objectives  Develop appropriate techniques for participants with differing needs  Improve effectiveness of consultation techniques  Increase consultation skills of staff  Assessing the effects of the engagement on the issues and processes with which it is concerned. This may require ways of measuring attitudes or levels of knowledge ‘before and after’ the engagement .
  • 8. Integrated engagement and evaluation Community Engagement Initiate and scope Develop and plan Implement and monitor Evaluate and close
  • 9. Links with community engagement Community engagement Evaluation Initiate and scope phase Clarify objectives, clarify assumptions, define stakeholders, clarify roles, identify risks, evaluation template – decide formal/informal evaluation Develop and plan Change control plan, communications plan, reporting strategies, documentation Implement and monitor Tracking progress, monitoring and data collection, feedback and adjust Evaluate and close Finalise evaluation documentation, conduct the formal/informal evaluation, conduct team review, communicate outcomes and findings
  • 10. What is Program Logic?  How we make sense of a program  A tool for establishing the logical connections – why, what and how, from inputs through to the ultimate outcomes  Allows the logic of a project to be questioned and challenged, and helps to identify assumptions that link steps together  Connects the evaluation activities with the intended short, medium and long term outcomes  Is not a project plan
  • 12. Hierarchy of Outcomes • Impact on the overall issue and ultimate goals (social/economic, artistic, organisational, communications and sustainability outcomes). Ultimate outcomes • Changes in individual and group knowledge, attitudes, skills; changes in aspirations, intentions, practices and behaviour. Intermediate outcomes • Levels and nature of participation; reactions to the outputs/activities by participants/stakeholders. Participation in activities, the quality and use of program collateral and the range and appropriateness of activities is assessed at this level. Immediate outcomes • The products/services/activities the program actually offers to engage participants. Outputs/activities • Priority issues that the program must to respond to: (social, organisational, sectoral, communications) based on existing or new information (policies, data, consultation, research). Needs
  • 13. The challenge of outcomes
  • 14. How do we make sense of a program? Efficiency Action/ outputs Outcomes Resources/ inputs Process Problem/ issue Effectiveness Appropriateness Needs
  • 15. Linking outcomes to priority needs – you know it makes sense…doesn’t it?  Sealand Community Engagement in Stormwater Management Project  (AKA Stubby Holders for School Kids)  Stormwater run-off, turbidity and siltation in local waterways  Land clearing due to residential developments  Project focus on engagement with local school-kids, to inform and educate their parents regarding the importance of reducing stormwater run-off into local rivers  Raindrop character, school performances, educational messages and collateral including stubby holders with pollution prevention messages
  • 16. Developing a project evaluation action plan Outcomes Hierarchy Evaluation Questions (What do we want to know?) Data sources (Where will we find the information?) Standard (How do we judge it?) Utilisation (How will we use it?) Ultimate outcomes Immediate outcomes Activities/output Needs
  • 17. Thinking about evaluation 1. Clarify the program logic/outcomes as part of the planning process 2. Establish the purpose/uses of the evaluation with priority stakeholders taking account of their needs/questions 3. Establish the type of information needed (performance indicators) 4. Establish the sources and methods for collecting information and timeline for delivery of findings 5. Analyse information, make judgements 6. Use the judgements/findings to adaptively manage your program
  • 18. Thinking about data From the beginning…………….  What’s the available data?  Does it respond to the evaluation questions and help demonstrate desired outcomes?  Who is the audience – and what will they value as evidence?  Build off the existing platform  What else do you need to collect?  Who else can provide data?  Consider quality and consistency  Is it replicable?  Consider different types of data and perspectives.
  • 19. Demonstrating success  Evaluation matters – it demonstrates impact and outcomes  For best results, build it in from the beginning  Program logic makes sense of why, what and how  It challenges our assumptions, and focusses appropriate activity  It doesn’t have to be complicated  An integrated approach delivers a more robust process and outcomes  It supports transparency and accountability  We learn from it!
  • 20. Resources  NSW Office of Environment and Heritage (former NSW Department of Environment and Conservation) 2004, Does your project make a difference?, http://www.environment.nsw.gov.au/resources/communities/040110- Project-Evaluation.pdf  Hendricks, Alison, Evaluation Framework for Community Engagement Based on the United Nations Brisbane Declaration, http://www.iap2.org.au/sitebuilder/resources/knowledge/asset/files/4 0/undecevaluationframeworkforcommunityengagement.pdf  Queensland Government 2011, Engaging Queenslanders: evaluating community engagement, http://www.qld.gov.au/web/community-engagement/guides- factsheets/evaluating/

Notas do Editor

  1. SR Thank you for coming along today. My name is Susan Rudland and I work for a company called Urbis We are specialists in community engagement, evaluation, research and social planning Delivered over 200 stakeholder and community engagement programs for councils, state government and NGOs, across environmental, sustainability, planning and land-use processes We use social research and evaluation to strengthen robust and accountable engagement
  2. SR It is important to clearly articulate the community engagement process to be evaluated. “Clearly articulating the community engagement program is the most important aspect of any evaluation because it guides the development of evaluation questions and performance criteria which will be used to measure the success of the program. In other words, unless it is clear how a community engagement program is meant to function and what it is meant to achieve there is no way of judging how successful it is.” (Queensland Government 2011)
  3. Community engagement – what constitutes success (and how do we know)?   There is no one way to engage communities – it all depends on the issue, the aims and the desired outcomes.  Engagement planning often focusses on developing activities and tools – how we plan to engage.  But it is important to stand back and take some time to consider why we are engaging and what we are seeking to achieve, before going to the how.      A focus on the drivers for engagement and the desired outcomes is important to identify measures of success.  When funding is limited, the issues are contentious, and stakeholders are diverse, how will you demonstrate and communicate success to others? The NSW Supreme Court recently handed down a judgement regarding what constitutes genuine and effective consultation according to the NSW Strategic Regional Land Use Policy Delivery Guidelines, "Guidelines for community consultation requirements for the extraction of coal and petroleum, including coal seam gas". The case was brought by Metgasco Limited v Minister for Resources and Energy, regarding the suspension of metgasco operations for failing to consult genuinely and effectively with the community – the evidence presented was that they had failed to persuade the community to support the project. The court found in favour of Metgasco, saying that a consultation process that does not persuade the community to support a project is not necessarily a failure or ineffective. Notwithstanding what you may think of how Metgasco did or did not engage with the community, the case has raised all sorts of discussion on social media, with many suggesting that it supports a focus on activities, not outcomes – and that this is actually a vindication – as if the only outcome that community engagement never seeks to achieve is persuasion and agreement. I think that’s a mistake – community engagement is much more than process and activity – it often seeks to deliver a range of crucial outcomes – otherwise, why are we here and how can our efforts be judged effective and genuine? Community engagement may seek to educate, inform, support and resource people. It may seek to facilitate changes in attitudes, knowledge and behaviours. It may seek to achieve environmental improvements. provide ongoing avenues and channels of communication. It may seek to document complex questions and factors for consideration. It may seek to produce innovative and creative solutions through co-design and co-creation. And there a number of other objectives that may be in play. The question is – have we defined what are we trying to achieve, how we will measure success, and how we will know we have made a difference? If we haven’t done that, engagement runs the risk of being reduced to a judgement of how good the workshop biscuits were and did people have a good time. I think that does a real disservice to all the very creative and experienced engagement practitioners out there who are trying to make a difference.   Program logic is one method of integrating evaluation and measurement thinking, into the design of an engagement project right at the beginning.  Program logic asks the question does it make sense, how do we know, how is this demonstrated, and what else do we need to consider.  It offers an action research framework to strengthen engagement practice, manage risk and communicate achievements more broadly.  This presentation introduces key concepts and frameworks that may be applied in different engagement projects, large and small.  So I would like to explore some key issues in this presentation: what is evaluation and why is it important to community engagement Introduce a program logic approach Consider some of the challenges Identify some of the benefits And leave you with some resources you may wish to consider. Starter Question: Before I go on, I am keen to hear from you. I wonder if you could pair up, talk to the person next to you for about 5 minutes, and share with one another – in your role, what do you need evaluation to do? What do you see as the key benefits? what you see as the key challenges and benefits of evaluating community engagement. Just 5 minutes together, and then I will invite some of you to share those key challenges and benefits. Round robin intros - So as a quick ice-breaker we would like you to introduce yourself and tell us about your role in community engagement and how evaluation is relevant to you What do you need evaluation to do/what are the anticipated benefits for your work?
  4. SR The key principles are as follows: Evaluation should be an integral part of the planning and management of community engagement activities Evaluation should be a structured and planned process The scale and scope of the evaluation should reflect the purpose, audience, and the scale and significance of the community engagement activities Evaluation should, wherever possible, be a participatory activity
  5. SR Evaluation is helpful for those keen to: Build your capacity to think critically and be consistently evaluative in your work; Improve your ability to construct convincing and evidence-based rationales for new and existing work; Meaningfully assess the value of your approach; Demonstrate or defend the value of what you do; Enhance the rigour of your decision-making and problem-solving; Check that you get value-for-money and create new efficiencies; Have more frequent and structured conversations with your teams, supervisors or sectors about creating and evaluating change; Be responsive to the needs and interests of your stakeholders; Articulate your purpose/focus of activities and track progress toward it; and Consistently adapt your approach as you go, based on lessons learnt.
  6. There are two key reasons for evaluating community engagement: The process and tools used to implement community engagement – Was the engagement appropriate? The impacts and outcomes of community engagement – what difference did the engagement activity make? – Was it effective? The benefits of evaluating community engagement activities include: Identifying and articulating lessons and achievements to improve practice Building an evidence base for innovative and best-practice community engagement including creating standards and benchmarks Contributing to engagement capability development by providing feedback on performance Presenting opportunities for further participation in the evaluation process Providing evidence for communities and clients of how effective engagement works.
  7. Evaluation is an important tool in determining whether a consultation program achieved its objectives, and whether the most effective and suitable techniques were applied.
  8. Evaluation can be embedded into any engagement project – it is part of the usual process of planning, designing, doing and reviewing. In appreciative inquiry, or action research projects, you might think of this as the discovery, dream, design, and deliver phases. If I could insert a word that started with D for evaluation and adapt, I would! But you get the picture. Its an iterative, ongoing, adaptive process. It informs how you design, what you do, and how you reflect on that.
  9. Embedding evaluative thinking into community engagement strengthens and enhances the engagement process.
  10. SR So what is program logic and how do we use it to evaluate community engagement? Program logic models describe the sequence of events for bringing about change and relate activities to outcomes. It describes the assumptions about how and why a program will work by identifying “what causes what” to connect the evaluation with the intended short, medium and longer-term outcomes. Program logic focuses on the main steps that need to be achieved so that the project can be tracked to see if it is on the right path, or whether changes are required to the proposed methodology and if so why this may have occurred. Program logic is not a project plan, and should not replace a project plan, but should be used at the beginning of the process to refine and make sure that proposed evaluation activities are relevant to the intended outcome of the engagement activity. Benefits of PL: Illustrates the logic or theory of a program Focuses on what matters - outcomes Can be used to create dialogue and shared understanding of a program between different stakeholders and those being evaluated Helps to identify gaps Makes assumptions explicit Identify appropriate questions for your evaluation based on the program Determine data collection sources and methods and when to collect data
  11. SR A backcasting approach is used to develop the program logic. Program logic starts with the intended future outcome and works backwards to the how, when and whom (inputs and outputs). This involves identifying what is the vision of the project and then working backwards to identify the necessary steps required to achieve short, medium and long term outcomes. This allows you to think through what is needed to create the future, rather than thinking about what is currently happening and trying to predict the future. When using program logic to evaluate community engagement activities, the ultimate outcome is a process of engaging the community.
  12. SR Every program logic starts with a hierarchy of outcomes approach. Program logic models include the following: CURRENT SITUATION/NEED: What is the current situation that the program is responding to. INPUTS - This is what you invest including human and financial resources and time. ACTIVITIES, ACTIONS or OUTPUTS – What you are going to do including different community engagement activities conducted. Any ‘products’ that are produced (eg. submissions, workshop notes or summary reports). OUTCOMES – short-term, medium-term and longer-term outcomes Short term outcomes are what happens as a result of a particular activity and usually include changes to a specific community and participants due to a particular activity. Medium term outcomes can include changes to policies, plans and projects and broader changes to communities and government. Longer term outcomes are those that are fundamental changes in the social, environmental, economic and governance priorities.
  13. SR Outcomes need to be: Realistic Measurable Achievable Relevant Performance indicators: Level of participation Usefulness of the information Extent to which they are being heard being and influence Commitment to process Changes in decision makers views/perceptions Efficiency of delivery of plan    Cost effectiveness
  14. But as well as thinking about what we want to achieve – we need to identify measures of success. This is where we start to build the logic of an engagement program.
  15. Fabulous activities, supported across local schools, high levels of participation and engagement by schoolkids years 1 to 3. Did it make a difference? Was it appropriate? Was it effective? For whom and why? What needed to happen? Engagement with local residential developers and builders and construction teams, to ensure appropriate management practices and minimise runoff. Not school kids and not their parents. If we had developed the program logic to start with, and mapped out our desired outcomes, we could have delivered an outcome that went beyond a misguided activity.
  16. How might we have done that – spent a couple of hours, at the outset of the project, before doing anything!
  17. SR It is important to clearly articulate the community engagement process to be evaluated. “Clearly articulating the community engagement program is the most important aspect of any evaluation because it guides the development of evaluation questions and performance criteria which will be used to measure the success of the program. In other words, unless it is clear how a community engagement program is meant to function and what it is meant to achieve there is no way of judging how successful it is.” (Queensland Government 2011)
  18. SR
  19. SR
  20. I started off by referring to the recent court judgement about what constitutes success in community engagement. Program logic is one approach that allows us all to consider that question early. Program logic holds us to the why, before we jump to the how. It embeds evaluation thinking into program design and planning, and offers a structured framework for reflection that allows us to learn and adapt along the way. It can be as simple or as complicated as you need it to be, as some of these resources will demonstrate.