Welcome, overview, introductions Photo exercise - how you feel about evaluation
Outcome or impact eval focuses on results; Process evaluation focuses on how the program produced those results
Benefits latest trends in noprofit sector relevant to funders and NPOs So why DO you do what you do? Evaluation CAN… Identify program strengths and weaknesses Spot problem areas and suggest solutions Provide information on met and unmet needs T o demonstrate program effectiveness To improve your program To increase funding opportunities To demonstrate accountability to your clients and funders To build community capacity and engage stakeholders To share what works and what does not work with other programs To provide knowledge for strategic planning To share your successes with the public
This graphic shows the big picture of how evaluation fits into the program development process: Start with a needs assessment to determine the situation or issue being addressed, establishing priorities in program development based on the org mission, org vision, community needs, etc (CONTEXT). Assumes you have a theory of change: the assumptions, theories that describe why you think your particular program design will achieve interned or desired results. Defining your program through the use of your program logic model Creating a program based on the logic model Conducting the program Evaluating the program Using the eval results to start over again to determine how the program fits in the org, how to improve it or drop it
A logic model is a road map that describes how your programs, from start to finish, intend to improve client lives and the community Use a logic model to… Create a common understanding of your program and its expectations Identify items that are critical to goal attainment Clarify what needs to be measured Make sure a program’s process is not overlooked Enhance your learning Make mid-course corrections as needed
Template for a process or an output/impact evaluation. Start with outcomes (or activities and outputs) and your questions about these.
Objectives: describes your goals in measurable terms Specific, Measurable, Attainable, Realistic, Timebound
The socio-ecological model recognizes the interwoven relationship that exists between the individual and their environment. While individuals are responsible for instituting and maintaining the lifestyle changes necessary to reduce risk and improve health, individual behavior is determined to a large extent by social environment, e.g. community norms and values, regulations, and policies. Barriers to healthy behaviors are shared among the community as a whole. As these barriers are lowered or removed, behavior change becomes more achievable and sustainable. It becomes easier to "push the ball up the hill." The most effective approach leading to healthy behaviors is a combination of the efforts at all levels--individual, interpersonal, organizational, community, and public policy. How many of you have outcomes that are more community driven rather than focusing on individual accomplishments? Give me an example of what these outcomes look like? Taken from Jane Moore, Ph.D., RD Manager of Oregon Department of Human Services-Health Services, http://www.dhs.state.or.us/publichealth/hpcdp/about.cfm#why
Transition slide to evaluation planning Links between outcomes and data collection methods Standards for gauging your program’s performance on achieving the outcomes Timeline and staffing instructions for putting the evaluation in motion
Show link to logic model
Stakeholders: (Audience) who are yours? Why should you consider them in eval planning? Focus evaluation by defining stakeholders and evaluation questions Stakeholders are “ individuals or organizations who stand to gain or lose from the success or failure of a system” Implementers : those involved in program Partners : those who actively support the program Participants : those served or affected by the program Decision makers : those in a position to do something or decide something about the program
Data collection methods ACTIVITY: Draft evaluation plans
Important considerations... Confidentiality or anonymity Consent and/or assent Cultural sensitivity Administration protocol Pilot testing Other considerations
So imagine that magic occurred and poof, your data is analyzed and you’ve got your results. What now? What will you do with your results? HANDOUT: XX
Let’s talk for a few minutes about your program or organization’s environment… Setting up the right environment is important because staff and volunteers in all levels of your program may be impacted by evaluation either because their work is being examined or they are asked to help collect the data. It will be important that you build support within your organization for evaluation. It needs to be ok for staff to take an honest look at the success or lack of success regarding a program they run. What are characteristics you would use to describe a good environment for evaluation or a learning environment? Use flip chart to record answers. [open communication, motivation, sense of ownership] So from the very beginning, consider how you can bring all different levels of staff, volunteers, clients and board members into your discussions. Together you can create a “culture of inquiry” by openly discussing your mission, values, the assumptions you make about your programs and the hopes and dreams you have for achieving success. Who (in audience) think s they have a healthy “culture for inquiry” at their program or organization? What does this look like? For those who don’t think they have it, what is standing in your way? Can you think of solutions (to obstacles) to help you achieve a culture of inquiry? What are strategies for integrating evaluation into your organization?