This presentation provides information from an interactive informational session given at the Oakland Literacy Coalition in 2015. The presentation was led by Nada Djordjevich, Paul Gibson and Julie Johnson of Gibson and Associates. The interactive workshop was designed to help non-profits and school-based services understand how to use program evaluation to improve their programs, inform funders and create accountability. The event was designed for an audience somewhat familiar with program evaluation and uses tools designed in multiple contexts, including municipal funding, public health, education, nonprofit arts, childcare and environmental agencies. The event was well-received with several agencies using the tools with their own agencies in follow-up meetings.
Lucknow 💋 Russian Call Girls Lucknow ₹7.5k Pick Up & Drop With Cash Payment 8...
Olc.evaluation.nada.djordjevich.final
1. USING PROGRAM EVALUATION
FOR CONTINUOUS IMPROVEMENT
& TO TELL YOUR STORY
OAKLAND LITERACY COALITION, JANUARY 14, 2015
Paul Gibson, Ed.M.
Nada Djordjevich, Ed.M.
Julie Johnson, M.Ed.
Gibson & Associates
3. INTRODUCTION
Please take a moment to jot down on the sticky paper
any challenges that you have with evaluations.
We will collect these thoughts and try to address as
many as possible by the end of the presentation.
4. COLLECTIVE IMPACT PROGRAM EVALUATIONS
• Evaluated school-day and out of school
time programs, and NSF grants.
• Lead evaluator for a Race to the Top
grant in excess of $27 million.
• Evaluations recognized for both clarity
and quality by the U.S. Dept. of
Education and the CA. Dept. of
Education.
For more than 25 years, G&A has collaborated with a range of organizations to build organizational
cultures that value data usage, analysis, and research to support continuous improvement.
• Evaluated 25 programs from 20 agencies
receiving funding from the City of
Berkeley Children and Youth Commission.
• Engaged multiple city partners in planning
efforts to determine funding priorities for
children and youth for programs in Irvine
and Oakland.
GRANT WRITING LITERACY
• G&A has supported agencies in receiving
more than 400 million dollars in grant
funds.
• In the past few years, we completed 10
multi-million dollar partnership grants for
school districts.
• Developed and evaluated multiple early
childhood and school-age literacy
programs.
• Facilitated planning processes for early
childhood and school-day literacy
programs.
ABOUT G&A
5. G&A’S EVALUATION
SYSTEM: M4
MODEL: CREATING A LOGIC MODEL
MEASUREMENT: DATA COLLECTION
MANAGEMENT: TIMELINE, REPORTING &
PRESENTATION
MEETINGS: CYCLES OF INQUIRY
7. Logic Model:
How many of you have a current logic model?
Of those who have one, how many use it within
your daily program operations?
MODEL
8. RESOURCES ACTIVITIES OUTPUTS
SHORT &
MID-TERM
OUTCOMES
LONG-TERM
OUTCOMES
To accomplish
our goal, we
need (staffing,
materials):
We will engage
in the
following:
Evidence that
we completed
activities:
Evidence of
impact of
activities
(within 2 to 20
months):
Often not
measurable
within a short
span.
Purpose: A logic model will communicate to stakeholders what you intend to do,
your resources, activities, and short and long-term outcomes.
ASSUMPTIONS: Why do we think this program will work?
EXTERNAL FACTORS: What could impact our program?
MODEL
9. RESOURCES ACTIVITIES OUTPUTS
SHORT &
MID-TERM
OUTCOMES
LONG-TERM
OUTCOMES
Trainers
Materials
Workshops
Follow-Up
Events
Outreach
20 Families
Attended 3
Sessions Each
Pre-Post surveys
indicate increase
in # of books at
home and
increased home
reading activities.
School
readiness
Purpose: Our Family Reading Program provides training for low-income parents
and guardians of K-3rd grade families on how to read to their children.
ASSUMPTIONS: Through increasing access to high-quality children’s literature and reading
practices, children will be better prepared for kindergarten.
EXTERNAL FACTORS: Families move or are unable to attend all 3 events.
MODEL
11. Use a
simple logic
model.
Revisit and
refine
quarterly.
Consider
external
facilitation.
Create
measurable
outcomes.
MODEL
BEST PRACTICES
12. What tools could the program(s) use to measure
outcomes?
What challenges do they face in reporting
information?
How are these challenges similar or different than
those of your organization?
Read the case study aloud and discuss the 3 questions below
with a triad from different organizations.
MEASUREMENT
14. MEASUREMENT
What is the
purpose of
data?
Formative
improvement
Summative
outcomes
Who should
data be
collected from?
All clients (by
participation
levels)
Other
stakeholders
Consider
linguistic/cult
ural issues
Where is data
stored?
Online/Cloud
Excel or other
Database
How is privacy
secured?
How is data
shared?
Frequency of
sharing data
Reporting
methods
15. MEASUREMENT
TERMS & EXAMPLES
Quantitative
• Assessment
Data
• Pre-Post
Results
• Attendance
Qualitative
• Focus groups
• Interviews
• Case Study
Formative
• Ongoing
• Informs Daily
Operations
Summative
• Annual
• Informs
Program
Direction
16. MEASUREMENT
Most of you collect student assessment information. Here are
other examples of data that may be collected and reviewed
on an ongoing basis for continuous improvement.
Family Workshop
Events
• Attendance
• Participant
satisfaction with
event or training
Staff –Teacher
Professional Learning
• Surveys
• Externally
conducted focus
groups or interviews
Volunteer Activities
• Number of
volunteers
• Number of hours
• Satisfaction
20. MANAGEMENT
EVALUATION
TIMELINE
Time Frame
Type of data
How do we
organize our
data?
Who
receives
report?
Quarterly
Individual
Spreadsheets
Clients/
Staff
Mid-Year
Formative
Charts &
tables
Program
staff/Funder
s
End of Year
Summative
Visuals:
Trends &
highlights
Stakeholders
21. MANAGEMENT
VISUAL EXAMPLES
0% 50% 100% 150% 200%
Strongly Agree
Agree
Disagree
Strongly Disagree
I like evaluation.
I like data.
I like reading reports.
28. “If your child is not confident to
ask for help, they are left
behind.” Elementary School Parent
“The school is pretty good with
below basic and basic kids, but
kids proficient and advanced get
nothing to challenge them.”
Middle School Parent
“Teachers have to teach to the
middle because of the number of
students in each class.”
Elementary School Parent
MANAGEMENT
VISUAL EXAMPLES
31. As part of your evaluation timeline, create a meeting schedule
to share information. Here is an example.
Management/Staf
f/Board
Monthly/Qua
rterly
Parents/
Families
Quarterly
Partners/
Schools/Funders
Quarterly/An
nually
Funders/Commun
ity
Annually
MEETINGS
TYPES OF STAKEHOLDERS TO MEET WITH AND DISCUSS
PROGRAM EVALUATION INFORMATION
32. MEETINGS
A CYCLE OF INQUIRY
REVIEW DATA -ASK
QUESTIONS
CONSULT
STAKEHOLDERS
& RESEARCH
SUSTAIN OR
MODIFY APPROACH
IMPLEMENT
CHANGES AS
NEEDED
REVIEW
PROGRESS
33. • Describes your goal in detail.
SPECIFIC
• How will you measure progress?
MEASURABLE
• What will be done to achieve goal?ACHIEVABLE/ACTI
ONABLE
• Is this relevant/realistic?RELEVANT/REALIS
TIC
• When will it be done?
TIMELY
MEETINGS
SMART GOALS ARE A COMMON TOOL FOR
INTERNAL DATA REFLECTION
ND: Brief intro to who we are. We have worked with a number of agencies that are here but are an Oakland based business and have worked for the city, for the school district for many non profits. Our expertise includes evaluation.
Quick---I think we miss some very important examples of things we’ve done: CDBG (5 years with the City) and SMC BHRS….but I get that this is got to be short.
Question: Who is our audience? E
Here is a framework that we use in thinking about program evaluation: M4 – Your Model, Measurement, Management of Evaluation, and Meetings.
G&A to discuss learning objectives.
We assume most of you have at some point created a logic model. Quick show of hands, how many of have one? Of those who have one.. PG- I like asking about their experience.
This is the basic framework for logic models that you often see in grant requirements.
This description is made up but intended to show the relationship between each item. PG--I would include some standardized school readiness assessment in the right column.
Review the worksheet hand out.
A note on external support – sometimes when you have an external person who does not know your agency well. This can be difficult. It’s best if the external support is used for facilitation but not creation.
.PG- Not sure why it is difficult. Not sure I’d want to emphasize this as it is something we do quite a bit of. There are advantages of external too, like being more familiar with the field and evaluation/measurement options. Also ability to speak candidly about what they see where some internal stakeholders may be timid about pointing out weaknesses.
Find two partners from organizations other than your own. Review PG Good questions/process.
G&A to discuss learning objectives.
PG---Important. I think data should be collected from other stakeholders as well….those who observe the impact of services on clients because they are involved in serving them too or they are family members.
I agree- these are examples. Not necessarily inclusive. PG needs to expand on some of these…that how these factors apply to each program will vary considerably.
Most of you know these terms but just as reminder – we’ll talk about these throughout.
I think it would be good to highlight one or two very specific examples of how a client, teacher or volunteer can offer up a fabulous suggestion in the context of answering the simple question: How could our services better address our clients needs?” PG--Do we have a case study for this?
Refer to lengthy surveys for parents or stakeholders as well as surveys with complex language. Less is more. Simple is better than complex.
G&A to discuss learning objectives.
PG-Mid-year could include review with Senior Staff, Board, and even funder.
What do you notice about this graph? Most of us don’t see this but there is actually a website on data visualization that has leading reports that shows examples from global companies and newspapers that have some of these attributes. In this case, the axis and the color scheme make it very difficult.
Is there some important distinction being made between this and the last example. The last one had a scale of 150%. This is 100%.
Yes- Exactly –Axis Choices and Color Scheme?
What do you notice about this chart? Our eyes generally focus on the positive so having it be negative makes it difficult and takes longer for us to interpret results.
PG-- I don’t mean to be picky, but I think this is a chart that is pretty easy to grasp and that combines a great deal of information pretty clearly. HMMM.
What do you notice about this map? More distinct color gradations can make things easier to view. Also the kind of information distinctions in the gap between rich and poor results is a bit difficult to grasp by this picture and story. What other ways can we do this?
No – the use of color gradations that allow you to quickly understand trends but too much information that makes it difficult to see individual results.
A word cloud can help in providing a quick glimpse into trends. Note that call outs are also useful in highlight comments. PG and to call out trends in charts.
G&A to discuss learning objectives.
(even if the agency uses an external evaluator, they still need their own internal system).
A cycle of inquiry is another common way of looking at data. There are many variations on this approach as well. PG- In the model I have used, the orange would be: Review Data-Ask Questions and the brown would be Consult Research & Stakeholders, Design Improvements. The Green Cycle of Inquiry would be in the middle as it is the entire process.
There are multiple variations on SMART goals and they can be found with worksheets. .
Here is a framework that we use in thinking about program evaluation: M4 – Your Model, Measurement, Management of Evaluation, and Meetings.