This presentation introduces the discipline of program evaluation and offers a glimpse to how developmental evaluation responds to the call of providing an evaluation approach to working in complex contexts, such as social innovation. I conclude by introducing the notion of design and design thinking as a way of approaching problems we face in today's complex world.
I discuss, briefly, some of the strategies I employed to manage my own thesis research as a graduate student researcher in education.
Developmental Evaluation and the Graduate Student Researcher
1. Developmental Evaluation
and the Graduate Student
Researcher
Chi Yan Lam
Queen’s University
EGSS ScholarShare @chiyanlam
February 22, 2012
Assessment and Evaluation Group, Queen’s University
1
2. Researching &
Researching Evaluation
Cutting-edge Researching
Development in
Evaluation • Writing
• Thinking about
• Data Management
Evaluation
• Utilization-Focused • Project Management
Evaluation
• Reality-testing • Logic-line planning
• Developmental
Evaluation
2
4. What would you need to know
in order to make a purchase decision?
http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature
4
6. Insurance? Value? Best bang for the buck?
Lease rate?
Affordability?
Social
desirability? Reliability?
Comfort?
Ride quality?
Practicality?
Future plans?
6
9. Buying
a car is an
evaluative act.
http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature
9
11. As an evaluator,
I think (and care!) deeply about evaluation.
(Guba & Lincoln, 1989)
11
12. As an evaluator,
I think (and care!) deeply about evaluation.
(Guba & Lincoln, 1989)
11
13. As an evaluator,
I think (and care!) deeply about evaluation.
(Guba & Lincoln, 1989)
11
14. As an evaluator,
I think (and care!) deeply about evaluation.
(Guba & Lincoln, 1989)
11
15. What?
So what?
Now what?
(Patton, 2011, p. 3)
12
16. Purposes of Program Evaluation
• Overall judgement - Does it meet the needs of
participants? Should we keep?
• Learning/Improvement - what works/what
doesn’t? How can it be improved? How can quality
be enhanced?
• Accountability - are goals being met?
• Monitoring - graduation rates? retention?
• Knowledge generation - What are patterns of
effectiveness? Site A vs Site B?
13
17. Utilization-Focused Evaluation (Patton, 2008, 2012)
• Framework for making
decisions about the evaluation
in collaboration with
Intended use primary users.
by intended • Attention paid to
stakeholders - people
users affected by the program and
evaluation (Greene, 2006)
• Focus is on... use!
14
24. Social Complexity
& Social Innovation
The world that we live in today is
fast-changing, such that the tools we have
for evaluation are no longer adequate.
21
21
26. Developmental Evaluation (Patton, 1994, 2011)
• supports innovation development to guide adaptation [of
programs] to emergent and dynamic realities in complex
environments
• processes include:
• asking evaluative questions
• applying evaluation logic
• gathering real-time data to inform ongoing decision making
• document and track program development; sense-making
• Informed by complexity science and systems thinking
23
23
27. Assessment
Pilot Initiative
• Contemporary notions of classroom
assessment
• Teaching and Learning Constraints
• Interested in integrating Social Media into
Teacher Education (classroom assessment)
• The thinking was that assessment learning
requires learners to actively engage with
peers and challenge their own experiences
and conceptions of assessment.
24
24
28. Uncertainty
• uncertain about how to proceed
• uncertain what (to use) in order proceed
• uncertain how teacher candidates would respond
• Clear, Measurable, and Specific Outcomes
• Use of planning frameworks.
• Traditional evaluation cycles wouldn’t work.
25
25
30. Book-ending: Concluding
Conditions
• In the end, 22 candidates participated in a pilot
program.
• Teacher candidates tweeted about their own
experiences around trying to put into practice
contemporary notions of assessment
• Guided by the script: “Think Tweet Share”
• Developmental evaluation guided this exploration,
between the instructors, evaluator, and teacher
candidates as a collective in this participatory learning
experience.
27
27
32. Research Purpose
to learn about the capacity of developmental
evaluation to support innovation.
29
29
33. Why?
• DE is still tentative.
• Evaluation community craves “practical
knowledge” (Schwandt, 2008) about
evaluation approaches: knowledge
generation.
30
30
34. Research Questions
1.
To what extent does Assessment Pilot Initiative qualify as a
developmental evaluation?
2.
What contribution does developmental evaluation make to
enable and promote program development?
3.
To what extent does developmental evaluation address the
needs of the developers in ways that inform program development?
4.
What insights, if any, can be drawn from this development about the
roles and the responsibilities of the developmental evaluator?
31
31
35. Method & Methodology
• Questions drive method (Greene, 2007; Teddlie and Tashakkori,
2009)
• Qualitative Case Study
• understanding the intricacies into the phenomenon and
the context
• Case is a “specific, unique, bounded system” (Stake,
2005, p. 436).
• Understanding the system’s activity, and its function and
interactions.
• Qualitative research to describe, understand, and infer
meaning.
32
32
36. Data Sources
• Three pillars of data
1. Program development records
2. Interviews with clients on the significance
of various DE episodes
3. My own reactions to the ongoing
development; via document-elicitation
33
33
37. Data Analysis
1. Reconstructing evidentiary base
2. Identifying developmental episodes (p. 47)
3. Coding for developmental moments (p. 49)
4. Time-series analysis
34
34
38. Data Analysis
1. Reconstructing evidentiary base
2. Identifying developmental episodes (p. 47)
3. Coding for developmental moments (p. 49)
4. Time-series analysis
35
35
40. Key Developmental
Episodes
• Ep 1: Evolving understanding in using social
media for professional learning.
• Ep 2: Explicating values through Appreciative
Inquiry for program development.
• Ep 3: Enhancing collaboration through
structured communication
• Ep 4: Program development through the use
of evaluative data
37
37
41. Major Findings
RQ1: To what extent does API qualify as a
developmental evaluation?
1. Preformative development of a potentially broad-
impact, scalable innovation
2. Patton: Did something get developed? (Improvement
vs development vs innovation)
3. Trends (patterns over time)
38
38
42. !
• Development occurred through purposeful
interactions (i.e. developmental episodes)
• concretization of ideas and thinking (e.g. reflection;
green, across)
• “intensification” in developmental evaluative activities
39
39
43. Major Findings
RQ2: What contribution does DE make to enable and
promote program development?
1. Lent a data-informed process to innovation; (p. 97)
2. Implication: responsiveness
• in candidates’ reaction
• in the program
3. Consequence: resolving uncertainty
40
40
44. • Blue DM - kinds of issues and concerns
that surfaced.
• Six foci of development
• Definition, delineation, collaboration,
prototyping, illumination, evaluation.
• non-linear, cyclical process
41
41
45. Major Findings
RQ3: To what extent does DE address the
needs of developers in ways that inform
program development?
1. Through promoting learning, and enacting a
learning framework
2. Values and valuing
42
42
46. Major Findings
RQ4: What insights, if any, can be drawn from
this development about the roles and the
responsibilities of the developmental evaluator?
1. Manager
2. Facilitator of learning
3. Evaluator
4. Innovation thinker
43
43
47. Conclusion
• Innovation developed in a context of
complexity guided by developmental
evaluation
• Preformative development
• Evaluator as someone who co-develops a
program and draws upon substantive
knowledge/skills to promote development
• Innovation process (six foci of development)
44
44
49. Design+Design Thinking
“Design is the systematic exploration into the complexity of options (in
program values, assumptions, output, impact, and technologies) and
decision-making processes that results in purposeful decisions about the
features and components of a program-in-development that is informed by
the best conception of the complexity surrounding a social need.
Design is dependent on the existence and validity of highly situated and
contextualized knowledge about the realities of stakeholders at a site of
innovation. The design process fits potential technologies, ideas, and
concepts to reconfigure the social realities. This results in the emergence of
a program that is adaptive and responsive to the needs of program users.”
(Lam, 2011, p. 137-138)
46
46
50. Implications to
Evaluation
• One of the first documented case study into
developmental evaluation
• Contributions into understanding, analyzing
and reporting development as a process
• Delineating the kinds of roles and
responsibilities that promote development
• The notion of design emerges from this
study
47
47
51. Limitations
• Contextually bound, so not generalizable
• but it does add knowledge to the field
• Data of the study is only as good as the data collected from
the evaluation
• better if I had captured the program-in-action
• Analysis of the outcome of API could help strength the case
study
• but not necessary to achieving the research foci
• Cross-case analysis would be a better method for generating
understanding.
48
48
54. Data Management
• Document, document, document.
• keep a researcher’s log
• Use a hanging folder method
• for each phase of the project
• proposal, lit, ethics, consent forms,
transcripts, examples
• special folders for each type of your rsh data
• Same thing for your digital files!
51
55. Project Management
• Think about: • budget twice the time
• deadlines • budget for re-writing
and re-drafting
• deliverables
• work in time blocks
• lead time
• pomodoro 25-4
• wait-time
• immerse yourself
52