A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
Chandigarh Escorts Service 📞8868886958📞 Just📲 Call Nihal Chandigarh Call Girl...
Evaluating Communication Programmes, Products and Campaigns: Training workshop
1. Evaluating Communication
Programmes, Products and
Campaigns:
1 day training workshop for
communication professionals
Glenn O’Neil
oneil@owlre.com
www.owlre.com
Workshop originally conducted for Gellis Communications
(www.gellis.com) in Brussels on 30 October 2009
2. Schedule
1. Introduction & definitions
2. Five steps of evaluation
3. Campaign evaluation methodology
4. Programme evaluation methodology
5. Product evaluation methodology
6. Reporting on communications evaluation
2
3. Training objective
Communication professionals understand the
key concepts of communications evaluation
and thus increase their effectiveness in
managing evaluation aspects of their projects!
3
4. What is evaluation?
“Evaluation is the systematic assessment of the operation and/or
the outcomes of a program or policy compared to a set of explicit
or implicit standards, as a means to contributing to the
improvement of the program or policy”
source: “Evaluation”, by Susan Weiss, (1998)
“A form of research that determines the relative effectiveness of
a public relations campaign or program by measuring program
outcomes (changes in the levels of awareness, understanding,
attitudes, opinions, and/or behaviours of a targeted audience or
public) against a predetermined set of objectives that initially
established the level or degree of change desired”
source: “Stacks, D. (2006). Dictionary of Public Relations Measurement and Research. Institute for Public Relations.”
4
5. What is communications?
Programmes, projects, campaigns and
activities that are dedicated to the
management of communications
between an organisation and its publics
Source: Grunig, J. (ed.) (1992). Excellence in Public Relations and Communications
5
6. What is communications?
– A programme is an organised set of communication
activities based on target audiences, themes or
functions running continuously or for long periods
– A campaign is an organised set of communications
activities, directed at a particular audience usually
within a specified period of time to achieve specific
outcomes
– A product is an individual object, such as a
publication, website or video created to support a
communication activity
6
8. What can communications change?
– What is communications attempting to
change? What are the effects desired?
– How can these effects be categorised?
– How can these effects be measured?
8
9. When to evaluate?
Formative evaluation
Process evaluation Summative evaluation
Intermediate evaluation Outcome evaluation
Mid-term evaluation Ex-post evaluation
Activity
Impact evaluation
Formative evaluation Monitoring (not evaluation!)
Baseline evaluation
Ex-ante evaluation
Appraisal
Different terms that mean the same thing!
9
10. Communications evaluation design
Post only T E Pre-post E T E
True & constructed E E
cohort studies T
E E
Time series
E E T E E
Field E T E
experiments
E E
Meta-analyses E T E T E
E T E E T E
10
12. Five steps of an evaluation
All evaluations, communications or otherwise,
follow similar steps in how they are carried
out:
Planning Creation Management Analysis Presentation
12
13. Five steps of an evaluation
Creation
• Designing the evaluation methods
• Designing the templates and tools
• Designing final report outline
Planning
• Creating inception report
• Refining inception report Management
• Creating the evaluation framework • Testing the templates and tools
• Selecting the evaluation methods • Collecting the data
& sample • Managing the data
• Consulting with client
• Establishing the team
• Determining the timeframe
• Determining the deliverables
• Creating the workplan
Analysis
• Preliminary desk review • Analysing and interpreting the
results
• Formulating conclusions
Presentation • Creating recommendations
• Presenting the final report • Writing the final report
• Disseminating the final report • Submitting the final report
• Promoting the final report
• Creating follow-up mechanisms
(e.g. steering group)
13
14. Five steps of an evaluation
– Which steps are usually done well?
– Which steps are usually done less well or
skipped over?
14
16. Two types of campaigns
Advocacy campaign
Public information campaign
16
17. Evaluating campaigns
– In theory, campaigns are easier to evaluate
than programmes
– Where do you start if you have been asked to
evaluate a campaign?
17
18. Theory of change
– A good starting point is to map out the “theory
of change”
– The theory of change shows the pathway
from inputs to impact
What was this campaign trying to achieve?
Inputs Activities Outcomes Impact
18
19. Example: Theory of change
Inputs Activities Activities
Organisation-led Partners-led
Planning of Communication tools Adaptation & production
campaign goals and of campaign material
Artistic projects (film,
activities
cartoon, book, poster) Special events &
conferences
Preparation of Special events &
campaign materials conferences
Grassroots mobilisation
Consultation and Global day
briefing of campaign Media campaign
partners
Media campaign
Web campaign
Web campaign Training programme
19
20. Example: Theory of change
Outcomes Outcomes Outcomes Impact
Alliances Awareness Action People are
Stimulated debate, protected and
Increased
awareness of spurred action and empowered to
Network actively
human rights in reaffirmed commitment realise their
participated in the
general amongst of governments, civil rights
campaign
rights holders society, educational,
cultural and human
Engaged multiple rights institutions
stakeholders in the
campaign at the
country level & Helped bridge gaps in
globally HR implementation at
the national level
Garnered further
support for the
organization
20
21. Evaluation framework
– The “theory of change” assists in clarifying
the objectives of the campaign
– The next step would be to create the
evaluation framework – the link from
objectives/outcomes to indicators to
evaluation methods
21
22. Evaluation framework - example
Campaign Proposed Means of Selection frame
Outcomes Indicators verification
(evaluation tools)
6. Increased
association of - Change to level of -Online panel study of
organisation as key association of individuals to assess -individuals recruited
actor for today’s individuals changes of online
ecological association
challenges -Change to level of
visibility in the media - Street polls in major - urban population
of the organisation cities(voxpop)
- Event attendance - key events
statistics and
feedback
- Number of mentions - Selection of print
of organisation and and online media
other key words in
the media and online
- Number of visitors - Campaign portal
to campaign portal
22
23. Evaluation methods
– Standard evaluation methods are used in
campaign evaluation…in combination with
methods particularly adapted for campaigns
and communications programmes
– A combination of qualitative and quantitative
methods is recommended
23
24. Evaluation methods
Standard Adapted
Surveys Expert reviews
Interviews Content analysis
Panels Media monitoring
Focus groups Web metrics
Case studies Tracking mechanisms
Observation studies Network mapping
We focus on these methods
as they are special to
communications!
24
25. Expert review
– A specialist examines a communication
activity or product and provides an
assessment
– Assessment is made often against best
practices or standards
A brochure is compared to the corporate
identity guidelines of an organisation
A website is measured against usability
standards
25
26. Expert review - example
Your Comparative Comparative Comparative
organisation organisation A organisation B organisation C
High-level design decisions and 57% 71% 71% 71%
strategy
Content design 62% 85% 73% 93%
Navigation and search
59% 87% 84% 94%
Content presentation 59% 86% 86% 93%
General design aspects 45% 95% 90% 92%
Overall compatibility
58% 86% 83% 92%
Compatibility of organisation web tool and comparative tools with ISO usability standard
26
27. Content analysis
– Media reports, documents or other
sources are analysed and categorised to
identify trends and patterns
– Content analysis assists in identifying
preference, priorities, trends, etc.
27
28. Content analysis - example
Theme (Question) Category No. of posts Keywords
Policy in the Middle
East research 110 Middle East, Jordan, Iran, participation
Rwanda, research, quotas, gender,
Policy in Rwanda research 89 leadership
Policy and gender research 88 gender, legislation
Careers in policy work advise 55 consultancy, careers
Heads of state and
policy research 22 research, leadership
Upcoming elections &
policy advise 18 Ecuador, elections, quotas
Indicators for measuring
policy advise 13 policy, research, evaluation, Peru
Policy and
representation advise 5 Congo, representation
Policy development advise 1 New Zealand, development
Content analysis of postings in online forum
28
29. Content analysis - example
Comparison of media releases & updates by crises
Crisis one
Crisis two
Crisis three
Day of crisis 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35
1 square = 1 media release or update 2 squares = 2 media releases or updates
issued on that day issued on that day
29
30. Media monitoring
– Media monitoring measures visibility of an
issue or organisation in the media
– Most monitoring counts mentions of
keywords in a pre-selected group of media
using automated software
– Media monitoring can be an indication of
levels of awareness amongst publics – but
it is not a replacement!
30
32. Media monitoring - example
No. of articles
2500
2000
1500
1000
500
0
ly
v
v
n
n
g
b
ay
t
c
c
ne
ar
ril
pt
oc
no
no
ja
ja
de
de
fe
au
ju
ap
m
se
m
ju
Campaign 2008/9 Campaign 2005/6
32
33. Web metrics
– Web metrics is data collected by
automated software on visits and other
actions on web sites
– This can be both for an organisation’s
website or a sector
– Web metrics can measure different
variables including interests, preferences,
interaction and online behaviors
33
34. Web metrics - example
Language – no. articles (%) Language of visitors (%)
English 90 69
Chinese 6 15
Russian 3 4
French 0.5 3
German -- 2
--
Spanish 2
Other -- 5
Combination of content analysis (language content) with web metrics (language of visitors
per computer settings) for online portal
34
35. Tracking mechanisms
– Tracking mechanisms record actions
taken on issues, policies, legislation, etc.
– Tracking mechanisms are usually
manually tracked on standard forms in a
systematic manner
Recording how many partners join a
campaign
Tracking and recording the number of
business leaders that speak out on an
issue
35
37. Network mapping
– Network mapping measures the relations
and flow between people, ideas and
organisations
– Network mapping is useful in measuring
growth of networks and interconnectivity
between publics and issues
37
38. Network mapping - example
Conference participants – networks
Before After
39. Network mapping - example
Legend
No content Mostly out-of-date Mostly up-to-date
Size of square indicates number of visits Connecting lines indicate users have visited both directories
Thickness of connecting line indicate number of users that have
visited both directories
Network map of directories of online portal combining web metrics (number of visits per
directory), content analysis (data updated or not) and user survey data (visits to which
directories)
41. Programme evaluation
– This type of evaluation examines a series
of communication activities grouped under
a programme, for example:
By target audience: communication
programme aimed at young people
By function: online communication
programme
By theme: communication programme on
health policy
41
42. Evaluation steps
– The evaluation steps are the same as for
campaign evaluation, but will normally be over
a longer period (2 – 6 months)
Planning Creation Management Analysis Presentation
– More consultation and interim meetings with
the client are usually built into the evaluation
planning
42
43. Programme evaluation
Programmes are typically more difficult
to evaluate than campaigns because:
– They are often on rolling timeframes with
no clear end
– They often have unclear or very broad
objectives
– They often lack an institutional memory on
past activities and achievements
However, organisations increasingly need to
evaluate such programmes!
43
44. Programme evaluation
– A similar methodology can be applied as
to campaigns
– Determining the programme’s objectives
and defining the evaluation framework are
key
44
45. Programme evaluation
Programme evaluation can focus on three
distinct areas:
– Process: how has the programme been
managed?
– Outcomes: what has the programme
achieved?
– Impact: what has the programme contributed
to overall?
Programme evaluations can combine
elements of all three!
45
46. Evaluation Framework
Questions for completing the evaluation
framework:
– Are programme objectives documented?
– Does any baseline data exist?
– Has any programme monitoring being done?
– What is the balance between “outcome”,
“process” and “impact” evaluation questions?
Objectives
Indicators Tools Source
/Questions
46
47. Evaluation methods
Standard Adapted
Surveys Expert reviews
Interviews Content analysis
Panels Media monitoring
Focus groups Web metrics
Case studies Tracking mechanisms
Observation studies Network mapping
Onsite visits
47
48. Onsite visits
– In ongoing programmes operating across
multiple countries, onsite visits are often
included as an evaluation method
– Onsite visits involve a combination of
observation, interviews and discussions
– An evaluator can observe directly a
programme’s activities, discuss with its
implementers and gain in-depth knowledge
– Onsite visits add credibility to the evaluation
findings
48
49. Evaluating Communication
Programmes, Products and
Campaigns
5. Product evaluation methodology
Glenn O’Neil
50. Product evaluation
– Product evaluation is a more narrower
approach focusing on an individual item
(or series of items)
– This evaluation provides feedback on a
product’s use and its contribution to a
communication programme (or other type
of programme)
50
51. Product evaluation
– Different types of products can be
considered including:
• Promotional videos
• Publications
• Websites and online tools
51
52. Criteria for evaluation
– Evaluation questions often include:
• Is the product considered to be of high quality in
terms of design, usability and content?
• Is the product targeted to the right audiences?
• Is the product available, accessible and
distributed to the intended target audiences?
52
53. Criteria for evaluation
– Evaluation questions often include (cont.):
• Is the product used in the manner for which it
was intended - and for what other unintended
purposes?
• What has the product contributed to broader
communication and organizational goals?
• What lessons can be learnt for improving future
editions of the product and design, distribution
and promotion in general?
53
54. Evaluation methods
The evaluation methods have to be adapted to
the type of product and can include:
Surveys Expert reviews
Interviews Content analysis
Focus groups Web metrics
Case studies Tracking mechanisms
Observation studies Distribution statistics
54
55. Distribution statistics - example
Fax Web National
Promotional
orders orders offices
distribution
5% 25% 30%
40%
Student / teacher NGOs
Media Local partners
55
56. Mapping use - example
Develop teaching Charts & tables used in
Create presentations
materials production
for clients
Used as guidelines
for product design
Resource Working tool
Product
Used for staff
training Training Used by authorities
Policy support
to revise guidelines
Used for training of Used by NGOs to
national partners influence debates on
regulations
56
57. Mini case study - example
Capacity building for women in Uzbekistan, Central Asia
Nargiz, portal member, Uzbekistan
In Uzbekistan, Nargiz, a portal member is part of a group of 50 women
who were preparing to run in parliamentary elections. For her, the portal
has been a valuable source of support and information.
“In the e-discussions I got important feedback on fundraising strategies
and financing of campaigns. This information will be used!”
Nargiz especially mentions an interesting experience from Mauritius,
shared on the portal, which she can apply in her daily work at the
Women’s NGO Forum of her country
“The material on capacity building is also very useful for us. I hope that in
the future, we can share more of our own resources with the network.”
57
58. Devising precise questions
• In all communications evaluations, if an evaluation
framework exists, it should be relatively easy moving
from indicators to questions or criteria for collecting data
• But these questions and criteria must be created,
documented and shared with the persons undertaking
the evaluation
• Questions and criteria would normally be documented in
templates and guides
58
59. Devising precise questions
Extending the Evaluation Framework
Objectives Evaluation questions Indicators Precise questions
Are resources How often do you use the
Portal facilitates being exchanged? - Level of usage of resource resource section?
areas of website Have you contributed
a global
- Frequency and type of resources?
exchange of By whom, what type, resources exchanged How have you used the
resources within what regions - Instances of resources found on the
and at which frequency? uses of resources Portal?
- etc.
We are here!
59
60. Evaluating Communication
Programmes, Products and
Campaigns
6. Reporting on communications
evaluation
Glenn O’Neil
61. Reporting and presenting
– We have now jumped to the final phase of the
evaluation
Planning Creation Management Analysis Presentation
– The presentation phase is often the most
neglected of all the phases
– Evaluation regularly fails in ensuring that
people know of the findings and take action
61
63. Creating readable reports
A Good Evaluation Report is… A Weak Evaluation Report is…
• Impartial • Repetitious
• Credible • too long
• Balanced • Unclear and unreadable
• Clear and easy to understand • Insufficiently action oriented
• Information rich • Lacking hard data and relying
• Action oriented and crisp on opinion
• Focused on evidence that • Poorly structured and lacking
supports conclusions focus on key findings
• Lacking comprehension of the
local context
• Negative or vague in its findings
63
66. Findings table: Summary of the review’s key findings
Expected Results Rating
an example Outputs
Eight directories of the CR established and Largely achieved
accessible to potential users from the disaster
management community worldwide.
Eight directories of the CR stocked with relevant, Only partially achieved
appropriate and up-to-date information on disaster
management capacities.
Outcomes
Potential users from the disaster management Very limited achievement
community worldwide learned of the CR.
Potential users from the disaster management Very limited achievement
community worldwide visited the CR and registered
Users obtained information of use to them in one or Only partially achieved
more of the eight directories of the CR.
Users contributed information from their Very limited achievement
organisations to one or more of the eight directories
of the CR.
Information found on the CR facilitated the rapid Very limited achievement
identification of appropriate disaster management
services.
Information found on the CR contributed to the rapid Not achieved
delivery of humanitarian emergency assistance.
Impact
Delivery of humanitarian emergency assistance
improved. Not measured in this review
66
67. Video report: an example
http://www.youtube.com/watch?v=q6nKXcUrNXA
67
68. Follow-up mechanisms
Evaluations may require follow-up
mechanisms to ensure that the findings are
disseminated and acted upon, including:
– Workshops with staff and donors to discuss findings
– Steering committees to discuss findings and
implementation
– Plans of action based on findings and
recommendations of the evaluation
68
69. A parting quote
Scientific quality is not the principle standard; an
evaluation should aim to be comprehensible, correct
and complete, and credible to partisans on all sides
oneil@owlre.com
www.owlre.com Professor Lee Cronbach
69