2. The team
Beck Taylor
Cathy Henshall
Ian Litchfield
Louise Bentham
Sara Kenyon
Sheila Greenfield
Our partners
Birmingham Women’s Hospital, particularly the Home Birth Team
Birmingham South Central Clinical Commissioning Group
3. Evaluation of a service innovation
• Asked by NHS to evaluate dedicated Home Birth Service
• Model and its implementation evolving, not well-defined
• Designed project based on Evaluability Assessment
methodology
– Interviews (n=21) and focus group (n=13) with key stakeholders, plus
documentary analysis (n=9)
– Explores: programme theory and fidelity; barriers/facilitators; available
data; areas for evaluation; recommendations for changes and further
evaluation
4. My background
• Public health physician, working in predominantly
qualitative research since 2008
• Pre-2008 conducted pragmatic evaluations and
assessments with key stakeholders in short timescales,
not methodologically robust!
• Considerations
– Desire to inform practice in real time
– Understanding that stakeholders want key findings
rather than fine detail
– BUT not at expense of academic rigour
5. Was there a way to analyse data and
deliver findings more quickly?
– Sharing of model/lack of model quickly
– Inform ongoing development and decision-making
– Information not out of date, useful
6. What we did
• Data gathered predominantly by BT
• Analysed ‘rapidly’ by BT and CH with input from SK
• No coding – ‘summary templates’ used to manage
data (Alison Hamilton. Qualitative Methods in Rapid Turn-Around Health Services Research.
Presented online for the US Department of Veterans’ Affairs (2013) )
9. Questions arising…
• How does this approach work in practice?
• Does it deliver findings more quickly than ‘traditional’
qualitative analysis?
• Does it elicit similar findings to traditional approaches?
If not, how do they differ?
• What impact might any differences in findings have?
• What might the applications of this approach be?
10. “Can’t we do it the ‘normal’ way and
see how the two compare?”
(Thank you Dr Kenyon!)
11. Comparative analysis project
• Repeat analysis of data using in-depth analysis,
Framework method
• Independent, blinded researcher (Ian Litchfield)
• Input from second researcher (Louise Bentham)
• Oversight and methodological support from Prof Sheila
Greenfield
16. Approach to comparison
(not much in the literature)*
1) Time
Researcher timesheets
Total time + time for specific tasks
2) Findings and recommendations
Independent review – match/partial match/no match
Check as a team
Quantitative and qualitative summaries of similarities/differences
*Burgess-Allen J, Owen-Smith V. Using mind mapping techniques for rapid qualitative data analysis in
public participation processes. Health expectations : an international journal of public participation in
health care and health policy. 2010;13(4):406-15.
*Putten JV, Nolen AL. Comparing Results from Constant Comparative and Computer Software Methods:
A Reflection about Qualitative Data Analysis. Journal of Ethnographic & Qualitative Research.
2010;5(2):99-112.
17. Results: time
• Data management was much faster in rapid analysis (RA)
(In-depth analysis (IDA) 3x longer)
• Interpretation took much longer in RA, but we think other
factors influenced this
(RA 6x longer than IDA)
18. Results: overlap in recommendations
and findings
SPECIFIC,
DETAILED
KEY
ISSUES
CONTEXT-
INFORMED,
INTERPRETIVE
In-depth
Analysis
Rapid
Analysis
19. Comparing the two methods
Rapid Analysis In Depth Analysis
Clinical Not clinical
Embedded in the field No prior exposure to field
BT collected the data Did not collect data
Using RA for first time – learning time,
need to avoid usual practice
Experienced in method – no method to
‘learn’, doing what comes naturally
Shared office No space for informal reflection
Equal workload IL did the lion’s share of analysis
Main focus of work over short period Project ‘squeezed in’ among other
commitments
Focused on producing and ‘crafting’
outputs for known stakeholders
Much less focused on the stakeholder
team
20. Reflections – rapid analysis
• Requires time discipline
• Uncomfortable at times – true to data?
• How would this work for novice researchers?
• How would this work in larger teams? Would it take longer
to synthesise data?
• Would this work if we were not embedded/clinical?
21. Reflections – comparing methods
• Very time consuming
• Defining and interpreting ‘outcomes’ is not
straightforward – what constitutes a ‘finding’ or a
‘recommendation’?
• Comparing apples with oranges – further work needed
• Ideally need to compare two similar research teams
using different methods, but cost/capacity implications
22. Reflections – applications of RA
• Identifying headline/priority issues – how much of the
detail do stakeholders actually use in practice?
• Providing rapid findings where time of the essence
(provided it is truly rapid)
• Identifying areas for more in-depth analysis
23. Further work
• ‘More comparable’ comparisons
• Repeat comparisons in other contexts
• Comparisons of other rapid approaches
• Consultation with users of research outputs