Presentation to the 2017 European Association for Health Information and Libraries Conference Dublin. An update on work of the Health Education England group working on improving the use of metrics by helping people define better metrics and then put them to use
1. Building better metrics – driving
better conversations
Alan Fricker - Head of NHS Partnership
& Liaison, King’s College London
2. Why Metrics?
• How are we doing?
• How do we compare?
• Have changes made a
difference?
• Have better conversations
@NHS_HealthEdEng #heelks
3. Defining terms
• "A metric is criteria against which something is
measured" (Ben Showers (2015) Library Analytics
and Metrics)
• "a criterion or set of criteria stated in quantifiable
terms" (OED)”
@NHS_HealthEdEng #heelks
4. What was the plan?
• Take a look around
• Identify appropriate methodologies
and mechanisms
• Help people get better with metrics
• Support Knowledge for Healthcare
@NHS_HealthEdEng #heelks
7. @NHS_HealthEdEng #heelks
NHS explorations
SHALL National KPI
• 2011 consultation on 6 national KPI
• Revised to 4 (not all from original list)
– % of the organisation’s workforce (headcount) who are registered library
members.
– % of the organisation’s workforce (headcount) who have registered as a
library member in the last year.
– % of the organisation’s workforce (headcount) who have used ATHENS in
the last year.
– % increase in compliance with the Library Quality Assurance Framework
(LQAF) compared with the previous year.
• Not implemented
8. @NHS_HealthEdEng #heelks
NHS Quality Assurance
HeLICon and the Library Quality Assurance
Framework (LQAF)
• Helpful checklist approach
• Often seen as key metrics (levels / 90%)
• But burdensome
9. Current practice in the NHS
• Brief KfH survey on metrics in use
• 150 responses but only 47 offered a metric
• 117 metrics suggested
@NHS_HealthEdEng #heelks
10. Serendipity
• Areas for focus (Van Loo in Haines-Taylor &
Wilson, 1990):
– time consuming
– space intensive
– high cost
– affect most users
– directly linked to library objectives
– well defined and easy to describe
– relatively easy to collect
– are in areas where library staff have some
control to make changes
@NHS_HealthEdEng #heelks
11. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard (ISO 11620:2014)
• Generic approach to performance
indicators
• Well defined terms
– Resources
– Use (activity)
– Efficiency (cost)
– Potentials and Development (value added work)
• 52 indicators offered
12. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard - criteria
Informative content (provides information for decision
making
Reliability (produces same result when repeated)
Validity (measures what it is intended to measure –
though indirect measures can be valid)
Appropriateness (units and methods of measurement
appropriate to purpose)
Practicality (does not require unreasonable staff or user
time)
Comparability (the extent to which a score will mean the
same for different services – standard is clear you should
only compare similar services)
13. @NHS_HealthEdEng #heelks
Wider world
The Metric Tide - dimensions
“Robustness: basing metrics on the best possible data
in terms of accuracy and scope
Humility: recognising that quantitative evaluation should
support – but not supplant – qualitative, expert
assessment
Transparency: keeping data collection and analytical
processes open and transparent, so that those being
evaluated can test and verify the results
Diversity: accounting for variation by field, and using a
range of indicators to reflect and support a plurality of
research and researcher career paths across the system
Reflexivity: recognising and anticipating the systemic
and potential effects of indicators, and updating them in
response.”
14. @NHS_HealthEdEng #heelks
Wider world
HSCIC – Quality Assurance Indicators Tool
Relevance (Does it meet user need? Is it actionable?)
Accurate and reliable (Quality of data? Is it a good estimate of reality?)
Timeliness and Punctuality (How long after the event is data available / collected?)
Accessibility and clarity (How easy is to access the data? How easy is it to
interpret?)
Coherence and comparability (Are data from different sources on the same topic
similar? Can it be compared over time?)
Trade-offs (Would improving this metric have a negative impact on another?)
Assessment of user needs and perceptions (What do stakeholders think?)
Performance, cost and respondent burden (How much work is involved in
collection?)
Confidentiality and transparency
16. @NHS_HealthEdEng #heelks
Principles for good metrics
Meaningful
• Relates to goals of organisation
• Relates to needs of stake holders
• Re-examined over time to ensure
still valid
17. @NHS_HealthEdEng #heelks
Principles for good metrics
Actionable
• Measures what matters
• Measures something you can
influence
• Drives changes to behaviour /
services
• Investigate not assume
18. @NHS_HealthEdEng #heelks
Principles for good metrics
Reproducible
• Clearly defined in advance
• Transparent
• Can be replicated
• Best available data
• Non burdensome (to allow repetition)
19. @NHS_HealthEdEng #heelks
Principles for good metrics
Comparable
• Valid over time for internal use
• Valid externally for benchmarking
• Respect diversity of services
24. @NHS_HealthEdEng #heelks
Knowledge for Healthcare
An underpinning tool
• Supporting thinking around:
– Evaluation framework
– National Statistics Collection revision
– Future LQAF
25. @NHS_HealthEdEng #heelks
Opening a bank
Promoting sharing and supporting use
• Open submission and
publication
• Quality check – mostly around
reproducibility
• Not a big as rush yet!
• Helpful conversations
26. Thanks
Alan Fricker - Head of NHS Partnership & Liaison,
King’s College London
Alan.Fricker@kcl.ac.uk
@NHS_HealthEdEng #heelks
Notas do Editor
You could say Something to argue with
Example of use of these figures in the NLH finance report
Previous attempt to address this issue. Feel very culpable here as one of the people who shot holes in things. Basically – I could game almost every single one – the question was – did they matter?
First six
KPI1. Percentage of the organisation’s workforce (headcount) which are “active* library users.(Indicates penetration of library service).
KPI2. Percentage of the organisation’s workforce (headcount) which are registered ATHENS users.(Indicates use of e-resources)(E.g., 1000 Athens users in an organisation of 10,000 staff = 10% )
KPI3. Re-current expenditure commitment on library services based on the organisation's workforce (WTE). (Indicates Trust commitment to Library Services).(E.g., £100,000 spent on Library services in a Trust of 10,000 staff = £10 is spent on library services per WTE)
KPI4. Number of information consultancy enquiries per member of staff based on the organisation's workforce (WTE).(Indicates penetration level of Library enquiries on the organisation).(E.g., 400 enquiries in an organisation with 1,000 staff = a penetration level of 0.4)
KPI5. Percentage of the organisation's workforce (headcount) that subscribe to current awareness services. (Indicates penetration level of current awareness services on the organisation).
KPI6. Percentage of organisation's workforce (headcount) which have received information skills training in one year.(Indicates penetration of information skills/information literacy training on organisation).
We started by considering where metrics (and quality assurance / KPI) had been discussed in the NHS previously – we stuck with major initiatives and did not seek out an exhaustive picture of local work.
Helicon roots back in original LINC health panel accreditation checklist and toolkit (1996-1998).
LQAF 2010 onwards
Why so few metrics? Issue with tool? Survey overload? Discomfort with metrics?
Discovered on the discard pile – describe what we were seeing in the survey data perfectly
Bingo! Powerful way to think about what we are interested in
Debate in HE around use of Metrics – post REF 2014 and in an increasingly numbers driven approach to career futures.
Now Known as NHS Digital. National Library of quality assurance indicators – task under the 2012 Health and Social Care Act – aimed at healthcare delivery and performance but work for our quality purposes too
People care about this metric
This metric makes a difference
You could repeat my metric and the results would be consistent
Take care with comparisons!
Doing this is not easy! The template is there to help
Main template – spells it out and offers gaps. Simple Word based
Checklist – good enough for Gawande and the WHO – good enough for me
Booklet with modified forms (previous version)
Uses 9 lightly modified templates to define and present quality standards
Used to help present a wide picture of the service
Revisiting standards at moment to ensure meaningful – time of document supply turn around for example as always made. Is it challenging? Meaningful?
Good feedback from colleagues within service and that she works with – wider use in division who were thinking about similar issues
Three year was plan but more organic in the end
(accessing GMC data, considering options for KnowledgeShare)