SlideShare uma empresa Scribd logo
1 de 24
Baixar para ler offline
Pathology Quality Review
Fr3dom Health Review Team
October 2013
2013
Prepared for The Barnes Review
Fr3dom Health Solutions Limited | Fr3dom House | 11 North Street | Portslade | E Sussex | BN41 1DH | www.fr3domhealth.co.uk
FHS-PathologyQualityReview-v2.indd 1 08/11/2013 17:07:12
2
Contents3	 1: Executive Summary
4	 2: Introduction
4	 3: Background
5	 4: Summary of Findings
5		 4.1: High level
5		 4.2: Objective assessments and benchmarking of quality
5		 4.3: Quality systems and learning
5		 4.4: Maintaining quality assurance during times of structural change
6	 5: Acknowledgements
7	 6: Method
8	 7: Results
9	 8: Emergent Themes
9		 8.1: Overall impression of pathology services
10		 8.2: Issues concerned with the breadth of activity covered by pathology
11		 8.3: Objective assessments and benchmarking of quality
13	 9: Quality systems and learning
13		 9.1: Some of the practices that were identified:
13		 9.2: Sharing lessons within and outside the organisation
14		 9.3: Maintaining (improving) Quality Assurance during times of structural change
15		 9.4: Maintaining Quality in times of organisational stress
16	 10: Scope of pathology services’ ability to control quality on the whole pathway
17		 10.1: Working with Commissioners to manage demand
17		 10.2: Restrictions imposed by IT systems
18	 11: Paradigm shift to commissioning led service
19	 Appendix 1: Survey Results
19		 Table 1: Survey returns
20		 Table 2: Categorisation of answers given
21	 Appendix 2: Topic Guides
21		 Discussion guide for providers:
21		 In response to the Francis Report the Royal College of Pathologists said:
21		 Discussion Guide for Commissioners
21		 In response to the Francis Report the Royal College of Pathologists said:
22	 Appendix 3: Quality Assurance in Pathology
FHS-PathologyQualityReview-v2.indd 2 08/11/2013 17:07:12
3
2013Prepared by Fr3dom Health
Pathology Quality Review
1: Executive Summary
Fr3dom Health surveyed pathology services during
the summer of 2013; a sample of 12 providers was
subsequently approached to take part in a more in depth
analysis. In the autumn of 2013 a total of 28 interviews
took place with 43 participants who worked within
twelve different health communities during autumn
2013. They included clinicians and quality managers
from provider units, Clinical Commissioning Groups
(CCGs) and a Commissioning Support Unit (CSU).
•	 QA within pathology services appears to give few
causes for concern. Clinicians and managers in
provider units approach their work and its QA with
energy, passion and enthusiasm.
•	 Neither the scope of activity undertaken nor the
quality assurance of that activity is currently driven
by commissioners.
•	 Different styles of pathology department utilise
different styles of QA. It is not clear whether the
different styles and processes have a measurable
effect on quality of service. There may be a case for an
agreed definition of quality.
•	 Some departments have very well developed
and resource intensive quality systems. The most
persuasive were those where quality was built into the
day-to-day activity of the department. We would like
to see some evidence about whether a great number
of “meetings about quality” equate to quantifiable
improvements in quality of service.
•	 Responsibility for delivering quality needs to sit with
laboratory staff and clinical staff and within their
respective management structures. Staff training and
development can reinforce this.
•	 Some work might be done in identifying objective
metrics /standards beyond simple performance
measures in order to allow benchmarking of quality
for commissioners and others. To achieve this, there
is a need to revitalise professionally-led clinical
groups to work on standardisation of protocols
and performance measures within pathology
departments, or for commissioners to more fully
promote Royal College of Pathologists (RCPath) Key
Performance Indicators (KPIs).
•	 A key test is how departments and their staff handle
error reporting and incidents. An open culture
of learning from mistakes and where appropriate
involvement of those outside pathology is indicative
of a good approach. Ensuring both clinical and
laboratory staff receive feedback and share lessons
learnt within and across organisation remains a
challenge and one that requires attention.
•	 Pathology services are undergoing period of intensive
change (post Carter). QA needs to be kept to the fore
when planning these changes.
•	 The error reporting risk management systems seem
to work efficiently and effectively. Largely, there are
similar ways of dealing with error reporting processes
in all pathology departments surveyed. In general
commissioners expect that they would be told if
exceptions occur.
•	 The most persuasive regimes for quality assurance
were those services where the quality managers
adopted a facilitative role of persuading and enabling,
and if necessary cajoling, front line staff to use and
drive the quality management systems, recording
and following up incidents appropriately, performing
some audits and taking part in others.
•	 IT systems may in some cases limit the internal and
external communication (including between merging
organisations) and therefore might limit effectiveness
of quality management systems (QMS). There is a
need to consider integration standards for IT systems
in pathology.
•	 Primary care and secondary care would each
welcome better communication and some pathology
departments are actively seeking to engage with
primary care users of their service. If asked for
their unmet needs, almost universally primary care
wants “the same but quicker”. Pathology services
perceive that they are driving innovation rather than
responding to a demand for innovation from primary
care.
•	 There appear to be few, if any, commissioner-led
service specifications. There may be scope for
commissioners (perhaps via CSUs or by taking
consortium approach across CCGs) and providers to
work jointly on service specifications.
•	 There appears to be little appetite in CCGs to
undertake this work and little pathology expertise
within commissioning bodies, a situation that will
not help develop QMS in the future. This, combined
with the lack of understanding or acknowledgement
of RCPath KPIs in current commissioning, is an issue
that will need addressing if quality is to become more
systemically assured than is currently the case.
•	 The availability of pathology staff for advice and
guidance was mentioned both by providers and
commissioners as a key quality indicator although
much more difficult to measure other than as a
process e.g. number of meetings attended, out of
hours contact information. This may be a fruitful area
for commissioners and providers to work together.
FHS-PathologyQualityReview-v2.indd 3 08/11/2013 17:07:13
4
This report was commissioned to inform the Review
of Quality Assurance (QA) Arrangements for NHS
Pathology Services ordered by NHS Medical Director
Sir Bruce Keogh in December 2012 and led by Dr Ian
Barnes. The review brings together experts to explore
how QA arrangements can be strengthened and
how organisations can be more confident about the
monitoring of the quality of care they offer the public.
Fr3dom Health was commissioned to review the
systems and practices in place to ensure quality within
commissioner and provider environments; and to
examine which, if any systems are in place to support and
monitor this.
The research took place in the summer and autumn of
2013 and also explored the extent to which pathology
services and their associated Quality Management
Systems are commissioning led. We focused on:
Provider input: the attitudes and practices of providers
towards quality assurance including error reporting, the
consistency and spread of reporting channels and the
reasons for any variation.
Clinical Commissioning Group input: the
understanding of the quality of service (QoS) in terms
of how and what is commissioned, quality assurance
processes used to monitor QoS, awareness of statutory
undertakings together with the reasons for the views and
practices expressed.
Commissioning Support Units: the extent to which
commissioning support (driven by service level
agreements (SLAs) and Quality of Service (QoS)
indicators) is becoming apparent in the new system.
There is an acceptance that errors occur but anecdotal
evidence suggests that not all errors and their handling
are recorded, reported or shared with peers and Trust
management consistently across pathology facilities –
despite there being standard reporting tools in place.
There is a need to understand the extent to which
this is true and, as far as can be determined, why this
might be the case. There is also a concern that there
is variation in how errors are ranked for severity,
allowing for inconsistency in reporting. This report was
commissioned to help the review understand:
1.	 What is and is not reported locally and why?
2.	 What is and is not reported nationally and why?
3.	 What systems are used for reporting?
4.	 What arrangements are in place to ensure
how learning from incidents is promoted and
disseminated?
Post Francis in particular there is a need to ensure that
commissioners understand their duty to commission a
high quality service. Contractual lines and specifications
are not always consistent and do not always cover QoS
against all services. It is important to understand how
commissioners monitor QoS in pathology services. This
report was commissioned to help the review understand:
1.	 Do commissioners know the details and scope of
what they commission, and the quality of services
purchased?
2.	 How and what do they commission?
3.	 How do they assure themselves of the quality of the
service delivered?
4.	 Are they aware of their responsibilities/obligations
regarding quality?
5.	 What arrangements are in place to ensure how
learning from any incidents are promoted and
disseminated?
2: Introduction
3: Background
FHS-PathologyQualityReview-v2.indd 4 08/11/2013 17:07:13
5
2013Prepared by Fr3dom Health
Pathology Quality Review
4: Summary of Findings
4.1: High level
•	 There is no clear definition of quality against which
to judge services externally or indeed to benchmark
between services. There is a plethora of processes
used, from operating procedures and systems to
“getting it right for patients” to reliance on external
QA bodies. In the absence of such a definition against
which to make an objective judgement, the research
showed that QA within the pathology services visited
within England appears to be in very good shape.
•	 We have been impressed by the energy, passion
and enthusiasm within the pathology services for
their work and assuring its quality. The laboratory
managers and clinical staff we met had quality
ingrained in their day-to-day work. We recognise
this group was self-selected, however we remain
persuaded.
•	 This high standard is driven almost entirely by the
service providers themselves – commissioners,
certainly those commissioners in the primary care
sector, have not engaged in driving change within the
services at any scale. It is not high on their agendas,
and certainly not a current priority.
•	 There is a desire from within pathology services
to do more to help the diagnostic process. Real
measurement of quality in terms of informing patient
care should be captured.
•	 The role of multi-disciplinary teams (MDTs)
in sense checking and challenging marginal
interpretive decisions is very much part of the quality
management (QM) of a pathology department but is
difficult to quantify.
•	 We wonder whether there are differences in approach
between quality managers who have emerged
from pathology (or the wider NHS) as opposed to
quality managers who have come from other sectors
including the commercial world? Is there a benefit in
sharing different approaches?
•	 There is a concern expressed by some that
investigating QA in pathology implies a deficiency
in quality. The opposite may well be true. Other
interpretive specialisms do not appear to be under
this sort of scrutiny.
•	 Similarly there is a concern that high levels of error
reporting is regarded as evidence of poor practice
rather than being an indicator of an open culture of
learning from mistakes.
•	 We perceived a difference in response to QA when
we spoke to clinicians or quality managers versus lab
managers or heads of service.
•	 There was some frustration from some quality/
risk management professionals that clinicians
had not necessarily understood or appreciated
their role in the past. If there ever were schisms
between the professions they seem to be reducing to
insignificance.
4.2: Objective assessments and
benchmarking of quality
•	 Clinical Pathology Accreditation (CPA) is effectively
the universal standard for pathology services. Within
the profession there is much debate about how useful
CPA is. As far as commissioners are concerned it is
seen as the “kite mark” and thus shorthand for a great
deal of information regarding quality. It is an obvious
starting point for them. CCGs should check labs are
CPA accredited or ISO15189. They should also check
participation in external quality assurance schemes.
•	 Concern was expressed in pathology departments
that a good CPA could reflect that an organisation is
good at passing assessments rather than providing
a good quality service per se. Similarly they felt that
there could then be a danger that the CPA standards
are seen as a target rather than a minimum standard.
•	 A majority of pathology departments visited had
well established quality management systems. This
typically consisted of open access to error logging,
regular quality meetings with incident reports as well
as audit findings and examination of trends.
•	 Incidents were reported locally and actions reviewed
by departmental managers (within pathology) or
quality leads from within pathology (who may
be service heads). QPulse is widely used as the
pathology-specific incident logging system; Datix is
then used at Trust level.
4.3: Quality systems and learning
•	 A key test is how departments and their staff handle
error reporting and incidents. An open culture
of learning from mistakes and where appropriate
involvement of those outside pathology is indicative
of a good approach.
•	 Ensuring both clinical and laboratory staff receive
feedback and share lessons learnt within and across
organisation remains a challenge.
4.4: Maintaining quality assurance
during times of structural change
•	 There are definite concerns about the changes being
driven through by the Carter review. Most can see the
FHS-PathologyQualityReview-v2.indd 5 08/11/2013 17:07:13
6
merit but there are concerns about how to maintain
(let alone improve) quality through changes
•	 There are very high volumes of work with a wide
range of procedures that are all to be covered by the
same quality management systems.
•	 There is a sense of organisations under pressure.
We detected concerns that quality (in common
with training and / or continuing professional
development) might be sacrificed in an attempt to do
more for less.
•	 There is a case for some research as to whether
the amalgamation of different systems into single
management structure with different systems actually
improves quality. There is a need to establish proper
QMS before changes. As with many other quality
issues the use of IT is critical here. For example where
two or more organisations are working together,
integrating their error reporting systems (Datix) is
not straightforward.
5: Acknowledgements
Thanks must go to the dozens of participants who
made themselves available as well their colleagues who
supported us with administration, information and a
welcome during a hectic summer period.
Without exception we found the professionals who took
part working within pathology services up and down the
country to be enthusiastic, professional and genuinely
passionate about their work. It was a real pleasure to
meet individuals and teams taking pride in their service
and in getting things right for patients. We are extremely
grateful to them for making time to see us in their busy
schedules. We were unable to engage with only two
pathology service providers from our original sample and
these were readily substituted for similar organisations
We would also want to thank those individuals who felt
able to contribute from commissioning organisations.
It is true to say that we had much more difficulty in
engaging with these organisations. It was a difficult time
of year to find space in diaries and CCGs and CSUs
were clearly under a lot of pressure. We struggled to find
many people who could engage with our questions in a
meaningful way but all the interviews we conducted were
courteous and useful.
We are sorry that a handful of individuals from
commissioning organisations were too busy throughout
the entire period of fieldwork to respond in any way.
FHS-PathologyQualityReview-v2.indd 6 08/11/2013 17:07:13
7
2013Prepared by Fr3dom Health
Pathology Quality Review
6: Method
Pathology departments were surveyed during the
summer of 2013 and from the respondents to this survey
a sample of 12 providers were approached to take part in
a more in depth analysis.
A brief résumé of the findings of this survey appears at
Appendix 1.
59 individuals responded comprising clinicians and
service managers and of these:
•	 50 reported they were aware of how pathology error
coding is reported within the Trust
•	 49 were aware of the risk manager’s role in respect of
pathology services
•	 48 reported they were aware of any established
processes for sharing experiences of pathology error
coding with colleagues outside the Trust, perhaps at
other hospitals
•	 37 were aware of QoS policies at their laboratory in
respect of pathology error coding.
The sample of pathology departments to be investigated
was taken in consultation with colleagues from the
Pathology Quality Assurance Review Team from NHS
England. The intention was to survey, as far as possible,
a representative mixture of pathology services by size,
geography and type of organisation.
Pathology services were contacted and invitations to
take part in the review were extended. We found the
services to be extremely hospitable. Only two services
were unable to contribute and fortunately we were able
to invite other similar sized and positioned services to
substitute almost immediately.
We asked services to identify their link commissioners
in primary care. In the event we usually approached
professionals on the commissioning side through other
channels.
Guided interviews conducted by experienced senior
researchers were used in order to gather data. Similar
Topic Guides were produced in order to facilitate the
interviews with different professionals in order to
explore the role that quality issues play in the delivery
of pathology services (copies of these are included at
Appendix 2). Interviews were conducted face-to-face
in the participant’s place of work where practicable. All
the interviews with providers took place face-to-face. In
general participants on the commissioning side appeared
to have much more pressure on their time and the
majority of these interviews took place on the telephone.
The fieldwork took place throughout England from mid
August until the early part of October.
All participants were guaranteed anonymity and given
this the organisations taking part and the individual
contributors will not be identified within this report
other than by using generic role descriptions.
FHS-PathologyQualityReview-v2.indd 7 08/11/2013 17:07:13
8
7: Results
A total of 28 interviews took place with 43 participants who worked within 12 different health communities.
Interview Face to Face Telephone Total
Service Provider
Quality Manager 11 0 11
Service or Lab Manager 9 0 9
Clinician 7 0 7
Trust Governance / Risk Manager 3 1 4
Commissioner
CCG Quality Director 1 5 6
CCG Contracts Officer / Lead 2 1 3
CSU Lead 1 2 2
Total interviews 34 9 43
We had correspondence with several additional
commissioning and contracting professionals who
provided some material but we have not included these
contacts above since interviews fell outside the field
work window due to availability issues and input was
consistent with previous interviews.
Discussion of method and analysis
We acknowledge that whilst this was a review with
several limitations (time and scope) we point out that a
vast amount of information has been gathered from the
various organisations and individuals with different job
roles (several of the interviews lasted over three hours).
A simple thematic analysis has been performed.
Saturation was reached fairly early on with several
themes.
Our brief has been to highlight the common themes
relevant to the review of QA within pathology arising
from these discussions as well as some perhaps unusual
themes that were brought up that we think are of interest
to the review.
With only one or two exceptions the pathology
departments approached were extremely welcoming
and open to the idea of taking part in the work. We
acknowledge that we may well have spoken with a very
select willing sample i.e. those who responded to the
survey request promptly and could accommodate a visit
at short notice. However some of the themes that arose
were so common that we would be extremely surprised if
the size and pragmatic nature of our sampling biased our
findings.
Many of our themes focus on the service provider
element of pathology. It proved more difficult to
identify and engage with individuals and organisations
commissioning pathology services. There may be
commissioners within CCGs who are actively pursuing
improvements in QA within pathology services and
taking a lead in these developments but we did not come
across many and weren’t given to understand that there
were many to come across. We cannot surmise what their
priorities would be.
Table 1: Interviews by type of organisation and participant’s role
FHS-PathologyQualityReview-v2.indd 8 08/11/2013 17:07:13
9
2013Prepared by Fr3dom Health
Pathology Quality Review
8: Emergent Themes
8.1: Overall impression of
pathology services
The stereotype of pathology departments is of a “service
within a service”, perhaps aloof and surrounded by
metaphoric high walls. The fact that the majority of
departments are often in separate buildings from the
main hospitals and always (for good reason) behind
locked doors reinforces this initial impression.
Significantly a similar impression was held very
strongly by several commissioners in CCGs who did
not necessarily have first-hand knowledge of their local
providers.
But there were some serious concerns
Whilst such forthright views were not universal or
even usual we can be extremely clear that pathology
services were not a high priority for almost any of the
commissioners
Pathology services often felt that they were the leaders
when it came to developing new services or suggesting
metrics to reflect performance or quality. One or two
of the pathology departments were proactive with
commissioners, for example checking whether they will
pay for new tests or innovative procedures. The process
in one Trust appeared to be to pilot an innovation in
order to show any benefit of the new service and then
present a robust business case based on the pilot.
On visiting pathology services we were pleasantly
surprised by their energy, passion and enthusiasm
around quality issues.
The sense of “otherness” was a genuine one. Many of the
participants to whom we spoke referred to the hospital or
the Trust in the third person, i.e. “they” instead of “we”.
However this sense of team spirit seemed, to us at least,
to be beneficial when trying to deliver extremely high
standards of quality within their areas of responsibility.
We may have been fortunate to find such enthusiastic
individuals in service after service but our perception
is that individuals within these services have a well-
developed sense of professional pride and a desire to
perform at the very highest standards.
Furthermore there was a sense reinforced time and time
again that the importance of maintaining these high
standards was in order to provide a high quality and safe
service for patients.
We would suggest that this desire “to get things right
for every patient every time on time” is a matter of
professional pride from individuals who work within a
rigorous scientific discipline. Whilst those who wished
to take part in the research gave us a universally warm
welcome one or two expressed concerns about the
implications of the Barnes Review.
They were keen to point out that some other interpretive
or diagnostic specialisms within the NHS were not under
the same sort of scrutiny.
Idon’thavethebackgroundinpathology
CCG Director
Thereareissuesaboutleadership,Ihavean
impressionthattheyare,orwere,oldfashioned,
commandandcontrolsortsoforganisation
CCG Director of Quality
They’reinthebowelsofthehospitalandnot
onanybody’sradar
CCG Director of Quality
Wehavedonealittlebitofworkbutlargelyit
isn’tadepartmentthatcreatesalotofnoise
forus-takethatasabarometer
CCG Director of Quality
Wearenotdirectlymonitoringanyindicators
aroundqualitybutwewouldpickuponincidents
thoughthenormalchannels…andwewouldhear
concernsfromourGPsiftherewereany
CCG Director of Quality
Pathologyservicesareoneservicelinewithin
largeacutecontractanddiscussionswithin
thisonlyhappensiftheserviceisunderpressure
Contract Manager
Itrytorememberthatthesesamplesareall
somebody’sbloodorwhateverattheendof
thedayandtheyneedmetodomybestforthem
Lab Manager
Theveryfactthatthereisareviewofquality
assurancewithinpathologyimpliesthatthere
issomekindofshortfallandIamnotsurethatthat
istrueorhelpful
Clinician
FHS-PathologyQualityReview-v2.indd 9 08/11/2013 17:07:13
10
8.2: Issues concerned with the
breadth of activity covered by
pathology
Professionals working within pathology services will
realise that they are high volume organisations with
huge turnover for some tests. Pathology departments
throughout the country are experiencing high demand,
an increase between 5 and 10% in overall workload
was the oft-quoted figure and none of the respondents
disagreed. The second and less easily quantified issue
for QA is the breadth of activity within these services.
Activity ranges from almost industrial scale, highly
mechanised procedures to labour intensive interpretive
diagnoses of individual specimens. All of these activities
need to be covered by the same quality management
processes. The same measures may not be meaningful for
every discipline. This may be why some commissioners
have limited their monitoring of pathology services to
turn around times (see section below “Paradigm shift to
commissioning led service”).
This leads to another theme that arose from our
interviews. We would often lead into discussions by
asking what “quality” means in respect of a pathology
service and we were intrigued to find a wide variety of
responses. We would always expect to have different
degrees of interpretation or emphasis but we found real,
perhaps even fundamental, differences.
The spectrum (even across professionals working for
the same organisation) ran from definitions that might
be characterised as the textbook answer “a service that
is fit for purpose” or some variation on what we might
characterise as the Darzi Definition about being safe,
effective and positive, through “good record keeping”,
“Passing our CPA” to the more visceral “getting things
right for patients”.
The usual response tended to include monitoring of
standards and responding to incidents.
We detected some difference in understanding of the
meaning of “quality” between individuals who are
quality management professionals and others (including
clinicians). We sensed that there may be a detectable
difference in approach between quality managers who
have emerged from within the NHS and those who have
come in from quality assurance roles in other industries.
Making general statements is a risk with such a small
self-selecting sample and it may well be true in that a
service that decides to recruit a full time quality manager
rather than handing quality brief to an individual(s) with
other responsibilities within the department may already
have a different corporate approach to QA. However
we sensed quite strongly that there were differences
in approach taken by departments with quality
management professionals with some independence (i.e.
whose line management came from either high up within
the department or even outside) and those who were
managing the quality brief within a portfolio of other
responsibilities.
We also found that the most persuasive regimes for QA
were those services where the quality managers adopted
a facilitative role of persuading and enabling, and if
necessary cajoling, front line staff to use and drive the
quality management systems, recording and following
up incidents appropriately, performing some audits and
taking part in others.
We came across some tension between clinicians and
quality managers and there were some stories of how
there had been considerable resistance from clinicians to
some of the systems. There was general agreement that
these issues were largely in the past and divisions had
been as result of a lack of understanding on both sides. It
was felt that, in the main, clinicians now understand and
appreciate (beyond their being a necessary evil) the QA
systems in their departments.
There were two comments within different pathology
services where clinicians felt that they were
disenfranchised from the processes dealing with
incidents. This was challenged by their quality / risk
management colleagues who felt that there was a
disinclination for clinicians to get involved in any such
follow-ups. One individual clinician felt that the Trust’s
risk management professionals questioned his/her
judgement as to what was a serious incident or near miss.
The disconnect between clinicians and the quality
management systems within a service was highlighted by
a senior clinician:
Thereisnoshareddefinitionofwhata
highqualitypathologyserviceisorhowto
measureitanditisthereforeextremelydifficultto
aspiretothatlevelofservice.Isthereadifference
ofemphasisbetweenprofessionalsfromdiffering
backgrounds?
Pathologist Icanmakeadiagnosisthatisdoneinthe
requiredtimescaleattherighttimeforthe
rightpatientandsoonandisthereforeof“high
quality”butthereisnothingaboutwhetherthisis
thecorrectdiagnosis
Clinical Pathologist
FHS-PathologyQualityReview-v2.indd 10 08/11/2013 17:07:13
11
2013Prepared by Fr3dom Health
Pathology Quality Review
Unpicking this concern with other clinicians in other
services highlighted the role of the multi-disciplinary
team (MDT) in sense checking and challenging marginal
interpretive decisions. This resource intensive activity
is very much part of the QA activity of a pathology
service but is difficult to quantify. One or two pathology
departments had started to consider how this could be
encapsulated in their performance measures.
One pathology department has a system of the receiving
clinician acknowledging that they have received the test
result electronically from pathology and that they have
used it in their decision about the patient’s care. This was
met with resistance initially across the Trust but is being
trialled as a quality measure.
8.3: Objective assessments and
benchmarking of quality
One of the opportunities afforded by a review conducted
by researchers outside of the pathology field was to
explore different understandings of some of the most
basic assumptions around quality. A key theme to emerge
for us was how professionals view the accreditation of
their service by Clinical Pathology Accreditation Ltd
(CPA) in their working lives.
Firstly there seemed to be some divergence about its
status
There was some mention of one or two labs that aren’t
accredited although nobody was able to identify these.
However the status is (presumably) fairly clear for those
who need to know. What was much more interesting
for us was how views about the CPA and the process of
accreditation varied between professionals. Views were
not split along recognisable lines (by job roles, experience
of other environments)
Even the most vociferous of detractors admitted that
the way in which accreditation had been organised over
recent years had improved significantly. There was a
feeling that some objective standards had been needed
in the past. The process had become systematised and
there had been a lack of consistency between different
assessors. There seemed to have been cases of individual
assessors seeking confirmation that processes were
to their own personal liking rather than to objective
standards.
There seemed to be a general acceptance of the new
timetables and an understanding that the moves towards
ISO type standards were beneficial.
More than one of our sample group of providers had
recently undergone an assessment and expressed how
onerous they had found it. Whilst they were content to
have passed, there was a definite sense that a significant
proportion of the evidence collected was of more use to
the assessor than to the organisation being assessed.
Is there a sense that the CPA becomes an end in itself?
It is probably worth recording that many of the
participants were in fact CPA assessors.
There was a great deal of discussion about how useful
CPA is internally. There was a sense for many that
this was at least a benchmark for their organisation.
However we would also suggest that some the
standards represented a target to be achieved rather
than a minimum standard for their service. We had a
limited sense that services found the process useful for
improving their own QMS. CPA is considered by some
pathology departments to be the end point in a long
process of internal and external QA.
We did get a strong sense of how fundamental the CPA
is for commissioning organisations. It is seen as a kite
mark and thus shorthand for a great deal of information
regarding quality as far as commissioners are concerned.
Alongside CPA there could be information about how
a lab participates in external quality assessment (EQA)
schemes and whether concerns have been raised. The
standard of EQA schemes was questioned, there was
felt to be little consistency across them but labs are free
It’svoluntarybutcompulsoryifyouknow
whatImean
Quality Manager
Brilliant,professionalandclear
Quality Manager
Utterrubbish–getsinthewayofquality
Quality Manager
CPAandincidentreportingismandatory
Clinician
Idon’timaginethatyoucouldbealab
workingfortheNHSwithoutit
Lab Manager
Itispossiblethatgettingagoodassessment
demonstratesthatyouaregoodatassessments
ratherthanahighqualitypathologyservice
Quality Manager
FHS-PathologyQualityReview-v2.indd 11 08/11/2013 17:07:13
12
to choose. Perhaps there should be a limited number of
approved EQA schemes. Our understanding is that when
a problem has been spotted via EQA a referral letter is
sent to the Trust chief executive and if the problem is not
addressed a second letter is sent. CCGs could ask about
these communications but currently a system to do this
does not exist.
Another issue brought to our attention is the lack
of agreement on performance standards for test
specification. The NHS does not specify performance
levels, for example error or precision levels for specific
tests. Again this was not something raised by the
commissioners.
In terms of performance standards, some of the RCPath
KPIs were in use with departments viewing these as
minimum levels of service and others as targets. It is
understood that these are not finalised. The pathology
departments measured turnaround times, staff training
and development and user surveys. Some identified the
attendance of pathologists at MDT meetings as a quality
measure whilst others focused on patient complaints.
FHS-PathologyQualityReview-v2.indd 12 08/11/2013 17:07:13
13
2013Prepared by Fr3dom Health
Pathology Quality Review
9: Quality systems and learning
Arrangements for incident reporting and follow up
actions within pathology departments appear to be
robust. However there was variation as to the risk and
governance reporting procedures within the Trusts and
the significance given to pathology errors and incidents.
Typically a locally developed or off-the-shelf database
(QPulse being the most commonly cited) was used to
report and track all errors within pathology. These
would be graded by laboratory managers and /or quality
managers and entered onto the Trust risk reporting
system (usually Datix). A summary of practices found is
given in Appendix 3.
9.1: Some of the practices that
were identified:
•	 In the case of clinical or critical incidents, non-
conformity is raised on the laboratory quality
management system. The quality manager liaises
with the clinical lead, pathology manager and the
operational manager of the particular area of the
Trust. The team decides on the severity and coding
of the incident. This is progressed to the Trust lead
for patient safety who takes a final decision on the
seriousness of the incident, further investigations and
any remedial actions required.
•	 Staff record errors and team managers score these.
The pathology committee meets monthly to discuss
complaints and incidents. The Trust risk manager
receives daily information on adverse events where
there is a serious impact on patient safety. The Trust
clinical governance team has facilitators to work with
each clinical area where support or assistance to take
action is required.
•	 A local Excel system is used to record errors, an
internal log identifies investigations, then the Datix
system used by the service manager and deputies
to grade incidents. Any adverse incidents are
investigated internally to analyse for root/cause and
for corrective/ preventive actions to be determined.
Summary reports are presented at senior pathology
management meetings attended by heads of
departments and the pathology risk manager. Any
shared learning outcomes are provided for action
and relevant issues would be escalated appropriately
to governance/ risk management. Datix trend data
is analysed; this feeds into risk management for
discussion/ action at divisional or executive level. The
Trust quality manager has governance overview and
reports serious untoward incidents (SUI) to the Trust
Board via six monthly assurance reports. A sample of
low-level incidents is examined, including the actions
taken to check for patterns. If no action taken is
identified, then these are escalated.
•	 Use of own database for adverse events. Datix is used
Trust-wide for incident reporting, actions reviewed
by departmental managers. Quality managers provide
number of incidents, those at medium and higher risk
levels are tabled in the Trust risk register. Monthly
quality meetings are held with quality leads, each
department in pathology reports and trends are
monitored. This is a well-established system.
•	 Datix “investigating officer” has open entry, some
escalated to root/cause analysis and SUI, most are not
serious. Datix manager escalates, completes national
upload and risk management team compares local
and national data.
•	 Near misses reported on local system, laboratory
manager grades errors and conducts root/cause
analysis, reported to the risk team. Investigation
solved but no follow-up at three or six months.
Pathologist raises concerns via clinical director route.
As investigator, the laboratory manager sees the
errors reported by staff and conducts the root/cause
analysis. The reports go to the Trust’s risk team but
no follow-up is received back by pathology as these
incidents are regarded as internal to pathology and
not serious enough.
•	 Pathology system designates clinical incidents when
there is a delay or incorrect test ordered. Incident
reporting may show a concern regarding staff skills
or numbers, however it was unclear that action was
taken on the latter.
9.2: Sharing lessons within and
outside the organisation
In order to communicate the outcome of recording
and investigating pathology incidents, the majority of
departments visited had well-established communication
mechanisms, usually via team meetings and quality
groups.
Theydon’tgetfullyinvestigated,orleadto
robustproceduralchanges
Clinical Pathologist
Qualityisoneverystaffmeeting,allstaffare
involvedinaudits
Quality Manager
FHS-PathologyQualityReview-v2.indd 13 08/11/2013 17:07:13
14
Many of the sites visited described a series of meetings
internal to the pathology department and then wider
within the Trust. In terms of clinician engagement, one
Trust has a laboratory management committee with
all consultants and senior scientists attending. The risk
management and quality management groups feed into
this.
It was not clear how often pathology staff learnt about
patient safety with other departments within the
organisation. One Trust has a “learning by improvement”
group of operational managers and clinicians who meet
quarterly and discuss patient safety incidents.
Governance arrangements in the provider organisations
clearly varied. One Trust categorised incidents between
those of low level managed locally and SUIs as a grey
area of “serious incident action review”. Weekly meetings
are held across the Trust as part of corporate governance
structures to present new incidents and conduct root/
cause analysis.
The role of a quality co-ordinator was evident at a
couple of sites. This person provided a link between the
laboratory managers and the clinicians especially for
closing the loop on feedback, continual improvement
and learning. This may be an alternative model especially
for large multi-site pathology services to ensure effective
feedback to all staff, and manage attendance at meetings
alongside other demands on staff time.
In a couple of places regional quality manager groups
exist and have been the forum to share and learn from
each others’ mistakes. Their purpose is to drive quality
through learning plus accreditation.
As was indicated to us, some of the reorganised
pathology services had found other providers more
guarded about sharing information for benchmarking
purposes. Clearly the quality agenda benefits from
learning within and across organisations.
One method to enable this may be to revitalise
professionally-led clinical groups to work on
standardisation of protocols and performance measures.
9.3: Maintaining (improving)
Quality Assurance during times of
structural change
Many of the pathology provider organisations with
whom we engaged were undergoing some structural
changes, had recently done so or were expecting changes
in the near future. Three of the providers were affected by
the Transforming Pathology Project (TPP). Several were
involved in mergers with other Trusts’ services as well as
forming partnerships with private sector providers.
In fact those services that were not undergoing or
contemplating significant change were in the minority.
The few services that were not undergoing change felt
that they had made the business case for the status quo
not least because they were an income generator for their
parent Trust.
This was echoed by the relevant commissioners
We are not in a position to be definitive but we had
a strong impression of volatility within the services
undergoing change and we perceived that in many
of them the focus was on change rather than service
delivery. The impression given is that change was or
had been being driven almost entirely by economic
(efficiency) arguments. Benefits to the service delivery
or improvements in quality were not at the forefront of
any discussions that we had. This was true for the service
providers and for the commissioners to whom we spoke,
although even in those areas taking part in TPP we
did not get the impression that CCG quality directors
Moresharingpreviously,guarded
conversationsnow,forexamplewewilldiscussa
patientsafetytopicbuthowitisaddressedand
financialimplicationsarenotshared
Service Manager
Communicatedtoallstaffthroughsection
meetingsandmanagementmeeting
Quality Manager
Riskmanagementhasahighlevelof
consultantinvolvementatthisTrust
Quality Manager
Pathologyhascultureofusingerrorsand
addressingnon-conformityaslearning
Risk Manager
Theserviceiswrappedupinblockcontract
andthatislargelywhyitissuchgoodvalueforus
CCG Director of Quality
Whenweinvestigatedtheserviceswewere
commissioningwerealisedthatweweren’t
payingforsomeserviceswewerereceivingandso
infacttheexercisecostusmoney.
CCG Director
FHS-PathologyQualityReview-v2.indd 14 08/11/2013 17:07:13
15
2013Prepared by Fr3dom Health
Pathology Quality Review
were intimately involved in the negotiations around the
tendering processes for pathology.
In general almost every professional with whom we
engaged could see the business case for merging smaller
laboratories with low volumes and a less interesting
mix of work. There are clearly critical mass arguments
for many procedures. There may be a case for merging
or taking over failing or struggling services (we were
led to believe that there were a number of smaller more
isolated laboratories around the country that did not
have a sufficient turnover to maintain the very highest
standards) but this quality dimension did not seem to be
driving the service redesign process.
We visited a large multi-site pathology service that had
been created as the result of several labs merging into
one management structure that included some private
sector input. The process was relatively mature yet the
individual laboratories still seemed to operate their own
QMS and standard operating procedures (SOPs). The
service was delivering more tests and anecdotally had
reduced costs. Another site visited had merged two
years previously and had managed to maintain a fairly
stable workforce and reported achieving cost savings
whilst increasing quality. The clinical and service leads
all reported to a joint board with clear incident reporting
arrangements in place.
We visited another service that was in the early stages of
merger. The change manager was very keen to explain
that the aspiration was to harmonise all of the various
sites to ensure a common QA process as well as SOPs.
There remain for us some serious questions about
maintaining quality standards through structural change
and we wonder whether there is good evidence base for
improving quality through mergers. Certainly the need
for planned change seemed clear.
9.4: Maintaining Quality in times
of organisational stress
Many of the laboratories visited spoke vividly about the
increase in workload that they had experienced. Some
managers described how they felt under pressure to
deliver more with diminishing staff recourses.
We sensed in several services real strain on teams with
challenges such as increasing workload, structural
change, a (perceived) reduction in resources, high
staff turnover, use of locums, reduction of skill level
(i.e. recruiting key personnel on lower bands within
laboratory situations).
We don’t want to paint a picture of services in crisis nor
of professional discontent, however we did get a sense
that pressure was mounting. QMS may have been the
first casualties of labs experiencing stress.
We do not what to give the impression that these
professionals were bemoaning their lot.; much of these
open discussions were conducted with a great deal of
humour. However we certainly detected an amount of
stress within many services. There were more than a
few indications that some of the day-to-day adherence
to QA procedures including internal audits, continuing
professional development, staff training, and less essential
administration were the first casualties of this stress.
What slips straight away because you have your head
down concentrating on the samples are things that are
required for CPA, to prove you’ve got a quality service, so
staff appraisals might not occur when they are scheduled,
competency assessments the same (you know the person
is doing it but the actual paperwork might not get done),
Ihavetofighteverytimesomebodyleaves
tomakethecaseforreplacingthatperson
withsomebodyonthesamegrade.Ilosemore
argumentsthanIwin
Service Manager
Withvacancycontrolthatisourstandard
conversation,anytimewehaveavacancyit
is“doesithavetobeatthatgrade?Whatareyou
goingtodelivermoreof?”
Lab Manager
Ourworkloadhasramped–certainlyover
thelast12to18monthswehavehadto
shoreupthedepartment,lotsofstaffbeingoffered
overtime,I’vebeendoingovertime,itsdifficultto
keepdisciplined
Lab Manager
HardlyadaygoesbywhenIdon’thave
toputthewhitecoatonandgetonthe
bench
Lab Manager
Oftenourteamsliterallydon’tknowwhat
theyaregoingtobedoinguntilweseewhat
comesin,itmakesplanningextremelydifficult.
Lab Manager
FHS-PathologyQualityReview-v2.indd 15 08/11/2013 17:07:13
16
10: Scope of pathology services’
ability to control quality on the
whole pathway
One issue that we came across several times was
something that is universal to any pathology service
but is worth noting when looking at QA from a wider
perspective.
The theme that arose in many of our discussions was the
extent to which the delivery of a high quality service (in
terms of, say, right result, right patient at the right time)
is dependent on some factors upstream and downstream
of the service that are not in their control. When it comes
to taking the samples, their collection and transport and
the distribution of the results a pathology service has
limited ability to drive quality.
Anecdotally the majority of sub optimal delivery
incidents we came across were as a result of incorrect
(or incomplete) labelling or a failure of communication
between the service and the healthcare professionals who
were dealing directly with patients.
Where services have capacity to investigate, analyse
and ameliorate shortcomings in the taking of samples
they did so. There were some excellent examples of
good practice including one service dealing with blood
samples that had the capacity to “go out” to wards and
practices targeting outliers. One individual concerned
was seen as an advocate for those professionals on the
wards or out in the practices who were unable to meet
the standards and delivered support and training as
appropriate.
The induction of new medical staff joining Trusts
or practices can also be a key point for pathology to
influence if they have the capacity to do so. At several
Trusts the clinical leads in pathology were involved in
presenting information about ordering tests and seeking
advice at induction sessions.
There remains some tension generated by the aspiration
to have every sample presented properly. Several
laboratories were considering a “zero tolerance approach”
to mislabelling. On the other side of the coin we picked
up some themes whereby professionals working within
the laboratories were less than willing to reject samples
or to record poor performance since they felt that this
was tantamount to accusing fellow professionals (some
of whom are working under a great deal of pressure)
of incompetence. We sensed that whilst the guidance
may be clear for senior quality managers, some level of
discretion is being used “on the front line”.
We imagine that very few quality managers are in a
position to systematically analyse outliers in terms of
incident reporting.
One clinician described QA as
The standing of the pathology department within the
organisation may be key to the balance between assisting
users of their service to improve quality versus refusing
to do the “incorrect” work, log it as an error and report
it at a meeting. It was unclear what the tipping point for
this balance might be, for example whether it depended
upon the seriousness of the error, whether it was a
audits might slip… you might not be able to release people
for training Lab Manager
Whilst there is no suggestion that a necessary condition for
QA requires a stable workforce (we did not have the time
to get anything other than an impression) there was an
understandable desire from providers to have a motivated,
properly trained and dedicated teams. More than one of
the commissioners to whom we spoke expressed an interest
in using some measure of staff volatility and training as
a proxy indicator for quality within the provider. As we
discuss below, the gap between the aspiration of some
commissioners to monitor meaningful data about QA and
their actual capacity or willingness to do so is quite wide.
Requiringasystemwherestaffare
comfortablereportingmistakesandlearning
fromthem
Clinical Pathologist
FHS-PathologyQualityReview-v2.indd 16 08/11/2013 17:07:13
17
2013Prepared by Fr3dom Health
Pathology Quality Review
“repeat offender”, or the individuals’ personalities and
workload. Does the Trust size and culture influence this?
One site within a merged service had experienced a
difference in approach from theirs of zero tolerance of
errors and had reached a combined team approach after
a year working through a clinical effectiveness policy.
This is an important consideration when bringing
departments together.
10.1: Working with
Commissioners to manage
demand
There were a couple of examples where the pathology
services worked with their lead commissioner to
proactively manage demand. At one place the workload
referred by the GPs was examined via the electronic
ordering system, any inappropriate test requests
identified and the number and type of tests repeated
too soon recorded. The consultant pathologist, with
commissioner support, had translated the results into an
online tutorial for GPs.
10.2: Restrictions imposed by IT
systems
Communication difficulties between IT systems used
internally and externally (including those between
merging organisations) might limit effectiveness of QMS.
Some of the pathology departments had developed
local databases to record incidents and errors identified
by their staff. In a few places paper records were still
maintained; others were fully computerised.
The advantage of an electronic central system to hold the
operating procedures is that they can be updated easily
and only the latest copy is available to staff, even when
working across more than one site.
Where pathology services were combined across more
than one organisation (for example a number of Trusts
and/or central hub laboratory) IT systems were not
always integrated. Datix completion was handled by one
or several individuals at team leader or manager position,
usually from within pathology. Datix systems can be
locally adapted and therefore combining the information
from two Datix systems is not as straightforward as it
may seem.
In the case of partnership between NHS Trusts and
private providers of laboratory services, new integrated
IT systems may be offered to the respective laboratories.
Apart from the transition from old systems across to
new, integration is seen as an advantage as long as the
respective Trust reporting requirements can be met.
Clearly an ideal scenario is that tests are ordered
electronically from the community, GPs in their
surgeries, and from other clinical staff within the
hospital. Results are also sent back electronically and
stored in electronic patient records. Evidence of this
practice was found in only a couple of sites.
For one commissioner the adoption of the electronic
ordering of tests by the GPs was the number one priority.
However a different opinion was that the electronic
system somehow lost one of the “sense checks” in the
transfer of samples.
FHS-PathologyQualityReview-v2.indd 17 08/11/2013 17:07:14
18
11: Paradigm shift to
commissioning led service
All of the pathology services with whom we spoke were
making efforts to engage with colleagues in the primary
care sector whom they saw as their users, in the main
GPs. In one area an individual GP with a particular
interest in pathology had been key to some innovative
engagement and service developments. However this
seemed to be the exception rather than the rule and
whilst one or two managers had some capacity to “go out
to the practices” communication other than discussions
of individual problems was in the main restricted to
e-mail cascades and annual surveys.
We heard in many services a sense of exasperation that,
when surveyed, GPs tended to want the same service that
they had now but could the pick-ups be later or maybe
slightly quicker? Several laboratories admitted that they
would welcome a much more interactive relationship
and felt that if there were innovations such as new tests,
changes in technique or different services (perhaps
additional help with diagnosis) these were always driven
by the service rather than the users of the service.
Furthermore in discussions with quality professionals on
the commissioning side we noted time and time again
that there were very few measures highlighted in the
contracts
In general the scrutiny of pathology services by CCGs
seemed to be light touch.
Professionals on the commissioning side were confident
that were there problems within pathology services they
would be made aware by the service or the parent Trust
through their normal modes of communication.
In truth almost everybody to whom we spoke was
managing vast block contracts in which the pathology
service was represented by a very few clauses.
As discussed elsewhere within this report we were given
the very strong impression almost everywhere that the
evaluation and monitoring of pathology services was not
a high priority for most CCGs.
When we discussed with CCG quality directors the sorts
of quality indicators that they would like to see in the
future, the recurring themes were:
•	 Turn around times (we accept that turnaround times
are easy to quantify and are a measure of proficiency
especially in acute and emergency care settings
however we wonder whether they are the be all and
end all for assessing quality in pathology for primary
care providers)
•	 Appropriate accreditation
•	 Some metrics around the training, retention and
development of staff
•	 Incident reporting and investigation
•	 Responsiveness to enquiries from GP colleagues
•	 A couple of commissioners did identify that some
attempt at managing demand through reducing
inappropriate referrals could be measured, especially
where electronic ordering is in place
•	 The availability of pathology staff for advice and
guidance was mentioned both by providers and
commissioners as a key quality indicator although
much more difficult to measure other than as a
process e.g. number of meetings attended, out of
hours contact information. This may be a fruitful area
for commissioners and providers to work together.
Thespecisverylow–weprovideevidenceof
turnroundtimes(TAT)for[certaintypesof]
screeningthatiswithincontractuallimitsandalso
weprovideinformationforsexualhealthscreening
fornationalsurveillance(CTAD),butthelatteris
notaqualityindicator.Thereisnorequirementfor
CPAaccreditationoranythingelsespecified.
Pathology Service Manager
Wedon’tactuallyaskforverymuch,only
reallysomethingaboutspeedofreturning
resultsandtobehonestwehaven’tbeen
monitoringthatveryclosely,
CCG Contract Manager
Weassumethatwewouldhearfromthetrust
iftheservicewasnotperformingadequately,
andwewouldhearthroughthenormalchannelsor
fromourGPcolleaguesiftherewasaproblem
CCG Quality Director
Thereisalackofstandardisationin
commissioning,blockcontractswillallow
activitycontractstodevelop
CCG Quality Director
FHS-PathologyQualityReview-v2.indd 18 08/11/2013 17:07:14
19
2013Prepared by Fr3dom Health
Pathology Quality Review
Appendix 1: Survey Results
Question Response
Are you aware of Quality of Service (QoS) policies at your laboratory in
respect of pathology error coding?
37 aware, 22 not aware
Are you aware of how pathology error coding is reported within the
Trust?
50 aware, 9 not aware
Are you aware of any established processes for sharing experiences of
pathology error coding with colleagues outside theTrust, perhaps at
other hospitals?
48 aware, 7 not aware,
4 no answer
Are you aware of the risk manager’s role in respect of pathology
services?
49 aware, 10 not aware
Table 1: Survey returns
FHS-PathologyQualityReview-v2.indd 19 08/11/2013 17:07:14
20
Response given Number Examples and issues raised
Comprehensive approach 13
Trust wide Incidents via Datix with pathology element: reports reviewed
three monthly by pathology quality manager looking for trends. Report
goes back toTrust risk management.
Reviewed by the quality manager for laboratory medicine with the busi-
ness group quality manager at a QA meeting that has risk management
team input
Risk management 10
All errors are reported and are part of the corporate risk register
Errors in our place of work are directly reported into theTrust’s risk
management system.They receive a severity grading, and are reviewed at
departmental risk management meetings.
Governance structure 10
Incidents and resolution are discussed on a monthly basis at the pathology
clinical governance (CG) meeting
Pathology CG committee reports to theTrust CG committee
Quality Management System 7
Minor errors and near-misses are reported via QMS and followed up within
the department
Reported in incident forms and these go through a formal pathway of
investigation and closure that is part of the quality process.
Team / clinical input 4
Quality manager liaises with the clinical lead, pathology manager and the
manager of the particular area of theTrust.The team decides on the sever-
ity and coding of the incident.
Risks are reviewed at monthly pathology meetings of senior clinical and
technical staff and are also reviewed monthly within the clinical division of
theTrust.
There is a risk structure but this structure mainly consists of“managers”
with minimal clinical input. Clinical input is sought“as and when needed”
but the main question is how will the managers automatically know unless
clinicians are an integral part of this group?
Patient Safety 2
Error reports are passed to patient safety department who coordinate,
investigate and escalate as appropriate.
Trust incident reporting 2
If an error has escaped pathology, i.e. impacted on another area or patient,
it would be reported on theTrust incident reporting system and investi-
gated as aTrust incident.
Via Datix, no further information
given
9 All pathology errors are reported on Datix
Not aware of action taken
2 out of
59
De-escalated and sidelined within the clinical error investigation/reporting
system because a manager outside pathology decides.
It is not clear. Pathology department not receiving feedback on remedial
measures for serious incidents reported.
Table 2: Categorisation of answers given
to question“How do error reports at your place of work link in with theTrust risk management system?”
FHS-PathologyQualityReview-v2.indd 20 08/11/2013 17:07:14
21
Appendix 2: Topic Guides
Discussion guide for providers:
Icebreaker – check equipment etc who are you/ job title/job
role. Cross check to records and amend if needed.
Recap that this is entirely confidential and anonymous.
Fr3dom have never and will never release transcripts or
identifiable responses to anybody.
In response to the Francis Report the
Royal College of Pathologists said:
“We continue our work to ensure that pathology services
and their quality systems are fit for purpose in a health
service which has to manage with a restricted budget.”
Q1.	 How do you respond to this statement?
Q2.	 In general how do you (your organisation) monitor
the quality of any services that are delivered for your
patients? Might be omitted
Q3.	 In your own words what do you think is meant by
quality assurance in terms of a pathology service?
Q4. 	 What quality assurance clauses have you got in your
agreements with your commissioners?
Q5.	 Are you clear about which pathology services you
are commissioned to provide and for whom?
Q6.	 Do you know what your responsibilities are with
regard to the quality of these services?
Q7	 Please can you explain what this means in practice?
Q8.	 In terms of any errors are you aware of what is and is
not reported locally? Do you know why these criteria
(for what is and is not reported) are in place? Should
you?
Q9.	 Similarly are you aware of what is and is not reported
nationally and why? Should you be?
Q10.	 Do you know what system is being used to report
these errors? And how? Who is responsible for this
data entry? Who uses STEIS/Datix (or other systems
such as NRLS and MHRA) respectively, and what
for?
Q11.	 What local arrangements are in place for learning
from errors or indeed any problems and incidents
identifie, whether or not they led to problems?
Q12.	 Can you identify any changes that have been made
locally in response to this process?
Q13	 What effect do these clauses have on the delivery
of the service, on a daily basis or from a strategic
perspective?
Discussion Guide for Commissioners:
Icebreaker – check equipment etc who are you/ job title/job
role. Cross check to records and amend if needed. Recap
that this is entirely confidential and anonymous. Fr3dom
have never and will never release transcripts or identifiable
responses to anybody.
In response to the Francis Report the
Royal College of Pathologists said:
“We continue our work to ensure that pathology services
and their quality systems are fit for purpose in a health
service which has to manage with a restricted budget.”
Q1. 	 How do you respond to this statement?
Q2.	 In general how do you (your organisation) monitor
the quality of any services that are delivered for your
patients? Ramp up question - might be omitted
Q3.	 Please can you give an overview of how you go
about commissioning pathology services?
Q4. 	 Are you clear about which pathology services you
commission and for whom?
Q5	 In your own words what do you think is meant by
quality assurance in terms of a pathology service?
Q6.	 What quality assurance clauses and have you got in
your agreements with your providers?
Q7.	 Can you please provide evidence of this? (should
have been prepared in advance – this will be the
document or pages referencing QoS from their
contracts)
Q8	 What effect do these clauses have on the delivery
of the service, on a daily basis or from a strategic
perspective? (Investigate the extent to which they are
familiar with national directives etc).
Q9	 Can you provide details of the way you actually
measure QoS of providers?
Q10	 Do you know what your responsibilities are with
regard to the quality of these services?
Q11.	 Please can you explain what this means in practice?
Q12	 In terms of any error coding, are you aware of what
is and is not reported locally? Do you know why
these criteria (for what is and is not reported) are in
place? Should you?
Q13	 Similarly are you aware of what is and is not reported
nationally and why? Should you be?
Q14	 Do you know what system is being used to report
these errors? And how? Who is responsible for this
data entry? Who uses STEIS/Datix (or other systems
such as NRLS and MHRA) respectively, and what
for?
Q15.	 What arrangements have you got in place for you
to learn from errors or indeed any problems and
incidents identified, whether or not they led to
problems?
Q16.	 Can you identify any changes that have been made
locally in response to this process?
FHS-PathologyQualityReview-v2.indd 21 08/11/2013 17:07:14
22
Appendix 3: Quality
Assurance in Pathology
Topic /Theme Ideal model Examples in practice Examples in practice
External
Accreditation
All parts of one pathology
service are accredited.
Different disciplines have own CPA process
Different views on CPA effectiveness.
Other models coming?
Unaware of what happens on linked but
separate sites.
Incident / error
reporting –
Department level
All staff trained and use
common electronic system.
Some paper based reporting, but entered
onto common system.
Some issues with access – seniors.
Refer to leaders.
Different local systems but brought
together for department, parallel systems
may exist.
Qpulse used internally some places.
Staff training in
error reporting
All staff trained and use
common electronic system.
Part of induction – regularly
reviewed.
“In the bloodstream”.
Some staff require others to complete
incident reports.
QM or equivalent monitoring for outliers in
terms of entries.
Some top ups but mainly doing it is better
than training.
Level of errors
Benchmarking possible via
Keele.
High error reporting numbers does not
mean problem site.
“Datix”or similar system seen as“slap on
wrist”by staff on wards so completion
poor.
Communication
via meetings and
reports
Clinical representation and all
staff briefed.
Clinicians not informed of actions taken
once inTrust risk reporting.
Internal quality meetings very good where
they exist.
Feedback
mechanisms
Actions reported back.
Can be via the incident“investigator”not
the“reporter”.
Trust risk reporting
Clear agreement of how
pathology incidents feature
inTrust risk reporting with
feedback to pathology.
Pathology define incident as patient safety
event.
Learning from
errors
Feedback mechanism via staff
groups.
Trust“Learning by Improvement group”
with representation from clinicians
and operational managers including
pathology.
Some excellent examples. QM usually
responsible for cascading learning.
Responsibility for
Quality
Sits in one person with overview
of service, medical and lab staff.
Full time quality manager covering a
number of sites. Drives quality through
system, common documents, equipment
etc.
Clinical Director has responsibility and also
each lab manager. for quality aspects.
Where does“QM sit”?
Quality culture
Openness of department and
across whole organisation.
CPD requirement for all staff to receive
Quality training.
Audit function
Team drawn from existing staff,
key part of role and professional
opportunity for all.
Only sample of staff work as auditors.
Part of normal working where it’s good
– sense is that this is one of the first
casualties of work pressure.
Managing demand
from hospital
Items specified and reimbursed
by CCGs.
Educate junior doctors and new staff
arrivals.
Test duplication
Where hospitals refer work on to
another pathology department
tests should be streamlined.
Some duplication from DGH to specialist
centres.
FHS-PathologyQualityReview-v2.indd 22 08/11/2013 17:07:14
23
2013Prepared by Fr3dom Health
Pathology Quality Review
FHS-PathologyQualityReview-v2.indd 23 08/11/2013 17:07:14
Pathology Quality
Review
Review undertaken by Fr3dom Health for
and on behalf of The Barnes Review
Fr3dom Health Solutions Limited | Fr3dom House | 11 North Street | Portslade | E Sussex | BN41 1DH | www.fr3domhealth.co.uk
2013
Market leading
Engagement strategies
Sustainable
Quality improvement
Creative
Patient & Pubic Involvement
Specialist
Public Health ReviewTeam
FHS-PathologyQualityReview-v2.indd 24 08/11/2013 17:07:14

Mais conteúdo relacionado

Mais procurados

Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...
Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...
Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...Hosmac India Pvt Ltd
 
Clinical audit program- A feeder and a model for the nation
Clinical audit program- A feeder and a model for the nationClinical audit program- A feeder and a model for the nation
Clinical audit program- A feeder and a model for the nationLallu Joseph
 
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdfQualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdfDRHatem ELbitar
 
Emr And Economic Stimulus
Emr And Economic StimulusEmr And Economic Stimulus
Emr And Economic Stimulusiternalnetworks
 
Clinical audit
Clinical audit Clinical audit
Clinical audit faheta
 
Fundamental principle of qa projects
Fundamental principle of qa projectsFundamental principle of qa projects
Fundamental principle of qa projectsLee Oi Wah
 
Barriers to implementation of nabh standards with intent and spirit- lallu j...
Barriers to implementation of nabh standards  with intent and spirit- lallu j...Barriers to implementation of nabh standards  with intent and spirit- lallu j...
Barriers to implementation of nabh standards with intent and spirit- lallu j...Lallu Joseph
 
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...Reynaldo Joson
 
Quality assurance in community health nursing
Quality assurance in community health nursingQuality assurance in community health nursing
Quality assurance in community health nursingJobin Jacob
 
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطار
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطارDr Hatem El Bitar 2. tqm history v ip د حاتم البيطار
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطارDRHatem ELbitar
 
Online Workshop: Improving Patient Care Pathways
Online Workshop: Improving Patient Care PathwaysOnline Workshop: Improving Patient Care Pathways
Online Workshop: Improving Patient Care PathwaysSIMUL8 Corporation
 
Quality assurance & monitoring in opd and outreach services
Quality assurance & monitoring in opd and outreach servicesQuality assurance & monitoring in opd and outreach services
Quality assurance & monitoring in opd and outreach serviceslionsleaders
 
Accreditation as a Strategy / Tool for Hospital Quality Service Improvement
Accreditation as a Strategy / Tool for Hospital Quality Service ImprovementAccreditation as a Strategy / Tool for Hospital Quality Service Improvement
Accreditation as a Strategy / Tool for Hospital Quality Service ImprovementReynaldo Joson
 
Journey towards achieving and sustaining quality in teaching hospitals
Journey towards achieving and sustaining quality in teaching hospitalsJourney towards achieving and sustaining quality in teaching hospitals
Journey towards achieving and sustaining quality in teaching hospitalsLallu Joseph
 
PGodfrey_Automation of Utilization Management
PGodfrey_Automation of Utilization ManagementPGodfrey_Automation of Utilization Management
PGodfrey_Automation of Utilization ManagementPaul Godfrey
 

Mais procurados (20)

Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...
Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...
Pillars of Quality : An Overview of NABH - Dr. A.M Joglekar at Knowledge Seri...
 
Clinical audit program- A feeder and a model for the nation
Clinical audit program- A feeder and a model for the nationClinical audit program- A feeder and a model for the nation
Clinical audit program- A feeder and a model for the nation
 
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdfQualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare4check list_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
 
Emr And Economic Stimulus
Emr And Economic StimulusEmr And Economic Stimulus
Emr And Economic Stimulus
 
Clinical audit
Clinical audit Clinical audit
Clinical audit
 
Fundamental principle of qa projects
Fundamental principle of qa projectsFundamental principle of qa projects
Fundamental principle of qa projects
 
Barriers to implementation of nabh standards with intent and spirit- lallu j...
Barriers to implementation of nabh standards  with intent and spirit- lallu j...Barriers to implementation of nabh standards  with intent and spirit- lallu j...
Barriers to implementation of nabh standards with intent and spirit- lallu j...
 
Radiologic Technology program powerpoint
Radiologic Technology program powerpointRadiologic Technology program powerpoint
Radiologic Technology program powerpoint
 
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...
Clinical Practice Guidelines / Pathways as a Strategy / Tool for Hospital Qua...
 
Quality assurance in community health nursing
Quality assurance in community health nursingQuality assurance in community health nursing
Quality assurance in community health nursing
 
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطار
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطارDr Hatem El Bitar 2. tqm history v ip د حاتم البيطار
Dr Hatem El Bitar 2. tqm history v ip د حاتم البيطار
 
Online Workshop: Improving Patient Care Pathways
Online Workshop: Improving Patient Care PathwaysOnline Workshop: Improving Patient Care Pathways
Online Workshop: Improving Patient Care Pathways
 
Quality in health care
Quality in health careQuality in health care
Quality in health care
 
Quality assurance & monitoring in opd and outreach services
Quality assurance & monitoring in opd and outreach servicesQuality assurance & monitoring in opd and outreach services
Quality assurance & monitoring in opd and outreach services
 
Quality assurance
Quality assuranceQuality assurance
Quality assurance
 
Accreditation as a Strategy / Tool for Hospital Quality Service Improvement
Accreditation as a Strategy / Tool for Hospital Quality Service ImprovementAccreditation as a Strategy / Tool for Hospital Quality Service Improvement
Accreditation as a Strategy / Tool for Hospital Quality Service Improvement
 
quality assurance in nursing
quality assurance in nursingquality assurance in nursing
quality assurance in nursing
 
Journey towards achieving and sustaining quality in teaching hospitals
Journey towards achieving and sustaining quality in teaching hospitalsJourney towards achieving and sustaining quality in teaching hospitals
Journey towards achieving and sustaining quality in teaching hospitals
 
NABH Medical Lab Certification Program
NABH Medical Lab Certification ProgramNABH Medical Lab Certification Program
NABH Medical Lab Certification Program
 
PGodfrey_Automation of Utilization Management
PGodfrey_Automation of Utilization ManagementPGodfrey_Automation of Utilization Management
PGodfrey_Automation of Utilization Management
 

Semelhante a FHS-PathologyQualityReview

Importance of medical audit
Importance of medical auditImportance of medical audit
Importance of medical auditalicecarlos1
 
Clinical Assignment Quality Improvement Final Project Goal
Clinical Assignment Quality Improvement Final Project GoalClinical Assignment Quality Improvement Final Project Goal
Clinical Assignment Quality Improvement Final Project GoalWilheminaRossi174
 
Clingov5understandingaudit2003
Clingov5understandingaudit2003Clingov5understandingaudit2003
Clingov5understandingaudit2003Papri Sarkar
 
Quality of nursing
Quality of nursingQuality of nursing
Quality of nursinggusainrahul
 
Quality of nursing
Quality of nursingQuality of nursing
Quality of nursinggusainrahul
 
Quality assurance in nursing
Quality assurance in nursingQuality assurance in nursing
Quality assurance in nursingNamita Batra
 
introductiontojcia-170123210005 (1).pdf
introductiontojcia-170123210005 (1).pdfintroductiontojcia-170123210005 (1).pdf
introductiontojcia-170123210005 (1).pdfMOHAMMED YASER HUSSAIN
 
Introduction to jcia
Introduction to jciaIntroduction to jcia
Introduction to jciaMouad Hourani
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...RBFHealth
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...RBFHealth
 
EVALUATION OF PERFORMANCE & QUALITY
EVALUATION OF PERFORMANCE & QUALITY  EVALUATION OF PERFORMANCE & QUALITY
EVALUATION OF PERFORMANCE & QUALITY Sana Saiyed
 
fti_whitepaper_just say yes v3
fti_whitepaper_just say yes v3fti_whitepaper_just say yes v3
fti_whitepaper_just say yes v3Glick, Noah
 
QUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptQUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptS A Tabish
 
QUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptQUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptS A Tabish
 
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docx
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docxRunning Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docx
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docxwlynn1
 
Clinical audit by Dr A. K. Khandelwal
Clinical audit  by Dr A. K. KhandelwalClinical audit  by Dr A. K. Khandelwal
Clinical audit by Dr A. K. KhandelwalDr.Ashok Khandelwal
 

Semelhante a FHS-PathologyQualityReview (20)

Importance of medical audit
Importance of medical auditImportance of medical audit
Importance of medical audit
 
Clinical Assignment Quality Improvement Final Project Goal
Clinical Assignment Quality Improvement Final Project GoalClinical Assignment Quality Improvement Final Project Goal
Clinical Assignment Quality Improvement Final Project Goal
 
Blank clinical audit report template
Blank clinical audit report templateBlank clinical audit report template
Blank clinical audit report template
 
Clingov5understandingaudit2003
Clingov5understandingaudit2003Clingov5understandingaudit2003
Clingov5understandingaudit2003
 
Quality of nursing
Quality of nursingQuality of nursing
Quality of nursing
 
Quality of nursing
Quality of nursingQuality of nursing
Quality of nursing
 
Audit in anaesthesia
Audit in anaesthesiaAudit in anaesthesia
Audit in anaesthesia
 
Quality assurance in nursing
Quality assurance in nursingQuality assurance in nursing
Quality assurance in nursing
 
introductiontojcia-170123210005 (1).pdf
introductiontojcia-170123210005 (1).pdfintroductiontojcia-170123210005 (1).pdf
introductiontojcia-170123210005 (1).pdf
 
Introduction to jcia
Introduction to jciaIntroduction to jcia
Introduction to jcia
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
 
NABH Dental Standards
NABH Dental Standards NABH Dental Standards
NABH Dental Standards
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
 
EVALUATION OF PERFORMANCE & QUALITY
EVALUATION OF PERFORMANCE & QUALITY  EVALUATION OF PERFORMANCE & QUALITY
EVALUATION OF PERFORMANCE & QUALITY
 
fti_whitepaper_just say yes v3
fti_whitepaper_just say yes v3fti_whitepaper_just say yes v3
fti_whitepaper_just say yes v3
 
QUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptQUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.ppt
 
QUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.pptQUALITY ASSURANCE IN HEALTH CARE.ppt
QUALITY ASSURANCE IN HEALTH CARE.ppt
 
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docx
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docxRunning Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docx
Running Head INTEGRATED QUALITY AND RISK MANAGEMENT PLAN 1 .docx
 
quality assurance
quality assurancequality assurance
quality assurance
 
Clinical audit by Dr A. K. Khandelwal
Clinical audit  by Dr A. K. KhandelwalClinical audit  by Dr A. K. Khandelwal
Clinical audit by Dr A. K. Khandelwal
 

FHS-PathologyQualityReview

  • 1. Pathology Quality Review Fr3dom Health Review Team October 2013 2013 Prepared for The Barnes Review Fr3dom Health Solutions Limited | Fr3dom House | 11 North Street | Portslade | E Sussex | BN41 1DH | www.fr3domhealth.co.uk FHS-PathologyQualityReview-v2.indd 1 08/11/2013 17:07:12
  • 2. 2 Contents3 1: Executive Summary 4 2: Introduction 4 3: Background 5 4: Summary of Findings 5 4.1: High level 5 4.2: Objective assessments and benchmarking of quality 5 4.3: Quality systems and learning 5 4.4: Maintaining quality assurance during times of structural change 6 5: Acknowledgements 7 6: Method 8 7: Results 9 8: Emergent Themes 9 8.1: Overall impression of pathology services 10 8.2: Issues concerned with the breadth of activity covered by pathology 11 8.3: Objective assessments and benchmarking of quality 13 9: Quality systems and learning 13 9.1: Some of the practices that were identified: 13 9.2: Sharing lessons within and outside the organisation 14 9.3: Maintaining (improving) Quality Assurance during times of structural change 15 9.4: Maintaining Quality in times of organisational stress 16 10: Scope of pathology services’ ability to control quality on the whole pathway 17 10.1: Working with Commissioners to manage demand 17 10.2: Restrictions imposed by IT systems 18 11: Paradigm shift to commissioning led service 19 Appendix 1: Survey Results 19 Table 1: Survey returns 20 Table 2: Categorisation of answers given 21 Appendix 2: Topic Guides 21 Discussion guide for providers: 21 In response to the Francis Report the Royal College of Pathologists said: 21 Discussion Guide for Commissioners 21 In response to the Francis Report the Royal College of Pathologists said: 22 Appendix 3: Quality Assurance in Pathology FHS-PathologyQualityReview-v2.indd 2 08/11/2013 17:07:12
  • 3. 3 2013Prepared by Fr3dom Health Pathology Quality Review 1: Executive Summary Fr3dom Health surveyed pathology services during the summer of 2013; a sample of 12 providers was subsequently approached to take part in a more in depth analysis. In the autumn of 2013 a total of 28 interviews took place with 43 participants who worked within twelve different health communities during autumn 2013. They included clinicians and quality managers from provider units, Clinical Commissioning Groups (CCGs) and a Commissioning Support Unit (CSU). • QA within pathology services appears to give few causes for concern. Clinicians and managers in provider units approach their work and its QA with energy, passion and enthusiasm. • Neither the scope of activity undertaken nor the quality assurance of that activity is currently driven by commissioners. • Different styles of pathology department utilise different styles of QA. It is not clear whether the different styles and processes have a measurable effect on quality of service. There may be a case for an agreed definition of quality. • Some departments have very well developed and resource intensive quality systems. The most persuasive were those where quality was built into the day-to-day activity of the department. We would like to see some evidence about whether a great number of “meetings about quality” equate to quantifiable improvements in quality of service. • Responsibility for delivering quality needs to sit with laboratory staff and clinical staff and within their respective management structures. Staff training and development can reinforce this. • Some work might be done in identifying objective metrics /standards beyond simple performance measures in order to allow benchmarking of quality for commissioners and others. To achieve this, there is a need to revitalise professionally-led clinical groups to work on standardisation of protocols and performance measures within pathology departments, or for commissioners to more fully promote Royal College of Pathologists (RCPath) Key Performance Indicators (KPIs). • A key test is how departments and their staff handle error reporting and incidents. An open culture of learning from mistakes and where appropriate involvement of those outside pathology is indicative of a good approach. Ensuring both clinical and laboratory staff receive feedback and share lessons learnt within and across organisation remains a challenge and one that requires attention. • Pathology services are undergoing period of intensive change (post Carter). QA needs to be kept to the fore when planning these changes. • The error reporting risk management systems seem to work efficiently and effectively. Largely, there are similar ways of dealing with error reporting processes in all pathology departments surveyed. In general commissioners expect that they would be told if exceptions occur. • The most persuasive regimes for quality assurance were those services where the quality managers adopted a facilitative role of persuading and enabling, and if necessary cajoling, front line staff to use and drive the quality management systems, recording and following up incidents appropriately, performing some audits and taking part in others. • IT systems may in some cases limit the internal and external communication (including between merging organisations) and therefore might limit effectiveness of quality management systems (QMS). There is a need to consider integration standards for IT systems in pathology. • Primary care and secondary care would each welcome better communication and some pathology departments are actively seeking to engage with primary care users of their service. If asked for their unmet needs, almost universally primary care wants “the same but quicker”. Pathology services perceive that they are driving innovation rather than responding to a demand for innovation from primary care. • There appear to be few, if any, commissioner-led service specifications. There may be scope for commissioners (perhaps via CSUs or by taking consortium approach across CCGs) and providers to work jointly on service specifications. • There appears to be little appetite in CCGs to undertake this work and little pathology expertise within commissioning bodies, a situation that will not help develop QMS in the future. This, combined with the lack of understanding or acknowledgement of RCPath KPIs in current commissioning, is an issue that will need addressing if quality is to become more systemically assured than is currently the case. • The availability of pathology staff for advice and guidance was mentioned both by providers and commissioners as a key quality indicator although much more difficult to measure other than as a process e.g. number of meetings attended, out of hours contact information. This may be a fruitful area for commissioners and providers to work together. FHS-PathologyQualityReview-v2.indd 3 08/11/2013 17:07:13
  • 4. 4 This report was commissioned to inform the Review of Quality Assurance (QA) Arrangements for NHS Pathology Services ordered by NHS Medical Director Sir Bruce Keogh in December 2012 and led by Dr Ian Barnes. The review brings together experts to explore how QA arrangements can be strengthened and how organisations can be more confident about the monitoring of the quality of care they offer the public. Fr3dom Health was commissioned to review the systems and practices in place to ensure quality within commissioner and provider environments; and to examine which, if any systems are in place to support and monitor this. The research took place in the summer and autumn of 2013 and also explored the extent to which pathology services and their associated Quality Management Systems are commissioning led. We focused on: Provider input: the attitudes and practices of providers towards quality assurance including error reporting, the consistency and spread of reporting channels and the reasons for any variation. Clinical Commissioning Group input: the understanding of the quality of service (QoS) in terms of how and what is commissioned, quality assurance processes used to monitor QoS, awareness of statutory undertakings together with the reasons for the views and practices expressed. Commissioning Support Units: the extent to which commissioning support (driven by service level agreements (SLAs) and Quality of Service (QoS) indicators) is becoming apparent in the new system. There is an acceptance that errors occur but anecdotal evidence suggests that not all errors and their handling are recorded, reported or shared with peers and Trust management consistently across pathology facilities – despite there being standard reporting tools in place. There is a need to understand the extent to which this is true and, as far as can be determined, why this might be the case. There is also a concern that there is variation in how errors are ranked for severity, allowing for inconsistency in reporting. This report was commissioned to help the review understand: 1. What is and is not reported locally and why? 2. What is and is not reported nationally and why? 3. What systems are used for reporting? 4. What arrangements are in place to ensure how learning from incidents is promoted and disseminated? Post Francis in particular there is a need to ensure that commissioners understand their duty to commission a high quality service. Contractual lines and specifications are not always consistent and do not always cover QoS against all services. It is important to understand how commissioners monitor QoS in pathology services. This report was commissioned to help the review understand: 1. Do commissioners know the details and scope of what they commission, and the quality of services purchased? 2. How and what do they commission? 3. How do they assure themselves of the quality of the service delivered? 4. Are they aware of their responsibilities/obligations regarding quality? 5. What arrangements are in place to ensure how learning from any incidents are promoted and disseminated? 2: Introduction 3: Background FHS-PathologyQualityReview-v2.indd 4 08/11/2013 17:07:13
  • 5. 5 2013Prepared by Fr3dom Health Pathology Quality Review 4: Summary of Findings 4.1: High level • There is no clear definition of quality against which to judge services externally or indeed to benchmark between services. There is a plethora of processes used, from operating procedures and systems to “getting it right for patients” to reliance on external QA bodies. In the absence of such a definition against which to make an objective judgement, the research showed that QA within the pathology services visited within England appears to be in very good shape. • We have been impressed by the energy, passion and enthusiasm within the pathology services for their work and assuring its quality. The laboratory managers and clinical staff we met had quality ingrained in their day-to-day work. We recognise this group was self-selected, however we remain persuaded. • This high standard is driven almost entirely by the service providers themselves – commissioners, certainly those commissioners in the primary care sector, have not engaged in driving change within the services at any scale. It is not high on their agendas, and certainly not a current priority. • There is a desire from within pathology services to do more to help the diagnostic process. Real measurement of quality in terms of informing patient care should be captured. • The role of multi-disciplinary teams (MDTs) in sense checking and challenging marginal interpretive decisions is very much part of the quality management (QM) of a pathology department but is difficult to quantify. • We wonder whether there are differences in approach between quality managers who have emerged from pathology (or the wider NHS) as opposed to quality managers who have come from other sectors including the commercial world? Is there a benefit in sharing different approaches? • There is a concern expressed by some that investigating QA in pathology implies a deficiency in quality. The opposite may well be true. Other interpretive specialisms do not appear to be under this sort of scrutiny. • Similarly there is a concern that high levels of error reporting is regarded as evidence of poor practice rather than being an indicator of an open culture of learning from mistakes. • We perceived a difference in response to QA when we spoke to clinicians or quality managers versus lab managers or heads of service. • There was some frustration from some quality/ risk management professionals that clinicians had not necessarily understood or appreciated their role in the past. If there ever were schisms between the professions they seem to be reducing to insignificance. 4.2: Objective assessments and benchmarking of quality • Clinical Pathology Accreditation (CPA) is effectively the universal standard for pathology services. Within the profession there is much debate about how useful CPA is. As far as commissioners are concerned it is seen as the “kite mark” and thus shorthand for a great deal of information regarding quality. It is an obvious starting point for them. CCGs should check labs are CPA accredited or ISO15189. They should also check participation in external quality assurance schemes. • Concern was expressed in pathology departments that a good CPA could reflect that an organisation is good at passing assessments rather than providing a good quality service per se. Similarly they felt that there could then be a danger that the CPA standards are seen as a target rather than a minimum standard. • A majority of pathology departments visited had well established quality management systems. This typically consisted of open access to error logging, regular quality meetings with incident reports as well as audit findings and examination of trends. • Incidents were reported locally and actions reviewed by departmental managers (within pathology) or quality leads from within pathology (who may be service heads). QPulse is widely used as the pathology-specific incident logging system; Datix is then used at Trust level. 4.3: Quality systems and learning • A key test is how departments and their staff handle error reporting and incidents. An open culture of learning from mistakes and where appropriate involvement of those outside pathology is indicative of a good approach. • Ensuring both clinical and laboratory staff receive feedback and share lessons learnt within and across organisation remains a challenge. 4.4: Maintaining quality assurance during times of structural change • There are definite concerns about the changes being driven through by the Carter review. Most can see the FHS-PathologyQualityReview-v2.indd 5 08/11/2013 17:07:13
  • 6. 6 merit but there are concerns about how to maintain (let alone improve) quality through changes • There are very high volumes of work with a wide range of procedures that are all to be covered by the same quality management systems. • There is a sense of organisations under pressure. We detected concerns that quality (in common with training and / or continuing professional development) might be sacrificed in an attempt to do more for less. • There is a case for some research as to whether the amalgamation of different systems into single management structure with different systems actually improves quality. There is a need to establish proper QMS before changes. As with many other quality issues the use of IT is critical here. For example where two or more organisations are working together, integrating their error reporting systems (Datix) is not straightforward. 5: Acknowledgements Thanks must go to the dozens of participants who made themselves available as well their colleagues who supported us with administration, information and a welcome during a hectic summer period. Without exception we found the professionals who took part working within pathology services up and down the country to be enthusiastic, professional and genuinely passionate about their work. It was a real pleasure to meet individuals and teams taking pride in their service and in getting things right for patients. We are extremely grateful to them for making time to see us in their busy schedules. We were unable to engage with only two pathology service providers from our original sample and these were readily substituted for similar organisations We would also want to thank those individuals who felt able to contribute from commissioning organisations. It is true to say that we had much more difficulty in engaging with these organisations. It was a difficult time of year to find space in diaries and CCGs and CSUs were clearly under a lot of pressure. We struggled to find many people who could engage with our questions in a meaningful way but all the interviews we conducted were courteous and useful. We are sorry that a handful of individuals from commissioning organisations were too busy throughout the entire period of fieldwork to respond in any way. FHS-PathologyQualityReview-v2.indd 6 08/11/2013 17:07:13
  • 7. 7 2013Prepared by Fr3dom Health Pathology Quality Review 6: Method Pathology departments were surveyed during the summer of 2013 and from the respondents to this survey a sample of 12 providers were approached to take part in a more in depth analysis. A brief résumé of the findings of this survey appears at Appendix 1. 59 individuals responded comprising clinicians and service managers and of these: • 50 reported they were aware of how pathology error coding is reported within the Trust • 49 were aware of the risk manager’s role in respect of pathology services • 48 reported they were aware of any established processes for sharing experiences of pathology error coding with colleagues outside the Trust, perhaps at other hospitals • 37 were aware of QoS policies at their laboratory in respect of pathology error coding. The sample of pathology departments to be investigated was taken in consultation with colleagues from the Pathology Quality Assurance Review Team from NHS England. The intention was to survey, as far as possible, a representative mixture of pathology services by size, geography and type of organisation. Pathology services were contacted and invitations to take part in the review were extended. We found the services to be extremely hospitable. Only two services were unable to contribute and fortunately we were able to invite other similar sized and positioned services to substitute almost immediately. We asked services to identify their link commissioners in primary care. In the event we usually approached professionals on the commissioning side through other channels. Guided interviews conducted by experienced senior researchers were used in order to gather data. Similar Topic Guides were produced in order to facilitate the interviews with different professionals in order to explore the role that quality issues play in the delivery of pathology services (copies of these are included at Appendix 2). Interviews were conducted face-to-face in the participant’s place of work where practicable. All the interviews with providers took place face-to-face. In general participants on the commissioning side appeared to have much more pressure on their time and the majority of these interviews took place on the telephone. The fieldwork took place throughout England from mid August until the early part of October. All participants were guaranteed anonymity and given this the organisations taking part and the individual contributors will not be identified within this report other than by using generic role descriptions. FHS-PathologyQualityReview-v2.indd 7 08/11/2013 17:07:13
  • 8. 8 7: Results A total of 28 interviews took place with 43 participants who worked within 12 different health communities. Interview Face to Face Telephone Total Service Provider Quality Manager 11 0 11 Service or Lab Manager 9 0 9 Clinician 7 0 7 Trust Governance / Risk Manager 3 1 4 Commissioner CCG Quality Director 1 5 6 CCG Contracts Officer / Lead 2 1 3 CSU Lead 1 2 2 Total interviews 34 9 43 We had correspondence with several additional commissioning and contracting professionals who provided some material but we have not included these contacts above since interviews fell outside the field work window due to availability issues and input was consistent with previous interviews. Discussion of method and analysis We acknowledge that whilst this was a review with several limitations (time and scope) we point out that a vast amount of information has been gathered from the various organisations and individuals with different job roles (several of the interviews lasted over three hours). A simple thematic analysis has been performed. Saturation was reached fairly early on with several themes. Our brief has been to highlight the common themes relevant to the review of QA within pathology arising from these discussions as well as some perhaps unusual themes that were brought up that we think are of interest to the review. With only one or two exceptions the pathology departments approached were extremely welcoming and open to the idea of taking part in the work. We acknowledge that we may well have spoken with a very select willing sample i.e. those who responded to the survey request promptly and could accommodate a visit at short notice. However some of the themes that arose were so common that we would be extremely surprised if the size and pragmatic nature of our sampling biased our findings. Many of our themes focus on the service provider element of pathology. It proved more difficult to identify and engage with individuals and organisations commissioning pathology services. There may be commissioners within CCGs who are actively pursuing improvements in QA within pathology services and taking a lead in these developments but we did not come across many and weren’t given to understand that there were many to come across. We cannot surmise what their priorities would be. Table 1: Interviews by type of organisation and participant’s role FHS-PathologyQualityReview-v2.indd 8 08/11/2013 17:07:13
  • 9. 9 2013Prepared by Fr3dom Health Pathology Quality Review 8: Emergent Themes 8.1: Overall impression of pathology services The stereotype of pathology departments is of a “service within a service”, perhaps aloof and surrounded by metaphoric high walls. The fact that the majority of departments are often in separate buildings from the main hospitals and always (for good reason) behind locked doors reinforces this initial impression. Significantly a similar impression was held very strongly by several commissioners in CCGs who did not necessarily have first-hand knowledge of their local providers. But there were some serious concerns Whilst such forthright views were not universal or even usual we can be extremely clear that pathology services were not a high priority for almost any of the commissioners Pathology services often felt that they were the leaders when it came to developing new services or suggesting metrics to reflect performance or quality. One or two of the pathology departments were proactive with commissioners, for example checking whether they will pay for new tests or innovative procedures. The process in one Trust appeared to be to pilot an innovation in order to show any benefit of the new service and then present a robust business case based on the pilot. On visiting pathology services we were pleasantly surprised by their energy, passion and enthusiasm around quality issues. The sense of “otherness” was a genuine one. Many of the participants to whom we spoke referred to the hospital or the Trust in the third person, i.e. “they” instead of “we”. However this sense of team spirit seemed, to us at least, to be beneficial when trying to deliver extremely high standards of quality within their areas of responsibility. We may have been fortunate to find such enthusiastic individuals in service after service but our perception is that individuals within these services have a well- developed sense of professional pride and a desire to perform at the very highest standards. Furthermore there was a sense reinforced time and time again that the importance of maintaining these high standards was in order to provide a high quality and safe service for patients. We would suggest that this desire “to get things right for every patient every time on time” is a matter of professional pride from individuals who work within a rigorous scientific discipline. Whilst those who wished to take part in the research gave us a universally warm welcome one or two expressed concerns about the implications of the Barnes Review. They were keen to point out that some other interpretive or diagnostic specialisms within the NHS were not under the same sort of scrutiny. Idon’thavethebackgroundinpathology CCG Director Thereareissuesaboutleadership,Ihavean impressionthattheyare,orwere,oldfashioned, commandandcontrolsortsoforganisation CCG Director of Quality They’reinthebowelsofthehospitalandnot onanybody’sradar CCG Director of Quality Wehavedonealittlebitofworkbutlargelyit isn’tadepartmentthatcreatesalotofnoise forus-takethatasabarometer CCG Director of Quality Wearenotdirectlymonitoringanyindicators aroundqualitybutwewouldpickuponincidents thoughthenormalchannels…andwewouldhear concernsfromourGPsiftherewereany CCG Director of Quality Pathologyservicesareoneservicelinewithin largeacutecontractanddiscussionswithin thisonlyhappensiftheserviceisunderpressure Contract Manager Itrytorememberthatthesesamplesareall somebody’sbloodorwhateverattheendof thedayandtheyneedmetodomybestforthem Lab Manager Theveryfactthatthereisareviewofquality assurancewithinpathologyimpliesthatthere issomekindofshortfallandIamnotsurethatthat istrueorhelpful Clinician FHS-PathologyQualityReview-v2.indd 9 08/11/2013 17:07:13
  • 10. 10 8.2: Issues concerned with the breadth of activity covered by pathology Professionals working within pathology services will realise that they are high volume organisations with huge turnover for some tests. Pathology departments throughout the country are experiencing high demand, an increase between 5 and 10% in overall workload was the oft-quoted figure and none of the respondents disagreed. The second and less easily quantified issue for QA is the breadth of activity within these services. Activity ranges from almost industrial scale, highly mechanised procedures to labour intensive interpretive diagnoses of individual specimens. All of these activities need to be covered by the same quality management processes. The same measures may not be meaningful for every discipline. This may be why some commissioners have limited their monitoring of pathology services to turn around times (see section below “Paradigm shift to commissioning led service”). This leads to another theme that arose from our interviews. We would often lead into discussions by asking what “quality” means in respect of a pathology service and we were intrigued to find a wide variety of responses. We would always expect to have different degrees of interpretation or emphasis but we found real, perhaps even fundamental, differences. The spectrum (even across professionals working for the same organisation) ran from definitions that might be characterised as the textbook answer “a service that is fit for purpose” or some variation on what we might characterise as the Darzi Definition about being safe, effective and positive, through “good record keeping”, “Passing our CPA” to the more visceral “getting things right for patients”. The usual response tended to include monitoring of standards and responding to incidents. We detected some difference in understanding of the meaning of “quality” between individuals who are quality management professionals and others (including clinicians). We sensed that there may be a detectable difference in approach between quality managers who have emerged from within the NHS and those who have come in from quality assurance roles in other industries. Making general statements is a risk with such a small self-selecting sample and it may well be true in that a service that decides to recruit a full time quality manager rather than handing quality brief to an individual(s) with other responsibilities within the department may already have a different corporate approach to QA. However we sensed quite strongly that there were differences in approach taken by departments with quality management professionals with some independence (i.e. whose line management came from either high up within the department or even outside) and those who were managing the quality brief within a portfolio of other responsibilities. We also found that the most persuasive regimes for QA were those services where the quality managers adopted a facilitative role of persuading and enabling, and if necessary cajoling, front line staff to use and drive the quality management systems, recording and following up incidents appropriately, performing some audits and taking part in others. We came across some tension between clinicians and quality managers and there were some stories of how there had been considerable resistance from clinicians to some of the systems. There was general agreement that these issues were largely in the past and divisions had been as result of a lack of understanding on both sides. It was felt that, in the main, clinicians now understand and appreciate (beyond their being a necessary evil) the QA systems in their departments. There were two comments within different pathology services where clinicians felt that they were disenfranchised from the processes dealing with incidents. This was challenged by their quality / risk management colleagues who felt that there was a disinclination for clinicians to get involved in any such follow-ups. One individual clinician felt that the Trust’s risk management professionals questioned his/her judgement as to what was a serious incident or near miss. The disconnect between clinicians and the quality management systems within a service was highlighted by a senior clinician: Thereisnoshareddefinitionofwhata highqualitypathologyserviceisorhowto measureitanditisthereforeextremelydifficultto aspiretothatlevelofservice.Isthereadifference ofemphasisbetweenprofessionalsfromdiffering backgrounds? Pathologist Icanmakeadiagnosisthatisdoneinthe requiredtimescaleattherighttimeforthe rightpatientandsoonandisthereforeof“high quality”butthereisnothingaboutwhetherthisis thecorrectdiagnosis Clinical Pathologist FHS-PathologyQualityReview-v2.indd 10 08/11/2013 17:07:13
  • 11. 11 2013Prepared by Fr3dom Health Pathology Quality Review Unpicking this concern with other clinicians in other services highlighted the role of the multi-disciplinary team (MDT) in sense checking and challenging marginal interpretive decisions. This resource intensive activity is very much part of the QA activity of a pathology service but is difficult to quantify. One or two pathology departments had started to consider how this could be encapsulated in their performance measures. One pathology department has a system of the receiving clinician acknowledging that they have received the test result electronically from pathology and that they have used it in their decision about the patient’s care. This was met with resistance initially across the Trust but is being trialled as a quality measure. 8.3: Objective assessments and benchmarking of quality One of the opportunities afforded by a review conducted by researchers outside of the pathology field was to explore different understandings of some of the most basic assumptions around quality. A key theme to emerge for us was how professionals view the accreditation of their service by Clinical Pathology Accreditation Ltd (CPA) in their working lives. Firstly there seemed to be some divergence about its status There was some mention of one or two labs that aren’t accredited although nobody was able to identify these. However the status is (presumably) fairly clear for those who need to know. What was much more interesting for us was how views about the CPA and the process of accreditation varied between professionals. Views were not split along recognisable lines (by job roles, experience of other environments) Even the most vociferous of detractors admitted that the way in which accreditation had been organised over recent years had improved significantly. There was a feeling that some objective standards had been needed in the past. The process had become systematised and there had been a lack of consistency between different assessors. There seemed to have been cases of individual assessors seeking confirmation that processes were to their own personal liking rather than to objective standards. There seemed to be a general acceptance of the new timetables and an understanding that the moves towards ISO type standards were beneficial. More than one of our sample group of providers had recently undergone an assessment and expressed how onerous they had found it. Whilst they were content to have passed, there was a definite sense that a significant proportion of the evidence collected was of more use to the assessor than to the organisation being assessed. Is there a sense that the CPA becomes an end in itself? It is probably worth recording that many of the participants were in fact CPA assessors. There was a great deal of discussion about how useful CPA is internally. There was a sense for many that this was at least a benchmark for their organisation. However we would also suggest that some the standards represented a target to be achieved rather than a minimum standard for their service. We had a limited sense that services found the process useful for improving their own QMS. CPA is considered by some pathology departments to be the end point in a long process of internal and external QA. We did get a strong sense of how fundamental the CPA is for commissioning organisations. It is seen as a kite mark and thus shorthand for a great deal of information regarding quality as far as commissioners are concerned. Alongside CPA there could be information about how a lab participates in external quality assessment (EQA) schemes and whether concerns have been raised. The standard of EQA schemes was questioned, there was felt to be little consistency across them but labs are free It’svoluntarybutcompulsoryifyouknow whatImean Quality Manager Brilliant,professionalandclear Quality Manager Utterrubbish–getsinthewayofquality Quality Manager CPAandincidentreportingismandatory Clinician Idon’timaginethatyoucouldbealab workingfortheNHSwithoutit Lab Manager Itispossiblethatgettingagoodassessment demonstratesthatyouaregoodatassessments ratherthanahighqualitypathologyservice Quality Manager FHS-PathologyQualityReview-v2.indd 11 08/11/2013 17:07:13
  • 12. 12 to choose. Perhaps there should be a limited number of approved EQA schemes. Our understanding is that when a problem has been spotted via EQA a referral letter is sent to the Trust chief executive and if the problem is not addressed a second letter is sent. CCGs could ask about these communications but currently a system to do this does not exist. Another issue brought to our attention is the lack of agreement on performance standards for test specification. The NHS does not specify performance levels, for example error or precision levels for specific tests. Again this was not something raised by the commissioners. In terms of performance standards, some of the RCPath KPIs were in use with departments viewing these as minimum levels of service and others as targets. It is understood that these are not finalised. The pathology departments measured turnaround times, staff training and development and user surveys. Some identified the attendance of pathologists at MDT meetings as a quality measure whilst others focused on patient complaints. FHS-PathologyQualityReview-v2.indd 12 08/11/2013 17:07:13
  • 13. 13 2013Prepared by Fr3dom Health Pathology Quality Review 9: Quality systems and learning Arrangements for incident reporting and follow up actions within pathology departments appear to be robust. However there was variation as to the risk and governance reporting procedures within the Trusts and the significance given to pathology errors and incidents. Typically a locally developed or off-the-shelf database (QPulse being the most commonly cited) was used to report and track all errors within pathology. These would be graded by laboratory managers and /or quality managers and entered onto the Trust risk reporting system (usually Datix). A summary of practices found is given in Appendix 3. 9.1: Some of the practices that were identified: • In the case of clinical or critical incidents, non- conformity is raised on the laboratory quality management system. The quality manager liaises with the clinical lead, pathology manager and the operational manager of the particular area of the Trust. The team decides on the severity and coding of the incident. This is progressed to the Trust lead for patient safety who takes a final decision on the seriousness of the incident, further investigations and any remedial actions required. • Staff record errors and team managers score these. The pathology committee meets monthly to discuss complaints and incidents. The Trust risk manager receives daily information on adverse events where there is a serious impact on patient safety. The Trust clinical governance team has facilitators to work with each clinical area where support or assistance to take action is required. • A local Excel system is used to record errors, an internal log identifies investigations, then the Datix system used by the service manager and deputies to grade incidents. Any adverse incidents are investigated internally to analyse for root/cause and for corrective/ preventive actions to be determined. Summary reports are presented at senior pathology management meetings attended by heads of departments and the pathology risk manager. Any shared learning outcomes are provided for action and relevant issues would be escalated appropriately to governance/ risk management. Datix trend data is analysed; this feeds into risk management for discussion/ action at divisional or executive level. The Trust quality manager has governance overview and reports serious untoward incidents (SUI) to the Trust Board via six monthly assurance reports. A sample of low-level incidents is examined, including the actions taken to check for patterns. If no action taken is identified, then these are escalated. • Use of own database for adverse events. Datix is used Trust-wide for incident reporting, actions reviewed by departmental managers. Quality managers provide number of incidents, those at medium and higher risk levels are tabled in the Trust risk register. Monthly quality meetings are held with quality leads, each department in pathology reports and trends are monitored. This is a well-established system. • Datix “investigating officer” has open entry, some escalated to root/cause analysis and SUI, most are not serious. Datix manager escalates, completes national upload and risk management team compares local and national data. • Near misses reported on local system, laboratory manager grades errors and conducts root/cause analysis, reported to the risk team. Investigation solved but no follow-up at three or six months. Pathologist raises concerns via clinical director route. As investigator, the laboratory manager sees the errors reported by staff and conducts the root/cause analysis. The reports go to the Trust’s risk team but no follow-up is received back by pathology as these incidents are regarded as internal to pathology and not serious enough. • Pathology system designates clinical incidents when there is a delay or incorrect test ordered. Incident reporting may show a concern regarding staff skills or numbers, however it was unclear that action was taken on the latter. 9.2: Sharing lessons within and outside the organisation In order to communicate the outcome of recording and investigating pathology incidents, the majority of departments visited had well-established communication mechanisms, usually via team meetings and quality groups. Theydon’tgetfullyinvestigated,orleadto robustproceduralchanges Clinical Pathologist Qualityisoneverystaffmeeting,allstaffare involvedinaudits Quality Manager FHS-PathologyQualityReview-v2.indd 13 08/11/2013 17:07:13
  • 14. 14 Many of the sites visited described a series of meetings internal to the pathology department and then wider within the Trust. In terms of clinician engagement, one Trust has a laboratory management committee with all consultants and senior scientists attending. The risk management and quality management groups feed into this. It was not clear how often pathology staff learnt about patient safety with other departments within the organisation. One Trust has a “learning by improvement” group of operational managers and clinicians who meet quarterly and discuss patient safety incidents. Governance arrangements in the provider organisations clearly varied. One Trust categorised incidents between those of low level managed locally and SUIs as a grey area of “serious incident action review”. Weekly meetings are held across the Trust as part of corporate governance structures to present new incidents and conduct root/ cause analysis. The role of a quality co-ordinator was evident at a couple of sites. This person provided a link between the laboratory managers and the clinicians especially for closing the loop on feedback, continual improvement and learning. This may be an alternative model especially for large multi-site pathology services to ensure effective feedback to all staff, and manage attendance at meetings alongside other demands on staff time. In a couple of places regional quality manager groups exist and have been the forum to share and learn from each others’ mistakes. Their purpose is to drive quality through learning plus accreditation. As was indicated to us, some of the reorganised pathology services had found other providers more guarded about sharing information for benchmarking purposes. Clearly the quality agenda benefits from learning within and across organisations. One method to enable this may be to revitalise professionally-led clinical groups to work on standardisation of protocols and performance measures. 9.3: Maintaining (improving) Quality Assurance during times of structural change Many of the pathology provider organisations with whom we engaged were undergoing some structural changes, had recently done so or were expecting changes in the near future. Three of the providers were affected by the Transforming Pathology Project (TPP). Several were involved in mergers with other Trusts’ services as well as forming partnerships with private sector providers. In fact those services that were not undergoing or contemplating significant change were in the minority. The few services that were not undergoing change felt that they had made the business case for the status quo not least because they were an income generator for their parent Trust. This was echoed by the relevant commissioners We are not in a position to be definitive but we had a strong impression of volatility within the services undergoing change and we perceived that in many of them the focus was on change rather than service delivery. The impression given is that change was or had been being driven almost entirely by economic (efficiency) arguments. Benefits to the service delivery or improvements in quality were not at the forefront of any discussions that we had. This was true for the service providers and for the commissioners to whom we spoke, although even in those areas taking part in TPP we did not get the impression that CCG quality directors Moresharingpreviously,guarded conversationsnow,forexamplewewilldiscussa patientsafetytopicbuthowitisaddressedand financialimplicationsarenotshared Service Manager Communicatedtoallstaffthroughsection meetingsandmanagementmeeting Quality Manager Riskmanagementhasahighlevelof consultantinvolvementatthisTrust Quality Manager Pathologyhascultureofusingerrorsand addressingnon-conformityaslearning Risk Manager Theserviceiswrappedupinblockcontract andthatislargelywhyitissuchgoodvalueforus CCG Director of Quality Whenweinvestigatedtheserviceswewere commissioningwerealisedthatweweren’t payingforsomeserviceswewerereceivingandso infacttheexercisecostusmoney. CCG Director FHS-PathologyQualityReview-v2.indd 14 08/11/2013 17:07:13
  • 15. 15 2013Prepared by Fr3dom Health Pathology Quality Review were intimately involved in the negotiations around the tendering processes for pathology. In general almost every professional with whom we engaged could see the business case for merging smaller laboratories with low volumes and a less interesting mix of work. There are clearly critical mass arguments for many procedures. There may be a case for merging or taking over failing or struggling services (we were led to believe that there were a number of smaller more isolated laboratories around the country that did not have a sufficient turnover to maintain the very highest standards) but this quality dimension did not seem to be driving the service redesign process. We visited a large multi-site pathology service that had been created as the result of several labs merging into one management structure that included some private sector input. The process was relatively mature yet the individual laboratories still seemed to operate their own QMS and standard operating procedures (SOPs). The service was delivering more tests and anecdotally had reduced costs. Another site visited had merged two years previously and had managed to maintain a fairly stable workforce and reported achieving cost savings whilst increasing quality. The clinical and service leads all reported to a joint board with clear incident reporting arrangements in place. We visited another service that was in the early stages of merger. The change manager was very keen to explain that the aspiration was to harmonise all of the various sites to ensure a common QA process as well as SOPs. There remain for us some serious questions about maintaining quality standards through structural change and we wonder whether there is good evidence base for improving quality through mergers. Certainly the need for planned change seemed clear. 9.4: Maintaining Quality in times of organisational stress Many of the laboratories visited spoke vividly about the increase in workload that they had experienced. Some managers described how they felt under pressure to deliver more with diminishing staff recourses. We sensed in several services real strain on teams with challenges such as increasing workload, structural change, a (perceived) reduction in resources, high staff turnover, use of locums, reduction of skill level (i.e. recruiting key personnel on lower bands within laboratory situations). We don’t want to paint a picture of services in crisis nor of professional discontent, however we did get a sense that pressure was mounting. QMS may have been the first casualties of labs experiencing stress. We do not what to give the impression that these professionals were bemoaning their lot.; much of these open discussions were conducted with a great deal of humour. However we certainly detected an amount of stress within many services. There were more than a few indications that some of the day-to-day adherence to QA procedures including internal audits, continuing professional development, staff training, and less essential administration were the first casualties of this stress. What slips straight away because you have your head down concentrating on the samples are things that are required for CPA, to prove you’ve got a quality service, so staff appraisals might not occur when they are scheduled, competency assessments the same (you know the person is doing it but the actual paperwork might not get done), Ihavetofighteverytimesomebodyleaves tomakethecaseforreplacingthatperson withsomebodyonthesamegrade.Ilosemore argumentsthanIwin Service Manager Withvacancycontrolthatisourstandard conversation,anytimewehaveavacancyit is“doesithavetobeatthatgrade?Whatareyou goingtodelivermoreof?” Lab Manager Ourworkloadhasramped–certainlyover thelast12to18monthswehavehadto shoreupthedepartment,lotsofstaffbeingoffered overtime,I’vebeendoingovertime,itsdifficultto keepdisciplined Lab Manager HardlyadaygoesbywhenIdon’thave toputthewhitecoatonandgetonthe bench Lab Manager Oftenourteamsliterallydon’tknowwhat theyaregoingtobedoinguntilweseewhat comesin,itmakesplanningextremelydifficult. Lab Manager FHS-PathologyQualityReview-v2.indd 15 08/11/2013 17:07:13
  • 16. 16 10: Scope of pathology services’ ability to control quality on the whole pathway One issue that we came across several times was something that is universal to any pathology service but is worth noting when looking at QA from a wider perspective. The theme that arose in many of our discussions was the extent to which the delivery of a high quality service (in terms of, say, right result, right patient at the right time) is dependent on some factors upstream and downstream of the service that are not in their control. When it comes to taking the samples, their collection and transport and the distribution of the results a pathology service has limited ability to drive quality. Anecdotally the majority of sub optimal delivery incidents we came across were as a result of incorrect (or incomplete) labelling or a failure of communication between the service and the healthcare professionals who were dealing directly with patients. Where services have capacity to investigate, analyse and ameliorate shortcomings in the taking of samples they did so. There were some excellent examples of good practice including one service dealing with blood samples that had the capacity to “go out” to wards and practices targeting outliers. One individual concerned was seen as an advocate for those professionals on the wards or out in the practices who were unable to meet the standards and delivered support and training as appropriate. The induction of new medical staff joining Trusts or practices can also be a key point for pathology to influence if they have the capacity to do so. At several Trusts the clinical leads in pathology were involved in presenting information about ordering tests and seeking advice at induction sessions. There remains some tension generated by the aspiration to have every sample presented properly. Several laboratories were considering a “zero tolerance approach” to mislabelling. On the other side of the coin we picked up some themes whereby professionals working within the laboratories were less than willing to reject samples or to record poor performance since they felt that this was tantamount to accusing fellow professionals (some of whom are working under a great deal of pressure) of incompetence. We sensed that whilst the guidance may be clear for senior quality managers, some level of discretion is being used “on the front line”. We imagine that very few quality managers are in a position to systematically analyse outliers in terms of incident reporting. One clinician described QA as The standing of the pathology department within the organisation may be key to the balance between assisting users of their service to improve quality versus refusing to do the “incorrect” work, log it as an error and report it at a meeting. It was unclear what the tipping point for this balance might be, for example whether it depended upon the seriousness of the error, whether it was a audits might slip… you might not be able to release people for training Lab Manager Whilst there is no suggestion that a necessary condition for QA requires a stable workforce (we did not have the time to get anything other than an impression) there was an understandable desire from providers to have a motivated, properly trained and dedicated teams. More than one of the commissioners to whom we spoke expressed an interest in using some measure of staff volatility and training as a proxy indicator for quality within the provider. As we discuss below, the gap between the aspiration of some commissioners to monitor meaningful data about QA and their actual capacity or willingness to do so is quite wide. Requiringasystemwherestaffare comfortablereportingmistakesandlearning fromthem Clinical Pathologist FHS-PathologyQualityReview-v2.indd 16 08/11/2013 17:07:13
  • 17. 17 2013Prepared by Fr3dom Health Pathology Quality Review “repeat offender”, or the individuals’ personalities and workload. Does the Trust size and culture influence this? One site within a merged service had experienced a difference in approach from theirs of zero tolerance of errors and had reached a combined team approach after a year working through a clinical effectiveness policy. This is an important consideration when bringing departments together. 10.1: Working with Commissioners to manage demand There were a couple of examples where the pathology services worked with their lead commissioner to proactively manage demand. At one place the workload referred by the GPs was examined via the electronic ordering system, any inappropriate test requests identified and the number and type of tests repeated too soon recorded. The consultant pathologist, with commissioner support, had translated the results into an online tutorial for GPs. 10.2: Restrictions imposed by IT systems Communication difficulties between IT systems used internally and externally (including those between merging organisations) might limit effectiveness of QMS. Some of the pathology departments had developed local databases to record incidents and errors identified by their staff. In a few places paper records were still maintained; others were fully computerised. The advantage of an electronic central system to hold the operating procedures is that they can be updated easily and only the latest copy is available to staff, even when working across more than one site. Where pathology services were combined across more than one organisation (for example a number of Trusts and/or central hub laboratory) IT systems were not always integrated. Datix completion was handled by one or several individuals at team leader or manager position, usually from within pathology. Datix systems can be locally adapted and therefore combining the information from two Datix systems is not as straightforward as it may seem. In the case of partnership between NHS Trusts and private providers of laboratory services, new integrated IT systems may be offered to the respective laboratories. Apart from the transition from old systems across to new, integration is seen as an advantage as long as the respective Trust reporting requirements can be met. Clearly an ideal scenario is that tests are ordered electronically from the community, GPs in their surgeries, and from other clinical staff within the hospital. Results are also sent back electronically and stored in electronic patient records. Evidence of this practice was found in only a couple of sites. For one commissioner the adoption of the electronic ordering of tests by the GPs was the number one priority. However a different opinion was that the electronic system somehow lost one of the “sense checks” in the transfer of samples. FHS-PathologyQualityReview-v2.indd 17 08/11/2013 17:07:14
  • 18. 18 11: Paradigm shift to commissioning led service All of the pathology services with whom we spoke were making efforts to engage with colleagues in the primary care sector whom they saw as their users, in the main GPs. In one area an individual GP with a particular interest in pathology had been key to some innovative engagement and service developments. However this seemed to be the exception rather than the rule and whilst one or two managers had some capacity to “go out to the practices” communication other than discussions of individual problems was in the main restricted to e-mail cascades and annual surveys. We heard in many services a sense of exasperation that, when surveyed, GPs tended to want the same service that they had now but could the pick-ups be later or maybe slightly quicker? Several laboratories admitted that they would welcome a much more interactive relationship and felt that if there were innovations such as new tests, changes in technique or different services (perhaps additional help with diagnosis) these were always driven by the service rather than the users of the service. Furthermore in discussions with quality professionals on the commissioning side we noted time and time again that there were very few measures highlighted in the contracts In general the scrutiny of pathology services by CCGs seemed to be light touch. Professionals on the commissioning side were confident that were there problems within pathology services they would be made aware by the service or the parent Trust through their normal modes of communication. In truth almost everybody to whom we spoke was managing vast block contracts in which the pathology service was represented by a very few clauses. As discussed elsewhere within this report we were given the very strong impression almost everywhere that the evaluation and monitoring of pathology services was not a high priority for most CCGs. When we discussed with CCG quality directors the sorts of quality indicators that they would like to see in the future, the recurring themes were: • Turn around times (we accept that turnaround times are easy to quantify and are a measure of proficiency especially in acute and emergency care settings however we wonder whether they are the be all and end all for assessing quality in pathology for primary care providers) • Appropriate accreditation • Some metrics around the training, retention and development of staff • Incident reporting and investigation • Responsiveness to enquiries from GP colleagues • A couple of commissioners did identify that some attempt at managing demand through reducing inappropriate referrals could be measured, especially where electronic ordering is in place • The availability of pathology staff for advice and guidance was mentioned both by providers and commissioners as a key quality indicator although much more difficult to measure other than as a process e.g. number of meetings attended, out of hours contact information. This may be a fruitful area for commissioners and providers to work together. Thespecisverylow–weprovideevidenceof turnroundtimes(TAT)for[certaintypesof] screeningthatiswithincontractuallimitsandalso weprovideinformationforsexualhealthscreening fornationalsurveillance(CTAD),butthelatteris notaqualityindicator.Thereisnorequirementfor CPAaccreditationoranythingelsespecified. Pathology Service Manager Wedon’tactuallyaskforverymuch,only reallysomethingaboutspeedofreturning resultsandtobehonestwehaven’tbeen monitoringthatveryclosely, CCG Contract Manager Weassumethatwewouldhearfromthetrust iftheservicewasnotperformingadequately, andwewouldhearthroughthenormalchannelsor fromourGPcolleaguesiftherewasaproblem CCG Quality Director Thereisalackofstandardisationin commissioning,blockcontractswillallow activitycontractstodevelop CCG Quality Director FHS-PathologyQualityReview-v2.indd 18 08/11/2013 17:07:14
  • 19. 19 2013Prepared by Fr3dom Health Pathology Quality Review Appendix 1: Survey Results Question Response Are you aware of Quality of Service (QoS) policies at your laboratory in respect of pathology error coding? 37 aware, 22 not aware Are you aware of how pathology error coding is reported within the Trust? 50 aware, 9 not aware Are you aware of any established processes for sharing experiences of pathology error coding with colleagues outside theTrust, perhaps at other hospitals? 48 aware, 7 not aware, 4 no answer Are you aware of the risk manager’s role in respect of pathology services? 49 aware, 10 not aware Table 1: Survey returns FHS-PathologyQualityReview-v2.indd 19 08/11/2013 17:07:14
  • 20. 20 Response given Number Examples and issues raised Comprehensive approach 13 Trust wide Incidents via Datix with pathology element: reports reviewed three monthly by pathology quality manager looking for trends. Report goes back toTrust risk management. Reviewed by the quality manager for laboratory medicine with the busi- ness group quality manager at a QA meeting that has risk management team input Risk management 10 All errors are reported and are part of the corporate risk register Errors in our place of work are directly reported into theTrust’s risk management system.They receive a severity grading, and are reviewed at departmental risk management meetings. Governance structure 10 Incidents and resolution are discussed on a monthly basis at the pathology clinical governance (CG) meeting Pathology CG committee reports to theTrust CG committee Quality Management System 7 Minor errors and near-misses are reported via QMS and followed up within the department Reported in incident forms and these go through a formal pathway of investigation and closure that is part of the quality process. Team / clinical input 4 Quality manager liaises with the clinical lead, pathology manager and the manager of the particular area of theTrust.The team decides on the sever- ity and coding of the incident. Risks are reviewed at monthly pathology meetings of senior clinical and technical staff and are also reviewed monthly within the clinical division of theTrust. There is a risk structure but this structure mainly consists of“managers” with minimal clinical input. Clinical input is sought“as and when needed” but the main question is how will the managers automatically know unless clinicians are an integral part of this group? Patient Safety 2 Error reports are passed to patient safety department who coordinate, investigate and escalate as appropriate. Trust incident reporting 2 If an error has escaped pathology, i.e. impacted on another area or patient, it would be reported on theTrust incident reporting system and investi- gated as aTrust incident. Via Datix, no further information given 9 All pathology errors are reported on Datix Not aware of action taken 2 out of 59 De-escalated and sidelined within the clinical error investigation/reporting system because a manager outside pathology decides. It is not clear. Pathology department not receiving feedback on remedial measures for serious incidents reported. Table 2: Categorisation of answers given to question“How do error reports at your place of work link in with theTrust risk management system?” FHS-PathologyQualityReview-v2.indd 20 08/11/2013 17:07:14
  • 21. 21 Appendix 2: Topic Guides Discussion guide for providers: Icebreaker – check equipment etc who are you/ job title/job role. Cross check to records and amend if needed. Recap that this is entirely confidential and anonymous. Fr3dom have never and will never release transcripts or identifiable responses to anybody. In response to the Francis Report the Royal College of Pathologists said: “We continue our work to ensure that pathology services and their quality systems are fit for purpose in a health service which has to manage with a restricted budget.” Q1. How do you respond to this statement? Q2. In general how do you (your organisation) monitor the quality of any services that are delivered for your patients? Might be omitted Q3. In your own words what do you think is meant by quality assurance in terms of a pathology service? Q4. What quality assurance clauses have you got in your agreements with your commissioners? Q5. Are you clear about which pathology services you are commissioned to provide and for whom? Q6. Do you know what your responsibilities are with regard to the quality of these services? Q7 Please can you explain what this means in practice? Q8. In terms of any errors are you aware of what is and is not reported locally? Do you know why these criteria (for what is and is not reported) are in place? Should you? Q9. Similarly are you aware of what is and is not reported nationally and why? Should you be? Q10. Do you know what system is being used to report these errors? And how? Who is responsible for this data entry? Who uses STEIS/Datix (or other systems such as NRLS and MHRA) respectively, and what for? Q11. What local arrangements are in place for learning from errors or indeed any problems and incidents identifie, whether or not they led to problems? Q12. Can you identify any changes that have been made locally in response to this process? Q13 What effect do these clauses have on the delivery of the service, on a daily basis or from a strategic perspective? Discussion Guide for Commissioners: Icebreaker – check equipment etc who are you/ job title/job role. Cross check to records and amend if needed. Recap that this is entirely confidential and anonymous. Fr3dom have never and will never release transcripts or identifiable responses to anybody. In response to the Francis Report the Royal College of Pathologists said: “We continue our work to ensure that pathology services and their quality systems are fit for purpose in a health service which has to manage with a restricted budget.” Q1. How do you respond to this statement? Q2. In general how do you (your organisation) monitor the quality of any services that are delivered for your patients? Ramp up question - might be omitted Q3. Please can you give an overview of how you go about commissioning pathology services? Q4. Are you clear about which pathology services you commission and for whom? Q5 In your own words what do you think is meant by quality assurance in terms of a pathology service? Q6. What quality assurance clauses and have you got in your agreements with your providers? Q7. Can you please provide evidence of this? (should have been prepared in advance – this will be the document or pages referencing QoS from their contracts) Q8 What effect do these clauses have on the delivery of the service, on a daily basis or from a strategic perspective? (Investigate the extent to which they are familiar with national directives etc). Q9 Can you provide details of the way you actually measure QoS of providers? Q10 Do you know what your responsibilities are with regard to the quality of these services? Q11. Please can you explain what this means in practice? Q12 In terms of any error coding, are you aware of what is and is not reported locally? Do you know why these criteria (for what is and is not reported) are in place? Should you? Q13 Similarly are you aware of what is and is not reported nationally and why? Should you be? Q14 Do you know what system is being used to report these errors? And how? Who is responsible for this data entry? Who uses STEIS/Datix (or other systems such as NRLS and MHRA) respectively, and what for? Q15. What arrangements have you got in place for you to learn from errors or indeed any problems and incidents identified, whether or not they led to problems? Q16. Can you identify any changes that have been made locally in response to this process? FHS-PathologyQualityReview-v2.indd 21 08/11/2013 17:07:14
  • 22. 22 Appendix 3: Quality Assurance in Pathology Topic /Theme Ideal model Examples in practice Examples in practice External Accreditation All parts of one pathology service are accredited. Different disciplines have own CPA process Different views on CPA effectiveness. Other models coming? Unaware of what happens on linked but separate sites. Incident / error reporting – Department level All staff trained and use common electronic system. Some paper based reporting, but entered onto common system. Some issues with access – seniors. Refer to leaders. Different local systems but brought together for department, parallel systems may exist. Qpulse used internally some places. Staff training in error reporting All staff trained and use common electronic system. Part of induction – regularly reviewed. “In the bloodstream”. Some staff require others to complete incident reports. QM or equivalent monitoring for outliers in terms of entries. Some top ups but mainly doing it is better than training. Level of errors Benchmarking possible via Keele. High error reporting numbers does not mean problem site. “Datix”or similar system seen as“slap on wrist”by staff on wards so completion poor. Communication via meetings and reports Clinical representation and all staff briefed. Clinicians not informed of actions taken once inTrust risk reporting. Internal quality meetings very good where they exist. Feedback mechanisms Actions reported back. Can be via the incident“investigator”not the“reporter”. Trust risk reporting Clear agreement of how pathology incidents feature inTrust risk reporting with feedback to pathology. Pathology define incident as patient safety event. Learning from errors Feedback mechanism via staff groups. Trust“Learning by Improvement group” with representation from clinicians and operational managers including pathology. Some excellent examples. QM usually responsible for cascading learning. Responsibility for Quality Sits in one person with overview of service, medical and lab staff. Full time quality manager covering a number of sites. Drives quality through system, common documents, equipment etc. Clinical Director has responsibility and also each lab manager. for quality aspects. Where does“QM sit”? Quality culture Openness of department and across whole organisation. CPD requirement for all staff to receive Quality training. Audit function Team drawn from existing staff, key part of role and professional opportunity for all. Only sample of staff work as auditors. Part of normal working where it’s good – sense is that this is one of the first casualties of work pressure. Managing demand from hospital Items specified and reimbursed by CCGs. Educate junior doctors and new staff arrivals. Test duplication Where hospitals refer work on to another pathology department tests should be streamlined. Some duplication from DGH to specialist centres. FHS-PathologyQualityReview-v2.indd 22 08/11/2013 17:07:14
  • 23. 23 2013Prepared by Fr3dom Health Pathology Quality Review FHS-PathologyQualityReview-v2.indd 23 08/11/2013 17:07:14
  • 24. Pathology Quality Review Review undertaken by Fr3dom Health for and on behalf of The Barnes Review Fr3dom Health Solutions Limited | Fr3dom House | 11 North Street | Portslade | E Sussex | BN41 1DH | www.fr3domhealth.co.uk 2013 Market leading Engagement strategies Sustainable Quality improvement Creative Patient & Pubic Involvement Specialist Public Health ReviewTeam FHS-PathologyQualityReview-v2.indd 24 08/11/2013 17:07:14