This document summarizes the key topics discussed at the 2005 President's Conference on "Technology in Imaging and Radiotherapy". The conference highlighted advances in medical imaging technologies like CT and their impact on clinical practice and workflow. Presentations covered improvements in CT scanner design allowing faster, lower dose scans over wider areas. Applications of newer cardiac CT and particle therapy techniques were also discussed. The conference emphasized that while technology enables better outcomes, optimized implementation through improved workflows is also needed to increase productivity and efficiency in medicine.
2. The British Journal of Radiology
January 2006, Volume 79, Issue 937
February 2006, Volume 79, Issue 938
March 2006, Volume 79, Issue 939
April 2006, Volume 79, Issue 940
May 2006, Volume 79, Issue 941
June 2006, Volume 79, Issue 942
July 2006, Volume 79, Issue 943
August 2006, Volume 79, Issue 944
September 2006, Volume 79, 945
October 2006, Volume 79, 946
November 2006, Volume 79, 947
Volume 79 (2006), Case reports
September 2006, Volume 79, Special Issue 1
4. January 2006, Volume 79, Issue 937
● The President’s Conference 2005: ‘‘Technology in Imaging
and Radiotherapy – towards improved workflow and
productivity’’
● CT scanning the early days
● Cardiac applications of multislice computed tomography
● Technology solutions for better outcomes: integrated
information management in key to productivity increases in
medicine
● The case for particle therapy
● The contribution of PET/CT to improved patient management
● Mesenteric panniculitis in oncologic patients: PET-CT findings
● Diagnostic efficacy of SonoVueH, a second generation
contrast agent, in the assessment of extracranial carotid or
peripheral arteries using colour and spectral Doppler
ultrasound: a multicentre study
● Lymphoepithelioma-like carcinoma of salivary glands:
treatment results and failure patterns
● Comparison of patient doses in 256-slice CT and 16-slice CT
scanners
● Assessment of tube current modulation in pelvic CT
● Radiosurgical palliation of aggressive murine SCCVII
squamous cell carcinomas using synchrotron-generated X-ray
microbeams
● Solitary pulmonary nodule with growth and contrast
enhancement at CT: inflammatory pseudotumour as an
unusual benign cause
● Non-haemorrhagic subdural collection complicating rupture of
a middle cranial fossa arachnoid cyst
● Correspondence
● A deformed skull with enlarging hand and feet in a young
female
● Acknowledgment to Referees
5. Commentary
The President’s Conference 2005: ‘‘Technology in Imaging
and Radiotherapy – towards improved workflow and
productivity’’
P P DENDY
Gu¨nter Dombrowe, the President of the British Institute
of Radiology (BIR), introduced the theme of this year’s
Conference, and explained its dual purpose – to highlight
the contributions of medical and information technologies
towards improving clinical practice, patient outcome and
health economics; and to pay tribute to the pioneering
work of Sir Godfrey Hounsfield, the inventor of CT
scanning, perhaps the key technology of the digital
imaging age.
This Commentary provides an overview of some of
the important topics discussed at the Conference.
Some of the key presentations are also included in this
issue.
Elizabeth Beckmann reminded the audience of the early
days of CT – the excitement generated by the images of the
brain shown at the 32nd Congress of the BIR on 20 April
1972, the delightfully understated title of Sir Godfrey’s
lecture – ‘‘Computerised axial tomography, a new means
of demonstrating some of the soft tissue structures of the
brain without the use of contrast media’’, and the
subsequent publications in the BJR [1, 2]. The enduring
memory of this and other early developments is that so
much was achieved with so little money. Was Sir Godfrey
one of the last brilliant, intuitive, string and sealing wax
physics brigade?
The first of the two nominated Hounsfield lecturers,
Willi Kalender gave a comprehensive review of the past,
present and future of CT from a physics and technology
standpoint. He pointed out that there had been three
distinct phases of development: (1) the 1970s had been a
time of rapid development with second, third and fourth
generation scanners; (2) the 1980s had been a period of
stagnation with the competing development of MRI (the
late 1980s was the only time during a 30 year period when
there was no increase in the number of CT scanners in
Germany); (3) the 1990s were the renaissance years,
particularly with the introduction of spiral CT and
multidetector arrays.
Scan times are now typically 0.3 s to 0.5 s per full 360˚
scan and 10–30 s for the whole body. The first figure is
important for temporal resolution, especially in cardiac
applications, and one of the limitations on faster times is
the centrifugal force to which sensitive components such as
the X-ray tube are subjected [3]. To achieve better
temporal resolution increased electronic control of the
beam and possibly multiple tube designs are being
explored.
Improvements in total scan time will be achieved
through further development of wider detector arrays,
possibly towards flat panel detectors. This will in turn
require X-ray tubes with an even higher peak output, as
the total flux of photons required to image a given volume
remains roughly the same.
Like for like, patient doses have been reduced with tube
current modulation both on rotation from anteroposterior
(AP) to lateral projections and as the beam traverses the
body from high to low attenuating regions. Achieving
the same counting statistics on all data is a worthwhile
goal [4, 5].
Since 1990 the emphasis has been on scanning volumes
rather than slices and one of the landmarks has been to
achieve isotropically uniform spatial resolution, typically
in the range 0.4–0.6 mm [6]. It is important to recall that
for isotropic resolution, radiation dose to the patient
increases with the fourth power of the resolution element.
These improvements must also be seen in the context of
global use of radiology. CT is a relatively high dose
technique, now accounting for 25% of all radiation
exposure, and there must be strong clinical justification
for its use, and in particular serial, repeat whole body
scans.
The future for CT is hidden from view but there are
many possibilities and it is worthwhile to summarize
Kalender’s predictions – more detector rows; shorter
effective scan times; higher resolutions and more tissue
parameters (there is renewed interest in superimposing, e.g.
a calcium density map on a real density map obtained by
dual energy CT [7]); lower doses (of course!).
The second nominated Hounsfield lecturer, Adrian
Dixon, reviewed the clinical advances in CT. Two
important issues in particular were addressed:
(1) Do the ‘‘advances’’ in CT technology make any
difference to the patient?
(2) Many cutting-edge CT investigations are still chari-
tably funded and if the NHS is to become
responsible for their provision, they must be shown
to be cost-effective.
As a specific example of the clinical issues, he considered
the impact of multidetector CT on abdominal problems.
The improved anatomical resolution of modern helical CT
scanners enables the diagnosis of acute appendicitis or the
cause of small bowel obstruction to be made with a high
degree of accuracy [8]. Consequent on its multitasking
The British Journal of Radiology, 79 (2006), 1–4 E 2006 The British Institute of Radiology
DOI: 10.1259/bjr/19232533
1The British Journal of Radiology, January 2006
6. abilities, CT is increasingly being used as a means of
triaging patients and facilitating early discharge for those
without serious disease – with obvious benefits to the
patients and cost savings to the NHS [9].
CT has become so good that in many areas of radiology
the real questions are now (a) is there a role for plain film
radiography? (b) when should ultrasound be used? (c) is
there a role for MR other than to avoid the use of ionizing
radiation?
This success has come at a price: clinicians are tending
to request a CT scan without fully examining the patient;
surgeons are reluctant to operate without high quality
imaging; for outpatients in oncology the number of
requests for CT staging is starting to approximate the
number of visits to hospital. However, Dixon was able to
conclude on a positive note. For the patient CT has
replaced some very unpleasant investigations.
The tribute to Hounsfield concluded with a more
specialized lecture from Albert de Roos on cardiac CT.
Roos summarized the technical considerations for multi-
slice CT in cardiac scanning – low contrast detection,
spatial resolution at high contrast, temporal resolution,
scan time and patient dose. The choice of acquisition
variables and reconstruction characteristics is very depen-
dent on the clinical problem under investigation.
De Roos then reviewed a wide range of applications
including: the quantitative assessment of coronary artery
calcification [10, 11]; the assessment of coronary artery
morphology; stent and graft patency; the selection of
patients for invasive therapy; assessment of the anatomy of
pulmonary veins and the investigation of acute chest pain.
In the last of these applications there is now a one-stage
protocol, i.e. the nirvana of the ‘‘one stop shop’’ to
diagnose accurately both cardiac and non-cardiac causes
of chest pain [12].
The Mackenzie Davidson lecture, delivered by Nicola
Strickland, touched on many aspects of modern imaging
but concentrated on information technology, especially
PACS.
PACS has now become a mature technology, especially
as a result of the DICOM standard and network
protocols. It clearly has the potential to improve workflow
and productivity but does not, in itself, solve departmental
inefficiencies and may highlight them. It is not a ‘‘quick
fix’’ and must be an integral part of workflow engineering.
Looking to the future, speech recognition and web
browsers will be developed further. The electronic patient
record remains a major challenge, since the facilities
provided need to match the service being provided. A
good example is home reporting – a full work load
requires a full diagnostic service, emergency reporting
needs only more limited facilities.
Strickland concluded that technology provides the
means for improving workflow and productivity – the
challenge is to optimize the use of technology to maximize
productivity in a clinically efficient way.
Manufacturers’ views of the use and development of
technology were also presented. Hermanns Requardt from
Siemens Medical Solutions reminded us that, worldwide,
challenges to healthcare systems are dominated by two
main topics – demographic factors and progress in
medicine. In diagnostic radiology, as in some other
branches of medicine, for example molecular/genetic
medicine, the challenge now is not a lack of information
but a flood of information. Drawing an analogy from
industry where knowledge management systems are
commonplace, Requardt predicted that information tech-
nology would bring about a paradigm shift in medicine if
it could facilitate the formation of a clinical knowledge
database and enable this to be used to complement the
data from the individual patient.
Jacques Souquet from Philips Medical Systems con-
sidered some other aspects of the impact of future
technology on medical imaging. Picking up a theme
from the previous speaker on progress in medicine, he
pointed out that knowledge doubling times have fallen
from about 8 years in 1970 to 1 year in 2001. Increased
use of computer-aided decisions is one way to improve
management of data, for example nodule identification in
a radiograph, using embedded medical knowledge to
reduce avoidable medical errors, genetic algorithms to
discover diagnostic patterns in huge data sets.
Souquet reminded us that much remains to be done.
There are still several diseases for which no diagnostic test
is available and the development of drugs to correct
specific genetic flaws that are biological causes of cancer
has a long way to go. In conclusion, he threw out two
challenges:
(1) How can the translation from cell to mouse to man be
speeded up?
(2) How can the multidisciplinary constituencies contri-
buting to progress (basic sciences, engineering, medi-
cine, industry) be coordinated? This is a challenge that
is close to one of the fundamental aims of the BIR.
Jane Guinn from Kodak Ltd concluded the session by
comparing the techniques of computed radiography (CR)
and digital radiography (DR) from the standpoint of
workflow patterns. She listed 16 distinct stages in the
production of a traditional analogue film, many
involving radiographer movement. CR removed only
two steps, DR removed nine. This had a big impact on
average examination time and in a busy general radio-
graphy room, on patient waiting time. Unfortunately DR
does not provide the flexibility of CR for several
examinations.
Peter Williams delivered the Silvanus Thomson
Memorial Lecture. With the somewhat enigmatic title
‘‘Things can only get better’’ he reviewed the development
of external beam radiotherapy treatment delivery, con-
centrating on current developments and future promises.
Early examples of ‘‘things getting better’’ included
megavoltage therapy with Co-60; isocentric mounting;
electrons as well as X-rays; anatomical data from the CT
scanner for treatment planning. For a few years the ability
to model tumours exceeded the ability to treat, which was
restricted to a cylinder.
In 1987 the multileaf collimator (MLC) became avail-
able for beam shaping and as with most really worthwhile
medical developments, there were no formal health quality
assessments or clinical trials.
MLCs led to intensity-modulated radiotherapy (IMRT),
essentially conformal therapy for difficult targets [13, 14],
and at the same time electronic portal imaging was being
developed to provide active control of beam direction
rather than a passive verification system.
Williams then discussed the current development of real
time tumour tracking to counteract patient movement by
P P Dendy
2 The British Journal of Radiology, January 2006
7. mounting a diagnostic machine with fluoroscopic, radio-
graphic and CT capabilities onto the treatment linear
accelerator. Examples of improved set-up were shown for
lung and bladder treatments – image-guided radiotherapy
will certainly make things better!
For the future, although physicists and engineers are not
yet spent (vide the next topic of proton therapy), they will
need help from other disciplines, e.g. molecular biologists
and geneticists (biological targeting for anoxia and
metabolism, and selective targeting of tumour cells), and
from radiobiologists (for example to exploit the
information on bystander effects coming from microbeam
studies).
As a fitting sequel to the Silvanus Thomson Memorial
Lecture, Bleddyn Jones presented the case for particle
therapy, especially with protons. The theoretical advan-
tages of using the Bragg dose peak to improve the
therapeutic ratio have been known for many years.
Unfortunately, for a 60 MeV beam the peak is at only
3 cm depth and treatment is limited to quite superficial
tumours. Notwithstanding, over 1200 choroidal melano-
mas have been treated successfully at the Clatterbridge
Hospital.
Work by Lomax et al [15] has shown that for treatment
of the breast and regional nodes, a 9-field photon IMRT
approach can either produce similar dose homogeneity
across the planning treatment volumes to that of a proton
plan, or similar sparing of dose to both lungs and the
heart, but not both.
Jones estimated that 10–20% of patients might be better
treated by particle radiotherapy and believes that technical
improvements in physics, bioengineering and computing,
especially in robotics and particle delivery, now make
treatment with a 200 MeV beam, with Bragg peak depths
approaching 20 cm, a practical proposition. It is antici-
pated that this will lead to a big increase in demand for
particle therapy in the UK [16].
The Conference concluded with two further papers in
diagnostic imaging. Catherine Owens gave a wide-ranging
review of the changing practice of paediatric imaging. The
diagnostic capability and accuracy of multidetector CT
(MDCT) angiography was compared with echocardio-
graphy, cardiac catheterization and surgery in the assess-
ment of the great vessels in 40 consecutive patients (mean
age 5 years) with congenital heart disease. MDCT was
accurate, showing good agreement with interventional
catheter and surgery and provided additional information.
Effective doses of radiation were low – ranging from
0.97 mSv in neonates to 1.7 mSv in adolescents [17].
Magnetic resonance coronary angiography and late-
enhancement imaging have been shown to be feasible in
children who had undergone arterial switch for transposi-
tion of the great arteries. Diagnostic quality images were
acquired in 72% of the coronary arteries imaged and this
rose to 100% in subjects over 10 years old [18].
Finally, Peter Ell discussed the contribution of PET/CT
to improved patient management. Whilst acknowledging
the important contribution in neurology and cardiology, in
the limited time available and in the context of the
Conference, Ell concentrated on oncology. Four distinct
areas were covered, diagnosis, staging, radiotherapy
planning and treatment monitoring.
Two very different challenges for this wonderful
technique were highlighted. At the cutting edge of research
there are almost unlimited opportunities for PET/CT to be
used to assess the biology of individual response to
treatment [19]. Whilst recognizing the importance of F-18
fluorodeoxyglucose in oncology, Ell emphasised the need
to look at a wide range of other novel markers that are
being developed, aimed at imaging proliferation [20, 21],
hypoxia, angiogenesis, apoptosis, etc.
At the other extreme there is the huge problem of
diffusion of technology in a cost-effective way so that, on a
day-to-day basis, many more of the millions of cancer
sufferers can benefit from the power of multimodality
imaging.
Ell’s concluding remarks were:
N PET/CT has changed patient management;
N It is best at assessing extent and severity of cancer;
N It informs radiotherapy planning; and
N It combines the power of CT with the unique
metabolic mapping obtained with PET.
These remarks were, of course, addressed to PET/CT
but, in many respects, with suitable changes of wording,
could be applied to the impact of other technological
advances discussed during the 2005 President’s
Conference. We commend to you the full articles
contributed by the speakers in this issue of the Journal.
Acknowledgments
I am grateful to Fergus Gleeson and Gu¨nter Dombrowe
for helpful contributions to this Commentary.
References
1. Hounsfield GN. Computerised transverse axial scanning
(tomography). Part 1 description of system. Br J Radiol
1973;46:1016–22.
2. Ambrose J. Computerised transverse axial scanning (tomo-
graphy). Part 2 clinical application. Br J Radiol
1973;46:1023–47.
3. Shardt P, Deuringer J, Freudenberger J, Hall E, Knipfer W,
Mattern D, et al. New X-ray tube performance in computed
tomography by introducing the rotating envelope tube
technology. Med Phys 2004;31:2699–706.
4. Kalender WA, Wolf H, Seuss C. Dose reduction in CT by an
anatomically adapted tube current modulation. Med Phys
1999;26:2248–53.
5. Greess HR, Wolf H, Suess C, Lutze J, Kalender WA, Bautz
WA. Automatic exposure control to reduce dose in subsecond
multislice spiral CT – Phantom measurements and clinical
results. Radiology 2002;225 Suppl. RSNA programme p 593.
6. Kalender WA. Thin-section three dimensional spiral CT. Is
isotropic imaging possible? Radiology 1995;197:578–80.
7. Kalender WA, Klotz E, Suess C. Vertebral bone mineral
analysis: an integrated approach with CT. Radiology
1987;164:419–23.
8. See TC, Ng CS, Watson CJE, Dixon AK. Appendicitis:
spectrum of appearances in helical CT. Br J Radiol
2002;75:775–81.
9. Ng CS, Watson CJE, Palmer CR, See RC, Beharry NA,
Housden BA, et al. Evaluation of early abdominopelvic
computed tomography in patients with acute abdominal pain
of unknown cause – prospective randomised study. BMJ
2002;325:1387–9.
10. Girshman J, Wolff SD. Techniques for quantifying coronary
artery calcification. Semin Ultrasound CT MR 2003;24:33–8.
Commentary: President’s conference 2005
3The British Journal of Radiology, January 2006
8. 11. Thompson GR, Partridge J. Coronary calcification score:
the coronary-risk impact factor. Lancet 2004;363:
557–9.
12. White CS, Kuo D, Keleman M, Jain V, Musk A, Zaidi E,
et al. Chest pain evaluation in the emergency department; can
MDCT provide a comprehensive evaluation? AJR Am
J Roentgenol 2005;185:533–40.
13. Williams PC. IMRT: delivery techniques and quality
assurance. Br J Radiol 2003;76:766–76.
14. James HV, Scrase CD, Poynter AJ. Practical experience
with intensity modulated radiotherapy. Br J Radiol
2004;77:3–14.
15. Lomax AJ, Cella L, Weber D, Kurtz JM, Mirabell
R. Potential role of intensity-modulated photons and protons
in the treatment of the breast and regional nodes. Int J Radiat
Oncol Biol Phys 2003;55:785–92.
16. Jones B, Burnet NG, Price P, Roberts JT. Modelling the
expected increase in demand for particle therapy: implications
for the UK. Br J Radiol 2005;78:832–5.
17. Benson C, Taylor A, Ross UG, et al. Three-dimensional anatomy
of the great vessels defined by 16-slice multi-detector CT
angiography in neonates, infants, children and adolescents with
congenital heart disease. Presented at the 42nd Congress of the
European Society for Paediatric Radiology, Dublin, June 2005.
18. Taylor AM, Dymarkowski S, Hamaerkers P, et al.
MR coronary angiography and late-enhancement myocardial
MR in children who underwent arterial switch surgery
for transposition of great arteries. Radiology 2005;234:542–7.
19. Bugarolas J, Clark JW, Chabner B. Using ‘‘rationally
designed drugs’’ rationally. Lancet 2003;361:1758–9.
20. Shields AF, Grierson JR, Dohmen BM, et al. Imaging in vivo
proliferation with 18FLT and positron emission tomography.
Nature Medicine 1998;11:1334–6.
21. Francis DL, Visvikis D, Costa DC, Croasdale I,
Arulampalam TH, Luthra SK, et al. Assessment of recurrent
colorectal cancer following 5-fluorouracil chemotherapy using
both 18FDG and 18FLT. Eur J Nucl Med Mol Imaging
2004;31:928.
P P Dendy
4 The British Journal of Radiology, January 2006
9. President’s conference paper
CT scanning the early days
E C BECKMANN, BSc(Eng)
Lanmark, Beaconsfield, Bucks, UK
Abstract. CT scanning has become an established diagnostic tool within the radiology department. This article
covers some of the history of the development and early days of CT scanning. It is based upon the lecture given
on the Memorial Day for Sir Godfrey Hounsfield during the British Institute of Radiology President’s
Conference 2005.
It is less than 34 years ago, on 20th April 1972, that an
unknown engineer from EMI Ltd, the company better
known at the time for publishing the Beatles records,
gave a presentation at the 32nd Congress of the British
Institute of Radiology. The Engineer, Godfrey Hounsfield,
was lecturing with Dr James Ambrose from Atkinson
Morley’s Hospital on ‘‘Computerised Axial Tomography
(A new means of demonstrating some of the soft tissue
structures of the brain without the use of contrast media)’’
[1, 2]. Many people attending that BIR congress will never
forget the experience of hearing a presentation on CT
scanning for the first time. In fact Hounsfield had
presented the results of some of his animal experiments
the previous year at the 2nd congress of the European
Association of Radiology, in Amsterdam, but they had not
excited much interest. The same might have happened in
the USA because at a Neuro Postgraduate Course at the
Albert Einstein College of Medicine, New York, on
Monday 15th May 1972, only about a dozen people
stayed to hear an extra lunchtime lecture by Hounsfield
and Dr Bull, where they showed the first clinical images.
However these people realised the significance of what they
had seen and the news spread rapidly.
The beginning
In the mid 1960s Hounsfield was working on the pattern
recognition of letters when he began to consider whether
he could reconstruct a three-dimensional representation of
the contents of a box from a set of readings taken through
the box at randomly selected directions. He found that by
considering the three-dimensional object within the box as
a series of slices, reconstruction was easier than treating
the content as a volume.
He tested the theoretical principal by working with a matrix
of numbers set to zero with a square in the middle where each
number was set at 1000. He entered these data into a com-
puter programme to get simulated absorption values and
then reconstructed the picture using another programme.
Hounsfield recalled his surprise at how accurate the result was.
The project proposal
Once Hounsfield had proved the theoretical principle he
went on to generate the original project proposal in 1968.
Here he stated ‘‘The purpose of the study was to investigate
the employment of a computer to make better use of the
information obtained when an object is examined by gamma
rays or X-rays’’. In this proposal Hounsfield compared the
classic conventional X-ray technique producing a confused
and fuzzy picture to the clear outline produced by the
proposed system. Hounsfield proposed a system as shown in
Figure 1 based upon reconstructing pictures of slices through
an object and in detailing the expected benefits he indicated a
theoretical accuracy of detection better than 1%.
The lathe bed model
The initial test rig was built on the bed of an old lathe which
Hounsfield had been using in a previous project working on
computer stores. Hence the early test unit became referred to
as the ‘‘Lathe bed model’’. The initial rig utilized a gamma
source, Americium 95, with a photon counter as the detector.
On this rig, the source made 160 traverses of the object, which
was rotated 1˚at the end of each traverse for a total of 180˚. It
took 9 days to collect sufficient information, and 2.5 h to
reconstruct the image on an ICL 1905 mainframe computer.
However, the resultant images proved the feasibility of the
technique and with the replacement of the gamma source by
an X-ray source as shown in Figure 2, the scanning time was
reduced to 9 h.
Initial images were of inert objects, then specimens from an
abattoir, including bullocks brains and pigs bodies as shown
in Figure 3. Due to the long scan times, particularly with the
gamma source, many of these specimens decayed while the
Received 12 September 2005 and accepted 16 September 2005. Figure 1. Extract of the original 1968 project proposal.
The British Journal of Radiology, 79 (2006), 5–8 E 2006 The British Institute of Radiology
DOI: 10.1259/bjr/29444122
5The British Journal of Radiology, January 2006
10. pictures were acquired producing gas bubbles which caused
artefacts in the images. This initial work was done by a very
small team comprising Hounsfield, Stephen Bates (program-
ming), Peter Langstone (electronics) and Mel King
(mechanics) working on a very low budget of £25 000.
Dr James Ambrose recalls that, in about 1969, he received a
call from an old acquaintance, Dr Evan Lennon then
principal medical officer in Radiology at the Department of
Health, asking if he would see a man called ‘‘Godfrey
Hounsfield’’ and listen to him. Lennon had found him
confusing but was reluctant to dismiss him as a crank
(Ambrose later learnt that other eminent radiologists had
already dismissed him as a crank!). Ambrose recalls that when
he and his senior physicist Dr John Perry met Hounsfield, the
conversation was difficult. Hounsfield would only say that the
method was fundamentally different from other methods of
X-ray imaging, more efficient in photon usage and likely to be
more sensitive to small density variations. In order to
demonstrate a clinically relevant image, Ambrose arranged
for a bottled specimen of a brain to be borrowed from a
museum and was amazed at the image Hounsfield showed
him 5 weeks later. An image of the first brain scanned is
shown in Figure 4.
Building the prototype
Having shown some clinically interesting images the
project was then ready to move to the next stage of
building a full prototype. However funding was an issue. It
was Gordon Higson at the Department of Health who had
the foresight to place an order for a machine with a
theoretical specification which included a 4–5 min scan
time and an 0.5% pixel accuracy, and this enabled the
project to continue. This order was for a prototype and
three clinical machines that would generate sufficient
income to fund a fifth machine for Hounsfield and his
team to keep and work on. The Department of Health
order would also fund half the remaining research costs
and in exchange they would receive a small royalty on
sales. At the time it was calculated that it would cost
£69 000 to build a complete working system and so it was
agreed that the Department of Health would pay £150 000
for each of the four systems.
The first clinical patient
The prototype was installed at Atkinson Morley’s
Hospital in South London where the first patient, a
middle aged lady with a suspected frontal lobe tumour,
was scanned on 1st October 1971. The surgeon who
operated on her shortly afterwards reported that ‘‘it looks
exactly like the picture’’ shown in Figure 5.
Hounsfield remained cautious. He recalled ‘‘I’ve had this
before, first time is always lucky and then everything else
goes wrong after that. So I thought, the next ones are not
going to be any good, but they did another ten more
patients and every one of them came out as being obvious
diseases of the brain showing up in various forms. Dr
Ambrose found that, by injecting iodine-based contrast
agent that would localize the particular spot where the
tumour was and it showed up even better’’. Hounsfield
took some of the contrast enhanced images and subtracted
without contrast images to compare the blood flow on
either side of the brain.
In the original system the patient’s head was placed in a
rubber cap surrounded by water. This water bag was used
to reduce the dynamic range of the detected X-rays and
improve the absolute values of the attenuation figures.
Using one sodium iodide (NaI) crystal and photomul-
tiplier tube detector per slice, plus one as a reference
detector with a scan time of 4.5–20 min per 180˚ scan, the
system acquired two contiguous slices per scan each with a
80680 matrix of 3 mm63 mm613 mm voxels. Early
images showed the ability to meet the pixel density
accuracy of 0.5% in the absorption coefficient as defined in
the theoretical specification.
Figure 2. The original lathe bed model (copyright EMI Ltd).
Figure 3. Early scan of a pig.
Figure 4. First image of a brain specimen.
E C Beckmann
6 The British Journal of Radiology, January 2006
11. The three systems ordered by the Department of Health
were installed at the National Hospital for Neurology and
Neurosurgery in London, Manchester and Glasgow. After
this, the first CT scanners were installed in the USA at the
Massachusetts General Hospital and the Mayo Clinic, where
the first scan in the USA was done on 19th June 1973.
Reconstructing the picture
Early scan data were actually taken back to EMI on tape
for processing overnight which took 20 min per image on an
ICL 1905 computer. In production this was done on a mini-
computer which fortuitously had emerged at the right time.
Images were taken back the next day on tape to Atkinson
Morley’s Hospital to be displayed. The early images were
displayed in three ways; paper printout, cathode ray tube
(CRT) display or as a Polaroid picture of the CRT display.
The early images were generated using iterative algebraic
reconstruction implemented by Steve Bates on the ICL 1905
mainframe. Subsequently reconstruction used the filtered
back projection or convolution method invented and
patented by Chris Lemay, one of the many patents filed
and held by Hounsfield and his team. On the original EMI
Mk1 scanner an 80680 image took 7 min to process, with
filtered back projection on the same computer a 1606160
image could be processed in 30 s after the end of the scan.
It had been thought that image reconstruction and
processing was so complicated that it would have to be
done at a central processing unit on a suitable large and
fast main frame machine.
But the introduction of the mini computer and the
implementation of the new improved reconstruction
algorithms were to change this.
CT1010 scanner
A challenge with the original EMI Mk1 scanner was the
water bag, both as regards the ease of use with patients
and also due to the occasional water leak! Replacement of
the water bag with shaped carbon fibre wedges and bean
bags was a significant improvement. This was further
enhanced by the increase to eight detectors per slice in the
CT1010 which was still a two contiguous slice scanner
offering 1606160 and 3206320 matrix sizes over a
210 mm scan diameter and with the minimum scan time
improved to 1 min. The prototype of this system was
installed in 1975 at Atkinson Morley’s Hospital and
showed significant improvement in clinical image quality.
Body scanning
The feasibility of body scanning was proved when a slim
member of the EMI team, Tony Williams, was scanned in
a head scanner.
The first body images taken in the body prototype
machine were of Hounsfield himself on 20th December
1974. The first body images were shown to a meeting at
the first International Conference on CT Scanning in
Bermuda on Friday 14th March 1975, one of these images
is shown in Figure 6.
All the research machines were named after stones:
Opal, Pearl, Garnet and the body prototype was Emerald.
This Emerald system was first installed clinically at
Northwick Park Hospital in March 1975. The first body
scan carried out in the USA was in October 1975 at the
Mallinkrodt Institute St Louis. Dr Ron Evans recalled
that this was a jaundiced patient, in whom it had been
difficult to differentiate between medical and surgical
jaundice. The CT scans showed that it was surgical
jaundice which was subsequently clinically confirmed.
Initially known as the CT5000, the body scanner was
developed into the commercial production machine, the
CT5005. These body scanners were single slice machines using
a gantry with 30 detectors plus a reference detector to reduce
scan time to 20 s. The matrix had been increased to 3206320
over a selectable 240 mm, 320 mm or 400 mm scan field.
The generation game
All these early scanners were the so called 1st or 2nd
generation utilizing the translate/rotate technology where
the gantry scanned across the patient before indexing by
one degree and scanning back.
An early problem in CT scanner design was detector
stabilization and the need for calibration. The EMI scanners
were using NaI crystal photon detectors and photo multiplier
tubes, and the translate/rotate technology enabled detector
calibration by taking air readings at the end of each translate
movement. This gave high accuracy but limited the speed of
the scan. By 1976 there were 17 companies offering CT
scanners with 3rd generation rotate/rotate scanners having
Figure 5. First patient image scanned on the prototype EMI
scanner at Atkinson Morley’s Hospital on 1st October 1971.
CT scanning the early days
7The British Journal of Radiology, January 2006
12. been introduced, to offer fast scan times, most based upon
xenon gas detectors arranged in an arc [3].
Hounsfield realised the need for a system that was faster
than translate/rotate and that could overcome the
calibration and artefact issues of rotate/rotate systems.
Topaz
The patent for a scanning focus system to produce a
true volume scanner was filed on 19th October 1976. The
Topaz research system, also named after a stone and
shown in Figure 7, was a 3rd generation system with a
flying X-ray spot. The X-ray flying spot scanned in a
direction opposite to the direction of rotation of the
machine which meant that the body could be scanned with
arcs of detector readings which overlapped in such a way
that they could be compared and continuously calibrated.
Built with 612 detectors including a central zoom region,
Topaz had a resolution in the x-y plane of 0.65 mm.
Volume scans taken in June 1980 were displayed in three
dimensions in real time as 1200612006270 pixels.
Recognition
Initially the scale for describing the attenuation
coefficients was referred to as EMI numbers. This was
then expanded by a factor of two and became known as
Hounsfield units (H) where
H~
ktissue{kwater
kwater
|1000
and m is the linear attenuation coefficient. Each Hounsfield
unit is equivalent to 0.1% of the attenuation of water [3].
In addition to giving his name to the unit of attenuation,
Hounsfield received many awards including the BJR
Barclay prize jointly with Ambrose in 1974, the Nobel Prize
for Physiology or Medicine in 1979 [4] and a Knighthood
in 1981.
Hounsfield and his team created the CT scanner,
which has had an explosive impact on diagnostic
radiology, with little money and few resources. By the
end of the 1970s they already had plans for many of the
technologies which were to develop the CT scanner over
the next 30 years, including helical multislice scanners and
high power continuously rated scanned beam X-ray tubes.
They developed many of the techniques which formed
the foundation of modern imaging including image
subtraction. By 1976 the reconstruction techniques used
in CT were already being applied to other areas including
ultrasound and nuclear magnetic resonance.
Acknowledgments
The author is indebted to many people especially those
members of the original EMI team who worked with Sir
Godfrey Hounsfield for their input to the original lecture
and material used in this article.
References
1. Hounsfield GN. Computerised transverse axial scanning
(tomography): Part 1. Description of system. Br J Radiol
1973;46:1016–22.
2. Ambrose J. Computerised transverse axial scanning (tomo-
graphy): Part 2. Clinical application. Br J Radiol
1973;46:1023–47.
3. Brooks RA, Di Chiro G. Principles of computer assisted
tomography (CAT) in radiographic and radioisotropic ima-
ging. Phys Med Biol 1976;21:689–732.
4. Computed medical imaging. Nobel lectures in physiology or
medicine 1971–1980; 568–86.
Figure 7. Topaz 3rd generation flying focal spot scanner.
Figure 6. Body scan of Hounsfield taken on the prototype
scanner in the laboratories and shown at Bermuda conference
on 14th March 1975.
E C Beckmann
8 The British Journal of Radiology, January 2006
13. President’s conference paper
Cardiac applications of multislice computed tomography
1
A DE ROOS, MD, 1
L J M KROFT, MD, 2
J J BAX, MD, 1
H J LAMB, MD and 1
J GELEIJNS, PhD
Departments of 1
Radiology and 2
Cardiology, Leiden University Medical Center, Albinusdreef 2, 2333 ZA Leiden, The
Netherlands
Multislice CT (MSCT) is gaining clinical acceptance for
cardiac imaging owing to improved temporal and spatial
resolution of the latest 16-slice and 64-slice technology.
Although the cardiac MSCT applications are promising,
there is still room for further technical improvements and
optimization of post-processing techniques for cardiac
evaluation.
Interestingly, the data acquired for CT angiography of
the coronary arteries can also be used to create volumetric
cine loops of cardiac function. The functional data are
available without the need for repeat scanning or for
administration of additional contrast material [1].
Furthermore, MSCT allows assessment of first-pass
perfusion and delayed enhancement imaging in patients
with subacute myocardial infarction. Recently, it has been
reported that MSCT reveals microvascular obstruction or
the so-called no-reflow phenomenon as a late perfusion
defect in patients with re-perfused acute infarctions,
similar to observations made by other techniques like
MRI [2]. With further development MSCT may allow
combined assessment of the presence and extent of
coronary atherosclerosis, the percent diameter stenosis,
plaque characterization and the effect of the lesion on
perfusion and myocardial function.
In this review, the technical requirements of cardiac
MSCT and some frequent clinical applications are
discussed.
MSCT imaging requirements
Requirements for cardiac MSCT image acquisition
depend strongly on the clinical problem. For example,
CT coronary angiography requires excellent spatial and
temporal resolution, whereas only modest spatial and
temporal resolution is sufficient for the assessment of the
anatomy of pulmonary veins and the left atrium. In
general, the higher the requirements for image quality
become, the more complex the acquisition, the longer scan
time and the higher patient dose. Main aspects with regard
to imaging performance are low-contrast and spatial
resolution, temporal resolution, and scan time. Patient
dose and radiation risk should always be considered as the
counterpart of image acquisition and image quality.
Low-contrast resolution and spatial resolution
Low-contrast resolution is the ability to visualize
structures that demonstrate only a small difference in
Hounsfield units compared with their direct environment.
In cardiac applications of CT, native tissue contrasts are in
general not sufficient to differentiate between, for example,
the vessel wall and its unenhanced lumen, or the heart and
the inner chambers. Contrast enhancement is thus
mandatory for visualizing the lumen of coronary arteries,
the heart chambers, pathology of the myocardium or
anatomy of pulmonary veins. Low-contrast resolution
depends on tube current (mA), the reconstructed slice
thickness, tube voltage, beam filtration and the reconstruc-
tion algorithm, and is strongly correlated to radiation
exposure. In general, low-contrast resolution performance
of CT scanners is not a limitation for the application of
cardiac CT.
Spatial resolution, or high-contrast resolution, deter-
mines the ability to visualize contours of small structures
within the scanned volume. Small objects can only be
resolved when there is a rather large contrast with the
direct environment. Considerable improvement of spatial
resolution in clinical acquisitions was achieved with the
latest generations of multislice CT scanners. This is of
importance, particularly for the application of CT
coronary calcification scoring and CT coronary angio-
graphy. The actual diameters of the lumen of normal
coronary artery segments range from 5 mm in the
proximal segments to less than 1 mm in the distal
segments [3]. This means that spatial resolution of
1.0 mm in all three dimensions should be sufficient for
imaging of the coronary arteries, except for distal segments
that would require a spatial resolution of at least 0.5 mm.
Bypass graft diameter typically ranges from 4 mm to
6 mm. A spatial resolution of 2 mm3
(voxel size) might
thus be sufficient for imaging the lumen of bypass grafts.
For imaging of small structures within the coronary
arteries, such as atherosclerotic plaque and stents, excellent
spatial resolution, even better than 0.5 mm3
, might be
required. Voxel size is often used as an indicator of
spatial resolution. However, voxel size should be inter-
preted with care since smaller voxel size does not
necessarily imply better spatial resolution. Spatial
resolution is preferably expressed as the response of a
delta-function; in CT, this response is either called a point-
spread-function (spatial resolution in the axial plane) or a
slice sensitivity profile (spatial resolution along the z-axis).
Spatial resolution is limited by the acquisition geometry of
the CT scanner, the reconstruction algorithm and the
reconstructed slice thickness. The performance of current
64-slice scanners with regard to spatial resolution,
expressed as the full-width half-maximum of the response
of a delta-function, is within the range 0.6–1.0 mm in all
three dimensions.Received 22 September 2005 and accepted 5 October 2005.
The British Journal of Radiology, 79 (2006), 9–16 E 2006 The British Institute of Radiology
DOI: 10.1259/bjr/67045628
9The British Journal of Radiology, January 2006
14. Temporal resolution
Temporal resolution determines whether fast moving
objects can be resolved in the CT image. Good temporal
resolution limits motion artefacts and blurring of the
image. Principally, good temporal resolution can be
achieved by a short reconstruction window providing
snap shots of the beating heart and coronary arteries.
Good temporal resolution in cardiac CT is realised by fast
data acquisition (fast rotation of the X-ray tube), but even
more importantly by a dedicated reconstruction algorithm.
A recent paper [4] provides information on the rest
period of the heart, which is a measure for the required
reconstruction window. The rest period is defined as the
time during which the 3D motion of a coronary artery is
less than 1 mm. It was reported that, for patients with a
heart rate of 64¡9 beats per minute (BPM), the end-
systolic rest period duration was 76¡34 ms; and the mid-
diastolic rest period duration was 65¡42 ms for the
proximal to middle segments of the right coronary artery.
For the left coronary artery tree, the end-systolic rest
period duration was 80¡25 ms; the mid-diastolic rest
period duration 112¡42 ms. From these data it is
concluded that the duration of a ‘‘snap shot’’ of the
coronary arteries, or in other words the reconstruction
window, should be shorter than 65–110 ms. This is in good
agreement with earlier papers; in one paper it is suggested
that the reconstruction window should be lower than
100 ms for coronary angiography in mid-diastole at
62¡10 BPM [5], and in another paper it is stated that a
100 ms reconstruction window is relatively optimal for
most patients at heart rates up to 90 BPM [6]. All of these
considerations assume image reconstruction at the cardiac
phase point that is associated with least motion, e.g. a
reconstruction window starting between 60% and 80% of
the interval between two consecutive R-waves. More strict
criteria for the reconstruction window apply if the heart
should be assessed at more than one cardiac phase point,
including those that are associated with rapid movement
of the heart wall, e.g. for studying the dynamics of the
myocardium. More strict criteria apply as well when a
1 mm displacement of a coronary artery within the
duration of the snap shot becomes unacceptable. This
may happen, for example, when imaging small distal parts
of the coronary arteries, quantifying coronary stenoses and
assessment of coronary atherosclerotic plaque.
General reconstruction algorithms that are used for
general CT applications provide, in principle, a temporal
resolution equal to the rotation time (360˚ rotation, full
reconstruction), the best achievable temporal resolution
with general reconstruction algorithms is slightly longer
than 50% of the rotation time (180˚ rotation, half
reconstruction). Current 64-slice scanners that are used
for cardiac applications provide a rotation time of 330–
400 ms. These typical rotation times are not short enough
for achieving a 100 ms or shorter snap-shot of the heart,
even if a 180˚ rotation half-reconstruction is applied.
Therefore, dedicated reconstruction algorithms are used in
cardiac CT that allow for reconstruction of synchronized
images from transmission data acquired during two or
more successive heart cycles according to a method
described already in 1977 [7]. These so-called segmented
(multicycle) reconstruction algorithms allow for merging
synchronized transmission data from successive heart
cycles. The more heart cycles that can be included in the
reconstruction, the better the temporal resolution. A low
pitch factor, which is typical for cardiac CT acquisition, is
required to acquire data from more than one heart cycle.
A pitch factor as low as 0.2 is required to record at least
two heart cycles and to achieve a temporal resolution in
the order of magnitude of 100 ms for typical heart rates
between 60–80 BPM. Figure 1 shows, as an example, the
temporal resolution that is achievable with a reconstruc-
tion algorithm that can merge transmission data from an
unlimited number of heart cycles. The figure illustrates the
dependence of the reconstruction window on rotation time
and heart rate and was calculated for a pitch factor of 0.2.
From Figure 1 it can be concluded that, for achieving the
shortest reconstruction window, rotation time should be
adapted to the heart rate.
Scan time
Scan time is the time interval between the start and the
end of one acquisition, sometimes referred to as a
sequence. To avoid breathing artefacts and to limit the
amount of contrast material in contrast enhanced acquisi-
tions, scan time in cardiac CT should remain at least below
30 s, but preferably below 20 s. The extent of the target
volume, as well as acquisition parameters such as rotation
time, pitch factor, section thickness and number of
simultaneously acquired sections, define scan time. In
general a much shorter scan time than 20 s can now be
realised with the current generation of 64-slice scanners for
typical cardiac CT examinations; for example, a typical
Figure 1. Temporal resolution of CT coronary angiography.
The temporal resolution depends strongly on the rotation time
and the reconstruction algorithm. In segmented (multiphase)
reconstructions, temporal resolution depends also strongly on
the pitch factor. The lower the pitch factor, the more cardiac
phases are captured during the acquisition and the better tem-
poral resolution. The graphs are calculated for a pitch factor
of 0.2. The graphs clearly show the dependence of temporal
resolution on heart rate and rotation time.
A de Roos, L J M Kroft, J J Bax et al
10 The British Journal of Radiology, January 2006
15. scan time for calcium scoring is 2.5 s, for coronary
angiography 10 s and for an ungated acquisition of the
pulmonary veins 3.0 s.
Patient dose in MSCT
Radiation protection of patients is based on justification
and optimization. Justification implies that the benefit for
the patient outweighs the risk of radiation exposure.
Patient dose assessment is required for balancing harm
and benefit of the CT examination and to assess the effect
of measures for optimization of cardiac CT. Nowadays,
most CT scanners provide the user with an indication of
patient dose in the form of the CT dose index (CTDI) and
dose–length product (DLP). Effective dose can be derived
from these dose quantities. Effective dose from cardiac CT
coronary angiography is relatively high, mainly due to the
need to catch more than one cardiac cycle and the
resulting low pitch factor. On the other hand, effective
dose from an ungated acquisition, such as in ungated
pulmonary vein CT angiography, is relatively low due to
the high pitch factor. Effective dose for calcium scoring,
assessment of ventricle function or pulmonary veins is in
the range 1–3 mSv, effective dose for CT coronary
angiography is considerably higher, e.g. in the range 10–
15 mSv. Concern about radiation exposure stimulates
the development of methods for dose reduction in
cardiac CT coronary angiography. The field of view of
interest in cardiac CT is rather small and therefore
radiation exposure of tissue outside this field of view
can be limited by means of a special ‘‘small field’’ beam-
shaping filter. Another method for dose reduction is to
reduce X-ray output during the systolic phases that are
expected to be of less interest for the evaluation of
the coronary arteries (ECG triggered modulation of
dose). Pitfalls of small field scanning are the occurrence
of artefacts and reduced image quality. A pitfall of
tube modulation is reduced image quality at certain
relevant phases of cardiac cycle, e.g. due to an irregular
heart rate.
Clinical applications
MSCT provides special opportunities for cardiovascular
CT in addition to angiography of the coronary arteries
and coronary bypass grafts. These options include
assessment of left ventricular (LV) and right ventricular
(RV) function, coronary calcification score, myocardial
infarction imaging and assessment of the anatomy of
pulmonary veins in patients with atrial fibrillation. Each of
these applications can be characterized by their specific
techniques for acquisition and reconstruction. Table 1
provides information about typical acquisition and
reconstruction parameters for some clinically established
cardiac CT applications.
Quantitative assessment of coronary artery calcification
Coronary artery calcification is a marker for athero-
sclerotic lesions in the coronary arteries. The amount of
coronary artery calcification is correlated to the risk of
coronary events. However absence of coronary artery
calcification does not rule out atherosclerosis. Applications
Table1.TypicalacquisitionandreconstructioncharacteristicsofsomecardiacCTexaminations
ExaminationAcquisitionContrast
(mls-1
,ml)
SynchronizationAcquisition
configuration
(n6Tmm)
Rotation
times(s)
Tube
voltage(kV)
Tube
current(mA)
PitchScanrange
(mm)
Scan
time(s)
Reconstruction
algorithm
Reconstruction
windowb
(ms)
CalciumscoringSequentialnoneProspective
triggering
4630.25a
120200Notapplicable1202.5Halfreconstruction250
CTAcoronary
arteries
Spiral4/100Retrospective
gating
6460.50.41203000.191209.5Multisegmental100
CTAcoronary
bypass
Spiral4/100Retrospective
gating
6460.50.41203000.1924017.3Multisegmental100
RVfunctionSpiral2.5/40Retrospective
gating
1662.00.4120400.191209.0Multisegmental100
PulmonaryveinsSpiral5/70Nosynchronization6460.50.41003000.831202.7Halfreconstruction250
a
Partialrotation.
b
60beatsperminuteassumed.
Cardiac MSCT
11The British Journal of Radiology, January 2006
16. of quantitative assessment of coronary artery calcification
are screening of asymptomatic individuals with risk factors
for coronary artery disease and follow-up of patients who
received medication for the treatment of coronary artery
disease.
Coronary artery calcification is well visualized with
X-ray techniques such as radiography but only CT
provides a non-invasive method for detecting and
quantifying coronary artery calcification [8]. Coronary
calcification is best detected and measured in a plain CT
acquisition without contrast enhancement.
Quantification of coronary calcium was introduced in
1990 by Agatston et al [9]. They used electron beam
tomography and established the ‘‘Agatston score’’. The
Agatston score requires an acquisition with a special
protocol (3 mm contiguous slices, 130 kV). The Agatston
score is achieved by setting a threshold for the Hounsfield
unit (130 HU) and for the size of the lesion (1 mm2
). Then
a pragmatic weighting of the calcified area is applied
depending on the maximum HU in the lesions for each
image. The total calcium score is calculated by summing
the weighted areas for all images (Figure 2).
With the introduction of MSCT, new acquisition
protocols came into use; prospective ECG triggering in
combination with a half (180˚) reconstruction at 120 kV is
now generally used for calcium scoring. In prospective
ECG triggered MSCT acquisitions, the patient is only
exposed within the 170–200 ms acquisition window at
diastole and radiation exposure is therefore significantly
less compared with retrospective gated MSCT cardiovas-
cular examinations. The application of MSCT for
quantification of coronary calcium made it mandatory
to switch to new quantification methods that can be
compared for different scanners and that are robust with
respect to different scanners and acquisition protocols.
Alternatives for the Agatston score are the volume score
(the volume of all voxels exceeding a certain threshold)
and calcium mass (mg) [10]. The latter quantity holds the
promise of providing the best physical measure for
coronary artery calcification. Unfortunately, there is still
a lack of standardization of the MSCT techniques with
regard to image acquisition as well as to the methodologies
for quantitative coronary calcification scoring. The devel-
opment of standardized and reproducible protocols is a
technical prerequisite for coronary calcification scoring to
become a useful clinical tool. In addition, for screening
purposes, the coronary calcification score will have to be
established as an independent predictor of existing risk
factors for cardiovascular disease [11].
Coronary angiography
MSCT has rapidly evolved through different stages of
technological innovation, allowing high-quality non-inva-
sive 3D imaging of coronary artery morphology (Figures 3
and 4). Recently the diagnostic accuracy of 64-slice MSCT
for the identification and quantification of coronary artery
stenoses has been reported [12, 13]. The patient-based
analysis revealed that 94% of patients who required
revascularization were correctly diagnosed by CT.
Although excellent accuracy for stenosis detection was
noted, technical restrictions for exact quantification of the
degree of stenosis and reliable visualization of small vessel
segments remain [12]. In an accompanying editorial the
authors express the expectation that MSCT will be used in
the near future on a routine basis for the identification of
patients who do not need revascularization therapy despite
the presence of symptoms [14].
The potential value of MSCT for stenosis quantification
is currently under active investigation. Recently, a good
correlation between MSCT and quantitative coronary
X-ray angiography was shown for stenosis quantification
with the use of 16-slice technology, although MSCT
revealed a systematic overestimation as compared with the
reference standard [15]. Perfusion defects related to
previous myocardial infarction or ischaemia may be well
visualized with the use of MSCT (Figure 5).
In CT coronary angiography, beta-blockers may be used
to reduce the heart rate to a lower range, e.g. to 50–
60 BPM to increase the cardiac rest period and with this to
reduce motion artefacts. The resulting imaging perfor-
mance is more predictable and of more consistent quality
when using such medication. Special reconstruction
algorithms for the reconstruction pose an alternative to
the use of medication. The segmented reconstruction
algorithm yields good temporal resolution even at higher
heart rates. Also, when total scanning time is short, e.g.
below 10 s, the quality of the scan improves since, due to
the reduction of the total amount of heart beats in the
scan, less variation can be expected in the heart rate
during the acquisition. Hyperventilation and administra-
tion of oxygen may be used to stabilize heart rate
particularly at scan times of approximately 20 s scanning
time or longer.
Figure 2. Coronary artery calcification imaging at 64-row
multidetector CT (MDCT). 64-row MDCT of a 52-year-old
male patient with risk factors for coronary artery disease.
Small calcifications in the left anterior descending artery. The
total calcium score according to Agatston was 21, and the
total volumetric score was 25, indicating mild atherosclerotic
plaque with mild or minimal coronary artery narrowings likely.
CT-angiography revealed no coronary artery stenoses.
A de Roos, L J M Kroft, J J Bax et al
12 The British Journal of Radiology, January 2006
17. Figure 3. Normal coronary artery anatomy at 64-row multidetector CT (MDCT). 64-row MDCT of a 62-year-old male patient with
risk factors for coronary artery stenosis. No stenoses were found at MDCT coronary angiography. Left anterior (a) oblique view
and (b) caudal view. LAD, left anterior descending coronary artery; D, diagonal branch of the LAD; IM, intermediate coronary
artery branch; Cx, circumflex coronary artery; MO, obtuse marginal branch (of the Cx); DP, descending posterior branch (of the
right coronary artery).
Figure 4. Bypass imaging at 64-row multidetector CT (MDCT). 64-row MDCT of a 78-year-old male patient after coronary artery
bypass graft operation (CABG). Occlusion of multiple venous bypass grafts (nr 1 in a). Left internal mammarian artery bypass graft
(nr 2 in a,b) with open anastomosis (nr 3 in a,b,c) on the left anterior descending coronary artery (nr 4 in a,c). Poor quality native
coronary artery system with multiple stenoses and poor contrast enhancement (nr 4 in a,c). b and c are displayed in two perpen-
dicular longitudinal directions.
Cardiac MSCT
13The British Journal of Radiology, January 2006
18. Assessment of ventricular function
With retrospective gated 180˚segmented sinogram space
reconstruction the data can be reconstructed for evalua-
tion of ventricular function [16]. Diastolic and systolic
images can easily be extracted and reconstructed in any
orientation for functional evaluation (Figure 6).
Global ventricular function is generally measured as the
end-systolic and end-diastolic volume (ESV, EDV).
Subsequently, stroke volume (SV) and ejection fraction
(EF) can easily be derived from ESV and EDV.
Semiautomatic software may be used for ventricular
cavity contour detection and for the calculation of
global ventricular function. Regional LV wall motion
can be assessed by visual scoring of cinematic loops of well
described myocardial segments [17].
Integrated CT assessment of the coronary arteries and
regional myocardial function allows assessment of the
functional consequences of a coronary artery stenosis
leading to ischaemia and contraction abnormalities. The
usefulness of this combined approach has been reported in
patients with hypertension and diabetes mellitus [18, 19].
From the same data set global function and left ventricular
mass can also be determined, which have clinical relevance
in patients with hypertension for prognosis and guidance
of therapy.
Several studies have shown that right ventricular
function can also be accurately measured by gated
MSCT. The assessment of right ventricular function may
have special interest in patients with acute pulmonary
embolism. Right ventricular enlargement on chest CT has
been shown to be a predictor of early death in patients
with acute pulmonary embolism [20, 21]. Even the
dimensions of the right ventricle in non-gated CT
images may be predictive for mortality in this setting.
The potential value of gated MSCT for assessing right
ventricular function in patients with pulmonary embolism
is now under investigation.
Assessment of pulmonary veins
Atrial arrhythmias often originate in the pulmonary
veins and can be treated with percutaneous radiofrequency
catheter ablation. With this technique, the arrhythmic foci
are electrically disconnected from the left atrium by means
of catheters placed in the left atrium [22]. Pre-procedural
MSCT examination is helpful to depict the anatomy of the
pulmonary veins and left atrium and particularly to
demonstrate additional pulmonary veins (e.g. middle
lobe vein), which is important for planning the interven-
tional procedure. Variations in pulmonary venous anat-
omy are quite common and comprise variation in the
number of veins as well as the occurrence of common ostia
Figure 5. Multiple perfusion defects imaged with 64-row multi-
detector CT (MDCT). Same patient (78-year-old male) as in
Figure 4 after coronary artery bypass graft operation and mul-
tiple venous bypass graft occlusions. Multiple perfusion defects
with regional wall thinning.
Figure 6. Ventricular function imaging at 64-row multidetector CT (MDCT). 26-year-old male patient after surgery for congenital
heart disease. Ventricular function can be assessed after drawing the endocardial ventricular contours in (a) end-diastolic and (b) end-
systolic phases at multiple cardiac levels, thereby including the ventricular volumes.
A de Roos, L J M Kroft, J J Bax et al
14 The British Journal of Radiology, January 2006
19. and early branching [23]. Three-dimensional surface
rendering reconstructions provide a quick overview of
the pulmonary venous anatomy, but cross-sectional
reconstruction in coronal, sagittal and transverse orienta-
tions is necessary for full appreciation of the morphology
of the pulmonary veins (Figure 7) [24]. Post-procedural
MSCT also offers an opportunity for follow-up of the
pulmonary vein after ablation [25].
MSCT pulmonary venography requires a contrast
enhanced helical acquisition. To avoid motion artefacts
a half reconstruction is generally performed, yielding a
reconstruction window of about 165–200 ms. This is
sufficiently short for imaging the rather large pulmonary
veins with diameters well above 10 mm. Reliable images
can be acquired without the use of ECG gating. Breath-
hold acquisitions with a high pitch factor and resulting
rather low patient dose are routinely obtained. The
potential additional value of ECG synchronized MSCT
is under investigation.
Conclusion
MSCT is a highly accurate tool for the non-invasive
detection of coronary artery disease. Further technical
advances are expected in acquisition techniques as well as
post-processing of the CT data. Detector technology and
arrays may be further expanded, allowing shorter imaging
times. Improved temporal and spatial resolution will
contribute to better stenosis quantification and plaque
characterization. Integration of coronary artery imaging
and functional data are feasible with current MSCT.
Shorter scanning times may allow integration of coronary
imaging, first-pass perfusion imaging as well as wall
motion analysis from the same data set. Other cardiovas-
cular applications also benefit from the improvements in
CT technology. Recently, the value of MSCT for the
evaluation of patients with chest pain presenting to the
emergency department was reported [26]. It was shown
that MSCT is feasible to evaluate chest pain patients
comprehensively. During one comprehensive MSCT pro-
tocol cardiac and non-cardiac causes of chest pain can
accurately be diagnosed. It is expected that MSCT will
become a gatekeeper in patients presenting with chest pain
from various sources.
References
1. Schuijf JD, Bax JJ, Salm LP, Jukema JW, Lamb HJ, van der
Wall EE, et al. Noninvasive coronary imaging and assessment
of left ventricular function using 16-slice computed tomo-
graphy. Am J Cardiol 2005;95:571–4.
2. Paul JF, Wartski M, Caussin C, Sigal-Cinqualbre A, Lancelin
B, Angel C, et al. Late defect on delayed contrast-enhanced
multi-detector row CT scans in the prediction of SPECT
infarct size after reperfused acute myocardial infarction:
initial experience. Radiology 2005;236:485–9.
3. Dodge JT Jr, Brown BG, Bolson EL, Dodge HT. Lumen
diameter of normal human coronary arteries. Influence of
age, sex, anatomic variation, and left ventricular hypertrophy
or dilation. Circulation 1992;86:232–46.
4. Shechter G, Resar JR, McVeigh ER. Rest period duration of
the coronary arteries: implications for magnetic resonance
coronary angiography. Med Phys 2005;32:255–62.
5. Hofman MB, Wickline SA, Lorenz CH. Quantification of
in-plane motion of the coronary arteries during the cardiac
cycle: implications for acquisition window duration for MR
flow quantification. J Magn Reson Imaging 1998;8:568–76.
6. Lu B, Mao SS, Zhuang N, Bakhsheshi H, Yamamoto H,
Takasu J, et al. Coronary artery motion during the cardiac
cycle and optimal ECG triggering for coronary artery
imaging. Invest Radiol 2001;36:250–6.
7. Harell GS, Guthaner DF, Breiman RS, Morehouse CC, Seppi
EJ, Marshall WH Jr, et al. Stop-action cardiac computed
tomography. Radiology 1977;123:515–7.
8. Girshman J, Wolff SD. Techniques for quantifying
coronary artery calcification. Semin Ultrasound CT MR
2003;24:33–8.
9. Agatston AS, Janowitz WR, Hildner FJ, Zusmer NR,
Viamonte M Jr, Detrano R. Quantification of coronary
artery calcium using ultrafast computed tomography. J Am
Coll Cardiol 1990;15:827–32.
10. Ulzheimer S, Kalender WA. Assessment of calcium scoring
performance in cardiac computed tomography. Eur Radiol
2003;13:484–97.
11. Thompson GR, Partridge J. Coronary calcification score: the
coronary-risk impact factor. Lancet 2004;363:557–9.
12. Leber AW, Knez A, von Ziegler F, Becker A, Nikolaou K,
Paul S, et al. Quantification of obstructive and nonobstructive
coronary lesions by 64-slice computed tomography: a
comparative study with quantitative coronary angiography
and intravascular ultrasound. J Am Coll Cardiol
2005;46:147–54.
13. Leschka S, Alkadhi H, Plass A, Desbiolles L, Grunenfelder J,
Marincek B, et al. Accuracy of MSCT coronary angiography
with 64-slice technology: first experience. Eur Heart J
2005;26:1482–7.
14. Achenbach S, Daniel WG. Computed tomography of the
coronary arteries: more than meets the (angiographic) eye. J
Am Coll Cardiol 2005;46:155–7.
15. Cury RC, Pomerantsev EV, Ferencik M. Comparison of the
degree of coronary stenoses by multidetector computed
tomography versus by quantitative coronary angiography.
Am J Cardiol (In press).
Figure 7. Pulmonary vein imaging at 64-row multidetector CT
(MDCT). 64-row MDCT, non-ECG-synchronized imaging. 59-
year-old male patient. Pre-interventional assessment of pulmon-
ary veins for radiofrequency ablation. Posterior view of the
patient’s heart. Common ostium for the left pulmonary veins,
i.e. the pulmonary veins join before entering the left atrium.
Separate ostia for the right pulmonary veins. LS, left superior
pulmonary vein; LI, left inferior pulmonary vein; RS, right
superior pulmonary vein; RI, right inferior pulmonary vein;
LA, left atrium; LPA, left pulmonary artery; RPA, right pul-
monary artery; VC, inferior vena cava.
Cardiac MSCT
15The British Journal of Radiology, January 2006
20. 16. Dirksen MS, Bax JJ, de Roos A, Jukema JW, van der Geest
RJ, Geleijns K, et al. Usefulness of dynamic multislice
computed tomography of left ventricular function in unstable
angina pectoris and comparison with echocardiography. Am J
Cardiol 2002;90:1157–60.
17. Cerqueira MD, Weissman NJ, Dilsizian V, Jacobs AK, Kaul S,
Laskey WK, et al. Standardized myocardial segmentation and
nomenclature for tomographic imaging of the heart: a statement
for healthcare professionals from the Cardiac Imaging
Committee of the Council on Clinical Cardiology of the
American Heart Association. Circulation 2002;105:539–42.
18. Schuijf JD, Bax JJ, Jukema JW, Lamb HJ, Vliegen HW, van
der Wall EE, et al. Noninvasive evaluation of the coronary
arteries with multislice computed tomography in hypertensive
patients. Hypertension 2005;45:227–32.
19. Schuijf JD, Bax JJ, Jukema JW, Lamb HJ, Vliegen HW, Salm
LP, et al. Noninvasive angiography and assessment of left
ventricular function using multislice computed tomography in
patients with type 2 diabetes. Diabetes Care 2004;27:2905–10.
20. Schoepf UJ, Kucher N, Kipfmueller F, Quiroz R, Costello P,
Goldhaber SZ. Right ventricular enlargement on chest
computed tomography: a predictor of early death in acute
pulmonary embolism. Circulation 2004;110:3276–80.
21. van der Meer RW, Pattynama PM, van Strijen MJ, van den
Berg-Huijsmans AA, Hartmann IJ, Putter H, et al. Right
ventricular dysfunction and pulmonary obstruction index at
helical CT: prediction of clinical outcome during 3-month
follow-up in patients with acute pulmonary embolism.
Radiology 2005;235:798–803.
22. Pappone C, Rosanio S, Oreto G, Tocchi M, Gugliotta F,
Vicedomini G, et al. Circumferential radiofrequency ablation
of pulmonary vein ostia: A new anatomic approach for curing
atrial fibrillation. Circulation 2000;102:2619–28.
23. Ghaye B, Szapiro D, Dacher JN, Rodriguez LM,
Timmermans C, Devillers D, et al. Percutaneous ablation
for atrial fibrillation: the role of cross-sectional imaging.
Radiographics 2003;23 Spec No:S19–S33.
24. Jongbloed MR, Dirksen MS, Bax JJ, Geleijns K, Lamb HJ,
Van der Wall EE, et al. Multislice computed tomography to
evaluate pulmonary vein anatomy prior to radiofrequency
catheter ablation of atrial fibrillation. Radiology 2005.
25. Maksimovic R, Cademartiri F, Scholten M, Jordaens LJ,
Pattynama PM. Sixteen-row multislice computed tomography
in the assessment of pulmonary veins prior to ablative
treatment: validation vs conventional pulmonary venography
and study of reproducibility. Eur Radiol 2004;14:369–74.
26. White CS, Kuo D, Kelemen M, Jain V, Musk A, Zaidi E, et
al. Chest pain evaluation in the emergency department: can
MDCT provide a comprehensive evaluation? AJR Am J
Roentgenol 2005;185:533–40.
A de Roos, L J M Kroft, J J Bax et al
16 The British Journal of Radiology, January 2006
21. President’s conference paper
Technology solutions for better outcomes: integrated
information management in key to productivity increases in
medicine
H REQUARDT, PhD
Group Executive Management, Siemens Medical Solutions, Henkestrasse 127, 91052 Erlangen, Germany
Abstract. The challenges to healthcare systems around the world are primarily impacted by two topics:
demographic factors and progress in medicine. An ageing population inherently needs more medical services
which add financial burdens, in particular, to public healthcare. Since the level of medical education is growing
at the same time, we are observing an increased demand for sophisticated (in general expensive) medicine.
Drastic changes in financing seem unavoidable. Multiple diagnoses, repeated examinations, trial-and-error,
overcapacities and other signs of missing economical considerations are reinforced by reimbursement systems.
In a world where, in principle, all information is available everywhere, more than a patient’s history should be
accessible. Other industries have knowledge management systems in place that make state-of-the-art expertise
available everywhere. Intelligent patient databases could consist of learning cycles that (i) enable the individual
to benefit from structured knowledge, in addition to personal experience of the physician, and (ii) use the
knowledge generated from the individual to extend the database. The novel area of molecular medicine fits
perfectly well into these scenarios. Only attached to an IT backbone can the flood of information be managed in
a beneficial way. Efficiency improvements in healthcare address the needs of all parties in the system: patients,
providers, and payers. The opportunities, however, can only materialize if everyone is prepared to change. IT
will set the standards for the biggest challenge in healthcare: The paradigm shift in medicine.
Introduction
Demographic developments are placing tremendous
pressure on healthcare systems around the world.
Although age distribution varies significantly in different
countries (e.g. China’s one-child policy versus India’s fir-
tree distribution), problems come down to one common
denominator: We are all living longer.
Figure 1 [1, 2] shows the age distribution in more
developed regions and the prognoses for 2025. It is
obvious that health is a major macroeconomic factor. If
we want to avoid the situation that fewer and fewer payers
have to support more and more users of healthcare
services, we will need to see more elderly people working.
The prerequisite for this development is that they stay
healthy. Healthcare systems thus would need to prove that
the investment in them pays off as a productivity factor.
A related challenge is reflected in the fact that a growing
population is increasingly demanding to actively partici-
pate in medical progress. Mass media and the Internet
depict what is possible today; with the majority of research
being funded by the public purse. Thus, it is a natural
desire that the same paying public also wants to enjoy the
benefits that are generated.
The basic question is: How can all of this remain
affordable? Cutting cost by cutting services is not helpful
for addressing both the need for higher quality care and
the necessity to reduce cost. Instead, all contributors to the
delivery of healthcare need to ask themselves ‘‘How can we
do more with less?’’ If we draw an analogy with industry,
this question translates to ‘‘What levers do we see to
improve efficiency?’’
Innovations drive efficiency
Medical industries are developing not only more cost
effective and reliable systems, but are also generating more
and more relevant patient information in less examination
time.
Figure 2 shows a standard way of looking at CT
datasets. The approximately 2 GB of raw image data that
are typically acquired in a 5 s scan are stored in cache
memories, are post-processed with volume renderers and
can be displayed according to the interpreter’s comfort
view.
A different example is given in Figure 3: Not only has
the amount of data dramatically increased, but so has the
quality. In this case, a high-resolution three-dimensional
(3D) image of the moving heart displays the stent
structures with superb resolution.
The broadening of the application scope is typical for
each of the imaging modalities: Angio suites do excellent
3D imaging with cone beam reconstruction algorithms,
linear accelerators deliver kV and MV images, magnetic
resonance scanners have left the domain of pure
morphologic imaging, and now measure functions in
various ways. As an example, Figure 4 shows colour
coded diffusion spectral imaging that is highly correlated
with the directions of nerve bundles.
The international medical industry has developed many
technologies that can be utilized to improve efficiency inReceived 16 August 2005 and accepted 16 September 2005.
The British Journal of Radiology, 79 (2006), 17–23 E 2006 The British Institute of Radiology
DOI: 10.1259/bjr/23726774
17The British Journal of Radiology, January 2006
22. diagnostic and therapeutic processes. Figure 5 shows in a
schematic diagram how these developments can be locked
into the learning cycles of healthcare providers. The
potential for cost savings without sacrificing quality of
care is clear. It is, however, evident that leveraging this
potential is not only a matter of technology; reimburse-
ment systems and workflow structures have to be adjusted
accordingly.
Is more always better?
The basic question ‘‘Do I get enough information about
my patient?’’ is no longer appropriate from a technology
perspective. The medical industry has established time-to-
market cycles that can rapidly turn a novel clinical
parameter into a product standard. Only 6 weeks elapsed
between the identification of the SARS virus and the
availability of a clinical test. The problem is no longer the
Figure 1. The change of age distribution in more developed regions. The qualitative cost curve reflects the current status. If nothing
changes, the real overall cost can be the integral over the age distribution multiplied by the cost curve.
Figure 2. Volume-rendered abdominal CT image. The underly-
ing image dataset consists of approximately 800 images.
Figure 3. High resolution CT image of the heart. The stent
structure is clearly delineated.
H Requardt
18 The British Journal of Radiology, January 2006
23. lack of data; the problem lies in filtering out the relevant
information.
There are various technological solutions for filtering. A
widely practiced method uses overlay of images with different
measurement parameters. Figure 6 shows an example in
which a positron emission tomography (PET) image shows us
where to focus in a set of hundreds of CT images. The overlaid
images help us to select the slices of interest.
A totally different approach with similar outcome
is represented by ‘‘computer-aided detection’’ (CAD)
algorithms. Figure 7 shows CAD-detected polyps within
a virtual colonoscopy dataset acquired with CT. These
algorithms have now reached a performance level that is
comparable with human readers. It is, however, still
applicable only for simple structures, but can help us to
focus our attention on the more complex features.
Progression of CAD into more complex structures will
be subject to the availability of standardized reference
cases.
It can be implied that innovation pressure for medical
devices will in future not only focus on the generation of
more data, but more and more on the generation of
Figure 4. Diffusion spectral MR image. Colours code for
spatial directions.
Figure 5. Contiguous improvement cycles stimulated by technology (examples).
Figure 6. PET images overlaid to a volume-rendered CT data-
set. The primary breast cancer is clearly delineated. Metastasis
search is done within the same dataset.
Technology solutions for better outcomes
19The British Journal of Radiology, January 2006
24. ‘‘smarter data’’. Yes, there will be CT scans that do 256
slices. But at the same time there will be an industry focus
on systems with two or three X-ray detector systems that
can generate not only increased temporal resolution, but
also open up new degrees of freedom with respect to
contrast by applying different anode voltages in the sub-
systems. Figure 8 shows a basic set-up for such a
system.
Overall, the focus of industry will move from ‘‘genera-
tion of data’’ towards ‘‘exploitation of data’’. It is evident
that information technology is a key enabler for that shift.
IT enables process optimization
In a patient-centric system, the ultimate outcome of the
treatment is reflected by the status of the patient. The
typical patient process in a hospital usually starts with
diagnostic steps (radiology, ECG, lab, …), iterates with
various therapeutic procedures (medication, surgery,
radiation, …), and terminates with the recovery of the
patient (ICU, ward, rehab, …). The most competitive
healthcare provider will be the one that optimizes the
entire procedure chain rather than the individual steps
(this does not relieve the individual departments from
delivering the best quality; ‘‘best’’ according to cost
optimization criteria means ‘‘adequate and intelligent’’).
In industrial analogy this means analysis, mapping and
continuous improvement of workflow.
Workflow optimization comprises the moving of
patients, resources and information within the healthcare
continuum according to certain rules. Everything (includ-
ing the rules) is subject to best practice shared across all
relevant healthcare participants throughout the world.
Workflow can be referenced in ‘‘hospital information
systems’’ by so-called workflow engines. An example of
what a workflow engine can contribute is given in
Figure 9: The emergency treatment of an acute stroke
patient is managed by a computer network. The state-of-
the-art workflow engine would not only draft a work list,
it would also monitor all activities in feedback loops.
Cross-checks with rules engines ensure that the patient
experiences state-of-the-art stroke treatment procedures.
Figure 9 gives an impression how a workflow engine can
be programmed according to the local conditions. It is
obvious that workflow engines not only synchronize
Figure 7. Computer-aided detection (CAD) algorithms detect polyps in a virtual colonoscopy. The sensitivity for polyps ¢ 6 mm is
on average 90%; and the median false positive rate is a manageable 3 per volume [3].
Figure 8. Multitube CT set-up. The system enables a new
degree of freedom allowing for double temporal resolution
and/or novel contrast opportunities.
H Requardt
20 The British Journal of Radiology, January 2006
25. Figure 9. Workflow engine editor. The various decision steps reflect the time-critical diagnosis and treatment of an acute stroke. The
time window for initiating thrombolysis is computer controlled.
Figure 10. Steps for cancer development. Today’s procedures detect cancer at a very late stage associated with high treatment cost
and reduced prognosis. Early detection schemes lead to cellular and molecular levels; one of the exciting novel areas of ‘‘molecular
medicine’’.
Technology solutions for better outcomes
21The British Journal of Radiology, January 2006
26. clinical activities, but also other day-to-day operations,
e.g. discharge (paper work needs to be ready, transporta-
tion needs to be arranged, room needs to be made up, bed
needs to be cleaned, etc.).
Workflow engines will not only change the way care is
delivered, but will also define the requirements for newly
developed systems. Requirements and job descriptions in
both arenas, industry and healthcare services, will be
affected.
The patient is an individual
The process chain within healthcare environments
(prevention R diagnosis R therapy R care) is obviously
not limited to hospitals. If we look at a schematic
development of cancer in Figure 10, we realise that with
today’s diagnostic methods we detect cancer only at a very
late stage with higher cost and lower quality of life.
Patient-focused healthcare systems will bring the interven-
tion point forward to an earlier stage of the disease. With
early detection and prevention capabilities, healthcare will
increasingly be looked at just like every other service
industry. The patient will behave like any other customer,
but still with one fundamental difference: He/she is not
free in selecting the disease.
To shift the intervention point in an efficient way, much
basic research remains to be done: The complexity of the
‘‘omics’’ (genomics, proteomics, metabolomics) needs to be
understood and standardized with respect to the develop-
ment of individual diseases. The potential, however, is big
and every single day new cancer genes are being discovered
or proteins identified that originate in specific tissue
alterations. The diagnostic industry is asked not only to
deliver blood sample tests, but also software modules that
make the associated knowledge available.
The individualization, however, is not only subject to
the diagnosis of the individual patient. It also needs to give
clear recommendations for an optimized treatment. The
entire arena of pharmacogenomics will be closely asso-
ciated with ‘‘omics’’ analysis. Also, specific tumour
metabolisms can be clearly understood and thus indivi-
dually treated. It becomes evident that in scenarios like
these, the diagnostic process moves from primary diag-
nostic to optimized treatment planning and follow up.
The holistic scenario
The topics discussed so far lead to a few characteristics
of future healthcare systems:
(1) they will be patient-focused and workflow-driven;
(2) the patient’s entire history will be accessible through
an electronic patient record (EPR);
(3) the providers will be in a competitive situation and
thus will publish proven outcome statistics to differ-
entiate themselves;
(4) the capability of sharing best practices with best-in-
class providers will be a differentiating factor.
The patient of the future will no longer rely just on the
individual experience of his physician, but on the entire
medical knowledge that is available. Obviously, the
individual experience becomes part of that knowledge,
but there are also other contributors. Figure 11 shows a
scenario of how the individual patient information can be
matched with the available knowledge. The individual
Figure 11. Process chart of future treatment planning. Data access for both the patient’s individual electronic patient record and a
comprehensive knowledge data base are crucial to enable state-of-the-art medical treatment for everyone, everywhere.
H Requardt
22 The British Journal of Radiology, January 2006
27. treatment plan for the patient is mainly impacted by two
elements: (1) the clinical knowledge database with rules for
utilization of equipment and drugs, contraindications,
standardizations, procedures and others; (2) the EPR
consisting of images, lab data, structured reports, ‘‘omics’’,
etc.
Those databases will be mined by software agents for
reference cases with proven outcome data to derive the
most promising treatment plans. This enables the primary
care physician (PCP) to match his individual experience
with all the information that is available in the data stores.
The databases will not only be filled with expert knowl-
edge from medicine, but will also include related
disciplines like pharmacology, radiation biology, biome-
chanics and others. In short, the PCP has a real, powerful
tool that leaves him with a high degree of confidence that
he has done all he can to help the patient.
It will certainly be a long way to reach this scenario, but
at the same time it is worth defining and working towards
a common vision. Enabling technologies are there to help
make this vision reality. Many new problems will come up
including topics like data protection, ethics, business
models or simply operational realization, and a social
consensus will be required to address them all.
Medicine will never become deductive, but managing its
complexity will become easier. Although basic work
remains to be done, the technological solutions are
available today. It is now a question of political desire
to launch the paradigm shift in medicine.
References
1. Population Division of the Department of Economic and
Social Affairs of the United Nations Secretariat. World
Population Prospects: The 2004 Revision Population
Database. [Online]. 2005 [cited 2005 March 15]. Available
from: URL: http://esa.un.org/unpp/
2. Economic Policy Committee (EPC). Budgetary challenges
posed by ageing populations: the impact on public spending
on pensions, health and long-term care for the elderly and
possible indicators of the long-term sustainability of public
finances. Brussels. 2001 October 24 (EPC/ECFIN/655/01-EN
final). p. 34.
3. Bogoni L, Cathier P, Dundar M, Jerebko A, Lakare S, Liang
J, et al. Computer-aided detection (CAD) for CT colonogra-
phy: a tool to address a growing need. Br J Radiol 2005;78:57–
62.
Technology solutions for better outcomes
23The British Journal of Radiology, January 2006
28. President’s conference paper
The case for particle therapy
B JONES, MD, FRCR, MedFIPEM
Queen Elizabeth University Hospital, Birmingham B15 2TH, UK
Abstract. Among the most important decisions facing the British Government regarding the treatment of cancer
in the National Health Service (NHS) is the purchase of charged particle therapy (CPT) centres. CPT is
different from conventional radiotherapy: the dose is deposited far more selectively in Bragg Peaks by either
protons or ‘‘heavy’’ ions, such as carbon. In this way, it is possible to ‘‘dose paint’’ targets, voxel by voxel, with
far less dose to surrounding tissues than with X-ray techniques. At present the UK possesses a 62 MeV
cyclotron proton facility at Clatterbridge (Wirral), which provides therapy for intraocular cancers such as
melanoma; for deeper situated cancers in the pelvis, chest etc., much higher energies, over 200 MeV are required
from a synchrotron facility. There is an impressive expansion in particle beam therapy (PBT) centres worldwide,
since they offer good prospects of improved quality of life with enhanced cancer cures in situations where
conventional therapy is limited due to radioresistance or by the close proximity of critical normal tissues. There
is a threat to UK Oncology, since it is anticipated that several thousand British patients may require referral
abroad for therapy; this would severely disrupt their multidisciplinary management and require demanding
logistical support.
The benefits of an increase in charged particle therapy
(CPT) centres in the UK would be not only for children
and young adults with cancer, where a reduced risk of
radiation induced malignancy is predicted, but also in
older patients where it is necessary to avoid abnormal
tissues such as an enlarged heart/restricted lung irradiation
and where artificial (metallic) joints may cause difficulties
in the use of conventional radiotherapy techniques. The
results of phase I and II clinical studies are extremely
encouraging. The UK must obtain at least one CPT centre
with protons/ions in order to conduct research and
development; it is suggested that quality adjusted life
years should be used to assess outcomes. It is anticipated
that the UK might eventually require 7–8 such centres in
10–15 years from now. In the meantime, healthcare
purchasers and providers need to put in place mechanisms
and personnel for patient referrals abroad, as well as the
establishment of UK CPT facilities.
Background
The connection between subatomic particles and health
delivery improvements may seem rather tenuous, but the
narrative begins in 1879, when J J Thompson discovered
the negatively charged electron in Cambridge, and
Aneurin Bevan was born in Wales. The subsequent
discoveries of the positively charged proton (a term
coined by Ernest Rutherford in 1920) and the uncharged
neutron by James Chadwick in 1931, confirmed the pre-
eminence of our science. Bevan, with similar precision of
thought, digested the wide recommendations of the Beveridge
Report (1942) and transformed most of its principles to
practical achievements, including the National Health Service
Act of Parliament (1946) and the inception of the service in
1948. Subsequently, Britain was at the forefront of practical
applications of physics and engineering developments in
cancer therapy until the early 1990s, when the reorganized
NHS became disadvantaged in terms of expensive tech-
nological acquisition.
Dr R D Errington related the history of cyclotron
radiotherapy at the BIR President’s Day conference in
2003. He detailed how the initial promising results
obtained with neutron therapy at The Hammersmith
Hospital were not subsequently confirmed in randomized
trials at Edinburgh and at the Clatterbridge facility [1, 2],
which produced neutrons that matched a 5 MeV X-ray
beam. The latter facility was converted to produce protons
on the recommendation of the late Prof. Arthur Jones of
St Bartholomew’s Hospital. This enabled patients with
choroidal melanoma of the eye to receive radical radio-
therapy using protons; this technique was the first example
of three-dimensional (3D) radiotherapy in the UK. Over
1400 patients have by now received this therapy with a
local control rate of 98% – an outstanding achievement
within British medicine [3].
Past attempts to obtain a higher energy facility in
the UK
Since 1992, Clatterbridge, Oxford and the National
Physical Laboratory at Daresbury (near Warrington)
have all unsuccessfully attempted to obtain a higher
energy CPT facility [4]. All these bids were rejected
because of perceived lack of clinical support,
intermittent beam availability, the lack of clinical trial
evidence, the recommendation that a facility should be
sited in a University Hospital campus and perhaps
mostly, the expected high initial costs incurred at a
time when NHS reforms discouraged large-scale
projects, even the provision of new (replacement) linear
accelerators.
More recently, there has emerged a more collective
response from clinical oncologists and medical physicists
who appreciate that obtaining a CPT facility is essential
The British Journal of Radiology, 79 (2006), 24–31 E 2006 The British Institute of Radiology
DOI: 10.1259/bjr/81790390
24 The British Journal of Radiology, January 2006
29. for the advancement of radiation oncology standards in
the UK. The Royal College of Radiologists (RCR), British
Institute of Radiology (BIR) and Institute of Physics and
Engineering in Medicine (IPEM) for example all support
the case for a CPT facility. Recent improvements in the
quality of cancer imaging and the availability of
industrially produced turnkey facilities, has allowed the
question to be carefully re-considered and better under-
stood, particularly in relation to the rapid expansion in
CPT facilities abroad.
Technical aspects
The velocity of heavy charged particles (electrons are
considered to be light) is reduced as they traverse deeper
through tissues. The interaction probability to cause
ionization increases as the velocity falls, so that a peak
of dose occurs at a depth proportional to the energy
imparted to each particle. William Bragg, a British
physicist, described this phenomenon over 100 years ago
[5]. The so called Bragg peak can be ‘‘spread out’’ to
achieve a plateau of uniform dose that covers a target by
use of rotating range-shifting modulators of variable
thickness. In the past, passively scattered beams were
used in this way to provide wide circular or rectangular
beams with spread out Bragg peaks (Figure 1). More
recently, the spot scanning method allows smaller beams
to deposit their peaks within individual voxel targets
defined by good imaging techniques: by the use of
‘‘wobbler’’ magnets and particle energy selection, the
raster scanning system allows cancer bearing voxels
(defined by x, y, z, co-ordinates), to be ‘‘dose painted’’.
The Bragg peak position will depend on the initial
energy imparted to the particles as well as their mass and
charge; the Bethe-Bloch equation contains all the neces-
sary parameters. It can be seen from Figure 2 that the
range for clinical use should be at least 200 MeV in the
case of protons; higher energies – up to 400 MeV – for
carbon ions.
Gantries and robots
Within treatment rooms there are options for beam
arrangements. The simplest approach is to have either
fixed horizontal or vertical beams, or a combination of the
two for the simplest treatments. An isocentric rotating
gantry is required for more complex geometrical problems.
These consist of large cylindrical rotating structures that
contain the beam bending magnets: they weigh 100 tonnes
for protons and 200 tonnes for ions and require movement
with 1 mm precision of beam placement. Future engineer-
ing innovations may reduce the tonnage and costs.
Robotic treatment couches are desirable in order to
rapidly position the patient at predetermined angles
relative to the beams; they may also transport patients
in fixed positions from image guided or other localization
devices in the treatment rooms to the actual treatment
location. Radiographers may feel sensitive about robotics,
but it will always be the radiographer who commands the
robot and remotely monitors their performance.
Typical centre
The typical layout of a centre is illustrated in Figure 3.
The particles are injected from a small linear accelerator
and further accelerated to higher energies around the
synchrotron, then extracted and delivered selectively to
different rooms; the beam switching time between rooms is
Figure 2. Approximate depth dose positions of partially spread
out Bragg peaks for protons of different energies.
Figure 3. A schematic diagram of a synchrotron treatment
centre.
Figure 1. Schematic depth dose diagram of a proton beam
Bragg peak, the spread out Bragg peak and a megavoltage
X-ray beam (modified from Suit et al [12]). The grey shaded
areas indicate the extent of dose reduction within normal tis-
sues situated proximal and distal to the tumour target.
The case for particle therapy
25The British Journal of Radiology, January 2006