SlideShare uma empresa Scribd logo
1 de 8
Baixar para ler offline
1
Trace Detector Evaluation Criteria
Dr. Susan F. Hallowell
Introduction
The trace detection industry is offering numerous products to an increasing customer base for the
detection and interdiction of explosives, chemical warfare agents, drugs and toxic industrial chemicals
for field applications. For the topic of this discussion, only explosive trace detection (ETD) systems will
be discussed. These ETD systems detect explosive residues at extremely low levels, and can be highly
specific in their identification of each explosive.
In order to make objective comparisons of one detector to another, various aspects of each system must
be examined on an equal basis. Generally, ETD systems that arrive in airports for testing and evaluation
have already passed laboratory qualification. This simply means that systems are able to detect and
identify threat substances at a specified level in a simulated environment. What works in a simulated
environment does not always work well in the real world. Airport operational tests are critical to
reveal how well individual ETD systems meet the challenges of the demanding environment in the
airport ecosystem.
How can airports measure performance?
An airport may take into account laboratory results indicating systems’ ability to detect threat
substances, it is more important that airport decision-makers conduct their own real world operational
tests to ensure that these systems work and become part of the seamless flow of passengers through
airports.
This is measured by:
 False Alarm Rate
 Through-put/Clear-down
 Ease of Operation
False Alarms: Cost Money
When a system alarms, it must be resolved. This means checkpoint personnel and alarm resolution
protocol must be followed. That process slows throughput, requires additional personnel and may even
require law enforcement to respond to a false alarm. Even a 1 percent false alarm rate has a major
impact on airport operations. For example, at an airport the size of Heathrow Airport, with 201,000
passenger arrivals and departures per day, that means nearly 2000 false alarms per day. This costs into
the millions of dollars per year per airport. (only a few percent are actually ETD)
Through-put: Trace Detection should not impact the flow of passengers at checkpoint
Legacy systems have traditionally been limited by the number of samples taken per hour and by the
ability to come back on-line to analyze the next samples.
2
Systems utilizing ion mobility spectrometry (IMS) have traditionally not been able to screen at a very
high frequency. These systems struggle to recover from alarms quickly and can take up to several hours
to come back on-line for use after an alarm.
Ease of Operation
ETD systems originated as an adaption of laboratory instruments. Today, systems are available that
deliver laboratory performance to airport screeners and are easy to use. Ease of use means that the
systems are plugged in, warmed up, and are ready to screen quickly. Quick to learn, self calibrating,
internal diagnostics, easy to operate,
How to evaluate systems in your airports
When conducting tests in your airport, several factors will impact operation of systems. ETD evaluation
is based on three major criteria: operational performance, human factors and total cost of
operation/ownership.
Operational Performance and Human Factors
 Interaction of operator with instrument - is it easy for screeners to use?
 Does the system warm up and become operational quickly at the beginning of a screening
operation?
 Is it simple to prepare for use?
 Is the calibration and verification a simple process?
 Is it simple to use for screening?
 Are the results easy to interpret?
 Do the results provide information needed for any follow up questions the screener may need
to ask (e.g. Is it important to know the explosive identity in "clearing” an alarm?
 Does the screener have any problem collecting the sample or inserting it into the machine?
 Is the monitor easy to see?
 Does the system seem to be intuitive and easy-to-use for the screener?
 Can the screener "do it wrong" or is the detector engineered such that the operation is fairly
fool proof?
 Is there a good training package associated with the device?
 Is there a high degree of confidence that alarm is real?
Total Cost of Operation/Ownership
Make sure to gain detailed information and specific costs of:
 Dopants and calibrants
 Consumables
 Maintenance
 False alarm rates
 Radiation-related costs (if the unit has a radioactive source or utilizes ionizing radiation)
Other Considerations
 Reliability, maintenance, and service offerings
o MTBF (Mean Time Between Failure)
o What happens when the system needs maintenance?
3
o Does the company have a remote diagnostic capability?
o How long does it take to get service?
 Is the system is collecting data in a format that meets your needs?
o Do you need a record of the number of samples/false alarms per day?
o Use by operators
o Other factors?
Be aware that there are environmental factors, such as ambient temperature, dust, humidity,
environmental pollutants, and other chemicals that may be present in the environment (deicers,
fertilizers, etc.) which might affect the ETD systems’ performance.
ETD Trial Recommendations
Operational Testing:
Allow time for testing -- run a minimum of two weeks to two months of airport operational testing.
This will yield critical data for purchasing decisions:
 How well manufacturers’ claims align with user experience
 How well systems respond to environmental conditions
 Complete performance data collection
Test 2-3 units of a particular manufacturer’s system:
 Controls for any performance variations between systems
 Gathers more experimental data
 Delivers more operator feedback
4
Operational performance
Factors Definition Results/Score
Operational false alarms Operational false alarms are critical measures for
an ETD because the rate impacts throughput and
the availability of the system. ETD systems with
high operational false alarms can also increase the
total cost of operation.
 Operational false alarms in the field in %
 Number of false alarms per year
 Cost of rescanning caused by false alarms
 Cost per trap
………………………
………………………
………………………
………………………
Clear-down time The clear-down time represents the time the
system requires to clear out the previous sample
and be ready for the next. Clear-down algorithms
are different from one manufacture to another.
The major key parameters for clear-down are:
 Average clear-down time after an alarm
 Average clear-down time for non-alarm
samples
………………………..
………………………..
Pd (Probability of detection) Probability of detection (Pd) represents the
probability that the ETD system will detect an alarm
when presented with a threat. The ETD system
includes the instrument, the sampling mechanism,
and the sampling wand. Pd is typically evaluated in
the laboratory environments with real threats
employing fairly sophisticated designed
experiments. These results may be obtained
directly from the test center.
Mean time between failures
(MTBF)
Mean time between failures (MTBF) is the elapsed
time between inherent failures of any given system
in operation. This MTBF time can be supplied by
the manufacturer and can be determined in the
field during the operational tests. It is typically
measured in hours.
………………………..
5
Mean Time to Repair
(MTTR)
This is the mean operational downtime required to
diagnose and repair a unit due to all critical and
non-critical failures during operating hours.
System availability The system availability is defined as:
 Availability = (MTBF / (MTBF+ MTTR))
Where MTBF is the mean time between failure and
MTTR, mean time to repair. Availability is a
measure of % of time. High performance ETD
systems must meet an availability of 97%.
…………………………
…………………………
………………………..
Remote diagnostics Factory service personnel can evaluate system
performance and identify faults without site visits
and can make on-line adjustments.
 Remote Diagnostic capability
 Remote system adjustment
 Remote connections with a network would
need to be enabled
 To be cleared with airport facility security
plan
………………………….
………………………….
Automated maintenance Automated maintenance consists of performance
of a maintenance task on a regular basis without
operator action.
 Automated maintenance capability
 Scheduled maintenance capability
…………………………..
…………………………..
Human Factors
6
Factors Definition Results/Score
Ease of use The interaction time between the operator and the
instrument should be minimal and very easy to perform.
For an operator, the number of daily operations for the
system should be limited to:
 Startup/Warm time of 30 min or less
 Calibration (automatic or manual)
 Verification
………………………
………………………
………………………
Sampling wand  Ease of installing/removing the trap from the wand
 Robustness of the wand
 Comfort and ease of use
……………………….
……………………….
………………………
System calibrations ETD performance can be impacted by environmental
conditions and calibrations are performed to optimize the
system and ensure the highest level of detection.
 How many calibrations per day
 Does calibration require operator intervention
……………………….
………………………
System verifications The verification procedure involves sampling a known
substance from the sample trap to ensure the system is
properly calibrated and that the system’s hardware
(desorber, sampling flow, traps) are functioning properly.
Systems can be evaluated by measuring the number of
 Daily verifications
 Daily use of traps
………………………..
………………………..
Navigation through
the GUI (software
interface)
The GUI should be intuitive and easy for the user. Complex
interaction slows the operator and distracts them from
their primary task – screening. Automated features such
as internal calibration and system prompt messages greatly
reduce and simplify system operation.
 Number of interactions per day from calibration
and verification
 Ease of use/intuitiveness
 Localization (local language)
…………………………..
………………………….
…………………………..
Total cost of ownership
7
Factors Definition Results/Score
Unit cost The costs for ETD equipment vary between different
manufacturers and can range between $20k and
$60K.
 Acquisition cost per unit
 Total cost of acquisition
………………………..
……………………….
Operational expense The Cost of operating and maintaining any ETD
system may exceed the cost of acquisition in a short
period of time. Estimating the operating expense is
extremely important. Airports must determine how
the system will be used, what materials and labor
are necessary to run ETD systems.
 Determine cost of Consumables
 Determine cost of Maintenance
 Determine cost of labor
…………………………
………………………...
…………………………
Sample traps cost Manufacturers select sampling traps with excellent
efficiency for particle collection during swipe-
sampling step. These traps vary greatly from one
manufacturer to another and can have different
useful lifespans
Airports should consider the following to evaluate
the cost of traps:
 Are traps certified by any governmental
agency?
 Are traps multiuse purpose?
 Number of samples per trap?
 Cost per trap?
…………………………..
………………………….
………………………….
………………………….
False Alarm cost Every time an alarm is detected, the operator must
resolve all alarms by following their internal protocol
to resolve the alarm. This operation requires time
8
and its cost can be estimated as follows:
FA cost = Number of FA per year ×cost for each
individual trap × labor cost associated with
resolving the alarm
…………………………
Radioactivity and ionization
radiation  Radiation safety plan
 Radioactive material license
 National and international transport
 Disposal cost
………………………..
……………………….
……………………….
……………………….
Conclusion
Operational testing is absolutely essential to determine which ETD system meets the needs of the
airport environment. No airport should buy any systems without testing systems in their unique
environment. While ETD manufacturers may provide performance data, is its critical that airports collect
their own data to verify manufacturers’ claims. Utilizing results of robust airport tests and consideration
of the factors enumerated above will yield the critical data operators need to make an informed
decision. Choosing the right system will add to a positive passenger experience, will save resources that
are consumed by false alarms and other operational problems, and will in fact increase security at
airports.

Mais conteúdo relacionado

Mais procurados

Complying with New Functional Safety Standards
Complying with New Functional Safety StandardsComplying with New Functional Safety Standards
Complying with New Functional Safety StandardsDesign World
 
Computers 09
Computers 09Computers 09
Computers 09AkiTenshi
 
Computers Health and Safety
Computers Health and SafetyComputers Health and Safety
Computers Health and SafetyWildOakForrest
 
Computers 09
Computers 09Computers 09
Computers 09j45a45ck
 
Computers assesment
Computers assesmentComputers assesment
Computers assesmentDom9533
 
Fault avoidance and fault tolerance
Fault avoidance and fault toleranceFault avoidance and fault tolerance
Fault avoidance and fault toleranceJabez Winston
 

Mais procurados (7)

Complying with New Functional Safety Standards
Complying with New Functional Safety StandardsComplying with New Functional Safety Standards
Complying with New Functional Safety Standards
 
Computers 09
Computers 09Computers 09
Computers 09
 
Computers Health and Safety
Computers Health and SafetyComputers Health and Safety
Computers Health and Safety
 
Computers 09
Computers 09Computers 09
Computers 09
 
Computers assesment
Computers assesmentComputers assesment
Computers assesment
 
Computers 09
Computers 09Computers 09
Computers 09
 
Fault avoidance and fault tolerance
Fault avoidance and fault toleranceFault avoidance and fault tolerance
Fault avoidance and fault tolerance
 

Semelhante a ETD featurespdf

Domino Effect and Analysis | Gaurav Singh Rajput
Domino Effect and Analysis | Gaurav Singh RajputDomino Effect and Analysis | Gaurav Singh Rajput
Domino Effect and Analysis | Gaurav Singh RajputGaurav Singh Rajput
 
Domino Effect and Analysis | Relaibility Analysis | Unavailability Analysis
Domino Effect and Analysis | Relaibility Analysis | Unavailability AnalysisDomino Effect and Analysis | Relaibility Analysis | Unavailability Analysis
Domino Effect and Analysis | Relaibility Analysis | Unavailability AnalysisGaurav Singh Rajput
 
Automated-test-equipment-Blog-Digilogic Systems
Automated-test-equipment-Blog-Digilogic SystemsAutomated-test-equipment-Blog-Digilogic Systems
Automated-test-equipment-Blog-Digilogic SystemsDigilogic Systems
 
EMBEDDED SYSTEMS 1
EMBEDDED SYSTEMS 1EMBEDDED SYSTEMS 1
EMBEDDED SYSTEMS 1PRADEEP
 
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE WITH ON-LINE PARAMETER PR...
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE  WITH ON-LINE PARAMETER PR...FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE  WITH ON-LINE PARAMETER PR...
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE WITH ON-LINE PARAMETER PR...Sheikh R Manihar Ahmed
 
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEM
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEMADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEM
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEManil badiger
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded SystemsAngela Hauber
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded SystemsMEN Mikro Elektronik GmbH
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded SystemsMEN Micro
 
Knowledge Based System (Expert System) : Equipment Safety Control & Management
Knowledge Based System (Expert System) : Equipment Safety Control & ManagementKnowledge Based System (Expert System) : Equipment Safety Control & Management
Knowledge Based System (Expert System) : Equipment Safety Control & ManagementAmr El-Ganainy
 
SE2_Lec 20_Software Testing
SE2_Lec 20_Software TestingSE2_Lec 20_Software Testing
SE2_Lec 20_Software TestingAmr E. Mohamed
 
2010-03-31 - VU Amsterdam - Experiences testing safety critical systems
2010-03-31 - VU Amsterdam - Experiences testing safety critical systems2010-03-31 - VU Amsterdam - Experiences testing safety critical systems
2010-03-31 - VU Amsterdam - Experiences testing safety critical systemsJaap van Ekris
 
SE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software TestingSE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software TestingAmr E. Mohamed
 
IRJET- Early Detection of Sensors Failure using IoT
IRJET- Early Detection of Sensors Failure using IoTIRJET- Early Detection of Sensors Failure using IoT
IRJET- Early Detection of Sensors Failure using IoTIRJET Journal
 
Critical System Specification in Software Engineering SE17
Critical System Specification in Software Engineering SE17Critical System Specification in Software Engineering SE17
Critical System Specification in Software Engineering SE17koolkampus
 
Maintenance metrics in oee
Maintenance metrics in oeeMaintenance metrics in oee
Maintenance metrics in oeejohnsqms
 

Semelhante a ETD featurespdf (20)

Domino Effect and Analysis | Gaurav Singh Rajput
Domino Effect and Analysis | Gaurav Singh RajputDomino Effect and Analysis | Gaurav Singh Rajput
Domino Effect and Analysis | Gaurav Singh Rajput
 
Domino Effect and Analysis | Relaibility Analysis | Unavailability Analysis
Domino Effect and Analysis | Relaibility Analysis | Unavailability AnalysisDomino Effect and Analysis | Relaibility Analysis | Unavailability Analysis
Domino Effect and Analysis | Relaibility Analysis | Unavailability Analysis
 
Automated-test-equipment-Blog-Digilogic Systems
Automated-test-equipment-Blog-Digilogic SystemsAutomated-test-equipment-Blog-Digilogic Systems
Automated-test-equipment-Blog-Digilogic Systems
 
Pascual Imec06
Pascual Imec06Pascual Imec06
Pascual Imec06
 
EMBEDDED SYSTEMS 1
EMBEDDED SYSTEMS 1EMBEDDED SYSTEMS 1
EMBEDDED SYSTEMS 1
 
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE WITH ON-LINE PARAMETER PR...
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE  WITH ON-LINE PARAMETER PR...FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE  WITH ON-LINE PARAMETER PR...
FAULT DETECTION AND DIAGNOSIS OF INDUCTION MACHINE WITH ON-LINE PARAMETER PR...
 
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEM
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEMADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEM
ADVANTAGES AND LIMITATION OF AN AUTOMATED VISUAL INSPECTION SYSTEM
 
Safety system
Safety systemSafety system
Safety system
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems
 
5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems5 Techniques to Achieve Functional Safety for Embedded Systems
5 Techniques to Achieve Functional Safety for Embedded Systems
 
Knowledge Based System (Expert System) : Equipment Safety Control & Management
Knowledge Based System (Expert System) : Equipment Safety Control & ManagementKnowledge Based System (Expert System) : Equipment Safety Control & Management
Knowledge Based System (Expert System) : Equipment Safety Control & Management
 
bianco final
bianco finalbianco final
bianco final
 
SE2_Lec 20_Software Testing
SE2_Lec 20_Software TestingSE2_Lec 20_Software Testing
SE2_Lec 20_Software Testing
 
2010-03-31 - VU Amsterdam - Experiences testing safety critical systems
2010-03-31 - VU Amsterdam - Experiences testing safety critical systems2010-03-31 - VU Amsterdam - Experiences testing safety critical systems
2010-03-31 - VU Amsterdam - Experiences testing safety critical systems
 
Ch11 reliability engineering
Ch11 reliability engineeringCh11 reliability engineering
Ch11 reliability engineering
 
SE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software TestingSE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software Testing
 
IRJET- Early Detection of Sensors Failure using IoT
IRJET- Early Detection of Sensors Failure using IoTIRJET- Early Detection of Sensors Failure using IoT
IRJET- Early Detection of Sensors Failure using IoT
 
Critical System Specification in Software Engineering SE17
Critical System Specification in Software Engineering SE17Critical System Specification in Software Engineering SE17
Critical System Specification in Software Engineering SE17
 
Maintenance metrics in oee
Maintenance metrics in oeeMaintenance metrics in oee
Maintenance metrics in oee
 

ETD featurespdf

  • 1. 1 Trace Detector Evaluation Criteria Dr. Susan F. Hallowell Introduction The trace detection industry is offering numerous products to an increasing customer base for the detection and interdiction of explosives, chemical warfare agents, drugs and toxic industrial chemicals for field applications. For the topic of this discussion, only explosive trace detection (ETD) systems will be discussed. These ETD systems detect explosive residues at extremely low levels, and can be highly specific in their identification of each explosive. In order to make objective comparisons of one detector to another, various aspects of each system must be examined on an equal basis. Generally, ETD systems that arrive in airports for testing and evaluation have already passed laboratory qualification. This simply means that systems are able to detect and identify threat substances at a specified level in a simulated environment. What works in a simulated environment does not always work well in the real world. Airport operational tests are critical to reveal how well individual ETD systems meet the challenges of the demanding environment in the airport ecosystem. How can airports measure performance? An airport may take into account laboratory results indicating systems’ ability to detect threat substances, it is more important that airport decision-makers conduct their own real world operational tests to ensure that these systems work and become part of the seamless flow of passengers through airports. This is measured by:  False Alarm Rate  Through-put/Clear-down  Ease of Operation False Alarms: Cost Money When a system alarms, it must be resolved. This means checkpoint personnel and alarm resolution protocol must be followed. That process slows throughput, requires additional personnel and may even require law enforcement to respond to a false alarm. Even a 1 percent false alarm rate has a major impact on airport operations. For example, at an airport the size of Heathrow Airport, with 201,000 passenger arrivals and departures per day, that means nearly 2000 false alarms per day. This costs into the millions of dollars per year per airport. (only a few percent are actually ETD) Through-put: Trace Detection should not impact the flow of passengers at checkpoint Legacy systems have traditionally been limited by the number of samples taken per hour and by the ability to come back on-line to analyze the next samples.
  • 2. 2 Systems utilizing ion mobility spectrometry (IMS) have traditionally not been able to screen at a very high frequency. These systems struggle to recover from alarms quickly and can take up to several hours to come back on-line for use after an alarm. Ease of Operation ETD systems originated as an adaption of laboratory instruments. Today, systems are available that deliver laboratory performance to airport screeners and are easy to use. Ease of use means that the systems are plugged in, warmed up, and are ready to screen quickly. Quick to learn, self calibrating, internal diagnostics, easy to operate, How to evaluate systems in your airports When conducting tests in your airport, several factors will impact operation of systems. ETD evaluation is based on three major criteria: operational performance, human factors and total cost of operation/ownership. Operational Performance and Human Factors  Interaction of operator with instrument - is it easy for screeners to use?  Does the system warm up and become operational quickly at the beginning of a screening operation?  Is it simple to prepare for use?  Is the calibration and verification a simple process?  Is it simple to use for screening?  Are the results easy to interpret?  Do the results provide information needed for any follow up questions the screener may need to ask (e.g. Is it important to know the explosive identity in "clearing” an alarm?  Does the screener have any problem collecting the sample or inserting it into the machine?  Is the monitor easy to see?  Does the system seem to be intuitive and easy-to-use for the screener?  Can the screener "do it wrong" or is the detector engineered such that the operation is fairly fool proof?  Is there a good training package associated with the device?  Is there a high degree of confidence that alarm is real? Total Cost of Operation/Ownership Make sure to gain detailed information and specific costs of:  Dopants and calibrants  Consumables  Maintenance  False alarm rates  Radiation-related costs (if the unit has a radioactive source or utilizes ionizing radiation) Other Considerations  Reliability, maintenance, and service offerings o MTBF (Mean Time Between Failure) o What happens when the system needs maintenance?
  • 3. 3 o Does the company have a remote diagnostic capability? o How long does it take to get service?  Is the system is collecting data in a format that meets your needs? o Do you need a record of the number of samples/false alarms per day? o Use by operators o Other factors? Be aware that there are environmental factors, such as ambient temperature, dust, humidity, environmental pollutants, and other chemicals that may be present in the environment (deicers, fertilizers, etc.) which might affect the ETD systems’ performance. ETD Trial Recommendations Operational Testing: Allow time for testing -- run a minimum of two weeks to two months of airport operational testing. This will yield critical data for purchasing decisions:  How well manufacturers’ claims align with user experience  How well systems respond to environmental conditions  Complete performance data collection Test 2-3 units of a particular manufacturer’s system:  Controls for any performance variations between systems  Gathers more experimental data  Delivers more operator feedback
  • 4. 4 Operational performance Factors Definition Results/Score Operational false alarms Operational false alarms are critical measures for an ETD because the rate impacts throughput and the availability of the system. ETD systems with high operational false alarms can also increase the total cost of operation.  Operational false alarms in the field in %  Number of false alarms per year  Cost of rescanning caused by false alarms  Cost per trap ……………………… ……………………… ……………………… ……………………… Clear-down time The clear-down time represents the time the system requires to clear out the previous sample and be ready for the next. Clear-down algorithms are different from one manufacture to another. The major key parameters for clear-down are:  Average clear-down time after an alarm  Average clear-down time for non-alarm samples ……………………….. ……………………….. Pd (Probability of detection) Probability of detection (Pd) represents the probability that the ETD system will detect an alarm when presented with a threat. The ETD system includes the instrument, the sampling mechanism, and the sampling wand. Pd is typically evaluated in the laboratory environments with real threats employing fairly sophisticated designed experiments. These results may be obtained directly from the test center. Mean time between failures (MTBF) Mean time between failures (MTBF) is the elapsed time between inherent failures of any given system in operation. This MTBF time can be supplied by the manufacturer and can be determined in the field during the operational tests. It is typically measured in hours. ………………………..
  • 5. 5 Mean Time to Repair (MTTR) This is the mean operational downtime required to diagnose and repair a unit due to all critical and non-critical failures during operating hours. System availability The system availability is defined as:  Availability = (MTBF / (MTBF+ MTTR)) Where MTBF is the mean time between failure and MTTR, mean time to repair. Availability is a measure of % of time. High performance ETD systems must meet an availability of 97%. ………………………… ………………………… ……………………….. Remote diagnostics Factory service personnel can evaluate system performance and identify faults without site visits and can make on-line adjustments.  Remote Diagnostic capability  Remote system adjustment  Remote connections with a network would need to be enabled  To be cleared with airport facility security plan …………………………. …………………………. Automated maintenance Automated maintenance consists of performance of a maintenance task on a regular basis without operator action.  Automated maintenance capability  Scheduled maintenance capability ………………………….. ………………………….. Human Factors
  • 6. 6 Factors Definition Results/Score Ease of use The interaction time between the operator and the instrument should be minimal and very easy to perform. For an operator, the number of daily operations for the system should be limited to:  Startup/Warm time of 30 min or less  Calibration (automatic or manual)  Verification ……………………… ……………………… ……………………… Sampling wand  Ease of installing/removing the trap from the wand  Robustness of the wand  Comfort and ease of use ………………………. ………………………. ……………………… System calibrations ETD performance can be impacted by environmental conditions and calibrations are performed to optimize the system and ensure the highest level of detection.  How many calibrations per day  Does calibration require operator intervention ………………………. ……………………… System verifications The verification procedure involves sampling a known substance from the sample trap to ensure the system is properly calibrated and that the system’s hardware (desorber, sampling flow, traps) are functioning properly. Systems can be evaluated by measuring the number of  Daily verifications  Daily use of traps ……………………….. ……………………….. Navigation through the GUI (software interface) The GUI should be intuitive and easy for the user. Complex interaction slows the operator and distracts them from their primary task – screening. Automated features such as internal calibration and system prompt messages greatly reduce and simplify system operation.  Number of interactions per day from calibration and verification  Ease of use/intuitiveness  Localization (local language) ………………………….. …………………………. ………………………….. Total cost of ownership
  • 7. 7 Factors Definition Results/Score Unit cost The costs for ETD equipment vary between different manufacturers and can range between $20k and $60K.  Acquisition cost per unit  Total cost of acquisition ……………………….. ………………………. Operational expense The Cost of operating and maintaining any ETD system may exceed the cost of acquisition in a short period of time. Estimating the operating expense is extremely important. Airports must determine how the system will be used, what materials and labor are necessary to run ETD systems.  Determine cost of Consumables  Determine cost of Maintenance  Determine cost of labor ………………………… ………………………... ………………………… Sample traps cost Manufacturers select sampling traps with excellent efficiency for particle collection during swipe- sampling step. These traps vary greatly from one manufacturer to another and can have different useful lifespans Airports should consider the following to evaluate the cost of traps:  Are traps certified by any governmental agency?  Are traps multiuse purpose?  Number of samples per trap?  Cost per trap? ………………………….. …………………………. …………………………. …………………………. False Alarm cost Every time an alarm is detected, the operator must resolve all alarms by following their internal protocol to resolve the alarm. This operation requires time
  • 8. 8 and its cost can be estimated as follows: FA cost = Number of FA per year ×cost for each individual trap × labor cost associated with resolving the alarm ………………………… Radioactivity and ionization radiation  Radiation safety plan  Radioactive material license  National and international transport  Disposal cost ……………………….. ………………………. ………………………. ………………………. Conclusion Operational testing is absolutely essential to determine which ETD system meets the needs of the airport environment. No airport should buy any systems without testing systems in their unique environment. While ETD manufacturers may provide performance data, is its critical that airports collect their own data to verify manufacturers’ claims. Utilizing results of robust airport tests and consideration of the factors enumerated above will yield the critical data operators need to make an informed decision. Choosing the right system will add to a positive passenger experience, will save resources that are consumed by false alarms and other operational problems, and will in fact increase security at airports.