5. • C2 is an extreme decision-making paradigm characterized
by high uncertainty, high risk, and severe time pressure.
• Situational awareness and command functions, based on
collected data and filtered aggregation and dissemination of
information and orders.
Network Centric Warfare (NCW)
5
6. C2 Challenges
• Minimization of Risk & Uncertainly requires
Maximization of Information processing as close to real-
time as possible
– A ‘Big Data’ type problem
• Information increase Cognitive load increase
• high-consequence decisions increase psychological stress
• Critical decision points correlated with maximum cognitive
load
• decision-making is often conducted under fatigue
7. • Decision support system (DSS) to lighten the burden on the
operator by filtering and integrating information.
1. how to filter and integrate information
2. how to best communicate pertinent information
Dealing with C2 challenges
7
8. “Adaptive Automation for Human-Robot Teaming in Future
Command and Control Systems”, The International C2 Journal,
by Army Research Lab
• We propose that automation of early-stage functions—
information acquisition and analysis—can, if necessary,
be pursued to a very high level and provide effective
support of operators in C2 systems.
• On the other hand, automation of decision making functions
should be set at a moderate level unless very high-reliability
decision algorithms can be assured, which is rarely the case.
Algorithmic DSS
8
9. • Ultimate responsibility for system outcomes always resides
with the human.
• In practice, even highly automated systems usually have
some degree of human supervisory control.
• Particularly in combat, some oversight and the capability
to override and control lethal systems will always be a
human responsibility for the reasons of:
– system safety
– changes in the commander’s goals
– avoidance of fratricide
– unanticipated events that cannot be handled by
automation
Automation of Information Acquisition
and Analysis
9
10. • unbalanced mental workload,
• reduced situation awareness,
• decision biases,
• mistrust,
• over-reliance,
• complacency
• Crocoll and Coury (1990): cost of imperfect advice by
automation is greater when participants are given
recommendations to implement (decision automation) than
when only status information is given, with which to make their
own decision (information automation).
(Billings 1997; Parasuraman and Riley 1997; Sarter et al. 1997; Sheridan 2002; Wiener 1988).
Problems in human-automation
interaction
10
11. • For imperfect (less than 100 percent) automation, the
greater the reliability, the greater the chance of
operator over-reliance (because of the rarity of incurrect
automation advisories), with a consequent cost of the
operator uncritically following unreliable advice.
• The cost of unreliable decision automation occurs across
multiple levels of decision automation, from low to high.
• Users of decision automation may no longer create or
explore novel alternatives apart from those provided by
the automation, thus leading to a greater performance cost
when the automation is unreliable.
DSS paradox
11
12. • Vary the level and type of automation during system
operations, adaptively, depending on context or operator
needs.
• Information or decision support that is not fixed at the
design stage but varies appropriately with context in the
operational environment.
The concept has also been criticized as potentially
increasing system unpredictability.
Who is “in charge” of adaptation, the human or the system?
Approaches to solving the DSS paradox
12
13. • Multiple, highly autonomous robots of varying capabilities
will be supervised by much smaller teams of human
operators who will only be alerted when the situation
becomes critical (Lewis et al. 2006).
• However, all the performance problems discussed—over-
reliance, lack of situation awareness, skill degradation,
and workload increases during crises—will, if anything,
worsen.
Increased complexity with autonomous
battlefield systems
13