Human Machine Interactions as Partnership
Dr. Shelley P. Gallup, Dr. Mark E. Nissen
Naval Postgraduate School
Information Sciences Department
Team and Partnership: BLUF
Awareness, what is it really?
The Logic of Distinction
Update to OODA loop
….and now including Distinction
Intention
Competent Dialogue
Anticipation
Use of the Human Autonomic Nervous System (ANS) as an exemplar
Adaptation of Poly-Vagal Theory Analog
Hyperarousal
Window of Tolerance
Requisite Variety
Hypoarousal
Fight or Flight
Fawn or Freeze
Human 10th Cranial Nerve enervates all organs of the body producing reactions through production of hormone response.
Learning
Human-Machine Partnership follows the same development cycles as infant to individuated adult.
Extending the analogy to systems of systems
Increasing Danger of combat
Dynamic Window of Tolerance
Requisite Variety must expand to meet increase in context factors. Or decrease to focus combat power in smaller context.
CDRs intent and ROE-towards fight or towards retreat
Self Defense if attacked
.
Context outside the WOT requires feedback to SoSs to increase factors improving RV. Second order cybernetics
Dynamic contexts
Learning to adjust to risk-Dialog with automation.
Unknown unknowns
Trigger
Window of Control (or Window of Tolerance) v. requisite variety
Dynamic window of tolerance
Requisite Variety needed for control
Increase
Increase
Adaptation through Human-machine partnership
Discussion: Main Points and Example
Although this is a biological analog as an example, it follows closely the cybernetic models of H Ross Ashby’s explanation of control systems. In cybernetics the control function keeps the “system” within tolerances in order to perform its function. This is first order control. Occasionally the system exceeds a threshold and requires a second-order response. The controller’s ability to bring the system back into tolerance is a function of the controller’s requisite variety to deal with exceptions.
In the first slide this is operationalized by creating a “window of tolerance.” This is first order control, and our autonomic nervous system takes care of the ups and downs within this window. When exceptional triggers are discerned from the outside environment the autonomic nervous system reacts through the sympathetic (action inducing) and parasympathetic nervous system (quieting).
However, In the next slide the window of tolerance is NOT fixed. It is dynamic, determined by context, commander’s intent, being targeted unexpectedly etc. Many external conditions can affect the WOT. How the system of systems react and its speed of reaction is a function of a human-machine partnership, each doing what they are best at, and with the human having the last word. This partnership results from a kind of “dialog” between the humans and the machine partner interacting with a time critical controller. A Monterey Phoenix simulation is included here. (add MP)
Relationship to AI development for war at sea
No doubt “AI” (I prefer advanced autonomy) has made strides. But they are simplistic, under the theory that if we feed enough examples to the machine, it will “learn” what is correct. So far these systems do NOT include adequate requisite variety to be a “controller” in fully autonomous modes to have a weapons release authority. The concept of the human being on another platform to act as the ultimate controller for weapons on an autonomous platform is rife with difficulties, not the least of which is an assumption of clear and consistent communications.
In a new concept, the “machine” is not “educated” to be able to recognize and properly react to all contexts. It will not have the requisite variety needed.
Instead, what is needed is an understanding of what constitutes a “trigger” in dynamic contexts, adjusting the window of tolerance and thereby adjusting response in dialog with the human. What changes the window of tolerance? Commander’s Intent, effects (kinetic and non-kinetic), weapons availability (for each ship and across the force), tactics, willingness to take casualties, influence of the effects on the larger strategic picture, deterrence or hot war? There are many factors that expand or condense the WOT. When condensed, there are more triggers, and second order actions by forces to bring war back into a window of tolerance or to expand it.
Words are important
There is a problem with our use of the term “artificial intelligence.” What is meant by artificial? By intelligence? These terms are problematic by themselves, and together only confuse what we are really talking about.
We can create systems that have a great deal of ability to recognize a “thing.” However, there is no attachment at the system level that a human would recognize as an emotional connection. That is why a dialogic approach is also needed. Anthropomorphism is important to give the actors a sense that they are understood and care about the other’s intentions and reactions.
War is a set of initial conditions, shifts in those conditions, and complex set of unknowns and unexpected emergent behaviors.
All AI is autonomy. It can be complex autonomy, but it is still based on human development, and includes biases, and assumptions. Triggers do not include these. Triggers create defense mechanisms (to borrow from psychology and Dr. Wilhelm Reich) How those defense patterns are used is in the human domain.