23
Combat Decision-Making in High Intensity Conflict Williams Foundation March 2018 JD McCreary Chief, Disruptive Technology Programs Georgia Tech Research Institute

Combat Decision-Making in High Intensity Conflict · Decision support technology ... •Ipad vs laptop, mobile apps ... to present the relevant data elements in a way that makes sense

Embed Size (px)

Citation preview

Combat Decision-Making in High Intensity Conflict

Williams Foundation March 2018

JD McCreary

Chief, Disruptive Technology Programs

Georgia Tech Research Institute

“Engage with what they expect; occupying their minds while you wait for that which they cannot anticipate.” – Sun Tzu

• Decision superiority is a key element to (future) conflict• Complex decision space will require focus on technologies that mitigate

limitations on human sensory and cognitive capacity.

• Must develop trust in AI and its reliability through exposure and transparency

AI Arms Race

• “Artificial intelligence is the future, not only for Russia, but for all humankind. Whoever becomes the leader in this sphere will become the ruler of the world.” – Vladimir Putin, Sep 2017• Artificial intelligence will be able to replace a soldier on the battlefield and a pilot in

an aircraft cockpit - Viktor Bondarev (former commander of Russia’s Aerospace Force and Chairman of the Federation Council Defense and Security Committee) Nov 2017

• China’s State Council issued plan to lead the world in AI by 2030; “Next Generation Artificial Intelligence Development Plan” - Jul 2017• Highlights an approach of military-civil fusion (or civil-military integration) to ensure

that advances in AI can be rapidly leveraged for national defense• Pursue advances in big data, human-machine hybrid intelligence, swarm intelligence,

and automated decision-making, along with in autonomous unmanned systems and intelligent robotics

AI-assisted decision-making: platforms to personal assistant, tactical to strategic

• Russia builds lethal autonomous weapons systems, platforms, swarms

• China plans to use AI to assist submarine commanders

• U.S. Data to Decision, Fleet Tactical Grid, Multi-Domain Command and Control• Individuals, sensor/weapon/platform (man-unmanned teaming), Force design/employment

• Time scales, scope of complexity: human, machine

• Continuous learning, transfer of knowledge (mission, ready room, deployment)

• Integrated Tasking Order cycle (72 hrs), time sensitive targeting, kinetic or non-kinetic

• Grey Zone, Hybrid Warfare

• Deception, unconstrained solutions (not all playing the same game)

• Decision superiority preparedness (requirements)

Operational AI-assisted decision-making

• Cognitive, netted, distributed force (manned/unmanned systems, BM, MDC2)

• Algorithmic or prototype warfare (speed of software development)• Human judgment will remain essential, but decision allocation between humans and machines will be shifting

• Man-machine teaming (Centaur or R2D2 at individual, platform, AOC/MOC)• Human expertise/experience, workload, pace, complexity

• AI-assisted learning, COA generation, decision-making

• Trust, transparency, training

• Live, Virtual, Constructive (LVC)• AI assist blue with complexity/scale: training, T&E, ops rehearsal, force/TTP design/validation

• AI augment/supplant red/white force, dynamic adaptation of force behaviors, environment

• Extend speed of command, decision superiority• Computer, network, (sensor) data…knowledge, learning, decisions

Discussion

OODA vs OODA

Observe Orient

DecideAct

Observe Orient

DecideAct

Traditional investment

• Decision superiority• Man-machine teaming

• AI, visualization• Big data, COAs

• Keep in orient phase• Introduce “false” decisions

Force Design

• Outpace adversary decision speed

• Enhance battlespace awareness, decision speed, and dynamic synchronized actions

• Computer aided data fusion, info analysis, present warfighting options…enable decision speed

• Automated, intelligent analytics and AI designed to improve Commander/Warfighter resource allocation decisions…providing COA analysis, planning, automated synchronization, as a means of creating effective human-machine teaming

The Intelligent Digital Mesh

• Intelligent: How AI is seeping into virtually every technology and with a defined, well-scoped focus can allow more dynamic, flexible and potentially autonomous systems.

• Digital: Blending the virtual and real worlds to create an immersive digitally enhanced and connected environment.

• Mesh: The connections between an expanding set of people, processes, devices, content and services to deliver operational outcomes.

• Data to Decision, Fleet Tactical Grid, Multi-domain C2• Sensors, networks, data, knowledge, decisions

Decision support technology

• The current wave of progress and enthusiasm for AI has been driven by three mutually reinforcing factors: the availability of big data which provided raw material for dramatically improved machine learning approaches and algorithms, which in turn relied on the capabilities of more powerful computers. – Dec 2016 White House report

• Task vs data representation

• Supervised (classification), unsupervised (pattern analysis)

• Deep learning neural network• Value, policy optimization• Rational reasoning (IBM, Google)• Memory augmented (deliberative; Differentiable Neural Computer)

• Imitation learning (Simulated data)

• Reinforcement learning (Deep Mind - Alpha Go Zero, OpenAI - DOTA 2, selfplay)

• Transfer learning

User Interface (UI)

• Enders Game, Star Trek, Interstellar, Minority Report, Watson, Alexa, chat bots, Hololens

• Computer’s job: interpret the user’s instructions, process the data, present to user.

• Human’s job: provide Commander’s intent, understand the information presented in context of intended goals or surrounding environment.• Moving away from designing Graphical User Interfaces (GUIs), which require the user’s full

attention, and moving towards designing less obtrusive interaction• proactively initiate conversation about goals• limit interaction to a few natural questions and responses• factor in a large number of observations and assumptions• present with the results

• Ipad vs laptop, mobile apps

• Conversational, visual, tactile/haptic

• Personalized UI/AI

• Develop 4 slides for APC• 2 tech

• Computing, edge, fog, neural nets, AI algorithms, learning, transfer, DNC (inflective)• Interface/trust, AR/VR/MR, conversational, haptic, LVC (transition to ops)

• 2 ops value• Tactical/Operational (indiv/collective, platforms, ops centers)• Force (execution/design/experiment)

• “static” decision making (pre-conflict) vs “dynamic” (real-time)

• 5th gen – more valuable as “decision makers”? Persistence of combat loads (kinetic vs non-kinetic; persistence of info)

• Centaur approach

• Russia – hybrid warfare; China – informationized warfare (info dom)• Even if you are not a fan of cyber/ew, you must respect manned/unmanned teaming potential

• Algorithmic or prototype warfare (speed of software development)• Human judgment will remain essential, but the line of decision allocation between humans and machines will be shifting

in coming years.

AI beats pilot

• The secret to ALPHA's superhuman flying skills is a decision-making system called a genetic fuzzy tree, a subtype of fuzzy logic algorithms. The system approaches complex problems much like a human would, says Ernest, breaking the larger task into smaller subtasks, which include high-level tactics, firing, evasion, and defensiveness. By considering only the most relevant variables, it can make complex decisions with extreme speed. As a result, the A.I. can calculate the best maneuvers in a complex, dynamic environment, over 250 times faster than its human opponent can blink.

• ALPHA is currently viewed as a research tool for manned and unmanned teaming in a simulation environment.

• Hmi – nlp, perception, visualization, audio, verbal, conversational, tactile

• Decisions - Autonomous machines, coa generation• The software agent is an intelligent nonhuman agent (IA) that has clear objectives,

able to monitor its environment, and autonomous in the sense that it can generate courses of action (COAs) to obtain its objectives

• mixed initiative autonomy wherein humans and IAs share decision-making. The IA has a degree of autonomy and communicates with its human supervisor. The focus of this report is on the decision-making relationship between IAs and its human supervisor(s) in future battle spaces that are dynamic, dangerous, and complex

• Trust, transparency

• Industry, gaming

• operator’s understanding of the IA’s decision process: L1) operator perception of the IA’s actions and plans, L2) comprehension of the IA’s reasoning process, and L3) understanding of the IA’s predicted outcomes.

• Whatever the IA’s knowledge representation, it must translate to formats that humans understand to help foster a common language between the IA and its operator.

• 3 gradations of language processing: command processing, controlled language processing, and open-ended NLP.

Decisions

• We have not been making all the wrong decisions. AI just presents us with available options based on more data and with less instinct than we are accustomed to.

• Assessing situations without perfect information is not just a human trait but a daily necessity and something we may not realize is constantly happening. From negotiations to ranking new business opportunities, we make informed decisions towards our desired outcomes. Moving the needle upward involves having more relevant details from more relevant sources.

Discussion of COP

• Crew and user specific task-related information, rather than commonality.• Implications: The system needs to understand each users’ task context and what data would be relevant to those tasks, and

to present the relevant data elements in a way that makes sense to the task(s).

• Data for decision support tools and user views of networks and effects, rather than a single (picture) portrayal of platforms.

• Implications: We need to agree on open standards for data models, and protocols for exchanging that data, not just a standard set of icons.

• Prioritising near over distant, and aggregating distant things, rather than treating all information the same way. This is in accordance with Tobler’s law: ‘everything is related to everything else, but near things are more related than distant things’. In geospatial terms, these distances are in metres; in network terms, they are in milliseconds.

• Implications: We need a system architecture that provides decentralised fusion and aggregation, to support decentralisedexecution, and to manage throughput in a congested or contested network connection when things are getting emotional. This requires adaptive rules for filtering; policy for when and how aggregation can support security outcomes for “special” capabilities; and detailed understanding of how the filtering and aggregation affect decision support tools.

• Alerting of exceptions, rather than providing ‘as expected’.• Implications: System definition of what is exceptional given the operational context, and the level of alerting that is

appropriate given the user needs, network capability and the other alerts.

Interactions

• On the Rise• Human Augmentation• Ambient Experiences• Spatial Computing• Gait Analysis• Volumetric Displays• Brain-Computer Interface• Electrovibration• VPA-Enabled Wireless Speakers

• At the Peak• Smart Robots• Transparent Displays• Gesture Control Devices• Emotion Detection/Recognition• Virtual Assistants• Machine Learning• Muscle-Computer Interface• NLP• Intelligent Apps

• Sliding Into the Trough• Mixed Reality• Gaze Control• Speech-to-Speech Translation• Flexible Display• Smart Fabrics• Augmented Reality• Head-Mounted Displays• Sensor Fusion

• Climbing the Slope• Face Recognition• Virtual Reality• Biometric Authentication Methods• Ambient Displays

• Entering the Plateau• Speech Recognition• Handwriting Recognition

Trust

• Data scientists want to build models with high accuracy, but they also want to understand the workings of the model so that they can communicate their findings to their target audience.

• Users want to know why a model gives a certain prediction. They want to know how they will be affected by those decisions and whether they are being treated fairly. They need to have sufficient information to decide whether to opt out.

• Regulators and lawmakers want to protect people and make sure the decisions made by models are fair and transparent.

• Department of Defense (DOD) Directive 3000.09 (2012) mandates safeguards for autonomous weapons, as shown in the following:

• “Semi-autonomous weapon systems that are onboard or integrated with unmanned platforms must be designed such that, in the event of degraded or lost communications, the system does not autonomously select and engage individual targets or specific target groups that have not been previously selected by an authorized human operator.”

• “The system design . . . addresses and minimizes the probability or consequences of failures that could lead to unintended engagements or to loss of control of the system.”

• “In order for operators to make informed and appropriate decisions in engaging targets, the interface between people and machines for autonomous and semi-autonomous weapon systems shall: (a) be readily understandable to trained operators, (b) provide traceable feedback on system status, and (c) provide clear procedures for trained operators to activate and deactivate system functions.”

AI & ML Terminology

Mythology Centaur

AI Centaur

Close the orient & decision loop

Outpace adversary decision speed

Terms

• The term "artificial intelligence" is applied when a machine mimics "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving"

• Goals of AI research – perception, knowledge, reasoning, planning, learning, natural language processing, and the ability to move and manipulate objects

• Capabilities generally classified as AI include understanding human speech, images and videos, autonomous cars, competing at a high level in strategic game systems (e.g – DOTA 2 and Go), military simulations, and interpreting complex data.