
War, Adaptation and AI
Algorithmic Support to Military Adaptation in Peace and War - a discussion about a new paper that I am presenting at a Canberra seminar this week
I am tempted to declare that whatever doctrine the Armed Forces are working on now, they have got it wrong…it does not matter that they have got it wrong. What matters is their capacity to get it right quickly when the moment arrives. Sir Michael Howard, 1974
This week, I am spending some time in Canberra, Australia at a workshop that is exploring Artificial Intelligence (AI) and its application in decision-making about the use of force in warfare. I have been fortunate to have been able to participate in this Australian National University-Harvard Belfer Center collaboration over the past couple of years. In previous years I have also participated in similar workshops.
I have done this because I want to better understand AI and its potentially impacts on how humans are informed, and how they make military and strategic decisions, before, during and after wars.
I explored this issue in my book, War Transformed, and then also wrote about it in the follow up published last year, White Sun War: The Campaign for Taiwan. Even before these were published, I had been dipping my toe in this debate for some time. In 2019, I wrote a piece for the Australian Journal for Defence and Strategic Studies that explored how AI might be applied. In that piece, I described how:
The relentless speed of change and the complexity of the strategic environment militaries will increasingly be required to operate in defies human capacity to adapt… Given the enormous complexity of this problem, enhancing biological sources of the intellectual edge with silicon-based intelligence—AI— appears to offer one pathway to an enhanced advantage for nations in the 21st century as it brings together the macro-sources of technology and intellectual advantage.
Other explorations I have conducted into the interaction of AI and warfare over the past few years have included how AI will impact professional military education, reading lists for military personnel on AI, as well as a short series of fictional stories and an illustrated story on how humans might eventually be intellectually augmented with AI.
It is probably fair to state that military institutions are still at very immature states when it comes to the absorption of AI. While there are a multitude of trials with different kinds of AI taking place, from weapon systems to battle command and control to logistics and personnel management, much remains to be done to wrangle the necessary data, integrate AI into military processes and importantly, build trust in the application of AI among military personnel.
The war in Ukraine has provided an intensive experimental environment for AI. A variety of functions have been augmented, improved or invented through the use of AI since the beginning of the war. A recent story by The Economist explored many different applications of AI that included aiding Ukrainian prosecutors to find those responsible for warcrimes through to better targeting processes. Time Magazine has explored how large western companies have used Ukraine as a laboratory to test their products while also assisting the Ukrainian war efforts.
The Center for a New American Security and the International Institute for Strategic Studies have both produced reports with useful insights that examine improvements to Ukrainian targeting processes, and other battlefield awareness functions, through the application of AI. The Global Governance Institute has examined how Ukraine uses AI to combat Russian regional and global propaganda activities. The Center for Strategic and International Studies has explored the use of AI in military wargaming, as has the RAND Corporation. There are many other articles and reports that have been published on this subject.
Against this background, I am presenting a paper in Canberra this week on how AI might be applied to an important individual and institutional function in war: learning and adaptation.
Learning, Adaptation and AI
Learning and adaptation is one of the ways that good military commanders, and smart military institutions, seek to reduce uncertainty and the potential for tactical and strategic surprise. But, as the history of military affairs demonstrates, not every military organisation possesses the learning cultures necessary to recognise the need for change and then conduct disciplined, multi-level adaptation to improve their effectiveness. This requires many different leadership, training, educational, technological and cultural components that are practiced in peace, and provide a foundation for an institution to be reflexively adaptive when war occurs.
Every wartime decision can and should be informed by previous decisions, and thus, can be improved through effective adaptive cultures. This might be improved further through AI decision-support tools.
Key wartime decisions that might be impacted by AI-enabled adaptation processes include decisions on the ethical use of force, balancing tactical and strategic forces, achieving an optimal force structure of crewed and uncrewed systems, prioritising munitions, equipment and personnel as well as a wide gamut of training and education initiatives. Depending on the level of data available, AI might also be employed to provide better risk calculations and estimates of casualties to improve decision-making about risk-benefit in the tactical, operational, strategic and even political arenas.
But learning and adaptation is also important in military institutions outside of wartime.
War is normally only a small proportion in the life of any military institution. Many military personnel spend their entire careers in military services without conducting a wartime deployment. Of greater pertinence, the processes, technologies, leadership philosophies and cultures incentivised in an organisation between wars provides the foundations for military effectiveness and adaptation during wars. The military institution that a nation begins a war with is almost never the military institution that it wins with.
The wars in Ukraine and Gaza have provided examples of how humans have recognised problems at the tactical, strategic and political levels, and then produced solutions that are aimed at enhancing their chances of success. Uncrewed systems have been at the forefront of many examinations of adaptation in this war, but there are other examples.
The Ukrainian capacity to mesh civil and military sensor networks and analytical capacity on the battlefield, in the air defence environment and in other national security endeavours is another example of institutional adaptation. Russian improvements in electronic warfare capacity to degrade the performance of western precision munitions and minimise the impact of Ukrainian drones on the frontline is another example.
At the same time however, this process can be flawed, subverted or just plain inept. The sharing of battlefield lessons is often insufficiently automated, and sometimes compromised because of time or individual leadership shortfalls. The strategic collation and assessment of battlefield lessons and then turning this into new capability can be hindered by lack of time or focus, institutional cultures, computational resource shortfalls or the lack of established learning processes. Much of the analysis that is relevant to military adaptation continues to be a laborious human endeavour.
In the Russian system, collecting and sharing lessons is exacerbated by a fear of reporting failure and a culture of centralised command. This was examined in a Royal United Services Institute report on preliminary lessons from the war in November 2022. The report described how the ‘reporting culture’ of the Russian Army was deficient because it “does not encourage honest reporting of failures.” Anyone who is perceived to have failed is normally replaced or punished.
The adaptive capacity of a military institution and its ability to adapt its battlefield tactics as well as its strategic warfighting functions is critical to success in war. But the pace of war, and the speed of change in the geostrategic environment beyond warfare, means that contemporary approaches to learning, adaptation and decision-making must be improved.
Many military institutions have realised the theoretical value of AI in improving decisions and the overall effectiveness of their organisations. Some are yet to fully realise this value, although countries such as the U.S., Britain, Canada, and Australia all possess AI strategies which they are in various stages of implementing.
The most recent additions to this panoply of strategies are the 2024 NATO AI Strategy which was released during the Washington Summit on 10 July 2024 and the U.S. Marine Corps AI Strategy which was released on the same day.
Adaptation and AI: Building a Meshed Human-AI Adaptive Capacity
My paper this week proposes an evolved concept for multi-level, individual and institutional military adaptation, through the fusion of new learning processes and AI to speed up and enhance the quality of military adaptation and strategic decision-making. This transformation of the learning cultures and processes in military institutions has very little to do with technology, however.
The larger and most important role is played by humans. The success of enhanced adaptation through AI support will be almost entirely driven by human decision-making, processes and culture. For institutions to apply AI to improve the quality and speed of their learning and adaptation related to decision-making, I believe there are five key areas that require reform:
1. Set (and evolve) Measures of Effectiveness. If AI-enabled adaptive capacity is to work effectively, measures of military effectiveness to guide which direction adaptation might take. These need to be developed at the tactical, operational and strategic levels to guide development and implementation of AI-enabled adaptation.
2. Know where adaptation relevant data is found, stored and shared. An enhanced adaptive stance in military institutions must have enhanced data awareness as a foundation. Institutional measures will be an important element, but so too will data discipline in tactical units and by individuals. As such, data awareness and management will need to become one of the basic disciplines taught to military personnel.
3. Explicitly Embrace adaptation. Senior institutional leaders must nurture people and formations that are actively learning and capable of changing where it is safe and effective to do so. This culture must begin with clear statements about the leadership environment, and its tolerance for risk and new ideas.
4. Scale AI support from individual to institution. There is unlikely to be a one-size fits all algorithm or process that can enhance learning and adaptation at every level of military endeavours. A virtual arms room of adaptation support algorithms will be necessary in an institution-wide approach to adaptation.
5. Military process and doctrine stripping and reform. The observation and absorption of lessons needs to be part of normal military interaction rather than a separate and parallel ecosystem that often has difficulty inserting itself into strategic decision making. Tactical learning must connect to and inform strategic learning. Human processes and committees must evolve to improve this interaction.
Better Military Adaption in Peace and War Through AI
An important virtue in contemporary and future military institutions is the ability to develop adaptive processes in peacetime that can then be applied at multiple levels during war. Succeeding in the adaptation battle is founded on robust institutional learning, and is a central military function in war and peace.
Creativity, innovation, a tolerance of failure, and the capability to quickly learn from failure are important ingredients in any effective adaptive stance in 21st century military institutions. This can be improved with AI-enabled adaptation. But it will also demand investment and experimentation in an array of different algorithms, processing and communications technologies as well as cultural evolution to improve adaptation that informs military decision making. Experimentation and testing will be necessary, and this is likely to be an ongoing requirement in any organisation that has a learning culture.
Before concluding, one final issue should be re-emphasised. The transformation the learning cultures and adaptive processes required in contemporary military institutions will often have little to do with technology. The many technologies, including data management, communications technologies, and AI, comprise an important capability.
But the larger and most important role in improving learning and adaptation through AI is actually that of humans. The success or otherwise of enhanced adaptation through AI support will be almost entirely driven by human decision-making, processes and culture. As such, my paper is heavily biased towards the human aspects of an AI-enabled approach to adaptation. I look forward to being able to share the full paper on AI-enabled adaptation with you in due course.
I've been studying AI/Machine Learning as it relates to the Ukraine war and how the US is learning from the war. Ukraine is a test bed for a lot. A cautionary tale for some. But it's really a gold mine for data, data, data. One of the big secrets I can't even get a line on is how to keep the data sanitary. How to keep it labeled accurately. You can have the most advanced training model in the world but if you training data is bad or your outcome is bad. Garbage in, garbage out. Odd... how old is that tech maxim?
I've seen a few companies labeling data. But nothing on the scale I would assume is needed.
A prediction... soon the way you will thwart your enemy's AI is to poison their data.