The goal was a lofty one — within 15 years, 30 percent of all Army vehicles would not have a driver behind the wheel. The technology promised to replace fragile lives on the front lines with machines controlled by computers and without families to mourn their loss.
It was also the year 2000.
The goal, written into the fiscal 2001 National Defense Authorization Act, is hardly the only one the military has missed in its push for modernization. But as with many of those, a new NDAA means another chance. Last month the Army’s Chief of Staff Gen. James McConville told the Senate Armed Services Committee the service was “aggressively pursuing” unmanned ground vehicles that would be “teamed,” or linked, to human-controlled systems. It was the only mention of artificial intelligence in the socially distant “paper hearing,” a 60-page Q&A that replaced a traditional committee meeting for the fiscal 2021 NDAA.
It was also the strongest highlight of how the Army is thinking about its emerging technology initiatives. The general’s comments were in response to a question from Sen. David Perdue, R-Ga., whose state is home to the Maneuver Center of Excellence at Fort Benning.
“[O]vercoming advances our potential adversaries have made in their own squad-level robotics and AI development is crucial,” Perdue wrote in his question. He added current robotics testing in his home state could yield “long-term asymmetrical advantage for our warfighters.” Initial tests at Fort Benning and other military installations have been done with separate unmanned systems designed to assist manned units, and with automating some systems within a vehicle that still has a human driver.
Big returns for safety
The projected benefits of the technology are the same as they were at the turn of the century: mitigated threats to life and limb, reduced cognitive load in battle and an ease on logistics. In the two decades since that first goal was set, a key challenge as been overcome: artificial intelligence-enabled systems can crunch incoming data with the alacrity that could bring the Army’s long-held dream to life, Lt. Col. Chris Lowrance told FedScoop in an interview. It’s an accomplishment the Army seems intent on continuing.
Lowrance works as the automated systems lead at the Army’s AI Task Force, a collaboration with Carnegie Mellon University. There, he collaborates with other components of the Army to try to make safer battlefields for soldiers. Lowrance sees great potential in lightly manned and other automated systems in the near future.
“With the recent breakthroughs with AI, these systems are gaining enhanced capabilities,” he said.
There are other challenges besides quickly processing large amounts of data. Margarita Konaev, a research fellow at the Center for Security and Emerging Technology, cautioned the Army against thinking the “art of the possible” will become the art of battlefield reality any time soon.
“Things go wrong because the technical challenges are significant, especially for ground operations,” she said in an interview.
The type of urban environments the Army usually operates in, and the type Konaev studies, are full of landmines physically and metaphorically for AI-enabled unmanned systems. The process for turning a computer into a brain-like thinking machine relies on with running skyscrapers’ worth of testing data through a layers of neural nodes that self-adjust and “learn” from the input data. Urban battlefields are rife with novel problems that even the best neural network and sensor systems struggle to accommodate.
It is a problem that Lowrance acknowledges is “not solved.” He said that with more funding he and his team can collect more data and eventually develop synthetic data to better train systems.
“Curating and labeling this data is a very expensive process,” he said.
Reducing burden, but for whom?
Konaev also cautioned against thinking the benefits on adding unmanned or “lightly manned” systems are all they are cracked up to be. Beyond the safety of individual soldiers, the systems are billed as being able to reduce the cognitive load of commanders. Konaev said it’s more of “shift” than pure reduction.
“On the one hand you reduce the cognitive burden of the commander because he no longer needs to supervise every truck,” Konaev said. “But you essentially redistribute the cognitive burden down the chain of command.”
Many manned/unmanned teamed systems are much more akin to cyborgs that still need human overlords to function than fully automated machines that don’t need operators on or in the loop. A recent RAND study on the subject also found the Army is still years away but on-track to developing the systems.
“With current technology, human ‘operators’ are still required to monitor the driving environment and regain control when the autonomous systems are unable to handle the situation,” Shawn McKay, the lead author of the RAND study said in a release.
The current technology also relies on a network of newly modernized back-office systems. For his part, retired Army Col. John Kuenzli, who was a senior logistic expert in the Army Materiel Command and now is at IBM, said the push to bring AI into maned/unmanned team systems is the right direction for the Army, but it should also think more broadly. Using the goals of reducing risks to soldiers can continue the push of the back-office modernization and wide-spread adoption of AI across systems.
“I think the Army is really moving in the right direction with its emphasis with cloud migration,” Kuenzli said. “It is starting to understand the conditions to make that paradigm right.”