The new systems, called the Aided Threat Recognition from Mobile Cooperative and Autonomous Sensors (ATR-MCAS), will scan and classify imagery from sensors that can be mounted on vehicles, aerial coverage and autonomous vehicles that will help soldiers recognize incoming threats.
It is a tool that Lt. Col. Chris Lowrance, head of autonomous systems with the Army’s AI Task Force, said will act as a “teammate” and reduce “cognitive load” by alerting soldiers of incoming threats.
Soldiers in vehicles or holding mobile devices will be able to customize the feed of data that the ATR-MCAS will show and alert them to, Lowrance said. For example, a soldier driving a tank could set a laptop to only display images of enemy tanks when the computer-vision system detects them. Alternatively, a solider could watch a livestream of the raw video data the AI system is analyzing with highlighted sections classifying what the cameras and other sensors are picking up.
“You are reducing the risk of soldiers having to get out in front of danger,” Lowrance told FedScoop.
Currently, the algorithms primarily operate on image classification and computer vision, but in the future, the Army hopes to add pattern recognition of images. Lowrance said the capabilities will “always be expanded over time.”
While pattern recognition is a goal, humans will retain full control over how they react to the information ATR-MCAS alerts them to. The aim is to provide a more detailed picture of the battlefield for a solider and get them the most critical information, Lowrance said.
“In essence, the systems are all about being able to provide situational awareness in the form of these systems that are unmanned,” Lowrance said.
The Army inherently operates in noisy data environments; and with fast-moving objects, the AI system will need to work fast to classify them. To meet the need, the system will operate on edge-computing, according to the Army.
ATR-MCAS is also designed with customizability in mind. If the Army needs to conduct a reconnaissance mission with fly-over camera feeds or radar data, Lowrance said the system can account for a change in sensor input data.
“It can always be expanded over time,” Lowrance said of the type of input data the algorithms analyze.
The system is far from deployment in battlefield environments, however. Currently, the algorithms are being trained on test data being collected with help from academic partners. Researchers at Carnegie Mellon worked with the Army in mid-January to collect data to feed to machine-learning algorithms. Eventually, the Army will run the algorithms on data collected from the battlefield, Lowrance said.
The Army’s AI Task Force led the charge on this product but worked in collaboration with the Pentagon’s Joint Artificial Intelligence Center on “similar problem sets,” Lowrance said.
“We commend the Army AI Task Force for their groundbreaking work in advancing AI-enabled capabilities with ATR-MCAS,” said Lt. Cmdr. Arlo Abrahamson, a JAIC spokesperson.