AI and Attack Radios Teach DARPA Unexpected Lessons
Researchers have learned some surprising lessons from technologies developed under the Ministry of Defense’s Squad X program, which will end this year. For example, artificial intelligence may not help the military make faster decisions, but it provides an advantage in planning over adversaries. In addition, when it comes to detecting and electronically attacking enemy signals, systems can make intelligent decisions without artificial intelligence.
When first conceived in 2016, the Defense Research Projects Agency’s (DARPA) Squad X program is expected to explore four technology areas – precision engagement, non-kinetic engagement, squad reporting, and squad autonomy – on behalf of downed troops. and Marines in a detachment formation. Since then, the program has evolved to focus on small units in multiple echelons, such as squads, platoons, and special operations teams.
The program has developed two technologies: Lockheed Martin’s Extended Spectral Awareness and Assisted Transformation Localization (ASSAULTS) and the CACI family of radio stations known as the BITS Electronic Attack Module (BEAM). The first had to use autonomous robots with sensor systems to detect enemy locations, allowing small units to engage and direct enemy forces without being detected first. It has also evolved and is now a test site for evaluating artificial intelligence (AI) technologies.
“What Lockheed has done is an operating system that allows you to experiment and turn on and play with different components. I can’t tell you how difficult this is, “said Philip Ruth, DARPA Squad X’s program manager. Adding and removing AI components is even more difficult because they can contradict each other. They could fight each other, these AI behaviors. “
BEAM technology detects, locates and attacks specific threats in the radio frequency and cyber domains, including racing small unmanned aerial systems. Although the system is not activated by AI, Root shows that the processing capabilities make it quite smart. The radio communicates with each other to “find the best formation, to get the most information about the enemy,” he said.
Both infantry and special forces units appear to see the system as a member of the team, not as a tool, he said. “They will give the mission the BEAM system and it will modify its behavior depending on the threat it has seen and where they have been in the mission, where it has seen high value prices,” he said. “So, technically, it wasn’t AI. It does not use the machine learning and in-depth training required for this technical term, but there are many aspects that reflect a form of intelligence. “
The Lockheed ASSAULTS system taught researchers some valuable lessons. One of the first lessons is that AI can teach a lot to fighters. For example, researchers could use the wisdom and experience of commanders to teach and train artificial intelligence and robotic systems.
“I didn’t see this coming, the idea that we could learn from squad leaders, company commanders, battalion commanders about tactics and then use that to inform drones and drones would be completely different. technical direction and now we are starting to explore, ”says Ruth.
He added that the experience with Lockheed Martin had taught him that the military may not want to try to build artificial intelligence systems that are better than humans. “That doesn’t mean we shouldn’t try to develop good AI. This simply means that instead of trying to replace the wisdom and experience of a small unit commander, we should try to create an AI to support the wisdom and experience of the youngest Marine on the team, and sometimes a junior Marine. robot. ”
Researchers have also learned that artificial intelligence systems do not necessarily allow the military to make decisions faster, but it can help them plan more effectively. Ruth says his team has gathered data to test the hypothesis that AI-equipped friendly forces, known as blue forces, will make decisions faster than enemy or red forces.
“It simply came to our notice then. What we found was that the blue could plan in depth with several decision points and courses of action. And it wasn’t red, “he explains. “Red decided really fast, but out of necessity. They reacted. Blue was able to have an excellent awareness of the situation and then act with precision and real initiative to completely change the environment and dominate his local battlefield. “
Another unexpected lesson involves the process of collecting training data on AI systems. Usually companies train systems using their own data or publicly available information. Ruth concluded that AI systems for military use should be trained instead of military data. “Lockheed Martin helped me understand that we need to consider different approaches to data curation, data management and AI certification. When we collect this data, it should probably be owned by the department and made available to the training industry for their systems, ”he suggests.
The Ministry of Defense, he points out, collects a huge amount of information in experiments, training and the operational environment. This information is more suitable for military AI or robotic systems than the data industry can easily access. Ruth says his team only collected almost terabytes of data in their latest experiment. “If we have the data, every time we do an experiment or a training exercise, we will collect more data, manage it, and consider using it for additional training data. What guides you then is where the AI that a unit uses – whether it’s a squad, platoon, company or battalion … will learn and mature as the unit continues to train with it. “
Therefore, the department may rethink the process of testing and evaluating AI technologies. “We may need data certification and unit certification where the unit and AI are certified at the same time, which means that the device is capable and efficient in using AI, and AI provides valid feedback,” Root suggests.
Former Secretary of Defense James Mathis, Ruth recalls, was eager to support melee units and set up a near-combat mortality task force. Mathis’s mantra was that a soldier or Marine must experience 20 skirmishes through realistic training before engaging in combat.
The Squad X team experimented with battalions, planning and executing their missions in a simulation before passing mission orders to squads. These teams then plan and carry out their missions in a simulation before a live training session.
“If I have battalions, companies, platoons and detachments that can all be trained with the same mission-type orders and then carry out these missions in simulation, it certainly provides some value. But then with the opportunity to get into the training range and do that same mission and do it live, we start to see the ability to train much faster and more comprehensively and have AIs that are multi-tiered, ”says Ruth.
He cites an Squad X experiment using the department’s Test Resource Management Center, which involves installing AI on drones in different environments and wearing different types of camouflage uniforms. In some cases, the uniforms worked well and the Marines mingled in the background.
But the root of Root is that the service has collected a lot of relevant data in the process and can use it for AI training. “We collected gigabytes of data. It would be really expensive for the industry to recreate this every time. You will need access to nautical uniforms. You will need access to the same environment. And there would be a lot of duplication if every provider tried to do the same, ”he explains. “We have the most important data on government training.”
The BEAM family of radios also demonstrates innovation. CACI documentation says that the BEAM system explores the environment to allow deployed units to oppose small drones; cellular, digital or analogue radio transmissions for transmission; data connections; wireless fidelity signals; and digital or analog video signals. BEAM can be scaled by working in a cluster, and it can also work autonomously to deliver distributed attacks and provide a fast, responsive ability to defend forces in a hostile environment.
“There are some signals, such as unmanned aerial systems with a threat, that have too wide a bandwidth signal for a node to be able to receive and monitor,” Ruth explains. “So CACI has developed the ability – part of their unique ability – to connect these individual software-defined radio stations together to monitor these broad signals and then perform a geolocation calculation so that it can triangulate in these signals.”
The system was originally designed to be small and light to carry in a backpack, but then CACI developed a larger version for ground vehicles and another version for Aerovironment’s hand-launched drone, known as the Puma. The flexibility of radios can protect ships, smaller boats carrying marines from the ship to shore, landing forces and fixed locations. The 31st Marine Expeditionary Unit in Okinawa, Japan, uses the system. Special Forces units have also experimented with the technology.
In addition, the system is implemented for battle zones. “One of the reasons we know this works is that we sent this neglect to Afghanistan and Iraq and it had great effects. “Obviously I can’t say much about it, but two thumbs up from the clients we’ve worked with and supported,” Root said.
Root describes BEAM technology as very mature and says it can be adopted by military services or other departments or agencies as border patrol units. Although there is currently no planned transition for Lockheed Martin or CACI technologies, it is discussing both with multiple parties.
The Lockheed Martin system will soon not be used in battle, Root said. Instead, the system will be used to experiment with AI. “It allows us to do amazing experiments and it really helps us understand what we need. It allows us to collect a lot of data, and in the world of AI, data is the most important thing, ”says Ruth. “The nation that collects the most tactical data has the greatest advantage, and Lockheed’s solution undoubtedly allows us to collect more data than any other experimental system I’ve seen.”
Root described the upcoming end of the program as bitterly sweet and noted that others would appreciate DARPA’s work. “I can’t decide if we’ve done enough. Some future Marine will decide if we’ve done enough. ”
Comments are closed.