AI on the prize: competitions encourage artificial intelligence applications for naval use

The 2021 Advanced Naval Technologies Exercise provided participants from government, industry and academia with a low-risk, collaborative environment that takes advantage of the naval research and development establishment’s unique laboratories and ranges, while practicing operators and planners simultaneously exploring advanced tactics and assessing the operational suitability of emerging technologies. US NAVY / Joe Bullinger

Recent Department of Defense initiatives to foster rapid innovation and modernization have involved holding competitions between existing and potential defense contractors to develop technology for military use. Competitions allow small businesses to present and demonstrate their ideas in realistic scenarios and can eventually lead to production contracts.

Artificial intelligence and machine learning is an area that is receiving intensive attention from the Department of Defense due to the growing challenges of managing overwhelming amounts of data and making timely decisions. The challenge is made even more acute by the current sophisticated peer-to-peer competition with China and Russia. The current politico-military situations in Ukraine, the Taiwan Strait and the South China Sea are scenarios in which AI/ML can potentially help decision makers. AI can have applications at the national level down to the tactical level. Technology can help not only with situational awareness, but also with forecasting – indications and warnings.

In a recent competition sponsored by the Department of Defense, a small company named BigBear.ai, headquartered in Columbia, Maryland, won the prize. The 12-team competition was an exercise in advanced naval technology from Project Overmatch AI, or ANTX.

Brian Frutchey, Chief Technology Officer of BigBear.ai, said the company adapted a program developed for the US military to the Navy, which “dealt with hybrid warfare, gray area conflict in Eastern Europe Is and wanted to find tools that could automate the understanding of all the different data they need to look at for the Hybrid Warfare environment.

It was no longer just a matter of “counting planes, tanks and soldiers”, Frutchey said. “They need to look at economies, political relationships, people migration, cyber activity, all these new areas that the strategic analyst needs to be aware of and model in their anticipatory intelligence.”

BigBear.ai built the Virtual Anticipation Network (VANE) – the weather vane – to indicate where the winds of war were blowing. Now Big Bear.ai’s dominant product, VANE, funded by the Deputy Assistant Secretary of Defense for Special Operations in Low-Intensity Conflict, was used by the company to win the $75,000 prize from the ‘ANTX. BigBear.ai was one of 12 companies selected to enter the competition, which took place in the last half of 2021.

Frutchey said in ANTX that VANE is researching telemetry for ships and maritime aircraft, as well as weather, information environment, among other areas “that we’ve put together so we can inform when these things happen. occur with aircraft flying through the Taiwanese air defense zone [ADIZ]”, including the events preceding and during the events of interest, such as the entry of Chinese aircraft into the ADIZ; air traffic rerouting; even press releases. Because indicator watching is automated, human analysts are able to handle more data “and with great agility, pivot to new situations faster than before.”

He says what sets BigBear.ai apart “is that we’ve built machine learning that expects that data isn’t the whole story. I think that’s critical because, especially for our partners in defense and intelligence, you don’t always have control over the data you’re trying to analyze. There’s always this uncertainty in the data where you have gaps and holes and inaccuracies and because of that, we have to use machine learning that assumes you’re going to have these errors, these issues, and even more, you don’t always measure what you want to measure.

VANE presents its analysis on a dashboard on a monitor for analysts to observe. A “heat map or density graph” shows areas with a lot of activity over a given time period.

“We also look at baseline behaviors,” Frutchey said. “Analysts are concerned when this level crosses a certain threshold. In the first week of October, it rose to 56 [Chinese] one-day sorties at one time and they used strategic bombers in some of those sorties. This stuff is normal low level buzz, but I want to be alerted when the models predict that in a week, in a month, there will be above a certain threshold, or the rate of change is going to be significant. … We give them these alerts, and the user can, of course, then drill down into the alert and drill down into the forecast data. … So we were looking at aggressive activities in the South China Sea for the AI ​​ANTX exercise.

“We also have scenario forecasts [that] allow us to assess courses of action,” he said. “What if Russia invades Ukraine? What will this do to the price of Bitcoin? Or what would happen if, in the example of the AI ​​ANTX, we asked the question, and if we organized a naval exercise in the Luzon Strait? What would that do to Chinese behavior in the South China Sea? If we were to go with an aircraft carrier into the Luzon Strait and have a little naval exercise, what would that do to behaviors? And so we can run these simulations and then we can show the user, this is how the world would change in a month if we were to run this exercise next week.

NAVWAR Commander Rear Adm. Douglas Small presents the inaugural AI-powered ANTX award to Big Bear.ai Chief Technology Officer Brian Frutchey, right. NAVWAR / Elisha Gamboa

VANE is scalable, Frutchey said. “Vane is designed to elastically scale into the cloud as large as needed. That’s actually one of the great things about our platform is that it’s completely serverless, which means that it’s not like it’s a monolithic application which is a bunch of servers consuming resources all day long it’s a set of functions and since the clients need these functions the system is designed to retrieve resources from the cloud, spin them up to do the necessary work, and then turn them off when the work is done. Our systems process terabytes of data to create these models on a global scale.”

Frutchey said Overmatch ANTX’s win shows the company’s prescriptive analytics are appropriate for operational and strategic purposes.

“We are starting to talk to program offices for major command and control systems, [such as] the Global Maritime Command and Control System,” he said.

Match the best of breed

AI is also applied at a tactical level. Draper, a company known for building ballistic missile guidance systems, entered a competition last summer held by the Crane division of the Naval Surface Warfare Center in Crane, Indiana. The price challenge was to determine the feasibility of taking autonomy software and implementing it on another organization’s hardware.

“I think what the government was trying to learn is how difficult it is to separate these two [software and hardware]said Drew Mitchell, associate director of defense systems at Draper and general manager of Draper’s office in Tampa, Florida. “That way I can match the best software with the best hardware. Usually it’s not the same when the company delivers that end product to the government. And it’s also very expensive. So they’re trying to find a way to cut costs on some of these standalone platforms. »

The prize challenge has been divided into three phases. During Phase 1, Draper, which had a lot of experience developing platform-independent software, submitted a white paper that was accepted along with those of 20 other companies. Phase 2 was a simulated exercise that involved loading autonomy algorithms into a small quadcopter unmanned aerial system and navigating it inside a building, mapping the interior and identifying objects in the building, all without the aid of GPS.

Hydronalix’s EMILY unmanned surface vessel and Adapt drone. HYDRONALIX

Five competitors made it to Phase 3, which involved a live demonstration of the Phase 2 scenario using a Hydronalix quadrotor drone into which their respective software was loaded.

“It was completely self-contained, so you give the drone some sort of basic instruction, basically fly forward and then it takes over from there,” Mitchell said. “It senses the environment, and it does that through cameras, and it uses the same camera to perform the navigation algorithm using a vision-based navigation system. It uses the same camera to collect an environment map or generate an environment map. Much of this is very CPU intensive. In a small package, like a small quadcopter drone not much bigger than a book, it doesn’t there aren’t many processors available out there to do all that.

Draper used vision-assisted navigation algorithms that he used in his other programs.

“We used what’s called visual inertial odometry,” Mitchell said. “It’s very similar to how the human eye and brain works in terms of referencing objects as you see them and then as you move around your brain is always calculating, oh I saw that dot and now the point is three feet away from me as I move closer to him, from that point he is now two feet away from me, and from there you can infer a lot of direction. very accurate but it’s accurate enough to help the IMU [inertial measuring unit] on board. This helps the IMU, so the IMU does not drift widely.

“The predominant systems today use GPS to assist this inertial unit,” Mitchell explained. “The GPS gives it a position, so it knows, okay, I see this point, then I move again, I see this point, I know where I am. But if you’re inside, the GPS doesn’t work and the big push within the MoD is to do things without GPS because they know that in a future conflict that’s probably going to be one of the first things our adversaries pull off. vision, you can get clearly decent navigational accuracy in a very small package and do it completely on its own.

“We were able to show that with very little time and resources and a very rudimentary hardware platform that the government provided us, we were able to navigate inside a building without GPS,” he said. he declares. “We were able to identify objects. We were able to map certain parts of an environment. Of course, this was not optimal. The places we created on our own were much better than what we had developed through this process, but we helped the government understand that it was possible to do it.

Draper came second in the contest, with EPISCI coming in first.

image_pdfimage_print

Comments are closed.