How Interconnected, Simulated Worlds Could Transform Military Training

By Mikayla Easley

iStock illustration

Earlier this year, two Berkut 540 aircraft — codenamed Red 1 and Red 2 — raced down the runway of Santa Monica Airport and climbed into the California skies. As the two planes flew over Ventura County, a KC-46 Pegasus Tanker came into the pilots’ view.

The tanker flew adjacent to Red 1, and the pilot navigated into position so the KC-46 could refuel the aircraft while Red 2 observed.

However, anyone looking up from the ground would have only seen two planes in the sky.

The third plane that “refueled” Red 1 wasn’t real — it was generated using augmented reality. Once in the air, the Berkut 540 pilots entered a common network. This allowed them to see and interact with the AR tanker in real time from their separate aircraft while communicating with each other.

Controlled by Florida-based startup Red 6, the exercise was one of the first successful training flights that linked multiple real-world platforms with synthetic assets, according to the company.

The demonstrated extended reality and networking capabilities are just some of the many technologies ushering in a new era of military training — one operated in the metaverse.

The idea of the metaverse has dominated the commercial space in recent years. While tech giants and small startups alike have hopped on the bandwagon, the U.S. military has also been interested in and developing its own “military metaverse” that can be harnessed to train thousands of warfighters simultaneously in realistic battlefield simulations.

“The metaverse means a lot of different things to a lot of different people,” Brig. Gen. William Glaser, director of the Army’s Synthetic Training Environment Cross Functional Team, said in an email. “Microsoft may be focusing on a collaborative work effort. Facebook may focus on the social aspect. If you talk to Amazon, they may say it’s about an improved shopping experience.”

Despite differing use cases, experts share some common understanding of what the metaverse is. The idea is to create a single, simulated 3D world that exists online and that multiple people can access through a device, such as a headset or tablet, from anywhere around the globe. Once inside the metaverse, a person’s “avatar” can interact with the virtual world and others in it.

Juliana Slye, founder and CEO of consulting firm Government Business Results, said the metaverse has become a viable reality due to “the commoditization and commercialization of the necessary hardware and software and other devices that make it happen.”

These include hardware components — such as headsets or tablets — that allow users to enter the metaverse, as well as advancements in high-fidelity graphics and power from gaming engines by software companies, she said.

The metaverse would be so realistic that it suspends one’s disbelief that it’s a simulation, tricking their minds into thinking they are in those conditions when physically they are somewhere else, said Bob Kleinhample, business line executive at software company Improbable.

“It’s like muscle memory. If [a warfighter is] training in something that’s realistic, in an environment they’re getting ready to go into, and the conditions are realistic, that’s much more meaningful of a training experience,” he said.

This means tech replicates the complexities found in real life — from the way buildings, terrain and weather shape an environment to how populations interact with one another, he said.

Software company Bohemia Interactive Simulations is developing technology for the Army’s Synthetic Training Environment, or STE, that creates high-fidelity simulations for soldiers to train anywhere in the world.

Called One World Terrain, the software combines 3D data collected from satellites, sensors or scanners and combines it with additional information to render high-fidelity terrain simulations, said Pete Morrison, the company’s chief commercial officer.

The Army’s Synthetic Training Environment is an example of how the service could evolve live-virtual-constructive, or LVC, systems — which combine simulated and physical elements for training — into a single system to train ground, dismounted and aerial platforms, Glaser said.

The service began developing the program long before the metaverse became a buzzword in industry, Glaser said. The system is one of the service’s top modernization priorities and is currently expected to be fielded in 2023.

STE’s digital world will be manipulated with artificial intelligence and machine learning to achieve specific training results, he added. For example, inexperienced soldiers might enter the simulation and find perfect weather. As they get better through training, the system introduces different environmental and operational conditions, he explained.

“We want to meet the training audience where they are at, not where we want them to be and gradually move them along,” Glaser said.

Red 6, the company that controlled the simulated refueling exercise, is focused on merging synthetic and live training capabilities. The company is using AR-focused technologies to evolve pilot training, said Dan Robinson, co-founder and CEO of the company.

“The problem with LVC is as soon as you transition from beyond-visual-range to within visual-range, the whole training system collapses,” he said. “There’s no way of putting synthetic adversaries into the field-of-view of pilots and have them behave in a manner representative of real-world threats.”

Red 6’s advanced tactical augmented reality system, or ATARS, improves upon LVC capabilities and allows pilots to see augmented reality-generated objects through a headset in real time and in high-speed environments, Robinson said. When paired with the company’s combined augmented reality battlespace operational network, or CARBON, multiple users can enter the same digital space and train together, he said.

These systems can realistically generate augmented reality in outdoor environments, allowing pilots to train for difficult refueling maneuvers or even high-speed dogfights, he added. Both were used during the company’s summer exercise that demonstrated two aircraft communicating and interacting with an AR-generated KC-46 during a refueling mission.

Along with complete immersion facilitated by hardware and software, the metaverse also must be a “persistent world — one that continues to exist after a user has left. Another user could go back in. The world lives outside of users,” Slye said.

The digital world of the metaverse also needs to be adaptable, Robinson said. Otherwise, “it’s just a video game,” he said.

During combat, missions are planned and executed after an evaluation of previous outcomes and actions of enemy forces, he explained. Warfighters must adapt to changes in personnel, equipment and landscape from earlier events, he said.

“When we train — in contrast to operations — we can’t do that,” he said. But with an adaptive metaverse, “we can now put living, breathing walls in there with an enemy force that is driven by either human input or AI algorithms that’s reacting to the outcomes of our training measurements.”

This would allow the services to conduct campaign-level training over multiple weeks, where war­fighters, combat commanders and senior leaders can exercise large-scale operational plans, Robinson said.

To achieve training at that scale, thousands of users need simultaneous access to a single synthetic environment. This makes the metaverse attractive for military training, Improbable’s Kleinhample said.

Improbable is one company working to improve the computing and networking capabilities the military would need to connect multiple users into a single metaverse training environment to the scale of multi-domain operations — which require a lot of virtual elements to render and then react naturally.

“The more dense you get, the more that stuff has to go on at microsecond levels — really straining the compute power,” Kleinhample said.

The company is addressing the need for increased scale and complexity with Skyral, a platform-enabled ecosystem that facilitates collaboration and development for synthetic environments, he said.

It creates a “marketplace of capability” that will be needed to build and maintain these environments, without bogging down or crashing legacy computer systems, he added.

Improbable also offers M2, a networking capability that brings multiple unique users into virtual worlds simultaneously using its Morpheus networking technology.

The company recently held a “stress test” of the technology that had 20,000 bots interacting in a dense simulation using the technology, paving the way for having the same amount of service members enter a synthetic world for a training event, he said.

Kleinhample added that cloud computing — a network of remote servers hosted on the internet — will also help get the metaverse to the edge of operations where warfighters are located.

Data collected about an environment from deployed units can be used to recreate that theater in the metaverse, Morrison explained. Then, warfighters who are preparing to go to that environment can train alongside those who are already there.

“That’s really powerful,” he said. “We can give troops the ability to do the same convoy driving that they’ll do in theater in the environment they’re about to deploy to.”

The training benefits are not just limited to warfighters. Glaser noted that the metaverse lets Army leaders better understand its soldiers, the enemy and terrain.

“This allows commanders to improve their speed and effectiveness of their decision-making,” he said. “What we do in a collective training environment is about synchronizing assets in time and space which requires our units to look ‘inside-out.’”

Military leaders can also harness the vast amounts of data created in the metaverse for insights on how the entire force is performing in specific exercises, Morrison said. Machine learning and data analysis on data from the metaverse could make informed suggestions for improvements, he said.

Once that data exists, it can be used to customize training based on a warfighter’s strengths and weaknesses, Robinson pointed out.

“How we’ve traditionally done it is we have a common syllabus for all students, and I think that’s a flawed concept,” he said.

In the metaverse, data produced by a warfighter’s experiences would be so accurate that it’s an actual representation of their real-life capabilities, Kleinhample said. The military could then focus training where someone might need more practice while also letting them quickly move through other areas they naturally excel, all while keeping historical records of improvements.

It’s clear the benefits of multi-user simulated military training are indisputable — so much so that the Defense Department has been toying with the idea since the 1980s, almost a decade before science fiction author Neal Stephenson first coined the word “metaverse” in 1992, Slye noted.

In fact, many of the services are going virtual today, she said.

The Space Force earlier this year awarded a contract to develop a digital-twin space simulation and astrodynamics training platform. The Air Force is employing a synthetic training program known as the Operational Training Infrastructure 2035 Flight Plan, from which the Navy has derived its own version called Naval Aviation Training Next — Project Avenger. Likewise, the Marine Corps’ recent modernization efforts call for increased use of immersive training environments.

Their development reflects progress in the commercial world, where individual companies are creating “a metaverse” that theoretically could be connected into one giant virtual world, Slye said.

“There are multiple metaverses that are actually being engaged in today within the military,” she said. “I think we’re leaning toward ‘the metaverse,’ but there’s a gap between where we are today and where we can get to with ‘the metaverse.’”

However, there is no concrete plan at the Defense Department to create and invest in an enterprise-wide metaverse. Instead, each service’s separate development of metaverse-enabling technologies are based on their own training needs.

Glaser emphasized that the term “military metaverse” can be misleading, as it doesn’t entirely reflect the Synthetic Training Environment’s capabilities. While there could be a military metaverse years from now, the service is going to let industry lead that effort, he said.

Just like the multiple definitions of what the metaverse will look like, experts described many paths on how and when the entire world will get there. But if the metaverse is ultimately where the military decides to head with their training, they would have one key advantage: a clear goal.

“In the near term, it will be training and eventually we might leverage a collaborative planning environment in order to plan military operations in a distributive fashion,” Glaser said. “There are significant differences, but it all comes down to the purpose.”


Topics: Training and Simulation, Emerging Technologies, Defense Department