TRAINING AND SIMULATION
Augmented Reality Emerges as Key Tool for Military Medical Training
University of Miami Gordon Center photo
Having well-trained medics and other military health care professionals can mean the difference between life and death for U.S. service members. That’s why the Pentagon is eyeing augmented reality as a simulation tool to help them sharpen their skills.
Last year, the University of Miami’s Gordon Center for Simulation and Innovation in Medical Education conducted a four-month study funded by the Defense Health Agency to examine the effectiveness of teletraining that leverages AR.
“The background to this is that the DoD is faced with really the same challenges that we all are in health care, which is not enough faculty to be able to train all the people that need to be trained,” said Dr. Ivette Motola, a specialist in emergency medicine with the University of Miami Health System. “And so it was in some ways a proof of concept — can we do this effectively remotely?”
The study involved about 30 participants including personnel from the Army Trauma Training Detachment.
The study examined whether teletraining with augmented reality could be used to teach a new, complex procedure to a group of paramedics, Motola said. The AR technology that was used was developed by ArchieMD, a Boca Raton, Florida-based company that provides educational training products within the health sciences.
“We put [trainers and trainees] in two different places and used telemedicine along with a manikin and augmented reality … imagery for the faculty member to teach the paramedic how to put in a chest tube,” Motola explained in an interview.
A tube thoracostomy involves placing a hollow plastic tube between the ribs and into the chest cavity to drain air, blood or other fluids from around the lungs — a procedure that could be required to save the life of a wounded service member.
“For this one, it was a projector over the manikin, and then they had a screen in front of them where they could see the instructor and then also see the manikin with the superimposed anatomy and the step-by-step procedure components,” she said. “Basically they could see the manikin plus the anatomy [with AR imagery] and all of the layers they were going through together.”
“It’s like they have X-ray vision,” Dr. Ross Scalese, director of educational technology development at the Gordon Center and a former Air Force flight surgeon, said of the students using the technology.
The feedback from participants was positive, Motola said.
“The features the learners found most useful were the ability to interact and receive guidance and feedback live from the instructor, and the anatomic landmarks of the AR with the step-by-step animations,” she said. “The instructors identified as most useful the high quality of the [audiovisual technology], along with the presence of the anatomy, to be able to ensure that the learner is performing the right actions in the correct location.”
What did the researchers conclude in their report for the Defense Health Agency?
“The integration of teletraining with augmented reality could become a valuable method for both training military health care personnel and treating soldiers with battlefield injuries,” Motola said. “Military medics could receive remote guidance in executing a procedure, saving time and even lives.”
The Defense Department is moving forward with pursuing these capabilities.
The Army has a project known as the Realistic Portable and Deployable Medical Patient Simulator Using Augmented Reality. In April, Army Medical Research and Development Command’s Medical Simulation and Information Sciences Research Program conducted a live demonstration of a virtual reality device called PerSim, which was developed by MedCognition Inc.
The technology leverages Microsoft HoloLens goggles to allow users to enter a virtual environment where they can treat a variety of simulated sick and injured patients, according to an Army news release. “Those virtual patients are created by animating a live human model and then ‘wrapping’ that model in a three-dimensional ‘skin’ which can then be adjusted for a number of variables depending on what exact malady or injury the patient is ostensibly suffering from at the time,” it explained.
The system can support a wide range of training procedures including the application of tourniquets, treating shrapnel injuries or sucking chest wounds, according to the release.
The Army has also been exploring a software tool developed by Design Interactive Inc. known as AUGMED, which was developed to leverage augmented reality to provide trainees with “fully immersive” tactical combat casualty care training scenarios that can be delivered remotely, according to the company’s website. It also leverages the HoloLens, according to the Army.
“The long-term goal of all of our efforts is to put an end to the need to send service members to another location and to be able to provide the training anywhere and anytime,” Medical Research and Development Command Senior Program Manager Frank Karluk said in the news release. “In the end, if training can be accomplished in this way during future wars, we will always have a fully trained and ready medical force without the need for just-in-time type of training just prior to a deployment.”
Dr. Darrin Frye, a portfolio manager at the command, said: “It is of critical importance that we provide our medical and non-medical military the opportunity to practice casualty care using realistic synthetic combat trauma scenarios, and provide optimal learning experiences in a cost-effective, everywhere, anytime, immersive fashion.”
Both the PerSim and AUGMED were expected to be transitioned to the Defense Health Agency’s program manager for medical simulation and training office, with the goal of putting the technology in the hands of end users, according to the release.
Meanwhile, the Air Force’s 6th Medical Group Education and Training Flight team has been chosen to collaborate with the Defense Health Agency and commercial partners to field the Defense Department’s first integrated “mixed reality” training platforms for military medical readiness training that combine augmented reality with manikin-based simulation, according to a September Air Force news release.
“The introduction of augmented reality training increases the availability of training, without a need to increase manpower availability for training setup and multiple student face-to-face interaction,” the release said. “The additional benefits to the augmented reality application in military medical readiness training is increased mobility, minimal setup and wireless connectivity that can be used in the field.”
An emerging technology that could serve as a key enabler of teletraining and augmented reality is 5G networks.
Last year, the Pentagon designated Joint Base San Antonio, Texas, as a 5G experimentation site. Officials have identified AR-guided enhanced medical training as a potential application for the technology.
“5G features, functions and advanced communications technologies, including flexible bandwidth allocation (ultra-narrow to extremely broad), in-network core applications, ultra-low latency, and ubiquitous connectivity for DoD military medical service components … will enable the joint medical community to sustain its long-term economic and military advantage,” according to a Defense Department news release.
“The prototype and demonstration of this application with 5G is to enhance in garrison (on the actual post or station) or just-in-time (right before departing for deployment) medical readiness training,” it said. “It will improve and/or augment capabilities and accessibility for reliable and realistic scenarios which meet or exceed national providers’ standards.”
The Defense Department did not respond to interview requests.
Scalese said “bandwidth is always an issue” when it comes to telemedicine and teletraining.
“If you want it to be like you’re in the room with the person doing the debriefing, or watching the scenario, then you’re always [potentially] dealing with lag and stutter,” he said. “As you go to these 3D renderings and all of that, it’s a lot of information that you’re trying to stream.
And so bandwidth is going to definitely be a factor. And 5G is just one step … where that’s going to be leaps and bounds better.”
The Defense Department in June released a request for prototype proposals to the members of the National Spectrum Consortium aimed at supporting 5G telemedicine and research and development for medical training technology.
The solicitation focused on two areas of interest: development of a 5G-enabled augmented reality training platform that can accommodate multiple trainees under a single trainer; and telementoring capabilities for medical procedures that leverage AR and 5G to facilitate real-time communication between military service members in remote locations and medical specialists, according to a consortium press release.
Proposals were due in July.
“Augmented reality has the very real potential to create three-dimensional immersive learning experiences to train and mentor military medical personnel in critical care and trauma scenarios from remote locations,” said the consortium’s chief strategy officer, retired Vice Adm. Joseph Dyer.
Consortium Executive Director Maren Leed said the initiative offers an opportunity to significantly advance the quality of health care for service members in remote locations.
Additionally, “DoD’s investments will help advance the commercial state-of-the-art when it comes to using AR for medical training and mentoring, benefitting the nation as a whole,” she said.
How quickly is augmented reality and teletraining technology advancing?
It is now much more user friendly than it was just a few years ago, Motola said.
Scalese said the visual experience is where technology developers have made the most progress. However, the tactile or haptic experience still has a lot of room for improvement, he noted.
“The ideal would be when you can walk into the Holodeck and say, ‘Load trauma scenario 1B,’ and boom, it’s 3D, completely realistic, and with all your senses engaged,” Scalese said, referring to the fictional device from Star Trek that features holograms and allows users to interact with entities in a synthetic environment. “We’re not there yet, but it has come a really long way, just with all the 3D graphics and processor speeds and things like that have enabled much more realistic” scenarios.
Topics: Training and Simulation