|
|
REVIEW ARTICLE |
|
Year : 2018 | Volume
: 4
| Issue : 4 | Page : 166-172 |
|
ImmersiMed: Cross-platform simulation training
Pieter Jorissen, Ivan De Boi
Department of Industrial Sciences and Technology, Karel de Grote University College, Antwerp, Belgium
Date of Web Publication | 28-Dec-2018 |
Correspondence Address: Pieter Jorissen Department of Industrial Sciences and Technology, Karel de Grote University College, Salesianenlaan 90, B 2660 Hoboken, Antwerp Belgium
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/digm.digm_12_18
This work presents our vision and work-in-progress on a new platform for immersive virtual and augmented reality (AR) training. ImmersiMed is aimed at medical educational and professional institutions for educating nurses, doctors, and other medical personnel. ImmersiMed is created with multi-platform support and extensibility in mind. By creating consistent experiences across different platforms and applications, ImmersiMed intends to increase simulation availability. Furthermore, it is expected to improve the quality of training and prepare students better for more advanced tasks and boost confidence in their abilities. Tools for educators are being provided so new scenarios can be added without the intervention of costly content creators or programmers. This article addresses how Immersive's mixed platform approach can ease the transition from basic school training to real-world applications by starting from a virtual reality simulation and gradually let the student move on to guided AR in the real world. By explaining the idea of a single development platform for multiple applications using different technologies and by providing tools for educators to create their own scenarios, ImmersiMed will improve training quality and availability at a low training and simulation costs.
Keywords: Augmented reality, cross-platform simulation training, medical education, virtual reality
How to cite this article: Jorissen P, De Boi I. ImmersiMed: Cross-platform simulation training. Digit Med 2018;4:166-72 |
Introduction | |  |
While the term virtual reality (VR) was coined in the 1980s, its concepts had been fueling sci-fi literature and movies for decades. Nowadays, VR refers to computer-generated virtual environments (VE) in which users are being immersed using VR devices such as head-mounted displays (HMD) and intuitive, motion sensing input devices. The goal of VR applications is to create an experience in which the user feels immersed and present in that VE. The focus of VR interactions thus remains in the digital environment.
Augmented reality (AR) differs from VR as it uses the real environment but enhances the experience by adding an interactive overlay onto it. Therefore, AR applications usually focus on real-world tasks. The virtual and physical environment layers are blended in such a way that an immersive, interactive environment is experienced.
VR and AR are no longer just about video games and research. Over the past few years, big players in the tech industry such as Facebook (Oculus Rift[1]), Google (Glasses, Cardboard, Daydream[2]), Samsung (Gear[3]), Microsoft (HoloLens[4]), and HTC (Vive[5]) have invested in the future of VR and AR. The technology has since evolved dramatically, providing high-fidelity experiences at affordable prices. They are thus no longer Sci-fi vision but are a commercial reality and are on the verge of being adopted in every other industry, as well.
Training is one of the earliest use cases for VR and AR. Especially VR, which allows for practicing dangerous, complex, uncommon, or expensive tasks in a risk-free environment. From a learner's point, the possibilities are unlimited, as trainees can perform “hands-on” tasks in a controlled and safe environment. Trainees can afford to make mistakes and learn from it in the VR setup where there is literally no risk at all. A comprehensive publication of the benefits of VR training has been described in Gupta et al.[6] Although these benefits are numerous, a gap between the simulation and the real world remains.
As AR is a more recent technology, its advantages have not been studied in such a general way. However, some studies do point out that many of the benefits that VR brings to training also hold true for AR.[7],[8],[9],[10] Some AR applications have shown the potential of AR. Perhaps, it might be able to bridge that gap between achieving skill in a virtual training context and the actual competence in the real world.
One of the fields that have been adopting VR for training and simulation since its early days is healthcare.[11] Over the past couple of decades, VR and simulation technology has been implemented in healthcare training and education. Surgery simulators have been invaluable for physician training. However, these tools have historically come at a considerable cost and students often only have limited access to these simulators. Furthermore, most of these setups were only designed to focus on specific procedures, scenarios or situations, mostly aimed at surgeon training. New real-time visualization platforms, such as smartphones, are now becoming ubiquitous and their power is nearing that of desktop computers. This has been pushing VR and AR technologies and is making it cheap and available to everyone. Studies have shown that the VR and AR training of medical students and residents improves students' knowledge base and in evaluating their performance.[12] Students perceive simulation-based education as “an opportunity to learn new skills in a safe environment.”[13] Use of VR training at the start of medical training has also been shown to improve understanding of basic concepts of medical science, such as pharmacology and physiology, presumably because these simulated experiences help students to understand abstract concepts of basic science that are difficult to perceive with regular discourse.[14]
Apart from the understanding aspects, there is also an important confidence aspect. There has always been a strong mental connection between performance and seeing oneself as being successful. Visualizing yourself on the ladder of success certainly helps you boost your confidence level. And that's one of the reasons VR and AR simulations are a successful technology for training. The more medical students train and succeed in a virtual situation, the more confident they will be once they enter real-life situations. More and better training will improve the quality of care and lower the risk of medical mistakes.
The healthcare industry lies at the intersection of high-tech, medical knowledge, and legal policy. Medical knowledge doubles every 6–8 years and new innovations in medical procedures pop up every day. Healthcare practitioners have an obligation to keep their knowledge and skillset up-to-date. Medical professional training and proficiency are a huge part of healthcare costs and many hospitals and educational institutions lack availability of tools and/or time to keep up with all new developments. Due to the high cost, training is often limited to pure medical knowledge. We believe a blend of AR and VR can solve these issues. As reported earlier, many tools for both VR and AR already exist in the field of medical training. However, they are mostly complex expensive and focused on fixed procedures and scenarios. The largest group of medical personnel consists of nurses, a group which are believe has been overlooked by most VR application developers and researchers. Furthermore, their technical knowledge on how to adapt VR/AR applications is mostly non-existing.
Immersimed | |  |
This article presents the ImmersiMed vision and project status. ImmersiMed is a combined VR/AR platform aimed at educational and medical institutions for training, nurses, doctors, and other medical personnel. It envisions being an easy to use platform for training all kinds of procedures from navigating through the hospital to highly specialized surgical procedures. By allowing different setups, ranging from highly specialized setups to commercially available computer-HMD combinations, down to downloadable mobile applications that users can run on their smartphones at home ImmersiMed will be tailorable to every budget and situation. Combining multiple setups based on the same platform will improve the learning experience by increasing the amount of time a student can train. Furthermore, the experience across these platforms will remain much more consistent than if separate applications were to be used. Highly advanced simulator setups, that might be available at the institution, will never be available for every student at all times. The mobile apps, that can deliver similar experiences, can be available on every student's smartphone, allowing them to practice whenever wherever. Training programs could also work their way up by letting students learn the basic knowledge and procedures at home on their smartphone applications, then move on to more available computer-HMD setups at the institution and finally do a simulation on the sparsely available high-end simulator. This would save valuable time and costs on these expensive simulators.
However good these expensive high-end simulators can be, a gap will still remain between virtual and real experiences with real patients. ImmersiMed will be used to investigate if AR applications will be able to bridge that gap. This, however, will be part of future research and is beyond the topic of this article.
Another important aspect of the ImmersiMed platform is flexibility. ImmersiMed will provide educators and even students with the opportunity to create new scenarios and procedures which could be scenarios for all or for specific setups only. Finally, ImmersiMed incorporates user feedback, directly in the virtual/augmented environment and indirectly after the session. Depending on their level of skill the instructor or the student themselves, could determine when and how much feedback he wants. In all setups, this feedback could also be set up as an evaluation tool to check student's progress and skill level and might be stored permanently or exported.
Current Status | |  |
System overview
ImmersiMed is a medical teaching platform under development at the Karel De Grote University College. It can be used to train students and medical personnel in medical procedures. Furthermore, once completed it can also be used to test the effectiveness of VR and AR training in a medical context. It is being developed by researchers from the Multimedia Technology program in close collaboration with teachers from the Nurses program who are also responsible for the simulation laboratories in the healthcare department.
ImmersiMed and its first set of applications have been developed using the Unity.[15] This is a multi-purpose cross-platform game development environment. Unity was chosen for its flexibility and cross-platform capabilities, as it supports building to 27 different platforms. This will be crucial for future releases, especially those running on mobile devices. Furthermore, Unity has a plugin supporting OpenVR,[16] a standard which encourages a common baseline experience for VR users. It is a Software Development Kit developed by valve and supports most VR HMDs and motion controllers. For future AR setups and mobile applications, Unity allows for AR development and supports most popular AR headsets, among others the Microsoft HoloLens. It has been proven to be a very reliable engine for AR applications for all major platforms and allows for cross-platform development as well.
Our first set of applications, which will be described in the next section, focus on a computer-HMD VR setup. A standard VR-enabled desktop computer combined with a HTC Vive HMD is used, allowing the user to interact with the VE using the standard Vive controllers. Since all VR components are implemented using the OpenVR plugin, switching or adding motion controllers or changing the HMD would require very little effort. It was developed by two developers and one three-dimensional (3D) modeler over the course of one trimester.
This setup was the preferred starting point for several reasons:
- It will require the most resources (computational, graphical, and complex user interactions)
- It will require the highest level of graphical detail
- It was the first platform requested by the Department of HealthCare, at the Karel de Grote University College, which will be our main partner in the future testing and development of the ImmersiMed platform.
Internally, an extensible markup language (XML)-based file format is used to describe scenarios, including patient details, available medications and equipment, tasks the student should perform and so on. A user-friendly tool that allows nursing teachers or trainers to edit and create new scenarios was also developed. It uses the same unity modules and codebase that were used to create the applications. More details will be described in the next section where we discuss the current status.
Current functionality and applications
Currently, the ImmersiMed applications are aimed at training nursing students in routine tasks and provide them with feedback on how well they performed. The implemented tasks were identified by a team of instructors from the Bachelor in Nursing program at the Karel de Grote University College. They identified the priorities of the routine tasks based on the importance of the task, set out against the lack of training possibilities in the current curriculum. From this prioritized set, a selection of high-priority tasks and procedures was made in close collaboration with the instructors. Some less important tasks were added to create a complete overall training experience. In this first stage, they were developed for the computer-HMD setup, but with the other platforms in mind.
These selected routine tasks include:
- Navigate through and interact with a virtual hospital environment. The subject can walk through the hallways, enter patient rooms, medical supply rooms, open, close doors, etc., [Figure 1]
- Find, identify patients [Figure 2] and check their medical charts containing history, allergies, etc., [Figure 3]
- Find and read instructions on the required medical treatments, prescriptions
- Retrieve necessary medical equipment and machines from store rooms and computer-controlled medication cabinets, also known as secured unit-based cabinets (UBCs) [Figure 4]. Our system is based on the Vanas[17] computer controlled medical cabinet system which is the most prevalent in Belgian Hospitals
- Prepare medical equipment (syringes, needles, etc.) and medications such as dosage, collecting extra aids for administrating medicines (a cup of water, etc.,) [Figure 5]
- Look up medication information in the official professional drug and medications databases
- Administrate medications to the patient according to professional guidelines [Figure 6].
 | Figure 1: Navigating through the Medical Institution's hallway, to get to a patient's room
Click here to view |
 | Figure 2: A patient's room including a pregnant female patient. The patient can be identified by scanning a code on the Wristband
Click here to view |
 | Figure 3: A collection of virtual patient charts located at the department's central desk
Click here to view |
 | Figure 4: Room with secured medical unit-based cabinets for retrieving medications and medical equipment
Click here to view |
 | Figure 5: Drawing up medication from a vial after selecting the correct syringe and type of hypodermic needle
Click here to view |
Instructors, on the other end, are provided with two simple tools. The first tool is a simple, easy-to-use form-based scenario creation tools which allows them to:
Create new scenarios:
- Create new patient with history and type (adult, child, male, female, pregnant, senior, etc.)
- Add new medications and tools, describing their form (pill, flask, etc.)
- Setup the entire UBCS contents, the medication, tools, and security level can be set up per drawer, multiple cabinets can be set up
- Describe the entire scenario:
- Select the patient that needs treatment
- Select the right medication
- Select the right way and details of how to administer the medication (using the correct tools and dosage).
Part of the interface is shown in [Figure 7].
The second tool provided to instructors, allows them to check a student's progress by viewing the results after the session has ended. This progress is, for now, kept in a simple text-based file logging every timed action. An example of a very short session is shown in [Figure 8]. This feedback is an extra addition to the feedback that can be given while the student is training in VR. It could easily be exported to a database or grading tool if necessary.
Hence, in contrast to most VR applications for medical education, our current applications do not aim to simulate complex surgical tasks, train for diagnosing patients or learn anatomy. Instead, ImmersiMed focuses on routine nursing tasks and performing them in the correct order and in compliance with the healthcare procedures.
The first set of trials, comparing different interaction methods for navigation, preparing and administrating medications and different forms of feedback have already been carried out in close collaboration with the Department of Health Care at the Karel de Grote University College, Belgium. Its conclusions are now being grouped, and the different techniques are now being implemented and combined into the next release. This release, along with the first version of a smartphone app, will be used to train students starting the Bachelor in Nursing program at the Karel de Grote University College in 2018–2019.
Future Developments | |  |
The next phase in the ImmersiMed project will be to create mobile training applications that mimic our computer-HMD setup. This will allow nursing students to practice the same skills, when and wherever and let them learn at their own pace. Of course, the level-of-detail will not be the same, and interaction with the VE on the mobile app will not be as sophisticated as within our VR setup. However, by working on the same platform, using the same codebase, graphical elements, and scenarios, ImmersiMed intends to achieve a consistent experience across all our applications. Apart from allowing for more hours of practice, the combination of having a mobile app and a more advanced setup also has the merit of letting students try, learn and experience certain scenarios at home, before engaging in more advanced lessons using the scarcely available advanced setup. This is expected to increase the quality of training on the more detailed aspects of the simulation and increase students' confidence in the routines and procedures they are trying to master. Of course, these results will have to be confirmed in future research.
Another future development will be more detailed instructor feedback. Currently, instructor feedback is limited to be shown directly in the HMD application and simple text-based logging of student's actions, as described in the previous section. An instructor can follow the entire simulation on a separate screen, and can thus see the student's every action when he is there. By enlarging the amount of tracked data, nursing teachers should be able to check which students have been training and how well they have developed their skills, both at home and in the advanced simulator, even when not around. This information could also be used to show student's progress as well, for both instructor and student. This, for example, by showing statistics on how to correct, fast procedures were performed, the number of mistakes made, over time. Furthermore, it could be used to allow to check whether students prepared themselves well enough to take part in advanced setup training etc.
The next phase of ImmersiMed applications for students will start AR development. AR smartphone and high-end AR device applications will be developed to help students bridge that gap between achieving a skill in a virtual training context and the actual competence in a real-world hospital. This could be realized by showing them the steps of procedures, visualize how medications should be prepared, warn them in case of mistakes, etc. Another way ImmersiMed's AR applications may be able to bridge that gap is by letting students train in real medical environments, with real medical equipment, but on virtual patients.
In a parallel path, next to the student-focused applications, our team intends to develop AR tools for medical staff in general aiming at reducing mistakes. All elements of our student-focused AR applications can be reused and extended them with, for example, facial recognition, scanning tools to scan patient id tags, couple them to hospital medical tracking systems, patient chart databases and so on realizing a system reducing medical errors. As high-end AR device prices will start to drop in the very near future, equipping every healthcare worker with such a tool is no longer just a sci-fi scenario, in contrast, it is a plausible near-future reality.
Finally, one of our future realizations will include storing all medications preparation tracking data. Preparing medications is one of the most important tasks of the nursing staff. However, it is error-prone and takes up quite a percentage of nurses' time. As the shortage of trained medical staff is increasing in most western regions, their workload needs to be optimized. By collecting all motion controller data, robots or cobots could be trained to prepare or assist with preparing medications for patients, taking over these tasks from human nurses, creating more time for tasks which robots will not be able to do in the short term. This part of ImmersiMed will be developed in conjunction with the Op3Mech[18] research group at the University of Antwerp. Other future ImmersiMed opportunities include a remote AR medical assistance applications that let's untrained bystanders get instructions from emergency services, remote controlled nurse robots for highly contagious patients, and many more.
The technical challenges in these future developments are numerous and addressing them all would be beyond the scope of this paper. The most urgent ones include:
- Creating a consistent experience across mobile and more advanced setups. This will be met by developing on a single platform and codebase wherever possible. Unity allows for cross-platform development and 3D models can be designed in different levels of detail. Code for selecting the models at startup, depending on the computing/graphical capabilities of the device will realize the optimal result for every device
- Develop interactions for mobile and AR platforms with limited motion controller systems. As ImmersiMed's smartphone applications should be available to students at all time, the burden of adding, buying, configuring extra peripherals should be kept to a minimum. Interactions in AR have already been studied and many techniques exist. Most of them use computer vision-based gesture recognition,[19] a field of research that has received a lot of attention and has progressed enormously over the past decade. Most AR devices and smartphones are equipped with cameras nowadays, so this opens up a lot of possibilities to allow for a whole range of controller-free interactions. Gaze control[20] is another intuitive and more lightweight way of interacting in AR
- Allowing for new graphical and technical elements to be included for new scenarios. To tackle this issue, Unity's asset bundle component will be used. Of course, this could result in the fact that our XML description of scenarios will have to be expanded to also incorporate new assets. Another option would be to develop a simple, easy-to-use visual scene editor that would allow nursing teachers to create their own VE and scenario, all in one tool. This would generate the XML containing a VE description, interactions and the scenario and tasks that the student would have to carry out. In future versions ImmersiMed might also let the instructor create scenarios by performing tasks themselves and record his choices and actions. This to make it even more intuitive for the instructors.
Most of the solutions are currently under development will be tested in the next phase of the ImmersiMed project, in close collaboration with the Bachelor in Nursing program at the Karel de Grote University College.
Conclusions | |  |
In this work, we present our vision and work-in-progress on the ImmersiMed platform. ImmersiMed is a combined VR/AR platform aimed at educational and medical institutions for training, nurses, doctors, and other medical personnel. ImmersiMed and its applications are being developed on top of the Unity multi-platform engine. All developments are done with extensibility, future applications and hardware evolutions in mind. By creating consistent experiences across different setups and applications, we intend to increase simulation availability, improve the quality of training and enhance student's preparations for more advanced tasks and boost their confidence. We presented the first set of ImmersiMed applications aimed at training nursing students in routine tasks and providing them with feedback on how well they performed. Tools for educators provide the possibility to create new scenarios without the intervention of expensive content creators or programmers. Finally, we discussed the future options of ImmersiMed and suggest how its AR applications will ease the transition from virtual training environments to the real world and show many other opportunities the platform offers to improve and lower medical training and simulation costs. Proving our assumptions will be part of the further research.
Acknowledgment
We are gratefully acknowledge the support of the Department of HealthCare, at the Karel de Grote University College, especially Rik Depauw and Dieter Smis for their input and feedback. Furthermore, we would like to thank the third-year students in Multimedia Technology at the Karel de Grote University College specializing in Virtual and 3D for their work on 3D models and support developing the first set of applications.
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
References | |  |
1. | |
2. | |
3. | |
4. | |
5. | |
6. | Gupta K, Anand DK, Brough JE, Schwartz M, Kavetsky RA. C Training in Virtual Environments: A Safe, Cost-Effective, and Engaging Approach to Training. College Park, Maryland: ALCE EPSC Press, College Park. 2008. |
7. | Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surg Endosc 2016;30:4174-83. |
8. | Khor WS, Baker B, Amin K, Chan A, Patel K, Wong J, et al. Augmented and virtual reality in surgery-the digital surgical environment: Applications, limitations and legal pitfalls. Ann Transl Med 2016;4:454. |
9. | Ma M, Fallavollita P, Seelbach I, Von Der Heide AM, Euler E, Waschke J, et al. Personalized augmented reality for anatomy education. Clin Anat 2016;29:446-53. |
10. | Kamphuis C, Barsom E, Schijven M, Christoph N. Augmented reality in medical education? Perspect Med Educ 2014;3:300-11. |
11. | Jones F, Passos-Neto CE, Braghiroli OF. Simulation in medical education: Brief history and methodology. PPCR 2015;1:56-63. |
12. | Okuda Y, Bryson EO, DeMaria S Jr., Jacobson L, Quinones J, Shen B, et al. The utility of simulation in medical education: What is the evidence? Mt Sinai J Med 2009;76:330-43. |
13. | Weller JM. Simulation in undergraduate medical education: Bridging the gap between theory and practice. Med Educ 2004;38:32-8. |
14. | Rosen KR, McBride JM, Drake RL. The use of simulation in medical education to enhance students' understanding of basic sciences. Med Teach 2009;31:842-6. |
15. | |
16. | |
17. | |
18. | |
19. | Haria A, Subramanian A, Asokkumar N, Poddar S, Snayak J. Hand gesture recognition for human computer interaction. Procedia Comput Sci 2017;115:367-74. |
20. | Nilsson S. Interaction Without Gesture or Speech – A Gaze Controlled AR System. 17 th International Conference on Artificial Reality and Telexistence (ICAT 2007). Esbjerg, Jylland; 2007. |
[Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6], [Figure 7], [Figure 8]
|