4:00pm - 4:30pmAn Immersive System for Exploring and Measuring Medical Image Data
Patrick Saalfeld, Johannes Patzschke, Bernhard Preim
Otto-von-Guericke-Universität Magdeburg, Deutschland
We present an immersive system to explore and measure medical 2D and 3D data for treatment planning. Our focus lies on interaction techniques, which are oriented on the workflow of treatment planning. We use the head-mounted display from an Oculus Rift DK2 as output device. For input, we use a stylus from a zSpace and implement ray-based interaction techniques. Our system is designed to be used in a sitting position at a desk, similar to the treatment planning environment of a physician. The stylus is grabbed equally to a pen and allows precise input, which is important for measurements. We evaluated our prototype with computer scientists, stating exploratory and measurement tasks on medical 2D and 3D data. Here, we assessed the time, usability, felt presence and informal feedback. Our results show that our exploration and measurement techniques are fast, easily understood and support presence. However, tracking losses due to infrared interference have to be solved for further studies with physicians.
4:30pm - 5:00pmTouchless Measurement of Medical Image Data for Interventional Support
Patrick Saalfeld1, Dominique Kasper2, Bernhard Preim1, Christian Hansen2
1Visualization Group, Otto-von-Guericke-Universität Magdeburg, Deutschland; 2Computer-assisted Surgery Group, Otto-von-Guericke-Universität Magdeburg, Deutschland
The preservation of sterility is essential during interventions. Based on interviews with physicians and observed interventions, we derive requirements for touchless distances measurements. We present interaction techniques to apply these measurements on medical 2D image data and 3D planning models using the Leap Motion Controller. A comparative user study with three medical students and eleven non-medical participants was conducted, comparing freehand gesture control with the established, but non-sterile mouse and keyboard control. We assessed the time, accuracy and usability during 2D and 3D distance measurement tasks. The freehand gesture control performed worse compared to mouse and keyboard control. However, we observed a fast learning curve leading to a strong improvement for the gesture control, indicating that longer training times could make this input modality competitive. We discuss whether the advantage of sterility of gesture control can compensate for its inferior performance.
5:00pm - 5:30pmFoot Interaction Concepts to Support Radiological Interventions
Benjamin Hatscher, Maria Luz, Christian Hansen
Otto-von-Guericke-Universität Magdeburg, Deutschland
During neuroradiological interventions, physicians need to interact with medical image data, which cannot be done while the hands are occupied. We propose foot input concepts with one degree of freedom, which matches a common interaction task in the operating room. We conducted a study to compare our concepts in regards to task completion time, subjective workload and user experience. Relative input performed significantly better than absolute or rate-based input. Our findings may enable more effective computer interactions in the operating room and similar domains where the hands are not available.
|