This page is a copy of research/systems_and_control/biomed/haptic (Wed, 31 Aug 2022 15:08:02)
This page is a copy of research/syscon/biomed/haptic (Wed, 27 May 2015 14:18:27)
Whole-hand haptics with true 3D display
This new project is a multi-disciplinary project lead by Ingrid Carlbom and Ewert Bengtsson at the Center for Image Analysis. Our group is responsible for the real-time tracking of the hand. This is an essential part of the whole project, as the position of each finger and the palm - with respect to the diplayed virtual 3D object - must be known with high accuracy and low latency in order to generate the haptic feedback.
From the presentation of the project at the main funding agency KK-Stiftelsens web page:
- We envision a system with true haptics for interfacing with a true 3D display of virtual models. It will provide an unprecedented experience allowing the user to touch and manipulate high contrast, high resolution, three-dimensional (3D) virtual objects suspended in space using a glove that gives such realistic whole hand haptic feedback that the interaction closely resembles interaction with real objects using a bare hand. The system will allow multiple users simultaneous access to the same virtual object in a fully lit room, providing a natural environment for collaboration on complex tasks. Such a system has numerous applications. This project is thus highly multidisciplinary, and we have assembled a team of world class researchers from computer science, image analysis and computer graphics, engineering, material science, neurophysiology, and psychology. We have established contact with several companies in Sweden that are at the forefront in key technologies and have expressed an interest in participating in the project. These companies would manufacture components for this system or would utilize some of the components to enhance their current product lines. This project touches upon several of KK-stiftelsens particular interest areas. It involves real-time visualization of complex models and it is a multi-modal application which invokes two of our senses: sight and touch. And it represents a new type of augmented reality mixing real and virtual data: real hands are used to manipulate virtual objects.
In order to make it possible to know when some part of the hand is in contact with the virtual object we need to be able to track all parts of the hand in 3D with millimeter precision and millisecond response time. The most promising technology is based on cameras with infrared illumination which track small retro-reflective markers attached to each segment of the fingers on the outside of the glove. By using several cameras that are viewing the hand from different directions the exact 3D position of each finger can be determined as the hand is moving and turning. These types of systems are used for motion capture in medical, industrial as well as for computer game and special effects movie industry, and they are commercially available from several companies, e.g. Qualisys. One of the commercial partners in the project, the animation and motion capture studio Imagination Studios (http://imaginationstudios.com) will provide access to their motion capture system.
Tracking with markers may still cause problems, mainly as a result of occlusion, but a detailed biomechanical model of the hand can be used to disambiguate situations when markers leave and come back into the field of view of the cameras. The millisecond response time required for realistic haptics does make the task extra challenging. If necessary, the optical tracking can be supplemented by miniature accelerometers in the glove that keep track of the directions the different finger segments are moving.
Even without haptic feedback, fast and accurate tracking of the whole hand constitutes a potentially powerful input device which gives multi-finger control with high precision for manipulating virtual objects in 3D.