Haptics research has permeated many disciplines and application areas. Earliest efforts focused on sensory substitution: stimulating the sense of touch to convey imagery or speech for individuals with visual and/or auditory impairments. With the advent of force-feedback devices, there have been renewed interests in using haptic interfaces in teleoperator systems and virtual environments. The successful deployment of haptic interfaces requires continuing advances in hardware design, control, software algorithms, as well as our understanding of the human somatosensory system. Priority areas include, but are by no means limited to:
- Devices & technology
- Tactile display and tactile sensing
- Haptic rendering
- Perception & psychophysics
- Haptic cognition
- Multimodal perception
- Sensory guided motor control
- Haptic Communication
- Applications in entertainment, medicine, rehabilitation, education, data perceptualization, art, rapid prototyping, remote collaboration, etc.
The "Technical Area Overview Report" for the IEEE Technical Committee on Haptics provides an oveview of current state of the art and furure 10 years research in Haptics. It can be downloaded from here.
An excerpt from the text is available from here:
Important items of science, technology, research and development for the future 10 years
One of the greatest challenges to the field of haptics (and this is not a new issue) has been to find compelling real-world applications that are also commercial. Here we outline some of the principal areas that are likely to become mainstream areas of future haptics research topics.
[HD Haptics and Haptic feedback for touchscreens]
With the prevalence of touchscreens that enable users to provide input to the computer through fingertips, there is surprisingly little variety in the type of touch feedback afforded by touchscreens other than vibrations. We are all aware of the difficulties associated with typing on a soft keyboard on a flat glass. It would be awesome if a computer screen can morph itself into the shapes of the virtual objects on the screen, provide key-click feedback when typing on a virtual keyboard and otherwise provide clues as to the location and identify of the visual contents on a screen. "HD haptics" (e.g Immersion or Vivitouch high fidelity haptics) vibration feedback technology or Disney’s, Senseg's touchscreen compatible technologies might be the next big technology to break into the consumer domain.
[Haptics for neurorehabilitation]
Robotic devices have been shown to be effective at delivering the intensive and repetitive therapy that is known to induce brain plasticity and foster restoration of motor coordination after stroke, spinal cord injury, and other neural impairments. Engagement of the sensorimotor system, including haptic feedback to the participant during rehabilitation, is an important factor in regaining motor control. Rehabilitation robotic devices should be effective in terms of performance and training paradigms (transparency to the user, adaptability, active haptic guidance, control), and this opens the way to new developments. The science and technology of haptics thus has great potential to affect the outcomes of rehabilitation and adoption of advanced orthotic devices
[Fingertip force/contact displays]
The fingertip is capable of detecting a wealth of information from a static contact with an edge, a rough surface, etc.; yet it is extremely challenging to be able to recreate the force/deformation distribution at the fingertip through electromechanical means. Not only is it difficult to create haptic displays at a small scale that matches the high spatial resolution of the fingertip, it is also challenging to package all the actuators and connectors in a small-enough scale for such displays to be portable and practical. The ability to interface with the fingertip directly will open new doors to applications of haptic technology
[Haptic language for notifications]
Even though it is technologically straightforward to create different vibrational patterns with a single actuators, current mobile devices are almost all stuck with a single type of buzzing. What does it take for us to create a meaningful haptic language so that we learn to associate certain meanings with a few haptic stimulation patterns, just like the way we associate meanings with the red/yellow/green traffic lights and can recognize a car honk from a siren. There is so much more we can gain from developing a haptic language for notifications on mobile devices.
As the paradigm of google glasses is bringing augmented reality in the consumer market, with this term we refer to the possibility of haptic devices to be worn on your body, according to the body sensor network concept, and being relatively transparent to the user. This enables endowing people with sense of touch for applications of physical interaction with digital data, like immersive VR systems. The availability of cheap RGB-D cameras, such as Kinect, make it possible to devise new paradigms of Human Computer Interaction. In this area we consider also wearable system for providing navigation cues, such as guidance systems for the blind users. Moreover haptic channel has been already used to provide additional information cues when visual and audio senses are already engaged in other tasks, such as driving.
Robots that can be worn on your body, pose research issues in the area of Physical Human-Robot Interaction, that mainly involved man-in-the-loop issues related to the kinesthesia and proprioception. Full-body, lower limb and upper limb exoskeletons can be employed as assistive devices for the elderly or the motion impaired, as augmentation devices for the increase of force performance or as rehabilitation system for recovery of lost gait and manipulation capabilities. There are still a lot of technological issues that need to be fully addressed in this area.
[Real-Time Realistic Haptic Simulation]
It's hard to predict the future, but one almost a sure thing is that computing will become faster, smaller, and cheaper. At Moore's law rates in 10 years computers could be two orders of magnitude faster or smaller or cheaper. But they will require new parallel algorithms to exploit them. How is haptics going to exploit this computing power? We may be able to run detailed simulations of complex contact phenomena, with realistic forces, deformation, and sounds. We may be able to build new haptic devices by controlling surface vibrations or dense arrays of tactors. We may be able to produce user-specific simulations that learn the user's dynamics in real time.
[Prosthesis and Neurohaptic interfaces]
Haptic hardware and haptic neuroscience have both made progress and inspired each other recent years, but have not influenced each other directly. In the next decade, it is possible that we may be able to more directly stimulate the central nervous system to produce haptic experiences. These could include cortical implants, peripheral nerve stimulators, or even traditional mechanical interfaces that explicitly exploit knowledge of force production in muscles, and force transduction in tactile and kinesthetic mechanoreceptors. Further, haptic feedback can enhance the natural control, utility, and efficacy of advancement of prosthetic and orthotic devices that restore mobility and manipulability to lower- and upper-extremity amputees. However, advanced prosthetic devices, for example, have decoupled the normal afferent-efferent loop and rely heavily on visual feedback to the amputee for control in the absence of haptics. Upper limb amputees must also cope with the critical loss of sensation. For all those reasons, nearly half of upper limb amputees choose not to use prostheses, functioning instead with one good arm. By contrast, almost all lower limb amputees use prosthetic legs.
[Telepresence and X-Human Haptics applications]
Applications of this technology include and are not limited to teleoperation, remote handling, and telepresence. Teleoperation is currently regaining interest in hazardous environments (e.g. space, nuclear industry, natural or industrial disasters, …), not to mention its interest in surgery or inaccessible worlds, e.g. micro/nano-manipulation. Telepresence and telexistence systems might be used also for mediated-interaction with other agents or objects in remote environments, including collaborative tasks with haptic feedback and affective haptics in social applications.
[Haptics in entertainment]
Haptics in the entertainment industry is a relevant field. For instance, Disney is currently developing a tactile full-body suite for their theme parks (Disney's Surround Haptics). Haptics for entertainment poses mainly two challenges: on the one hand, the haptic systems must be inexpensive and easy to install and use; on the other hand, haptic content must be provided. In particular, generating realistic and enjoyable haptic content for films is a challenge. In the future, "haptic cameras" could record the additional content directly during the shooting.
Treadmill-based locomotion device is a major achievement Locomotion interface creates sense of walking in virtual environments. Walking is an important issue in haptics research. Research activities in locomotion interfaces have been done all over the world. Other methods, such as Robot-tile, will be focused for the next 10 years.
[Physical Human-Robot Haptic Interaction]
Haptics is often used as a mean to not only interact but also communicate with robot. People often call this as "physical human-robot haptic interaction", which is not new, but there has been continuous research progress in this area. There are many research issues, such as, how to achieve swift and effective interaction while maintaining intrinsical safety, and how to understand human intention based on biological signals or kinesthetic information.