Workshops

Displaying 1 - 5 of 5
  • Hand synergies - how to tame the complexity of grasping

    This workshpo has been proposed and should be held at the next ICRA 2013. 

     

    The aim we are seeking with this workshop is to disseminate the scientific activities and results that are being pursued in the EU Project "THE: The Hand Embodied", coordinated by Prof. Antonio Bicchi, and discuss them with experts in related fields. We believe that the workshop topics involve disciplines and application areas whic
     
    WORKSHOP ABSTRACT

    The human hand is a cognitive organ capable of extremely dexterous actuation and hyper-acute sensory function. Possessing the hand machinery with over 30 effectively enabled degrees of freedom, actuated by at least as many muscles through a complicated network of ligaments, and collecting sensation from diverse tactile receptors, the central nervous system must have developed strategies simplifying its control and interpretation of the sensory information. These strategies are united within a single notion, so-called “synergies”, which implies an efficient use of multiple possibly diverse agents, like muscles, joints, fingers, receptors and sensory modalities.

    Humans are astonishingly talented in organizing the diverse elements of its body machinery into sensorimotor synergies. Within this full day workshop we will look into this capability in search of inspiration for development of beyond state-of-the-art robotic hands. We aim at bringing the concept of synergy originating in the neurophysiological studies into the field of robotics with the main focus on the design of robotic hands, prosthetic devices and haptic interfaces. It is in fact by further refining and improving upon the synergy principles that we strive towards creating robots that are capable of complex manipulation skills in real-world environments.

    MOTIVATION AND OBJECTIVES

    In this workshop we want to study how the embodied characteristics of the human hand and its sensors, the sensorimotor transformations, and the very constraints impose, affect and determine the learning and control strategies we use for such fundamental cognitive functions as exploring, grasping and manipulating. By learning from human data and hypotheses-driven simulations how can we devise improved system architectures for the “hand” as a cognitive organ. This knowledge is extremely useful to design and control new and improved robot hands, haptic interfaces, and hand prostheses.

    The workshop hinges about the conceptual structure and the geometry of such enabling constraints, or synergies: dependencies in redundant hand mobility (motor synergies), and in redundant cutaneous and kinaesthetic receptors readings (multi-cue integration), and overall sensorimotor synergies.

    Embodiment is key in characterizing human skills with respect to grasping and manipulation capabilities. Understanding that mere mimicking human behavior is not a way to pursue since it is not possible to replicate the complexity of the embodied human hand, we search for the simplifying strategies governing the human motor behaviour, i.e. an abstraction layer, by which effective lessons can be then taught to robotic systems.

    In the workshop leading experts from the various fields will get together to present the underlying fundamental ideas. The workshop is a platform to identify and reveal aspects from neuroscience and learning to be used profitably in robotic grasping, prosthetics, haptics, and rehabilitation. Topics to be addressed include, but are not limited to: 

    1. How do synergies help to control such complicated many-dofs systems as hands?

    2. How do synergies help to perceive properties of external objects through the noisy and diverse sensory system we possess?

    3. What are the limitations of the soft/adaptives synergies as implemented in compliant low-dof grasping devices?

    4. What is the trade-off between achievable tasks and number of motor synergies (actuators)?

    5. What is the role of tactile information in the sensory synergies and how can low-dof hands benefit from it?

     

    Organizer : 
    Marco Gabiccini, Alexander V. Terekhov, Thomas Wimboeck
    Location : 
    Karlsruhe, Germany
    Date: 
    Monday, May 6, 2013 to Friday, May 10, 2013
    Material Available: 
    No
    WorkshopType: 
    Workshop
    Area: 
    Europe
    IEEE Sponsored: 
    Yes
  • HAID 2012 - The Seventh International Workshop on Haptic and Audio Interaction Design (HAID)

    The combination of haptic and audio for interaction design is a challenging research
    area, and we invite researchers and practitioners interested in these non-visual
    modalities to come to HAID to exchange designs and research findings. This year's
    HAID has a particular (but not exclusive) focus on the mobile setting - while on the
    move the haptic and audio combination has great (but sadly under-exploited)
    potential. More non-visual interaction designs will make applications and devices
    easier to user for everyone.
    We invite contributions on the appropriate use of haptics and audio in interaction
    design: how do we design effectively for mobile interaction? How can we design
    effective haptic, audio and multimodal interfaces? In what new application areas can
    we apply these techniques? Are there design methods that are useful? Or evaluation
    techniques that are particularly appropriate? We also welcome artistic exhibits and
    commercial design cases for our exhibition.

    HAID12 is a direct successor to the successful workshop series inaugurated in
    Glasgow in 2006, in Seoul in 2007, in Jyväskylä in 2008, Dresden in 2009, Copenhagen
    2010 and Kyoto 2011. The aim of HAID12 is to bring together researchers and
    practitioners who share an interest in finding out how the haptic and audio
    modalities can be used together in human computer interaction. The research
    challenges in the area are best approached through user-centred design, empirical
    studies or the development of novel theoretical frameworks.
    We invite your papers, posters, demonstrations and exhibits/design cases on these
    topics, and look forward to seeing you in Lund in August 2012!

    Topics
    Contributions are welcomed in (but not limited to) the following areas (please note
    that in all of these areas both theoretical and empirical approaches are
    encouraged):
    . Novel haptic, audio and multimodal interfaces and interactions
    . Evaluating multimodal interactions, especially in real contexts
    . Design principles for multimodal user-interfaces
    . Multimodal visualisations
    . Affective roles of haptics and audio in interaction
    . Cross-modal interactions
    . Auditory and haptic displays for visually impaired people
    . Safety critical multimodal applications (monitoring, controlling, alarming)
    . Designing haptics and audio for touch screen
    . Multimodal gaming and entertainment
    . Interaction in physical exercise
    . Collaborative multimodal systems
    . Mobile multimodal interactions
    . Emulation and simulation of real world with audio-haptic design
    . Novel systems and interactions using other modalities (e.g. taste, smell)

    Organizer : 
    Charlotte Magnusson
    Location : 
    Lund, Sweden
    Date: 
    Thursday, August 23, 2012 to Friday, August 24, 2012
    Material Available: 
    No
    WorkshopType: 
    Workshop
    Area: 
    Europe
    IEEE Sponsored: 
    No
  • Machine Learning Methods for Human-Computer Interaction

    Abstract

    In this tutorial, I will cover various machine learning methods for pattern recognition at an overview level illustrated with case studies mostly taken from haptics applications, and further lay out the space covered by other methods without reviewing them specifically. I will only talk about basic statistical pattern recognition methods applied for supervised learning; namely, Bayesian decision theory, linear discriminant, and k-nearest neighbor methods; emphasizing the distinction between generative and discriminative approaches. I will close by mentioning commonly used extensions of the introduced methods and by providing resources for the participants to follow up with. I will also provide some guidelines on parameter selection and optimization for the classifiers, which is still a research problem in pattern recognition.

    Motivation

    With the development of on-body computing devices such as smartphones, human-computer interaction (HCI) has become one of the popular research areas in computer science. Such devices have several built-in sensors that can acquire data about the motion patterns of the user. On a smaller scale, user hand gestures provide a means to communicate with the computer. Similar means of communication also exist in haptics. For example, human communication patterns with a robot through a haptic channel can provide the robot with past/future states of the user. Through intelligent processing of human motion data, both in large scale (body motion) and small scale (hand gestures), the computer/robot can either make decisions on the past activity state of the user or predict his/her future intentions. For this purpose, knowledge of machine learning methods is essential.

    References

    Organizer : 
    Kerem Altun
    Location : 
    Vancouver, CA
    Date: 
    Sunday, March 4, 2012
    Material Available: 
    No
    Related Conference: 
    WorkshopType: 
    Tutorial
    Area: 
    North America
  • Interation of Haptics in Virtual Environments: from Perception to Rendering

    The design of virtual environments using haptic interfaces remains often driven by the availability of technology than by the necessity to solve real users' issues. There is a need today for a clear change of perspective and it is time to search how to design virtual environments that match properly the real expectations of the user. For instance, haptic hardware could be restricted to stimulate the part of the haptic channel which provides the best contribution to the final percept in the VE. Having a deeper understanding of the characteristics of the human haptic system, as well as of the human perceptual processes would help us to define more effective guidelines for developing and evaluating virtual environments and applications using haptic devices.

    Therefore, this tutorial will provide the audience with recent findings in the field of haptics, multimodal perception and rendering. It will give methodological guidelines for the design of virtual environments that better match the characteristics of the human haptic sense and the expectations of the user. We will illustrate our approach with successful applications and systems, which benefited from information stemming from human perception and from user-centred approach.

    This tutorial is a follow-up of the previous tutorial organized at IEEE VR last year on a similar topic (www.irisa.fr/bunraku/VR07haptic). We will extend and complete the results presented last year. Envisioned topics of the tutorial include:

    • Recent results in the field of human haptics and multimodal perception
    • Design of virtual environments and haptic interfaces based on a user-centred approach
    • Software simplifications related to haptic perception, i.e., “perception-based haptic rendering”
    Organizer : 
    Anatole Lecuyer (INRIA, France), anatole.lecuyer@irisa.fr; Matthias Harders (ETH-Zurich, Switzerland), mharders@vision.ee.ethz.ch
    Location : 
    Reno, NV, IEEE VR 2008
    Date: 
    Sunday, March 9, 2008
    Material Available: 
    No
    WorkshopType: 
    Tutorial
    Area: 
    North America