Workshops

Go to the Program page for the full program.

Workshops will take place on September 6th. We have accepted a variety of half-day and full-day workshops:

1. Designing Digital Touch: Social and Sensory Aspects and Challenges

Whole day: The workshop combines a ½ day physical or virtual component (10 am – 13.00) with pre & post virtual workshop activities, details on the workshop website: https://designtouch2020.wordpress.com/workshop-programme/

Organisers:
Carey Jewitt and Sara Price (University College London) and Jürgen Steimle (Saarland University)

The immense richness of human touch, is still very partially addressed by today’s haptic technologies and interfaces. The workshop aims to bring together haptic researchers, social researchers, interaction designers and makers to explore and discuss in depth the challenges of designing the social and sensorial aspects of touch.  (No prior experience or programming expertise is required, although experience of Arduino and basic electronics would be of help.)

It will interrogate and map these challenges, possible techniques and methods, to inform a draft ‘Manifesto’ for the social and sensory design of digitally mediated touch experiences to inform and support accessible and inclusive design and development that speak to the need of all potential users and future haptic technologies/interfaces.  It will be a valuable source of ideas for future research and development.

Participants will interact with two new methodologies: ‘Multi-Touch Kit’ an open source touch sensing toolkit, and the ‘Designing Digital Touch Toolkit’, to explore the challenges of designing digital touch (with attention to bodies, environment, agency, norms, materiality and temporality). One key goal is to move beyond common limitations and restricting conceptions at the level of interaction modalities, locations and users, and explore opportunities for touch beyond vibration, hand/forearm, a ‘standard’ ‘able’ body and unintentional gendering of technology.

This workshop is now fully booked. No more application to this workshop will be considered.

2. Active haptic humans and robots (THUMB): Artificial haptic systems

Morning

Recording of the morning session:

The video features two talks:

-/ Philipp Beckerle (Technische Universität Dortmund / Technische Universität Darmstadt, Germany): Considering human body experience in haptic interface and wearable robot design.

-/ Thrishantha Nanayakkara (Imperial College, UK): Conditioning the body to reduce the entropy of perception.

Organisers:

Lucia Seminara & Fulvio Mastrogiovanni (University of Genoa, Italy), Matteo Bianchi & Paolo Salaris (University of Pisa), Simon Watt & Ken Valyear (Bangor University)

Semi-autonomous robotic haptic exploration deals with acquiring and interpreting tactile and proprioceptive sensory signals through purposive and information-seeking movements to manage constrained tasks. If high-level goals are defined by humans, haptically-intelligent robots can be able to manage a specific task in an optimal way, even outperforming humans, but they fail for open-ended situations. A special case is represented by robotics-enabled prosthetics, which requires focusing on how to link haptics-related artificial devices to humans and examining whether a biomimetic approach is necessarily better. During this morning session, we will foster debate around different, possibly contrasting, approaches to the development of next-generation haptic feedback interfaces and touch-enabled artificial and robotic systems inspired by the current state of the art in active haptic perception in humans and robots. Decision-making under uncertainty can be the overall framework that holds these different perspectives together, implying that planning movements or recovering properties of the world is rendered probabilistic in nature. The workshop will address such broad research questions as: (i) What level of verisimilitude is necessary to generalize lessons learned from haptics-competent robots to humans and viceversa? (ii) Is it necessary to emulate the hierarchical structure of the biological nervous system? Can we think of the control of the hand analogous to that of a prosthetic device with limited and noisy sensory inputs? (iii) Are biomimetic robots appropriate platforms to emulate human capabilities with the goal of developing adequate theory of computation to achieve human-like haptic performance?

At the end of keynote talks, the interaction of the audience will be promoted by organizing parallel interactive and unstructured discussions, driven by (i)-(iii) topics defined in this abstract and keywords collected by the organizers during keynote presentations.

Program (9.00 – 12.30)

9.00 – 9.10: Introduction

9.10 – 9.30: Philipp Beckerle (Technische Universität Dortmund / Technische Universität Darmstadt, Germany)

9.30 – 9.50: Thrishantha Nanayakkara (Imperial College, UK)

9.50 – 10.05: Break

10.05 – 10.25: Strahinja Dosen (Aalborg University, Denmark)

10.25 – 10.45: Hannes Saal (University of Sheffield, UK)

10.45 – 11.00: Break

11.00 – 12.15: Interactive discussion: This discussion will be organized around keynote topics and will be led by the WS organizers from a remote location.

12.15 – 12.30: Wrap-up

Registration : Regular registration can be done through the Eurohaptics 2020 website : http://eurohaptics2020.org/attending/

Speakers

Philipp Beckerle (Technische Universität Dortmund / Technische Universität Darmstadt, Germany) : Considering human body experience in haptic interface and wearable robot design

Wearable robotic devices, e.g., robotic prostheses and exoskeletons, as well as teleoperation systems are emerging technologies with tremendous potential to address upcoming needs due to demographic change and constrained activity in daily life. On the bottom line, such devices are artefacts made to be used by human users in tight physical interaction, which outlines the paramount importance of haptic interfaces. Due to the tightness of interaction, technical and human factors are crucial in design and control while human factors are usually harder to grasp for the developer. This talk presents design approaches to consider human factors methodically and focuses on creating, conducting, and interpreting human-in-the loop experiments to understand underlying mechanisms of human body experience, e.g., by disentangling haptic feedback from motion control. The embodiment of artificial limbs by their users is proposed as a very promising measure to assess interface design and adapt robot control to distinctly improve device functionality and user satisfaction. The talk presents experimental studies of human-robot body experience regarding the upper and lower limbs as well as cognitive models for embodiment prediction to finally discuss directions for future research on bidirectional human-machine interfaces and non-functional haptic feedback.

Thrishantha Nanayakkara (Imperial College, UK) : Conditioning the body to reduce entropy of perception

A system is called an embedded system if it can take good enough actions in response to states within deadlines imposed by the environment. In that sense living beings and most robots are embedded systems. When states are uncertain, the task of state estimation within deadlines becomes non-trivial. Living beings often take a recursive approach to estimate such random variables. For instance, if someone is asked to estimate the weight of an object, they would bob it up and down several times before concluding an estimate. If we frame it as a Recursive Bayesian estimation process, the agent can significantly benefit from the ability to “morph” the likelihood function to sharpen the posterior distribution. In our studies we see that participants change the elbow stiffness and bobbing behavior depending on the weight of the object in the above scenario. We see similar phenomena in other estimation tasks too. In soft tissue palpation for instance, when a Physician is required to estimate the location of the edge of the liver of a patient using manual palpation, they would regulate the stiffness and configuration of the fingers to condition haptic perception during palpation. In this talk, I will show some recent results of this information morphing approach for efficient estimation of environmental states using a controllable stiffness body. I will show a soft robotic approach to test hypotheses we build based on human behavior

Strahinja Dosen (Aalborg University, Denmark) : Somatosensory feedback in prosthetics from the perspective of human motor control

Restoring somatosensory feedback in upper limb prostheses is an important goal that attracts the attention of the academic community and industry. Yet, this situation is not in any way unique and something similar has happened decades ago in 70 and ’80s. However, we still do not have a commercial prosthesis that provides feedback to its user. In this lecture, we will advocate that it is not the lack of technology that has impeded the translation of research into clinical applications, but the lack of fundamental knowledge about the role of artificial feedback in the context of prostheses control. We will propose that feedback needs to be understood from the perspective of human motor control since it interacts with the other components of the motor control loop. Therefore, to be effective, the feedback needs to be designed to seamlessly integrate with these components and make an impact on the overall control loop. We will demonstrate that this perspective can explain a surprising outcome that we see in many studies, where the supplemental feedback failed to demonstrate benefits in prosthesis control. Finally, we will show how the concepts of human motor control can be used to design effective feedback interfaces.

Hannes Saal (University of Sheffield, UK) : Bio-mimetic tactile feedback in robotics and prosthetics: can we and should we?

Despite much progress in tactile sensor development and artificial intelligence, robots are still poor at haptic tasks. Similarly, tactile feedback in modern prosthetics often has low throughput and can be cumbersome for the user. One way of solving these problems is to take inspiration from the brain and develop algorithms that mimic the way tactile information is processed neurally. But which aspects of neural tactile processing are worth imitating? Here, I will review recent findings in the neuroscience of touch and how these might be implemented in a biomimetic fashion. I will then discuss recent work towards establishing guiding principles for artificial tactile processing that is not only biomimetic in some fashion, but implements natural processing strategies that are crucial for our capability to interact with and manipulate objects effortlessly.

3. AcTive Haptic hUMans and Robots (THUMB): Human Active Touch

Afternoon

Recording of the afternoon session:

The video features four talks:

-/ Vincent Hayward (Sorbonne Université, France): Tactile mechanics.

-/ Alessandro Moscatelli (University of Rome “Tor Vergata” / Fondazione Santa Lucia IRCCS, Italy): Integration of proprioception and touch for the control of reaching movements.

-/ Roberta Klatzky (Carnegie Mellon University, Pittsburgh, USA): Active touch supports the perception of rendered material.

-/ Gerald E. Loeb (University of Southern California, Los Angeles / SynTouch Inc., USA): Learning from bio-inspired haptics.

Organisers:

Lucia Seminara & Fulvio Mastrogiovanni (University of Genoa, Italy), Matteo Bianchi & Paolo Salaris (University of Pisa), Simon Watt & Ken Valyear (Bangor University)

Touch is a highly interactive sense: tactile communication is mediated by physical contact and motor action underpinning interaction. To this aim, humans need both to interpret the information they sense through touch and to select necessary movements to maximize the information uptake over given tactile features, as well described in the pioneering work on Exploratory Procedures by Lederman and Klatzky. During this afternoon session, the main principles and recent findings on the mechanisms underpinning active human touch perception will be reviewed. This session will cover topics that encompass (i) the mechanics of touch, (ii) the role of exploration for the perception of rendered materials and (iii) how exploratory movements can be effectively tuned to improve haptic information gathering. (iv) The integration of proprioception and touch will be also analyzed for motor control, which is the other side of the coin of the action-perception coupling. There is a compelling case for using principles of human haptic perception – active touch – to inspire the development of robotic systems: we will finally discuss (v) how the intimate connection between movements and percepts can be profitably translated for robotic applications with bio-inspired tactile sensors.

At the end of keynote talks, the interaction of the audience will be promoted by organizing parallel interactive and unstructured discussions, driven by (i)-(v) topics defined in this abstract and keywords collected by the organizers during keynote presentations.

Program (13.00 – 16.30)

13.00 – 13.10: Introduction

13.10 – 13.30: Vincent Hayward (Sorbonne Université, France)

13.30 – 13.50: Knut Drewing (Justus-Liebig-University Giessen, Germany)

13.50 – 14.00: Break

14.00 – 14.20: Alessandro Moscatelli (University of Rome “Tor Vergata” / Fondazione Santa Lucia IRCCS, Italy)

14.20 – 14.40: Roberta L. Klatzky (Carnegie Mellon University, Pittsburgh, USA)

14.40 – 15.00: Gerald E. Loeb (University of Southern California, Los Angeles / SynTouch Inc., USA)

15.00 – 15.15: Break

15.15 – 16.15: Interactive discussion: This discussion will be organized around keynote topics and will be led by the WS organizers from a remote location.

16.15 – 16.30: Wrap-up

Registration : Regular registration can be done through the Eurohaptics 2020 website : http://eurohaptics2020.org/attending/

Speakers

Vincent Hayward (Sorbonne Université, France) : Tactile Mechanics

The astonishing variety of phenomena taking place during contact between fingers and objects is a formidable trove of information that can be extracted by organisms to learn about the nature and the properties of objects. This richness is likely to have fashioned our somatosensory system at all levels of its organisation, from early mechanics to cognition. I will illustrate this idea through examples and show how the physics of mechanical interactions shape the messages that are sent to the brain; and how the early stages of the somatosensory system en route to the primary areas are organised to process these messages.

Knut Drewing (Justus-Liebig-University Giessen, Germany) : How clever is human haptic perception?

Human haptic perception is an inherently active process during which humans purposively gather sensory information. When humans aim to perceive their environment by touch, for example when they aim to haptically perceive the softness of an object, they first have to appropriately explore the object with the fingers. It is this exploratory movement that generates the relevant sensory information. Often a single movement seems not sufficient, but humans repeatedly explore the object before they deliver a judgment. In several experiments we investigated how humans utilize active movement control in touch. Our results show how people fine-tune their exploratory movements in order to improve the gathering of sensory information, how they use repeated exploratory movements to gather sufficient information, and how an interplay of closed-loop sensorimotor processes and implicit predictive open-loop processes determines the control of exploratory movement. We also studied how the serially gathered and prior information are integrated for perception. Overall, we conclude that human haptic perception is based on quite “clever” processes, in that it optimally integrates different types of information under conditions of memory decay and in that exploratory movements are optimized in different ways.

Alessandro Moscatelli (University of Rome “Tor Vergata” / Fondazione Santa Lucia IRCCS, Italy) : Integration of proprioception and touch for the control of reaching movements

According to a classical view in physiology, receptors in the musculoskeletal system provide kinesthetic information for guiding movements of our limbs. However, perceptual tasks where the hand was stationary demonstrated that in addition to proprioceptive cues, cutaneous stimuli from contact with objects provide the illusion of hand displacement. In our recent work, we demonstrated that touch provides auxiliary proprioceptive feedback for guiding reaching hand movements. We developed a novel behavioral task where participants slid their fingertip on a plate with raised ridges to move toward a target, without any visual feedback on hand location. Tactile motion estimates were biased by ridge orientation, inducing a systematic deviation in hand trajectories in accordance with the hypothesis of the role of touch as an auxiliary proprioceptive cue. Results are in agreement with an ideal observer model, where motion estimates from different somatosensory cues are optimally integrated for the control of movement. This situation highlights the capacity of the brain to take advantage of knowledge of the mechanical properties of the body and of the external environment to increase the precision of the fused estimate.

Roberta L. Klatzky (Carnegie Mellon University, Pittsburgh, USA) : Active Touch Supports Perception of Rendered Material

Over 30 years ago, Susan Lederman and I published a seminal paper on haptic Exploratory Procedures, which described invariants in active movements that accompanied the intake of information about the surface, substantive, and structural properties of objects.  Since that time, haptic researchers have come to a deep understanding of EPs: why invariant gestures occur with respect to neural optimization, how the specific movement parameters vary with context and intention, how they are expressed in non-human perceivers with different sensory apparatus such as whiskers, and how the underlying algorithms might be usefully implemented in sensing robots.  Exploration is not essential; indeed, passive presentation suffices for the perception of objects’ textural properties and even, somewhat surprisingly, for a limited impression of shape and compliance.  But what if the objects are not real?  My recent work has considered how active control contributes to rendering material properties. When perceivers explore a virtual surface or substance and receive sensory feedback that is congruent with a real object, the inevitable (one hopes) percept is that of an externalized physical object.  As indicated by psychophysical judgments, percepts can be obtained even with primitive displays that under passive conditions would be merely a series of vibrations and jolts.

Gerald E. Loeb (University of Southern California, Los Angeles / SynTouch Inc., USA) : Learning from Bio-inspired Haptics

Humans excel in unstructured environments.  If we want robots to perform similarly, we must understand the hardware and software of human haptics.  That is not to say that we must slavishly incorporate all of that into a given robot to be useful.  Man-machine interfaces – from computer monitors to prosthetic hands – succeed when we understand which biological features are necessary and sufficient and which can be ignored for the task at hand (pun intended).  Our work with a biomimetic tactile sensor (BioTac® from SynTouch Inc., for which the speaker is a founding director) led us to appreciate the intimate and necessary connection between movements and percepts.  For either an animal or a machine to interpret tactile signals, it must either control or know the details of the exploratory movement.  We have developed a Bayesian algorithm for efficient selection of such movements and interpretation of the consequent signals – a new form of bioinspired artificial intelligence.  For either an animal or a machine to explore or handle objects in a controllable manner, it must respond to tactile signals but not necessarily perceive them directly.  We have developed a myoelectric prosthetic hand that uses spinal-like tactile reflexes to greatly improve grasp of fragile objects.

4. Assistive technology for persons with deafblindness

(cancelled due to COVID-19 regulations)

Afternoon

Organisers:
Astrid Kappers, Myrthe Plaisier (Eindhoven University of Technology)

This workshop will address how assistive technology can facilitate day-to-day life of persons with deafblindness. Deafblindness means that the visual and auditory systems are deteriorated to such a degree that one cannot substitute for the other. The main topics covered will be communication and navigation. The aim of this workshop is to bring together scientists, professionals from expertise centers and persons with deafblindness. The workshop will be a combination of short talks and demos. We will provide the opportunity for scientists and developers to demonstrate their applications to persons with deafblindness, and persons with deafblindness will have the opportunity to try out state-of-the-art technological developments. The idea for this workshop was inspired by the European H2020 project SUITCEYES in which a garment with haptic feedback is being developed to assist persons with deafblindness. We invite technical developers to contact us if they have a suitable demo and want to show this during this workshop.

More information can be found here.

5. Telepresence and embodiment hackathon

(cancelled due to COVID-19 regulations)

Whole day

Organisers:
Douwe Dresscher, Sara Falcone, Jan van Erp, Camille Sallaberry (University of Twente),
Ivo Stuldreher, Nanja Smets, Kees van Teeffelen (TNO).        

i-Botics intends to create synergy in multimodal telepresence, transporting the operator’s social and functional self to a robotic avatar anywhere on Earth through a compelling combination of state-of-the-art social, visual, haptic, audio and olfactory technologies. This description can be defined by the concept of Sense of Embodiment. The topic of discussion of this hackathon is about (1) avatar ownership, meaning that the operator should be able to crawl into the skin of the avatar. By orchestrating coherence between movements and posture of the operator and the avatar, synchronizing vision, touch and proprioception, and by employing body ownership illusions we try to provide the operator with an avatar ownership experience. (2) Avatar agency, namely intuitive operator control over the avatar through head-slaved sensor control, arm and hand tracking to control manipulators, body tracking for avatar posture, body or legs/feet tracking to control avatar motion, and physiological sensors to control for instance variable stiffness of the arms. And (3) the sense of self-location of the operator, which concerns the subjective feeling of where the operator body is. To provide visual feedback in a remote condition, for example, the real stream of data is not the best choice usually, but SLAM algorithms and VR are some of the approaches used to deal with it.

We propose 3 challenges. You can decide to deal with the one that you find more interesting. If there will be too many participants for just one challenge, we will create multiple groups and we will have a inner-challenge competition!
At the end of the day, each group will present its results.
The challenges are:
i) Challenge 1: how to create the sense of embodiment while teleoperating a device which is not human-like;

ii) Challenge 2: how to deal with the delay, which is one of the main opponent in teleoperation tasks;

iii) Challenge 3: which are the effect of manipulating something really heavy (that a human being could not manipulate) on the sense of the embodiment and how to deal with the force feedback?

We will debate and approach the topic from different perspectives. This is a tentative plan:

09:00 – 09:20 : we will first discuss briefly the state-of-the-art of the current teleoperation strategies and the different scenarios of application;

09:20 – 10:00: presentations of the participants on their work;

10:00 – 10:30 : we will open a discussion about: 1) how to evaluate the sense of embodiment; 2) which are the most important aspects which affect the sense of embodiment; 3) how to design experiments to test the sense of embodiment;

10:30 – 10:45 Break

10:45 – 11:00 presentations of the participants on their work;

11:00 – 12:15 Hackathon

12:15 – 13:15 Lunch

13:15 – 14:30 Hackathon

14:30 – 14:45 Break

14:45 – 16:00 Hackathon

16:00 – 16:45 Demonstrations

16:45 – 17:00 Closure

Considering the topic, multidisciplinarity of the participants would be totally appreciated and would make the discussion really relevant and interesting.
We invite the possible participants to contact us if they want to share a short presentation about their work related to the topic or if they are dealing with a challenge that they would like to brain storm with the scientific community.