Go to the Program page for the full program.

Workshops will take place on September 6th. We have accepted a variety of half-day and full-day workshops:

1. Designing Digital Touch: Social and Sensory Aspects and Challenges

Whole day

Carey Jewitt (University College London) and Jürgen Steimle (Saarland University)

The immense richness of human touch, is still very partially addressed by today’s haptic technologies and interfaces. The workshop aims to bring together haptic researchers, social researchers, interaction designers and makers to explore and discuss in depth the challenges of designing the social and sensorial aspects of touch.  (No prior experience or programming expertise is required, although experience of Arduino and basic electronics would be of help.)

It will interrogate and map these challenges, possible techniques and methods, to inform a draft ‘Manifesto’ for the social and sensory design of digitally mediated touch experiences to inform and support accessible and inclusive design and development that speak to the need of all potential users and future haptic technologies/interfaces.  It will be a valuable source of ideas for future research and development.

Participants will interact with two new methodologies: ‘Multi-Touch Kit’an open source touch sensing toolkit, and the ‘Designing Digital Touch Toolkit, to explore the challenges of designing digital touch (with attention to bodies, environment, agency, norms, materiality and temporality). One key goal is to move beyond common limitations and restricting conceptions at the level of interaction modalities, locations and users, and explore opportunities for touch beyond vibration, hand/forearm, a ‘standard’ ‘able’ body and unintentional gendering of technology.

This workshop is now fully booked. No more application to this workshop will be considered.

2. AcTive Haptic hUMans and Robots (THUMB): Human Active Touch



Lucia Seminara & Fulvio Mastrogiovanni (University of Genoa, Italy), Matteo Bianchi & Paolo Salaris (University of Pisa), Simon Watt & Ken Valyear (Bangor University)

Touch is a highly interactive sense: tactile communication is mediated by physical contact and motor action underpinning interaction. To this aim, humans need both to interpret the information they sense through touch and to select necessary movements to maximize the information uptake over given tactile features, as well described in the pioneering work on Exploratory Procedures by Lederman and Klatzky. During this morning session, the main principles and recent findings on the mechanisms underpinning active human touch perception will be reviewed. This session will cover topics that encompass (i) the mechanics of touch, (ii) the role of exploration for the perception of rendered materials and (iii) how exploratory movements can be effectively tuned to improve haptic information gathering. (iv) The integration of proprioception and touch will be also analyzed for motor control, which is the other side of the coin of the action-perception coupling. There is a compelling case for using principles of human haptic perception – active touch – to inspire the development of robotic systems: we will finally discuss (v) how the intimate connection between movements and percepts can be profitably translated for robotic applications with bio-inspired tactile sensors. This conclusive part will also serve to launch the afternoon session, which will be mainly focused on robotics and prosthetics. At the end of keynote talks, the interaction of the audience will be promoted by organizing parallel interactive and unstructured discussions, driven by (i)-(v) topics defined in this abstract and keywords collected by the organizers during keynote presentations.

TENTATIVE PLAN 8.30 – 12.30 AM:

– Talks (5) + break: 3h

– Parallel interactive discussions: 50min

– Final wrap-up: 10min


Vincent Hayward (Sorbonne Université, Paris): Tactile Mechanics

Roberta L. Klatzky (Carnegie Mellon University, Pittsburgh): Active Touch Supports Perception of Rendered Material

Knut Drewing (Justus-Liebig-University, Giessen): How clever is human haptic perception?

Alessandro Moscatelli (University of Rome “Tor Vergata”): Integration of proprioception and touch for the control of reaching movements

Gerald E. Loeb (University of Southern California): Learning from Bioinspired Haptics

3. Active haptic humans and robots (THUMB): Artificial haptic systems



Lucia Seminara & Fulvio Mastrogiovanni (University of Genoa, Italy), Matteo Bianchi & Paolo Salaris (University of Pisa), Simon Watt & Ken Valyear (Bangor University)

Semi-autonomous robotic haptic exploration deals with acquiring and interpreting tactile and proprioceptive sensory signals through purposive and information-seeking movements to manage constrained tasks. If high-level goals are defined by humans, haptically-intelligent robots can be able to manage a specific task in an optimal way, even outperforming humans, but they fail for open-ended situations. A special case is represented by robotics-enabled prosthetics , which requires focusing on how to link haptics-related artificial devices to humans and examining whether a biomimetic approach is necessarily better. During this afternoon session, we will foster debate around different, possibly contrasting, approaches to the development of next-generation haptic feedback interfaces and touchenabled artificial and robotic systems inspired by the current state of the art in active haptic perception in humans and robots. Decision-making under uncertainty can be the overall framework that holds these different perspectives together, implying that planning movements or recovering properties of the world is rendered probabilistic in nature. The workshop will address such broad research questions as: (i) What level of verisimilitude is necessary to generalize lessons learned from haptics-competent robots to humans and viceversa? (ii) Is it necessary to emulate the hierarchical structure of the biological nervous system? Can we think of the control of the hand analogous to that of a prosthetic device with limited and noisy sensory inputs? (iii) Are biomimetic robots appropriate platforms to emulate human capabilities with the goal of developing adequate theory of computation to achieve human-like haptic performance? After keynote speakers, parallel interactive discussions will be driven by (i)-(iii) topics defined above and keywords collected by the organizers during keynote presentations.

TENTATIVE PLAN: 1.15 – 4.45 PM

– talks (4) + break: 2h30

– Parallel interactive discussions: 50min

– Final wrap-up: 10min


Thrishantha Nanayakkara (Imperial College, UK): Conditioning the body to reduce entropy of perception

Philipp Beckerle (Technische Universität Dortmund, Technische Universität Darmstadt): Considering human body experience in haptic interface and wearable robot design

Strahinja Dosen (Aalborg University): Somatosensory feedback in prosthetics from the perspective of human motor control

Hannes Saal (University of Sheffield): Biomimetic tactile feedback in robotics and prosthetics: can we and should we?

4. Assistive technology for persons with deafblindness

(cancelled due to COVID-19 regulations)


Astrid Kappers, Myrthe Plaisier (Eindhoven University of Technology)

This workshop will address how assistive technology can facilitate day-to-day life of persons with deafblindness. Deafblindness means that the visual and auditory systems are deteriorated to such a degree that one cannot substitute for the other. The main topics covered will be communication and navigation. The aim of this workshop is to bring together scientists, professionals from expertise centers and persons with deafblindness. The workshop will be a combination of short talks and demos. We will provide the opportunity for scientists and developers to demonstrate their applications to persons with deafblindness, and persons with deafblindness will have the opportunity to try out state-of-the-art technological developments. The idea for this workshop was inspired by the European H2020 project SUITCEYES in which a garment with haptic feedback is being developed to assist persons with deafblindness. We invite technical developers to contact us if they have a suitable demo and want to show this during this workshop.

More information can be found here.

5. Telepresence and embodiment hackathon

(cancelled due to COVID-19 regulations)

Whole day

Douwe Dresscher, Sara Falcone, Jan van Erp, Camille Sallaberry (University of Twente),
Ivo Stuldreher, Nanja Smets, Kees van Teeffelen (TNO).        

i-Botics intends to create synergy in multimodal telepresence, transporting the operator’s social and functional self to a robotic avatar anywhere on Earth through a compelling combination of state-of-the-art social, visual, haptic, audio and olfactory technologies. This description can be defined by the concept of Sense of Embodiment. The topic of discussion of this hackathon is about (1) avatar ownership, meaning that the operator should be able to crawl into the skin of the avatar. By orchestrating coherence between movements and posture of the operator and the avatar, synchronizing vision, touch and proprioception, and by employing body ownership illusions we try to provide the operator with an avatar ownership experience. (2) Avatar agency, namely intuitive operator control over the avatar through head-slaved sensor control, arm and hand tracking to control manipulators, body tracking for avatar posture, body or legs/feet tracking to control avatar motion, and physiological sensors to control for instance variable stiffness of the arms. And (3) the sense of self-location of the operator, which concerns the subjective feeling of where the operator body is. To provide visual feedback in a remote condition, for example, the real stream of data is not the best choice usually, but SLAM algorithms and VR are some of the approaches used to deal with it.

We propose 3 challenges. You can decide to deal with the one that you find more interesting. If there will be too many participants for just one challenge, we will create multiple groups and we will have a inner-challenge competition!
At the end of the day, each group will present its results.
The challenges are:
i) Challenge 1: how to create the sense of embodiment while teleoperating a device which is not human-like;

ii) Challenge 2: how to deal with the delay, which is one of the main opponent in teleoperation tasks;

iii) Challenge 3: which are the effect of manipulating something really heavy (that a human being could not manipulate) on the sense of the embodiment and how to deal with the force feedback?

We will debate and approach the topic from different perspectives. This is a tentative plan:

09:00 – 09:20 : we will first discuss briefly the state-of-the-art of the current teleoperation strategies and the different scenarios of application;

09:20 – 10:00: presentations of the participants on their work;

10:00 – 10:30 : we will open a discussion about: 1) how to evaluate the sense of embodiment; 2) which are the most important aspects which affect the sense of embodiment; 3) how to design experiments to test the sense of embodiment;

10:30 – 10:45 Break

10:45 – 11:00 presentations of the participants on their work;

11:00 – 12:15 Hackathon

12:15 – 13:15 Lunch

13:15 – 14:30 Hackathon

14:30 – 14:45 Break

14:45 – 16:00 Hackathon

16:00 – 16:45 Demonstrations

16:45 – 17:00 Closure

Considering the topic, multidisciplinarity of the participants would be totally appreciated and would make the discussion really relevant and interesting.
We invite the possible participants to contact us if they want to share a short presentation about their work related to the topic or if they are dealing with a challenge that they would like to brain storm with the scientific community.