Keynotes

Go to the Program page for the full program.

Speaker : Sile O’Modhrain

Title : Malleable Media: Defining Interaction Paradigms for Full-page Tactile Arrays

Date : Sunday 6 September (17:00 – 18:00)

Abstract : Five years ago, I presented a talk at the Haptics Symposium entitled “Rekindling the Search for the Holy Braille”, in which I outlined the challenges faced in developing a full-page refreshable tactile array for the display of braille and tactile graphics.  I am happy to report that, while no displays are yet available on the market, there are at least five serious contender’s with  technical approaches that are very close to solving the problem.

Now the challenge becomes one of determining how to interact with tactile content, static or dynamic, on such a full-page refreshable display.  For users of speech-based interfaces or single-line displays, this moment is as significant as the transition from the command line to the GUI.

In this talk, I will review current work seeking to define a new interaction paradigm for tactile displays and present some of the questions that are taxing those of us in the field at the moment.

 

Speaker : Garmt Dijksterhuis

Title : TBA (interactive opening reception)

Date : Sunday 6 September (18:00 – 19:00)

Abstract: TBA

 

Speaker : Hellen van Rees

Title : TBA 

Date : Monday 7 September (9:30- 10:30)

Abstract: TBA

 

Speaker : Monica Gori

Title : From science to technology: the interaction between senses during the development and the creation of new multi-sensory rehabilitation devices.

Date : Wednesday 9 September (16:30- 17:30)

Abstract: During the first years of life, sensory modalities communicate with each other. Since 2004 we have studied how the haptic, visual, and auditory modalities interact and are integrated during development. We have observed that specific sensory modalities are crucial for the development of particular skills and the absence of one sensory input impacts on the development of different modalities. For example, in young children, the haptic modality is essential to perceive the size, and the lack of haptic skills affects the ability to understand the visual size of objects. Similarly, the visual information is crucial to perceive orientation and the lack of vision impacts on haptic orientation perception and verticality perception.

These results suggest that there is a strong interaction between sensory systems during the early period of life to correct perceptual development to occur. Starting from these premises, we have developed new multisensory technology based on multisensory feedback. Its goal is to improve the perceptual, motor, social, and learning skills of children with impairments. During the presentation, I’ll present the results obtained with ABBI and TechArm. These are two wearable devices that provide audio, visual, and tactile feedbacks associated with body movement to improve haptic manipulation and social interaction. I will also show the results obtained with the WeDraw, a set of applications based on audio, tactile, and visual feedback designed to improve learning skills at the elementary school level. The results are discussed considering further developments and applications for rehabilitation settings in hospitals.