Temp

Keynote Speakers

Monica Gori

Istituto Italiano di Tecnologia, Genova, Italy

Monica Gori

Building Multisensory Experience: The Role of Sensory Interaction in Development and Accessible Technology

An organism’s ability to interact with the external world depends on its capacity to accurately process and interpret environmental information. For example, when walking down the street, vision helps us navigate and locate objects of interest, while hearing provides critical cues, such as an approaching car or the presence of people nearby. These seemingly simple actions are the result of complex multisensory integration processes. The mechanisms underlying multisensory development are still not fully understood. Over the past years, our research has highlighted the fundamental role of sensory interaction in supporting the development of multisensory integration. We have shown that some multisensory skills take time to emerge, and that vision plays a crucial role in shaping both auditory and tactile integration. In the absence of visual input, representations of auditory space and body representations are altered, making sensory integration more difficult, or even impaired, in blind infants and children compared to their sighted peers. Sensory interaction, therefore, represents in many cases a building block of multisensory development. The strong link between sensory interaction and multisensory learning provides a powerful framework for understanding the basic principles that allow us to learn to navigate and interpret a multisensory environment. This understanding also drives the development of science-based technologies aimed at improving the quality of life for individuals with sensory impairments. Our technological solutions translate the world into sound, vibration, and movement, offering children new inputs they can perceive, make sense of, and grow with.

 

 

Mark Wallace

Vanderbilt University, Nashville, USA
 
Mark Wallace

From Single Neurons to Immersive Environments: A Multisensory Scientist’s Journey

We live in a multisensory world, being continually bombarded with stimuli from our various sensory modalities. As such, one of the important functions of the brain is to combine this information into a coherent perceptual gestalt. Multisensory research has provided a good deal of insight into the behavioral and perceptual benefits of having information available from multiple senses, as well as the combinatorial operations and neural circuits involved in multisensory integration. In addition, evidence continues to grow linking altered multisensory function to neuropsychiatric and neurodevelopmental conditions. The talk will focus on several of our lab’s contributions to our understanding of multisensory function, as well as its development and plasticity, at the neural, perceptual and clinical levels. It is important to note that much of this work has been carried out in highly controlled laboratory settings, leaving open the question of whether multisensory abilities differ in more naturalistic scenarios. To examine this, we are beginning to use fully immersive environments. In this augmented reality CAVE, we are presenting high definition 3D visual, spatialized audio, and tactile (haptic) stimuli while monitoring eye, head and body movements and recording brain activity via EEG. The focus of this work is to study multisensory development using more realistic settings.

 

 
 

Alessandro Farnè

Lyon Neuroscience Research Centre

IMPACT | Lyon Neuroscience Research Centre

The role of vision in grasping and sensing objects with tools

Scientists have long questioned the origin of the exquisite human mastery of tools. How do we manage controlling a tool in the skillful way humans typically do, as a body-part? Grasping objects with tools is a major challenge for the visuomotor system, in that the control of the hand needs to be transferred to the prehensile part of the tool. In the first part of my talk I will present findings suggesting that when we use tools to grasp objects, the tool is incorporated into our body representation. Our more recent work indicates that the lack, or (even late) loss of vision hampers such update of the body state estimation for motor control of tools. The second part of my talk will focus on the perceptual component of tool-use: sensing through tools challenges the somatosensory system, humans being capable of localizing impacts on the entire surface of a hand-held rod with great accuracy. I’ll report findings indicating that, contrary to motor control, the lack (or loss) of vision in the latter case does not prevent good tactile localization on hand held tools. These findings help understanding the differential role that vision plays in the sensorimotor control of tools.

 

Loading... Loading...