Temp

Pre-Conference Workshops

IMRF 2025 will be offering three pre-conference workshops focused on methods in multisensory research.

These will run in parallel and take place in the morning of the first conference day: 

Tuesday, 15th July, 9am-12.30pm.

 

  

 

The three workshops will include both theoretical as well as hands-on practical tasks, and will focus on (1) behavioural, (2) neural, and (3) computational methods: 

 

 

You can sign up to the workshops here before the 1st July. 

Note that there is a limitation on numbers for each workshop. We will try to ascertain that you will be allocated to your first choice, however, this cannot be guaranteed. Please indicate a second choice in the form.  

 

If you want to take part in the hands-on exercises, please bring along your laptop and check that the respective dependencies (see below) are installed. 

 

(1) Enhancing behavioural testing efficacy with continuous psychophysics

Organizer: Björn Jörges 

Topic: Designing experiments and analysis pipelines for multisensory experiments combining continuous stimulus and responses, thereby enhancing the efficacy of behavioural data collection. 

Description: Psychophysical experiments are notorious for requiring large amounts of participant time, which in turn constricts statistical power and, with that, limits the reliability of experimental results. Further, typical trial-based designs are not conducive to ecological validity as real-life rarely pauses to give us time to respond before presenting us with the next perceptual challenge. Continuous Psychophysics, a novel paradigm originated in Vision Science, addresses both of these shortcomings by coupling a continuous stimulus with a continuous participant response. It allows, in principle, to collect large amounts of data in very short time and frees us from the shackles of trial-based designs. 

This workshop will first introduce this method by providing an overview over the theoretical underpinnings of Continuous Psychophysics, discuss practical implications for experimental design, and present some example studies from both Vision Science and Multisensory Integration. Then, we will build a very simply tracking experiment together in PsychoPy, collect data from ourselves and analyze them in R.

Dependencies: PsychoPy, R

 

(2)  Decoding EEG signatures of multisensory decision-making

Organizers: Ioannis Delis; Kyriakos Birmpas; Jessica Diaz

Topic: Decoding neural signals on a single-trial level: decoders allow to predict (a) stimuli, (b) binary decision responses, or (c) sensory categories

Description: How does the brain integrate information from multiple senses to make perceptual decisions? In this workshop, we will explore the neural processes underlying multisensory decision-making using electroencephalography (EEG) and computational modeling.

We will begin by presenting an audiovisual object discrimination task, during which EEG was recorded to capture neural dynamics. Participants will then be introduced to a computational framework that enables the quantification of the neural evidence that is available for decision formation. This approach uses multivariate single-trial decoding to identify spatiotemporal EEG components that reflect distinct stages of neural information processing. Through hands-on demonstrations, we will examine how to extract EEG components that represent: (1) Sensory versus decision-related evidence; (2) Changes in neural representations due to learning/practice; (3) Modality-specific (auditory and visual) and integrated (audiovisual) neural signals. Then, by linking these neural components to behavioral performance, we will quantify the neural correlates of multisensory benefits. As an optional last step, we will introduce a novel information-theoretic framework to characterize the nature of multisensory interactions. This approach will quantify unique (modality-specific), redundant (supramodal), and synergistic (multisensory) contributions of sensory modalities, and will map when and where these interactions emerge in the human brain. Participants will gain access to a user-friendly implementation of the computational framework, enabling them to apply these methods to their own EEG data and research questions in multisensory perception and cognition.

Dependencies: Matlab

 

(3) Modelling multisensory processing within the causal inference framework

Organizers: Ulrik Beierholm, Haocheng Zhu

Topic: Causal inference, Modelling, Bayesian inference, Theory development 

Description: Computational models provide a much more precise yet concise way of describing mechanisms in a theory. It can however feel overwhelming for someone without a computational/mathematical background to attempt to use computational models. A prominent model in the multisensory field is the Bayesian Causal Inference (BCI) model, which proposes that the nervous system makes inferences about the causality of presented stimuli. A toolbox programmed in Python has recently been developed to model BCI without the need for any programming, making it easier for many researchers to apply it.
This workshop will explain the theoretical background for the model, and train participant through hands-on examples in the use of the toolbox. All theory and tutorials will be framed in terms of standard experiments from the multisensory literature.
No programming experience is required, but participants should bring their own laptop with Python 3.9+ preinstalled.
Participants are welcome to also bring their own data to the workshop and discuss how the toolbox can be used to test BCI models.

Dependencies: Python 3.9-3.11

Loading... Loading...