Back Multisensory Integration and Conflict

Multisensory Integration and Conflict

Multisensory Integration and Conflict
Humans sample information from a rich diversity of sensory systems. These different information channels allow us to exploit correlations between various sensory features and eventually form coherent perceptual representations of the objects in...

 

Humans sample information from a rich diversity of sensory systems. These different information channels allow us to exploit correlations between various sensory features and eventually form coherent perceptual representations of the objects in the environment. Many popular phenomena used to address these multisensory interactions are based on inter-sensory conflict, such as for example the Ventriloquist Illusion, the McGurk effect, the Rubber Hand Illusion, just to name a few. What is more, the Bayesian framework of multisensory perception, which is the dominant current approach, is grounded on evidence from experiments precisely manipulating crossmodal disparity. Despite the prominent role of cross-modal disparity in all these approaches, it is remarkable that prior work on multisensory perception has taken little notice of the role of conflict. Here, we propose that conflict is not only a shortcut to reveal possible multisensory interactions in the laboratory, but a fundamental vehicle for information integration via the cognitive conflict brain network.
In particular, the present proposal addresses the hypothesis that the conflict monitoring system plays a role in multisensory perception by regulating different possible representations of the incoming sensory inputs. We think that applying the cognitive conflict framework to perception may fit well with current accounts of multisensory perception based on Bayesian principles and can be generalised to perceptual inference in general. But, why may conflict processing be relevant for multisensory perception? The idea inherits from our previous work on multisensory integration and attention and from the predictive coding theory. According to predictive coding, during the process of perception noisy evidence arriving from the different senses are compared to predictions based on internal representations, and the output of this comparison is used in turn to recurrently update these internal representations. We hypothesise that during this inference process, cross-modal disparity between the inputs engages the conflict network. Furthermore, we posit that these conflict responses reflect competition between alternative internal models that help arbitrate between fused or segregated representations of the sensory inputs. 
We plan to prove the concept of our hypothesis by implementing eight experiments using behavioural and EEG measures. Our team combines expertise in EEG analysis, psychophysics, brain oscillations and multisensory integration. To the best of our knowledge this is an original approach to understand multisensory processes, because cognitive conflict theory has been mostly developed using sensorimotor conflict rather than perceptual conflict. Accomplishing the goals of this project will shed new light on the mechanisms that allow humans to form unified and coherent representations of the sensory environment. It should also help bridge the gap between two successful, but currently unconnected theories about the human brain: Predictive coding on the one hand, and cognitive conflict on the other.

Principal researchers

Salvador Soto Faraco
Ministerio de Ciencia, Innovación y Universidades (MCIU), Agencia Estatal de Investigación (AEI), ref: PID2019-108531GB-I00/AEI/10.13039/501100011033