Li, Q., Meso, A.I., Logothetis, N.K. and Keliris, G.A., 2019. Scene regularity interacts with individual biases to modulate perceptual stability. Frontiers in Neuroscience, 13, 523.
Full text available as:
|
PDF (OPEN ACCESS ARTICLE)
fnins-13-00523 (1).pdf - Published Version Available under License Creative Commons Attribution. 3MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Abstract
Sensory input is inherently ambiguous but our brains achieve remarkable perceptual stability. Prior experience and knowledge of the statistical properties of the world are thought to play a key role in the stabilization process. Individual differences in responses to ambiguous input and biases towards one or the other interpretation could modulate the decision mechanism for perception. However, the role of perceptual bias and its interaction with stimulus spatial properties such as regularity and element density remain to be understood. To this end, we developed novel bi-stable moving visual stimuli in which perception could be parametrically manipulated between two possible mutually exclusive interpretations: transparently or coherently moving. We probed perceptual stability across three composite stimulus element density levels with normal or degraded regularity using a factorial design. We found that increased density led to the amplification of individual biases and consequently to a stabilization of one interpretation over the alternative. This effect was reduced for degraded regularity, demonstrating an interaction between density and regularity. To understand how prior knowledge could be used by the brain in this task, we compared the data with simulations coming from four different hierarchical models of causal inference. These models made different assumptions about the use of prior information by including conditional priors that either facilitated or inhibited motion direction integration. An architecture that included a prior inhibiting motion direction integration consistently outperformed the others. Our results support the hypothesis that direction integration based on sensory likelihoods maybe the default processing mode with conditional priors inhibiting integration employed in order to help motion segmentation and transparency perception.
Item Type: | Article |
---|---|
ISSN: | 1662-4548 |
Uncontrolled Keywords: | visual perception, bias, bayesian, computational modeling, regularity, psychophysics, human perception, motion perception |
Group: | Faculty of Science & Technology |
ID Code: | 32384 |
Deposited By: | Symplectic RT2 |
Deposited On: | 11 Jun 2019 08:37 |
Last Modified: | 14 Mar 2022 14:16 |
Downloads
Downloads per month over past year
Repository Staff Only - |