Gil Rodríguez, R., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G. C. and Gegenfurtner, K. R., 2024. Color constancy mechanisms in virtual reality environments. Journal of Vision, 24 (5), 6.
Full text available as:
|
PDF (OPEN ACCESS ARTICLE)
i1534-7362-24-5-6_1715250544.15467.pdf - Published Version Available under License Creative Commons Attribution Non-commercial No Derivatives. 7MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1167/jov.24.5.6
Abstract
Prior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time. We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements. We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.
Item Type: | Article |
---|---|
ISSN: | 1534-7362 |
Uncontrolled Keywords: | Humans; Virtual Reality; Color Perception; Cues; Lighting; Adult; Male; Female; Photic Stimulation; Young Adult |
Group: | Faculty of Science & Technology |
ID Code: | 39853 |
Deposited By: | Symplectic RT2 |
Deposited On: | 17 May 2024 10:28 |
Last Modified: | 17 May 2024 10:28 |
Downloads
Downloads per month over past year
Repository Staff Only - |