Skip to main content

KIMA: The Wheel—Voice Turned into Vision: A participatory, immersive visual soundscape installation.

Gingrich, O., Renaud, A., Emets, E., Soraghan, S. and Villanueva-Ablanedo, D., 2018. KIMA: The Wheel—Voice Turned into Vision: A participatory, immersive visual soundscape installation. Leonardo. (In Press)

Full text available as:

[img]
Preview
PDF
LeonardoPaper_Redacted.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

8MB

Official URL: https://www.mitpressjournals.org/doi/abs/10.1162/l...

DOI: 10.1162/leon_a_01698

Abstract

Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics—physical sound patterns—as medium for performance. The current development incorporates neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360 degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into co-creators of the piece.

Item Type:Article
ISSN:0024-094X
Uncontrolled Keywords:Sound Art; Immersive Sound; Visual Sound; Participatory Art; Machine Learning
Group:Faculty of Media & Communication
ID Code:31795
Deposited By: Unnamed user with email symplectic@symplectic
Deposited On:15 Feb 2019 16:50
Last Modified:10 Dec 2019 14:32

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -