At the Input Devices and Music Interaction Lab (IDMIL), I got involved with a technology startup specializing in emotion recognition. Their algorithm predicted realtime arousal-valence coordinates from physiological sensors worn on the body. They thought that audio would be a great way to showcase their technology, and we did too. We wrote an NSERC-Engage Grant together, and received funding to start our collaboration.
Fortunate for our project, music is full of emotions, and researchers have been identifying acoustic and structural cues that underly perceived emotion in music. I coded these into a sonification model that changed its sound with the arousal/valence coordinates.
Publications
Journal Article
- Winters, R. M., & Wanderley, M. M. (2014). Sonification of Emotion: Strategies and Results from the Intersection with Music. Organised Sound, 19(1), 60–69.
Conference Papers
- Winters, R. M., Hattwick, I., & Wanderley, M. M. (2013). Integrating Emotional Data into Music Performance: Two Audio Environments for the Emotional Imaging Composer. Proceedings of the 3rd International Conference on Music and Emotion. Jyväskylä, Finland.
- Winters, R. M., & Wanderley, M. M. (2013). Sonification of Emotion: Strategies for Continuous Auditory Display of Arousal and Valence. Proceedings of the 3rd International Conference on Music and Emotion. Jyväskylä, Finland.