Sonification of Emotion

At the Input Devices and Music Interaction Lab (IDMIL), I got involved with a technology startup specializing in emotion recognition. Their algorithm predicted realtime arousal-valence coordinates from physiological sensors worn on the body. They thought that audio would be a great way to showcase their technology, and we did too. We wrote an NSERC-Engage Grant together, and received funding to start our collaboration.

We made a video to help get the word out.

Fortunate for our project, music is full of emotions, and researchers have been identifying acoustic and structural cues that underly perceived emotion in music. I coded these into a sonification model that changed its sound with the arousal/valence coordinates.

Excited – High Arousal, Positive Valence
Scared – High Arousal, Negative Valence
Relaxed – Low Arousal, Positive Valence
Sad – Low Arousal, Negative Valence
Neutral – Mid Arousal, No Valence


Journal Article

Conference Papers

Leave a Reply