As a music technology researcher, I’m aware of the incredible amount of information available about music. The practice of Computational Music Analysis involves analyzing large databases of music to find patterns representing distinct styles, genres and composers.
At the end of my master’s degree, I worked with developers at the ELVIS Project to make a sonification interface for rapidly scanning through databases of music by ear. Audio seemed like an intuitive choice to represent this data.
It was fun to find out that the difference between composers could be heard, even at 10,000 notes per second.
You can find the app, the source code and more examples here
- Winters, R. M., & Cumming, J. E. (2014). Sonification Symbolic Music in the ELVIS Project. Proceedings of the 20th International Conference on Auditory Display. New York, NY.
- Leenders-Cheng, V. (2014, April 2). The Big Picture on Big Data – Mapping Musical Intervals. McGill Reporter.