Artificial Intelligence (AI) is a fundamental building block of many new technologies. In the wild, many AI algorithms are designed to operate autonomously. But to communicate with humans, they need to perceptualize what they know.
One of the most rewarding projects I’ve worked on was designing an AI sonification system to help doctors with melanoma diagnosis. A dermatologist snaps a photo of a suspicious region of skin, triggering analysis by a computer vision system trained to detect skin cancer. The doctor gets a read out of the results and a brief sonic summary. If it the sound is scary, alarming or ugly… it’s probably coming off.
Publications
- Winters, R. M., Kalra, A., & Walker, B. (2019). Hearing Artificial Intelligence: Sonification Guidelines & Results from a Case-Study in Melanoma Diagnosis. Proceedings of the 25th International Conference on Auditory Display, 262–267. Northumbria, UK.
- Walker, B. N., Rehg, J. M., Kalra, A., Winters, R. M., Drews, P., Dascalu, J., David, E. O., & Dascalu, A. (2019). Dermoscopy diagnosis of cancerous lesions utilizing dual deep learning algorithms via visual and audio (sonification) outputs : Laboratory and prospective observational studies. EBioMedicine, 40, 176–183.
Press
- Bostel Technologies (2019). The Sounds of Nevi: Diagnosing Skin Cancer More Accurately with a New Deep Learning Technology and Telemedicine. Market Watch, Feb. 29, 2019.