MIMS Final Project 2018

Auditory Data Representation


With the rise of both virtual assistants and software-embedded devices, audio-first interactions are becoming more prevalent in daily life. However, there is not yet an industry standard for designing sound experiences to communicate data generated by ubiquitous computing, particularly through emergent conversational interfaces. We imagine a future in which we will be able to explore data with a combination of the ear and a voice user interface: how might Alexa enable us to explore complex data? How might we hear the weather, stock market prices, or the status of our IoT-connected home – by sound alone?

We have conducted an in-depth literature review of prior work related to data sonification, shaping our analysis based on visual analogs of “auditory graphs” (i.e., histograms, scatterplots, pie charts, etc). Spanning across various disciplines – including literature from human-computer interaction (HCI), accessibility, music and art – these papers demonstrated a great range of approaches, including expressive, subjective design proposals as well as more functional, objective representations, vetted by experimental procedures.

A major difficulty encountered while reviewing the literature, however, is that much of it is dated and lacks access to the audio files associated with each of the papers. Therefore, we reproduced three different sonification methods for our own voice user interface (VUI), representing data from the 2010 Census and the 2015 American Community Survey. With this VUI, we conducted five in-person usability test sessions to evaluate the potential of auditory data exploration via a contemporary, conversational interface, as well as to present recommendations for future work.

This is an exciting and fun new project that combines learnings from tangible user interface design, natural language processing, and information visualization courses at UC Berkeley’s School of Information, with sponsorship from the Berkeley Center for New Media. While primarily aimed at smart devices, this project can have implications for developing accessible design patterns going forward to allow an alternative way to interact with data for everyone.

Last updated:

May 6, 2018