skip to Main Content

What Does Data Sound Like?  An Overview of Data Sonification

In recent years the availability of data has soared, and there is a need for new methods for perceiving patterns and trends in this data. A great deal of attention has been given to visualization methods for sifting through this data, and there is much we can learn from visual representations of data.

But what about our other senses, such as listening?

Data sonification means representing data as non-speech sound.  The basic principles are similar to visualization, but where visualizations use elements such as lines, shapes, and colours, sonification relies on sound properties such as volume, pitch, and rhythm.

For a simple example, see University of Michigan librarian Justin Joque’s sonification of the Dow. A tone plays representing daily trading volume and changes to closing value of the Dow from 1928 to 2011.  As stock market activity increase over time, the tone increases in pitch and complexity.

Why?

If you listened to Joque’s example and found it confusing, that’s okay. Many sonifications sound abstract and confusing, especially for new listeners.  Most people are not used to decoding this sort of abstract information from non-speech sound, and the process takes some getting used to.

So why go through the trouble?  Why use sonification?

Accessibility

An obvious strength of sonification is its utility for people with visual impairments.  Sonification has been used by many researchers to make information about objects, colours, and even maps perceivable to visually-impaired users.  Representing information in multiple ways that can be understood by those with different perceptual abilities is vital for accessibility.

Temporality

Temporality is a defining feature of sound.  All sounds occur across time in the same way images occur across space.  As a result, sonifications are well suited to representing temporal or sequential data.  Additionally, because the human auditory system is extremely capable at discerning temporal and rhythmic patterns, sonification can be used to these patterns with great nuance.

The temporal strengths of sonification motivated myself, Jeffrey Boase, and Hirokazu Oda when we designed the E-Rhythms Data Sonifier software. The basic premise of the software is that it creates audio histograms from time-stamped event logs.  If a lot of events occur in a short time, the sonification will produce frequent and loud sounds, and if there are few events, the outputted sounds will be quiet and less frequent.  This makes it possible to listen to temporal and rhythmic patterns within log data.

For example, Jeffrey Boase and myself used this software to explore temporal patterns in anonymized logs of text message metadata.  When listening to the total number of text messages sent by a large group of study participants, we would hear a steady rhythm of loud sounds as many people texted during the day, and quiet sounds or silence at night when most people were sleeping (and not texting).

The purpose of data exploration is to perceive data from a variety of perspectives so as to reveal patterns or trends that would be difficult to see otherwise.  Using sonification we could listen to texting patterns among groups, between pairs, and by individuals, all with a focus on temporal patterns in the data. This allowed us a new perspective of our dataset and led us to generate hypotheses we hadn’t considered before listening.

Background monitoring

Because sonification does not require visual attention, it can be used to monitor information in the background.

One example of this is the LARAS Sonification Laboratory’s projects that allow network administrators to monitor network activity by listening to sonifications. In 2012, the Sonification Lab introduced inteNtion, and in 2014 they presented [NeAR] – Network Activity Renderer, both of which allow administrators to hear unusual behaviour – which could indicate hacking and abuse –  while their eyes are focused on other tasks.

Sonification has also been used for real-time monitoring of athletes’ physical performance.  Various sensors collect information about athletes’ motions, posture, and force, then that information is sonified so athlete’s can listen to their performance in real-time. For example, Stephan Barrass described a workshop he led about using sonification for elite rowing training. And more recently, the company Rowing in Motion has released a smartphone application for this purpose. Athletes using these sort of applications can listen to feedback about their performance without needing to compromising their visual focus.

Public outreach

Finally, the musical qualities of sonification make it an engaging tool for presenting information to the public.

The LHC Sound project exemplifies this.  The project involved sonifying data from the Large Hadron Collider. Part of the purpose was to provide a new analysis tool for researchers, and as an additional benefit generated a great deal of media attention.  In an interview with the BBC, LHC Sound software engineer Archer Enrich said the sonification is “true to the data, and it’s telling you something about the data that you couldn’t know in any other way.”  He also noted that when listening to the sonification, “you feel closer to the mystery of Nature which I think a lot of scientists do when they get deep into these matters.”

Brian Foo’s sonification of income inequality in New York City provides another example.  The sonification emulates a subway ride through Brooklyn, Manhattan, and the Bronx. As the listener ‘travels’ through different neighbourhoods, the sounds change to indicate the median household income in each area.  The sonification is entertaining simply as a piece of music and manages to  powerful information about income inequality in a few short minutes.

For now, sonification is unfamiliar for many, but these examples reveal how sonification is emerging as a useful tool for exploring and presenting patterns in data.  By offering perspectives of data that are difficult to replicate in other forms, sonification can reveal new sorts of knowledge and can engage a variety of audiences.

Jack Jamieson is a PhD student at the University of Toronto’s Faculty of Information. His research examines how emerging communication technologies affect and are affected by cultural and social activities. His current research examines how social, cultural, and political values influence web development technologies. Other interests include user-generated mobile literature and exploratory analysis using data sonification. He can be reached by email at jack.jamieson [at] mail.utoronto.ca.

This Post Has 7 Comments

  1. fascinating. Can help but imagine a drum beat, and a few ambient washes to back up this sonograph (if that’s the term).

Comments are closed.

Back To Top