Tech Trends Turning Big Data into Sound Data Visualization
Home » Blog » Tech Trends » The Sound of Big Data

The Sound of Big Data

posted in: Academic Research, Tech Trends

 

Researchers from music and engineering teams are working together to turn big data into sound.

A collaboration between two professors – one of music and one of engineering – at Virginia Tech resulted in the creation of a new platform for data analysis that makes it possible to understand data better by turning it into sound.

A collaboration between music and engineering professors at Virginia Tech resulted in the creation of a new platform for data analysis Click To Tweet

This is a pioneering approach to studying spatially distributed data. Instead of placing information into a visual context to show patterns or correlations – meaning, data visualization – the work will use an aural environment to leverage the natural affordances of the space and the user’s location within the sound field.

This is a pioneering approach to studying spatially distributed data Click To Tweet
Tech Trends Data Visualization Immersive Sound
The earth’s hemisphere is rendered as a half-dome (denoted in red) inside the Cube using immersive spatial sound. Each of the dome’s rectangular areas are assigned to one speaker that varies its loudness, pitch, timbre, and pulse rate to reflect changes in the atmospheric data
The work combines elements of music, geospatial science, computer science, and human-computer interaction Click To Tweet

Funded by the National Science Foundation, the work combines elements of music, geospatial science, computer science, and human-computer interaction. It’s the first time a research project led by a faculty member from the university’s School of Performing Arts working in collaboration with the College of Engineering

Instead of placing information into a visual context to show patterns or correlations – meaning, data visualization – the work uses an aural environment to leverage the natural affordances of the space Click To Tweet

Ivica Ico Bukvic, associate professor of composition and multimedia in the College of Liberal Arts and Human Sciences, and Greg Earle, professor of electrical and computer engineering, used infrastructure at the Institute for Creativity, Arts, and Technology to investigate how immersive sound can be used to develop our understanding of complex systems.

Tech Trends Data Visualization Immersive Sound

Combining performance space, research laboratory, and studio, the Cube is a collaborative research facility at Virginia Tech where researchers, composers, and musicians are uncovering possibilities in immersive sound Click To Tweet

According to Bukvic, data sonification – which involves converting non-auditory information into sound, is a relatively unexplored area of research, yet provides a unique perspective for exploring data. The human auditory system has a superior ability to recognize temporal changes and patterns, making sonification a powerful tool for studying complex systems.

“Identifying new time and space correlations between variables often leads to breakthroughs in the physical sciences,” explained Dr. Bukvic, who also serves as a senior fellow for the Institute for Creativity, Arts, and Technology. “It makes sense that we would want to go beyond two-dimensional graphical models of information and make new discoveries using senses other than our eyes.

Titled “Spatial Audio Data Immersive Experience (SADIE),” the project is the first large-scale endeavor focusing on immersive spatially-aware sonified data using a high-density loudspeaker array. The research will focus on the earth’s upper atmospheric system, which features physical variables that are spatially and temporally rich. Each of the data sets associated with this system will be represented by distinct sound properties, such as amplitude, pitch, and volume.

The human auditory system has a superior ability to recognize temporal changes and patterns, making sonification a powerful tool for studying complex systems Click To Tweet

These sounds will be played through a 129-loudspeaker spatially distributed immersive sound system in the Cube, located in the Moss Arts Center. A combination of performance space, research laboratory, and studio, the Cube is a collaborative research facility at Virginia Tech where researchers, composers, and musicians are uncovering new possibilities in immersive sound.

Tech Trends Data Visualization Immersive Sound

Using the Cube’s motion capture system, similar to the interface imagined for the film Minority Report, participants will be able to navigate the sonified data using a gesture-driven interface Click To Tweet

Using the Cube’s motion capture system, similar to the interface imagined for the film Minority Report, participants will be able to navigate the sonified data using a gesture-driven interface, allowing them to rewind, fast-forward, rotate, zoom, amplify, speed up, and slow down the data playback. The system will also be used to capture user study data.

“Allowing the brain’s innate signal processing mechanisms to identify specific features in complex data sets is a logical way to link computational sciences with human sensory perceptions. This merging of technology and nature could further current analysis techniques and foster new breakthroughs involving complex systems in science, with the potential to produce new technologies designed to spur creativity,” concludes Dr. Bukvic, who says that if this approach to experiencing data can be proven to improve people’s understanding of complex relationships in physical systems, it could be applied to other fields of study: “It could have applications to fields such as thermodynamics, quantum mechanics, and aeronautical engineering; help advance visualizations and virtual reality systems; and create interdisciplinary bridges between scientific communities, including music, computing, and the physical sciences.”

This article was originally published on The Next Web

 

Alice Bonasio is a VR Consultant and Tech Trends’ Editor in Chief. She also regularly writes for Fast Company, Ars Technica, Quartz, Wired and others. Connect with her on LinkedIn and follow @alicebonasio on Twitter.