Create an immersive environment to visualize sonic input and translate across the modalities of sound and imagery.

An FFT is performed on incoming sonic data. Then, these values are aggregated into 26 bands according to Bark psychoacoustical scale of perceptually equal distances. Finally, these values are saved to a Shared State object that is sent throughout the distributed system. A historical record of the last 1000 values of each band is stored to show the change in amplitude as the song progresses. These values are sent to each visual component to output a representation of the song over time.

Displays current sonic amplitudes around 180 degrees of a circle of 3D Meshes, which is repeated to form a full circle where all bands can be seen from any vantage point. This gives the viewer a sense of the current amplitudes / intensity of the song, as well as where these values lie within our perceptual scale.

Displays stored historical values as heights around a revolving circle. This gives the viewer a sense of the overall dynamics and changes in the song.

Simulates a group of floating clouds whose size and color is determined by a partial history of the sonic values. These entities move according to the intensity of the band they respond to, which creates the effect of jumping and dancing along to the progression of the song.


My previous experience mapping sonic data to visual objects allowed me to create elements more quickly and thoughtfully than before. Also, by starting with a basic output and building from there, I created a solid platform of analysis to create visual elements on top of.

I wanted to do more experimentation with textures to make it more graphically interesting, but inexperience and lack of time prevented me from doing so. However, this will serve to motivate me to work with these topics in the future.

Due to limited testing inside the sphere, there were issues with cohesion between the renderers that had to be sorted out at the last minute. Inconsistencies between each renderer caused elements to fade in and out between areas of the sphere, which interrupted the immersion I wanted. After putting more data into the Shared State, I had problems with circular access of the array, which led to bad accesses and random “lasers” shooting across the screen. Also, I had some trouble with making the elements noticeable due to the tint of the stereo glasses, which subdued a lot of the details I had originally included.

How to Run
Download and install allolib and associated dependencies
     Installation details found here: https://github.com/AlloSphere-Research-Group/allolib
Download “seeing-sound.cpp” and “sound” folder
     (Alternatively,specify a new .wav file on line 153 of “seeing-sound.cpp”)
          samplePlayer.load(“your file here”)
Build and run “seeing-sound.cpp”
        (With allolib installed) Run "/allolib/run.sh path/to/seeing-sound.cpp" in Terminal / Command Line

    Allolib, DistributedApp, Gamma
Back to Top