Narratives 2.0 visualizes music, from Queen’s “We Will Rock You” to Beethoven’s “Symphony No. 5.” I believe that the artist, Matthias Dittrich, used Java and Processing to analyze the music using the same process for each analysis, creating different but connected images which are really interesting and almost organic. Here’s his description of what he did:
The music was segmented in single channels. The channels are shown fanlike and the lines move from the center away with the time. The angle of the line changes according to the frequency of the channel, while the frequency reaching a high level, the channel becomes highlighted by orange. The visualisation should not necessarily return exact informations, even if the arrangement and uniformity of the music can
be read. The purpose was to create even more an aesthetically responding visualisation with the music as an artist.