The Poseidon Project
The Poseidon Project uses 100 years of seismic data, sourced from the United States Geological Survey (which is a really good source for seismic data and geological data in general).
Johnson divided the globe into eight segments, each representing a different instrument. As such, if an earthquake occurs, it will trigger the instrument associated with the globe’s segment. The depth of the earthquake controls the pitch, and velocity is controlled inversely by the magnitude (the bigger the magnitude, the lower the sound)
The globe is divided into 8 segments with each representing a single or group of instruments. Each seismic event constitutes a single note, with the instrument determined by the location, pitch by the depth of the event, and the velocity by the magnitude of the earthquake. 5 remaining MIDI channels are reserved for special events (earthquakes > magnitude 8, low and high RMS (root-mean-square amplitude), etc.)
The rhythm is determined by the occurrence of the earthquakes over time. As measurements became more widespread over the years, there are more datapoints for the recent years than for the earlier ones. Therefore, the time is divided into 7 time periods, each with a different time acceleration: whereas for the first years, data are accelerated 500,000 times (meaning that the events are taking place 500000 times faster than in real-time), for the last ten years, the data are only accelerated 750 times. This allows having a somewhat equal distribution of events. Had the events for the current decade been accelerated 500000 times as well, we would probably be hearing a very chaotic soundscape akin to black midi. The complete dataset consists of 780000 points and running the software completely would take 8 days.
The sound is accompanied by an animation of the earthquakes. Bezier curves between sites identify patterns between seismic events. Stronger lines indicate that events in one region tend to follow another.
The software has been written in Processing (for the animation), Python (to collect the data), and Ableton Live (for the music). The code is available on Github so you can modify the code and create your own sonification.
Fault Trace is a similar work, also using seismic data, however, here Johnson uses only data from 2017 and created 12 compositions, one for each month of the year. Each sonification is stylistically different. This song was inspired by artists such as Burial and Sorrow. Each song also uses different mappings; In the sonification for December for example, Johnson used the following constraints:
Time Signature: 4/4
Time Compression: 7300x
Time Period: 1–30 December 2017
Tempo: 120 BPM
Genre: Electronic/Future Garage
The programming is similar as in the Poseidon Project, using Processing, Python, and Ableton Live. Just like the Poseidon Project, The code is open-source and available on Github.
Here is the full playlist of all 12 months:
You can also buy the album on Bandcamp
About the artist
David Johnson is part of The Edison Union, a creative technology collective, based in Melbourne, working on the intersection of art and technology. They do some pretty amazing work. You can also check out their Instagram profile.