Sound Design for 4D CGI Fractal Animation

Quaternion fractals are essentially 4 dimensional fractal plots, where the fourth dimension is expressed as movement through time. Here, CGI is used to plot a series of quaternion fractals to create short animations showing them ‘unfolding’ over time. The sound design brief for this experiment was to create a sound track that complemented, both aesthetically and texturally, the visual animations. The sound design for these animations was conceived of and created here at The Orange Hut.

Due to the complex nature of these CGI animations, initially it was decided to experiment with creating sound directly from each video’s own imagery. This was done using Cycling ’74’s Max graphical interface software (Roads, 1996, p. 614). A similar method to that employed by John Reed in his brief YouTube tutorial on this subject, entitled “Max 7 Tutorial #44 Sound from Video” (dearjohnreed, 2017), was used. Essentially a Max patch was made that allowed one to create sound from the video using ‘fast Fourier transform’ (fft), and then modulate the respective data outputs with gate, filter and other objects. While this experiment yielded some interesting results, the overall sound – both texturally and aesthetically – felt too dislocated from the visuals. Also, the rhythm with which the gated and filter effects were triggered and released was hard to synchronise with the animation itself, as the complex ripples of spiralling self-similarity seemed to continually glide between various tempos and nodes.

Note: Due to the low resolution of the above animation (and its resulting size within the frame), it is recommended that this video be played in “full screen” mode in order to properly hear how the sound design synchronises to the moving images.

Considering the above difficulties, it was next decided that a more tailored sound would probably yield a more suited aesthetic to these animations: one that allowed every aspect of the sound design process and tonal palette to be controlled and synchronised to the animations, albeit manually and thereby approximately. Loading the video files directly into ProTools, a multichannel composition was made using sounds derived from a series of modular synthesisers that were controlled directly via MIDI, and then recorded back into ProTools as audio files. These recordings were then edited to formulate suitable textural motifs around the ever-changing visual elements. The recordings mainly used synthesis patches that employed amplitude and frequency modulations to mimic ripples and changes that could be observed within certain parts of the animations. To counterpoint these complex undulations of sound, a tonal element was added afterwards to infer a type of feeling to the animation (see “SOUND DESIGN – 3D QUATERNION FRACTAL PLOT (1)” above). A myriad of sound design possibilities with this method were much easier to explore and, ultimately, allowed a more expressive sound design composition to be created than the first effort using Max.

Note: Due to the low resolution of the above animation (and its resulting size within the frame), it is recommended that this video be played in “full screen” mode in order to properly hear how the sound design synchronises to the moving images.

Visually the second animation lacked some of the dynamism of the first. Given the results of the first attempt with Max, it seemed possible that the idea of creating audio from video might work in this instance. Thus a similar Max patch to Reed’s own was created, adding again a ‘fft’ component, as well as utilising some ‘cv.jit’ third party objects and reverse engineering some patches designed by Jean-Marc Pelletier (Pelletier, 2022). After some trial and error, the resulting sound was recorded and used as a type of static background noise that runs throughout the animation. We felt this textural sonification suited the animation, primarily as it seemed to change as the object revolved, providing a sonic suggestive rotation. Further to this, design elements from a Max patch that we created several years ago were used to create another sub-patch, that allowed one to manually “transcribe” (or perhaps “perform”) a sound that eluded to small flurries and ripples that were seen occurring within and around the quaternion fractal animation. While each performance didn’t exactly match the complex flows seen within the animation, the best performances were taken and layered over one another in a ProTools multitrack arrangement, so the individual elements could then be synchronised, as accurately as possible, to various parts in the animation.

At present we are looking for some software that will allow us to create a series of five quaternion fractal animations, each one being approximately three minute in length. The software that was originally used to create some fractal animations with is now discontinued.

To read more about quaternion fractals, please visit Paul Bourke’s informative webpage by clicking here.

Bibliography

dearjohnreed (2017) Max 7 Tutorial #44 Sound from Video. Available at: https://www.youtube.com/watch?v=JcfzXIwGgPs. [Accessed: 17th January 2022]

PELLETIER, J. (2022) cv.jit | Computer Vision for Jitter – Jean-Marc Pelletier. [online] Jmpelletier.com. Available at: https://jmpelletier.com/cvjit/ [Accessed 19th February 2021].

ROADS, C. (1996) The Computer Music Tutorial. Cambridge, Massachusetts: The MIT Press.