At this stage of development, I feel it is important to refine my ideas to a point where things are defined and put in place properly. I’ve spent a lot of time figuring out the most effective way to communicate my idea in the simplest way possible, and through a lot of research I’ve found a great solution.
I was struggling with finding the most simple way to create an audiovisual experience where the visuals are reacting to the sound, so I widened my research and looked to online forums to try and get a sense of any common/uncommon techniques that don’t require code or learning a new software. Sure enough, I found a Reddit forum that was called ‘oscilloscope music’, and saw a comment that referenced Tame Impala using an oscilloscope at a performance to react to the music. Not really knowing what an oscilloscope was, I found out that they are used to generate waveforms over time using a two-axis graph. I then discovered that an oscilloscope can produce something known as a LISSAJOUS PATTERN, which is where two sinusoidal curves intersect, producing a curve that veers away from a linear waveform. This curve can be changed depending on the phase angle of the waveforms and the frequency at which they are being oscillated. So, in essence, inputting sound through two channels (which would correlate to the x and y axis respectively) creates a moving curve or curves that will change in real time as the sound changes, because the ratio of the frequencies is always changing.
The video that the person mentioned is linked above, and after seeing this I had that eureka moment that I was hoping would come. A generative sound that is always changing and running through an oscilloscope will mean that the visuals would also be changing in response to the sound. This would communicate my concept perfectly.

The audio-visual artist called Jerobeam Fenderson created an entire audio-visual album called ‘Oscilloscope Music’, which highlights the limitless possibilities that an oscilloscope can offer in terms of aesthetics. Using mathematical ratios between frequencies and phase relationships, Fenderson is able to construct beautiful moving imagery that has a direct response to the frequencies being sent in to the oscilloscope. I find this absolutely mind-blowing because it seems so simple but there is so much maths and knowledge involved. It’s as if the visuals and the sound are talking to one another – a conversation which again falls perfectly within my concept.

The mission now is to figure out the best way of setting this all up. In my notebook I drew out how it could potentially work, which would be a camera pointing at the oscilloscope and a HDMI running into a projector from the camera, providing a live view of the oscilloscope screen. I also thought of Ableton’s ‘Spectrum Analyser’, which would do a similar thing but in a much less interesting manner. The visuals would probably get quite boring, but it would be a good backup option.
Now I just need to find an oscilloscope to try this out.