EXHIBITION UNIT – INSTALLING THE WORK

My proposal was selected and will be located in Gallery 1, which is the exact room I wanted. I’m so excited that my work was chosen, and I feel that through my development of the idea the simplification benefitted me as there isn’t too much going on.

For the sound, I recorded an hour and a half loop on my synth using Ableton, and put this on USB stick for use with a Raspberry Pi. Even though the sound is not actually never-ending, it will have the same effect as no one will notice the loop starting and ending because it is so long. I would have had to actually have my synth physically in the room which would have complicated matters too much if I wanted the sound to be genuinely infinite.

I arrived at the space with four days to install everything and setup up each element so that it would run smoothly. I began to get a little nervous initially as it began to sink in that I am actually doing a gallery installation for real, as this is the first time I’ve ever done something like this. I am going to include some bullet points with photos explaining the process of installing the piece.

  • Got familiar with the space and thought about the best way to set everything up
  • Hoovered the floor removing any dust, random rubbish etc.
  • Shifted the layout of the work around, so that the projector would be projecting onto a different wall than outlined in my proposal
  • Figured out where everything should go e.g. the oscilloscope in the corner and close to the projector
  • Thought about possibly adding another element, beanbags, to invite people to sit down and get lost in the experience, but then decided against it at it could crowd the room
  • Began to attempt to get the oscilloscope working in the way I wanted it to, firstly by routing my laptop through an interface and then routing the interface into the oscilloscope
  • Wasn’t seeming to work
  • Tried to change the voltage and volume of the input which worked to an extent but still wasn’t creating the complex lissajous patterns I was looking for
  • Added an amplifier that the interface ran through before the audio got to the oscilloscope, which worked perfectly as I was able to boost the gain and the voltage, I’m still not entirely sure how this works but the main thing is I figured it out
  • Ran a stereo 1/4 inch jack to dual XLR from the amplifier to the speakers, which meant I could hear the sound and see the image on the oscilloscope at the same time
  • Updated the setup, switching out my laptop for a Raspberry Pi and plugging in a USB with the sound on it
  • Roughly positioned the camera and connected it to the projector via a HDMI
  • Things to do written on post-it notes to stay organised
  • Was originally going to install the projector on the wall but decided against it because it was too much work and patience to try and drill into the wall
  • Wanted to hide the oscilloscope and other tech but decided to leave it out for people to see and make sure it wasn’t a trip hazard
  • Ran the XLR cables to the speakers and tidied them up, running them along the floor next to the walls and fastening them in with hooks, nails and a hammer – this was to avoid any cable build up and keep everything looking good
  • Got the projector set up and angled properly

https://drive.google.com/file/d/1-XWFV0yKbgnvfJvxWS-S0k1Tpskf3cLq/view?usp=sharing

  • Video shows the full setup working, I then blacked out the room by nailing a black cloth of fabric over the window

https://drive.google.com/file/d/1B9jkFrhd_sH-S_cK-uR2B4R1OxkJolV_/view?usp=sharing

  • Video of the final setup, with the oscilloscope being projected onto the wall and the room blacked out

FINAL THOUGHTS

I will reflect in more depth in my reflective writing, however I am very proud of myself for having a rough idea, developing it into a concrete concept and realising it in a short space of time. I am very excited and grateful for the opportunity to show my work to the public in a space like this. Everything looked and sounded exactly as I had envisioned, but the one thing that let it down was that my camera doesn’t stay on video view for longer than 30 minutes. This means that I will have to keep turning it on every half hour, which is frustrating because it takes away the magic of the work and means I will need to tell people to keep turning it on.

Overall though, this has been the most enjoyable and productive unit I have done on the course so far, and I am extremely proud of myself for creating something that I feel is really unique.

EXHIBITION UNIT – SOLIDIFYING A THEME

I was recommended by someone to look at this artist called Russell Haswell after I explained my idea to them, and from researching into his work I found some interesting concepts that align with my idea but branch off it at the same time. Russell Haswell is an English multidisciplinary artist. He has exhibited conceptual and wall-based visual works, video art, public sculpture, as well as audio presentations in both art gallery and concert hall contexts. I was particularly interested by his work with lasers and he works a lot with oscilloscopes and the lissajous patterns that they can create.

In this interview with ATTN Magazine, he speaks about a “real time synthesthetic collision”, and upon looking further into this I found out about the world of SYNAESTHESIA and THOUGHT FORMS. Synaesthesia is defined as a sensory experience being stimulated through another sensory experience, for example you would see a colour if you heard a particular sound. This reminded me of hearing about Billie Eilish claiming that she can smell sounds, which I thought was complete nonsense at the time. However, Synaesthesia is an actual condition and affects around 2-5% of the global population. It is not damaging in any way, it simply means that people who have it can experience a sensory stimulation through the prism of an experience catered for a different sense.

‘Thought forms’ are closely linked with synaesthesia, and are basically graphical or visual representations of what is in the artists mind when they listen to sound, whether that be the same artist who created the sound or a separate visual artist. Nevertheless, this article that speaks on thought forms titled ‘Synaesthesia at some space and beyond’, refers to thought forms as “graphic scores in reverse.” They are intended to guide the listener through the experience of listening rather than abstractly represent the sound. The light show ‘Lumiere’ by Robert Henke is an example of a visual experience that guides the audience through a listening experience, which results in an intimate relationship between sound and image.

So to put things into the context of my own work, this idea of synaesthesia has been the driving force of the development of this installation without me even realising. It falls perfectly in place with my overriding them of an endless process of sound and image interacting in real time with one another, and I can now imagine my piece way more clearly as an exploration of the relationship between senses and how this can impact someone. I want people to have a unique experience where they too feel as integrated in this never-ending process of creation as the sound and visuals are.

As a result of this new knowledge and the evolution of my idea, it feels right to rename the piece, changing it from ‘Infinite Horizon’ to ‘Synaesthesia’. Originally, as shown in the title development above, I was thinking of ‘Infinite Synaesthesia’ but it felt like too much for a titled and sounded awkward when I said it out loud. I think the green and black aesthetic of the oscilloscope will work really well in a dark room.

I’ve now drafted a couple of sketches for my proposal, and I will finalise this and add my finished proposal in the next post.

EXHIBITION UNIT – EXPERIMENTING WITH AN OSCILLOSCOPE

I went to talk with one of our technicians, Rory, and it turned out that the technician team actual have an old oscilloscope. I was able to borrow it and managed to get it to work in the exact way I was hoping it would.

I wasn’t actually able to figure out how to get the oscilloscope generating a lissajous figure through my computer audio, however I was able to connect it straight into my synth, which is the source of the generative sound. By plugging two 1/4 inch jack to RCA cables from the two outputs on the synth into the two inputs on the oscilloscope, I managed to get the oscilloscope looking like the image above. By routing a 1/4 inch jack to mini jack cable from the phones output of the synth to the phones input of my laptop, I could simultaneously hear the sound and see the oscilloscope’s reaction. Here’s a video.

https://drive.google.com/file/d/1sJhNphBFZwkfmiKaGz86rQwZ4Ukql8HV/view?usp=sharing

It felt so fulfilling to see this idea start coming to life, and through using the synth and oscilloscope the process and outcome felt organic and intuitive. Any parameter that I changed on the synth patch would cause a random response from the oscilloscope, and I spent hours playing around with this analogue gear. It was nice to get away from the laptop. The only problem now is figuring out how to get the same effect through audio from my laptop or a Raspberry Pi, which I’m sure I will be able to do in the coming days.

EXHIBITION UNIT – CREATING THE SOUND

I have been making a lot of generative sounds over the past few months and I feel I have got pretty good at knowing how to keep a sound endlessly interesting and immersive. I started using VCV rack to create intricate generative patches, as well as my Behringer Deep Mind 6 synth.

VCV PATCHES

The main ingredient to keep these expansive, ambient patches infinitely interesting is LFO’s reacting to each other. So for the first one for example, there is one LFO randomly controlling the note sequencer which generates random notes in the same key, and then two more LFO modules controlling various things about the first LFO, such as its rate, waveform and mix. These interactions keep the sound moving randomly whilst retaining the general atmosphere or mood.

DEEPMIND 6 PATCH

The patch I made on my synth took a lot of time to construct and there are quite a few complexities without it being overly complicated. I wanted to make something that reflected the darker attitude that I present through my sound practice, as well as having an evolving shine that comes and goes. My thought process was that, seeing as I am creating a piece in the space that focuses on art’s infinite nature, I would make a sound that integrates both haunting texture and bright shimmers to capture the essence of exisiting in a world where the line is blurred between dark and light. The sound needs to be a reflection of the world around it, which I believe will hopefully make it feel more organic when it is existing in the gallery space. If mobile visitors can relate in some way to the feeling of the sound or get lost in the experience, then I would say I’ve done a good job.

Using the two LFO’s to control each others rates as well as the envelope rates, the saw and sine waves began to take shape and evolve into a morphing wave of sound. I then used these random LFO sequences to control the distortion, which is at the end of the FX chain meaning it is affecting all of the effects before it, such as the reverb and delay. Another random LFO controlling the VCF resonance meant that the sound became more detailed on certain occasions. Overall I am very happy with how it sounds, and obviously the audio clip above is only a 3-minute excerpt of what is ultimately an endless sound.

I think I will use the Deepmind 6 sound over the VCV patches because it has a more raw feeling to it and sounds a lot warmer and more natural. I feel like the VCV patches sound nice but my intention is for the sound to be as organic as possible.

Next it is time to start fusing the sound and visuals to start forming a singular experience.

EXHIBITION UNIT – REFINING MY IDEA

At this stage of development, I feel it is important to refine my ideas to a point where things are defined and put in place properly. I’ve spent a lot of time figuring out the most effective way to communicate my idea in the simplest way possible, and through a lot of research I’ve found a great solution.

I was struggling with finding the most simple way to create an audiovisual experience where the visuals are reacting to the sound, so I widened my research and looked to online forums to try and get a sense of any common/uncommon techniques that don’t require code or learning a new software. Sure enough, I found a Reddit forum that was called ‘oscilloscope music’, and saw a comment that referenced Tame Impala using an oscilloscope at a performance to react to the music. Not really knowing what an oscilloscope was, I found out that they are used to generate waveforms over time using a two-axis graph. I then discovered that an oscilloscope can produce something known as a LISSAJOUS PATTERN, which is where two sinusoidal curves intersect, producing a curve that veers away from a linear waveform. This curve can be changed depending on the phase angle of the waveforms and the frequency at which they are being oscillated. So, in essence, inputting sound through two channels (which would correlate to the x and y axis respectively) creates a moving curve or curves that will change in real time as the sound changes, because the ratio of the frequencies is always changing.

The video that the person mentioned is linked above, and after seeing this I had that eureka moment that I was hoping would come. A generative sound that is always changing and running through an oscilloscope will mean that the visuals would also be changing in response to the sound. This would communicate my concept perfectly.

The audio-visual artist called Jerobeam Fenderson created an entire audio-visual album called ‘Oscilloscope Music’, which highlights the limitless possibilities that an oscilloscope can offer in terms of aesthetics. Using mathematical ratios between frequencies and phase relationships, Fenderson is able to construct beautiful moving imagery that has a direct response to the frequencies being sent in to the oscilloscope. I find this absolutely mind-blowing because it seems so simple but there is so much maths and knowledge involved. It’s as if the visuals and the sound are talking to one another – a conversation which again falls perfectly within my concept.

The mission now is to figure out the best way of setting this all up. In my notebook I drew out how it could potentially work, which would be a camera pointing at the oscilloscope and a HDMI running into a projector from the camera, providing a live view of the oscilloscope screen. I also thought of Ableton’s ‘Spectrum Analyser’, which would do a similar thing but in a much less interesting manner. The visuals would probably get quite boring, but it would be a good backup option.

Now I just need to find an oscilloscope to try this out.

EXHIBITION UNIT – EVOLVING MY IDEA

DEVELOPING A VISUAL LANGUAGE

I used this artificial intelligence art generating application called ‘Wombo Dream’, which is similar to Google’s Deep Dream software to begin creating an aesthetic for my piece. I felt it was important for me to start seeing visual imagery as it helps me pull my ideas together and establish a direction to go in. I always find that seeing visuals helps hugely with the sonic side of things, and vice versa. I typed words like “futuristic landscape”, “infinite horizon” and “cyberpunk” into the software so it could begin generating a response. Some of the images below are really unique and I began to see patterns than ran through them all, for example shapes, colours and abstractions. What I find so amazing about this way of working is I am able to generate complex images in a matter of seconds and then refine my search based on what I like and what I think could be better. The fact that each generation is completely unique makes producing a lot of content really easy and I spent a lot of hours narrowing down my search so it became more specific and I could establish a level of control, sort of like educated guesses on how the image would turn out.

I loved the name “Infinite Horizon” and decided that it should be my working title for the time being, because it definitely relates to my overarching theme of an endless form of art. I feel ‘Infinite Horizon’ speaks to me in the sense that I imagine someone running towards the horizon, but it never gets closer and the beauty of the sunset and surrounding landscape is endless. Much like this idea of exploring the PROCESS as opposed to a FINISHED PRODUCT.

Using the shapes, colours and forms that I felt ran through all of the AI generations, I thought I’d write the title taking influence from them. The experiments above are purely a way to help me visualise the final outcome of my work in a clearer way.

After thinking about my work more, I really do feel it is a little unnecessary to have an interactive element. I think it would work but I don’t feel it is vital to the theme of the piece to have the audience interacting with a MIDI controller. Just a generative sound and some way of having the audio react to the sound I think will be enough to communicate my ideas effectively.