“Playing” the Image

Initial Ideas:

Falling out of this was the idea of “memory triggers” and if we could “play” our memories, emotions and ideas just like we’d play a piano and interact with the visual forms triggered in real time as we might do with a musical instrument. By “playing” the pre-constructed memories as shots we can create a montage of musical memory that uses sound to affect colour, time, motion and movement of the images presented.

Developing the concept:

In a more detailed analysis here, I’m thinking about the nature of the content as well as the interactive element. Here’s my thinking around the nature of the content and how the initial treatment addresses key concepts of the moving image.

1. Memories created using images (found or filmed/ still and moving) and voice over

-exploration of the relationship between still and moving image

-exploration of colour and memory

2. Animated memories created in a virtual environment using particle systems, emitters, boids etc.

-how does the integration of still and moving image in a virtual (“un-natural”) environment affect out relationship with the image.

-opportunity for generative, artificial intelligence (via boids systems) to introduce an organic development of shots

Set up

In this set up I’m using 3 MIDI triggers for both visual and audio which are composited and re-introduced into the system. Sound (generative audio and voice tracks) is layered both in the A/V mixer and the synth modules allowing for interactive generation of A/V montage based on the users (me!) interpretation and reaction to both sound and image.

The A/V mixer can be set up in a number of different ways with different triggers effecting different parameters.

In this way the 5 shots that can be generated would be an exploration using the audio itself to trigger paramaters in the A/V mixer that relate to:

1. colour – volume, velocity (how hard a key is pressed) and pitch can be used to directly effect the colour balance of a clip or A/V mix.

2. time – similarly triggers can be used to affect the start and end point of a clip or speed and direction of the playhead.

3. Movement – created directly in terms of the order of playing the clips in order to create an A/V montage.

4. Sound – this shot would be related directly to how a piece of music can create a specific shot.

5. Interaction – the key to this is the relationship between sound, image and user in no particular order and having run a few tests (see below) with some clips it is really interesting to explore how the image encourages a certain key press and how the sound itself leads the user on to more experimentation: what I mean by that is it’s a very moreish toy! 🙂