Using projected images that are mapped to the surface features of, for example, a building has presented me with a great deal of inspiration regarding the re-mapping and re-imagining of space and place with specific regard to the moving image.
Although familiar with projecting live and composited images in VJing, I had never before come across this idea of using structures, their textures and nuances as a projection screen (examples here: http://mashable.com/2011/04/24/3d-projection-mapping/#OqLFYcretDg)
Inspiring in terms of both its impact on how this method can transform place, coupled with the potential scale of what could be achieved using Puredata and nothing more than a projector, I wanted to investigate the impact of projection mapping on a small and large scale and also develop means of incorporating my research to date into a final piece.
For these experiments and the final piece I wanted to revisit some of the themes that had drawn my attention during previous modules including Experimental Practices and Histories and Theories and use these as a basis to inform my development of a projection mapping patch in Puredata.
TEST #1 During the two week period 11th to 21st July 2011, I was able to set up an installation as part of the “Great Big Empty Shop Experiment” (http://cci.glam.ac.uk/big-shop/) which was inspired by my Experimental Practices around re-mapping time and space, the sense of the uncanny (das unheimlich) (Freud 1919) and how the outside can become inside, daytime in the dark (Carroll 1996) and the divided cinema (Cubbit, 2004).
(Stills from the Great Big Empty Shop experiment. The installation was only viewable through a reversed spy hole on the door.)
This experience gave me a great opportunity to experiment with perception (the viewer only being able to witness a fish-eye view of the installation) and the sense of place (the projected image was close ups of lush greenery somehow out of place amongst a room of shop mannequins).
Taking these ideas forward and using projection mapping techniques developed through Puredata, I produced the following test piece which again explores the ideas of un-heimlich (the familiar made unfamiliar) through re-mapping and animating the mannequin’s face.
The “still life” here had some issues of spill due mainly to the arrangement being moved prior to projecting, although the handheld camera phone does allow the viewer to get a sense of the textural nature of the piece, especially the bottle as it is not completely covered.
TEST #3: Final Piece For this final test piece I wanted to bring together some of the ideas, technologies and theory discussed in this document; including projection mapping (this time on a larger scale), OSCeleton, perception of space and place, QR tags and Blender simulations. Although at the time of writing there is no ReacTIVision component, it could be incorporated as a Puredata patch (as I did with the OSCeleton patch) but I feel it would be more suited to a collaborative interactive piece.
Another addition to this piece is the iPad OSC controller developed through the open source Mrmr (http://mrmr.noisepages.com/) interface designer. This controller links to Puredata via a computer-to-computer network and while it can be developed to send multiple data streams (including accelerometer data), for the purposes of this test it mainly controls the activation of video sources.
(Developing my Mrmr iPad interface controlling Puredata)
These projection mapping tests have allowed me to bring together various areas of interest into one piece that effects perception and experience of space and place, the “real” and “virtual” and how they can help re-map and re-envisage a space.
Whilst these are very much test pieces I think that they can inform the direction of my development moving towards my main production project.
In terms of development, the positive to come out of this research and development is a library of Puredata patches that can be re-used and re-configured for different purposes.