070514 Visuals/Video Technology
Project: Design Media Arts at UCLA
Project: Design Media Arts at UCLA
The visuals for my piece will take videos of nature that represent your own memories or that of those in your social network(friends/family). These memories will be reinterpreted using conditions of the individual such as motion, heart rate, and respiratory rate.
The video element will then be created by blending blurred video of nature with graphic elements that react to the different streams of data from sensing devises. The nature memory videos will be blurred so it can't be determined what memory they actually represent. The installation should be though of as creating a unique and original experience rather than re-experiencing a memory.
I have been researching different methods on how this can be done with live in a way that the video is smooth and responsive to the sensor data. Processing is the best method to create images and graphics in response to sensor data, but it doesn't always have a smooth frame rate. On the other hand, Quartz Composer does very will with real time video. If somehow these two technologies could be combined, it provides the best solution.
I had no success in finding a way to bring live images from processing into Quartz Composer, therefore I decided to test it manually. This required feed the processing video from one computer and inputing that into another computer which runs it through Quartz Composer and blends it with a second video layer. The second layer is the videos of memories from nature. The test produced promising results. The only downside is that it require a lot of equipment.
The video element will then be created by blending blurred video of nature with graphic elements that react to the different streams of data from sensing devises. The nature memory videos will be blurred so it can't be determined what memory they actually represent. The installation should be though of as creating a unique and original experience rather than re-experiencing a memory.
I have been researching different methods on how this can be done with live in a way that the video is smooth and responsive to the sensor data. Processing is the best method to create images and graphics in response to sensor data, but it doesn't always have a smooth frame rate. On the other hand, Quartz Composer does very will with real time video. If somehow these two technologies could be combined, it provides the best solution.
I had no success in finding a way to bring live images from processing into Quartz Composer, therefore I decided to test it manually. This required feed the processing video from one computer and inputing that into another computer which runs it through Quartz Composer and blends it with a second video layer. The second layer is the videos of memories from nature. The test produced promising results. The only downside is that it require a lot of equipment.