Member 1435
6 entries
22357 views

 RSS
Contributor to project:
Emergence and Navigating...
Emerson Taymor (M, 33)
San Francisco, US
Immortal since Jan 9, 2008
Uplinks: 0, Generation 3
I am a Design|Media Arts major at UCLA minoring in Global Studies. I am an amateur magician and gastronomic.
  • Affiliated
  •  /  
  • Invited
  •  /  
  • Descended
  • etaymor’s favorites
    From wwayneee
    Project 2: Phase 5 - Ummts...
    Recently commented on
    From wwayneee
    Project 2: Phase 5 - Ummts...
    etaymor’s project
    Emergence and Navigating...
    Develop a generative, emergent process to fill space (2D or 3D) using only black lines. Modify a known process or invent your own. Implement your...
    Now playing SpaceCollective
    Where forward thinking terrestrials share ideas and information about the state of the species, their planet and the universe, living the lives of science fiction. Introduction
    Featuring Powers of Ten by Charles and Ray Eames, based on an idea by Kees Boeke.
    From etaymor's personal cargo

    Project 2: Phase 6: Video Documentation
    Project: Emergence and Navigating Space
    ——Continued from previous post by Ben—

    For the past one and a half months, Ben and myself have been toiling away on our multi-touch table. When we first set out on the journey, we did not anticipate the magnitude of the project and the number of issues that we would run into. From figuring out the electronic setup, to the physical table, to the projection, compliant surfaces, computer setup and software interfacing, for every step forward there was another hill to climb. However, we were able too persevere through our problems and ended up with what we think is a very successful piece.

    To recap our project, Ben and I created a multi-touch table that is interfaced through software. The goal of this project was that it would not just end with this class. We are hoping that the project continues to exist in the Design | Media Arts Department at UCLA and other students may design their own software for the table. We believe that the future of technology and interaction lies in the multi-touch world. With the Apple iPhone setting off a huge sensation in the technology and consumer world, the world is slowly moving in that direction. Multi-Touch allows the sense of tactile interfacing, allowing users to actually feel like they are taking more part in using their technology. Also, multi-touch computing could certainly alleviate medical problems with overusing the mouse and keyboard, which has been linked to carpel tunnel syndrome.

    Our table is based on Frustrated Total Internal Reflection which is a process created by Professor Jeff Han at NYU. IR leds are in an aluminum rail around a sheet of acrylic. The leds scatter light throughout the acrylic and when contact is made (in our cases a finger), the light is reflected off the finger down. Below the table we have a webcam that has been hacked to sense IR light. The reflection off the finger in essence gives a bright blob. Using open source software, these sections of reflection are calibrated to be all that are read by the camera. The software also calibrates the screen to determine the location of the touches. These touches are then communicated to your software program. Programs can be written in Flash AS3, C++, and Processing.org. Software interfacing can also be done through the Reactivision Client (http://reactable.iua.upf.edu/?media).

    We feel that a lot of software could scale well with the new interface. Normal programs and gallery exhibitions could take on new dimensions. Multi-touch also has the benefit over normal touch environments that can only read one finger at one time. Much of the history of multi-touch is based on single-touch input (ATM, physical touch-activated lights, restaurant kiosks), but multi-touch allows you to take advantage of multiple inputs. Not only can one person use it with multiple fingers, but also multiple people can use the table at the same time!

    We hope that others can use our multi-touch table in the future and we can further continue to develop software for it. We have included pictures of our table in progress from the beginning up until the finished product. A video has also been posted of the table in use.

    This is a video of the configuration software without the background removed. It shows zero-force touch being read with great results from the webcam.










    Another view of the configuration software this time with the background removed. Now all you see are the touches being read with great contrast. Results in very good results for application building.










    A video of calibrating the system. This shows the raw camera video footage. This is to line up the touches with the position on screen.










    A video with an application running. This application is a bit processor intensive, as such you see a little lag. Ben noted this in his post. A faster web cam would definitely help solve this problem (FireWire cameras are probably the way to go), also a better Graphics Card may also improve performance. Still you can see the system is working pretty well.










    Sat, Mar 22, 2008  Permanent link

    Sent to project: Emergence and Navigating Space
      RSS for this post
    Add comment
      Promote
      
      Add to favorites
    Create synapse
     
     
          Cancel