Here’s an initial prototype of the “touch screen” interface. Right now it just uses the motion tracker widget to track motion, though that’s sort of finicky. I’m wondering if it might be better to use something like an IR camera so it tracks more precisely. The monitor in front is also way too big. But I do like that if you were sitting in front of the monitor and put your arm around the back side it would make you basically hug the screen and force you to get pretty intimate with it.
Also, while I was trying to figure out how to get another camera to work as a webcam I came across this tutorial. I wasn’t able to get my Canon Rebel working cause I haven’t been able to find the EOS Utility software yet. But I did get CamTwist going and got pretty excited about the prospects of that alone.
Basically it lets you make a box around an area of your screen which then becomes another “camera” source. So then in the Netlab motion tracker widget you can use it as a camera, meaning it’s no longer bound by the camera on your laptop or one plugged into the usb. So I could even use a bunch of those IP cameras we have from the show.
In the tests above I tried it with a Youtube video, a street view, and a live chat and they all worked just fine, although street view is too low contrast I think. The most exciting part was using the live chat because this means I could actually have people’s webcams control something on my computer, or if I hook up some servos to the motion tracker, something in a physical location. Also interesting that it lets me use both my laptop camera and the CamTwist camera at the same time. I feel like there’s some nugget of an interesting idea in here but I haven’t quite sorted it out yet. It’s a little confusing thinking about all the different camera feeds….