Independent Efforts
I have written a few augmented and mixed reality codes when experimenting with the concept of interactive physical objects. Following are two pieces of experiments.
1) An interface to use any physical surface as a touch screen. The code and demo are used as a lesson for the augmented reality workshops I organize for Bangladeshi engineering students.
The demo uses a mini projector to project a computer screen on a physical surface, and a webcam to auto-calibrate and detect hand motion.
2) Using a Laser pointer as a “poor man’s game controller”. (code and schematics available upon request)
DRAB Lab
picture courtesy of DRAB Lab
In the Distributed Robotics lab at Bard College (DRAB Lab), we devised and created applications for mobile robot systems such as Roomba that let us interact with them. My own project in the lab was to implement applications for a mobile projector & camera system that projects interactive animations and images on a flat surface and lets the user interact with the images in real time. I have written a vision program for the robot using Processing that can detect human hand and feet movement using frame differencing techniques. Based on this program, I have written an augmented reality game where the user can interact with/move around in a 3D world projected on a flat surface.
picture courtesy of DRAB Lab
I have also been involved with the Bird’s Eye project, a collaboration between computer science and dance. An overhead camera setup in the stage captured a dancer’s movement in real time and projected live interaction on the background and on the floor. We deployed our setup for live performance to audience in May 2011.