We’ve reached the point where a camera-based gesture interface, a pico-projector, and wireless computing has come together in SixthSense, a wearable prototype that lets the user grab information about anything in the vicinity. Aim at a package on a supermarket shelf to see its environmental information; aim at a building to take its picture or see its layout in a map view; project a telephone keypad onto your hand and dial away. Information that’s rich and relevant.
Right now, of course, you have to be able to see pretty well, be able to point your head at a target without shaking, and be able to move your hands and fingers accurately and consistently. But we’re gonna fix that at today’s meeting, right, gang? OK, everybody push on the big steel lab door!
There’s a video of this prototype in operation. (Forget our negativity — this thing is bangin’.)