Microsoft has been on a user interface kick this week, what with their new OmniTouch and trans-fabric interfaces. But the HoloDesk project might be the coolest demo of all, at least for those of us who lie awake at night fantasizing about our very own holodeck.
Microsoft's Holodesk combines a Kinect sensor and a beam-splitter to create a virtual 3D environment that you can manipulate with your hands. The concept is simple enough: a top-mounted projector displays an image downward towards a work area, and a piece of half-silvered glass sends that image towards your eyeballs while still allowing you to see what's going on. Placing your hands inside the work area lets the Kinect sensor see what you're doing, and the display dynamically updates to let you virtually interact with objects that aren't really there.
The key to the whole system is a webcam that tracks the location of your head and eyes to make sure that the orientation of the projection stays constant. This means that you can move your head around and get slightly different angles on things without losing the synchronization between your hands and the virtual objects inside the display. The system works with real objects, too: anything you put inside the display gets incorporated into the simulation thanks to the Kinect sensor.
So now that we have a seamless way to interact with a virtual objects, all that we're really missing is a good way of getting those objects to interact right back at us, and we'd have ourselves a holodeck. Wait, you say we've got that already too and it doesn't involve brain implants? Well, blimey, what are we waiting for, then? Let's get that holodeck up and running!