The 30th annual Conference on Human Factors in Computing Systems is going on right now in Texas, and it looks like Microsoft is there in force. It's showing off five new concepts in human computer interaction, from body part projection to sound wave control to motion-capturing antennas made of people.
Think of Humantenna as a Kinect sensor, except without needing a Kinect sensor. Users wear a backpack (this is just a prototype so it's very bulky) that measures voltage over time over the surface of the body, and changes in this voltage allow the system to identify movements, gestures, and locations. Where does the voltage come from in the first place? Ambient electromagnetic noise emanating from light fixtures, power lines, and other electronic devices.
It's getting easier and easier to learn new physical skills over the Internet thanks to things like video tutorials, but there's still no way to replace a human with a ruler standing there to give you a good hard smack if you screw something up. LightGuide is intended to be a virtual teacher of sorts, combining motion sensors and a digital projector that interactively instructs you how to move your limbs to complete a task. No ruler smacks (yet), but still more effective than simply watching someone else do it.
We can all agree that it would be pretty cool to have a Kinect embedded in laptops and smartphones, but since we're not there yet, Microsoft has decided to cheat a bit and fake the same sort of gesture interaction capability with speakers and microphones, which your computer already has. SoundWave uses your computer's speakers to emit ultrasonic soundwaves that your microphone can hear but you can't, and then monitors those sound waves for subtle frequency shifts caused by the Doppler Effect. When you move your hand around in front of the computer, the ultrasonic soundwaves are reflected back to the speakers at slightly different volumes and wavelengths, and these data can be used to measure velocity, direction, proximity, size, and rate of change.
4. LCD Dual View
You know how if you have a relatively cheap LCD monitor and you try to look at it from too extreme of an angle the colors go all wonky? Microsoft has somehow figured out how to take advantage of that to display two images on one regular LCD monitor at the same time. I have no idea how this actually works (something about spatial multiplexing of pixel colors), but you can do some very cool tricks with it, including glasses-free 3D.
MirageTable combines gaze tracking, a projector, and a 3D display to simulate dynamic and physically realistic behavior in a 3D environment that you can move your head to look around in. Real 3D, in other words. Hooking up two of these systems together also allows for interactive collaboration in a virtual environment.