The desktop metaphor is dead. Long live the post-PC is all we ever hear these days. What we don't see very often is how computer desktops can be improved with 3D depth. Watch this demo of a transparent OLED display paired with a Kinect sensor and tell me this isn't making you lust for a Minority Report future.
Researched and created by two Microsoft Applied Sciences researchers, Jinha Lee and Cati Boulanger, the "see-through computer" looks like a pretty generic rectangular box at first. But pull its display upwards and it turns into an awesome computer that lets you "touch" your apps and windows.
Yeah, sticking your hand underneath the display to type looks awkward, but take a look at how the 3D depth sensor lets a user manipulate onscreen elements. The science works by using the depth sensors to capture a user's finger positions, which is then translated into an input command.
In the video demo, you can see windows moved around and various objects being "grabbed" in 3D space with relative ease. Switching between typing and 3D interaction seems to be a seamless experience too. There's even head-tracking to keep the 3D viewing angles in check.
If this is a glimpse at what computing will look like in 10 or 20 years, count me in.