An electrical emitter that sits on your tongue offers a new way to perceive the world around you.
Bar-Ilan University student Tirosh Shapira may have been lying in an fMRI machine in Israel, but he wasn't immobile. Nearly 2,000 miles away in France, his brainwaves were controlling a robot that moved when Shapira thought. It's the first time a robot has been controlled directly using this method.
At what point will we be able to casually chat with our gadgets like the crew of the USS Enterprise does with its computer on Star Trek, or like Dave Bowman and Frank Poole do in 2001 before HAL went violently bonkers? We're taking baby steps toward normalized machine-human relations with Apple's Siri, Ford's Sync, the ivee clock radio, Samsung's voice-controlled HDTVs and IBM's "Jeopardy"-champion Watson. Perhaps a further step will be taken by the long-rumored Siri-controlled Apple HDTV later this year. But we're still a long way from considering colloquies with our appliances as normal as bar codes, Wi-Fi and touchscreens. The question is, just how long of a way? And just how conversational do we want our gadgets to become before paranoiac imaginings of malevolent self-awareness develop?