Laptops and smartphones allow easy access computing power, but researchers at the Massachusetts Institute of Technology want to go one step further by turning the entire world into a computer.
At this year's Computer-Human Interaction (CHI) conference in Boston, the Fluid Interfaces Group at MIT's Media Lab unveiled the latest prototype of SixthSense, a wearable, gesture-driven computing platform that can continually augment the physical world with digital information.
Imagine being able to check your email on any blank wall, simply by drawing an @ sign in the air with your finger, or being able to check the time by using that same finger to draw a circle, which produces the image of an analog watch right on your wrist.
You want to take a digital photograph? Just put your thumbs and forefingers together to make a picture frame
Use your hand as a keypad or dial pad
Better yet, imagine a system that can display the reason for your flight delay directly on the boarding pass you are holding in your hand.
"We're trying to make it possible to have access to relevant information in a more seamless way," says Dr Pattie Maes, who heads the Fluid Interfaces Group at MIT.
She says that while today's mobile computing devices can be useful, they are "deaf and blind," meaning that we have to stop what we're doing and tell those devices what information we need or want.
"We have a vision of a computing system that understands, at least to some extent, where the user is, what the user is doing, and who the user is interacting with," says Dr. Maes. "SixthSense can then proactively make information available to that user based on the situation."
The SixthSense prototype has changed since it was first introduced to the public last year. Originally, it consisted of a web camera strapped to a bicycle helmet.
But the current prototype promises to be a bit more consumer friendly. It consists of a small camera and projector combination (about the size of a cigarette pack) worn around the neck of the user. An accompanying smartphone runs the SixthSense software, and handles the connection to the internet.
The camera, in a sense, acts as a digital eye, seeing what the user sees. It also tracks the movements of the thumbs and index fingers of both of the user's hands.
The system can project video on to real world surfaces to augment reality
The idea is that SixthSense tries to determine not only what someone is interacting with, but also how he or she is interacting with it.
The software searches the internet for information that is potentially relevant to that situation, and then the projector takes over.
"You can turn any surface around you into an interactive surface," says Pranav Mistry, an MIT graduate student working on the SixthSense project.
"Let's say I'm in a bookstore, and I'm holding a book. SixthSense will recognize that, and will go up to Amazon. Then, it will display online reviews of that book, and prices, right on the cover of the book I'm holding."
Mistry notes that the system is customisable as well, so that if you don't want Amazon reviews of a book, you could choose instead to find out what the New York Times thinks of it.
He also says that brick and mortar bookstores might decide to provide their own information to the device, which would mean that a customer would not have to necessarily go online to find more information.
The system has been improved to become smaller and more consumer friendly
The hardware included in the SixthSense system is not that expensive. The current prototype costs about $350 to build. But this attempt to merge the digital world with the physical world requires some serious programming and engineering.
"All the work is in the software," says Dr Maes. "The system is constantly trying to figure out what's around you, and what you're trying to do. It has to recognize the images you see, track your gestures, and then relate it all to relevant information at the same time."
It is not surprising then, that is this initial research phase, the SixthSense team has only developed a few applications. In the longer term, Dr. Maes envisions opening up the SixthSense platform and letting others develop applications for it.
But Pranav Mistry sees some commercial applications for the system in the near future. For example, he wants to develop a sign language application that would "speak out" a translation while someone was signing.
He also sees potential for SixthSense in the field of gaming. Unlike the Nintendo Wii, which keeps you in front of the television, the SixthSense system might "allow a kid to go outside, and be able to get a real tennis lesson on a real tennis court."
No one involved in the SixthSense project feels that their platform will replace laptops and smartphones.
"If I'm doing something like CAD, I'm not going to choose the SixthSense interface," says Liyan Chang, an MIT undergraduate working on the project.
"But in certain instances, it can do something that a desktop or laptop can't do, which is quickly put information right where I want it to be, right on a wall or a newspaper in front of me."
And if SixthSense catches on, what will we all make of the sight of dozens of people checking their e-mails on the walls of airports and train stations?
Dr. Pattie Maes laughs: "Well, I think it might actually be more socially acceptable than those Bluetooth earpieces people use these days. At least with our system you can actually see that people are interacting with information, instead of watching someone that looks like they're just talking to themselves on a street."
This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.