Marc Cieslak looks at three new ways to use a computer that go beyond the traditional mouse and keyboard.
By ditching that basic interface for touch, gestures or mind control inventors hope to improve the way people can interact with a machine.
Users could one day manipulate applications through touch
Some handheld devices such as MP3 players and mobile phones have already embraced touch screens.
While their usefulness is apparent with pocket-sized devices it is, so far, less clear that they would work well with a desktop computer or a laptop.
In 2007, Microsoft showed off a touch screen interface it imaginatively named Surface. Ultimately it wants to go beyond a table-mounted touch screen to something bigger such as a wall-mounted display.
Jeff Hann from Perceptive Pixel has come up with a multi-touch screen that lets a user manipulate several applications by making simple gestures.
He believes "intuitive" interfaces could make tech more accessible to "non-computer experts".
"We are all familiar with the mouse, but it still takes a degree of skill to use it. Touch is so intuitive that it's able to be used by children, by grandparents, and people whose job it isn't to be able to use a computer," he said.
The ability for a screen to interpret on-screen gestures or simultaneous touches from several fingers, rather than using a single point like a mouse, might mean users could do more advanced operations.
Some of the latest touch-screen computers
There are rumours that such multi-touch functionality will feature in the next Microsoft and Apple operating systems.
Mr Hann's work is only being done for research purposes, but he believes it could "open up a whole new range of interactions".
"I see touch being used in certain applications. You don't want to write your next paper using touch. I am not an advocate of saying this is going to replace everything," he noted.
Gesturing technology is being developed
UK-based design outfit Clusta created a gestural interface for one of its clients while investigating novel ways to interact with websites.
It relies on a webcam to feed a computer data about a user's movements, which are interpreted as commands.
So far the interface has a limited repertoire of gestures it can recognise and interpret, but a similar type of system has already been shown as a way of controlling a TV.
Clusta has begun work on a new version of gesturing technology that goes beyond simply "motion and movement".
"Rather than just sensing motion in a particular portion of the screen, it will recognise where your hand is moving to and from," said Matthew Clugston, creative director at Clusta.
"So if your hand is moving left to right. it will then spin an object on screen in that direction," he said.
Facial expressions are replicated by wearing the Epoc headset
Over the past couple of years there have been several attempts to create a mind-controlled interface that have enjoyed varying degrees of success.
The latest attempt to tap into the power of the mind called Emotiv has emerged from a joint Australian-US collaboration.
Emotiv has created the Epoc headset the capabilities of which are currently being shown off in conjunction with an adventure game written specifically for it.
The headset enables a player to carry out a limited number of actions by reading their neural activity. On screen this would result in their avatar lifting and manipulating objects.
The headset uses 16 electrodes to measure conscious thoughts, emotions, facial expressions and the rotation of the head.
It works by measuring the electrical impulses emitted by some of the 100 billion nerve cells, or neurons, that make up the average human brain.
The headset transmits the data it collects via wireless to a USB dongle connected to a computer.
Emotiv is planning to launch the headset in early 2009.