By Jo Twist
BBC science and technology reporter
Changing tracks on digital music players of the future while on the move could be done with the nod of the head.
3D sound and movement could transform the way we control gadgets
Building on previous work, researchers at the University of Glasgow have been developing "audio clouds" to control gadgets using movement and sound.
Presenting their work at a US computer interface conference, they say audio clouds could make using mobile devices on the move safer and easier.
A mobile that responds to movements was launched in Japan last month.
Mobile makers are also already testing 3D audio systems for devices.
"The idea behind the whole thing is to look at new ways to present information," Professor Stephen Brewster told the BBC News website.
Interfaces and ways to control mobile devices were often visually based, he added, because they had grown out of the way people use desktop computers.
They tended to vary in ease of use, however, and were often difficult to use while on the move, said Professor Brewster.
"We hope to develop interfaces that are truly mobile, allowing users to concentrate on the real world while interacting with their mobile device as naturally as if they were talking to a friend while walking."
There had been a large body of work looking at different ways to control devices through vision-based gesture recognition, the professor said, but not so much had concentrated on movement.
"Lots of times, you need to use your eyes to operate a gadget - even with an iPod, you need to take it out of your pocket to look at the screen to control it.
"If you could do something with your hands, or other gestures, you would not have to take it out of your pocket," explained Professor Brewster.
The researchers have developed ways to control gadgets, such as personal digital assistants (PDAs) and music players, using 3D sound for output and gestures for input.
For 3D sound, they have been working on bone conductant headphones, which can be placed behind the ears. The sound is transmitted through the ear bone.
"What we are doing is the basic science to design these things, like how big should items be, and how they can stop interfering with each other," explained Professor Brewster.
The team has also been developing systems using accelerometers, in particular. These sense motion and are used to give commands to devices.
This kind of technology is being developed by mobile companies, and the research team regularly talks to them about innovations like this.
The V603SH motion-sensitive mobile that was released last month, for example, makes use of this kind of technology.
Developed by Sharp and launched by Vodafone's Japanese division, it is intended mainly for mobile gaming, but other phone functions can be triggered using a pre-set pattern of arm movements.
Professor Brewster and his Multimodal Interaction Group realised that they could get other information out of accelerometers, too.
The actual variations in a person's gait could be read and harnessed for different uses.
As part of the research, one recently completed student project developed a Harry Potter-type wizard game in which each person has one of the set-ups the team has prototyped.
Working over wi-fi, the idea is that when a person can "hear" another - in a different location - via the "audio space", or cloud, the individual could then perform, or gesture, a spell at the other person.
The team is also exploring ways for people to navigate and deal with the vast amounts of information and functions that phones are starting to handle as they become more powerful.
"We are talking with mobile manufacturers all the time. There has been a large growth in smartphones and people want more functionality out and about.
"Some are integrating accelerometers into phones. We are also getting better audio and video. It is starting to move into the direction we need."
He added: "The whole thing is about trying to make it more natural and using the right way to control something at the right time."
Shaking mobile responds to movement
Although he thinks this is the way we will be controlling our gadgets in the future, it will be some time before it becomes socially acceptable and cheap to do.
Many people have only just come to terms with talking to themselves in the street while using a hands-free set.
There are also very complex algorithms to formulate to make these ways of controlling gadgets work well.
The audio clouds research is three-year Engineering and Physical Sciences Research Council (EPSRC) funded project.
"The innovative aspect of this project is that engineering is being used to explore a new paradigm for interacting with mobile computers to create interfaces that are powerful, usable and safer," the EPSRC's Lucy Brady said.