Touch screens have changed the way people use mobile phones dramatically. But gesture controls, augmented reality and larger screen sizes are about to change habits even more, according to mobile interface expert Christian Lindholm.
We will see more sensors in devices that can transform the mobile user experience by allowing control through gestures and other types of movement, according to Lindholm, the managing director of Fjord, a consultancy exhibiting at IFA that has specialised in building mobile user interfaces. The company helped the BBC build the mobile version of its iPlayer, which was named the best mobile music or video service at Mobile World Congress earlier this year.
The use of gestures and movement to control phones has already started to take off. Some Nokia devices allow users to reject calls by turning them upside down, and the iPhone has a "shake to undo" capability. Another obvious way to use the technology would be to share files with a flick of the wrist or by touching devices, Lindholm said.
De facto standards for these gesture controls will eventually emerge, so tasks can be done in similar ways with different devices, according to Lindholm.
He also has a side-business called Tech21, which is working on a technology that can replace a phone's keyboard with a trackpad, which could then be used to sense different gestures. "That's interesting because then we get pressure. So we could put a gas pedal and a brake pedal on keys," said Lindholm.
The technology will come to market in a couple of years, he said.
The area of gesture control could also turn into a goldmine for lawyers, however. Companies such as Nokia have started to patent gestures, according to Lindholm, who hasn't decided yet if that's a good or bad thing.
Intellectual property rights can be a good thing if they are coupled with common sense, he said. But too much can lead to unnecessary litigation.
Another emerging area that could transform the way we use mobile phones and perceive the physical world is augmented reality, a technique which overlays relevant information or annotations on a view of reality, perhaps from a phone's camera. Modern smartphones are great for this application, as their GPS receivers and accelerometers allow software to determine where the phone is and which way it is pointing.
The most challenging thing will be using it to obtain information about someone while talking with them, according to Lindholm: Holding a phone up in front of someone without an apparent reason isn't yet socially acceptable.
There is still lots of work needed on a user interface that can work in this situation. It can't take up so much of your attention that you can't interact normally with the other person, Lindholm said.
The size of the screen is an integral part of the user interface. It's easier to develop a good user interface on a bigger screen, and the vendors are in a race to put bigger screens on mobile phones, according to Lindholm.
But as the screen grows so does the size of the devices, and if they are considered too big they'll put users off. The iPhone was a trendsetter not just for its touch-screen and application store, but also for the width of its body, at 61 millimeters for the first generation and 62.1 millimeters for the 3G and 3GS.
"To me it's fantastically interesting that no one has dared to challenge Apple on the width because it's then perceived as being too wide," he said.
If a phone has a great user experience then an extra-wide body may be acceptable to users. But it's rare that devices have been able to deliver on that, so playing the size game is risky, he said.
Screens coming in a couple of years that fold or pull out could solve that, however. Samsung has demonstrated foldable screens, Lindholm said.
Designing software for phones isn't a trivial matter. Besides smaller screens, developers have to contend with users who are on the move a lot and don't give applications their full attention, or use them only in short bursts.
"When companies design their applications they forget, or suppress, these laws of mobility, and that results in lower usage, more user errors and lower uptake," said Lindholm.