By Åse Dragland
"We are working to develop interaction between myself and this iPad which does not require me to touch the display", says Petter Risholm at SINTEF ICT. The basic technology has been around for several years, and he is now working with his colleague Jens Thielemann to extend the function to enable them to select and move objects on the display.
Expanded vocabulary
They wanted to lead the technology trend that prompted the Norwegian company Elliptic Labs to contact SINTEF in 2008. At that time, the focus was on touch-free interaction with stationary installations such as PCs and laptops. Now it is mobile phones and tablets.
"We’re trying to expand the vocabulary to achieve a more detailed interaction", explains Risholm. "We’ve been able to scroll through pages for some time. Now we’re working on selecting and moving objects, or saying stop by raising a hand".
Has no one done this before?
"We’re not aware of any others in this field in Norway, no".
What are the potential benefits?
The researchers see this as an extended service, not something that will replace the functions of a touch-display.
"Imagine that you're baking bread. Your fingers are sticky and you want to check the tablet to see how much flour it said in the recipe. In situations like this it would be great not to have to touch the screen", says Risholm.
"Or you're working in your workshop, your hands are oily, and you want to check a phone number on your mobile. Many people also suffer from 'mouse arm', and it would be of great help to them to interact with the screen using more expansive hand and arm movements.
In demand
Many companies around the world are focusing on the field of touch-free interaction, and there are many competing technologies. A field in which touch-free interplay is widely used is the games industry, and Risholm mentions Microsoft's Kinect sensor, which is the best known example. It can recognise body postures and expansive arm movements.
However, touch-free interaction is not yet widely used in mobile devices, although some systems have been launched onto the market. For example, it’s possible on Samsung's newest mobile phone to scroll though an e-book by moving your fingers in front of the integrated infra-red sensor. But according to the researchers, the system has limited options and a limited field of vision.
"Samsung is focusing on simple infra-red sensor technology. This is why the system has limited sensitivity and doesn’t always recognise commands. To use it, you have to perform the command directly over the sensor, which tends to be at the top of the PC/mobile/iPad. If you move your fingers or hand away just a little, nothing happens", says Risholm.
Benefits of ultrasound
The Norwegian researchers have therefore chosen to focus on ultrasound. This technology enables the whole screen to be used – which means a larger working surface.
"The system picks up on what you’re doing both in front of and beside the screen. This creates a large interaction area, and means that it’s also possible to control the device without having to screen off the display. The mobile or tablet can also detect what you’re doing as long as you move within between 2 cm and 30 cm from the screen", says Petter Risholm.
Much of the work focuses on making the system so robust that the command you make will always be understood with no room for misunderstanding. This becomes particularly challenging as SINTEF researchers try to expand the vocabulary to include gestures for more advanced commands such as drag and drop.
Does 'everyone' have to have it?
The technology for future touch-free interaction must be cheap and power-efficient. The system must be small and uncomplicated, so that it can be integrated into mobile devices, and it must have a large work capacity so that interplay can be relatively detailed.
"Cameras and infra-red technologies are being used on a global scale, but we have seen that our ultrasound technology has properties that fulfil many of these criteria", says Tom Kavli of Elliptic Labs.
"The main advantage is that our technology enables a larger space from which a device can detect and recognise the user's gestures. You are not limited to the top of the device, but can operate in front of the whole screen, which is also best in terms of ergonomics.
Kavli says that the field is developing at a furious pace, and that new, exciting applications will open up as the technology continues to develop. "We’ve had an extremely good response internationally from all mobile phone manufacturers, and have just been awarded an international innovation award in Japan", he says.
Now the company is working to commercialise the technology, and estimates that it will be rolled out in a few years’ time.
FACTS:
The first project for Elliptic Labs ran from 2008–2012. It was here that much of the basic technology for touch-free interaction with PCs or desktop devices was developed.
The project Multigest (2012–2016) is now up and running, and is extending this work to include mobile phones and tablets.
Both projects are BIA projects (User-driven Research-based Innovation) funded by the Research Council of Norway.