Menu
Menu
Elliptic Labs smartphone is controlled by waving at it

Elliptic Labs smartphone is controlled by waving at it

Ultrasonic speakers can turn phones into gestural interfaces

Elliptic Labs shows off its smartphone gesture control interface at Ceatec 2014 outside Tokyo on Tuesday. The system uses an ultrasonic speaker to determine the location of a user's hand.

Elliptic Labs shows off its smartphone gesture control interface at Ceatec 2014 outside Tokyo on Tuesday. The system uses an ultrasonic speaker to determine the location of a user's hand.

Sick of fingering your smartphone's touchscreen? How about just waving at it?

Elliptic Labs thinks it's easier and more intuitive to wave at a smartphone than to touch it as a control method.

At the Ceatec 2014 technology show outside Tokyo, the Norwegian startup exhibited a smartphone control method it calls Multi Layer Interaction. It allows users to display different content on a phone depending on the location and distance between their hand and the screen.

During a demonstration, an Elliptic staffer moved his hand toward the screen on a prototype smartphone. The device immediately displayed the home screen on its Android OS. When he moved his hand closer, the device scrolled through various other menu levels.

The feature is aimed at replacing touch for everyday phone uses such as checking messages, playing games, watching videos and navigating maps, which were also shown during the demonstration.

For instance, if you're watching a video on your mobile device and want to see how much time is left before the end, simply waving a finger at it will call up the time display.

"With Multi Layer Interaction, we can represent different content depending on your distance," said Elliptic Labs CEO Laila Danielsen. "First we had the keyboard, then the mouse, then it was touch. We knew that the natural next step is going to be touchless."

The technology works with an ultrasonic speaker developed by Japanese components maker Murata, which showcased its latest performing robots at Ceatec.

The ultrasonic speaker, measuring 5.2 millimeters square, fits into the earpiece of the smartphone prototype. It works with standard MEMS microphones already in use in smartphones, and even works in noisy environments.

It sends out ultrasonic waves that reflect off the bones in a user's hand. Those reflections are picked up by four microphones in the smartphone, one in each corner of the display surface. An algorithm then processes the data to determine the location of a hand.

That calculation involves a relatively light computational load and it won't impose a large drain on battery power, said Erik Forsstrom, director of user experience for Elliptical. The gesture interface can remain on even when a smartphone is asleep.

Elliptic Labs said it's in talks with smartphone, tablet and laptop manufacturers to include ultrasonic speakers in models that will be launched in 2015.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the CIO newsletter!

Error: Please check your email address.

Tags consumer electronicssmartphonesCEATECElliptic Labs

More about Sick

Show Comments
Computerworld
ARN
Techworld
CMO