Safety concerns may ultimately drive demand for in-car hand gesture technology. But will it become standard, or simply a sideshow to the HMI mix? Brian Kenety and Justin Parker report.
Kinect – pardon the pun – was a game changer. Arguably one small step for technology long in use, Microsoft’s motion sensing input device for the Xbox 360 was a giant leap for virtual reality, allowing players to control and interact with the gaming console solely through gesture and spoken commands. Selling eight million units within two months of launching, it’s the fastest-selling consumer electronics device ever, Guinness World Records says.
Now, Microsoft is looking to adapt Kinect’s gesture-recognition technology into its Windows-driven Connected Car Platform. “The new Connected Car will know its riders and will interact with them naturally via speech, gestures and face tracking,” the company said in a job posting last summer, which is how the news broke. “Through a growing catalogue of applications, it will inform and entertain them, and keep them connected with the people and information they care about. The possibilities are endless.”
Microsoft is far from alone in looking into harnessing gestures as yet another way of interacting with the vehicle.
In January, Hyundai unveiled HCD-14, a luxury four-door concept sedan featuring 3D gesture controls for its navigation, infotainment, audio, HVAC and smartphone connectivity functions. The driver selects a main function with his eyes on a head-up display (HUD), confirms his selection with a thumb button on the steering wheel and then moves into gestures, which include moving a hand in/out from the dashboard to zoom in/out on the navigation system, a dialing motion to adjust volume and a side-to-side gesture to change radio stations.
(For more on Hyundai's HCD-14, see Podcast: Gesture recognition and HMI.)
Meanwhile, Japanese carmaker Nissan is heralding its InTouch telematics system for the 2014 Infiniti Q50 sports sedan with two touch capacitive screens and hand gesture screen operation as “the future of in-car connectivity.”
And, last week, Google purchased Flutter, a hand-gesture recognition company, filing a day later for a patent with the United States Patent and Trademark Office for gesture-based in-car controls. Flutter’s technology utilizes an in-dash camera to capture driver gestures (road rage outbursts aside) to control in-car features. Google said that, similar to its impetus for self-driving cars, the goal here is safer roads and less driver distraction.
(For more on Google and self-driving cars, see The autonomous car: The road to driverless driving.)
Quo Vadis, HMI?
Multi-touch user interfaces, interfaces able to recognize at least three touches simultaneously, are now standard in everything from smartphones to in-vehicle infotainment (IVI) systems. Will gesture control also become the norm or simply a sideshow to the human-machine interface (HMI) mix, which already includes touch screens, voice recognition and old-school steering wheel controls?
BMW, for one, says gestures are ever more important in defining the HMI but will never replace its iDrive controller – now with an integrated touch pad. For now, the German carmaker is only looking at simple gestures to control some iDrive display functions, like its split screen feature, which can, for example, divide the screen into a navigation and infotainment part.
“[Our] developers are focusing on six different gestures that can be used in conjunction with the screen contents shown on the central information display in current BMW vehicles – hand movements from left to right across the screen, or up and down, plus hand movements toward or away from the instrument panel,” BMW says in a web post discussing the future of ConnectedDrive. “Gesture controls in automobiles … need to be short, concise and unambiguous,” the company writes. “They must not distract the driver or negatively impact his ability to control the car in any way.”
But other developers of gesture-recognition systems say they are looking to further blur the line between man and machine for a more immediate and intuitive user interface (UI) experience.
Now building up a “gesture database” through studying human behavior is Harman Becker Automotive Systems, a leading in-car tech supplier that makes the multimedia interface for Audi and BMW's ConnectedDrive. Hans Roth, Harman’s director of global business development & marketing, says that the company’s next generation infotainment system with (Kinect-ish) infrared-based system, shown at the CES in Las Vegas, is already ready for commercial use, that talks with OEMs are well advanced and that the roll-out date is certainly within two years. “We call it our premium interactive head-up display system because we’re focusing on the interactive operations, with gesture control,” Roth says.
Harman’s system focuses on the free hand as opposed to finger gestures, which require more concentration, he says, adding that like Hyundai, they are also looking at face recognition and eye-tracking technology. For now, Harman is looking to incorporate fewer than ten hand gestures into the UI mix.
“We focus on making these gestures very intuitive so that the driver doesn’t have to think, for example, ‘What must I do now to take a phone call?’” Roth says. “You can receive a phone call, see the contact person’s picture in the head-up display, take the call or reject it with a wipe … or select an entry in the phonebook and make a call, just with a free hand gesture. And you have it in front of your eyes, so you can keep them always on the road.”
(For more on HMI design, see Video: The future of HMI design.)
Apple’s core, connectivity and Kinect
According to Accenture, concern for safety may ultimately drive demand for in-car hand gesture technology mainstream. “As the use of traditional methods of interacting with … mobile devices present a safety hazard,” the consultancy writes in its Perspectives on In-Vehicle Infotainment Systems and Telematics report, “the adoption of hands-free communication will increase.”
The market intelligence firm ABI Research says vision-based gesture recognition should be among this year’s feature set “big winners” in smartphones. “Gesture recognition is soon going to become a key differentiator in high-end flagship smartphones,” senior analyst Joshua Flood says.
Samsung’s latest Galaxy S4 already incorporates gesture recognition “and has received significant plaudits for its new innovative user experience,” Flood says.
A Galaxy S4 user can now navigate with the wave of a hand, answer his phone without picking it up, preview emails without opening them or view pictures in a folder without tapping it. Google’s new smartphone, Moto X, also responds to spoken commands and gestures. As for Apple, in 2008, the company filed a patent for technology similar to Microsoft’s Kinect. Its “Advanced Sensor-Based UI” application was finally granted a patent in September. The technology uses sensors in the bezel of a display to enable scrolling, selecting and zooming via gestures from meters away, and could be in the iPhone 6.
So how should IVI systems respond? By adding enhancements – live traffic, parking updates, etc. – and innovations in sensor detection, interconnection, and UI, Accenture says. And many are, though they are ever cautious not to add to driver distraction.
Building a better ‘mouse trap'
Still, Chris Schreiner, director of user experience practice at Strategy Analytics, isn’t quite sold on the appeal or efficiency of in-car hand gesture technology, as, like with voice recognition, he sees too much room for error or false positives, given that people frequently gesticulate when talking.
“I’m just a little cynical on what’s actually going to come out of it,” he says. “You have gaming systems – Kinect – where there are a lot of gestures, and it’s very compelling to play a game and mimic what’s happening on the screen. That’s great for a gaming environment, but what are you going to do with that in the car? There are not many intuitive use cases for gestures. It’s not terribly compelling.”
Mark C. Boyadjis, senior analyst & manager, infotainment & HMI at IHS Automotive, is more gung-ho on gesture control, seeing its potential to replace some existing input/output mechanisms or as an additional way to access them. While he also notes the possibility for false positives, he says a major issue going forward is that the hand gestures themselves must be standardized.
“It’s a potential solution to some of the driver distraction issues, some of the UI problems that have started bubbling up because of the urgency of [providing] connectivity,” he says. “But gesture recognition will not work if you have to learn it all over again if you have to buy a new car. … The point is that what Hyundai describes as a gesture could be different from how Ford or Volkswagen implements gesture – and that doesn’t work.”
He adds: “The technology of voice, touch screens, hard keys and rotary controllers have been around for a long time, and, for the last couple of years, it’s been about who can perfect it, design a better mousetrap so to speak. Gesture recognition has the potential to change the mouse trap a little bit.”
Brian Kenety and Justin Parker are regular contributors to TU.
For all the latest telematics trends, check out Telematics Munich 2013 on Nov. 11-12 in Munich, Germany, Telematics for Fleet Management USA 2013 on Nov. 20-21 in Atlanta, Georgia, Content and Apps for Automotive USA 2013 on Dec. 11-12 in San Francisco, Consumer Telematics Show 2014 on Jan. 6, 2014, in Las Vegas.
For exclusive telematics business analysis and insight, check out TU’s reports: Telematics Connectivity Strategies Report 2013, The Automotive HMI Report 2013, Insurance Telematics Report 2013 and Fleet & Asset Management Report 2012.