Don’t you wish your car was as intuitive to use as a smartphone?During the past decade, automotive manufacturers have added more and more electronics to take the hard work out of driving a car. But the proliferation of buttons, screens and lights, while reducing the physical effort required to drive a car, has also increased the risk of distractions.
So, car makers are looking to natural-language voice control as an alternative user interface that will help ensure drivers’ eyes remain focused on the road ahead.
Indeed, this message of “eyes on the road” is the marketing slogan that Ford uses to promote SYNC, its voice-control technology that make its European debut on Ford’s new B-MAX car later this year.
As well as familiar features such as hands-free calling and turn-by-turn directions, SYNC can do a few other tricks, such as control the air conditioning, read out text messages or control smartphone apps using your voice.
Ford hopes to sign up 13m SYNC customers globally by 2015, including 3.5m in Europe. Cadillac has gone one better and combined natural-language voice recognition with an 8-inch touch screen á la iPad in the Cadillac User Experience, its latest car user interface. The technologies that power CUE read like those of a latest-generation smartphone: Linux OS, a 3-core ARM 11 processor, JavaScript, HTML5 and so on.
Chip giant Intel sees the automobile as just another connected mobile device — albeit one that is considerably larger and more expensive than a smartphone or iPad.
While chip and car makers have grandiose visions of turning cars into internet-connected “infotainment” centers that can read your mail, connect to Facebook and sync with other devices, I think the killer application of in-vehicle voice technology is much more basic – making cars easier to use.
As someone who regularly rents vehicles, I often struggle to discover where the lever to open the fuel cap is located on unfamiliar cars. It would be so much easier to simply say “open the fuel cap”.
Similarly, do you know what the correct tire pressures are for your car? Instead of trying to find the information in the user manual, the car could read them out to you instead. Drivers are more likely to take to natural language technology if it used to offer genuinely useful services. Wouldn’t it be great to able to say “I think I’m lost. Tell me how to get back onto the main road”? To which your confused voice-driven satnav system might respond: “There is no Main Road listed in this city?”
Being able to understand that “the main road”, in the context in which it is used here, does not refer to a road named “Main Road” is a difficult task for natural language systems. For this reason, automotive manufacturers are looking to use cloud-based intelligence to supplement the currently limited capabilities of their in-car voice recognition systems. But as users of Apple’s Siri already know, there can be a time lag of several seconds while their mobile device — handheld or four-wheeled — consults cloud-based assets. And with in-car systems, there is a greater chance that those assets may be offline just when they are needed most — while driving through a tunnel, for example.
So what is the solution? Well, as coverage and download speeds continue to improve, the occasional slow responses will become an issue of the past. But it is also possible to further mitigate the impact of lost connectivity by holding crucial elements of a natural language solution on-board so for example language libraries for speech recognition and the number-crunching for natural language processing can already be configured to reside on device, reducing the need for a permanent connection to those queries that inevitably require access to external services.
Hence, as natural language technology continues to evolve at pace, there seems little doubt that the car of the future will have more intuitive user interfaces capable of understanding natural language and, equally importantly, context.