How will I communicate with my car in future?
Communication is the most crucial tool for the cultural development of mankind. So, it's no wonder that there is intense discussion on how our cars will communicate with each other (Vehicle to Vehicle, V2V) and the world around them (Vehicle to Everything, V2X) in future to make travelling easier and safer. At least as relevant, however, remains the communication interface between vehicles and occupants.
According to Jason B. Johnson, HARMAN's Director of User Experience Design & Studio Lead, Huemen, voice is likely to emerge as the principal vehicle communication medium: “As consumers become increasingly used to voice-based control in their everyday lives with the likes of Bixby, Alexa, Siri, Watson or Cortana, they will expect to be able to apply the same approaches to how they drive.” He goes on to point out that voice is also the most natural, instinctive and flexible form of interaction.
“As consumers become increasingly used to voice-based control in their everyday lives, they will expect to be able to apply the same approaches to how they drive.”
However, with a view to advanced driver assistance systems (ADAS) and shared vehicle usage models, the challenge for developers will be exactly those faced by human beings every day in their voice communications, not just with their devices but with other people, too. Different languages, dialects, slang, speech defects, tonality and double meanings are a perennial source of mirth and misunderstanding.
“So, there will always have to be further communication options built into the system,” says Johnson
HARMAN has a number of products and technologies within its connected car portfolio that facilitate voice control as part of an integrated strategy to reduce distraction, increase safety and simplify HMI. The HARMAN Ignite modular, end-to-end cloud platform already supports all popular voice-based assistants and work is ongoing to expand their suitability for automotive applications by adding contextual in-car sensor information as well as intelligence from other devices and clouds.
Using Natural Language Understanding (NLU), these Intelligent Personal Assistants (IPA) can interact with the user using full conversation functionality for dealing with complex requests. For instance, “I’m cold”, raises cabin temperature to a comfortable level. When asked "What does the check engine light that just came on mean", the IPA explains that in this case it only indicates a minor problem and suggests scheduling a service appointment for the end of the week. HARMAN’s flexible digital cockpit solutions also leverage forthcoming technologies such as 5G telematics to maximize the consumer benefits and personalized user experience delivered by unlimited connectivity.
“As the level and sophistication of advanced driver assistance increases, voice control and other non-contact interfaces will make everything much easier and more intuitive for the driver to access, understand and use,” adds Jason Johnson. “This will substantially reduce complexity for users and further increase road safety.”