Loading...

Talk

Scroll

Previous Article: Yes, predicting your sales is possible!
Next Article: AI: How to Explain the Inertia of Companies?

Jan 17, 2022

The Language of Machines

by Hugues Foltz Executive vice-president

Artificial intelligence

Speech is without a doubt a powerful skill that distinguishes humans from all species living on Earth. It unifies people and allows us to pass on knowledge from one generation to the other.

Beyond the words we use, delivery, accents and intonations are amongst the many factors that can influence the message conveyed. Speech is not only the most straightforward way to communicate, but also the most precise. Yet as human-specific as the art of conversation may be, we are no longer chatting only amongst ourselves: we are also talking to machines!
 
Not so long ago, it was essential to memorize a certain amount of phone numbers if we were to reach our relatives and friends, while nowadays we struggle to remember more than three or four. Our good old Rolodex has been replaced by smartphones. We interact and talk with them, and above all, they answer.
Technologies capable of listening and speaking come with their share of questions and fears. Will we know that we are talking to a machine? How can we tell if the machine imitates the voice of a real person? Here are some possible answers.
 
Your phone, your car and your Alexa are speaking to you. Soon enough, even your fridge will do the same. Machines that listen are already part of our lives. In fact, the change taking place right now goes far deeper than the mere presence of virtual assistants in our homes. The voice is a communication tool well mastered by robots. For instance, we’ve already begun to entrust important tasks to bots using a synthetic voice.

In the great city of Copenhagen, since last year, the 911 emergency services have been completely run by a technology named Corti. The voice it uses is strikingly humanlike. Corti has been tested on more than 160,000 emergency calls. During those tests, it was able to diagnose no less than 92% of heart attacks, whereas human operators could only detect 73%.
 
To obtain such incredible results, AI experts turned to Natural Language Processing (NLP), a broad area of expertise. NLP combines an impressive number of sub-specialties such as lexical semantics, machine translation and emotion analysis. Like many areas of AI, deep learning has proven to be a major accelerator in the development of technologies that mimic and understand the human voice.
 
The considerable success of Copenhagen’s project is now resonating in many other countries, which are (finally) grasping the full potential of artificial intelligence algorithms to analyze the complexity of an emergency call, or any other call for that matter. You must admit it’s quite brilliant, isn’t it? But are we really surprised?
Virtual assistants have been listening and understanding us for so long, it’s only natural that algorithms became so powerful!
 
Call centres as we know them today are very likely to disappear in the years to come. By extension, we can only imagine that many occupations – receptionists, sales consultants, cashiers or other customer service jobs – will undoubtedly be profoundly transformed.
 
Let’s keep in mind that this is only the beginning. In the United States alone, there already are over 110 million voice assistants in smart phones and speakers. Imagine what will await us when, in a few years, more efficient technologies see the light of day.