×

blog

Is Google Duplex Dehumanizing Customer Experiences?

Stay Connected

Blog updates via Email

Earlier this month Google demonstrated its new artificially intelligent feature Duplex, in which a Google Assistant called a hair salon to book an appointment, carrying on a very human-like conversation with the receptionist who seemed unaware that she was speaking to an AI.

While impressive, it also opens for discussion the morality and ethics of automating more of the customer experience. As we move further away from manual and authentic modes of communication, are we giving up more than what we get in return?

Sure, chatbots already simulate human conversation. Most chatbot programs rely on an if this, then that (IFTTT) process. For example, if you type “Hi,” then a bot might be programmed to automatically respond, “Hello! How can I help you?” And consumers increasingly assume that they may or may not be chatting with a bot.

What’s new here is that Google Duplex further fractures expectations of conversing with a human while enabling automated interactions on a whole new scale and potential for abuse. Voice-activated AI devices offer tremendous benefits for consumers and enterprises in terms of saving time, but they can also be manipulated and hijacked. For example, researchers at the University of California, Berkeley demonstrated how audio commands can be hidden in a YouTube video to hijack Amazon’s Alexa and order it to make purchases.

Prove you’re not a robot

Another issue is the fact that Google Assistant must record and analyze the spoken utterances in order for Duplex to determine how to respond. About a dozen states, however, have wiretapping statutes that require all participants in a phone call to give consent before a recording can be made.

A Google spokesperson has reportedly said that the demo was an early stage version of the product, and that the final version would notify people that they were either talking to a robot or being recorded.

On the flip side, Duplex may usher in a new set of verification requirements in which callers must confirm that they are in fact, human. In addition to the Duplex demo, Google demonstrated that it can replicate the singer John Legend’s voice. What this means for voice biometrics as a method of authentication is unclear, but disruption is practically inevitable.

What’s more, Duplex is unlikely to be used only as a concierge service. It’s up to companies to decide what to do with technology that can mimic human conversations. For instance, TTEC is thinking of ways to deploy Duplex to assist associates. And even though it remains to be seen how well the technology works, the proverbial Pandora’s Box is already open. The technology will continue to be perfected.

Humanizing digital experiences

Duplex represents a natural evolution of Google’s products: Google Maps and the Google Assistant already make suggestions based on information they have about our location and habits. Google Photos creates entire albums of photos based on its interpretation of the memories we’d want to preserve. What these services have in common is the notion that AI can handle the menial tasks in the background, allowing us to focus on what really matters.

But what if connecting with another person is what matters? Even if a bot was somehow able to explain a complex issue to a contact center associate, would the bot be able to express urgency? It’s no secret that consumers avoid making phone calls whenever possible, but the further we move into automated modes of communication, something personal is lost. The pendulum may eventually swing back if automation becomes the norm.

Delegating conversations to AI represents a seismic shift in human interaction. By relying on bots to communicate for us, we risk degrading the very qualities that separate us from machines—our unique voices and emotional intelligence. Instead of automating ourselves out of the conversation, we need to focus on truly humanizing digital experiences. And to do that, we can’t lose our ability to communicate with each other.