Liz Glagowski:
Hi and welcome to the CX pod. I'm Liz Glogowski of the Customer Strategist Journal. After a bit of a break, we're excited to be relaunching the CX pod as the go-to resource for all things customer experience.
To kick off our relaunch, I recently sat down with JB Bednar, TTEC's head of innovation, at the Customer Contact Week show in Las Vegas. Now CCW is the premier event for customer experience and this year was the biggest ever. There were so many tech vendors, technology companies, even traditional legacy CX companies really excited about incorporating AI and other tools into their business.
And really, seeing what the future might bring in terms of innovation around the customer, we had a really cool conversation about the idea of empathy. And while it's important, some traditional conventional wisdom approaches to empathy in the contact center may need a second look.
Here at CCW, there's a lot of talk about agentic AI and empathy. Now, JB, you've got a unique take on it, which some even called provocative. So you want to share?
JB Bednar:
To explain, I think you know when we think about AI and humans, I think one of one of the things we've sort of grasped as being unique about humans is the ability to express empathy or understand empathy. And we thought about that question for a while. To better understand empathy - not only in how we deliver it today within CX interactions, but how do we think about how empathy fits within agentic and AI interactions in our industry - we've kind of had this one-size-fits-all idea that, you know, if a little bit of empathy is good, a lot of empathy must be better.
So that became sort of the standard for everything we did. And the more and more we dug into it, we realized that, you know, look, not not every interaction warrants that same level of empathy. Not every culture or geography or industry a [wants empathy] and why weren't we looking at it in a in a bit more of a strategic way?
Speaker 1
So now, how does this buck conventional wisdom? You know, you mentioned a little empathy is good, a lot more is better. How? How does this really buck that trend and what companies are focusing on?
Speaker 2
Yeah, I mean, I see a very similar pattern when when we work with clients or operations teams where you you sort of see that same pattern in the training material and things like the QA process and the QA rubric, and we're sort of like industrializing the same approach everywhere.
What we're seeing in some of the the academic research, what we're seeing in some of the studies, is it's probably a terrible idea. And the fact of the matter is, is by doing that, we may be making the interaction worse. And gosh, I've sat in enough contact centers and and listened to enough calls and sat side by side to just hear it go wrong in the wrong context. And it just raises the question of why are we? Why are we doing it this way? And is there a better way to approach it?
Speaker 1
So for companies unsure how to manage these different types of customers and empathy levels, you've created at TTEC a matrix for these different types of customers and call types and and cultures. So can you explain a little bit about it and how it works?
Speaker 2
Yeah, we wanted to create a framework for maybe looking at the next level of detail, kind of within empathy, and we came up with an approach that looked at it from the perspective of, you know, are you going to focus on empathy or are you going to focus on action as your approach.
We covered all of the nuance that we see in different cultures and different places around the world. So we we went back and looked at some of the the older academic research on on culture, on communication and interaction, and we we found that was a pretty interesting way to look at this.
When you look at just the different ideas, or the aspects of different cultures, around things like are they community based or individualistic from a culture perspective or are they low-context and high-context. Like how much is all of the unspoken interaction part of that interaction? And when you actually put those two together and do a quadrant, because we we always love quadrants, you come up with an interesting mix, which comes up with sort of four flavors for how to approach empathy on those two axes: action versus empathy and low-context versus high-context.
Speaker 1
So the idea would be, depending on the types of customers you have or the or the places in which you're operating, you can map some of those calls based on that.
Speaker 2
Yeah, that's right. So you could either look at it by call type and say, "Hey for service interaction type should it fit within a certain quadrant." You can look at it from the perspective of different industries falling into where where customers expect. So if you're working with a customer engaging with high-end retailer versus oh, I don't know, the cable company, you may have a different expectation for the level of engagement context than it would be on the call. So it's a tool to guide how to approach those interactions.
Speaker 1
So where, then, does AI fit into the mix? Because this is another topic here at CCW, a lot of the discussions are shifting to where AI is now evolving to be almost able to to have some level of empathy. So where does AI fit in the mix and what should companies be thinking about?
Speaker 2
Yeah, it's a great question. I think there's there's two parts because, you know, we're looking at the human interactions and we're looking at the AI agent interactions. And I think the first one is you can you can use that same empathy matrix as part of the way you design those bot interactions.
I think for a long time we saw a lot of different methodologies or approaches to help identify what interaction types are best suited for automation. But those generally focused on things like, you know, do we have a do we have a high volume of certain types of interactions.
Speaker 1
Yes.
Speaker 2
Or were those really simple interactions now that AI has gotten much more sophisticated into the types of interactions you can automate. It sort of raises the next question, which is how are we designing those conversational interactions with AI so that they are delivering the right levels of empathy as a human interaction?
I think the second way is, given how good AI is becoming at understanding emotion and conversational data, gow are we using these tools to help us? Tune our agent, human agent performance with the right level of empathy. Are we? Are we spending way too much time apologizing? Are we using a lot of really low-value, flowery language as part of an interaction where we're spending handle time, we're spending effort and it's not really generating a better outcome at the end of the day?
Speaker 1
So finally, you've been here a couple days. What's your hot take here at CCW.
Speaker 2
Oh gosh. It's interesting having come to this conference for, gosh, more than a decade, maybe two decades. I'm not sure I'm dating myself. You know, it went from very much services industry focused to a tech focus.
I feel like it is now 99% tech focus, but there's still such a huge desire for all of the folks attending and the conversations we had - people still think the human aspect of it is more important. And when you're focusing on that, people want to have those conversations. I feel like, yeah, there's a heavy emphasis on tech, but this has always been a a people-focused industry, both from the customer perspective and the delivery perspective.
Speaker 1
Alright. Thanks JB. To learn more about bringing humanity to business, come see us at ttec.com or subscribe to our journal at customerstrategistjournal.com. Thanks, see you next time.