About Ben Labay
Ben Labay is the CEO of the experimentation and CXO agency, Speero, by CXL. He combines a decade of academic and research training with UX & business development knowledge to help scope and run experimentation and CXO programs for companies including P&G, Serta Simmons, Codecademy, MongoDB, and Miro.
This episode was also recorded in video format. To watch the conversation, tune in below:
Mary Drumond is Chief Marketing Officer at survey tech startup Worthix, and host of the Voices of Customer Experience Podcast. Originally a passion project, the podcast runs weekly and features some of the most influential CX thought-leaders, practitioners and academia on challenges, development and the evolution of CX.
Follow Mary Drumond on Linkedin
Follow Mary Drumond on Twitter
Worthix was born in the Experience where customers are the backbone, and customer-centricity is the soul of every company. Innovation is at our core, and we believe in welding technology to bring companies and customers together. Our purpose is to use cutting edge mathematical models and Artificial Intelligence to extract actionable, relevant, and easy-to-understand insight straight from your customers’ minds.
Mary Drumond: It’s season 7 of Voices of CX podcast, still bringing you the best thought leaders, practitioners and academics of the industry, but this time with a renewed focus on the human touch. Empathy is our key word, and we’re going all out on discussing how conversations can reshape experiences and your business inside and out. No matter how big your business is, what your challenges are or your industry, connecting with your customers and their decisions is the essential to leading through empathy.
We’re back with one more episode of Voices of Customer Experience. We’re on season seven and today I’m joined by Ben Lebay. Hey Ben!
Ben Labay: Hey, good to be here. Thank you for having me.
Mary Drumond: Thanks for coming on. It’s kind of a gloomy day in Atlanta, which is rare for those who watch this video a lot. It’s always like this really beautiful backdrop here in the picture. People sometimes write me and say, Mary, are you using a fake background? No, I am not. This is the burbs in Atlanta.
But Ben, you are the focus of our conversation today. Not only you, but everything that you’re doing in your job and some of the changes that you’re making in your job as well, which is a really interesting topic for me. So I’m going to let you do the talking for a couple minutes and tell our listeners and our viewers about who you are, where you work and what you feel your overall mission is professionally.
Ben Labay: Yeah, let me get you… I’ll tell a little bit of a story and get to that CX topic at the end of it.
And we can pivot from there. I’m calling from Austin, Texas, sort of in kind of a big work closet at home. I work for a company called Speero. It’s an agency, we recently rebranded from CXL Agency. So the bigger umbrella organization that I work for is known as CXL. We have a few different business units but by and large, we’re in the industry of conversion rate optimization. That’s kind of our birthplace, but been around for about 10 years. About five years ago or so we created a learning platform, our own product, kind of a subscription e-commerce product. And that learning product focuses on conversion rate optimization, but does generally digital marketing training, online training for teams on different fields of marketing and analytics. So my work is on the agency side, so managed services. Our bread and butter is testing program, experimentation programs. So that’s where I’ll kind of speak to my background. I came from, not marketing, not the field of marketing, but the field of academia.
So four or five years ago I spent about 10 years at the University of Texas in Austin, the staff researcher doing a lot of stats work, a lot of data work, conservation science. Modeling, mainstreaming data to general audiences, telling stories with data , a lot of predictive analytics modeling and things like that.
But the work that we do now on these experimentation programs and CRO is that a lot of the transfer skills are around research science. The idea of, what does research mean? What does data mean? How to not get in data traps, how to explain data properly, how to contextualize it, how to make decisions with data.
So that’s what we primarily are hired by clients to do is, let’s get some focus, let’s get some better decision-making with data. Our bread and butter, as I mentioned, has been for the last 10 years, has been conversion rate optimization. So on the digital web experience, doing a lot of AB testing, a lot of experimentation on that web experience.
So not doing SEO, not doing acquisition. We’re not really part of that function of marketing, but just in that web experience, how do we, how do we journey? How do we message, how do we speak to customers in terms of what resonates, what brings them back and what ultimately converts them.
And now we’re trying to push, we’re changing the language. We’re trying to change the dialogue. So not only do, that’s still a core part of our function, but we want to craft our service and our product in a way that also answered not only how we can convert that customer. But how can we retain them and keep them loyal and allow them to spread the love of that brand.
And so taking brand, internal brand point of views, internal brand messaging and getting that outward on the website and allowing that to be a vehicle for retention and loyalty and analytics metrics associated with that, boiling them back up to tests. A lot of that type of work.
So that’s our big shift right now is going in, label wise, we’re going from conversion rate optimization to CXO customer experience optimization, and that word optimization really brings, that’s the suitcase term that brings with it experimentation, AB testing, using data to make better decisions.
So very much an analytical metrics driven focus on measuring brand, measuring perception, things like that.
Mary Drumond: So I think it also adds some action to the term CX and to a certain degree, because CX is so broad. When you think about it, that there’s like a philosophical part, there’s a journey mapping part.
There are all these different areas inside of customer experience. But when you talk about the optimization of the customer experience, it almost feels like you’re putting something into action in order to make the experience better. So I’m down with it. I like it.
But it’s, one thing that I’m interested in, what you just said when you were talking about your background, is that even coming from an academic background where your work was totally not in marketing at all, your expertise in data kind of gave you the skill set that you needed to pivot to an entirely different field and still be successful and still have the tool set that you needed in order to accomplish great work. So when it comes to your training in academia, it really was science and data collection and predictive analytics, but in, is it Marine biology?
Ben Labay: It was aquatic resource conservation science, ultimately. So using data to help the department of interior of the federal government, state governments use their conservation dollars more intelligently. We have all of this data, where do we focus it? Where do we focus our efforts? Where are we focusing our time, where do we focus our calories?
That’s the same thing that we’re- so experimentation optimization, it’s a decision support function, so it can float among the silos. Marketing, product, sales, it can float among the silos, helping those channel owners make better decisions. So I think those are the skills that transfer and just being honest about data being actionable, about data, like not getting in those data traps.
There’s a phenomenon in academia known as the implementation crisis because academics are really good at assessments and planning and we can do this, and I can write and I can pick the report, but does it get actionable? Does it get, does it influence bulldozers on the ground?
That’s the implementation crisis. There’s another whole branch of research and science called knowledge transfer science, trying to solve the same thing. We gain all this knowledge over here, but it just falls flat. You can talk about CX all day, but where is it getting implemented?
Where’s it getting implemented successfully? How is it getting implemented? So this is like operating system stuff, like getting it done, moving it forward. Making decisions, not getting trapped.
Mary Drumond: That this background of yours, that you’re almost in, let’s say you’re an outsider. Let’s label you that. We’re going to label you an outsider, and you come in to the customer experience vertical with an entirely different perspective. And you’re looking at things from a very scientific standpoint, not only that, you understand that this gap exists between what people say needs to get done and what actually gets implemented. So I think that’s really valuable.
And I think that the entire vertical could use more of that because there is a lot of, how can I say this? People are really passionate about helping customers, but when it comes to actually connecting those concepts, the concepts of the customer experience with the concept of the company making money, there seems to be an abyss between the two, that the market is struggling to find the solutions to. We have seen a lot of customer experience departments, customer experience executives, customer experience practitioners, become increasingly frustrated because they don’t have budgets. They, their titles are just, almost symbolic to check off a box for shareholders.
And when it comes down to it, the decision makers inside organizations, don’t actually believe in the success of these individuals who are trying their best. But they’re underfunded, they don’t have the resources and any findings they have, any sort of changes that they want to make, in many cases are ignored or never really implemented in the other departments in the organization that would actually have the power to change anything at all in the customer experience. So do you feel like you have some of the tools necessary to help us solve this challenge?
Ben Labay: Yeah. I think it’s a classic change management problem, or that’s the word that kind of addresses this arena of problems and it’s like an execution problem, right?
You don’t have the proper, in this case, customer metrics set up from the top – it’s got to be led from the top – you don’t have clarity around customer experience metrics that then kind of trickles down into the different departments and different silos in a way that everyone is on the same page and working towards the same goal.
So whether it’s like, get things done or four disciplines of execution or OKRs, or whatever framework of execution is being used, if you don’t have that goal tree of where the customer metrics are delineated, the mechanisms of how they’re measured, how brand perception and there’s some great, tactically like Jeff Souro out of MeasuringU is doing some really good stuff with the Super Q survey in terms of measuring and delineating way beyond NPS, like measuring perceptions of user experience of customer experience through these different customer. And we do a lot of the work derivative from his work. So attaching those metrics to a framework of execution. And I mentioned a few of them there. That’s how to solve it. So you first have a point of view from your company, like in the four disciplines of execution framework, what’s your wildly important goal? And there’s only three categories, revenue, customer, and process.
All companies have like their wildly important goals. If they’re mature enough to have that umbrella and have leadership and a CEO that’s that on it, where we have this goal and has that vision, then it’s going to be one of those three things. And there’s a lot of research saying that it always boils up to customer metrics. Like these companies, they think they have their revenue metrics, but in the end, it always boils back to the customer metrics. And the marketing function is there for the customer or the product team is there for the customer. And the sales team is there for the customer. But we think they’re not, each of those different silos has their own like little metric bucket and they start to just care about their metric bucket and slice and dice the departmental budgets accordingly and fight over that.
And so it’s just the classic kind of siloing and getting away from that type of operational mentality and thinking a little bit different. In the end, like team efficiency is my motivator, like internally my team, agency, how can we be the most efficient, the most motivated, having the most fun?
And then the second layer of that is our clients. A first client point of contact and then clients. Like how can we change the organization, change management. Metrics. What’s your point of view? What’s your internal brand? Where are your customer metrics? How do you boil that up? And then how do you use this measurement function?
AB testing, experimentation, surveys. Intelligent surveying, monitoring of NPS, et cetera. How do you use those metrics to boil up that, or down that goal tree, so to speak? To get to that wildly important goal of whatever it might be. I kind of rambled there, but…
Mary Drumond: You answered my next question before I asked it, which was very predictive of you in that sense, but my question was going to be, so do we need better metrics or do we just need better processes for actually doing something about the results that come from those metrics?
Ben Labay: I think we need in the end, what this boils up to is better leadership. So it’s a use of the metrics. Like we’ve gotten a lot of metrics. There’s a lot of data out there, too much data and there’s data traps left and right. So it’s the organization of the data, the frameworks and decision trees for, okay, we have enough, let’s make a move.
And every week I’ve got conversations with clients about stats and I describe it as, there’s a classic analogy of a blind man, putting his hand on an elephant, trying to describe it. And that’s what any metric is by itself. But in the end, like you can only feel around so much before you make a decision on describing that elephant moving forward.
We just need to have frameworks for making these decisions and I think what’s lacking a lot of times is those execution frameworks, the operating system to understand how we’re executing towards that goal, that vision. We’ve both been talking about that book Play Bigger recently.
And if you have a vision, a point of view of your agency, that’s your umbrella. So my boss Peep Laja that talks a ton about, not listening to the customer in a lot of ways, right? Like your customer can lead you astray. So you have to be driven by the point of view, that umbrella goal of the org, and then knowing that you’re underneath that umbrella.
Okay. Let’s use the customer data to make better decisions and make sure we’re learning, driving towards that.
Mary Drumond: If you’re listening to all your customer data, and trying to act on all of it, you’re going to be in deep trouble. One of the things that we focus on doing here in Worthix is measuring impact. So the impact of whichever feedback the customer gave, how much is that truly driving them to purchase and repurchase or even churn, right? Because once you’re able to understand how strongly people’s reactions to those experiences are, then you have a better idea of what actually needs to change and what can stay put.
Not only that, if you consider even from an investment prioritization standpoint, right? If you have these giant datasets or this like endless customer feedback, how do you know where to begin? How do you even know where to start? How do you even know how to start looking for trends without entirely skewing the data with biases, right?
So being able to measure and understand the impact and there are a lot of organizations, not only Worthix, we’re not the only one that’s doing that now, but I would encourage organizations and practitioners to start paying attention to that because customers talk a lot, and sometimes we get frustrated, sometimes we don’t, but when we’re frustrated with something, we tend to be more vocal about things and just because we’re being vocal about it, just because we’re making noise, just because we’re making a scene even, it doesn’t mean that feedback is actually going to contribute to something positive for the organization as a whole. The company changing something that’s going to make Mary happy and how’s it gonna affect all of the other individuals that are a part of the organization?
Yesterday I was listening to a podcast. I like listening to Freakonomics radio. I don’t know if you’ve ever listened to that. And Freakonomics has a newish podcast called No Stupid Questions. And it’s Steve Dubner alongside Angela Duckworth. Angela Duckworth is a social psychologist. She wrote that book Grit, and they talk about really kind of just daily things and questions that people send in.
And interestingly, there’s been a lot of talk on empathy in marketing in general, but specifically in customer experience about having empathy.
And there was an interesting question that was, is empathy immoral? Now that’s not really something that we give a lot of thought to, but as humans, we tend to empathize with people when we can hear their story, right? And if we don’t hear their story, then we don’t empathize with them. So sometimes we give priority to somebody who doesn’t necessarily need it more than the others, but because we formed a personal bond and a personal connection with that individual, we give them that priority. And so the discussion around it was is it ethically and morally wrong to act on empathy in these cases?
And that’s something that as organizations, we also need to keep in mind. We need to be able to have empathy at scale. We need to be able to listen to a larger group of customers so that we’re not making poor decisions based on individual conversations or a single conversation with customers.
What are your thoughts on that, as somebody who works with data so much?
Ben Labay: Yeah, this is great. I love it. And to me, context, data context comes to mind. And what you described there with empathy also can be described as a cognitive bias, a focus bias. What we’re focusing on is the most important thing, right?
In a data angle- I harp on this a ton. So we do user testing, for example. User testing is a really dangerous research methodology because you get those stories and you get somebody talking and you get the video and you get the stories and you got the anecdotes and it’s exactly, you got the sympathy and you’re focused in on that. When you do it with like 10 people, right? What’s the sample size there? Are you going to make conclusions based on opinions and be able to extrapolate and transfer that dataset and say, it’s applicable. That statistic of 10 people have the opinion- crazy thing is like 70% of my user testers and stuff like that, which is always alarming when I see, 70% of my user test testers thought that this design was X or whatever. It doesn’t apply that way. User testing is all about observing behavior, not about getting data on perceptions or opinions. And so data context, being question driven, not collecting data for data’s sake and also putting the data in context.
I don’t like to do user testing by itself. I like to pair it with surveys, pair it with heat mapping, pair it with analytics, pair it with customer interviews and do it all at the same time and have confidence that an issue is worth addressing because it’s supported by different types of data, flavors of data.
So that’s the way to address it. And we, a lot of our research methodologies do that really explicitly. Of trying to get around that focus bias to where one type of data, category data, doesn’t bias you towards like, oh, we got to do this. Right. And then also again, being question driven. We’ve got a new policy at our agency, it’s a bit of a marketing campaign, too. We’re going to have all of our analysts soon, which they don’t even know yet, that we’re going to have them put on LinkedIn and say I’m a Speero analyst. If I do a button color test, I get fired. Like it’s a sign of an unhealthy program, a program that’s not asking good questions, not being question driven when they’re doing dumb tests like that.
When they’re collecting data for data’s sake and not doing- now, there are certain organizations, you know, the Microsofts, the Bookings, the Googles and things like that, that can ask those types of questions and they’re doing it at scale, they’re doing continuous experimentation, they’re there in their own bucket and we’re not- the exception proves the rule here, right?
For most people when you’re doing tests like that it’s a sign of that you’re just collecting data for data’s sake and you’re not actually asking questions. You’re not being strategic.
Mary Drumond: Then with that in mind how would that apply to some of the current key metrics for customer experience in the market?
How many of them are being asked- how many surveys are going out to customers that are collecting data just for the sake of collecting data?
Ben Labay: Where I see the fault is when people get hooked to one methodology because that’s their hobby horse, and that’s their hammer. This applies to AB testing too. Not everything should be a AB tested, and I just mentioned that.
And I think it’s having a good set of questions. And it being focused on the flavors of like heuristic framework, classic heuristic frameworks of like what’s causing friction, what’s motivating or what’s causing the fears, uncertainties and doubts. What are the perceptions, be it brand, usability, clarity, loyalty, like NPS, for example, loyalty. Credibility, message clarity, things like that. So I just mentioned those four big buckets, like UX perception, I went into the details there, but there’s four buckets of things. And so for example, behavior is one bucket, like how are they behaving?
You look at analytics, you look at heat mapping, you can look at user testing because that’s there to observe behavior. All of those methodologies of collecting data don’t get you data on motivation, right? They’re implicit, but it’s not explicit. Right? Let’s do some surveys. Let’s do some customer interviews.
Let’s focus in on motivations surgically and not have, like, I see a lot of customer surveys where they’ll have like 13 goals in the survey itself. And each goal has like five questions, that’s kind of extreme but there’s a ton of different goals. Where we like to slap hands, like, no let’s survey with one goal and get just motivation. If we want fear, uncertainty and doubt, what were three things that almost held you back from purchasing today or on exit intent- is there something holding you back from purchasing today? Yes. No. Just as easy to click yes or no, as it is to close out that little poll.
If they do click, yes, it opens up open-ended. That’s surgically fear, uncertainties and doubts. Chat logs has a lot of fear, uncertainty and doubt information like, cause like I can’t find this where’s my support? So really surgical methodologies to get that and then tying it into the broader thematic goal of like, we need to help our customers do product discovery more intelligently.
We’re the Cadillac of selling jewelry. So we need to dial it up to 11 on that brand messaging. How can we do that through the funnel? And then how can we then take a benchmark metric of brand perception with a hundred people. And then two months later after we turned it to 11, do it again, measure it again.
Did that work? Another example, if you’re doing customer experience or customer service as your main sales channel, for example, your path to customer experience, how is that working? Are they getting there too quickly? Not long enough? What’s the value of a shorter funnel versus a longer funnel in terms of perception and the quality of the conversations with customers success, things like that.
You can measure that super explicitly.
Mary Drumond: When it comes to your job, the job that you’re doing at the moment, who are your main clients? What sort of organization uses your services to improve their revenue or their experience and optimize those? Is it large organizations, SMBs…?
Ben Labay: Yeah, not SMBs. When you use the term optimization it’s really focused in on process and margins. The ROI of bringing on an agency to focus on process and margins, there’s got to be ROI in the margins about that a little bit. Now there’s a lot of zero-to-one situations where they might just lack resources for experience design or experimentation, development, things like that.
And then you just bring on an agency to plug in an org chart. But in a lot of ways, the ROI of a strict experimentation program needs a certain amount of data and volume and money. So if you have a hundred thousand visitors a month, a million a month in revenue there’s triangulation among average order value, traffic, conversion rates.
So there’s triangulation among that, depending on what the ROI is, but generally we see a lot of success and like series B, series C, fast moving startups. We did a lot of that or a little bit later, like MongoDB, Code Academy, we work with Miro.com and these are fast growing organizations.
There’s a lot of up-skilling to be done. An agency that comes in with like, this is how we do it. This is our process. We’re going to zero-to-one, you on this whole process, train you, hand over the keys. Hopefully it gets sticky because we’re good at what we do. And we can run with your team as a partner.
There’s a lot of that as well.
Mary Drumond: So you’re talking about a certain amount of volume here. Right? And from my very limited understanding, in order to have successful AB testing, there is a rule of large numbers that applies like we were talking about earlier. You can’t say like I have 10 responses. This is sound empirical evidence. And I always find it funny when somebody comes like straight out of marketing school and they come to work for me and they’re like, Hey, let’s AB test the hell out of this page. And I’m like, this page gets 500 views a month. You’re going to AB test that? That’s…eh.
Ben Labay: I have those conversations every week. Yeah.
Mary Drumond: So that’s something that is real, but how do companies that get, maybe less traffic or less volume of business, how do they optimize that ROI through experimentation? How do they apply some of the work that you’re doing without having those big numbers?
Is it even possible or do they just have to do it manually?
Ben Labay: So, yeah, that’s a good question. And that question has been a perennial one in the field of CRO. How do you test with not a lot of traffic? And you don’t AB test, you don’t do certain types of activities, but then you can still can optimize. The function at its core is a decision support function.
And actually we were talking about category creation or we’re moving to CXO this year. Next year, I want to go bigger. I want to try to take experimentation as a function and move it away from product and from the Chief Marketing Officer and put it underneath the Chief Operating Officer, I think it is an operating system itself.
It’s a way of thinking. It only works on bigger organizations. But if you look at it as a way of thinking, we’re going to use data, validated decision support operating system. You can do user testing, you can do surveys, you can do customer interviews. That’s what startups do. The Mom Test is a really good book on getting data through customer interviews and in a nice framework that, that he put out there.
And that analogy of a blind man putting his hand on the elephant, you got to go with just what you feel. You collected data somehow. Right. But the framework, the science of it, and in marketing, we don’t like to use the word science. Whenever I throw that out, there are like, I feel like it puts this big wall or everything gets fuzzy or whatever.
But the scientific method is just a tool to use observation to find truth. And so that’s really what we’re talking about here and the type of observation that you do. We look at social science. If you want to call it a science, it’s not empirical. It’s not like- all those studies get debunked every year! Like… and my posture can affect my hormones, and they totally get debunked all the time.
And so the empirical-ness of observation can always be doubted, but in the end it’s a speed-confidence thing. The faster you can make decisions that move your ship that are hopefully good decisions that you’ll get ahead of the competition or just move yourself forward.
Mary Drumond: I totally agree there. Absolutely.
So what is the one thing you feel, Ben, that if you could change in the way that people are looking at their customer experience at the moment? Again, we’re going back to the outsider label, what would you have people do differently in the way they’re managing their customers’ experiences?
Ben Labay: Yeah, I think a lot of CX and UX practitioners would probably agree with me that the one thing that I could probably change, would want to change is to break the silos and really have the customer at the heart of all decisions. With the caveat of, it still needs to be under the umbrella of the mission statement of the company, of the mission statement of the brand and that brand, just like people are more productive when they know themselves, a company is going to be more productive when it knows itself and knows what it doesn’t do.
And when a company knows what it doesn’t do, it can filter customer data more intelligently, and then you can truly put the customer at the center of everything like the marketing, the sales, the product, and you can start to break those silos.
And you really start to see cultures of innovation when those silos are broken. Because you see people start to engage with each other’s departments. When the sales department does an experiment, finds a learning, shares that, marketing says, that’s cool. We can use that over here. Let’s do that. Let’s try this. You start to see those types of activities and that type of culture. That’s brilliant. Everyone’s having fun at that point.
Mary Drumond: Yeah. This culture of corporate tribalism, where we feel like we have to stick with our people in our unit. I mean, of course, it happens all over.
We see this all across the human race, where we have this division where we try to stick to our little tribes and exclude everyone else because, you know, danger, getting preyed on by other civilizations back in the day, et cetera. But nowadays when we’re in companies, it’s amazing for me, how we default into those silos, even though we know better.
So for instance, Worthix is a start-up. We haven’t been around for that long, and we still find ourselves having to consciously break down silos in this ridiculously small organization, we’re like all in the same office! Like, how are we even doing this? How are we creating these barriers? So it’s amazing to me that’s what we default to, subconsciously.
That we feel almost like it’s our responsibility that we have to remain loyal to our individual teams and then it’s us-against-them culture. How do we break that down in a way that it doesn’t keep coming back? You know what I mean? Like it’s not, we’re not the first people to talk about silos. It’s been a while since this topic has been going on.
Ben Labay: I solve that by firing clients. Like, it’s the only way, just some people, some clients are just not going to be there. It’s not solved. It’s so ingrained. Actually my background in university, what I studied was evolutionary behavior.
And you’re just not going to solve in a lot of ways. But I think that just trying to attract more enlightened people to work with and for and around you is the way to do it. And so we’re at a point as an agency where we can start to fire clients and we’re starting to always be the buyer, so to speak.
Think along those lines of we want to attract open, intelligent clients that give like really critical feedback. They push for performance. But they’re honest and open and it’s not a you versus, us versus them sort of mentality. It’s a team- we’re in this together and it’s working or not. We can measure that and we’re going to measure that, but we’re going to measure the gain, not the gap. And that’s the key. If personally, professionally, if you keep measuring the gap, you just kill yourself. You got to measure the gains.
Mary Drumond: Yeah. Awesome. Great advice. Ben, if our listeners and viewers want to follow you, reach out, have a conversation with you, how can they do that? What’s the best way?
Ben Labay: Yeah, I’m most active on LinkedIn, that sort of professional network. I post there quite a bit and try and do more and more. So find me on LinkedIn, Ben Labay. The website, Speero.com. You can find me through their business bureau.com as well. Reach out, give me some thoughts. But LinkedIn is a great place for me.
Mary Drumond: Awesome. Go find Ben on LinkedIn then, folks. Ben, thank you so much for coming. It’s been such a great conversation. It’s been a different angle than we normally get at. So good talk.
Ben Labay: Yeah. Talk to you soon.
Mary Drumond: That’s our show. Thanks for joining us. We hope we’ve brought you one step closer to leading through empathy. It’s our way of making the world a better place. One business at a time. Don’t forget to subscribe and hit the bell if you want to know as soon as we publish a new episode. Voices of CX is brought to you by Worthix.