Never Go With Your Gut: Dr. Gleb Tsipursky

Never Go With Your Gut: Dr. Gleb Tsipursky

SHARE THIS POST

Share on facebook
Share on twitter
Share on linkedin
Share on email

About Dr. Gleb Tsipursky

Known as the Disaster Avoidance Expert, Dr. Gleb Tsipursky is on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies.

He has over two decades of consulting, coaching, and training experience as CEO of Disaster Avoidance Experts, and over 15 years of experience in academia as a cognitive neuroscientist and behavioral economist. Dr. Tsipursky writes for Inc. Magazine, Time, Scientific American, Fast Company, and Psychology Today. A best-selling author, his new book is Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters.

Follow Worthix on LinkedIn 
Follow Worthix on Twitter: @worthix

Follow Mary Drumond on LinkedIn 
Follow Mary Drumond on Twitter@drumondmary


Follow Dr. Gleb Tsipursky on LinkedIn
Follow Dr. Gleb Tsipursky on Twitter

Buy Never Go With Your Gut

Transcript:

Mary Drumond: Welcome back to one more episode of season five of voices of customer experience podcast. And today I am joined by Dr. Gleb Tsipursky and he is a specialist on cognitive biases and behavior. Right?

Gleb Tsipursky: That’s indeed right. That’s what I specialize in, how to make sure that we address the dangerous judgment errors that cause us all to make really bad decisions, whether in customer experience, employee experience or other business areas.

MD: Awesome. So can you tell us, how do you prefer to be addressed? Dr Tsipursky, Gleb, what’s your favorite?

GT: Gleb is fine. I don’t need that formality, Mary.

MD: Okay, great. So, start off by telling us a little bit about your background. What got you excited about, you know, kind of delving into this topic ultimately, what led you to it?

GT: Well, what gets me excited about this was seeing some bad decisions that my parents were making, and that’s what then got me interested in cognitive biases. It got me interested in decisions. So I saw my dad for example, he’s a real estate agent, and he has variable income because he works on commission. So at one time he made a lot of money, but he hid it from my mom and he invested it into buying an apartment elsewhere, leasing it out. Once my mom found out, she was very mad. She was very pissed.

MD: I could imagine.

GT: My dad couldn’t imagine it, clearly, because he did it. But he, you know, once that was revealed, it was a big blow, a conflict. They separated for a while actually, and they eventually reconciled, but you can never really trust him again, it was really a trust issue. And so as a kid, that really shaped me, seeing my parents split up over dumb decisions around money. So I’m like, okay, adults make really dumb decisions around money. And that was further proved to me when I was growing up in the middle of the dotcom boom and bust. I was born 1981, I came of age in 1999. I was 18 when that comes like web [?] dot com they were all booming and just a couple of years later, they were all bust. They will win bankrupt, billions of dollars down the drain. Lots of people losing their life savings. And so I saw that people who are the heroes in the front pages of the wall street journal in 1999, were now the zeros in 2002. Even worse, where leaders like Bernie Ebbers and so on at the Enron, WorldCom and Tyco, who used fraudulent accounting to hide their losses. And it couldn’t have lasted more than a couple of years, just a year or two until they themselves did the perp walk and were in the front pages of the wall street journal for all the wrong reasons. But they must have known, they couldn’t have not known that they wouldn’t get away with it. But they still made these terrible decisions, which, you know, took them from the Titans of industry to the jail cells. So that made me really realize that even the most prominent, biggest leaders in our country would make terrible decisions around business. So that made me want to understand how to address these problems because I really wanted to address the suffering that I saw so many people experience. And that’s when I went into studying that dangerous judgment errors that our brain makes and the teaching about them as a trainer and consultant coach, which I’ve been doing for the last two decades. And as I was doing some training, consulting coaching on this, I realized that I need more formal education myself to understand what’s actually behind this and to do some research on it. And so I went into academia studying cognitive biases. I have 15 years of experience as an academic, including seven years as a professor at Ohio State. Go bucks! And that’s what brought together all the information in my new book, Never Go With Your Gut: How pioneering leaders make the best decisions and avoid business disasters. Combining the 20 years of consulting and coaching experience with cutting edge research on the dangerous judgment errors that cause us to make terrible decisions, like Bernie Ebbers did, like my parents, like my father did, and like so many others do, unfortunately. I man look at what’s happening in Boeing right now. Horrible decision making.

MD: Absolutely. Yeah. As we are speaking, there’s a lot of stuff going on within Boeing and changing up their C-suite and changing up their executives to try to make up for some really poor decision making. Right.

GT: Exactly. I mean, and you could see the biggest erro that Boeing ran into, I have a piece about this coming out in Inc magazine in just a bit, it’s called the normalcy bias. With Boeing, it’s kind of a the victim of its own success. What was happening previously in Boeing was that each new airplane model that they were putting out was more and more safe over time. So the safety record was improving, it was increasing, and there were a number of engineers and Boeing had said that, Hey, this new model, the seven 37 max, is not as safe as it should be, so we should have more safety, we should have more testing, fix the software. But the Boeing leadership fell into one of the typical cognitive biases we fall into, normalcy bias, thinking that everything will go as normal, as previously, and underestimated the huge potential for a major disaster. I mean that’s what led to typical stock bubbles like the 2000 dotcom boom and bust, the housing price bubble. People in 2007 couldn’t imagine that prices in 2008 would fall so drastically.

MD: Even the IPO bubble that we’re having nowadays. Right?

GT: Absolutely. That’s a great example. The IPO bubble where previously the founders were the heroes would disrupt the world and now banks are moving away from investing into these founder heroes and are actually looking at companies that are making money. And so the Boeing leadership in similar way, they couldn’t imagine, they couldn’t envision that their new airplane, the 737 max could possibly be less safe than previous airplanes. So that’s why I did. So they said, Oh, you know, this FAA certification process, whatever, it’s just a big bureaucratic hassle. We’ll just cut a few corners, pay a few bribes, do a few shady things but we’ll get it out faster and compete effectively with Airbus. Well, they see what’s happened right now, direct costs to Boeing are over $9 billion. Boeing has lost over 25 billion in market capitalization. They will never have the reputation for safety and security again. And they lost a lot of future orders. And of course the CEO was fired and so are a number of other leaders. So this is the consequence of the normalcy bias, which is just one out of over a hundred cognitive biases, which we tend to fall into very much.

MD: So there are, there are hundreds of them, but there are a couple ones that are more common knowledge, I think at this point. I mean we’ve had a lot of really great, amazing psychologists that have published best sellers all over the world that talk about biases. So there are a couple that are more well known, right? What are some of the ones that we should keep in mind more often?

GT: So besides the normalcy bias, which is definitely a big one, one prominent one is called the confirmation bias. This is probably the most popular one in terms of what has been talked about. The confirmation bias causes us to look for information that we want to see, cherry picking information that confirms our beliefs and ignore information that does not confirm our beliefs. So let’s go to another recent example with wework. Wework was a great company apparently about a year ago and it was worth about $75 billion. And its founder, its leadership team, was looking to do an IPO, an initial public offering and said, Hey, you know we’re doing so well. Adam Newman said and others let’s do an initial public offering. And he was ignoring information. A lot of people were telling him that, Hey, the governance structure of wework is kind of screwed up. Maybe we shouldn’t go to initial public offering right now. There’s a lot of internal problems in the structure. But he was refusing to listen to people like that, including from SoftBank and other investors in wework who didn’t want an IPO, and he went ahead with the initial public offering. Now as the structure of wework was revealed, where there was quite a bit of double dealing where Adam Newman owned some properties and was leasing them to Wework, so obviously problematic, and he owned shares that were worth 10 votes whereas he wanted to sell shares that were worth one vote. And there were many, many other governance issues. So after investors were actually looking at this, they were like, no, no, this is a bad investment. They lost trust in wework and the 75 billion evaluation was based not on the current profits of wework. We work is losing billions, many billions of dollars, actually. What people were valuing we work at 75 billion for, was because of the strategy and vision of the leadership. So it was the founders who were seen as the heroes, and once it was seen that the founders were really screwing up with their strategy and governance structure, wework’s valuation went down, to about 7.5 billion right now at the last evaluation. So that’s about, you know, $68 billion of value, an order of magnitude difference lost just because of the confirmation bias, just because of ignoring information that’s unpleasant and looking for information that confirms your beliefs.

MD: Yeah. And if you look even for my more general public point of view about confirmation bias, we see this all the time with elections, with voting, with what we see on the news, where we’re constantly looking for information that validates our point of view and the search engines and the algorithms that are out there aren’t helping that because they only feed us that which the algorithm feels is interesting for us. And then we really get stuck in a knowledge bubble as well, right?

GT: Yes, that’s a very insightful point, Mary. So it’s called a filter bubble, where are we tends to get information that confirms our beliefs and that goes back to why this happens, why these cognitive biases occur. They occur because they are part of our gut intuitions, our gut reactions, our emotions, our feelings, our instincts. What the research on this topic, the cutting edge research on this topic that I study has shown, is that our emotions, our gut reactions, our intuitions, they’re actually not adapted for the modern environment. This is why my book is called never go with your gut. People tell folks, go with your gut, go their intuitions, be primal, be savage. The Tony Robbins of the world, all the gurus. Terrible advice. Our gut reactions our primal, natural state is adapted for the Savannah environment. When we were hunters and foragers and gatherers living in small tribes of 15 to 150 people. That’s what’s were adapted for. That’s what we feel good about. So we feel good about looking for information that confirms our beliefs. It’s about a feeling. It’s about how we feel. It’s a gut reaction, we like to get this information. We like to feel that we are strong, that we are good, that we are nice and whatever. Partially because that’s important for our self-esteem. And it was important for us to have high self esteem in order to try to climb up the tribal hierarchy. So tribalism was incredibly important in that time when we were in the Savannah environment. And part of tribalism is the importance of climbing up in the tribal hierarchy, having as much social status as possible, because that allows us to get resources that enable us to survive and pass our genes. So we are the descendants of those people who are excellent at climbing the tribal hierarchy and passing on their genes. And then you get the same sort of screw ups that happened with Bernie Ebbers and other leaders around the world, who didn’t want to admit their failures because it would cause them to drop in the eyes of their peers, lose social status. They would lose face. They would rather either do the kind of, you know, last a year or two more in the high tribal hierarchy at the cost of going to jail for the next 10 years afterwards. Terrible decision destroying their companies. And that’s what comes from tribalism. That aspect of tribalism.

MD: You know, I was reading Sapiens, just really interrupting you really quick, and that book Sapiens, correct me if I’m wrong because I’m sure you know this better than I do, but the structure of the homo exists for 75,000 years. And of those 75,000 years, 72,000 we were the tribes just surviving and hunting and foraging and that that is so very deeply ingrained in our DNA. And we don’t realize this because society as we know it, the structure of society has only been around for a couple of hundred years versus thousands, tens of thousands of years of tribalism. And that is a lot more deeply ingrained in the way we make decisions than we think.

GT: That’s right. So I wouldn’t say for the last 72,000 years, we’ve probably looking in the last 6,000 years when we were transitioning to the agricultural state, living in more complex societies. So I wouldn’t quite agree with that aspect of sapiens. So it would probably be 75,000 versus 6 to 9,000, the points stands. What the research on this topic shows is that our gut intuitions or reactions, our emotions, are about 80 to 90% of our decision making. So if just left to our own devices, 80 to 90% of what we do just naturally, intuitively, primitively, would be shaped by our gut reactions by those Savage tribal impulses, whether it’s trying to climb up on the tribal hierarchy, which was very important aspect of tribalism causes us to make a lot of bad decisions right now. Where another aspect of tribalism which is looking for people like us, liking people like us and dislike people who aren’t like us, was very important in the tribal environment, for us to support our tribe because if we’re kicked out of our tribe, we die and if our tribe was defeated by some enemies, we die as well. So it was very important for us to have that same tribalism. And that tribalism is one of the biggest components and that is as well responsible for the filter bubble. One is looking for information that we like because it feeds our values. Another is looking for information that affirms our tribal belonging to whatever tribe we belong to, and that causes us of course to make terrible decisions in all sorts of areas. In business as well, where whether in hiring and promotion, whether in evaluating new business contracts, whether you’re looking at employee experience, user experience, these sorts of tribalist tendencies, are called the halo effect and the horns effect. Where if we like one aspect of someone or something, if we see that person or it’s whatever we’re looking at as being part of our tribe, we will tend to give it a higher valuation that it deserves. We’ll trust it more than it deserves. We will like it more than it deserves. And the opposite would be the horns effect. That’s another cognitive bias, where if we don’t like one aspect of something because it’s not part of our tribe, whether someone’s not of our race, sexuality, gender, politics, whatever, and other elements, then we would tend not to like that person as a whole, or thing as a whole. I’ll give you an interesting example. So I’m from Columbus, Ohio. Our big rival in football is the University of Michigan, Wolverines. Boo Wolverines. I was doing a presentation here in Columbus, Ohio at a diversity inclusion conference to over a hundred HR leaders and diversity inclusion here in central Ohio. And I asked him, Hey, you’re diverse inclusion HR leaders, how many of you would hire university of Michigan Wolverines fan? And you know what? Of over a hundred leaders, only three people said that they would hire them. Only three people! You know that that’s tribalism at it’s the strongest. Obviously however poor their sports choices, and you know, it is a very poor sports choice, but it doesn’t say anything about their ability to actually do a job. But because of the strong tribalism, strong antipathy, only three out of these hundred diverse inclusion leaders indicated they would hire this fan. So this is how it works in all sorts of areas and caused us to make all sorts of bad decisions in business as well in customer and employee experience.

MD: No, that’s pretty crazy. Cause I remember you saying a couple minutes ago that it’s more or less 80% of all of our decisions are made by gut decisions? Is that it?

GT: 80-90% depending on the decision. When we’re going more towards people, we have more of our decisions influenced by emotions.

MD: So if we were to take like a whole bunch of education and a whole bunch of knowledge, even the most educated person I would imagine, even you who studies this for a living, you still have these biases. It’s pretty impossible to get away from 100%. so what do you think the percentage of someone who is highly trained to recognize biases and combat them, what’s the percentage for that person? Of making a decision based on gut.

GT: So the critical thing in decision making is not to avoid feeling your emotions, avoid using emotions. The critical thing is to retrain your emotions. So that is what it’s all about, naturally, intuitively, we’d eat all the donuts that we can eat, all the chocolate chip cookies if we can. And we’d sit on the couch and watch Netflix because in the Savanna environment it was really important for us to eat as much sugar as possible in order to survive. And their ancestors who did that survived, those who didn’t die. So right now the obesity epidemic is one of the consequences of this, but slowly over time that are many people who have been able to teach themselves to like eating salads and dislike eating more than you know, 3 chocolate chip cookies. Two is A okay, but three is too much. So people have learned and have come to develop a mental habit, change their emotions to like eating more healthy things. It was unpleasant at first for many people to do so. But that’s what they did. They changed their emotions. Similarly, while it’s very natural for us, our brain is lazy and it causes us to be lazy, it’s very natural for us to suddenly character and watch Netflix all day. However, we have trained ourselves to say, Hey, let’s go and do some exercise. Let’s put on our sweats and go to the gym. So we have trained ourselves to say that, Hey, this is a really important mental habit for me to have. This is an important physical habit. In the same way you can train yourself change your intuitions to change your gut reactions, to fight tribalism. And there is extensive research showing that it’s possible if we do fight tribalism. Unfortunately, it’s not taught right now. People aren’t taught effective techniques to address these cognitive biases, and there are many effective techniques. Indeed the research shows that we can change our intuitions, but that’s essentially going from the natural and primitive state to the civilized state, civilized meaning what’s adaptive for the modern, complex, multicultural world, our civilized world. In the same way, it’s not at all natural to restrain from eating sugar. It’s not at all natural to go to the gym and do exercises. In the same way, it’s not at all natural to restrain tribalism, but people definitely can do it if they use effective decision making techniques to address them. So it’s definitely possible. And people who are trained in doing this are actually able to address a lot of these cognitive biases.

MD: So that’s what you do. I imagine you teach decision makers within organizations to combat tribalism and their decision makings and train them to recognize the biases that are clouding their judgment and effectively overcome them?

GT: Yes, exactly. So that’s my job. The first generation of scholars, the ones that you mentioned earlier who published bestselling books, they talked about how the brain is screwed up and that’s great. You know, they talked about the cognitive biases themselves. Hey here are over a hundred of these problems, too bad for us. My generation of scholars, kind of the second wave is looking at, Hey, okay, we’re kind of screwed. How do we actually solve all these problems? How do we address them? And that is a practice called debiasing. So the research on this topic is kind of really cutting edge research coming out in the last decade of how do you address these cognitive biases. And my book is actually the first one to popularize this research for the business sector. How do we, in business settings and professional settings, address cognitive biases effectively? So yes, that’s what I do. That’s what I study and that’s what I teach folks and that’s where the consulting and coaching on.

MD: So where do you think the main aspects that could really use this research are? Like I can immediately tell that HR, employee experience, hiring really needs that because both the halo and the horns effect, I imagine, are extremely strong when it comes to recruiting and employee evaluations and other things of the sort. What are some other critical areas that could really use an overhaul when it comes to cognitive biases clouding judgment?

GT: There are a couple of areas. One surprising area that people often don’t think about a strategy in all sorts of areas. Our strategic thinking is incredibly bad. This is the kind of one of the basic conclusions of the research. We are very much oriented toward the short term. That’s what our gut reactions are for. They are oriented towards short term survival. It’s that fight or flight response. One of the primal aspects of this tribal Savanna environment. It was very important for us in the Savanna environment to make very quick decisions in order to flee from attacking saber tooth tigers or fight tribal members of opposing tribes who are attacking us in the current environment. We have many less saber tooth tigers, but we’re still very much oriented toward making very quick decisions to be defensive or aggressive. And leaders who make these quick decisions are praised. That’s one big problem. The second big problem, extensive research shows that leaders, when they need to make them more important strategic decisions, they tend to go with their gut more, rather than less. And that’s terrible. That’s horrible. So you might have heard of let’s say a strategic analysis like the SWOT analysis where people evaluate the strengths, weaknesses, opportunities and threats facing themselves or their organizations. It’s a very bad instrument in many ways because of a cognitive bias called the optimism bias, where we tend to be too optimistic about what’s going on. Positive things will go on in the future. And another related one called the overconfidence bias, where we tend to be too confident about our judgment. As a result, people who do the SWOT analysis always, I mean always every time I’ve seen this list way too many strengths, way too many opportunities, wait a few weaknesses, wait too few frets. Then they make their strategic plan around this SWOT analysis. Hey, you know, let’s invest into what we see as our strengths and the opportunities and let’s, you know, address our weaknesses and frets through investing resources in this way. Well, the resources end up being not nearly sufficient to address the weaknesses and the frets. And they tend to overestimate how much reward money they’ll get from investing into the strengths and the opportunities. And then their strategic plan really goes in the wrong direction. So specialty companies that are in the stage of growth, they tend to make lots of bad mistakes in their strategic planning and ones that are going for a merger and acquisition tend to make a lot of bad mistakes in their strategic planning. So strategic planning is one big area. And of course another big area that’s relevant to your listeners of this podcast is customer experience. This is a big, big one, especially from folks who do technical work, engineers, software programmers and so on. They tend to intuitively make the product in such a way that it fits their needs and they do it intuitively, and that’s called the false consensus effect. That’s one of the cognitive biases where we tend to imagine other people to be too much like ourselves, and they’re actually much more different than we are than we tend to think they are. So, we don’t intuitively adapt our products that we create to their needs. This false consensus effect is a biggie and that causes the products that we create, the services that we offer to be not really targeted to the needs of the users and not framed in a way that they can engage with effectively.

MD: That’s brilliant. Is there a way to train people to do this differently or do you think the only way to solve it is by being more diverse in your teams and having representation of your customers inside your team?

GT: Oh, there’s definitely, I mean, there’s training on de-biasing and that’s kind of what I do. So a very quick way to do it, this is a quick technique. Takes a couple of minutes: make sure that you don’t screw up. So this is what this technique is about. It minimizes the risk of screwing up, doesn’t maximize rewards. There’s a longer technique that maximizes rewards that you should use for more, like a product launch, website overhaul, more serious things that you want to make sure to get right. But here’s a quick technique that I teach all of my clients and they teach this technique to all of their employees. It’s a very quick technique. It asks you to ask five questions about any decision. First question, what important information didn’t I yet fully consider? So what’s evidence didn’t I take into account? Here the important thing to do is to look at evidence that goes against your intuitions. It goes against your beliefs, so you’re not cherry picking evidence that’s most comfortable for you. And this question gets you a lot of the information biases. There are lots of biases that have to do with information that we choose to get, whether it’s the confirmation bias, other related ones, and that helps your address it. Second, what dangerous judgment errors didn’t I yet consider? So what cognitive biases didn’t you yet address. There are going to be specific cognitive biases relevant to each decision. So if you are making a people decision, the halo and horns effect may be something you want to look at, and there are a number of other ones. To learn this stuff, you of course have to learn about the cognitive biases. My book, never go with your gut: how pioneering leaders make the best decisions and the way business disasters, talks about the 30 most dangerous ones for businesses. Third, what would a trusted and objective advisor suggest I do? So think about someone who’s a trusted, objective advisor to you. What would they tell you to do? What would Mary tell you to do? What would someone you trust tell you to do? You get about 50% of the benefit of this question just by asking it, because you take yourself outside of your own shoes. And of course you can also just call the person who’s you’re a trusted instructional advisor. Or if you’re a millennial, text that person. Next, how have I addressed all the ways this could fail? So think about the decision completely failing and see what you’ve been doing and why did it fail? Think about why it failed and see what you can do in advance to address the failure. If you have a new product, for example, you might not have considered, let’s say how people with disabilities would use that product. That might be one way it failed and there might be whole range of other ways that it failed and you can address these problems in advance and that’s great. Finally, what new information would caused me to revisit this decision? So what would cause you to change your mind? When we’re implementing the decision, we are very emotionally invested in it, so we’re in the heat of the moment. It’s very hard for us to look at and consider information objectively. But when we’re actually making the decision, we’re in a different cognitive state, we’re in a calmer state, we’re less invested in the implementation of the decision. So we can see that, Hey, you know, if the new product that they launched doesn’t reach $450,000 within the first six months, then I would need to have a serious revision of the product plan. So those five questions are very effective to address a whole number of risks that you wouldn’t address otherwise. They automatically address a whole number of cognitive biases. So that’s something that I teach all of my clients and they make sure to have all of their employees ask these questions before they make their decisions.

MD: That’s great. De-bias training, is that what it’s called?

GT: That’s right. Exactly right. So it’s de-biasing training, dow you address all these cognitive biases.

MD: And your book never go with your gut. Where can our listeners find it?

GT: It’s available at bookstores everywhere. So somebody emailed me from Barnes and noble saying, yay, you know, this book is nice and turned face forward. That’s great. So yeah, so traditional book published by career press, and of course online bookstores everywhere. Also Barnesandnoble.com amazon.com and so on. And if readers want to connect with me, they can go to disasteravoidanceexperts.com. There are videos, blogs, podcasts, training, consulting, coaching information. One thing I would suggest is going to disasteravoidanceexperts.com/subscribe. There’s a free eight video module-based course on decision making, one-on-one informed by my book. And finally I’m on LinkedIn, pretty active there. So connect with me, Dr Gleb Tsipursky.

MD: That’s great. Well, I have already connected with you and I’ve already read your book, but I look forward to hearing from you more and more cause this, I mean, the listeners who’ve been following this for a while know that this is my absolute favorite topic. So it was great having you on. Absolutely fantastic. I hope our listeners enjoyed this as much as I did cause it was thrilling.

GT: I’d love to come back on the show. Mary, thank you so much. It was a great conversation.

Subscribe to our Podcast about Customer Experience – Voices of CX

SHARE THIS POST

Share on facebook
Share on twitter
Share on linkedin
Share on email
Mary Drumond

Mary Drumond

Mary Drumond is Chief Marketing Officer at Worthix, the world's first cognitive dialogue technology, and host of the Voices of Customer Experience Podcast. Originally a passion project, the podcast runs weekly and features some of the most influential CX thought-leaders, practitioners and academia on challenges, development and the evolution of CX.

RELATED POST

SUBSCRIBE TO THE VOICES OF CX PODCAST

Join our email list to be notified when new episodes air, and get them a day early!

what is voc - image

PODCAST

Voices of CX Podcast

110k plays and 100+ episodes later, we're still all about the Customer Experience

Subscribe to the Voices of CX Podcast to hear from biggest names in CX; Joe Pine, Jeanne Bliss, Dan Gingiss, Ian Golding and so many more. 

Get notified when new episodes air. 

Where one good thing ends, another begins! Don’t worry, the podcast won’t change as much as you think.

The Voices of Customer Experience Podcast has changed its name! From now on, we’re The Customer Value Alignment Blog and Podcast.

At CustomerValueAlignment.com, you’ll find the educational and informative blog content that you’ve grown to expect. Whether you need a refresher on the basics, a deeper dive into Customer Value, or helpful content to share with your team, you’ll find it there.

Subscribe to our newsletter or follow us wherever you get your podcasts. If you follow the Voices of CX already, you don’t have to change anything – we’ll be on the same feed as before!

Thanks for sticking with us. Stay tuned for Season 11!