Survey Fatigue and How to Fix It: Brian Lamar

Survey Fatigue and How to Fix It: Brian Lamar

SHARE THIS POST

Share on facebook
Share on twitter
Share on linkedin
Share on email

About Brian Lamar

Brian is currently VP of Insights at EMI Research Solutions, where his role is understanding the sample landscape and best practices. He designs, analyzes and presents a research on research study of the sample providers, keeps up with trends and innovation in the industry and has developed a product to help consistency.

He has a passion around best practices and quality and assisting companies making the best business decision possible. He has spent the past 7 years developing a unique understanding of the online quantitative sampling landscape, analyzing my own internal research to help our clients reduce sample bias and determine the ideal sampling plan. He also hosts a marketing research podcast, Intellicast and has a bulldog named Bonnie.

Follow Brian Lamar on LinkedIn 
Follow @EMI_Research on Twitter
Listen to Brian’s podcast, Intellicast.

Follow Worthix on LinkedIn
Follow Worthix on Twitter: @worthix

Follow Mary Drumond on LinkedIn
Follow Mary Drumond on Twitter@drumondmary

Transcript:

Introduction (00:04):

Brian Lamar is currently VP of Insights at EMI Research Solutions. He designs, analyzes and presents research studies of sample providers, keeps up with trends and innovations in the industry and has developed a product to help consistency. He has a passion for best practices and quality and assisting companies making the best business decisions possible. He has spent the past seven years developing a unique understanding of the online quantitative sampling landscape, analyzing his own internal research to help clients reduce sample bias and determine the ideal sampling plan. He also hosts a marketing research podcast in telecast.

Mary Drumond (01:23):

So one more episode of season five of voices of CX podcast. Today I’m joined by Brian Lamar who is an expert at survey sampling and other things in the world of survey sampling and other things. You have your own podcast, right Brian?

Brian Lamar (01:43):

Yes, I’m a fellow podcaster. Thanks for having me.

MD (01:46):

And your favorite topic is surveys, isn’t it?

BL (01:49):

Oh gosh. Um, yeah. My favorite topic on the podcast is surveys and we break it up and we goof off a little bit because surveys can get a little boring, but yeah, surveys I’m passionate about.

MD (02:02):

And you went to UGA, you attended the MMR program that Worthix is on the board of. I sit on the board there. So that’s how we met, right?

BL (02:11):

Right. In Athens.

MD (02:13):

And that’s where we got to talking about this topic that we’re both so passionate about and I really enjoyed when I met you, how much insight you had and how very knowledgeable you were on the topic. And I could think of no one better to come on here and talk about maybe my number one question when it comes to voice of customer surveys, which is sample sizes, panels, and that entire universe. So thanks for coming on. I really appreciate having you all.

BL (02:39):

I’m excited. Thank you. I appreciate it.

MD (02:41):

So this season we decided to focus a little bit more on market research and like the technical aspect of it because it was something that was coming up with listeners a lot. They were like, Hey, you know, you guys talk about all these cool things. You talk about technology, you talk about behavior science, but how about the nitty gritty of market research? A couple of your colleagues have joined us already and will join us along the course of the season. But before we get started, for the sake of our listeners, can you just give us a little tiny bio, a rundown of why you are so passionate about this topic and what kind of gets you up in the morning?

BL (03:13):

Yeah, so I’ve been in marketing research for my whole career, so over 20 years. And I’ve had every kind of role in marketing research. I started off as a telephone interviewer. And so I’m very passionate about the respondent experience and what we put respondents through. And that’s because I talked to people on the phone 25 years ago for sometimes 30 to 45 minutes. And then I navigated my entire career, I’ve done kind of every role in marketing research and I see bad data sometimes and I think that our industry could really grow if we’re providing more actionable, higher quality data for our clients so they can make better business decisions. And so that’s where my passion lies, is that I think our industry could really grow and deliver much better insights if we can just improve a few things.

MD (04:02):

You know, the first question I want to bring up is probably the question you get most often and it’s probably the topic that you delve into the most because everyone is kind of talking about this, which is the longterm sustainability of the survey models that we have, what to do about survey fatigue and how big of an issue is this truly for our industry at the moment? So I know lots of things to unpack in that question.

BL (04:26):

Well let me just rant away. I have a point of view on this. Okay.

MD (04:29):

I love a rant.

BL (04:29):

So survey fatigue is a passion of mine and it’s hard to measure because we’re dealing with people that are willing to take surveys and it’s hard to measure the percentage of people that aren’t willing to take surveys because of what we’ve put them through. And so survey fatigue is a big topic I rail on and I talk to clients about shorter surveys. We’ve done that for a long time. And the problem lies from 20 years ago we just moved surveys from telephone to online and didn’t shorten them and we started measuring, Oh they’re 33% quicker if you take it on a desktop or a laptop, that doesn’t mean it was best practice. I mean, I’ve talked to a client the other day that had 25 open ended questions and that’s just ridiculous. It’s a 45 minute survey and people are willing to do that.

MD (05:14):

No way.

BL (05:14):

Yeah, people are willing to do it, but that doesn’t mean we should be doing the research just because you can find 200 people that are willing to do that. In my opinion, I would love to keep surveys to no longer than 10 to 15 minutes, but I see probably 50% of our surveys are at least 20 minutes. And to try to shorten them is super important, especially because we have a new generation of people coming up whose attention spans are shorter and aren’t used to taking 30 minute surveys. I mean 5-10 minutes is a long survey for them. So customizing the surveys so they don’t get fatigued and we can improve response rates and we can improve our representivity is super important. I’m not sure if our industry is really facing that challenge head on yet. We’re kind slow to evolve to the demands of our respondents and especially clients are slow to evolve their surveys to meet their needs.

MD (06:05):

I agree with everything you just said. But I wanted to break it down a little bit because when most people, especially people who aren’t in the industry, not like in the nitty gritty, getting their hands dirty, they tend to think of surveys in general. When they think of surveys, their mind immediately goes to transactional surveys and I feel like if you haven’t got your stuff together when it comes to transactional surveys and if you’re still going over 10 minutes, then you should just go away. Especially when you’re talking to the end consumer, they’re not going to provide you with more insight, especially on a transactional basis. But there are other types of research. There are deeper, more qualitative studies that do require a couple more questions. Right, so let’s get into that for a second. What are some other surveys that exist out there that no matter how you look at it, there’s no way that you can make it that much shorter and how can we possibly tackle these meteor studies? And how can we improve the length of those?

BL (07:06):

Yeah, so I think that a lot of the transactional stuff, every time I travel, I was just in Atlanta last week I was on Delta. I get the followup survey on Delta, I get the followup survey on Hilton. Those are pretty easy transactional, few questions, surveys that are still frustrating, but they’re very transactional. But however, most of what I deal with are more customized research. And so we have a client that, I was just talking about them this morning, They do a lot of neuroscience and they want to put people into buckets, and they have like a 20 minute segmentation algorithm that everybody has to take every survey. So you’re taking a 20 minute survey just to kind of let the neuroscience, which very, very, very smart people are doing kind of behind the scenes, just to put you into a type of bucket to try to understand what type of person you are and why you do the things that you do. And then they really get to the point of the survey, which is really just a consumer packaged goods surveys, getting your opinions on products. That’s all it is, but it takes 20 minutes just to get you into the right bucket. That’s just I can’t even imagine being a respondent and having to go through that. We deal with clients like that all the time and you know there’s studies like attitudes and awareness and usage and behaviors that really get deep into what you’re purchasing, where you’re purchasing them, where you’re using the item and they can be easily 20 to 30 minutes and go very, very, very deep into very customized surveys that then the clients are using that just to try to understand the market, try to understand why people are using the product and how they’re using it. It’s very, very, very common and that’s most of what I deal with, that and the idea of valuation and that’s another frustrating survey at least for the respondents in that you know, we used to earlier in my career we would test ideas monatically, meaning that they just have one idea to evaluate. That’s not a problem, right? II’m happy to evaluate one idea until I feel like it or if I would purchase it. What’s unique about it, what are my likes and dislikes. But it’s now changed to be more than one concept or idea at a time where it’s sequentiamatic. You might evaluate five or 10 at a time, and that’s where respondent fatigue and quality of data starts to really diminish. If you’re evaluating like five or 10 ideas, it can be very frustrating, especially if they’re a boring idea and a category that maybe isn’t very exciting. Right. I’m thinking of evaluating like office furniture. I mean how exciting can that be? And I show you idea after idea after idea. Like It’s just office furniture. It’s no different than the previous six that I’ve seen. So that’s where I really try to advise clients into best practices to try to improve their experience as a survey taker.

MD (09:44):

Do you think that when people like submit to a study though, they already come in with a different expectation so they’re a bit more tolerant?

BL (09:54):

Yes absolutely. We deal with online panels and so these are groups of people that have joined the panel, that have agreed to take surveys and they’re sent an invitation, and in theory, that’s a key word here, In theory they’re told exactly what they should be doing. They’re told, Hey, we have an exciting survey for you. It is 15 minutes long, we will pay you this many points for your time. Now how accurate we are there is debatable. Unfortunately many times we tell people it’s a 15 minute survey and a 15 minute survey for you might be a 30 minute survey for me because I might be slower. I might have more to say. I might have to think about it. I might have a child crying in the background that I have to attend to. I might be at work and I might have somebody pop in. And so the 30 minute survey is where you start to get frustrated because Hey, you told me it’s 15 minutes, but these people tend to have agreed to do this and there’s a lot of those people and there’s a lot of people taking surveys for not a lot of money for their time.

MD (10:50):

So this is where we go into like the professional survey takers. Or I don’t even think that you can name it that, but–

BL (10:58):

That’s what we call it. Professional surveys takers. Absolutely.

MD (10:58):

Do people actually just answer surveys all day long?

BL (11:07):

There are some people that do that. Yeah, they join a lot of panels and you can’t make a ton of money, you could probably, you know, make a dollar or two per survey. It’s not bad for like a little thing on the side to do on your day off or in the evenings if you have some downtime. There are a lot of people that are professional survey takers that we try to identify when we’re looking at the data. We try to identify if people, it seems like they’ve taken a lot of surveys before. You could debate, generally people don’t want professional survey takers because they tend to answer surveys.

MD (11:38):

I wouldn’t want them.

BL (11:38):

Yeah. Unfortunately they’re kind of a necessary evil in our industry that they’re ready and willing to take surveys. There are people that are really willing to give their opinions on products and services, and we need their opinions. Yeah. So we want a mixture if people.

MD (11:52):

But how much do opinions actually count? And I know that this is bringing on the controversy right here. How much do their opinions actually matter versus an actual user of that service or that platform or the product?

BL (12:07):

Yeah, it really depends on the survey. Sometimes their opinions can be super valuable and they could have quite an impact on a brand because depending on the target, it’s hard to find certain people. And if you are identified as someone that maybe drives a certain vehicle that’s hard to find, then your opinion will matter a great deal. And yours will count more honestly than other people. So if you’re looking for a Ford Taurus or a Toyota Camry driver, you know there’s a million of them, right? Well, if you’re looking for the Tesla owner, your opinion is going to be count much more weight unfortunately. And so yeah, those are very important people to try to ensure that they are who they say they are and that they aren’t professional survey takers. So it’s super important. It’s hard to do. It’s very hard.

MD (13:20):

Is there some sort of, okay, this may sound a bit big brothery, but is there some sort of database that stores all of the information that survey respondents have previously answered as a way to pre segment them? So the idea occurred to me when you’re talking about the Teslas, right? So if like if you have already been pegged as a Tesla driver, could that be useful in the future?

BL (13:48):

Yeah, that’s, we call that profiling. When people join an online panel, they typically get profiled and that is a very boring survey. And what they’re doing is they’re trying to identify you as are you a homeowner? What’s your level of education? What kind of car do you drive? Where do you bank? What kind of smartphone do you have? Where do you shop? What kind of products do you buy? So you’ll go through this giant battery of questions and depending on how you answer those questions, we’ll target surveys towards you. If you’re identified as a Tesla owner, you better believe that you’re going to get a lot of surveys about Teslas and you can argue the merits of that because maybe all they’re getting now is Tesla surveys and then they’re kind of biased as a survey taker, right? Because all they talk about Teslas, right? You should really be talking about a wide range of topics in order for best practice. But yeah, we do that. The better panels do that more with programmatic and API data and third party data where you’re really just kind of opting in and connecting your data rather than answering it. Answering can be very long and monotonous and very boring and painful for respondents. Also it changes, right? So today I may not have, I may have one child, but in a year or two I might have another child. Right. So how often do you ask those questions? I might have a new car, I might bank somewhere else. I might have different shopping behaviors. I might have a different income bracket. You have to kind of continuously update that, which continues that frustration. But we do that all day long and it helps us provide a better respondent experience for people if we’re targeting surveys towards them rather than going to the just the general population.

MD (15:16):

And let’s be honest, Brian, there probably isn’t a way to make that initial profiling survey less painful, is there?

BL (15:24):

Not really, unless there’s new technology that maybe will help with that. There’s kind of pie in the sky that I’m super excited about that maybe will help with that. But generally as an industry, we ask people’s opinions and their attitudes and behaviors. And so we kind of have to ask those questions in order to target the future surveys towards them.

MD (15:42):

Right. So has there, in the past, I’m going to say 10 years just for the heck of it, have you noticed a decrease in people willing to even be compensated to take surveys?

BL (15:55):

A decrease in like people that are willing to take surveys? Yes and no. So we have a lot of attrition in survey takers, but they’re just constantly being replenished. And so the panel companies are really good at identifying websites or affiliate networks or areas where they can attract people to join their panel. The key to those companies is too increase the longevity of them. And so it’s about providing them with a good experience. Traditional research would mean that you’re only there to take surveys. And you’re taking surveys not really for the money, you’re taking surveys because you want to give back to the products and services. You wanna help shape them. And you’re paid a little bit for your time. But not much. That has changed a lot over the past 10 years in that we’re trying to be more engaging to people. They’re not just joining a company to take surveys. You might be joining a company that’s providing other benefits towards you. Maybe you play games while you’re on there, maybe you interact with other people, maybe you get discounts on products and services, as part of that you also take surveys and you build points and it’s part of a community that you build loyalty with. And that’s kind of the model that has started to trend in the past few years. They’re getting paid differently, I would say, maybe not more, but differently. And so that’s been a big trend. Some companies compensate their members a little bit different and the pricing structure, the pricing model of compensating surveys has changed a lot over the past 10 years. Some people, almost like it’s a minimum wage, regardless of how much time you take to take the survey, you’re paid minimum wage for that. So if it’s a one minute survey, maybe you get 50 cents, regardless of if you even qualify for the survey. If it’s a 10 minute survey, you get paid a dollar regardless, and that’s become a little bit of a trend as well. I’m hopeful that that attracts more people into the survey-taking business because we need more, we don’t have enough people to take surveys, which is a little kind of a dirty little secret in our world is that there aren’t enough people to take all the surveys that our clients want to pay for, and so that’s why sometimes people I think turn a little bit of a blind eye to professional survey takers and maybe even lower the bar in quality, which is kind of sad.

MD (18:07):

I’m going to say something and you tell me if you agree or not. Okay?

BL (18:11):

Okay.

MD (18:11):

Survey sample size has always been low. True or false?

BL (18:16):

I would say false. We see really large base sizes all the time. It’s very common to have a giant base size. Sometimes I question, why do you need to talk to 3000 people? Which that seems large to me. I don’t know if that seems large to you, or 10,000 people. It’s very common for people to administer large base sized surveys. I see more 500 people surveys than I do less than a hundred.

MD (18:40):

You know, last week we sent out a survey that got like 3000 responses in four hours.

BL (18:46):

Yeah. Yeah. That’s very common. The way that we are administering surveys now, it’s kind of almost in the moment, which is probably improving the respondent experience, is that we want people taking the survey in the time and place of their choosing and so if I’m at work and I want to take it on my smart phone, that’s how I should take this, be able to take the survey. If I’m at Kroger and I get a pop up to take a survey while I’m in Kroger that should be how you take it. Or it might evening or late at night or in the morning before work worth getting much better at improving when and where and how people can take surveys and we can do them really, really, really quick. That’s very common for people, especially for public opinion and polling, that you can do surveys like almost real time. It’s pretty crazy actually.

BL (19:28):

But do you think that in general over the past hundred years that surveys have been a thing if we go all the way back to pen and paper. The ratio, or the percentage of people that are willing to take your survey, do you think that that has been pretty much the same? Do you think there’s been a drop?

BL (19:47):

Yeah, there’s definitely been a drop and a lot of that is because I saw on green book, they asked about how much do you trust market research as an industry. It was really low, it was like below the police. And you know, you could argue the merits of whether you should trust the police, it’s really low. I think a lot of times it’s because people think of market research generally as polling and polling might be off and if polling is off as it was, and by the way, we’re doing a webinar on this soon, this is top of mind, and I’m researching it. A lot of the polls were off like in the 2008 and 2012 elections, not just in the U S but in Australia and Israel and the UK and Canada. People really started distrusting polling and marketing research felt the impact of that. It has kind of a bad reputation. We need like a PR person for the industry.

MD (20:37):

I think we should stop doing political polls in general and like name that industry something else. Like, we’re gonna call you political polls. Just PP. We’re MMR, you’re PP.

BL (20:50):

Exactly, because we do completely different things and we don’t have to prove to the world that we’re right or wrong because of the nature of what we do versus what the polls do.

MD (20:59):

It’s scientific research. It’s extremely different. But at the same time, do you think that the market has also suffered from company malpractices, in the sense that it’s just companies taking advantage of it and just absolutely disregarding or disrespecting people’s times by sending them surveys that haven’t been properly tailored, that haven’t been properly written? Let’s just throw all the questions in there because since we’re sending out a survey, we might as well get all those questions in any way. So instead of carefully segmenting the base, instead of asking, you know, only the questions they need to know. You know, I saw some examples about that with like airline companies for example, that they were making their customers go through all these qualifying questions before answering the survey. When really honestly the company has access to all that information. I mean, especially if you’re an airline and you’re required by law to fill in so much information, they’ve got so much data already. So instead of using that to eliminate the time of the survey, they’re just being lazy about it and just, you know, asking your date of birth or your age range, it’s like, you know, that, you know, my date of birth, it’s on my documents.

BL (22:08):

You touched on a couple of hot topics for me.

MD (22:11):

That was my mini rant.

BL (22:12):

Love it. That was awesome. I completely agree with you. One of them was that clients and buyers of research have a ton of data and we’re not good as an industry at answering business questions with the data you already. We wouldn’t do a unique survey, every thing, every business decision we do over and over and over again instead of trying to chunk it into one survey or using what you’ve already surveyed. That’s one thing that I’ve seen that the whole industry forever, it’s been been really bad about it and the other thing that you touched on is that we’ve abused respondents relentlessly over the past, I don’t know, at least my entire career. In that we send them poorly designed surveys or we send surveys that don’t make sense. We don’t make them optimized on every device available. I did a survey about iPhone users that you couldn’t take it on an iPhone, which was really crazy. Right? It was among iPhone users. We ask questions over and over again, there’s something in sampling called routing and if you take a lot of surveys, you’ll notice that you answered the same questions over and over again. And it’s basic questions. It’s like, what’s your gender? What’s your age, what’s your income, what’s your education? Then you get routed to another survey because you kind of qualify it and you answer the same questions again. Imagine how frustrating that is to, Hey, we want you to take a survey. Your opinions are so valuable, please answer these questions and you’ll get gender three times and you get income three times. As an industry, we’re slowly improving that experience for people. But yeah, we have just been horrible to people and I mean the respondents are people, right? You and I are respondents. Everybody’s a respondent.

MD (24:15):

I want to add some more accountability on behalf of the companies that are hiring surveys also. And I think that a lot of the responsibility and how they kind of burned out their customers with surveys is by doing absolutely nothing with that feedback or the customer feeling like their feedback made absolutely no difference. So I’m not talking about closing the loop cause yeah sure, closing the loop is a thing, but if you take a giant organization like Verizon that’s got hundreds of thousands of customers. Closing the loop might not always be possible every time a customer answers the survey, but one way or another, if that customer that painstakingly took your survey, if they don’t feel like that survey had any purpose whatsoever, if they finish and they’re like, why the hell did I just do that? What a waste of time. They’re not going to read my survey, they’re not going to make the improvements that I want them to make. I think that’s the thing, the, at least for customers, especially transactional surveys, that’s the one thing that pisses me off as a customer where I’m answering a survey where I’m like, why am I even taking my time to fill out this response if the company, I know that sometimes the company doesn’t even have a way to properly process what I wrote because they’re not going to sit there and read every answer. Right. So that’s one thing that pains me as well,. Okay, you know, my company Worthix, we do super duper cool NLP. There are a lot of companies out there that have come up with some, you know, revolutionary and innovative technologies to better process surveys and everything, but I still feel like companies have the obligation of acknowledging that feedback somehow. I don’t really know what it is. Do you have any ideas?

BL (26:00):

Well, I used to work at a company years ago when when online research was just in its infancy and what we had to do every single study was we had to take some of the data and provide it back to them. One of the questions I asked them is what frustrates you about surveys? Why would you take more surveys? What would it take? One of the number one things is feeling like my opinion is heard. We’re getting feedback on the survey that we took so that you know that your time is valuable and your opinion matters. And so we used to do that and it was hard to do. We used to try to find two or three questions in the survey and we would respond to every single person that took it and said, Hey, here’s what the data looked like and what you took. And so yeah, you’re right that we need to do a better job as industry in letting people feel special. Some companies can do that kind of real time. You’re taking a survey, you’ve clicked on an answer and it tells you how you kind of compare to other similar people taking the survey because yeah, that’s so important to people.

MD (26:57):

Can I rant a little bit more about another thing that really bothers me about surveys.

BL (27:02):

Oh my gosh, yeah.

MD (27:02):

I mean you’re the perfect person to rant to. I’ve gotta be honest.

BL (27:04):

Oh please. I’m recording this too, by the way.

MD (27:09):

Another thing that really annoys me as a customer taking surveys, I’ve got issues with transactional surveys. Okay. I think you can tell. Because let’s imagine I’m buying an airline ticket and I’ve so far up until that moment I’ve had a horrible experience with that company and then I get on the phone with a company rep and that rep does a great job to the best of their abilities and they try really hard and they do everything they can to help me solve the problem. My problem might be entirely siloed in a different department that has nothing to do with that rep. And then at the end of the call I get the survey and it’s like, please answer according to this transaction alone, like exclusively this one individual that you just spoke to, how would you rate their service? Like, I don’t want to talk about this one situation. This is the only opportunity that you’re giving me a voice as a customer after all this crap that you put me through as a company. At this moment, I want to vent as a customer. I don’t want to rate the representative I just talked to. I want to rate my experience as a whole and I can’t do that. I think that’s a huge mistake. It’s a huge opportunity that companies are missing out on where they’re like being so, how could I even say this where they’re focusing on that like one like moment of the customer journey, they’re failing to look at the entire experience,

BL (28:33):

Right, yeah. One hundred percent agree with you that there should be some sort of, I don’t know what you call it, but you create a system where you have the opportunity to kind of rant about anything you want to. And maybe that’s just, okay the first five questions are about your last interaction and then do you want to tell us anything else? It wouldn’t be that hard to kind of create. And so if that happened to you, that’d be a good example of someone listening to their customers and trying to be more customizable and offer more well rounded feedback and things like that. I wonder, I don’t know if that exists, but I completely agree with you on those transactional surveys. It’s very, very, very frustrating and a lot of times they’re also delayed. Like thinking about the time that you spoke to an AT&T representative, and it was on December 14th. I’m like I don’t remember December 14th. I barely remember this morning. I don’t remember the interaction, how well it went. Those are very frustrating to me as well. I can pile onto your rant.

MD (29:23):

Sometimes I’ll get a survey like a short one on like my Facebook timeline or something like that. And it’s like, do you remember seeing an ad from this company? No.

BL (29:35):

Those are the worst.

MD (29:36):

So I mean like when it comes to those, I’m like, okay, you really have to be more timely. It’s more time sensitive than that. So I’m thinking here, maybe there’s a way to have customers on each touch point. So you know, we know that in general when a company is creating a relationship with their customer and they want to listen to their customer, which is great, I mean that’s gone so far from, you know, the past 20 years when companies couldn’t care less about their customers at all. What if like there was an option to opt in in certain moments of the journey. So like do you want to answer a survey for yes, press one for no, press two and then you just get these little prompts along your journey so that you get to answer the survey whenever you want. And then when you choose to answer the survey, instead of it being transactional, it can be holistic and not holistic in the sense where I’m asking about every single transaction. But just understanding like is there anything going on that we should know? How are we doing in general? The thing is that as a customer, if something’s going wrong, that’s what I’m going to take the time to answer. If everything is great, then no news is good news.

BL (30:52):

Maybe, this might be a crazy idea, but have you been to the airport where you evaluate the restroom and you’re coming out of the restroom and there’s just the, is it clean or is it dirty? And that’s pretty much all you evaluate. It’s quick and easy. Right? And that’s a good way of getting feedback of, Oh, we need to send someone to the bathroom to clean it. That’s really pretty much the only reason that exists. What if we could put a system like that like everywhere in life? Like if you’re in the grocery store, like my deli experience was good. Oh, but now I’m in this aisle and you’re missing some items on the aisle. I’m unhappy here. And then my checkout time was way too long, so I’m unhappy here. What if you could put a system where you’re constantly just evaluating everything in life?

MD (31:30):

Yeah. But what have you can just do like a good or bad, because check this out when you’re leaving the bathroom at the airport. First of all, that touch screen is disgusting like I’m so terrified of touching that cause you’re walking out of the bathroom and you’re like, ew.

BL (31:44):

I’m a germophobe, so yeah.

MD (31:44):

So but like instead of asking if this was clean or dirty, just ask, is there a problem or was everything satisfactory? You know, cause that way I can just say like problem, then you send somebody. It doesn’t matter if the faucet is broken or if there’s tissue all over the floor or the bathroom is dirty. It’s like bad and good. You know, and then, and then from there then you can dive deeper. So like if it’s something that requires more feedback or more insight then you can ask the person, Hey, do you want to talk about it? Right. But I mean some sort of prompt to give the person the opportunity. What I think it is is the following. I think that you need to give people the possibility to share their voice when they need to. So that channel has to be there. It has to be available, but if they have nothing to share, then don’t burn them out by asking them something unnecessarily. So provide the channel so that the customer knows you’re constantly there and you’re constantly ready to listen, but make it really, really easy for them to opt out of that situation and maybe opt in somewhere in the future when they do have a problem or when they want to share a compliment, you know? So I think that providing customers with the opportunity to know that you’re always listening but then make it really, really easy for them to answer or not. That for me might be a pretty good upgrade.

BL (33:08):

That would be a great upgrade. Especially because I don’t know about you, but I try to always give feedback and a lot of times I’ll go to the website and try to, I just want to give some feedback and it’s impossible to find the ability to contact someone or give feedback. It would not be that hard to put up a system of feedback on a website. That should be a model that most, especially service companies, utilize. I think, by the way, I think we’re solving all of the worst problems, right?

MD (33:31):

I think so too. I hope everyone listens to this episode, my God. But like think about it, like the worst thing possible is when you have something to say and you’ve got no one or nowhere to share it. You know, like my friend Nate, Nate Brown, he tells the story of having a problem buying a chicken sandwich at a chicken place, which shall remain unnamed. And when he tried to complain, the person that he spoke to was like, I don’t know what to do with your information. Like she was not equipped at all to deal with that information and she had absolutely no way to make sure that information reached the right people. Right. But if Nate knew that there was a totem or some way for him to immediately give that feedback and know that that feedback was going to go somewhere, great. There are plenty of times that I just want to let people know there is a problem. I don’t want to complain. I don’t want a gift card. I don’t want to be compensated. I just want to say, Hey company, you know what? I like you. I like doing business with you, but I’ve got this issue and this issue is kind of pissing me off. Can I please tell you about it so you can fix and I can continue to have a good relationship with you please. And thank you.

BL (34:51):

Yeah but the comment card for the modern age, that doesn’t make sense. That’s what you’re talking about. That’s all it is.

MD (34:55):

This has been the most cathartic episode.

BL (35:03):

I feel like we’re getting angry.

MD (35:03):

Like it’s almost like a brainstorming session, right? That’s the feeling that I’m getting.

BL (35:08):

Yeah, me too. I think we are again solving a lot of problems and I’m excited about it.

MD (35:13):

Yeah. And I’ve got like 52 other questions to ask, so I’m going to have to have you back on the podcast sometime to get it done with. But finishing up, how can people hear more from you since, I mean like honestly we talked about a whole bunch of stuff, but I’m sure you’ve got a lot more. How can people find you?

BL (35:32):

Well, they can find me at emi-rs.com, that’s our website. I’m on LinkedIn. I’m easily accessible. I do also do a podcast called intellicast. I actually did 15 minutes just last week on Publix, an Atlanta brand, which I had an amazing customer service experience at a Publix in Atlanta last week. And so I brought some people on to talk about it. The passion that you heard today around surveys and experiences, that’s kind of who I am. So I’m really glad that you have me out to talk about it.

MD (36:00):

Awesome. And yeah, go give Brian’s podcast to listen because it’s really fun and it’s not boring and it’s not super duper structured. It’s like sometimes you do some like totally weird, different things, right?

BL (36:12):

Yeah we did a taste once on a podcast. That didn’t translate very well to the podcast environment, but we do stuff all the time.

MD (36:21):

I imagine it’s like chewing sounds or something.

BL (36:21):

We did for three minutes inside the Publix last week. So yeah,

MD (36:27):

I love me some Publix. Well, you know, I’m a fan of Publix.

BL (36:30):

You should be. They’re amazing.

MD (36:32):

They are. They really are. Brian, any social media handles you want to share? Are you active on social media?

BL (36:37):

I am not personally, but our company is @EMI_research. So yeah, follow us. We’re kind of loud and we talk about the topics that you heard here today a lot on social. So yeah, give us a follow.

MD (36:48):

Tons of fun. Thank you so much Brian for coming on and I’ll have to have you back.

BL (36:52):

Thank you. Appreciate it. Good times.

Subscribe to our Podcast about Customer Experience – Voices of CX

SHARE THIS POST

Share on facebook
Share on twitter
Share on linkedin
Share on email
Mary Drumond

Mary Drumond

Mary Drumond is Chief Marketing Officer at Worthix, the world's first cognitive dialogue technology, and host of the Voices of Customer Experience Podcast. Originally a passion project, the podcast runs weekly and features some of the most influential CX thought-leaders, practitioners and academia on challenges, development and the evolution of CX.

RELATED POST

SUBSCRIBE TO THE VOICES OF CX PODCAST

Join our email list to be notified when new episodes air, and get them a day early!

what is voc - image

PODCAST

Voices of CX Podcast

110k plays and 100+ episodes later, we're still all about the Customer Experience

Subscribe to the Voices of CX Podcast to hear from biggest names in CX; Joe Pine, Jeanne Bliss, Dan Gingiss, Ian Golding and so many more. 

Get notified when new episodes air.