Where Conversion Rate Optimization Meets Human Behavior: AJ Davis

Where Conversion Rate Optimization Meets Human Behavior: AJ Davis


Share on facebook
Share on twitter
Share on linkedin
Share on email

This week on the Voices of CX Podcast we hosted AJ Davis, Founder and CEO of Experiment Zone, to discuss the utility of Conversion Rate Optimization, and how it bridges the gap between economics and human behavior. Understanding what really goes into customers’ decision-making is a lot more complicated than simple AB testing; it might get you by, but it won’t tell you the whole story. That’s where AJ comes in.

About AJ Davis

AJ Davis is a Conversion Rate Optimization (CRO) Specialist. She’s the founder of Experiment Zone, a company that helps online businesses grow their revenue by improving the user experience of their website using scientific methods. Before starting Experiment Zone in 2017, AJ led optimization strategy for Fortune 500 companies during her tenure at Clearhead. She was also the lead UX researcher on the Google Optimize product.

Connect with AJ Davis

Follow AJ on LinkedIn

Check out ExperimentZone.com

Connect with the Voices of CX

Follow Worthix on LinkedIn
Follow Worthix on Instagram: @voicesofcx

Follow Mary Drumond on LinkedIn
Follow Mary Drumond on Twitter: @drumondmary

About Voices of CX Podcast

The Voices of CX Podcast is a podcast that covers all things business strategies, customer decision insight, empathetic leadership practices, and tips for sustainable profitability. With a little bit of geeking out on behavioral science, A.I. and other innovation sprinkled in here and there. The guests span multiple industries, but all of them have years of experience to bring to the table.

📩Got something to say about CX or want to be featured on the show? Let us know! Email the Producer ([email protected]).



Hello to our listeners and viewers. Welcome back to one more episode of Voices of CX Podcast. Today I am joined by AJ Davis, who is going to tell us some really interesting things about user experience, about data. And actually this is quite interesting for me because she’s got a lot of expertise that I don’t have. So it’s going to be a learning experience for you and for me as well.

So AJ, go ahead, introduce yourself. Tell our listeners about who you are, what you’re passionate about, and what you do create.


So the jargon that describes what I do is conversion rate optimization specialist. What does that really mean and what I try to do or why don’t we try to help our customers with is to figure out what it is in the customer experience or journey that’s getting in the way of somebody who’s interested in your company or your product, but are having some sort of friction or problem to actually becoming your customer.

So we employ a bunch of different data sources to really not just guess at what’s getting in their way, but to know for a fact and to make choices and validate it in the real world that it does work.


And how did you end up in this field? In this market? Did you plan to do this when you were thinking about what you wanted to be when you were a kid? And we’re all dreaming about being astronauts and firefighters, and you’re like, I want to be a conversion rate optimizer.


I don’t think that field existed even remotely when I was that age. You know, I think I’ve always been really curious about how we as people move through the world and make decisions. And I did start my degree in economics, so I was very interested in how people spend their money, make decisions, how we can model that.

And then it turns out we were wrong. A lot of the assumptions that go into economics are just based on assuming people are rational and predictable and all these things. So the field of behavioral economics was emerging when I was in school. Really interesting. I had a chance to have coffee with Dan Ariely before he blew up, and now you have to pay tens of thousands of dollars to spend time with him.

But it was so inspiring. I wasn’t really sure what to do with that at the time. And so it kind of just became the seed that got planted and I went into user experience research. It was very compelling and interesting to talk to people and figure out, you know, their friction points with technology to give that back to teams to solve those problems.

But there was still a gap in that. It wasn’t clear to me if the things we learned in the lab carried forward or not. And so kind of just by chance, I ended up in this Google sprint week where they brought people from all the different teams together to brainstorm and come up with new solutions for customer problems.

And I ended up with a group that ended up being the team tasked with building the product, Google Optimize. So from like pen and paper, like scratch pads, coming up with ideas, brainstorming, to the product launch. I was their head user researcher and I had a chance to see real people using experimentation platforms, and it solved this question I had of, How do we really know that our idea worked?

How do we know that all that work that we did before launching our products or the experience works in the real world? Because of that behavioral economics little piece of information in my head of like, we’re not always sure that that holds up. And so I basically came into contact with conversion rate optimization. It brought together the quantitative work that I have done early in my career, the qualitative work that I did sort of in the mid-time of my career.

And it’s this perfect intersection of all the things user research, customer needs but also really interesting because I get to learn about surprising human behavior every day.

The intersection between economics and consumer behavior


You know, next time I’m in a situation like a social situation where I have to make small talk, I’m going to be like, I interviewed a girl who once had coffee with Dan Ariely. This happened and now I’m going to nerd out this. This is really pretty amazing. But the interesting thing for me is a lot of people don’t understand the intersection between economics and consumer behavior, which is odd for me because I’m so fascinated by the industry.

And, you know, one of the here at our company, the product was created by a psychologist and an economist together. So I know firsthand I have witnessed firsthand how these, you know, a social science and think about numbers and economy actually translate so directly into understanding customer decisions and explaining things like churn and things like loyalty and the success rate of organization and customer lifetime value.

And when it comes down to it, taking a good hard look at what motivates us, what drives our decisions, is so important in understanding how to tailor or create a better experience for customers to begin with. And that’s exactly what your job is I’m guessing understanding every single route, every single path. What leads to the best possible conversion for a customer now?

Is this only on digital or is it somehow in real-world as well?


We’ve applied the concepts in the real world as well. The data is not as easy to get, so it’s more expensive to apply the concepts and principles there. I’ve done some work in my own career to do things that were in the field. So one of my favorite projects in grad school was looking at the emotional interaction of looking at a product’s ingredients and description on the physical product, on a shelf versus… and then like pulling that data to see what triggered those emotional reactions.

So there’s some really fun, like sci-fi type things you can get into, but in practice, I think a lot of companies aren’t even close to that. They just need to be thinking about their customer in the first place. Right? So often there’s a lot of assumptions that aren’t being considered or tested as assumptions are assumed to be the fact.

And when we get kind of pulled into it or take a closer look at it and kind of turn those strands over, we often see that those assumptions are misdirecting our decisions. And so just like coming back to fundamental things of talking to customers, observing them, seeing them use their digital product or physical product, we’ll teach all sorts of principles that are very different than how most companies operate.


And when you look at yourself in the mirror, AJ, what do you see? Do you see a conversion rate optimizer? Do you see a scientist, do you see a researcher? What do you consider yourself to be?


That’s a great question. I’ve actually worn all those titles across my career. I was a cognitive scientist at one company and researcher at another company. So I think all those things are true. I think in some ways I feel like it’s not a title but I feel like I’m advocating for customers. Right? We all live in a society where we exchange goods and services and we want to make sure that goes really smoothly.

And then the other thing that we’re doing is making it possible for small and medium companies to compete with the guys that have been around for a long time. So I like to think of myself as like an advocate for small and medium businesses to help them grow and compete alongside just I mean, how many times have we all been on a flight, for example, and there’s like all this friction in the process and you just want to tear your hair out before you even start your vacation.

And so I think about those kinds of scenarios of we have really good intent to have a positive experience and how can we actually achieve that as a business so that our customers are like, Oh, this is what I wanted.

Improve the customer journey


Yeah, that’s so amazing. And there’s it’s so intertwined sometimes when we’re in customer experience, we tend to think we’re really kind of we got tunnel vision and thinking that the only people that touch the customer so closely that’s really, really strange. The only people that interact so directly or influence the customer directly is people that are somehow in charge of the customer experience per se.

But when you look at the overlapping fields inside of corporate oceans and you see how much is actually influencing that decision, that’s why it’s really interesting to me when I talk to designers, for instance, who actually map out customer journeys and they see all the different friction points and they’re able to get that bird’s eye view of all of the little details.

And it’s so intricate and there are so many different things, especially if you have a really complex business model or a really large organization with multiple customer personas that have multiple journeys and multiple markets. And it’s always so amazing to me that people can look at that and, and actually architect new ways to do things to improve the customer journey.

For me, it’s fascinating. It’s absolutely fascinating. Now, when you are sitting down with a new client and you’re starting out, what is the first thing that you approach when it comes to understanding what could potentially be the issue with whatever it is they’re doing?


Hmm. Yeah, it’s a good question. I think we like to, as you mentioned, like these big things in the customer journey like you can make huge changes to that journey and it can be easy to understand when you measure it, but it can be really dramatic of a change. And we actually like to kind of walk before we run because we’ve seen over the years that really small changes, seemingly small changes, can hugely impact revenue.

Simple message changes we’ve seen going from telling them all the features to telling them three differentiators. We’ve seen that lift conversion by over 100%. We’ve seen the email sign-up field placement mattering so much that you can drive 600% more emails. In a recent study we just or AB test we just ran and so what I like to do is start with like where are the places where what are our goals might we always need to know what we’re trying to change or improve on?

What do we know to be problems or what are we assumed to be problems? Because before we go down the path of like really digging into everything, we want to learn from the subject matter expert of the company, what is it your customer service team said, what kind of feedback do you get from customers? Where are people dropping off?

We also look at the analytics and map that to that journey to say across the journey, where does the data tell us there’s pain points? And typically from there, we can already start to identify some hypotheses of things we want to learn and to test. While we’re running those initial tasks, though, we are taking that deeper look. We like to go layers and layers deep.

Do those usability studies, survey customers, do interviews, and we build this kind of ground-up understanding of what makes the customer tick, what’s causing pain points and friction from them becoming going from being a visitor to being a customer. So the short and long of it is find your goals, align on what those are, what we’re working towards, identify those assumptions, figure out what pain points we think we know we have or where we need to investigate further, and then test those solutions in the real world, so that we’re not just guessing.

Working to deliver the best customer experience


Yeah. Now I know that you and I know that when you’re putting in this effort and you’re working with organizations, it is to truly deliver the best experience possible. But it also feels like there’s a lot of power being wielded in understanding user experience. And I imagine that it can be seen by some as a form of manipulation.

Is this something that you encounter, this opinion, people looking at it from the outside at your profession and saying, oh, you just manipulate people to try to get them to buy more?


Yeah, I never hear it from people in our industry right now.


Outside these.


Experiences, but it’s something we’re keenly aware of. You know, I think one of the things that can happen is you can have disruptive, negative experiences, that kind of trick people into doing things I like to think about. But if you think about certain websites that might be used by certain demographics, you can see even in the layout that there are very strict design choices to kind of trick people to click on ads or things like that.

So the way that I like to think about it is we’ve got to align on the values of the company and make sure any design choices we’re making reflect that. So just because our goal is to get more email signups or to get more orders, we don’t want to trick people into giving away information or make them feel like they have to place an order.

So, you know, one of the things that we like to think about is how do we help the customer achieve what they’re there to do? How do we make it easier to understand? How do we remove any doubt, as opposed to we want 100% of people to do it? Because at the end of the day, if we wanted to trick everyone into doing something, we would have a big button that said ”free” that like somehow automatically clicks is your payment information and spam or something.

So we absolutely don’t want that. But there are some gray areas where we have to be very careful and evaluate the ideas based on the values of the company and the longer-term reputation that we have to protect yeah.

CX: Free trials are not always good


You and I, in our call, we talked a little bit and I told you the story, but I’m going to tell it to our listeners as well because it’s so interesting how your profession and it has such a strong influence in even the upbringing of the younger generations. So my child wanted to download an app and it was free, and as soon as she try to open the app, it wouldn’t let you continue until you put the credit card in and my response was to try to educate her on the different traps that we see on the web or in application is to try to get us to, to pay without wanting to.

And I found myself having to explain that, hey, free trials are not always good and there there’s a reason that, you know, they’re giving away a free app and having to educate my children that there are no free lunches, that these are corporations and that they’re trying to make money and they have to find a way to make money and have her actually sit in the driver’s seat and think, hey, if you were running a business and you had to make money, how would you try to build that into the product that you’re making?

Is is this something that you find yourself also feeling almost a responsibility to educate people on? Like, hey, when you see this in a website, don’t do that. Or, you know, sometimes even with our parents that perhaps didn’t grow up on the Internet and they struggle with things like that, like don’t click there, don’t do that. This thing you do think of this.


Websites where it’s like the button to convert something from one format to another. And then right below, it’s like an ad that looks like a button that takes you somewhere else. You’re like, Oh, no, that’s terrible. And in the early going.

User Experience: What’s the Goal?


Really gets me. Yeah. Before you change the subject is when you go on to news websites and they’re like, Watch the video here. To see what happened. And then you watch the video and it’s an ad and the story is all the way at the bottom. This it’s terrible.


Yeah. And I think that’s why we can’t just stop at metrics, right? And we’re just looking at how many customers we get or we’re just looking at how many, how much time someone spends on the page or something like that. We fail to account for like, what is the customer goal? What’s our goal? Where do we meet in the middle on that?

And, you know, I think coming from like an economics background, like it both serves the customer and the business to have a mutual product that somebody’s selling and somebody purchasing because it’s solving a real problem for them. And I think some of this maybe less than gray area of bad practice is when we’re trying to trick someone into buying something that they don’t need or want.

It doesn’t solve a problem for them. It doesn’t add value to their lives. And so that’s where I think we see a lot of these dark patterns in UX is when it’s really not solving something that customer needs. It might be you know, leaning into some things, like you see a lot of games where it’s like trying to get people addicted to the game as opposed to providing entertainment or enjoyment or something that’s more qualitatively better.

And again, I think we just we’ve chosen not to work with folks like that and to make sure that the problem, the problems, the companies are solving, they can articulate, align with their goals and align with real customer needs. But it can be tricky out there. And I definitely not so much professionally but like personally when I’m, you know, with a family member who maybe is an in tech kind of trying to give them that, like, hey, that’s what this means, or hey, there’s that X over there, that means you don’t have to do this because so often people do just kind of do the thing right in front of them.

Digital experiences and data vs unethical decisions


Yeah. It’s interesting. It reminds me of three authors that wrote books with absolute amazing intentions that ended up kind of being corrupted. One of them was influenced by Cellini that that book was revolutionary, and it talked about social proof and it talked about understanding how social proof influences people’s decisions. And while it was like groundbreaking when it came out, ultimately people started getting upset and thinking that it was a form of manipulation.

We had the same thing with B.J. Fogg, and the last one that I heard was near Seattle, who has been a guest on this podcast with his book Hooked, which he wrote for very specific reasons. And ultimately, we saw many cases in which that book was used as almost a guideline or a manual to get people addicted to unhealthy behavior.

And he himself wrote a follow-up book about how to get away from technology and how to break those patterns, because he saw his work being used in so many negative ways. Users ultimately. And, you know, I mean, I’m sure that there is that everyone has a line, let’s say, that they don’t cross because it goes into the unethical or too the dark patterns for you.

What is that line? What would you consider unethical and how do you stand by that in your business and in your practice?


Yeah, I think what I like to do is take and take the digital experience and make it analogous to something in the real world. So if we were like an arcade company, like we had a cool bar with a bunch of arcade games, it would be totally fine if people came in and played games on a Friday night and just relax with their friends.

But if we were to give them like lock the door or we were to just like charge their credit card every hour just because they’re there, there’s some practices you might think about that exist in the digital world, but if we apply it in a physical world, it starts to feel uncomfortable. So I think the most useful exercise for me is just how does it feel if this were in the real world, if we’re like a production company, like a Netflix or a YouTube or something, and we have these videos people watch, hey, that’s analogous to somebody paying to go to the movie theater or seeing something physically in the real world.

But we wouldn’t want them to just like sit in the theater and just watch movies all day. So similarly, I would apply that to, like, how people order things or buy things or subscription models. It’s you want people to make an intentional choice or you want them to understand what their choices are and why they’re being presented with that.

And there are little nudges that can influence what people choose. And I don’t know if those are bad or good on their face. Right. Again, it comes back to that real-world analogy. There’s a classic example of showing one choice for a subscription versus three choices and how that shifts how much people are willing to spend or to do.

So there are some psychological aspects that we aren’t aware of for ourselves and that do exist in the real world. And that to me is more of the gray area of what is ultimately good for the person or bad for the person. But if it’s supported with good policy, you can cancel or there’s customer support or there’s these other things that can help you get out of it.

The escape clause, it’s the undo in UX design. Then it to me that feels like it’s a much easier thing to experiment with and to understand, as opposed to something where it’s like you’ve put your credit card in and now forever any button you press will just charge your card.

Conversion rate improvement: tools and techniques


That’s such a great analogy. I think it became so clear to me, you should totally keep using that one. But comparing that arcade one was brilliant. Now, when you’re helping clients improve their optimization, what is your main tool to do so? You talked about qualitative and quantitative resources, and you also talked about AB testing. How do you use these things to try to understand more about what drives customers so that you can improve the optimization rates of your clients?


Yeah, I think the primary tool is AB testing. So what we want to do is have a really thoughtful idea solution that we want to put in the world and test it against what’s there already, because that’s going to give us a true understanding of did it work, did it not, what else did it in fact so that’s the primary research method that we use across any of the changes that we’re making because we want to know for sure what happened to this customer set when this change was introduced. But we complement it with a whole bunch of other inputs as well.


That’s great. Now when it comes to doing the research and trying to understand from users what their perceptions are, what their feelings are when they’re performing certain actions, where do you pull that data from and how do you make sure that you’re pulling the right data at the right time?


Yeah, good question. I think usability testing and interviews are just two of our go-to for really getting the qualitative side. So I like to think of it as the what and the why. So an AB test will tell us what worked or what didn’t. And oftentimes something will surprise us and we can infer what might have happened.

But also sometimes things happen where we don’t understand at all. Like we were so confident, we had so many signals it was going to work and then in the real world it didn’t hold up. And so going to an interview or a usability study gives us a chance to explore that. Why and to really dig into how our customers are responding to it, not just how do we perceive it as experts, but how do people who are the target customers or the real users respond to that that change or that design or that new pricing or whatever the thing is that we’re talking about.

And I find that those unlock more aha moments than anything else is wow. I thought very differently than my customer. It’s that sort of shift from an economic mindset of this should obviously work. This is the rational to the behavioral side of it, of understanding what really what are they really responding to? Because sometimes they don’t even notice the change you made or sometimes it’s confusing and you don’t understand that without talking to them directly.

When Surveys and AB Testing Make UX Worse


Yeah. I mean, has it happened? I imagine because I’ve seen it happen so many times where I know a company has done their research, I know they’ve done their homework, I know that it probably took months for them to create an update. I’m sure they tested the hell out of it and they did all the AB testing and it comes out and it’s terrible. It’s just terrible. It absolutely destroys the entire experience. How does this happen?


It happens way more often than it should. One of the most common reasons companies hire my company is because they’ve redesigned their website and they have all these promises and things coming from data or coming from collaboration or coming from experts that say this new experience just threw the whole old one out and bring this whole new one in and it will be so much better.

You’ll get a 20% conversion, less site will be faster et cetera, et cetera. There’s a lot of promises, and in part it can be informed by good research, but it also starts to become something where you become blind about what you didn’t look at. So if you have a system where you’ve thought about it a lot, you’ve had a chance to respond to problems over the last five, ten years and then you throw it away to put something new in place, there’s going to inevitably, inevitably be some gaps in that that you didn’t anticipate.

And that’s very often where the missteps are is you can have a perfectly laid out customer journey, you can overhaul it all. But you should look a better choice would be to like roll out a little bit at a time to see how it impacts things as opposed to just throwing away everything and having to start fresh and then kind of scrambling because you’ve invested a lot of money into something that hasn’t held up.

Decision making without bias


Yeah. Now, when it comes to pulling those signals, as you said, what is what are some of the best ways to verify that you’re pulling the right signals or that you’re diversifying it enough so that when you’re making a decision, it’s a fully informed decision and not something based on your biases? Mm-hmm.


Well, the first thing is to just not be just in a conference room sketching with people and assuming that’s enough. Right. But a lot of companies will rely on like one or two data sources, maybe depending on who leadership is or what the history of the company is. And sometimes they’re just missing opportunities for other data inputs so, you know, classic things to look at would be your analytics.

What are people doing? Google Analytics will show you the pages they go to, where they’re dropping off, what they’re clicking on or not. That’s a good data point. We want to know the behavior like tracking is another data point that’s complementary to that. Are people scrolling? Are they moving through? Are they getting through the experience and seeing things as you’d expect them to see it?

And then on the qualitative side, it’s pretty typical to have think about surveys, usability studies, competitor analysis, interviews. So those are probably the most common ones. And a lot of companies are missing bringing those things together. They’re kind of in silos. Maybe the research team’s doing something and the marketing analytics teams doing something different, and they’re not bringing it together.

But some of the most powerful insights are from other people in the company. And this is something that as companies grow, they’re missing the chance to bring the team together. And what they collectively know about the customer. So one of my favorite things to talk about with our customers when we first start working together is to say, what is your customer service team told you what do they know about the customer?

What do they assume about the customer and what is their mental model about that? And how is that different than yours? Because you’re going to start seeing some differences as they’re dealing with the customer in a moment when they might be frustrated or have a pain point or were expectations weren’t met versus, you know, the Web team might be more at the top of the funnel where they’re, Oh, I’m so excited about this product.

I’m very enthusiastic. I haven’t used it yet versus it didn’t quite hold up to those expectations. Another channel that I love to think about is the customer service team. I said customer service, the sales team. So the sales team is that first person talking to the customer. And what’s really neat about sales teams, especially as it relates to marketing, is that they know what closes the deal, right?

They know what they talk about first, which is your landing page or that first message, what’s kind of splashy in like lands or draws in the attention, but they can also tell you why somebody ends up buying or not and not often is something that we just lose on our website. You know, we lose in our emails, we lose in our marketing in general in our messaging.

And then lastly, one that I don’t think a lot of people think about is in-store associates. So if you have a physical store as well as a digital experience talking to people who are in store and have met the customers face to face, they walk them through the products they show them and explain it themselves without the context of marketing.

Those are gems like that is those are some insights that you have not thought about that a lot of companies have never taken a look at it like working with associates as interview subjects to understand like what? How they even narrate or talk about the business.

Diversify before decision making


Yeah, that’s so interesting. So there’s so many ways to diversify before making your decision and also understanding that even if you diversify, even if you do all your research at the end, it could still be a problem. And you still have to have the humility to check back in and listen and try to make things right. Is that a difficult step?

And is that something that you partake in, in that step of going back and saying, hey, we thought it was going to work, but it didn’t work.


And that’s our job, right? So our job is to then to take those nine data inputs and then test what we come up with as a solution. In the real world and then say, Did it work or not? I love losing test. Like, I think you win so much from losing because you can get the opportunity to just examine what went wrong in the investigation or in the specific solution.

Really often what happens is we understand the pain point or we understand the customer need, but we get the solution wrong. And so the gap is we didn’t solve the right way for the problem that we know exists. We should keep tackling that for different solutions. Are looking at different ways to achieve the same or accomplish the same sort of thing that we wanted to do in the first place.

And I think a lot of people it’s hard to distinguish between that solution and that pain point. And so the more we can separate them and say, these are the true pain points, these solutions, we don’t know for sure if they’ll work or not. But they’re backed by things that say their pain points are they’re important.


Is there a foolproof way to get it right? Like no. If you do this, there is a 100% chance this is going to work.


You know, I can’t imagine you can get closer, right? The more signals you have, the stronger, the more likely you’re going to have a better solution that’s going to meet real needs, real pain points can to really change the experience but time and time again, there are just moments where I’m surprised where something that so obviously should work or change things in a certain way.

We have a task that we ran a few years ago where we were trying to get more email sign-ups, but we ended up driving 11% more orders for the business not at all what you would expect. We wanted to make it easy and convenient to share your email, but we actually drew attention to their core value proposition with the design treatment where the placement of the email was close to that.

And long story short, it was just one of those moments where I had no idea that would change. It would never have surfaced in any research. But in the real world, like all these elements interact with each other. And so by isolating them and maybe test, you can know for sure what’s going on and what you’ve changed and then you can unlock all these other ideas and opportunities.

UX: Solutions for small businesses


Yeah, that’s so cool. This is such a fascinating science. So what would you recommend to maybe individuals who have less volume where a B testing isn’t really viable because the sample size simply isn’t there?


Mm-hmm. Yeah. I think all these other methods are great input so you do need to know what’s happening. So you’d still need your analytics, you still need to do a click tracking, and then you’ll just lean more heavily on that qualitative. So the more signal we can get, the better. But if you don’t have that traffic, it doesn’t make sense to spend the effort on it at the moment.

The other thing that’s sort of in-between is there are companies that have very little traffic or have lots of traffic, but in the middle, they have some traffic, but they’re not able to just test every tiny change. They’re not in a lot of the world and that’s when we want to take bigger changes and group them together.

So we can say, here are eight changes we’re going to group together. We won’t know which of the eight things drove the change, but it’ll be a big enough change that we’ll start to see it in the data and then the way customers respond. So it kind of depends on where you are in that journey and how your traffic corresponds with it.

And then I think the other thing too is just to like bring in experts or bring in someone with a perspective. So to pitch ourselves just for a moment, we do expert reviews for companies that are smaller because we’ve done thousands of experiments that we can then convey or apply those learnings from in another context. We have less certainty that that’s going to work than if we run an AB test, but it’s still better than going at it on your own.


That’s great. AJ It’s been so educational for me there. I mean, there was so much that you said that I was already familiar with, especially the aspects of behavior and behavioral economics, but so much of the practical part that I didn’t know. So this has been a really exciting episode for me. So thank you so much for coming on here to share and I’m going to give you a chance to tell our listeners and viewers where to find you if they like me, really enjoyed it and want to learn more, maybe find your company.

Perhaps they have a need that you could be the solution to.


Yeah, absolutely. So my company is called Experiment Zone and we’re at Experiment Zone dot com. For those who aren’t really sure about conversion rate optimization or just kind of getting started, we have some great free resources like some checklists you can go through on your own. And then we’re always open to doing a call and just chatting about what your problems are and if we would be a good partner to help you so you can find all of that on our website.


That’s awesome. Thank you so much, folks. Thank you so much for joining us once again. It’s great to have you with us and tune in next time on Voices of CX Podcast. AJ, thank you so much.


This is a lot of fun. Thanks for having me on.


Share on facebook
Share on twitter
Share on linkedin
Share on email
Mary Drumond

Mary Drumond

Mary Drumond is Chief Marketing Officer at Worthix, the world's first cognitive dialogue technology, and host of the Voices of Customer Experience Podcast. Originally a passion project, the podcast runs weekly and features some of the most influential CX thought-leaders, practitioners and academia on challenges, development and the evolution of CX.



Join our email list to be notified when new episodes air, and get them a day early!

what is voc - image


Voices of CX Podcast

110k plays and 100+ episodes later, we're still all about the Customer Experience

Subscribe to the Voices of CX Podcast to hear from biggest names in CX; Joe Pine, Jeanne Bliss, Dan Gingiss, Ian Golding and so many more. 

Get notified when new episodes air. 

Where one good thing ends, another begins! Don’t worry, the podcast won’t change as much as you think.

The Voices of Customer Experience Podcast has changed its name! From now on, we’re The Customer Value Alignment Blog and Podcast.

At CustomerValueAlignment.com, you’ll find the educational and informative blog content that you’ve grown to expect. Whether you need a refresher on the basics, a deeper dive into Customer Value, or helpful content to share with your team, you’ll find it there.

Subscribe to our newsletter or follow us wherever you get your podcasts. If you follow the Voices of CX already, you don’t have to change anything – we’ll be on the same feed as before!

Thanks for sticking with us. Stay tuned for Season 11!