In today’s episode of the Internet Marketing Podcast, Andy is joined by Johann Van Tonder, COO at AWA Digital and co-author of the book ‘E-commerce Website Optimization’, recently published by Kogan Page.
On the show, Johann discusses:
- Some of the CRO (Conversion Rate Optimisation) mistakes he’s made and how you can avoid them, including:
- The temptation in being emotionally invested in the outcome of A/B testing
- What sites with a low level of traffic can do when it comes to A/B testing
- Why A/B testing is controversial
- The best ways of listening to your users, including:
- Usability testing & remote moderated usability testing
- His book Ecommerce Website Optimization and who should be reading it
*This week, for our listeners, we’ve got a couple of copies of Johann’s book to give away. To be in with a chance of winning, simply fill out your details here!
Plus, as usual, Johann provides his top tip/key takeaway.
If you’d like to connect with Johann, you can do so on Twitter here.
Full Transcript of the Show:
Andy: Brought to you by Site Visibility at sitevisibility.com, this is Internet Marketing. Now before we start, we have a request – if you are genuinely enjoying what we do here on the Internet Marketing podcast then would you please leave us a review on ITunes or your podcast app, because it really helps us to grow the podcast and ensures that we bring you great marketing tips and advice each week. Today I’m joined by Johann Van Tonder, Chief Operating Officer at AWA Digital and co-author of the book ‘E-commerce Website Optimisation’, recently published by Kogan Page. Johann, how are you doing?
Johann Van Tonder: Hey I’m doing very well, how are you doing?
Andy: I’m very, very well indeed. You’re down in South Africa, is it Cape Town?
Johann Van Tonder: Yes I am indeed. We’ve just had a massive storm – trees uprooted, roofs blown off, that sort of thing.
Andy: Sounds quite dramatic. We’re using just a really long hose pipe with a funnel at each end aren’t we and the quality is amazing.
Johann Van Tonder: That’s right, it actually works. It works better than Skype.
Andy: Now, where shall we start? Tell us about yourself, AWA Digital, we’ll talk about your book a bit later [00:02:01.23].
Johann Van Tonder: Yes so AWA Digital we’re a CRO agency, conversion rate optimisation, specialising in e-commerce. The company is based in York, we’ve got an office in London, in Texas and then in Cape Town, where I’m based at the moment and we work with e-commerce websites around the world. Clients include Canon, Interflora and then a range of brands you’ll recognise. As for myself, I’ve been doing this for a while, eight years or thereabouts, and I started doing this when I was working in a corporate. I was in charge of a unit that had some e-commerce sites reporting in to me, there was a lot of pressure on me to grow them, to drive growth, and I found CRO to be one of the best ways in which to do this. I should say at the time, CRO wasn’t really around as a term. The term was only to be coined later and what I was doing was conversion rate optimisation without realising it. Listening to your users, running AB tests, then staying with that process. So it didn’t have a name and the process was quite rough at the time. The tools that we’re used to now weren’t there. Some of them were just being born and I was a beta user of some of the ones you’ll recognise now as big names. So been in the game for a while.
CRO Mistakes to Learn From!
Andy: Now in your book, E-commerce Website Optimisation, you talk about several things. But let’s talk about CRO because you just mentioned it, conversion rate optimisation, generally from a slightly higher viewpoint. Eight years is a quite long time to be doing it. You’ve probably made some mistakes, what have you learned from the mistakes you’ve made that we could help our listeners with?
Johann Van Tonder: Yes as you can imagine I’ve made every mistake that you can make in those eight years and hopefully only once. I think one of the biggest ones that [00:04:01.03] jumps to mind immediately, and I found this quite hard and I see this being repeated by people who are coming in to the industry, is the temptation to be emotionally invested in the outcome of an AB test. That was a really big lesson for me to learn. So you work hard on coming up with a hypothesis and you implement a solution to a problem you’ve identified, an area of the website you think you can improve, you launch it as an AB test against the control and you hope that your variation is going to win and you watch that curve every day and if it goes South then your mental state goes South and if it goes up then there’s all reason to celebrate. That’s exactly the wrong approach because what actually happens is you go in there with an open mind when you launch an experiment and the purpose of the experiment is to learn. It is to be completely open to the outcome of the test. So as soon as you emotionally invest in the outcome of that test, you’ve actually lost the game already – you’ve started on the wrong foot. You should be launching that test open to what happens and no matter what the outcome is, you can learn from that, you can build on that and I think that’s probably the biggest lesson for me. Once I learned that and once I started adapting my processes and my starting point, my mental state, being tuned into this process in that way, things changed dramatically.
Andy: It’s a bit like being a scientist isn’t it? Because scientists have the same problem because they get emotionally invested in their theories sometimes and when their experiments start disproving the theory, they start adjusting the experiments.
Johann Van Tonder: Yes that’s right and this actually is the scientific method [00:05:48.11] in general and a lot of the methodologies are quite similar and all of them are based on the scientific method, which is exactly what you said now which is about having a hypothesis and setting out to validate that hypothesis and it could be validated, so it could be true, or it could be refuted so it could be invalid. And that doesn’t mean you’ve failed, it doesn’t mean you’ve lost. It means you’ve learned something. And Thomas Edison famously said that he didn’t fail 10,000 times when he invented the lightbulb, he found 10,000 ways in which not to do it and each one of those learnings brought him a step closer to finally inventing the lightbulb. So yes, that’s the principal.
Andy: Are there any other chestnut mistakes that you came across? You mentioned that you’ve made quite a few.
Johann Van Tonder: Yes I think the other one was, and I referred to this earlier, was taking yourself too seriously. So I mentioned earlier listening to your user and I think that this has become a mantra that’s quite firmly entrenched in the landscape now and I’d be surprised if anyone [00:06:57.08] didn’t buy in to that. The answers are with our user data, qualitatively, quantitatively, both methods, but looking at how your users use your site and why they do things in a certain way, and the closer you can get to that, the closer you can get to the answers that you’re looking for. Where are there opportunities to improve? And initially, you start and you think – well I know a little bit about this business, I am very close to it, I’m very close to my customers and I have my own opinions. I’m a user of the web and I think I know how this should work. And that is a dangerous place to start off from, is trusting your own opinion too much. You are not your customer, your consultant is not your customer, your boss is not your customer. Your customer is your customer. And that’s where you need to go and find the answers and that’s another mistake that I made. I went right in there, initially not even listening to [00:08:00.28] customers. Initially just doing stuff that looked obvious to me – change this button, make that different there, change the menu – and some of it worked and some of it didn’t work, but it’s like throwing mud against the wall. It’s the opposite of where you want to be, which is a more methodical and systematic-, and the point of all of this is to have incremental results and I think when you talk about that as well, be realistic about what you can expect from a typical CRO programme. So there is some data on this. We all want those big winners – 30% improvement, even triple digit improvements, you see those. The reality is that most of the tests you run and most of the interventions you make will have a minute impact, so sub-10% and then some people say that – that’s almost nothing. The point is, you stack up all these almost nothings and you’ve got something. And that’s the point of having a programme like this, is incrementally making those improvements.
Andy: So not expecting too much from any one approach I suppose?
Johann Van Tonder: Yes and that’s another classic mistake, so I was there myself. Chasing the big wins and not being happy with anything other than a big win, and some of those big wins probably not even really such a big win. If I see a big win today I Immediately have questions. I am sceptical about that because I know that’s not the norm. I know it falls outside of what I know to be the norm and so there’s got to be a reason for that. So if I see a really big win, I am going to dig into it really, really deep and make sure that I can validate that.
How to Undertake A/B Testing with a Small Amount of Traffic
Andy: Now you mentioned AB testing, which of course is a huge part of CRO, I was just wondering – testing requires a certain amount of traffic, doesn’t it? What do you do with a site that perhaps doesn’t receive as much traffic? What do you do about AB testing in that case?
Johann Van Tonder: [00:10:02.16] Yes a lot of people obviously are in that position, start-ups and also those who have been around for a while but haven’t grown traffic levels. And for whatever reason you’re in that position and you want to be doing CRO, then there are two ways of doing it. The one way is if you really don’t have much traffic then just JDI, just do it. Instead of running a test, making the change and then comparing before and after. I’d say the other general principal is to not bother with tweaks, small changes. Small changes sometimes will have a significant impact. Let’s just consider this – what it means when you make a change to an area of your site. Let’s say you move a button around or you move an image of whatever change it is that you make to the visual design on your site. What you’re really doing is not moving stuff around on the page. This is crucial – you are changing behaviour. You are changing the way in which people interact with your site. It is psychology not design so little changes, you’ve got to ask yourself what is the potential impact of that on somebody’s behaviour? How is it likely to shift behaviour? If not, then you probably want to lean more towards bolder tests. So big changes generally. So that’s the one thing and the second thing is that when you don’t have a lot of traffic you’ve got to be really disciplined about your approach. You always have to be disciplined about your approach but even more when you don’t have a lot of traffic. So I’ll give you one or two examples of how that plays out in practice. So you might want to pursue bigger changes, you also want to be really disciplined about how long a time frame you give that test, because on a site with a lot of traffic you can run a test within days and have a statistically significant outcome. I wouldn’t say that’s a good time to stop it, after days, but that’s the reality. The numbers can work out after a few days. With a site that doesn’t have a lot of traffic, [00:12:12.29] it might take you months and that’s not what you should be doing. You should give it a reasonable timeframe, so say two weeks – I’m in favour of two weeks if it fits into the business cycle. Maybe three weeks, maybe a month at most. And then if you don’t have a result, kill it, analyse it, draw learnings from it and move on. Because each time you run a test it’s an opportunity cost. Why do I say that? You could have been running a different test at that same time that could have got you to a win and could have given you a better win. So by letting this test which is not going anywhere sit in your testing slot and occupy that testing slot, you are wasting time and you’re burning resources and you must realise that and this is true for everyone, whether you have a lot of traffic a lot, there are a finite number of testing slots each year and you have to be very disciplined and ruthless about what you put into those testing slots. Each time you run a test, you could have been running something that could have gotten you closer to your goal, so that opportunity cost is something to be aware of.
A/B Testing Controversies
Andy: [00:14:07.21] Now AB testing you mention it quite a lot in your book, it’s a bit of a controversial area, isn’t it? Tell us about some of the controversies around it.
Johann Van Tonder: I think the main one is just the results and how those results and that data is interpreted. So there was a data scientist [00:14:30.00] who about two years ago came out with a report based on a study he had done, and he said that two thirds of all test results were bogus and you must understand, what happens is people take these results, let’s say they’re wins, and they make the changes to their site based on the outcome of those tests. So you’re running an AB test in order to have a confident decision, a data backed decision. Now that’s a good place to be. I think if you can AB test you should be AB testing by all means, but I think it’s worse drawing the incorrect conclusions than not testing. Rather don’t do testing if you’re not going to be doing it properly. So if two thirds of your results are going to be bogus, that means you’re going to be making a lot of changes to your site that potentially harm your business. In most cases it’s not going to do anything, it’s going to make no impact, so you’re going to expect to see a particular result in your bottom line and it’s never going to come through and you’re going to wonder – where’s the uplift? Where’s the money that I saw in the AB test console? And the reason for that , that it’s not there, that you can’t see if reflected in your financial statements, is simply because that test result wasn’t valid. You either misinterpreted [00:16:01.11] the data or you didn’t stick to the rules of the game. And the rules of the game, this is quite a complex area, I don’t think we should get into that because I can keep you busy on this for a very long time, but it’s laws of statistics and laws of mathematics. You can’t change that, it is what it is. And so you’ll hear, and this is maybe segueing into another controversy surrounding AB testing, is you’ll hear a lot of these sort of rules of thumb of – for how long you should be running a test for example. And you hear people say – make sure that you have 200 conversions per variation, or 400 conversions per variation, whatever the case may be. And that’s not the way to solve it. And in the book we address this but you know, there’s a process behind this and it’s a process that should respect the laws of statistics and of maths and you can’t take a shortcut there. As soon as you take a shortcut then you’re potentially harming your business. So this is something you should really be serious about if you’re going to be doing AB testing, is understanding that context.
Why you Need to Listen to Your Users
Andy: Now you mentioned earlier about the importance of listening to your users and I was just wondering – how do you do that? What do you reckon is the best way of listening to your users?
Johann Van Tonder: Yes there are many ways in which you can do this and so I’ll mention a couple. Earlier I spoke about qualitative so let’s start there because qualitative is by definition a technique that puts you in direct contact with your user, whether it’s face to face or on the phone or by means of survey. And those are all very useful methods. One of my favourite methodologies, and it’s hard to single out one, but it’s usability testing and more specifically, remote moderated usability testing. So what we do is have a screener on the homepage of a site inviting the user on that site, to participate in live research. We offer [00:18:00.25] them an incentive. If they engage with that screener then we ask them a series of questions still in this app and we qualify them in or out of the study based on how they respond. Then we phone them up immediately, this is the secret sauce, we phone them up immediately, set up screen sharing between us and I’m the fly on the wall watching how they interact with your site in an authentic session and I’m watching all the stumbling blocks that they’re running into and that’s really powerful and the kind of insights that you draw from an exercise like that, just thirty minutes doing that with five people and you walk away with a long list of things that can potentially transform your business.
Andy: Now you also talk about prioritisation or triage in your book. Tell us a bit about that.
Johann Van Tonder: Yes that’s crucial and I think it’s the one area in optimisation that’s maybe not spoken about enough and it’s not such a sexy topic as AB testing or some of the rest, you know speaking to customers and visitors. But it’s not hard to come up with a long list of things. You can brainstorm it, you can speak to some users and you can generate a list of dozens, if not hundreds of ideas. The real difficulty is deciding what to start with, which of those ideas to pick and in which order to rank them. And clearly what you want to do is you want to make the best use of your resources, so you want to roadmap them in such a way that the bigger gains or the easier gains, the quicker gains – the low hanging fruit if you’ll indulge my jargon, sit at the top of the list, and the ones that potentially have lesser impact, drift away towards the bottom of the list and there’s several frameworks that you can apply here. A very famous one is PIE, Chris Goward’s PIE framework. But there are many frameworks and the point is not really what framework do you use, it is that you have a framework. Some system that everyone in your organisation is agreed on and that you can apply consistently that [00:20:06.15] we have the set of criteria by which to evaluate which ideas bubble up to the top and in which order you are going to be attacking them.
Johann’s book: E-commerce Website Optimisation
Andy: Right let’s talk about your book, because we’ve got some exciting news. We’re going to give away three free copies of your book, and I’ll give you the link in a minute, but just before we do that, who should be reading your book Johann?
Johann Van Tonder: So the title – E-commerce Website Optimisation, I think says it all. It’s aimed at anyone who runs an e-commerce website or works in an e-commerce website and then want to drive the growth, they want to improve their conversation rate, the revenue of that site. It’s a playbook, it’s a very practical guide that explains exactly what process you should be following, how you should be doing it, the techniques that you could be using, how to do the triage, how to approach AB testing, the entire process from the beginning to the end.
Andy: Fantastic. So listeners, if you hop along to www.sitevisibility.co.uk/johann/ and fill in your details and then Site Visibility will select, using a specialised algorithm, which is top secret, three people to get a free copy of Johann’s book. So I’ll mention that link again, so that’s sitevisibility.co.uk/johann/. Now Johann, we’ve talked about quite a few things today, but if you had one top thing that you spoke about today for our listeners that they should concentrate on, what would it be?
Johann Van Tonder: It’s hard to single out one, but I’d go back to the AB testing because this is such a big [00:21:54.17] industry and it’s such huge controversy and I encourage you to run AB tests if you can, if it’s within what your site traffic allows, but you’ve got [00:22:08.23] to do it properly. You’ve got to educate yourself on the rules of the game. I’d say that’s my number one.
Andy: Fantastic Johann, thanks so much for coming on. How can our listeners find out more about you and AWA Digital?
Johann Van Tonder: Well you go to the website, awa-digial.com
Andy: Fantastic and thank you for listening, people. The show notes are in the usual place, sitevisibility.com/impodcast. If you’re enjoying the show please leave us a review as I mentioned at the beginning. If you’ve got any questions or suggestions for future topics then there is an email address – email@example.com, or you can Tweet, @sitevisibility. If you want to connect with me personally, I’d doctorpod on Twitter and LinkedIn. Don’t forget the site visibility group on LinkedIn, and also – once again, that magic link: www.sitevisibility.co.uk/johann to get a chance of getting a free copy of Johann’s book. So that’s all from me, Andy, and it’s all from Johann.
Johann Van Tonder: Thanks Andy, bye everyone.
Andy: And we’ll see you next time on Internet Marketing.