No Hacks

Balancing Pragmatic Change Management and Effective CRO with Tim Stewart

Slobodan (Sani) Manić Season 2 Episode 5

In this episode of the No Hacks Podcast, Tim Stewart shares his wealth of knowledge from 25+ years of consultancy, breaking down what Conversion Rate Optimization (CRO) really means for businesses today. Tim doesn’t just focus on the numbers—he dives into the balance between experimentation and perfection, and how making data-driven decisions can transform a business.

You'll hear practical strategies for digital optimization, insights on navigating market shifts (like the rise of low-cost competitors such as Temu), and tips for mastering stakeholder communication. Plus, Tim brings his love for Star Wars into the mix, using mentors like Obi-Wan and Yoda as metaphors for personal and professional growth.

Whether you’re a business owner, digital consultant, or just passionate about CRO, this episode is packed with valuable takeaways and stories you won’t want to miss.

00:00 Welcome to No Hacks Podcast
01:47 Experimentation Elite Event
03:32 The Role of CRO in Business Strategy
07:24 The Pragmatic Approach to Testing
09:57 Measurement and Analytics in CRO
13:36 Fixing the Funnel: A Pragmatic Approach
18:54 Challenges in CRO and Business Strategy
19:28 Identifying Problems and Providing Solutions
20:11 Delivering Hard Truths to Clients
20:48 Persuasion Techniques in Business
24:08 Sales Skills for Optimizers
28:23 The Importance of Soft Skills
30:23 Rapid Fire Questions from the Audience
33:04 Key Takeaways and Final Thoughts



---
If you enjoyed the episode, please share it with a friend!

No Hacks website
YouTube
LinkedIn
Instagram
X

[00:00:00] Sani: Welcome to a new episode of No Hacks Podcast, where we share expert strategies for digital and real life optimization. This episode is about measurement conversions and change management. So I guess it will probably be a controversial one in countries like North Korea or Russia.

My guest is Tim Stewart, a digital consultant with 25 years of experience. He helps businesses drive digital transformation with CRO expertise. And beyond that, he's also a man of many talents. He's a photographer, a star Wars fan. for And he probably knows more dad jokes than you do, Tim. Welcome to No Hacks.

Great to have you on.

[00:00:35] Tim: Thanks for having me.

[00:00:37] Sani: A man of many talents,? What do you define yourself as professionally

[00:00:41] Tim: Yeah, it's, it's, it's tricky. Um, I, I describe myself simply as a consultant because for the most part, that's what I do. I'm consulting on people's problems, and when people ask for sort of the, uh, the elevator pitch, I tend to go, um, I fix broken stuff. Um,

[00:00:56] Sani: That's a nice tagline. it's the, simplest way to, and, you know, people go, well, what do you mean?

[00:01:00] Tim: Like, well, what have you got broken?

[00:01:02] Sani: I'll

[00:01:02] Tim: That's where it leads, and that starts the whole conversation, so that tends to be the introduction.

[00:01:08] Sani: Before we get to our main topic, I want to tell the audience about experimentation elite. It is the UK's premier experimentation and conversion rate optimization event.

Next edition taking place on December 10th in London. If you go to experimentationelite.com, you can get your tickets. And if you use the code, NOHACKS10 N O H A C K S. One zero, you can get 10 percent off. Tim, you've been to expectationally many times, all of them, right?

[00:01:33] Tim: I've been to all of them. the first, the first one when it was running with Conversion Elite, which 

was Conversion and SEO on the same ticket, and then, yeah, pretty much all of them since.

In the UK, we haven't got the, the the depth that we may be having some of the American conferences, but they have managed to assemble. I think the right claim is the premier kind of experiment conference in the country.

If you are somebody who is in, I wouldn't even say the conversion space, but the digital optimization space, or you are that way interested, or you are even a business owner or client side, you're going to be in a room with 250, 300 people who are similarly minded, keen to learn, and there's going to be beyond the people talking on stage, um, there's going to be people who you will just run into.

I think the biggest value and thing that's really kind of the thing that makes a difference for that conference and most conferences is the quality of the people attending 

means that you can't help but learn stuff just by sitting there passively. But if you are remotely involved in talking to people, running into people, even speaking to people who've got the sponsor stands, you're going to end up coming away with better contacts, different ideas, ways to approach stuff.

That's invaluable.

[00:02:51] Sani: I've been to one, I've been to only one. I will definitely be back so let's talk, you mentioned conversion optimization versus digital optimization in general, as you were talking about, experimentation elite, I want to talk about the role of CRO and conversion optimization in the overall, business and optimization strategy.

When CRO became a thing 10, 15, whatever years ago, it was like, that was a product on itself. It was a thing on its own. Now CRO, in my opinion, cannot function as, as its own function. If it's not part of a system, do you agree with that?

[00:03:22] Tim: Yeah, there's a bigger piece there which I'll come onto, but I think if we're talking about CRO as people describe it, meaning kind of A B testing experimentation, um, it needs to be part of the whole, because otherwise you can, it's academic. If you are just testing going, this could be better if you change this, Unless somebody takes action about that, that potential remains potential.

Nothing happens. If people are going to give it lip service, we're going to test this, and then not take an action based on what you could have done differently, then any risk you could have mitigated, any different change, any different behaviour you could have made, doesn't happen. So there's no chance to earn its value.

What we're seeing here is a standard kind of adoption life cycle, a maturity model if you want. And yeah, I mean, I, I started in it 15 years ago, 16 years ago, um, as a specialism, but the techniques that were involved had been going for another 10 years before that. Um, I'd been doing things like that in digital to improve performance.

for, you know, the decade before I started being officially a CRO. And I'd argue that most of the last decade I've been doing stuff that a lot of people wouldn't necessarily describe as CRO, because it's been taken to be shorthand for running an AB test. And it's like, if we're talking about optimizing, we're talking about taking what we've got, seeing what could be better, and then using methods, big pot called methods, to make better.

[00:04:56] Sani: Right.

[00:04:57] Tim: But I think it's, it kind of in terms of its integration with the wider business, I think it's what we did is we spent the first few years learning this tool, how it worked by itself. And now it's one of the tools that sits alongside a bunch that we already had and new ones that are coming along. There's an inevitability in that it's going to have to be part and parcel of the business anyway. And businesses that do have an experimentation mindset tend to outperform the ones that don't. And, you know, as much as we love our capitalist society, it's survival of the fittest. And therefore, that is a competitive advantage.

So, businesses that are Taking their data, taking their optimization more seriously will generally outperform their competitors. And if they consistently outperform the competitors, they will win market share, more money, hearts and minds of customers if it's done properly. Um, so there's a kind of inevitability in that it has to be done properly, integrated part of your overall mindset.

But if it's not been done at all or done poorly, then then that's kind of self fulfilling because the companies that do that won't be around to continue doing that. 

So it's a kind of, the ones that are left will have some form of experimentation built in because the ones that didn't do it well didn't survive. Maybe it's a bit negative to kind

[00:06:22] Sani: No, it's not that 

not negative at all. We don't know for sure, but that's probably what, what is happening. And that has been happening for years. Now you've seen CRO since basically day one, even before it became the digital thing that we have now, what is your CRO Hill you're willing to die on alone, if you have to. What's the one take you have that you're not letting go no matter what.

[00:06:46] Tim: you don't have to test it.

Sometimes you don't have the time for it. Sometimes you can't design a test to accurately measure that. Sometimes you could, but the win you get from it is not going to pay for the cost to test it. So it's one of those pieces where You need to be pragmatic. If you have all the resources in the world, all the time in the world, and you have a very important need to not make decisions which will cost you money or put things at risk, then yes, it's probably first in the queue waving its hand going, I'm the ideal choice.

But if you've got something which is short term, something which is going to have maybe low impact. Then you maybe don't need to do a full experiment. You could maybe run an experiment but to run it to a lower level of, of quality, like literally just set the bar lower, have it as a non inferiority. You know, as long as it doesn't go down, 

plus or minus, it's good enough.

And I think an awful lot of people get tied up with the, if it's not done perfectly with perfect stats and perfect situation and analyze the nth degree to 99. 99999, it doesn't count. 

[00:07:57] Sani: I'm dying on that hill with you. you're not alone on that hill.

[00:07:59] Tim: yeah, I think, I think, just be practical about it, because the problem with perfectionism, searching for it, is, is, quite often you end up backed into a corner where you don't get to test at all. 

Like, you've literally gone, my bar is so high, nobody will, nobody will work with it. I'd rather do something that is a poor version of to test, but know it's poor and make my decision with that in mind.

Then do something that is a grade A gold standard stamp triple check test that never gets finished because we're spending so much time trying to the test itself then becomes part of the problem.

[00:08:30] Sani: But that goes back to, done is better than perfect. 

Like, 

[00:08:33] Tim: Yeah, I mean, it's just pragmatic 

side of things. It's just be practical. There's businesses. They can do this. They can run 25 experiments a month. Great. They've got 10 times your head count. They're in a different industry that makes more profit. And their board have supported it with the investment.

You're a single person who needs to do some. short term stuff to impress people enough to be able to maybe build your team play it tactically. Like that's not the perfect test, doesn't matter. It's not serving the testing purpose, it's serving the moving my game forward purpose. We're going to talk about change management stuff.

It's like, it's one of those things where it has more purpose in what you learn from it than in what it 

wins. And one of the learnings can be, let's not do that test again. 

[00:09:15] Sani: That's a good thing to learn that, that, that will save you time in the future. Uh, and we all need some free time let's talk about measurement analytics in the CRO context. What are some of the mistakes you see with clients you work with, uh, when it comes to measurement specifically, or interpreting data for, for CRO what are some of the most common mistakes you see

[00:09:35] Tim: hundreds, but the way I'd sum it up is suitability of the metrics. Most people are correctly, I'd say, focused on the end goal, the macro goal. Did we get more sales? Did we get more leads? Let's assume the revenue associated with And therefore they design a test that they really, really want to increase that number.

But then the thing they can change is 1, 2, 3, 4, 7 levels removed from the actual rubber meets the road making money part. But they will set their decision metric based on the making money part.

And whilst ultimately that's what you're trying for You have to be cognizant of your kind of signal to noise. And if you are 1, 2, 3, 4, 5, 6, 7 levels away from actually putting money into the browser. You may not have enough signal versus noise to conduct a realistically useful test. The amount of detectable effect you're trying to aim for will be very small.

You'll be much more susceptible to Other variations, peaks and troughs that you will need more data collect on and you then start running into excessively long run times and repeat visitor issues and the yadda yadda and all the various data pollution problems you can hit. Whereas if you are changing something like the product detail page and you've assessed through research or through data or for interviews that people aren't able to see.

what they get when they click the button or are worried about what will happen next when they get the button and your design is designed to reduce that anxiety and increase that desire to friction lower impetus higher to clicking the button the loudest signal it's going to be clicking the button that's what you're all the levers you're pulling are trying to get them to that one action so the most common issue I see with people trying to interpret results, plan their tests, design their tests, and everything tends to be the suitability of the metrics they pick.

I understand that within the wider picture the reason we're testing this page is because this page then feeds into more people and it is therefore a factor too. And if you want to do some maths you can go if this page is 10 percent of the purchasing decision and we improve that by 10 percent that's actually 1 percent of the overall purchase impact.

I can't see it. Again. 

Your data volume is too low to do that. That's not what it's intended for. But we are pulling on this lever to affect this behaviour, and if we are correct, more of this happens. And if we're incorrect, less of this happens. That's a clearly structured test. You can measure that, and you can get an answer.

And then from that, you can either plan a follow up test. Do we want to get even more? Or do we want to go further down and go, Of the people who clicked more, where did they fall out next?

[00:12:38] Sani: Right.

[00:12:38] Tim: as a leaky funnel. Do you just plug one hole to the perfect degree and then leave the rest leaking? Or do you patch some of the holes some of the time and get an overall impact from the cumulative effect of some of these being patched, knowing that there are still leaks, but then you go back and get the ones that you didn't get the first time?

[00:12:57] Sani: So what's a better approach in, in your opinion, based on your experience, fixing one hole to perfection or working everywhere at the same time and fixing all the leaks slowly until they're, they're, they're, I know it depends. I know 

it 

[00:13:10] Tim: refer, I refer refer Your Honor to my previous answer, which is, um, it's pragmatic. Like it, it.

[00:13:17] Sani: Yep.

[00:13:17] Tim: What's appropriate to the business? Do you have all the time in the world? Do you have a particular deadline? Are you budget constrained? Are you time constrained? If that is a glaring great hole I don't know if we could swear on the public, but me and Arnout have said for years, like, fix broken stuff 

first. You know, if there's something that's howlingly broken, fix that. And then. Next go for the next most broken thing and there will be some easy wins. Those easy wins will generally pay for both in Revenue generated but also just good faith within the business for you to go this next problem. We can see a problem We can see there is a leak, but we don't know quite where it is And we're going to have to spend some of that goodwill With two three four tests that are going to explore possible reasons for this and one of those may be Identify it.

Or we may, through checking these five things, by process of elimination, definitively say, not those things. They are not a big factor. But then we won't, we'll know, we don't need to test them again. So the, it's not like you go through the funnel once and you're hands up, I'm done. And it's not like you fix one page, you go hands up, I'm done.

Because if you improve the amount of people who add to bag, you're putting more people who were, say, borderline going to buy are now in your checkout. So if your checkout's poor, You've now added to the number of people who were less convinced. Your checkout is probably not going to benefit from those extra 10 percent of people who now click add to bag because if they took that much persuading to add to bag, they're probably not going to be your best checkout users either.

So you're deferring the problem. So I don't think you can get away with just doing one area or fixing one problem because you need to understand one, how the users use the site. Two, I'm probably going to come onto this, what the business thought they were supposed to be doing on the site, 

and three, where those two are falling apart.

Because that's probably going to be the problem. And as much as you can fix a page, you're going to find that we've got a new payment provider, bang, the fixed page is broken again. Or our competitors have got a nicer site or a cheaper price, right? Well, The friction that we managed to reduce our checkout process down to was acceptable when we were comparable in price to our biggest competitor But now our biggest competitor has got also got a low friction thing and is now half our price there's less impetus to be people to come to us at all and the ones who do are looking for a discount code because they are now more Price sensitive because somebody else has changed the macro game Now 

that's not something you 

[00:15:55] Sani: Temu has entered the market in other words,

[00:15:58] Tim: Yeah, it's Yeah, the Temu factor.

It's like, what they've done is they've placed a new anchor point in people's heads, which means that people are now looking around going, I'd like that price, but not that terrible quality. 

And it's like, do you realise those things are No? Okay. But, once the seed is planted, that's something you're battling against.

That might be a messaging thing on your marketing before they even hit the site. 

That might be a decision at the business level to go, We're not going to play. That's a race to the bottom. And therefore we're going to reposition and leave Temu to fight the other dropshippers.

Because that's not our business model. But that's not something you're going to answer on an A B test. 

That's not something a, an improved banner on your product detail page is going to solve. But that could be the single biggest factor in conversion rate and overall sales trends. And, uh, Kind of famously one of my it you are to tend to get a little bit like Testing for the sake of testing.

I always used to describe that as kind of moving the deck chairs around on the Titanic Like it's busy work. It makes you feel good all over lovely, but Fundamentally, the ship's going down for different reasons than the deck chair alignment

[00:17:05] Sani: So testing velocity will not save Leonardo

[00:17:07] Tim: So it's an ego thing if you have got everything working and the thing that's making the difference is kind of You know how much you can move and how much the rest of the business can feed off what you're doing them and how quickly you can innovate and iterate.

Then yes, building a team, getting test velocity correct, fine. But before you do velocity, do it right. No, no, fail fast. It's like, but that's, if you're willing to fail fast and just throw tests at it and see what sticks, that is a method. But the cost will tend to be, unless somebody's done some joined up thinking, When you get a win, you won't know why. You can't repeat it. It won't fit a wider picture. You'll lose the narrative of what you're trying to prove for the customer, the user journey, the CX side of things. And, unless somebody is looking at the overall picture and going, this fits why we want to go as a business, that 

doesn't fit as we want to go as a business, you could be testing something that's a definite winner that you're never going to do because it's not what your business will allow, or your market will allow, or you'd ever risk with the customers. Actually, testing for velocity has just burnt 30 percent of your budget because you were testing shit you shouldn't have been doing.

[00:18:13] Sani: How do you handle. resistance when you're suggesting major, uh, company strategy, company wide strategy or, or business strategy changes based on what you found out using doing CRO for the website or, or any other research method on the website.

So you tell them, Hey, I think this doesn't work, not because the page is bad or poorly designed. It, there's a bigger problem. How do you handle resistance? If there's any

[00:18:40] Tim: I'm kind of in a privileged position now in this, and because I'm a consultant, they tend to bring me in because they know there's a problem. And I've been hired to identify the problem and generally provide a solution. Um, so I find it easier now than I used to when I was employed or working agency side trying to work for a client because they're literally paying me to tell them that their baby's ugly.

Yeah.

[00:19:07] Sani: you're the last option basically, and they need to trust you

[00:19:10] Tim: Yeah, and it does mean that sometimes I go in and I hope my my assessment is. Yeah, you passed the event horizon. There is there's no further. Yeah. value in, in pursuing your current approach. You can frame it how you want, but basically ships going down, do you want a soft landing? And then that then becomes the optimization plan.

We are going to optimize to land you as softly as possible, lose as few people jobs as possible. And generally when you give that piece of news, they kind of knew it was coming because they wouldn't have hired you unless they suspected it was. Those are usually the ones where people take less convincing.

[00:19:53] Sani: they know already.

[00:19:54] Tim: They, you know, convincing them what steps they need to take, what medicine they need to take to fix that, that can be the challenge. Um, but if we're coming at it from this like a, the history rather than kind of current situation, it depends. But there's, there's two main methods. You either tell them straight up, and this is where we talked about how, you know, randomised controlled experiments and the fact that you can put some data behind it help customer research helps evidence is going to help but people make decisions based on emotional decisions based on emotion are not changed by data they're changed by more emotion so you have to identify you have to work out what the audience is reacting to what their initial thought around that plan was and to a degree dig into what's going on why they're not listening.

So it may be they can't admit there's a problem because if they do, they'd have to put their hand up and say, and it's me that made that choice. Nobody wants to be the person who 

made the decision that cost money. So therefore, they will do, they will move heaven, earth and fake data to make it look like it wasn't them.

Now you could argue that culturally, that's a problem. Like if there's a business that punishes you so much, you fear so much that you lie, you've got the, The Russian kind of mentality of you lie to your manager, you lie to your manager, who ends up lying to Putin, who then thinks everything's going great because nobody dare tell the person above them it's going wrong.

Um, we see how that becomes toxic and, and it does. But the persuasion mechanisms you use are similar to sales, as you find out what they want to get out of it. You're kind of 

trading, you know, what effort you're willing to make for the result, And what does that result look like? And why do you want that?

What's the benefit to you of it happening like this? You're trying to get them to buy with their time or their budget or their attention, your plan. And if they're not willing to buy, you need to find out why.

If it's your plan, your product, not a good fit. If it's not a good fit, why don't they think it's a good fit? What would they do differently? If they've got a reason why they'd really like to. by your plan, but there's a blocker that stops them. It's that investigation piece. It's a conversation you need to have with them, and it may be you need to go and do a test to prove to them.

Here's my new plan. Let's do a mini version of that. Let's do a smoke door test to kind of go plan A, plan B. Oh, it would look better. And some people are not so data motivated, and you have to kind of look at them going, look, if you do, do not do this. Here's your best case, worst case, and likely case. And show them the lines on the graph.

And most board members, if they're looking after their business, that's their job, is to go, I have to steer the company so it at least mitigates risk and we take good steps forward. And if you speak in the right language to them, to the level you're at, they'll generally listen. Whether they can do something or not, different question.

[00:22:48] Sani: you describe all of that.

[00:22:50] Tim: yeah.

[00:22:51] Sani: I'm not saying it sounds easy. It sounds simple. Definitely not easy dealing with, with stakeholders and explaining to them that they're wrong in some cases. But the, the, the principles sound simple. Why does everyone make so much fuss about dealing with hippos, dealing with stakeholders, like that is the worst and the most difficult thing in the world.

And this is, this is not most people from my experience. Again, my experience seem to be complaining about dealing with stakeholders. 

[00:23:21] Tim: This was my talk at um, Conversion Hotel 

last year. 

I basically kind of called it selling magic beans. Like you're trying to 

get, swap the magic beans for the prize cow and subtitled it kind of selling without the ick, because this is, I spent 15 years in sales before I moved across to kind of production side.

Um, it didn't sit, it didn't sit well with me. It's not my personality, but one thing you can do is you can get into sales with not very much qualifications and it pays quite well. So the motivators were there. I just had to. be something I wasn't. But they taught me some basic sales techniques. You have to, if you're going to cold call people, if you have to try and persuade the persuasion architecture, they teach you in sales.

And it's not something I just did in my training. We were at university, I was reading books that use the same techniques going back to the 1920s. You know, it's, it is simple because these are kind of truisms. These are known mechanisms for dealing with humans. The, the problem people have with it is they go, yeah, but I'm not sales. I'm not a salesperson, I'm 

not, I don't do natural sales. Yeah, but but we are optimisers and we spend every day fixing webpages to persuade customers to do something we'd like, using psychological levers to persuade people that we, they should have confidence in our answers or they should take a next step or click to learn more.

So why can't we kind of 

flip a mirror and use that to sell ideas internally? That's to sell the idea of giving me some dev time to sell the idea of that and it's like sales is probably the wrong word but you know it's a persuasion architecture and we we use it daily we should be the best at doing it because we do it in multiple languages we try 15 different pieces of copy we know about kind of how to structure our emails when we're asking for stuff in terms of like messages it's a long form sales letter we should be really good at this and yet we've got this blind spot when it comes to Speaking to our customers, our stakeholders, but also our peers.

Can I speak to my head of development to say can I have the developer next week, not two weeks time because it's useful to me and here's what I'll give you in return, does that sound like a good deal? Oh no, that sounds a bit challenging. It's like, that's where you're falling down. If you kind of take away the mental block and think, if I was trying to sell a new product on a landing page, I wouldn't feel icky about it.

I wouldn't have a problem about how I'd use phrases or which psychological levers I'd be thinking through. I'd be testing those. But I have to go ask developer time and argue for more budget and suddenly I can't because it's the hippo or it's some structural cultural problem that I've got to get past that's not my problem.

And it's like, it's all of our problems to make the business work better. This is your part in it, and if you can't communicate your part to help the rest of the business, then your part's not actually worth anything. So you kind of have to do this, as we were saying at the very start, in terms of this is the integration.

If we have to go to the table with the people we don't necessarily communicate with, and find a way to find a common ground, then ultimately we end up in those silos we talk about. And we talked about, you know, the inefficiencies of companies that do not learn and improve. ultimately perform less well.

It costs more, you've got more friction, you've got higher staff turnover. They repeat the problems.

[00:26:44] Sani: But if you approach it as if it's literally part of your job description to be able to deal with people in such situations, like if you start with that mindset, it

[00:26:56] Tim: I mean, if you're working in a, in a corporate, it literally is your job.

[00:26:58] Sani: Right.

[00:26:59] Tim: You know, the, the, what I do once I've agreed with something with somebody is the technical part. And in fact, they're quite often outsourced that, that if that is the job is being a subject matter specialist, identifying with your specialist knowledge.

What the right course of action is, and then telling the person who you need to get stuff from, what you need from them. And then, if that ends up being a horse trade in terms of, well, I'll give you this and give you that, then that's fine. If you have to turn up with a bag of donuts at lunchtime at the developer's office and kind of go, Hi guys, where is mine on the list?

That technique's just as valid as sitting in a boardroom going, And here is a PowerPoint with 15 bullet points saying, please give me more dev resource. All right. To be honest, the soft skills, the people stuff 

works better than bullet points because if you're not able to effectively work that, if you're a manager but you can't effectively organise your team to deliver what needs to be done, that's the job not being done.

And so these inefficiencies are the sort of things that, as an optimiser, annoy the hell out of me. Because if we just all thought logically, it'd work. 

Yeah. 

[00:28:06] Sani: To quote Arnout,

[00:28:07] Tim: But, but they don't, they don't think logically.

[00:28:09] Sani: To quote Arnout, I hate, hate, hate inefficiencies, which is basically the same thing you just said right now. We, and I like that. 



[00:28:18] Tim: There's a reason he's one of my best friends. It's a common theme in that, that's But I'm saying you can hate it all you like, but unless you manage to do something about it It, you're, you're not helping the problem. And so, that's, that's the problem.

Part of your job as somebody who communicates is to go, okay, how can I communicate with them? If they're not like me, if they don't have the same agenda as me, if they've got different, what do they have? What technique would I use if I'm speaking to somebody who's a potential client to the website, when I'm doing my user research, to find out what would motivate them?

What consider, what other things they've got to consider? What competitors they've looked at in terms of, and okay, so let's do that. 

If I'm asking for dev resource, who else is competing for that time? If he won't give me some of his budget for some of his staff, why? What does, what marker do I need to hit for him?

Because if he can't do that, I can't do mine. So I've got to solve his problem. And if I go over here and like, ah, well, she's saying that this is a lovely idea, but it's not on a priority list. And 

it's her manager that's saying that, right. So I'll work my way back up the tree. I shouldn't be interfering, shouldn't be going across the business system.

My job is to make things work, this is blocking it, I'm going to go fix it. So that optimising mentality works really well if you apply it internally. You know, change management kind of, what were you saying in terms of how do you persuade people? It's like actually, an awful lot of the techniques we know and are really good at and have indeed tested on people, work on people, 

[00:29:43] Sani: We only have a few minutes left. So let's go to the rapid fire section. This is questions that were sent by the LinkedIn audience. So this podcast 10 to 20 seconds per question. Some will require a few minutes, but let's do them in 20 seconds.

Still, it doesn't matter. It's more fun that way. The first one from Armdeep Atwal, what do you think the fundamentals are in optimizing a digital business?

[00:30:06] Tim: Understanding how they make money.

[00:30:08] Sani: Next one from Arnout. No, Hacks guest. What is your favorite data visualization tool?

[00:30:13] Tim: The ones they can use.

[00:30:14] Sani: The ones they can use. There's no preferences. It's just whatever.

[00:30:17] Tim: I, I tend to lead towards Power BI because most of my current clients are on Microsoft Stack. It's easier to link with that. But if they can use Tableau because they're on a Tableau stack or they're with Salesforce, fine. If you want Click or one of the plug and play ones because you don't have any dev skills, use that one.

If you find with Looker Studio, use that one. 

[00:30:37] Sani: What's your favorite chart type?

[00:30:39] Tim: Your bars, your lines, standard,

[00:30:41] Sani: Stick, keep it simple, right?

[00:30:43] Tim: I, my personal preference, and it's not one you could use a lot, is I really like the ribbon chart at the moment. 

Because it not only shows the ribbon chart, it's

[00:30:51] Sani: Oh, ribbon. Yeah. Okay. Okay. Yep. Yep.

[00:30:53] Tim: bar chart with a 

line chart combined. So the line shows you the difference between there.

I like, I really like a well done Sankey chart. 

[00:31:00] Sani: Hmm. Right david Ivanov, uh, what should people be doing better in analytics?

[00:31:05] Tim: Keep to the theme of the episode, I suppose. Um, speak to the business. don't just measure stuff because you can measure stuff. Measure stuff they can take an action to. Measure stuff that they care about. Measure stuff that you can measure reliably. And measure stuff you're allowed to measure.

Like if you've got 

data you can rely on. and data that people can work with, it's doing its job. 

Oh, we can measure this. Why? Who cares?

[00:31:29] Sani: Exactly. Well, the last one, uh, which one of the Star Wars characters would make the best digital consultant.

[00:31:36] Tim: I think Obi Wan, 

ultimately, at the end of things. Yoda's too cryptic. We wouldn't get anything out of him. He's, he's like the database dev who kind of knows how it all works, but won't ever share it with you. Not

[00:31:48] Sani: his syntax is weird.

[00:31:50] Tim: His syntax is, no, his syntax is probably spot on. That's why he 

speaks like that. Um, yeah, 

Anakin slash Darth.

Too, too hot headed, like be too keen to find the big win and actually overlook the problems where you've got data problem, data quality problems

[00:32:05] Sani: I absolutely love that. I love

[00:32:07] Tim: but Obi Wan's slow, methodical, does things the hard 

way even when he doesn't want to. Just gets, just plows his 

way 

[00:32:14] Sani: the job 

[00:32:15] Tim: I think it's not a glamorous job, it is a dedicated and practical job and I think if you've got somebody with a 

long 

[00:32:23] Sani: absolutely perfect. So let me just recap the key takeaway from this episode is it's not just about a conversion. It's about what the business needs and learning about and figuring that out before you even start doing other things. I'll. Repeat, the call to action , for the audience, experimentation, elite. com. Go and get your ticket. Use the code. No hacks, 10, and also go to nohackspod. com slash follow. Subscribe to the podcast. If you are subscribed, send the episode to someone else who you think will enjoy it.

Final question for you, Tim. What is one key message or phrase you have for yourself six months from now?

[00:32:57] Tim: Have you been to the gym today?


People on this episode