Do You Need an Audit of Your Analytics Process? [Podcast Episode]

Amanda Milligan
By Amanda Milligan
October 6, 2020

There are so many different metrics you can track and various ways to track them.

 

To find out if there are ways you can optimize your reporting process, Dana DiTomaso, president and partner at Kick Point, explains what to look for and how you might want to tweak and improve the way you track various metrics.

 

EN_Google_Podcasts_Badge_2xUS_UK_Apple_Podcasts_Listen_Badge_RGB

Want more advice on how to get the best content marketing ROI? Sign up for our monthly podcast newsletter to get exclusive access to bonus interview content and resources!

 

In this episode, you’ll learn:

  • Why people shouldn’t overly rely on traffic as a metric
  • The two types of recommended reports you should make
  • Why it’s important to accept the things you can’t record, like ad blocker data
  • How to transition to more accurate data (even if it looks “worse”)
  • Common misunderstandings about session timeout and other metrics

Related links/resources:

Transcription:

Amanda: This week on the show, we're digging further into analytics. And I'm really pleased to have the president and partner of Kick Point, Dana DiTomaso on the program, she has spoken about the conferences around the world and about us here on brand building. And I think she's going to be a wonderful person to have this conversation with. So, welcome to the show. Dana,

Dana: Thank you so much for having me.

Amanda: Absolutely. So, we've had some episodes previously, that talk about analytics, I think it's valuable, even at top level to hear how different people do it. I don't think everybody does it the same way. But I think at this point, listeners of the show know that goals have to come first. Right? You have to try to understand what you're trying to measure before you're going to be able to successfully do that and tie it back to your objectives. But still, oftentimes people look at traffic or conversions as the baselines. Are there other elements that you think people should be measuring, to see if their tactics are successful? 

Dana: Oh, absolutely. I think definitely people over focus on traffic. And I think that that can be really misleading. One of the things that we see a lot of when it comes to recording pageviews, for examples of pageviews, pageviews could be a metric that you decide is successful. But the problem is that what's happening a lot now is that people are tab hoarders, right. They hang on to tabs in their browser forever. And whenever their browser reactivates, it means that it's sending a zero second page view off to Google Analytics. And if it's a particularly good resource, people could have that hanging around in their browser for months. And I think that this is where you really have to dive a little bit deeper and figure out what's actually going on with those metrics and not just look at, say, pageviews. This could be where, for example, you're looking at how far down the page, people scroll, we have a blog post on our website about something called content consumption that we've put out three years ago now called maz.com. And what content consumption measures, is it looks as if somebody actually scrolled down to the bottom of the content, and then if they spent long enough on the page to read that piece of content. So, the timer is different for every piece of content depending upon how many words it has in it. And if both those things are true, then we consider the content to be consumed. And if not, then we think that they are bailed, or they just saved it in their tab for later or they were a speed reader. And so, I think that that's been really helpful as well to see which pieces of content people are truly actually consuming, as opposed to just hanging on to for later. And I think this has been really helpful for things like, we've seen this with display campaigns, for example, where you might get tons and tons of page views for display campaign. But then you look at the scroll depth, and it turns out that literally no one scrolled, and everyone spent like maybe one second on the page. And that probably means that people were mis-clicking. We've seen that a lot with things like news websites, where people clicked on an ad they didn't mean to, and then they immediately hit back. But it was long enough for Google Analytics, or we call it a page view, it just wasn't a good page view. And so, really missing out on a lot of those user behavior metrics, I think it can make you really miss in terms of how you're reporting on your success of campaigns and tactics.

Amanda: So, that leads me to a question, I was wondering when I was preparing for this interview, which was, are there two kind of buckets for analytics, like things to look at to see if you're already being successful and things to look at to see if you're heading in the right track. Like this content is resonating, but we're still not hitting the numbers we're trying to hit, but we should stick with this.

Dana: Yeah, totally. And I think there are two types of reports that generally we recommend that you make, and there's one, speaking from an agency perspective here. I think there's one type of report, that's the one you give to the client. And that's the one that's got the top level metrics, the are we heading in the right direction? What are we doing to hit our goals type report, and that should be short and sweet to the point, ideally, just a single page or two pages if you have to. And then the second report is the diagnostic report. And this is for you, as a marketer to look at and figure out what's going on, you know, are you actually hitting that mark? And that's where you look at the stuff like, this blog post is a monster, and this blog post is a real dog. And so, what can you do to improve that? And I think that that's where he wouldn't show that diagnostic report to leadership or to a client. But certainly, it's an important thing to have. And I think that that's what makes you better as a marketer, is really being able to respond to campaigns as they're running, and then tweaking and making changes. But what you don't want to have happen is you don't want the campaign to finish and then you look at it, and you're like, okay, how did we do? Because that's just like you already wasted that money, when you could, of course corrected halfway through and improve things by the end.

Amanda: So, if somebody's listening, and they kind of have a sense of their analytics are probably not on track, but they could be doing some optimizing, where do they even start? Like if somebody is thinking, I need to audit the way that I gather Information and assess it. How can they go about figuring out what those mistakes are?

Dana: Yeah, I think start by figuring out like, where everything is set up. So, if you're using Google Tag Manager, what's in there, what pixels are firing, what's happening? If you're just using Google Analytics, I recommend using Google Tag Manager because it gives you the ability to add in some more behaviors to how you collect Google Analytics data. Start by auditing that and so often, when we do what we call analytics discovery with clients, the first thing we start with, is by asking them about their goals, right. And you've talked about goal setting on the podcast before. So, finding out their goals, and then turning those goals into KPI's and metrics and figuring out what do we actually need to report on. And then we'll look at their current setup and their tools that they're using, because there's never just one tool, right. Like, it's never just WordPress, or just Shopify or something like that. There's always like this chat tool that they're using, or they've got kiss metrics, or some other like supplemental analytics product happening on the site, or the developers have done something, right. 

So, really auditing everything they have currently, and then seeing how it lines up with the reality of what they actually want to measure. And then sort of looking at the gap in between the two. And it's similar to how, come to think about it, like how you do content marketing, right. It's like, here's the keywords that we want to rank for, here's the keywords that are currently ranked for, here's the gap between the two. Same thing when it comes to analytics. And yeah, from there, you can put together the plan and figure out all the different stuff that has to get done. And I got to say, analytics projects, to get it done well, it does take a while for sure. Like they always take longer than people estimate they will be including myself. I'm a perennial optimist, I'm starting to become less when it comes to analytics projects. But I think that it does take a while. And I think the other thing too, and I was just, actually the day we were recording this, there was a presentation that was pre-recorded and put out as part of the white spark local search summit, about analytics. And one of the things I said in that talk, which I recorded a couple months ago, was the idea that you just sort of need to accept what you can't change or what you can't record. And I think that a lot of people have this idea in their head that they can record absolutely everything. And that just simply isn't true. And I think by accepting what you can't record, then you can sort of except what you can actually report on and what you can influence. So, for example--

Amanda: Give us examples. 

Dana: Yeah. So, for example, I was going to say like a big black box for people or ad blockers, which are terrible. From a measurement perspective, but great for privacy. So, I have an ad blocker, I mostly use it so that my stuff doesn't show up in clients analyst reports. But yeah, lots of people use ad blockers. And if you're marketing to a segment of the population who is at all tech savvy, you can guarantee that they're also using ad blockers. And they're just not going to show up in your analytics, you'll have no idea. So, a good example of what percentage of this could be, is when GDPR came in, and everybody had to suddenly put a cookie notification up on the site. And just not on the side, one big mistake I see people make is they have a cookie announcement, except the cookie announcement doesn't actually do anything with regards to analytics. So, it's still sending the cookies, even if you say no, don't do that, double check with what's not happening. 

Anyway. So, back to cookies. When clients put this in, all of a sudden their traffic would drop by, say 20, 30%. And those are people who were saying no, thank you don't track me. And so, if you can just assume from, say, a major b2b site, and they lost 20, to 30% of their traffic, just from a cookie announcement, imagine how many of those people already had an ad blocker installed, and just weren't showing up at all. And now you're losing these additional people too to cookies. So, you could be making really important marketing decisions based on a tiny subset of users. So, don't try to put too fine a point on it right. And don't dive too far down rabbit holes, think about the big picture when it comes to analytics, because you can't look at something and say, well, this page had 10 page views, I'm going to dig into each one of these page views. That's not enough data for you to truly make decisions, look at the bigger picture stuff and try to focus on that instead.

Amanda: Yeah, I was going to say, I have certainly been in my own rabbit holes, it can get kind of overwhelming when there's a million different things you can look into. It seems very exciting in the moment. But yeah, in terms of actual productivity, I'm not sure it's entirely useful.

Dana: No. 

Amanda: Can you give an example of a reporting setup that you've used, that's been really successful?

Dana: Yeah, definitely. So, a lot of it depends on what kind of client or leadership you're talking to. And I've sort of broken it down into three buckets in my mind. And the first one is, I want to see lines that go up into the right. And those people like to have charts, they like to have graphs, right. But they really only need a single page with lots of lines going up into the right. And so, in that case, we'll include details on traffic, on number of leads, on how things are going. And those are the people who want to see all the details, but they don't need to have too many details. So, that's where I would report on traffic, like visitors, phone calls, no traffic from Google. My Business broken out separately from traffic from Google ads, right, that kind of information. And then the second kind is the person who's, I want to know that this is happening. So, I want to know my phone is ringing. And so, for those people, you really just need to start with a top line report, that's like how many times your phone rang last month. And we'll actually call it that in the report. We won't say phone call conversions, if the client says to us, I want to know my phone is ringing, that metric is going to say, is your phone ringing? Because that speaks more to the client, than phone call conversions, goals, six, and that's just not a great way of presenting that information. 

So, really try to use the client's own words when you get back to them. And this kind of microcopy and reports is really important to make sure that they engage with the reports, and they're interested in it, because it is speaking to them. So yeah, I mean, one example I'm looking at right now on my other screen is one we made for psychologists and the top line metrics, is your phone ringing? Are you getting emails or are visitors booking appointments? Those are three things they super care about. And then from there, we get into, like, how many people are finding via Google My Business? How many people are signing up organic? And they sometimes look at that. But really, the top three things are people calling us, that's what they want to know. So, it's a really short report from that perspective. And then the other type of client is the one who's like, tell me everything, I want to know what all I'm super into SEO, I am really, like, I read Search Engine Land, that kind of stuff. I'm going to ask you questions about Google Analytics changes, right. And so, for those kinds of clients, it just like give them lots of stuff. So, for those kinds of clients, we'll actually include like some pretty complex tables, we might have a conversion funnel, we'll look at, say, rolling dashboards, where we compare sessions over the last 30 days, the previous 30 to 90, 180 to 180, 365, to 365, just really looking at trends over time. And so, you can go a lot more complex with those kinds of clients. But you would never deliver that kind of report to the client who's just like, I want to know that my phone is ringing, because they literally don't care, and you're giving them something that they don't need. And I think that that as an agency, in particular, it can really sort of sour the relationship you have with the client, because you're not really listening to them and respecting them and giving them what they need. And I think a lot of agencies worry, they're like, Oh, you know, we're charging this client $5,000 a month, we got to make sure this report reflects that. They are not paying you to write reports, right? They're paying you to get results. And so, the more succinct you can be in that report, is really a service for that client. Start small, if they ask for more things, add it later, but don't start with 20 pages, and then try to come back to two, because that's not how that works.

Amanda: All right. Yeah. That's a great example. How often are you looking at your analytics in order to pull? I mean, presumably, clients expect reports, once a month, once a quarter, what have you. But how often are you and your team like internally looking at these analytics to make sure that things are on track?

Dana: I think it depends on the client. And like the amount of touches that they need. A lot of clients will have things like automated Google Analytics alert setup. So, if traffic suddenly increases or drops, we'll have an email sent to that client Slack channel, for example. And I mean, obviously, it misfires and things like Labor Day, like, down, well, yeah, it's a holiday, calm down, you know, that's going to happen. But then, in general, to know that things are doing well, that the organic is up 20%, week after week, for example, that's something exciting. And so, that's where the team for sure is looking at that. And that's where those diagnostic type reports really come in handy, because they help you adjust things in the moment. And then we use Google Data Studio for all of our reporting. When we make a dashboard for a client, usually we set it up so that they can change the time period. And then they can look at whatever their heart desires. Some of them want, like fixed, if they have to report to somebody else, and they'll want fixed timeframes, but then they can always change it if they need to. But then that report, they always have access to it, they don't have to wait for an email from us to tell them how they've done. They can always look whenever they want,

Amanda: If somebody's listening to this, and they know that they need to update their analytics, but in doing so, and getting more accurate and specific, but their numbers get worse. Like outwardly worse, what do you do to make that transition? Like, okay, this is more accurate, but it doesn't look as good. 

Dana: Oh, yeah. And I have a perfect example of this. We took over a client from another agency, and they had recorded any view of this client, so you'd have multiple locations. And they recorded any view of a location page as a conversion, which is all such crap, right? Like, come on. It's just, I don't know, if they did it maliciously, or they were just ignorant. No excuse either way, I was really irritated when I found that out. So, I had a call with client, and I was like, okay, so here's what's going to happen, we're going to change this. So, it's recording, taps on your email address, taps on your phone number, we're using call rail for call tracking. So, actual phone calls is what we ended up changing it to. And form pills as goals instead of views of this page. Your analytics are going to look really bad if you just go in and look at conversions because this is a client who's a tell me everything type client, if you can look year over year, it's going to look bad, just FYI. And so, just really had a conversation with them. And then in their dashboard itself, we actually have a whole section called important dates, where we list out like, here's the date that we changed your goals, from total BS to actual reality. You know, here's the date that your new website launched, here's the date that we turned on this new goal that we weren't tracking before because now we have a better way of tracking the site. And that's all really helpful for the client to remember because If they go in and they're like, I'm pulling over here, oh, this looks really bad. Why? When the important dates are right there for them to refresh their memory on stuff. So, like, I'm looking at it right now and one of them. So, your new website launched on date, goals were switched that day from tracking this to this. So, year over year, and just for sheer goal volume, it's not going to be accurate until, which is like launch plus one year, we will add them back in them. So, we just don't even include them, because we know that that metric is crap.

Amanda: Yeah, that's really interesting. And I think that companies previously internally, where I can tell that people just kind of use analytics as a vanity thing. They just want to make it seem, it's not even that what we were doing wasn't working. They just didn't want to dig in and report on anything. 

Dana: And mostly, it'll stop working. Like, that's not how analytics works. And I think it's really difficult to replicate success if you don't truly understand why you're successful in the first place. But I think there's also this real fear, like, if it's going really well, and I look at what's going on, I might find out that like, the goal was set up wrong or whatever. And like, this stuff happens all the time. And I think, especially for this client, like they knew that they weren't getting as many leads as this company was reporting, but they didn't know what the problem was. And they told the company, and the company was like, oh, no, totally, 300 leads and they were like, we only got 100 phone calls button. And so, I think that makes the client to not trust you. So, just be honest and if you screw up, you can also be honest, like, that's okay. And if they're a good client, they'll understand and just change it so you won't screw up again.

Amanda: Yeah. Is it similar advice for trying to get buy-in for like a new analytics set up internally, if you're trying to get people on board with tracking certain things over others?

Dana: Yeah, I mean, I think that a lot of the time when we're working with analytics setups, to be frank, they're quite rudimentary. So, when we show the kinds of things that they can track, people get excited. And so, we don't really have too much of a problem selling that stuff internally, especially like marketing teams, internal marketing teams who have been asked like, hard questions from leadership about why they should have this money, or whatever it is, if we can give them the tools to help them prove that to their boss, that they're valuable. Like, that's, they will authorize that, you know, and I find that whenever a company doesn't want to invest in analytics, like there's something else going on there, some sort of internal politics, or there's somebody who's like, got their own pet analytics agency that they want to bring in. You know, I have to say it, a lot of people are really excited about getting this set up, right. And they're willing to put in the time and effort to just be able to trust their data, it seems so basic that we've had analytics for, I think it came out 2004. Like we've had it for 16 years now. And yet, still, it's set up poorly. And I think if a company has one that they trust, like they get really excited about it. And that's always good.

Amanda: I think you've talked about before bringing off site marketing efforts, and some of the not as obvious to integrate into your analytics into the fold. What are their ways to do that?

Dana: Yeah, I mean, a lot of it is tracking numbers and tracking URLs, which are pretty basic stuff. If you're using a tracking number, using something like call rail, and then reporting it as offline traffic, we'll often set up a custom channel in Google Analytics for things like offline advertising. And so, we'll just set medium to offline, for example, and then it will record like postcard visits. One of our clients, they have a senior's discount. And so, seniors love lawn bowling, and Sophie has an ad on the lawn bowling scorecards at his local club. And that URL just redirects to a page about the senior’s discount. And it includes UTM parameters on the redirect, so that we know how many visits he got, for example, and how many conversions he got from that ad. And sometimes it's just simple stuff like that. And other times, it's things like getting the blocking chart for radio ads, so you know, when they were played, and then looking at things like traffic by day and hour, which is something you can pull in analytics, and then seeing if there's a correlation between when the radio ad was played, and when people actually went to the website, often we'll see a lift and things like direct traffic as a result of radio ads. And sometimes tracking URLs don't work as well. So, they might Google and it would be more branded searches as well during that time.

Amanda: Yeah. What do you think about, when people are using multiple tools? Like you mentioned, kind of at the top of the show, like for example for us, not for our clients, but for ourselves, like we have Google Analytics, but we also have HubSpot tracking. And it is interesting to see like, where is the best way to analyze some of this, when we have it in two different places?

Dana: Yeah, I think you have to pick which one you're going to trust, like who is your sole source of truth and also really understand the difference between the attribution windows for different products. So, I think HubSpot is a first touch attribution system. I mean, I'm saying this up top my head, I could be wrong people who are listening, if I'm wrong, please let me know. But I think that's what it is. And then things like analytics is last non direct click. So, if I came to the site via, say an ad and then I visited the site 18 more times via organic and then I converted via organic, analytics is going to give the credit data to organic, but HubSpot will give the credit to ads. Things like Facebook, for example, they have a 21 day attribution window, the ads will pick the last non direct ads click not the organic click. So, you're going to end up with different attributions in different places. And I think that's where it's just understanding the different attribution models for the different products that you have; I think are really key.

Amanda: Is there a middle ground, you can try to achieve between first and last, such as to give credit across the board?

Dana: Yeah, we usually will look at how people actually engage with the site. And then we'll run a few test models using Google Analytics. And they have that you can compare models down in the conversions, it's like the very last thing on the left and analytics, I'm just looking up to see what it's called, it's called multi-channel funnels, and then you go into model comparison tool, it is literally the very last thing in the left and analytics. And it's the most important in my opinion. So, go in there try last non direct click, by default, last click, which isn't actually what Alex uses, he uses this last non direct click, I don't know why they do that. And then look at, we might compare, first touch to last touch and then see if there's a huge difference between those two, which means you've got a lot of people coming in, we'll also look at things that reports above that are time lag and path length. So, figuring out time lag is number of days. And path link is the number of sessions. And so, for example, if somebody has liked a very short time lag, maybe it's always zero days, but they have a path length of, say, several. So, maybe three, that means that those three sessions all happen in one day. So, somebody looked at the website went away for half an hour, because the session timeout default and analytics is half an hour, then they came back again, then they went away again for half an hour. And then they came back again. And the frustrating thing about that is how Google records conversions, they do it based on sessions, not users. So, if I came to your website four times in the same day, and I converted on that fourth time I came to the site in the same day, you would have a conversion rate of 25%, but I'm still one person. So, it should be 100%. You can recalculate using formulas in Google Data Studio. Again, not 100% accurate, but certainly better than the smart sessions that buy things to people. So, you do want to be a little bit more accurate in that way. But yeah, go into the model comparison tool, try some different models, see what works, I think you probably don't need a custom model, you can probably pick one from what works in there, a lot of the time, we end up linear, which gives equal credit across the entire lifecycle. And then in our reports in Google Data Studio, if the client is truly interested, then we will show them the percentage of first touch, last touch by channel or we'll just use the whatever the model comparison tool says the conversions are and we'll use that as our conversion rates.

Amanda: Got it. Is there anything that you haven't mentioned that are some of the biggest misconceptions or the biggest mistakes that you see happening in Google Analytics?

Dana: Oh, I think that session timeout thing, I think that's something that a lot of people are super ignorant about. Just because you know, it's not a setting, it's just like buried in the admin, people don't think about it. So, if I'm on a website, and I'm reading something, let's say I'm on that page for 20 minutes, and then I go have lunch, and I'm away for 31 minutes, and I come back again, that's a new session now. And what happens too, if I default in Google Analytics, if you don't have some sort of like timer or something running through Tag Manager to record that somebody is actually on the page, that person's time is on page zero, because I didn't do a second action. So, often, your time on page and your time on site are totally wrong, because you're not actually showing the amount of time people spent on that page. A lot of this is, there's a new version of Google Analytics called app and web, which is still in beta that is better in that product. You can also do some things in Google Tag Manager to fix that. This is where content consumption actually comes in really handy to make your bounce rate more accurate. But for sure, that's a misconception. And then I think the other thing too, is people don't understand what goes into bounce rate and making bounce rate what it is. So, really go through and figure out what are the things that someone could do on the site to become not bounced, and then make sure you know what that list is so that you know that a bounce is relatively accurate based on what people are truly doing on the site.

 Amanda: When you're talking about the session time issue. What is the actual way you can go in and try to eliminate that inaccuracy? You said there was a way to do that Tag Manager.

Dana: Yeah. So, you can run things like some of them are pretty event heavy. So, you might end up running too many events in Google Analytics. But you can do things like run an event everyone minute, or 30 seconds or five seconds. If that's your fancy that they're active on the page, just to be like, yes, you're still alive. Yes, you're still alive, that will definitely change your time on page significantly, we ran it on our own site. And our time on Patreon was something like 70 minutes, that's one of our longest blog posts, because people are really engaging with it, which is great, but I can guarantee it wasn't saying that before. Oh, yeah, you can, certainly there's a lot of different blog posts out there about it, I can't think of there's one specific method out of those that we've used. But if you google things like running a timer in Google tag manager or running a timer event, there's lots of different resources out there on how to do this. And I would say out of all the different Tag Manager resources, the one that I go to the most is Simoahava, which is, S-I-M-O-A- H-AVA. He is essentially a wizard in Google Tag Manager and has a really fantastic blog and when you're first starting out, you may think that he is just wild and out there, but eventually you will like read his tutorials and you will definitely learn how to do a lot of this stuff.

Amanda: That's awesome. I'll make sure to link to that in the show notes, people can check it out. Is there anything that you haven't been asked in interviews like this, that you think is important to address for growth marketers in terms of analytics?

Dana: Oh, what a good question. I think, I don't know. Actually, I think my big thing right now is just like recognize sessions, recognize session length. People don't often ask if you should change session length. I think that that is an interesting question. Like if you should make session length longer or shorter. Actually, Ruth Bertie, who's I don't know if she's been on your podcast yet? 

Amanda: Yes, she has.

Dana: So, I don't know if she talked about it or not. But she often talks about changing your session length, and it's something that I didn't even think of until Ruth mentioned it. So, I think that that's something to think about as well. It's like, if you're running a timer event, and you're finding that your sessions run super long, generally, maybe you should change that session length.

Amanda: Awesome. Well, thank you so much for taking the time Dana to share your knowledge. I appreciate you coming on the show. 

Dana: You bet. Thanks for having me.

Join the Fractl community and get marketing tips directly to your inbox!