Google Analytics Channel Attribution Made Us Switch Off Our Growth Engine

Nobody sits down to make a channel investment decision they hope will immediately lose millions of dollars and get them axed, but that’s what’s at risk if you’re using purely cookie-based attribution. Attribution expert Michael Taylor compares cookie based attribution to “looking under a street light for a wallet you lost on the other side of the street, because that’s where the light is.” In this story, I recap how a global ecommerce business switched off one of their core growth drivers, without even realizing it! More specifically, it will explain how we found out that “unattributable” sponsored Youtube videos were responsible for the lion’s share of brand discovery amongst buyers, an insight that wouldn’t be possible to make with GA’s out-of-box cookie based attribution reports alone. 

DISCLAIMER: there’s no perfect way to measure marketing effectiveness. Each method has its drawbacks, but equally can help you gain an important perspective to spark meaningful discussion. Here we cover one to help you learn and align stakeholders to the uncomfortable truth…

We can’t rely on just the cookie

My consultancy, Unvanity, kicked off this new client engagement early 2021. The cookie apocalypse was picking up pace, and the client had briefed us that traffic, conversion rates, and sales were all on a downtrend. In the months leading up to our start, the performance marketer had been instructed to significantly increase ad spend during the holiday season, which produced disappointing results. After pulling back, though, their conversion rates had dropped to new lows instead of recovering to old benchmarks. Leaving them worse off than they were before the big spend period. They feared the IOS14 launch had messed up Facebook’s ability to show them to the right audiences, and that they needed the fresh-off-the-shelf GTM server side. 

Along with my fullstack dev, Warren, data engineer, James, and project lead, Nitesh, we had spent the best part of a year implementing some of the most bespoke tracking & data capabilities that I had seen in years of orchestrating CDPs. We were helping to collect billions of data points a month across a spectrum of businesses and industries and were pretty confident that server-side, on its own, wouldn’t answer the client’s question about where all their sales went, nor bring back those sales. The company lacked hard facts about what had happened or where growth had actually come from, and was suffering from new shiny tool syndrome. Bespoke customer data integrations from Unvanity to the rescue.

For context, the business sold solely to IOS users via a Shopify Store (and wholesale). Even before our web tracking audits, we knew that two likely issues impacting data quality were ITP shredding cookies on Safari browser and the use of ad blockers. However, the product’s price point and use case put it in the impulse purchase category, which meant the typical consideration and purchase phase happened the same day, often the first session… so not enough lag for the cookie shredding to falsely attribute too many purchases. Still, it was important to get this right: if we misallocate budget we could fail to hit our ambitious targets, and lose ground to fierce competition. If we get it right, we could have an unfair advantage in the market.

Rather than over rely on one attribution method, we decided on a 3-sided-pilot approach, with the #1 objective of creating a unique POV and sharpening our understanding of the situation. Our approach was:

  1. Exploratory business analytics: leaning into existing data assets with *caution (GA, Shopify, search console, google trend, ppc data): benchmarking, trendlining, tagging reports with metadata collected via workshops with the marketing team. 
  2. Marketing Mix Modeling pilot: developed a new tactic to get Youtubers openly sharing their data, combined with internal and external data attributes. We also looked at seasonality patterns across leading demand indicators.
  3. Manual attribution survey: which included integrating it with Google Analytics to compare responses against cookie attribution.

We worked on all three, but 1 and 2 from that list are big topics, out of scope of this post, so instead we will expand on the 3rd! That’s not to say the first and second method did not yield insight, they did. It’s just the third produced a visual that every marketer really needs to see, so that it really hits home why you can’t fully trust the cookie. Essential learning to win in today’s environment.

Broken cookie attribution in a nutshell

Just like many of today’s leading D2C businesses, the client had tapped new growth by piggybacking on Podcasters and Youtubers. However, these channels created headaches when trying to compare against their other channel investments. That’s because every new brand aware visitor created from these sorts of sponsorships doesn’t always generate a click: it sticks in their brain making them more likely to search later on. No click. No attribution. Right?

You could try “tagging” each channel, by giving the channel owners a unique promo code. But if you’re sponsoring hundreds of channels, or if you have a company policy to NEVER give discounts, or if you don’t see high referral program participation rates amongst channel sponsorships–just like this client– it’s unavoidable that a % of people who learn about your product or service through these awareness gen investments will just search for you on Google instead of clicking on a UTM tagged link in description of the video.

This means that people finding out about your business via these channels won’t appear in your Google Analytics reports, or at least they’ll be disguised as ‘direct’ traffic, or organic search. That’s exactly what was happening for this brand, eventually becoming a catalyst for the decision to switch off Youtube channel sponsorships… which just so happened to coincide with the same period where their sales fell off a cliff. 

Just because something isn’t “easily” measurable, does not make it unworthy

Early into the collaboration the company had let us know they were collecting post purchase survey data with a Typeform emailed after checkout. We wanted to know where buyers first remembered us from vs blindly following what Google last/first click attribution reports were telling us. So we decided to integrate this survey with their Google Analytics instance to compare responses against what the cookie said, and include the survey on the purchase success page to boost response rates.

The survey would ask buyers to select the channel where they had first heard about us, like the example screenshot below.

Then we’d go and ask them to be more specific. So if they selected Youtube. Was it a Youtuber? A channel sponsorship, or an Ad? This way we could get a signal on which efforts within that channel were moving the needle.

The integration was straight forward enough. We’d use the Typeform to trigger a GA measurement using Zapier. This would let us monitor form responses in native GA reports, but also lets us create audiences based on where people said they’d found out about us. Or compare manual attribution vs cookie. Let’s go!




To build the Google Data Studio report, we segmented buyers by our default ga channels, broken down by our manual attribution collected via surveys. Imagine seeing the following….

How to read this table:

  • Along the left hand side in column A you have what they responded to the survey
  • Each column represents what channel GA attributed them to based on cookies 
  • Cells with a large share of all visitors are in green, low shares of visitors are in red


Notice how the biggest channel contributor is youtube, hidden within each of the cookie based channels? Imagine seeing spikes in paid social or organic search in native out-of-box GA reports… and thinking you’d produced a win on those channels, but in reality, one of your sponsored videos on Youtube had gone live and was going viral. Imagine doubling down on efforts across the other channel. And pulling back on Youtube. Oops!

The caveat here is of course that people filling out the form could be lying and remembering wrong about where they had first heard about us… or a number of implementation caveats that need caution. But remember what I said earlier: the objective of the 3-sided approach was to produce some new perspectives. There’s no one single one-size-fits-all attribution method. But this one, this time, was clearly highlighting Youtube channel sponsorships were likely to be getting under reported–not reported at all–and we knew from our other efforts, that sales had begun to decline the same period where Youtube channel sponsorships had been pulled back… so we definitely asked the question, why had they pulled back spend 😉

Conclusion

Where people discover your brand, and where your analytics vendors tell you where they do, can often paint a completely different picture about what’s driving the business. Trusting the cookie blindly can lead to incorrect allocation of marketing spends or worse, nuking your growth channel. Relying only on one attribution method could have lost me my job, or the client theirs, or even put the company out of business! Out-of-box GA reports rarely cut it. However, with a little bit of ingenuity and data integration prowess–you can compare a manual survey against a cookie based one. Try this at your own office to help colleagues dive one level deeper into the attribution rabbit hole. If you get stuck, don’t hesitate to reach out.

Implementation comments: To ensure the Google Analytics event that is sent from Zapier, gets stitched to the correct user history, you’ll need to collect an Identifier, like the Google Analytics cookie Id (client ID). While typeform lets you add and populate hidden form fields using JavaScript, HTML or query parameters, there are a few pitfalls to navigate when ingesting the client Id this way. If you have the support of a developer, it’s fine. Otherwise you’d be better off avoiding Iframe based forms all together, and trigger the event client-side. However, the further your survey respondents are from the initial device / date of first session, the more likely the cookie ID will have changed–which would defeat the purpose of the analysis. So if you can’t get the survey answered near to the first session/device, you’ll need to try and store the initial session UTMs and a unique user identifier like an email, to later compare against a survey response. An example might be storing these values in local storage on the first session, and collecting them in newsletter signups, anti-bounce popup forms, or demo request forms. Having an automation in your CRM that makes a copy of those values the first time a record sees them. Then when a prospect does eventually buy, you’d compare the survey responses against those values, using the email as the common identifier. The joys of being data driven.

About The Author