At adMixt, our top priority is insuring clients’ campaigns meet their goals. That means we prove our return on ad spend, or cost per acquisition performance. This involves targeting, creative testing, and optimization, but key to all of that is measurement. Without accurate measurement, optimizations do not work, and we’re left guessing at the best course of action. The further we go towards quantifying the total contribution of a paid campaign, the better we’re able to manage client budgets and scale their business.
We’ve talked about tracking with Facebook’s Offsite Conversion Pixels, and we’ve talked about setting up Goal Tracking with Google Analytics. Now let’s look at a real-world example of how we use this data to measure performance, and how we can fill in some of the gaps. The metric that will help us unlock a ton of accurate, actionable insight is post-click data, and I’ll get into that in a minute.
For this example, we’ll use some anonymized client data from an online retailer called adMixt’s Premium Erasers (APE is not a real business). APE has a loyal customer base and runs periodic sales to draw in new prospects and engage past customers.
APE is conservative in their transaction attribution, and they only track last-click attribution in their internal analytics system. They might look at Facebook’s post-click data, but it does not factor in to their CPA or ROAS goals. So let’s look at their orders for the month of April:
APE spent $20,000 over 12 days in April and tracked 638 orders, giving us a cost per acquisition of $31. That’s more than 10% below their target CPA of $35, so it looks like we did pretty well. Now let’s add in the numbers from Facebook’s Offsite Conversion Pixels. We’re only looking at conversions tracked within 28 days of users clicking on APE’s ads, and we’re excluding post-impression conversions completely.
Woah. Facebook is showing almost 3x as many post-click transactions as we’re tracking with last-click! That brings our CPA down to under $12. By this measurement, we could have profitably spent a lot more for APE and grown their business.
So which numbers do we trust? And how can we explain the difference in numbers to APE?
To answer these questions, it’s helpful to review the differences between last-click and post-click:
1. Last click is tracked using URL parameters appended to your landing page. This it open to a number of complications:
A. If your URL gets upset by the user’s browser, the parameters could be passed incorrectly.
B. If a user clicks another URL, even one from an unpaid media source (like organic search) that overwrites your URL parameters, you lose visibility on where that user originally came from
C. Even clicking links within your own website can sometimes strip off last-click tracking codes.
D. If the user clears their cookies after clicking your link, a last-click system will lose track of the user.
2. Post click is tracked internally inside Facebook based on their record of which users have clicked on which ads. Their offsite conversion pixels correlate that data against the users they see triggering conversion events. This is a more aggressive way to track for a few reasons:
A. Even if the user clicks additional links with different URL parameters, Facebook still knows the user clicked your ad and then converted.
B. Even if the user clicks your ad on one device and then converts on a different device, Facebook knows it’s the same user (if they’ve logged into Facebook on each device).
C. Clearing cookies has no effect on this type of post-click tracking, as long as the user logs into Facebook after they clear.
The big fear of trusting post-click conversion numbers is that transactions and revenue will not be properly attributed to the correct media spend. For example, if we take Facebook’s post-click numbers for APE, and then add in post-click numbers from their Google AdWords campaign, we’d find that the total number of post-click sales exceeded their total tracked sales sitewide. That’s because some users clicked on Facebook ads and Google ads before converting. They show as post-click transactions in both reports, each with their own cost incurred. If we trust all post-click numbers, we might inadvertently exceed our CPA goal.
This is where third-party analytics tools come in handy. Attribution-focused tools like Convertro exist to dive deep into these questions and offer solutions. But let’s see what we can do using free tools like Google Analytics.
Like Convertro, Google Analytics tracks every page in your site, so it has good visibility where all your traffic is coming from. It can be used for tracking last-click conversions using UTM tracking codes, but also used to analyze post-click data. Let’s take a look at APE’s last-click numbers in Google and see how they compare to internal and Facebook numbers:
Well that doesn’t help. Google’s last-click numbers are even lower than APE’s internal numbers. By this measurement, we missed APE’s goals completely. Looking day-by-day, we see that on the 18th and 19th, a campaign failed to track in Google Analytics. As we discussed previously, this can happen for a number of reasons. If those days had tracked accurately, Google Analytics would have matched APE’s numbers much more closely. One nice thing about last-click tracking is you rarely get false positives, but false negatives can be common.
So what other data does Google Analytics offer that might be of help? They’ve got a robust set of attribution modeling tools that can help identify how post-click transactions result in last-click transactions in other channels, but that’s a topic for another post. Today, let’s take a step back at how Facebook ads work and how users interact with social content.
When users see Facebook News Feed ads, they can interact with them as with any other post. Despite Facebook’s improvements to the Page Post Link Ad, with its larger text and call to action button, many users still click the link to APE’s Facebook fan page. We see a lot of photo views and post comments coming from our paid campaigns, so we know they’re visiting the page and interacting.
Users who do this before visiting APE’s ecommerce site would be tracked as post-click sales, but not last-click sales. Could these non-direct clicks be responsible for the big discrepancy between post-click and last-click?
To test this theory, we can use Google Analytics advanced filters to look at referral traffic from Facebook, making sure to exclude all last-click sales.
We can see an additional 678 sales coming from Facebook referral traffic in April. Lending evidence to our theory is the fact that on the 18th and 19th we see a big spike in referral traffic, corresponding with the last-click sales we expected to see. So how does this affect our CPA? To be fair, we have to recognize that even on days when no paid campaigns are running, we average 13 sales from Facebook referrals per day. So let’s remove 13 sales per day from our referral sales, but not go into negative.
That brings our Google Analytics CPA, including last-click and adjusted Facebook referrals, down to $17. Not bad! We’re about halfway between our post-click and last-click numbers, and more than 50% under our client’s CPA goal!
To close the gap even further, we can dive into Google’s attribution modeling tools, or implement a solution like Convertro, but even with these basic methods we can see that post-click numbers should not be discounted. They offer valuable insight into your media spend’s true impact. Advertisers committed to growing their business need to factor in this valuable data.