30-second summary:
- As third-party cookies will eventually phase out and marketers search for alternate approaches, they may find themselves lost in a sea of data when attempting to measure and evaluate the impact
- Focusing on the quality of users instead of attributable conversions can mitigate the inconvenience of losing third-party cookies
- The shift from cookies to a new engagement model will require constant testing, so keep data simple where possible
For years now, digital marketers have been spoiled by third-party cookies and the ability to accurately track engagement – it has made life simple, and reporting a campaign’s activity a breeze. Such an approach has allowed us to easily see how many conversions Meta, Criteo, or an influencer has contributed to with minimal effort. But the eventual demise of third-party cookies demands accurate data on engagement to ensure that the transition to new identifiers can be as clear as possible. However, due to either ignorance or convenience, many advertisers still take overly positive and blindly optimistic metrics as the truth.
Counting your chickens before they’ve converted
If we take Facebook for example, they have no way of knowing to what extent their services contributed to a conversion. There are many ways of producing wildly inflated numbers, such as having several touch points and one conversion being associated with multiple channels, or even inaccuracies from false positives. This is particularly troubling for those engaging in heavy remarketing based on past users who already have visited or interacted with a site. One must ask the question – when working with inaccurate metrics, will remarketing actually contribute to further conversions or will it simply attribute miss-clicks to campaigns that don’t increase revenue?
We as humans love to oversimplify things, especially complex patterns. Imagine how complex a visit is to your webpage – you get a session that is connected to a user, that considers different attributes such as age, gender, location, interests as well as their current activity on your site. That user data is then sent to, for example, Google Ads, in a remarketing list.
Even the remarketing list provides a notable variable when trying to make sense of conversions. Facebook and Google users are not 1:1, with one user on Google often being connected to more devices and browsers than the average Facebook user. You could get a conversion from a device that Google has connected to the same user, while Facebook may lack any insight.
With each user visiting your website you populate remarketing lists. Those remarketing lists build “lookalikes” in Facebook and “similar” in Google. These “similars” can be extremely useful, as although traffic from one channel could be attributed to zero to no conversions, they could in fact help build the most efficient “similars” in Google Ads that can then drive a large number of cheap conversions.
Identify data that helps you steer clear of over-attribution
All automated optimization efforts, whether they be the campaign budget optimization (CBO) or Target CPA are dependent on data. The more data you feed the machines the better results you get. The bigger your remarketing lists, the more efficient your automatic/smart campaigns will be on Google. This is what makes the value of a user so multifaceted and incredibly complex, even when you don’t take the action impression of an ad into account.
With this incredible complexity, we need to have an attribution model that can genuinely portray engagement data without inflating or underselling a campaign’s conversions. However, while there may be many models that are well suited to produce the most accurate results, it should be remembered that attribution is by itself flawed. As consumers, we understand that the actions that drive us to conversions in our personal lives are varied, with so many things that can’t be tracked enough to be attributed. While attribution cannot be perfect, it is essentially the best tool available and can become far more useful when applied alongside other data points.
The last non-direct click attribution model
When trying to avoid inflated data, the easiest attribution model is a last non-direct click. With this model, all direct traffic is ignored and all the credit for the conversion goes to the last channel that the customer clicked through, ultimately preventing any conversions from being falsely attributed to multiple touchpoints. It is a simple model that only considers the bare minimum that still manages to solve the problems of over-attribution by being direct. This way, marketers can measure the effect rather than attributing parts of conversion to different campaigns or channels. It really is a very straightforward approach; essentially, “if we do this to x, does that increase y?”. Of course, like all attribution models, the last non-direct click approach has its downsides. For one, it’s not a perfect solution to over/under contribution, but it is an easily replicable and strategically sound approach that provides reliable data where you can measure everything in one place.
In any case, the delayed death of the third-party cookie is certainly causing many to reevaluate their digital advertising methodologies. For now, proactive marketers will continue to search for privacy-friendly identifiers that can provide alternative solutions. First-party data could well have a larger role to play if consent from users can be reliably gained. While we wait for the transition, getting your data in order and finding accurate, reliable approaches to attribution must be a priority.
Ensuring the accuracy of this data is therefore imperative, this can be achieved by ensuring there are no discrepancies between clicks and sessions whilst all webpages are accurately tracked. In the absence of auto-tracking, UTMs should also be used to track all campaigns and, if possible, tracking should be server-side. Finally, marketers should test their tracking with Tag Assistant, and make sure they don’t create duplicate sessions or lose parameters during the session. Ultimately, once the third-party cookie becomes entirely obsolete, which direction marketers go in will ultimately be decided by data – which must be as accurate as possible.
Torkel Öhman is CTO and co-founder of Amanda AI. Responsible for building Amanda AI, with his experience in data/analytics, Torkel oversees all technical aspects of the product ensuring all ad accounts run smoothly.
Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.
Join the conversation with us on LinkedIn and Twitter.