Adobe Analytics attribution measurement

For mid/large advertisers who deploy a significant percentage of overall media spend on digital channels, the utility of implementing marketing attribution is no longer a point of debate. The issue instead seems to have become about the best approach to implementing this compelling and deeply insightful capability. The standard approach for most companies seems to be to implement an off the shelf attribution tracking product. This may well be the most feasible approach in many cases but interesting alternatives certainly exist. Especially so for the data savvy Marketers with in-house analytics capabilities. Here is why-

  • Implementing off the shelf products inevitably involve some degree of customization which in many cases outweighs the benefits of speed and agility. An example here would be when the conversions happen offline and the requirement is to convey the conversion parameters to these third party attribution products
  • Customers are largely limited to the models deployed by the vendor. This is totally inadequate in complex attribution scenarios given that analysis requirements differ drastically from client to client. For example, clients who spend heavily on affiliate marketing might want to place majority of credit to the ‘introducing’ affiliate and little to none to subsequent ones. Or they might want to purposely assign low scores to affiliates who are ‘known offenders’ or cookie stuffers. Such arbitrary weighting schemes are rarely simple to implement with off the shelf tools
  • In a vast majority of cases, the underlying data used for Attribution modelling is not just for estimating media attribution but has a number of other use cases including visitor profiling, customer lifetime value calculation, advanced segmentation, fraud detection and so on. Attribution measurement products would rarely provide these value added capabilities

The alternative is to use the raw, cookie level data from your Web Analytics tool and build your own attribution modelling capability. This is entirely possible as long as the Web Analytics (or some other visitor tracking tool) solution implemented provides the ability to capture cookie level click stream data. The section below provides a conceptual outline of Adobe Analytics attribution modelling using its Campaign Stacking feature.

Adobe Analytics (SiteCatalyst) is a popular tool deployed in many leading Marketing departments stands out in regard to providing direct access to cookie level data through a feature called Campaign stacking. The concept is simple. The record of every visit to a website is captured in a single ‘campaign stack’ as a cookie value (one cookie per visitor). Visit records are attributed to specific campaigns and appended to the existing cookie value and sent to SiteCatalyst at the time of conversion. Once in SiteCatalyst, these cookie values can be extracted offline to provide analysts with raw, clickstream data that forms the foundation of advanced attribution analysis At a conceptual level, the process of implementing custom attribution modelling using SiteCatalyst Campaign stacking feature can be divided into 3 steps

  1. Data collection in SiteCatalyst-This requires setting up appropriate tagging so that various campaign exposures can be properly captured in the cookie. While this is typically not an issue for visits from campaigns, some clever JavaScripting is typically required for visits coming from SEO or direct sources. Appropriate vars and events would also need to be setup within SC in order to take a snapshot value of each var for each conversion
  2. Data extraction using ETL-This takes us into the realm of ETL. While it is entirely possible to simply download excel extracts from SiteCatalyst and do analysis based off these, this method remains a largely amateurish approach in that is is crude, primitive, largely cumbersome and should be treated with contempt at best. A more sophisticated approach is to implement ETL for automating the extraction from SiteCatalyst and providing the final data sets based on analysis requirements. Notice that this is also the step to implement any merging of data sets. For example, customers heavy on display advertising, typically combine this data with cookie level view through data from ad servers to build the ‘true’ campaign stack which accounts for view through impressions
  3. Building the models-This is the final step that involves actually analyzing the data sets and building the appropriate models. More often than not, the data prepare in Step 2 above is fed into some kind of a relational storage that tools such as SPSS or SAS can easily tap into

From a skills perspective for traditional Marketing departments, Step 2 would appear to be the most challenging given that it involves significant technology expertise in ETL and data warehousing along with a very sound understanding of the underlying attribution business concepts. However, cloud based marketing intelligence solutions can largely mitigate these challenges and make the entire data preparation process largely transparent to end users and analysts. At LVMetrics, we work actively with Marketers to implement Steps 1 and 2 using a methodical delivery approach that minimizes risks and wasted investments. Once setup, analysts have regular access to clean, transformed data sets required for specific analysis tasks and analysis effort can be shifted from setting up the data to meaningful analysis. This option may well not be suited for every Marketing department but certainly offers far more flexibility than using an off-the-shelf product in addition to providing long term cost savings. Interested in finding out more? Get in touch today to discuss how we can be of help