Sunday, October 20, 2019

Implications of Classifications in The Adobe Analytics UI on Analysis

As Analysts working in Adobe Analytics or any other system, we want to analyze data without any restrictions or caveats around volume. Analysts who work with Adobe Analytics must've heard about the dreaded "Low Traffic" bucket which shows up when a report in the UI has more than 500,000 unique values per month (eVar/prop etc.). Here's how it shows up in the report.



A customer recently asked whether they could upload over a Million rows of classification data in Adobe Analytics tied to a primary user ID stored in an eVar and leverage the UI for analysis and segmentation. We discussed the implications of uploading this amount of classification data into Adobe Analytics and even considered Audience Manager. I recently wrote about the impact of classifications on segments for a different use case. In this post, I will cover some of the pros and cons of uploading over a million rows of classification data into Adobe Analytics and onboarding records into AAM for deeper analysis.


Pros and Cons of Classification Data (Over 5ooK Rows)



Adobe Analytics UI

These pros and cons are listed for the Adobe Analytics UI and does not apply to Analytics data feeds or Data Warehouse.

Pros

  • Uploaded classification data is retroactive
  • There's no extra cost in uploading classification data
  • Recommended tool for deeper data analysis and segmentation of classified dimensions (below 500K rows)
Cons

  • Classification data over 500K rows might take days or weeks to catch up and may delay analysis
  • Classification data in the UI AND segments tied to it will be subjected to the 500K unique limit
  • Classification data is also subjected to hash collision where some IDs may show 2 or more Unique visitors in the UI as explained here
  • Values over the “Low Traffic” bucket do not get classified. Data Warehouse does not have this limitation so if data can be exported out, we won’t run into the limit.

Audience Manager UI

These pros and cons are listed to highlight the advantages and constraints of uploading large amount of data in AAM and the constraints are called out for deeper analysis in the AAM UI. I recently wrote about the various ways to import data into AAM.

Pros

  • Onboarded data is retroactive IF data is segmented in AAM and an ID sync is in place
  • Large amount of data can be handled without any volume constraints
  • Data is uploaded to AAM quicker (24-48 hours) than how long Analytics classification  over 500K rows will take (assuming there are no errors encountered during onboarding)
Cons

  • Limited ability for reporting and segmentation in the AAM UI compared to Analytics. The UI will show overall uniques based on segmentation but will not allow you to slice and dice the data like Analytics can. An example is lack of sequential segmentation which you get in Adobe Analyics
  • Onboarded data/segments will only apply to users who visit the site AFTER the data has been onboarded when AAM segments are shared with Analytics and analysis is done there. I cover this in a post I wrote about last year.
  • Each row of data will be charged as a separate server call
  • Data is mostly available in the form of Unique Visitors and not as Page Views or Visits

As you can see, there's no clear winner when it comes to classifying large amount of data for analysis in either UI. So to summarize, the Adobe Analytics UI is better suited for deeper analysis and segmentation (data under 500K rows per report) whereas Audience Manager is better suited to handle large amount of data with segmentation capabilities suited more for activation of audiences (Cookies and Mobile device IDs).


Potential Options and Alternatives

This section will cover some of the ways to get around the "Low Traffic" issue and large number of classification data in Adobe Analytics:

  • Leverage Adobe Data Warehouse: It's not perfect but you can leverage Adobe Data Warehouse to export data and run segmentation as you won’t run into the same limits as the UI.
  • Reduce the Cardinality: You can split dimensions with high cardinality into separate variables.
  • Increase the uniques exceeded limit "within reason": You can contact Customer care to increase the uniques limit from 500K to up to a Million. This will work for anything between 500K-1 Million but not for anything over that.
  • New- Customer Journey Analytics: CJA is a newly added offering which leverages AI/ML models and much larger datasets from Adobe Experience Platform. This offering will remove all limitations around uniques exceeded and will be added to Workspace to allow for easy analysis of big datasets.

Again, there's no perfect solution to tackle this issue currently but the new feature of Customer Journey Analysis aims to fix this issue which is exciting. I hope this post was helpful to give you an understanding on some of the implications of uploading large amount of data into Analytics but all is not lost and there are some potential ways to avoid this.