Sunday, December 9, 2018

Fix Variable Stickiness on Click Events in DTM & Launch by Adobe

This post covers a fix for a very common issue where page load variables tend to 'stick' or 'persist' on the subsequent click event. I'm leveraging the clearVars() function and covering exit link tracking in this case but the same logic can be applied to other types click events too. We will cover this fix keeping in mind both DTM and Launch by Adobe so without any further delay, let's jump right in!

Overview of the Issue

Before we dive into the issue, we have to take a look at the HTML structure of our page. For the purposes of this post, I'm using my demo site as an example. In the screenshot below, we can see that all of our internal site links start with ".." and our external link pointing to starts with "http://". We can use this distinction as a condition in our Tag Manager.

Here's an example of a page view call (from my test site) where we have a custom Page View event (event6) and an eVar (eVar3) to track the Page Name.

On click of an exit link, we see that event6 and eVar3 get carried over on the next call.

So, let's discuss how to fix this in both DTM and Launch by Adobe. 


Given that DTM is still a populate Tag Manager that will eventually be sunsetted, it makes sense to cover how to resolve this issue in this TMS. 

The first step is to disable automatic exit link tracking in DTM. We do that by unchecking the "Track outbound links" checkbox in the Adobe Analytics tool.

The next step is to create an Event Based Rule tied to the anchor ('a') element with the event handler of 'click'. The reason we do that any link on the site can be an exit link so it makes sense to simply look for the presence of an anchor link.

The next step is to disable any tracking under of the Adobe Analytics section.

The final step is to add some custom code to fire our own exit link tied to the exit link starting with "http" (covers both http: and https:) where we only include eVar3 (Page Name) as part of our exit link call along with the exit link URL passed to the call. Note that I'm using s.clearVars() to remove the extra event6 from my call. 

Once published, we can see that on click of the exit link, we're only seeing eVar3 being passed but no longer seeing custom event6 being populated in the image request.

Launch By Adobe Fix

Now, let's take a look at how to tackle the exact same issue in Launch by Adobe. Given that Launch has an in-built action for clearing variables, we will leverage that instead of passing it in the custom code. 

The first step is to disable exit link tracking from Launch by Adobe by unchecking the 'Track outbound links' checkbox in the Adobe Analytics extension.

Next, I've created a new rule and am using the 'mouseover' event (which I really like as I'm able to test links by right clicking instead of clicking to monitor image requests). For Actions, I'm using 'Adobe Analytics - Clear Variables' and then just selecting 'Adobe Analytics - Set Variables' to put in my custom code in the JavaScript Editor. Note that I've not chosen the 'Send Beacon' action as we're doing our own custom event handler.

In the Code Editor, I've used the exact same code as DTM except for the clearVars() function as I'm leveraging Launch to do that.

After publishing the code, we can see the exact same image request as we saw in case of DTM, where we're only capturing eVar3 along with the exit link URL.

So, that's it! Hope you enjoyed this post and can make use of what I've done (or something better) to solve this common issue. 

Saturday, December 1, 2018

Reasons For Data Variance Between Analytics Systems

I'm almost certain that at some point in your career, you've been asked to troubleshoot an age old problem of comparing numbers between different analytics systems. I have been asked about this (several times!) so I'm writing this post to share what in my experience, are some of the reasons for this difference and also some potential ways to close these gaps. I will talk about a recent experience I had with a client and talk about some of the issues we faced and a potential solution the team is evaluating. 

Based on a few comments I got about it to not pursue this type of analysis and only look at trends, I want to clarify that we did that and the trends did not line up in this case. We all know that no two systems are going to be the same but in such a scenario, you do need a way to validate your orders to establish the web analytics tool as the ultimate source of truth. That can only be done by going through this painful exercise at least once which happened to be this time when I was first looped into this project. 

The Challenge and Context

My client reported a 25% variance between eCommerce orders reported on the website and their backend system where the website was lower. The issue started after they redesigned the UI (Single Page App) and updated the backend platform. The difference was caught about 2 weeks after the new UI went live. We had used DTM to slightly revamp the implementation to add additional attributes and asked the platform development team to fire a Direct Call Rule (same as before) on the confirmation page which was already in place in the previous version. 

The client wanted us to come up with a solution to 'backfill' these historical missing orders in Adobe Analytics and also apply the right kind of marketing attribution to them which was always going to be challenging as time had already passed. The basic issue is that once a Visitor ID has had been to the site and visited a bunch of pages and certain eVars were set, backfilling data for that user would ‘break’ the pathing flow, the visitor profile and any downstream conversion metrics would also be messed up. This Adobe article on timestamps explains the issue better than I did.

What is an Acceptable Variance between Systems?

Honestly, the answer to that question depends on the client, system and the metric we're looking at. If we're looking at deduplicated metrics such as Visits, Visitors or even Orders, then anything below 5% is acceptable in my experience. For duplicated metrics such as Page Views or Clicks, I've seen clients accept anything below 10-15% if we're comparing one client side system (Adobe Analytics) with a backend CRM system (my use case). 

The other thing to note is that if we're comparing Adobe Analytics to Google Analytics or any other web analytics system, then a lower variance should be expected. Keep in mind that both client side solutions should fire as close to each other as possible on the page to expect a low variance.

Analysis and Approach to Pinpoint the Variance 

During the course of the investigation, we looked into the following factors that could've contributed to the variance. 

But before we begin, make sure you download records from both systems that should be from the same time frame, time zone and need to have the same dimensions and filters as much as possible. We know that no two systems will be apples to apples but let's make sure the comparison files are as close to each other in all respects. Let's now take a look at some of the factors we looked at keeping the orders metric in mind in comparing a web analytics solution such as Adobe Analytics vs. a backend CRM system.
  • If you're not tracking all the payment methods in one system vs. the other system, you will see some discrepancy in the overall orders.
  • There may be conflicting JavaScript code deployed on the confirmation pages (privacy, plug-ins etc.) that will not be compatible with scripts used by other payment methods or Analytics tags. This was actually one of the causes of the difference.
  • Legacy Browsers may not run JavaScript tags so that is another cause of the variance.
  • There may be internal IP filters or bot filters that are applied at the web analytic tool level that won't be present in the CRM system so that is another reason.
  • Given that our Direct Call rule was firing at least 8-10 other marketing tags including Google Analytics, we thought that in some cases, Adobe Analytics didn't fire because of that so try to limit the number of tags on key pages such as confirmation.
  • Most important is the order of execution of the Adobe Analytics tag and if it's able to successfully grab all elements present in the data layer. In our case, we noticed that Adobe Analytics was not able to grab elements from the data layer 100% of the time.

What was the Root Cause of the Variance?

The main reason for the variance was that the website did not fire the Direct Call Rule in DTM required to track the orders 100% of the time in Adobe whereas the CRM system always captured the order if it was a successful submission. We were able to nail this down when we looked at the orders in Google Analytics which ALSO reported a 25% variance between the CRM system.

Potential Solutions to Backfill Missing Traffic

Partial- Data Insertion API

Data Insertion API is a very well known and common methodology to send data to Adobe Analytics server side. In this case, we tried to leverage this method to backfill historical orders and it worked but the other requirement around marketing channel attribution could not be met by this method. If your client also wants to fix the marketing channel attribution problem, this method is not going to work.

Regardless, I still want to quickly cover how we went about testing some orders using Postman to send a POST request to Adobe Analytics. The following screenshot shows a sample XML request to send historical order to Analytics. Note that we need to pass an epoch based timestamp which is a required field to backfill historical or any data.

Complete- Data Reprocessing

I don't have much experience with this but based on what I've heard, a custom consulting engagement with Adobe's engineering team to reprocess the data can fix the attribution issue. This is still being evaluated and I will provide an update when we're done with the project.

How Can We Avoid Issues Like This?

The only way to avoid such issues in the future is to thoroughly test this in a development environment before anything goes to production. A similar kind of rigor is needed to test key metrics in a test environment similar to how it is in a production environment especially during major redesigns so that we nip it in the bud at the onset. The other thing that you can do on an ongoing basis is create anomaly detection dashboards in Adobe Analytics or other tools that are designed to catch any noticeable drops or inflations. I know there are other ways to avoid issues but that is a separate blog post in itself.

My hope with the post is that I'm able to help you find some reasons around why traffic is different between systems on how to be better prepared while undertaking such a task. I wrote about something similar way back in 2007 when I started my career that might be relevant. Have you run into a similar issue for your clients?

Sunday, November 18, 2018

Audience Manager Segmentation Strategy

Organizations spend millions of dollars collecting data about its customers but is that enough? No it's not, because data is stored in raw format in enterprise level solutions such as Adobe Analytics and Adobe Audience Manager and unless this data is made actionable, there's no real value in collecting it. Decision makers in any organization are tasked with either selling more products, creating awareness or growing their customers and want concise data to gauge where prospective customers are and how they're engaging with their brand. How do we help them make all this data actionable and easy to digest?

It's done by slicing and dicing the data into meaningful and digestible chunks which is achieved by segmentation. Segmentation and activation is one of the core components of Audience Manager or any Data Management Platform for that matter. Let's dive into the various steps I recommend that you can take to leverage segmentation efficiently in AAM. 

Define your Goals

It's very easy to go crazy while creating segments and this is where they can spiral out of control and you can end up with hundreds of meaningless segments. To avoid this, it's always a good idea to take a step back and figure what do we actually want to do with this data and leverage the platform efficiently. Knowing what kind of data you want to bring into AAM is one thing but creating actionable segments is another. No analysis can be complete without knowing what the final objective is so it's imperative for you to define your goals keeping in mind the DMP.

Build a Governance Plan

I cannot stress how important it is to control who has access to the platform. I've seen too many implementations where access is given to external agencies who either are not aware of existing guidelines around trait/segment structure and either create new or delete existing segments that they're not supposed to. It often makes sense to first keep segment creation in-house and then slowly bring in other agencies who need to be made aware of your DMP's nomenclature and taxonomy guidelines. A good to way to enforce governance in Audience Manager is by leveraging Role Based Access Controls that allows you to provide read/write access only to specific types of audiences.

Create Incoming Signals using Data Explorer

Audience Manager segments are created using traits which are raw signals that capture different types of data and no segmentation strategy can even begin without knowing what traits we want to capture and how to leverage them. Even though we are discussing segments, our segment strategy needs to be aligned with traits as they both have to go hand in hand with each other. So, the first step is to identify traits that we need to create which will eventually be used in your segments. We can easily do that by leveraging Data Explorer (see screenshot) which is a newly added feature in Audience Manager. I'm already a fan of this feature and highly recommend that you use this to create traits.

Folder Taxonomy and Nomenclature

Once relevant traits have been built and identified, it makes a ton of sense to have a consistent and scalable folder taxonomy and nomenclature. In my experience, having some level of granularity in the trait/segment name has worked out well. As an example for a large enterprise client, we followed a strict nomenclature that mirrored the folder structure along with the right amount of granularity to make sense as well as be concise. Below is a sample folder structure and taxonomy that I usually go with but it may or may not work for everyone so it's important to define it based on your business. The most important thing is to get buy-in from everyone and then follow this consistently in the platform.

Build Segments

Segments are primarily broken in four parts (and others) which are as follows and these can be built directly in Audience Manager: 

  1. Geographical: These can be built in Audience Manager using traits created based on the 'd_' platform level variables such as d_country, d_city etc.
  2. Demographic: These are built from on-boarded traits that are typically created using CRM data. I wrote about how to bring this type of data into AAM and create traits.
  3. Online Behavior: This is usually online web data captured using Adobe Analytics or media data tied to digital advertising on other sites.
  4. Psychographic: These can be built using a variety of third party segments that are tied to lifestyle, income, hobbies etc. and are easily accessible via the AAM Marketplace

Earlier this year, I wrote in detail about the types of segments (use cases) we can build using a DMP. Let's take a look at a segment that leverages both 1st party media data with instant suppression built-in for frequency capping and third party marketplace traits. Note that you can click on the 'Calculate Estimates' button to see potential reach before saving it.

Even though 90% of your segmentation needs can be met directly in Audience Manager, below are some cases where you'll have to share segments from Adobe Analytics to AAM as the DMP doesn't really have the same tracking mechanism as other analytics solution. I wrote about it in detail earlier.

  • Visit based segments tied to visit number
  • Time based segments such as time spent on page/site or time parting
  • Mobile metrics such as upgrades and launches
  • Survey data or some online other data not captured in AAM
  • Revenue related segments from Analytics

One thing to avoid is to NOT share segments unnecessarily from Analytics if they can be created in AAM as there is a limit of only 20 segments that can be shared via the Experience Cloud. If you need more segments, you can leverage Audience Library. This is one of the most common mistakes I've seen clients make where they'll simply share segments from Analytics to AAM when they could've just created them in AAM.

Segment Mapping and Activation 

Now that you've built your segments, you would want to share those with various Demand Side Platform or others tools for marketing activation. One easy thing to do while mapping AAM segments to external destinations is to set an expiration date to avoid continuously sending data after a campaign or initiative is over (shown below). 

Segment Traffic Monitoring and Cleanup

Finally, I'd suggest that you add some kind of process around looking at your existing segments to see if there are any segments that haven't been receiving any traffic for the last 60-90 days and it might sense to archive them (separate folder) if there's no data being captured. You can leverage General Reports in AAM to do that. In the folder taxonomy section, I included the year as an attribute in the segment name so, if you have segments that were used the prior year that are no longer being used now, it'll make sense to archive them. This recommendation wouldn't really apply directly to traits unless those traits were used in a media campaign launched in the past as traits can be reused in different segments.

Now that we've gone through the various steps that I recommend as part of a segmentation strategy in AAM, I'm sure I've missed a few that you might be using in your organization. What kind of segmentation strategy have you defined for your DMP?

Wednesday, October 31, 2018

Adobe Experience Platform SDK Setup (iOS)

There were 178 billion apps (statista) downloaded in 2017. With that amount of volume, it'll be unthinkable for any major organization today to not have a mobile app that allows existing and potential customers to interact with its brand. Most Apps regardless of their purpose have some kind of analytics tracking embedded in them and one of the tracking solutions they leverage is Adobe Analytics.

Adobe Experience Platform SDK (informally known as SDK V5.0) was recently launched and it aims to give clients more flexibility around mobile SDK deployment as it leverages Launch by Adobe to publish changes in an external environment outside of the native app. This was not the case with the previous version v4 of the SDK. The reason why that's exciting is because any configuration changes to the SDK around Analytics can now be done in Launch without having to go through the App Store. There are other differences between the two versions which is covered in more detail here.

There are two primary methods to setup the AEP SDK which are either using CocoaPods or the manual method but in this post, I'll cover how to set this up manually. A lot more information is provided in the Mobile team's official documentation but my aim is to take their instructions and document it visually using my sandbox. I'm covering two main components of this setup which leverage Launch by Adobe and XCode (V10) to setup the SDK.

Launch by Adobe Setup

This is a big enhancement of this SDK over V4.0 where we're now able to deploy the configuration library using Launch by Adobe as opposed to the ADBMobile.json config file that we have with V4.0. The primary advantage of this is that we're able to make basic configuration changes such as changing report suites, enabling/disabling tags, activating the Experience Cloud ID service etc. directly in Launch without having to go through the App Store. There will obviously be a need to publish the app at the start to the App Store to submit the app. Please note that we currently cannot setup any trackAction (clicks) or trackState (page views) calls in Launch and need to do that directly in the App so that does require us to to go through the App Store for each change. Let's go through the steps.
  • Create a mobile property in Launch and select the platform as Mobile.

  • We then setup extensions in Launch and by default, you should see two extensions already added called "Mobile Core" and "Profile" which are explained below. I'm not going to cover these in detail as we only need to add some basic details to the Mobile Core extension. Additionally, we also need to add the Adobe Analytics from the Catalog and install it. Let's review these extensions.
    • Mobile Core: This provides essential SDK related functionality as well as the library to deploy the Experience Visitor ID service
    • Profile: It stores user attributes on the client and more details can be found here
    • Adobe Analytics: It provides the Adobe Analytics library to the SDK. 

Add your tracking server

Add your report suite and tracking server
  • Once we've installed the extensions, we need to retrieve the iOS libraries that are accessible in the Environments section in Launch. This snippet contains a reference to libraries that need to be included in XCode to deploy the SDK (see below). 

  • Finally, we need to publish these extensions to either the development or production environment depending on the scenario. Note that you can also disable Analytics tags from the App by publishing it in Launch if required.

XCode Setup

Before diving in, I highly recommend that you review the steps outlined in Adobe's official documentation on configuring the SDK. Some of what I'll cover is outlined in the documentation but I'm including a lot more detail in the form of screenshots from XCode which the documentation doesn't include. So, let's continue.

  • Download XCode and setup a basic App which allows you to simulate an iOS App experience.

  • Download the AEP iOS SDK from Adobe Experience Platform's Github branch located here.

  • The downloaded zip contains a bunch of files and folders but the one we need to use is called "iOS" which contains two sub folders called "device-only" and "universal". 

  • The "iOS" folder needs to be added to XCode and mapped to the Target in XCode. Once the mapping is done, we need to add some frameworks to the "Embedded Libraries" section in XCode to provide the necessary libraries to the SDK. Please note that for an actual iOS Apps, you need to use the frameworks present in "device-only" but for Simulators (my use case), you need to use the frameworks present in the "universal" folder. I've tried to color code the relevant details in this screenshot.

  • Once the required frameworks are added, we need to reference all libraries and call specific methods to activate lifecycle calls, enable logging and other methods as explained here. I'm using Objective C but the documentation covers Swift and Android syntax as well. Please note that I copied the installation code directly from the Environments section in Launch (covered above) and included it in XCode which ties the SDK with Launch. I've included these methods in the AppDelegate.m file of my App. The integration with Launch is done via the configureWithAppID call to which we pass the Launch App/Environment ID.

  • Once the basic lifecycle tracking methods are added, we can add a trackState (page view) method call to trigger an event. I'm basically sending data to the "screenName" variable populated with a value of "homeview". That is typically done in the ViewController.m file.

  • Finally, we need to make sure that the App build is successful and test the Analytics call. In my case, I'm looking for the variable called "screenName" which is populating in the debugging console as shown below.

The Adobe Experience Platform SDK is still relatively new and is being evolved but the process of installing the SDK is fairly easy once you get through the initial learning curve. I will continue to keep track of all upcoming features and will write about other functionalities tied to Launch by Adobe to trigger different types of mobile events.

Sunday, October 14, 2018

Adobe Audience Manager Email Pixel ID Sync

I wrote about email marketing back in 2009, and it's still very relevant and effective in 2018. According to this recent (2016) article, "users of email marketing systems are achieving $38 in ROI for every $1 spent". Almost all my clients do some kind of email marketing to send email newsletters to either existing or potential customers. Some of these clients leverage a Data Management Platform to connect data from multiple sources and email is one of the most common choices. I also wrote about email marketing back in 2009 and in 2018, it's still as relevant as back then.

From an Audience Manager perspective, a very useful data point for clients is to capture custom email attributes upon impression or click and user IDs to sync them with AAM for cross device targeting. This post covers the various steps to capture email data in AAM.

  • Configure the Email Pixel: The general format of the email pixel is very similar to display banner pixels with some minor differences. Below is a screenshot of an email pixel as well as a walkthrough of the various parameters.

    • Subdomain: This is the Audience Manager subdomain. In my example, it's "ags542" as per the screenshot. 
    • d_event: This is the event type which can either be 'imp' (impression) or 'click'  depending on where you want to capture this event.
    • d_cid: This parameter allows us to sync a user ID with AAM.
    • DataSourceID: This is the data source ID created for this data point. It's '133374' in my case and I've covered this in detail below. 
    • HashedEmailID: It's 'ABC789' as per the example but it can be any numeric or alphanumeric ID for ID syncing purposes.
    • c_emailopen and c_campaignname: You can pass any customer specific parameters (c_) which in my case are outlined in the screenshot. Note that we will need to create rule based traits to capture these parameters.

  • Create Data Source and Profile Merge Rule to Capture IDs: Given that we will be syncing user data with AAM, we need to create a cross device data source as shown below.

We will also need to create a profile merge rule mapped to this data source which we will need to leverage for activation.

  • Create Rule Based Traits: The next step is to create rule based tied to custom attributes that you want to capture for the email pixel. In my case, they are c_emailopen and c_campaignname. Below is what one of these customer specific traits look like.

  • Deploy the Pixel in the Email Marketing Software: As explained above, once you've completed the prerequisites of setting up the data source and the necessary traits, you can go ahead and setup the pixels in your EMS. Please make sure that there are no extra spaces and the pixel should look very similar to this example where you need to include the additional "%01" and "%011" characters to follow the ID sync conventions. 

  • Monitor Audience Traits: This process is often not known to everyone but there is an easy way to test if user IDs synced as part of the email pixel, made it into AAM. This can be done by going to the Audience Traits folder and clicking into the ID sync trait created for email as shown below.

  • Create Rule Based Traits for the Landing Page: Given that we're only capturing the email open impression and not click, the recommended approach is to create a rule based to capture users who clicked on an email tied to your landing page URL. So, the trait condition will be tied to the following attributes h_referer == AND c_campaign==ags542_test.

  • Upload CRM Files (if applicable): One interesting use case of syncing user IDs with AAM is to upload demographic or other customer data from the CRM. The process is the same for uploading files in AAM and I've covered it in detail here. 

Email marketing is not going anywhere and with a minimum amount of effort, we're able to ID sync users with the DMP. It is a great alternative to ID sync users if you're unable to capture user IDs upon authentication. So, are you capturing email marketing data in AAM and ID syncing your users?

Sunday, September 30, 2018

Adobe Analytics Reporting Issues and Tips

It's been a while since I've been a full time analyst but I do get to dabble in data in my current role as Technical Consultant. This post covers some of the reporting/segmentation challenges I've faced with Adobe Analytics and some tricks I've learnt in my role. My primary purpose of writing this post is to share it with other technically focussed users and capture it for my own documentation. Let's get started!

  • Cardinality Issue: One issue that has caused me some strife is the Unique Exceeded issue where only the top 500,000 rows for a particular month are shown. This happens when cardinality for a particular dimension (E.g. Page URL) is very high and Analytics outputs a string called "Low Traffic" as shown below where it's the most popular data point. This means that any segments leveraging Page URL will not show accurate results as only the top 500,000 values are shown in the UI.
    • How to Address the issue: I can think of 2 solutions to tackle this issue:
      • Add Dimensions with Low Cardinality: In your implementation, define a unique page name or sub sections for a combination of URLs and capture query string in a separate variable to keep the URL clean. Basically, look at alternative dimensions that have a low cardinality.
      • Leverage DataWarehouse: DataWarehouse (shown below) allows you to export more than 500K rows so you can essentially, export as many rows and then filter the data offline.

  • Single Page App Issue: In a recent client scenario, we had a situation where they had a Single Page App (SPA) built using AJAX and the UI experience changed but the page didn't refresh. These experiences often times leverage callback functions which developers can configure so that we can fire off "artificial" page views even if the page doesn't reload. 
    • Issue: My client wanted to see how many users landed on the SPA experience when the "cmpid" URL query parameter was present. They populated an Analytics event tied to this query string but saw a very high number of instances which exceeded the actual page views on this landing page. The issue was tied to the fact that the Analytics URL (highlighted in Red) did not get updated when in fact, the SPA URL changed (highlighted in Green) for additional steps in the flow. Given that the Analytics URL is used to evaluate if the query string is present, the instances of the event kept getting inflated despite the fact that "cmpid" was not present anymore.

    • Solution: The solution was to update the Analytics URL by overriding the s.pageURL (Analytics URL variable) with document.URL using JavaScript. This allowed us to ONLY track events when "cmpid" was populated.

  • Visitor Stitching Segment: A very common use case for enterprise clients is to stitch visitors across mobile app and mobile web where they want to attribute downstream events to visitors coming from a mobile app to a mobile web view. An example is a conversion funnel that is embedded in a native app where there is a mix of native app screens and web specific HTML or AMP pages. It will make sense to define a unique variable to differentiate between mobile app and mobile web and use that in a segment as shown below.

  • Sequential Segmentation: Sequential segments are explained very well here and I highly recommend you read this but my example is much simpler and deals with a recent question I got from a client.
    • Question: How many people stayed on the "about-us" page for at least 2 minutes and then clicked on the "visit-homepage" link?
    • Answer: The answer is a simple sequential segment which did take me a few tries and it's shown below:

  • Exclude Segments Work Well: I've used exclusion segments quite extensively to remove users tied to a particular dimension or event. I've also used exclusion as an option to create "inverse" segments to test my results. Below is a simple example of an Exclude segment that worked really well for me recently where I had to exclude visits who spent less than 15 seconds on a page:

  • Finally, a Segmentation Container Primer: I'll wrap up by covering a simple yet confusing concept of segment container which has often stumped me. It's the general difference between the various segment containers (namely Visits, Visitors and Hit) and when to use these. In the following screenshots, we see how each container affects the underlying data.
    • Hit Container: This container when applied, yields the fewest page views and instances as it confines the data to the actual hit level. The requirement for this container to work is for the variable to be set in the image request. It will show the lowest number of page views or instances.
      • Use Cases: Apply this when you want to get the most accurate results for a particular dimension or event. E.g. You can use this container if you want to get actual page views or video views for a particular time frame or search term. You should also use Hit level segments to create Virtual report suites.

    • Visit Container: This container shows data for a particular visit where something happened without a gap or timeout of 30 minutes. The will show page views or instances that are more than that of a 'Hit' container and less than that of a 'Visitor' container.
      • Use Cases: If you're looking to analyze user behavior that can only happen within the same visit such as Bounces, Click on a Call to Action on the Homepage, Landing page Visits tied to a marketing campaign etc.          

    • Visitor Container: This container will show data tied to a Visitor cookie (typically 2 years) and will show all page views and instances for the lifetime of the Visitor. This will show the most number of page views or instances as the scope of this is the biggest among the three containers.
      • Use Cases: If you're looking to find out total revenue captured across all visitors or to see how many visitors traversed from a mobile app to a mobile web view.

I hope these tips will help you answer some simple, yet confusing concepts that often requires additional investigation and can lead to unforeseen delays.