Sunday, June 28, 2020

AdobeDevBlog: Exploring the Impacts to Adobe Analytics when Migrating to AEP Web SDK

Adobe Experience Platform Web SDK is intended to replace the existing libraries across our various Adobe Experience Cloud solutions such as appMeasurement.js, at.js, dil.js, and visitorapi.js. The important difference is that a library was written from the ground up leveraging an updated schema via Adobe Experience Platform Experience Data Model (XDM) to map and collect data agnostic of the solution the data is pushed into for further processing. I coauthored this post on Adobe's Medium Tech blog to explain how to route data to Adobe Analytics through the newly launched Adobe Experience Platform Web SDK. Here's a link to it.

Tuesday, June 2, 2020

AdobeDevBlog: Adobe Experience Platform Web SDK Migration Scenarios

Adobe Experience Platform Web SDK consolidates solution-specific requests into a single payload for a seamless and much more simplified implementation. I coauthored this post on Adobe's Medium Tech blog as part of a series of blogs we're going to write about migrating existing Adobe implementations to the newly launched Adobe Experience Platform Web SDK. Here's a link to it.

Sunday, May 24, 2020

Visualizing Traffic on my Blog using R

I've been a data analyst in the past and one thing I can say for sure is that we don't have to be great analysts or statisticians to be able to read basics graphs and understand trends. Visuals are all around us whether it's stock market trends or data around the dreaded Covid-19 pandemic. By now, I'm sure all of us have heard about "flattening the curve". It literally took a pandemic for us to know what it means but the point I'm trying to make is that we are surrounded with data and people should ideally know how to understand it. I recently learnt the basics of R, which is a programming language mostly used in data analytics, statistical analysis and visualization. R is a good language to learn for data analysts and statisticians which resonates really well with professions who know SQL. 

In this post, I'll visualize traffic coming to my blog since 2017 (data captured in Google Analytics) and show some commonly used graphs and visualizations using R Studio. The most obvious trend you'll see is traffic started gradually increasing on my blog since I started writing again in January-18 and has really spiked in the last few months. So, let's dive in!



Basics of R


As I explained earlier, R is a programming language used to analyze existing trends and also do predictive analytics using statistics. For the purposes of this article, I'm using R Studio to run basic R commands to create simple visuals such as bar graphs, line graphs and slightly more complex visuals such as bubble charts, word cloud and a map using some commonly used packages. 



Data Frame


The first step before doing any analysis in R is data wrangling which is manipulation and transformation of data in a format which you can use for analysis. In R, we do the same thing by creating a data frame which is essentially a table that is populated typically by importing a .CSV file but other formats such as SPSS or Stata are also supported for advanced use cases. 

A data frame contains rows and columns and can be compared to an Excel spreadsheet. The other thing to keep in mind is that it's okay to do some data transformation in the source file itself before bringing data into R but a lot of the manipulation is done directly in R itself. In my case, I modified the source .CSV files exported from Google Analytics for basic data formatting such as switching the metrics to Number format as an example. The command to bring data into R via a .CSV file is: 

where df is the data frame, read.csv is the function which reads a .CSV file and stringsAsFactors = FALSE ensures that the data is not converted into a factor to keep the source data format intact. The original file contained Page name and some common metrics such as Page views, Visits etc.


Package


R packages are reusable code libraries that provide additional functionality to R and help simplify tasks. You can install packages using the install.packages() function and invoke them using the library() function. In my case, I'm using the following packages:


  • library(ggplot2)
  • library(lubridate)
  • library(ggwordcloud)
  • library(maps)
  • library(dplyr) 

Finally, before we take a look at the visuals, one other thing to note is that I did some basic data manipulation in R to convert the Month and Year using the lubridate library by using the ymd() and mdy()  functions. There are a lot other things that I can cover under basics but that's outside the scope of this post.


Visualizing Blog Traffic Trends


In this section, I've inserted the graphs created in R Studio which were saved as images. I've used the "ggplot2" package which is a very popular R library to visualize data. I'll admit that my blog does not get a ton of traffic but that is not my intent as I'm not into any competition to artificially inflate my traffic. My intent is to share what I know with others and document things for myself for future reference. Let's take a look at some of the trends.



Show Visits and Page Views for the Last 3 Years (Line)


In this line graph, I've visualized Visits (called Sessions in Google Analytics) and Page views for the last 3 years. 





  • If you notice closely and look at the label I manually added, traffic started gradually increasing once I started writing again in early 2018. 
  • The biggest spike happened Thanks to my last post about Real-Time CDP which I wrote last month. This shows how much my readers want to consume content about the latest and greatest technology from Adobe and especially if it's around Adobe Experience Platform.

I mentioned earlier that it's very common for analysts to modify the source data before bringing in the data in R which is what I did to generalize the page names by removing the month and year using the mid() function in Excel.





As promised, here's the code sample to generate this visual. Please note that I created a separate data frame called 'dfl' which contained Month, Visits and Page views. Also, note that the file format is .rmd which is a format used to visualize R commands. 




Top 10 Pages Visited from Jan-2017-Apr-2020 (Flipped Bar Graph)


In this bar graph, I'm showing the top 10 page Visits by Page in the last 3 years. 





  • The most popular page is the one I wrote to show the calculation of funnel drop off rate way back when I started blogging. This shows that there's still a sizable audience looking for calculation of basic metrics such as drop off rate as the traffic source of this page is primarily search engine.
  • The other popular pages is the homepage which may mean that people get sent directly to my homepage via search. Again, this is inclusive of the last 3 years worth of data so more analysis is needed to understand this fully which is outside the scope of this post.
Below is the code I wrote.


Top 10 Pages by Visits and Bounce Rate from Jan-2017-Apr-2020 (Bubble Chart)


In this chart, I show the top 10 pages (last 3 years) visualizing Visits and Bounce Rate. The color of the bubble is the page name and the size of the bubble is tied to Visits.



  • The most popular page name (drop off rate) has the highest Bounce Rate and Visits which shows that readers searching for drop off rate who come to my blog are MOSTLY interested in this type of content and nothing else.
  • The homepage ("/") has the lowest Bounce Rate of 66% which is obvious because users typically either search or click into a post which they came to read as opposed to just stay on the homepage.

Below is the code snippet.



Top Traffic Sources from Jan-2017-April-2020 (Stacked Area Chart)


In this chart, I visualize the top traffic source sending traffic to my blog for the last 3 years.





  • Organic search has traditionally been the top traffic source for my blog but the interesting thing is that a lot of visitors come directly to my blog by typing the URL which is very surprising to me unless they bookmark it.
  • Traffic via Social channels started appearing since I started sharing my blog posts on LinkedIn and Twitter from early 2018 which explains the trend.

Below is the code snippet.




Age and Gender Data Captured since late 2018 (Pie Chart and Bar Graph)


Now, this might be a bit surprising for some Adobe Analytics users to understand how Google Analytics captures demographic data. This is done by enabling the Demographics and Interests reports  in Google Analytics which uses data collected from IDFA and Google advertising cookies to help in retargeting. Again, none of this data shown here is even borderline PII so Google has taken into consideration all privacy regulations. It might be a good addition for Adobe Analytics if it can receive similar data from the Adobe Ad Cloud platform.




There's not much to say here as these graphs are self-explanatory but I manually added the percentage (total is ~5500 Visitors) to show the breakdown of Gender in the Pie chart. The code snippet is show below.



Word Cloud showing Internal Search Terms since early 2019


Word cloud is a commonly used visual to show search terms or popular tags which people are looking at. I've imported the "ggwordcloud" package to do this. 



  • Given that I've written extensively about Adobe Audience Manager, it's not surprising that a lot of the search terms are tied to AAM.
  • This also tells me what else I can write about based on what people are searching for.

Below is the code snippet.



Map Showing Visitors by Country from Jan-2017-April-2020


Finally, this visual created using the "maps" and "viridisLite" (already available in R) packages shows which country is the most popular in terms of the total number of Visitors. As shown below, United States is the most popular followed up by India which are the two countries I'm associated with so I'm not surprised :)



Below is the code snippet for this.



Adobe Experience Platform Data Science Workspace


One of the best things about Adobe Experience Platform is that it provides you the ability to run SQL queries and run ML algorithms or models directly on the data (in XDM format) in the tool which has never been the case in the past. This is super powerful and completely eliminates the additional time it would take to export the data and make it available in a platform outside of Adobe. 

Data Science Workspace integrates Jupyter notebooks which is very popular open source application that allows you to run ML models, perform data visualization written in programming languages such as Python. The reason why I'm mentioning it in this post is because it can also run code written in R so theoretically everything that I showed you can work in Data Science Workspace but my understanding is that it requires the underlying data to be in XDM format. I haven't dabbled with it due to data access constraints but here's how you can access DSW and run R commands in case you have access.





So that was it! Hope with this post, I piqued your curiosity about R and its data visualization capabilities.

Friday, May 8, 2020

AdobeDevBlog: Adobe I/O Architecture and Use Cases

Adobe I/O is a one-stop shop for developers, giving them everything they need to extend the capabilities of the Adobe tech ecosystem. I coauthored a post where we took a look under the hood at our I/O architecture, components, and all the dev tools and services available on I/O Runtime (Adobe's serverless platform), plus the different Adobe Experience Cloud use cases that I/O makes possible. Here's a link to Adobe's Medium Tech blog post I coauthored and below is the visual of the architecture.



Sunday, April 26, 2020

Overview of Real-Time CDP

Companies are continuously innovating and coming up with new tools and technologies to enhance the experience of its customers. Optimizing and personalizing the experience of customers at every touchpoint needs to be the holy grail for all companies. I gave an overview of a Data Management Platform a few years ago and have written a few article about Adobe Audience Manager since then. 

In this post, I'll write about Real-Time Customer Data Platform which is the activation engine and a service built on Adobe Experience Platform and how it solves the ever evolving challenge of a consistent and personalized user experience. As you may be aware, Adobe Experience Platform is built with an API-first approach which makes integrating with other platforms easier and it also allows us to replicate what we can do in the UI via APIs as well.


It's always helpful to start simple so I'll give a brief overview of what a Customer Data Platform is. There's a lot of documentation which compares a CDP to a DMP so I won't go into that in detail but will just focus on what a CDP is.


According to Wikipedia, "A customer data platform is a type of packaged software which creates a persistent, unified customer database that is accessible to other systems. Data is pulled from multiple sources, cleaned and combined to create a single customer profile. This structured data is then made available to other marketing systems".


I define it as a system which consolidates known and anonymous customer information from multiple data sources to serve 1:1 real-time personalized experiences across multiple touch points and marketing channels in a privacy aware manner. Below is a visual overview of a CDP (taken from a 3rd party site). The diagram looks very similar to a DMP but a CDP differs with a DMP in the following ways:

  • The activation of both known (PII) and anonymous user data is possible in a CDP
  • Profiles have a much longer data retention period in a CDP
  • A complete customer profile (historical, demographic) is maintained in a CDP



Core Elements of Real-Time CDP


Below are some core components of the Real-Time CDP and how it relates to Adobe Experience Platform.
  • XDM Schema & Datasets: The XDM Schema describes the logical model of all data coming into Adobe Experience Platform and contains information about the identities used to build the id graph. Essentially, any data that will be activated by Real-Time CDP has to be mapped to the XDM schema format. Datasets are tied to a schema and collect data either sent either via streaming or batch format from various sources (E.g. Audience Manager, Analytics, 3rd party sources) into AEP. 
  • Real-time Customer Profile: The Real-time Customer Profile is a combined view of all customer identities collected across multiple channels. I recently wrote about the device graph which creates a holistic view of the customer across multiple touch points. Real-Time CDP taps into the device/id graph to create a unified view of both the known and unknown customer profile on the fly depending on the use case.

  • Real-Time Web Data Streaming: The introduction of the AEP Web SDK has changed the dynamics of how Adobe data collection will happen moving forward. This essentially allows clients to send data directly to AEP in real-time as well as to Analytics, Audience Manager and Target from our website or mobile app. A lot of new content is being created on this which I'll share soon.

  • Real-Time Activation of Profiles: Real-Time CDP allows us to send segments and PII profiles to external destinations quickly keeping in mind the rapid pace at how customers can interact with brands across multiple channels. 
  • Built With Privacy By Design: All Adobe products including Real-Time CDP is built keeping privacy in mind. In case of the AEP UI, customers can use Data Usage Labeling and Enforcement (DULE) labels to mark mark identities or restrict how data will be activated based on GDPR and other privacy laws. The official Adobe documentation provides more details on this.

A lot more information can be found in the official Adobe documentation in addition to what I've covered.


Adobe Solution Integration with Real-Time CDP


Visuals always help so I created a quick high-level data flow diagram outlining how data flows into and outside of AEP/Real-Time CDP from various Adobe solutions. Only the arrows coming into and flowing out of AEP/Real-Time CDP are colored and these connections can be enabled OOTB. The diagram should be self-explanatory but web data flows into AEP/Real-Time CDP via the AEP Web SDK/Adobe Analytics and 2nd/3rd party/other data flows into Real-Time CDP from Audience Manager. Data from Real-Time CDP can be shared with Adobe Campaign to personalize emails.



Real-Time CDP UI Overview


I'll now provide a high level overview of the Real-Time CDP UI but please be aware that these are likely to change as product is continuously making changes to optimize the user experience and add more enhancements.


Profiles

A profile is a collection of an individual or entity which can be used to uniquely identify it. In this screenshot, the profile is tied to the ECID identity but there can be additional identities for this profile such as email address, phone number to name a few. This is embedded within the Adobe Experience Platform UI.



Segments

The segmentation UI for Real-Time CDP is the same interface which AEP uses and it looks very similar to the Segment Builder UI embedded in Adobe Analytics. Segments allow us to dissect very large amount of profiles into smaller and much more manageable chunks. 



Identities

The identity page contains identity namespaces which contain unique information about a person. As shown below, an identity namespace can be Email, address Phone number, ECID etc. This is also embedded within the Adobe Experience Platform UI.


Sources

The Sources->Catalog tab allows us to bring data into AEP/Real-Time CDP from various sources such as Adobe applications, CRM, Marketing automation to name a few. Data from these can be combined with each other and sent to various activation platforms. Again, this is also embedded within the Adobe Experience Platform UI.



Destinations

The Destinations->Catalog tab allows us to activate the audiences and share them with ESPs, marketing automation and advertising platforms. This is the core feature of Real-Time CDP which you won't see present in the regular AEP UI. There are two ways to send data via destinations:
  • Profile Export which allows us to export PII data to ESPs and other platforms. 
  • Segment Export (anonymous) which allows us to export profiles which qualified for a segment to Demand Side Platforms and is similar to how Audience Manager exports data to DSPs. 
This UI as shown below shows us the standard connections but also lets us connect to Launch which allows us to forward raw signals captured in Launch using extensions to any 3rd party partner.



The following is a visual (representational) system view which shows how many profiles are connected to sources and are sent to a destination.



Finally, I want to emphasize that I've not heard anything about Adobe Audience Manager being impacted by Real-Time CDP as we s
till need AAM for 2nd and 3rd party data activation. Also, Audience Manager has over 100 destinations whereas Real-Time CDP only has a few (currently) with connections to CRM, cloud providers and email systems but the list will only grow. I don't know about the overall product vision around the total numbers destinations and AAM in general but I'm sure our product team will clarify that.

So, that was it! I'm sure I've left out other finer details about Real-Time CDP but this tool is continuing to evolve and will only get better as its adoption increases.

Monday, March 9, 2020

Overview of Adobe Device Graph

We are surrounded by all kinds of (regular and smart) devices. It's either a Laptop, Tablet, Smart Phone, Smart Watch, Smart TV, Smart Vacuum, Smart Plug, Nest Thermostat, Wifi Camera or Chromecast to name a few. However, not all of these devices are truly connected due to many factors with one being that not all devices may belong to one single company in a household but the most obvious one is the lack of a true user "Identity". The other issue we face is that it's really difficult to track users and attribute an action across all these devices consistently. Google does this well with Google Home where it's able to sync my Nest thermostat with Chromecast using my Gmail ID. The other company is obviously Adobe ;)

By 2030, Americans will own as many as 15 connected devices. While we're not fully there yet, the introduction of new devices every year might expedite that trend. In 2016, Adobe's coined the phrase "Devices don't buy products, people do" which is still very relevant in 2020 and will be in the future. So how does Adobe establish the link between multiple devices to one person? In this post, I'll provide an overview of the various ways by which Adobe is able to link multiple devices to a person or household.


Concept of a Device Graph


According to this article, "A device graph, also known as “identity management,” is a map that links an individual to all the devices they use, which could be a person’s computer at work, laptop at home, tablet and smartphone". I'd like to add other smart devices to this list but the most common use cases for cross device analytics we see today are still around mobile, desktop and tablet but there's continuous innovation happening to bring different types of devices into the mix. 


Probabilistic and Deterministic Linking


The Adobe device graph comprises of multiple devices linked together via two methods namely deterministic and probabilistic. This Adobe article explains this concept really well but at a high level, probabilistic device link allows us to predict a person's identity (John Doe in our case) based on IP address, operating system etc. and deterministic device link allows us to identify a person based on their encrypted user ID captured on the website sent over to the Experience Cloud ID Service across devices. Please note that this matching and attribution happens 


The visual below is my attempt to explain the device graph at a high level where John Doe visits a website or mobile app using his iPhone, iPad and Mac. There are prettier visuals available in the Adobe documentation but I just wanted to create something simple to convey the concept.






Types of Device Graphs


There are primarily two types of device graphs which Adobe supports and these are ways by which you can create a true identity of your customers across multiple devices and sites. There's also an external graph option as well which allows companies to leverage 3rd party device graph data which is explained in detail here but it's not in scope for this post.


Device Co-Op


The Device Co-Op (available in US and Canada) allows companies to participate in a device graph which allows them to identify their "linked" customer devices (at a person and household level) across a magnitude of channels and websites in near real-time. The last time I heard about the scale of the Co-op, there were 100+ companies, about 300 Million users and 2 billion devices part of the Co-op device graph but this number is obviously higher now. 



The concept of the Co-op can be better understood based on the visual (above) taken from the Adobe blog

  • A customer John Doe visits the travel website and authenticates on both his mobile device and laptop. 
  • He then goes to a retail website but doesn't authenticate so the retailer doesn't know who this customer is.
  • Device co-op enables the retail website to "link" the customer's devices assuming both websites (companies) participate in the Device Co-op and identifies the anonymous user as John Doe. 
  • This in turn allows both companies to personalize the experience for this customer on both devices and websites. As you can see, the Device Co-op makes use of both the probabilistic and deterministic linking methods.

In contrast, if these companies didn't participate in the Co-op, then the retail company wouldn't be able to know what all devices did its customers use before purchasing something and treat it as 2 unique visitors instead of 1.

Below are some common use cases of the Device Co-op:

  • If you want to access to a large pool of users for prospecting related use cases.
  • If you want to perform frequency capping across devices.
  • Some other use cases are covered here.

Here's a link to a document which covers the membership and eligibility criteria for companies to participate in the Device Co-op. One other requirement of the Device Co-op is for the company to make changes to its privacy policy so it takes into account all the necessary privacy requirements and it does not collect any kind of PII or behavioral data. You can view your all your devices linked to the Co-op here and can unlink your devices from it anytime.

Private Device Graph


The Private Graph is another way for a company to link their customer devices in a device graph but visible and accessible only for their own organization and not any other company. 

Using the previous example, if we just look at the travel website alone we can treat it as a participant of the private graph as the data will simply be available within the context of that company. Private Device graph primarily makes use of the deterministic link as it works best if encrypted user ids are captured. Probabilistic matching will also be available to Private Graph sometime this year. 

Below are some use cases of the Private Device Graph:

  • If you want to reconcile user identities across multiple devices into one.
  • If you want to perform frequency capping across devices thereby optimizing ad spend.
  • If you want to establish a common identity across online and offline channels.

In order to participate for the Private Graph, the customer must be on Analytics Ultimate or have Audience Manager or have Target Premium and also be an Adobe Experience Platform customer. If you qualify for this as a client, subscribing to this should be a no-brainer for you given that no privacy policy updates need to take place. However, Device Co-op does give you access to a much larger amount of users which you wouldn't get with a Private Graph so you'll have to weigh your options on which one to choose.


Device Graph and Adobe Experience Cloud Solutions


In this section, I'll cover how to leverage some of the Adobe solutions I've used in conjunction with Device Co-op. Please note that I've only used Adobe Analytics and Audience Manager for customers participating in Co-op but there are use cases pertinent to Adobe Experience Platform, Adobe Target and Ad Cloud as well.

Adobe Analytics

Device Co-op is a natural fit for Adobe Analytics given the availability of the "People" metric which deduplicates the user count tied to multiple device and provides a true representation of a user as opposed to a device. This was introduced back in 2017 so it's been around for a while now and you can find more information about it here. Below is an example of what this looks like in a sample taken from the Adobe documentation.



The other service you can leverage (I've not used it yet) is Cross Device Analytics (CDA) which allows you to analyze cross device behavior in Analysis Workspace using Adobe Analytics data (needs a cross device report suite). This is not a default service and all the details and eligibility requirements are included in this Adobe Spark page.


Audience Manager

If you are a member of the Co-op and use Audience Manager, you get a lot of benefits primarily around prospecting and cross device frequency capping in offsite marketing. The first thing to do is to setup your Profile Merge Rule similar to how it's shown in the screenshot below taken from Adobe's documentation. 

The other thing which is very consistent with the screenshot is that your Co-op devices per Person count should be lower than the Person count of other data sources. Finally, you should expect to see a much larger count of users (potential reach) in the Co-op Person number compared to any other data source.


Adobe Experience Platform, Adobe Target,  Ad Cloud

Device Co-op also provides additional capabilities to Adobe Experience Platform for id stitching, Adobe Target for personalizing experiences across devices and to Ad Cloud for media uses cases around retargeting across devices. I haven't personally leveraged Co-op for AEPTarget and Ad Cloud so I'm unable to provide any additional context as I don't have much exposure to these three in the context of Device Co-op.

So, that was a high level overview of the Device Graph but I'll be writing more about it as I learn more about how it's used with other solutions of the Adobe Experience Cloud stack. Please feel free to share your feedback!

Saturday, February 8, 2020

Review Adobe Experience Cloud ID Cookies on Google Chrome 80

Web Browser Cookies have long been the lifeblood of the digital marketing ecosystem. A cookie is a tiny text file used by a web browser that typically captures non-intrusive information about a browser (indirectly a user) such as logged in state, user preferences and anonymized or encrypted ids etc. A cookie allows companies to measure browsing patterns (pages visited, products bought), remembering products added to a shopping cart or simply personalizing user experiences that match previous patterns and behaviors. 

As of December 2019, Google Chrome was the most popular web browser with a market share of 69%. Google recently released their newest Chrome browser version 80 which introduced clear guidelines around how cookies need to be set moving forward keeping in mind privacy regulations such as GDPR and CCPA. As an example, Safari now completely blocks 3rd party cookies from being set. Given that Google has advertising platforms which primarily leverage 3rd party cookies, there are still a few years left before Google completely phases out 3rd party cookies by 2022 which means we don't have a lot of time left. So, what does it mean for companies like Adobe which also rely on cookies for measuring user behavior using 1st party cookies and activating anonymous profiles on publishers using 3rd party cookies in the interim?

In this post, I will review changes made by the Adobe Identity Service team to address cookie setting requirements in Google Chrome 80.  I will refer to this article written by the Adobe Identity service team and validate changes made by Adobe primarily around the Experience Cloud ID Service cookies. So let's dive in!


What is all the Fuss About?


Here, I'll cover exactly was has been changed by Chrome 80 and I've made a simple decision tree to depict what the change is in regards to the new cookie guidelines. At a high level, 3rd party cookies need to be secure with a SameSite attribute equal to "None". On the other hand, cookies without a SameSite attribute will default to "lax". 

In the next two sections, I'll do a quick pre/post comparison between Chrome 79 and Chrome 80 and review the Experience Cloud demdex cookie set in both versions.


The World Before Google Chrome 80


I'm first looking at Chrome version 79 and I've visited an Adobe customer website using the Experience Cloud Visitor ID Service.

I'll primarily focus on the demdex which is our 3rd party cookie that is most susceptible to deletion after this change. I previously wrote about migrating to the Visitor ID service where I've covered some of the cookies set by the ID service in more detail. The first thing we see is an error in the developer console specifically calling out the demdex cookie and stating that it's set without the SameSite flag.

The World After Google Chrome 80


In this section, I will show how the Google Chrome error for demdex went away after I updated my browser version and look at the site customer website in incognito mode.

Looking at what the developer console shows for the same customer website, we can see that the change was made by the Adobe ID Service (demdex) server side and the error went away. Please note that it DID NOT require a Visitor ID service version upgrade as the change was made server side. Having said that, it's always advisable to upgrade your ID service library to the latest version. 

Now looking at how the demdex cookie is now set as shown by Chrome 80, we can see that the cookie has both the SameSite=None and Secure values set with a TTL expiration of 6 months.

What is the Recommendation for 1st Party Cookies on Safari?


In this section I want to talk about the importance of leveraging a CNAME tracking server to measure your website activity in Adobe Analytics which is more of an issue post ITP2.1 in Safari (slightly going off topic). This Adobe article covers how we can use a CNAME to set a new s_ecid cookie that extends the AMCV cookie expiration to 2 years instead of 7 days which Safari enforces today (see below).

Please note that this requires ID service version 4.3.0 + to take advantage of this change to extend your visitor expiration to 2 years instead of 7 days on Safari.

So, that's it! Hope found this helpful in understanding what changes were made by Chrome 80 and how Adobe is prepared to address any potential tracking issues as a result of this.