Saturday, October 24, 2020

Marketo Integration with Adobe Experience Cloud Solutions

The Experience Cloud ID service consists of many solution integrations which I've written about in the past. The true value of the Experience Cloud ID service come to the fore with the various integrations that exist helping client really realize the true value of their investment in Adobe technology. A relatively newer addition to the Experience Cloud ecosystem is Marketo which was acquired by Adobe a few years ago for its rich cross-channel activation and marketing automation capabilities primarily in the B2B space but Marketo can be leveraged for B2C use cases as well. 

I know that doesn't do justice to how powerful Marketo really is so here's a high-level overview of some of its capabilities especially around marketing automation:

To start off, let's start by understanding what marketing automation is. It is a technology that automates the measurement and orchestration of omnichannel marketing initiatives. It simplifies lead management & nurturing, event marketing, personalization, regular & triggered emails or SMS. I have worked extensively in the automotive vertical in the past and know how crucial it is for automotive companies or any other company to manage their leads throughout the customer journey starting from awareness to purchase. Marketo is the perfect system to do that and if you combine it with the power of the Adobe Experience Cloud, you are truly able to orchestrate and measure the customer journey from start to finish.

In this post, I will walk through the integration of Marketo with other Adobe Experience Cloud solutions focussing more on Audience Manager but the general process is the same for Adobe Analytics and Target. Please note I'm specifically referring to Marketo Engage but will call it Marketo in this post.


Use Cases

Let's start with some common use cases that can be executed with this integration: 

  • Cross-device media activation of leads (B2B or B2C data from Marketo activated in Audience Manager leveraging the device graph)
  • Event-driven (signups, abandonments etc.) user messaging (in Marketo based on user behavioral data from Analytics)
  • Cross-channel and cross-device personalization (via Target and Marketo based on onboarded data, user behavioral data from Analytics leveraging the device graph in Audience Manager)


Prerequisites


Let's take a closer look at some of the prerequisites with this integration.
  • An Adobe org admin can enable this integration by mapping the IMS org in Marketo (Admin section).
  • It is recommended to setup the cookie sync as early as possible between the Adobe Experience Cloud ID Service and Marketo's munchkin.js to ensure a higher match rate. The "cookie sync" happens automatically as long as long as both these scripts are present on the page.

Architecture


This is a slightly detailed architecture diagram which shows the bi-directional integration between Marketo and Audience Manager and the general process is the same for Adobe Analytics and Target (currently it's only Marketo -> Target audience sharing). 
  • One thing to note in this architecture is that the Marketo to AAM integration is currently manual but the AAM->Marketo integration is automated where AAM audiences are refreshed in real-time with a backfill done every 24 hours. Step #4 is explained in more detailed in Marketo's official documentation.
  • Another thing to note is that in Marketo, there will be an option to either "Send to Experience Cloud" or "Sync from Experience Cloud Audience" to send and receive segments from Audience Manager and Analytics respectively. For Target, it's currently a one-way sync from Marketo.


Other Facts


Finally, let's take a quick look at some other interesting facts about the integration with Marketo:
  • Any email lead data from Marketo to Audience Manager must be hashed (handled automatically from Marketo if these are captured).
  • There is already an integration between AEM and Marketo where Marketo can receive assets from AEM to embed in emails. More information can be found here.
  • An integration between Ad Cloud, Real-Time CDP and Marketo is also possible. I have not worked with it directly but will share once I learn more.
  • This integration is not PHI compliant but can be used for any other customer not mandating PHI compliance.


Hope this gave you a general understanding of how this integration works. Feel free to share your use cases for this integration or let me know if you have any questions.

Sunday, August 23, 2020

Medium Blog: Power Personalized Experiences with Project Firefly

It's never been more important for organizations of all sizes to personalize their experiences for their customers and audiences, and AEM and Adobe Target are two stellar ways to do this. 

Here's link to a post which I coauthored to show how to use Project Firefly, our framework for building custom, cloud-native Adobe apps, to integrate AEM and Target in a separate UI to more easily achieve your personalization goals.

Monday, July 27, 2020

Comparison Between Adobe Analytics and Customer Journey Analytics

Adobe Analytics has long been the undisputed leader in the world of Web Analytics and is still a marquee product for analyzing web and mobile app data. It is the bread and butter for consultants and data analysts worldwide who work on enterprise level data. However, just like any enterprise level product, it does come with its share of challenges. I wrote some posts last year outlining some of these challenges. This post lists a challenge we face while uploading classification data in Adobe Analytics and this article talks about the implication of uploading historical data (see point #3).

So, is there a solution that can make these challenges go away? YES there is and the solution to these challenges is Customer Journey Analytics. Customer Journey Analytics or CJA is an enterprise wide analytics product that is built on Adobe Experience Platform. CJA allows us to join different data sources (online & offline) to give a complete view of our customers in real-time across channels. Please note that CJA is considered an add-on to Adobe Analytics, also available for Non-Platform (AEP) customers and works natively with Adobe Experience Platform.


In this article, I'll compare Adobe Analytics with CJA based on a set of standard capabilities which are common between the two solutions and highlight some of the differences. The writeup is long but I've combined all the content in a single matrix at the end so feel free to scroll down to view it in one tabular view.



Adobe Analytics

In this section, I've listed the various capabilities of Adobe Analytics and added a high level writeup explaining each of these separately. I've done the same for Customer Journey Analytics.


1.    Data Capture 
o   Primarily takes place based on the AppMeasurement library (client-side-web), Mobile SDK (mobile app), Data insertion API and Bulk Data Insertion API (server-side).

2.   Data Usage
o   Data is stored in Report Suites usually setup to receive data globally or individually based on the requirement.
o  Virtual Report Suites (VRS) can be created to “split” data based on web/mobile, region or Business group and can be setup based on custom session timeouts, expiration and time zones. 

3.   Reporting and Analysis
o   Data is visualized in Analysis Workspace or the legacy UI.
o  Workspace panel includes Freeform, Cohort, Fallout etc. options available to visualize data.
o  Calculated metrics can be created, and marketing channels can be used for further analysis.
o  Robust data export capabilities (PDF, CSV etc. formats) as well as access to raw data feeds.
o  Ability to setup alerts in case of anomalies.

4.   Identity
o   Primarily based on cookies for client-side web tagging. 
o  Based on ECID for mobile app (tied to each installed instance of the app).
o  Customer IDs converted to ECID for server-side implementations in general.
o  Device graph data can be accessed via the People metric or leveraged via Cross-Device Analytics.

5.   Segmentation
o   Segmentation built into Analysis Workspace
o   Visitor, Visit and Hit segment containers available.
o   Sequential segmentation and exclusion capabilities available to users.

6.   Data Limitations
o   Limited to 200 eVars/props and 1000 events.
o   UI limited to 500K unique rows of data per month (Low Traffic).

7.   Data Classifications
o   Classifications subject to the same restrictions as the UI in terms of only classifying the top 500K rows.

8.  Historical Data Ingestion
o   Historical data sent in but out of order hits can affect the sequence of events and attribution of eVars and marketing channels.

9.   User Permissions
o   User permissions are granted via the Admin Console at a more granular level for report suites etc. at a product profile level.

10. Data Latency
o   Data can take up to 2 hours to be fully available in Adobe Analytics.






Customer Journey Analytics

In this section, I've put CJA through the same set of capabilities as I did for Adobe Analytics. Please note that there are some features that CJA lacks compared to Adobe Analytics which the product team is working on to add support for. Those are explained in more detailed here.

1.    Data Capture
o   Data needs to be conformed to Adobe Experience Platform’s XDM schema to bring in any type of data.
o   Web SDK is used for real-time data streaming and streaming API will be available for sending data server-side.

2.   Data Usage
o   Data is stored in datasets created within Adobe Experience Platform and added to CJA as Connections.
o  Data Views are similar to VRS which also allow us to define data based on the type of datasets being analyzed as well as setting custom session timeouts, expiration and defining separate time zones.

3.   Reporting and Analysis
o   Data in CJA is visualized in Analysis Workspace.
o   Workspace panel includes Freeform, Cohort, Fallout etc. options available to visualize data.
o   Calculated metrics can be created for further analysis, but marketing channel support is not available yet, but support is planned.
o   No current ability to export data in CJA (Workspace) but support is planned. However, Query Service and Data Access API provides the ability to export data.
o   No current ability to setup alerts but support is planned.

4.   Identity
o   Tied directly to the Namespace defined within Adobe Experience Platform.
o   ID can be based on anything be it cookies, CRM id, Loyalty ID or Phone number.
o   Custom namespaces can be defined.
o   Data in the device graph is NOT available yet but support is planned.

5.   Segmentation
o   Filters built into Analysis Workspace.
o   Person, Session and Event segment containers available.
o   Leverages the same standard segmentation UI/features as Adobe Analytics.

6.   Data Limitations
o   Unlimited metrics and dimensions and data in eVars/props is available in XDM format within CJA.
o   Unlimited number of rows and unique values.

7.   Data Classifications
o   Lookup Datasets created in Platform are not subject to any volume restrictions in terms of volume but there is a 1 GB limit which isn't "enforced".

8.  Historical Data Ingestion
o   Any missing historical data can be uploaded into Adobe Experience Platform and then leveraged in CJA including support for out of order hits for a person.

9.   User Permissions
o   Only product admins (not all users) can now perform granular tasks such as deleting, updating and sharing Workspace dashboards with other users.

10. Data Latency
o   Data isn’t available in near real-time can be take up to 2 hours, but real-time support is being looked into.



Here's the matrix which consolidates the capabilities compared above in a tabular format. Please note that I took a stab at also calling out which solution is (currently) better for a particular capability by adding a checkmark. If there's no checkmark, then it means that the two solutions are on par with each other or support is planned to add that feature to CJA by the product team.


Hope this article provided you with some more information and context to figure out some similarities and differences between Adobe Analytics and Customer Journey Analytics. The key points to consider would be to see if you analyze large amount of dimensional data (exceeding 500K unique rows per month), often analyze customer data across multiple channels, need to add missing historical "hit level" data after the fact or connect offline data with online with the aim to get a single view of the customer, then you should seriously consider CJA.

Are you in the process of considering this tool or have any further questions? Feel free to post them here.

Saturday, July 25, 2020

Medium Blog: Adobe Experience Platform Web SDK for Audience Manager

With the new SDK for Adobe Audience Manager, website data has traditionally been collected from an object or data layer on the page, or Adobe Analytics has forwarded the data into Adobe Audience Manager’s server, using Adobe Experience Platform Pipeline. Then Adobe Experience Platform Launch will send all website event data to Adobe Experience Platform’s Edge server where it will be federated out to the different Adobe Experience Cloud products including Audience Manager.

I coauthored this post on Adobe's Medium Tech blog to explain how to route data to Adobe Audience Manager through the newly launched Adobe Experience Platform Web SDK. Here's a link to it.

Sunday, June 28, 2020

Medium Blog: Exploring the Impacts to Adobe Analytics when Migrating to AEP Web SDK

Adobe Experience Platform Web SDK is intended to replace the existing libraries across our various Adobe Experience Cloud solutions such as appMeasurement.js, at.js, dil.js, and visitorapi.js. The important difference is that a library was written from the ground up leveraging an updated schema via Adobe Experience Platform Experience Data Model (XDM) to map and collect data agnostic of the solution the data is pushed into for further processing. I coauthored this post on Adobe's Medium Tech blog to explain how to route data to Adobe Analytics through the newly launched Adobe Experience Platform Web SDK. Here's a link to it.

Tuesday, June 2, 2020

Medium Blog: Adobe Experience Platform Web SDK Migration Scenarios

Adobe Experience Platform Web SDK consolidates solution-specific requests into a single payload for a seamless and much more simplified implementation. I coauthored this post on Adobe's Medium Tech blog as part of a series of blogs we're going to write about migrating existing Adobe implementations to the newly launched Adobe Experience Platform Web SDK. Here's a link to it.

Sunday, May 24, 2020

Visualizing Traffic on my Blog using R

I've been a data analyst in the past and one thing I can say for sure is that we don't have to be great analysts or statisticians to be able to read basics graphs and understand trends. Visuals are all around us whether it's stock market trends or data around the dreaded Covid-19 pandemic. By now, I'm sure all of us have heard about "flattening the curve". It literally took a pandemic for us to know what it means but the point I'm trying to make is that we are surrounded with data and people should ideally know how to understand it. I recently learnt the basics of R, which is a programming language mostly used in data analytics, statistical analysis and visualization. R is a good language to learn for data analysts and statisticians which resonates really well with professions who know SQL. 

In this post, I'll visualize traffic coming to my blog since 2017 (data captured in Google Analytics) and show some commonly used graphs and visualizations using R Studio. The most obvious trend you'll see is traffic started gradually increasing on my blog since I started writing again in January-18 and has really spiked in the last few months. So, let's dive in!



Basics of R


As I explained earlier, R is a programming language used to analyze existing trends and also do predictive analytics using statistics. For the purposes of this article, I'm using R Studio to run basic R commands to create simple visuals such as bar graphs, line graphs and slightly more complex visuals such as bubble charts, word cloud and a map using some commonly used packages. 



Data Frame


The first step before doing any analysis in R is data wrangling which is manipulation and transformation of data in a format which you can use for analysis. In R, we do the same thing by creating a data frame which is essentially a table that is populated typically by importing a .CSV file but other formats such as SPSS or Stata are also supported for advanced use cases. 

A data frame contains rows and columns and can be compared to an Excel spreadsheet. The other thing to keep in mind is that it's okay to do some data transformation in the source file itself before bringing data into R but a lot of the manipulation is done directly in R itself. In my case, I modified the source .CSV files exported from Google Analytics for basic data formatting such as switching the metrics to Number format as an example. The command to bring data into R via a .CSV file is: 

where df is the data frame, read.csv is the function which reads a .CSV file and stringsAsFactors = FALSE ensures that the data is not converted into a factor to keep the source data format intact. The original file contained Page name and some common metrics such as Page views, Visits etc.


Package


R packages are reusable code libraries that provide additional functionality to R and help simplify tasks. You can install packages using the install.packages() function and invoke them using the library() function. In my case, I'm using the following packages:


  • library(ggplot2)
  • library(lubridate)
  • library(ggwordcloud)
  • library(maps)
  • library(dplyr) 

Finally, before we take a look at the visuals, one other thing to note is that I did some basic data manipulation in R to convert the Month and Year using the lubridate library by using the ymd() and mdy()  functions. There are a lot other things that I can cover under basics but that's outside the scope of this post.


Visualizing Blog Traffic Trends


In this section, I've inserted the graphs created in R Studio which were saved as images. I've used the "ggplot2" package which is a very popular R library to visualize data. I'll admit that my blog does not get a ton of traffic but that is not my intent as I'm not into any competition to artificially inflate my traffic. My intent is to share what I know with others and document things for myself for future reference. Let's take a look at some of the trends.



Show Visits and Page Views for the Last 3 Years (Line)


In this line graph, I've visualized Visits (called Sessions in Google Analytics) and Page views for the last 3 years. 





  • If you notice closely and look at the label I manually added, traffic started gradually increasing once I started writing again in early 2018. 
  • The biggest spike happened Thanks to my last post about Real-Time CDP which I wrote last month. This shows how much my readers want to consume content about the latest and greatest technology from Adobe and especially if it's around Adobe Experience Platform.

I mentioned earlier that it's very common for analysts to modify the source data before bringing in the data in R which is what I did to generalize the page names by removing the month and year using the mid() function in Excel.





As promised, here's the code sample to generate this visual. Please note that I created a separate data frame called 'dfl' which contained Month, Visits and Page views. Also, note that the file format is .rmd which is a format used to visualize R commands. 




Top 10 Pages Visited from Jan-2017-Apr-2020 (Flipped Bar Graph)


In this bar graph, I'm showing the top 10 page Visits by Page in the last 3 years. 





  • The most popular page is the one I wrote to show the calculation of funnel drop off rate way back when I started blogging. This shows that there's still a sizable audience looking for calculation of basic metrics such as drop off rate as the traffic source of this page is primarily search engine.
  • The other popular pages is the homepage which may mean that people get sent directly to my homepage via search. Again, this is inclusive of the last 3 years worth of data so more analysis is needed to understand this fully which is outside the scope of this post.
Below is the code I wrote.


Top 10 Pages by Visits and Bounce Rate from Jan-2017-Apr-2020 (Bubble Chart)


In this chart, I show the top 10 pages (last 3 years) visualizing Visits and Bounce Rate. The color of the bubble is the page name and the size of the bubble is tied to Visits.



  • The most popular page name (drop off rate) has the highest Bounce Rate and Visits which shows that readers searching for drop off rate who come to my blog are MOSTLY interested in this type of content and nothing else.
  • The homepage ("/") has the lowest Bounce Rate of 66% which is obvious because users typically either search or click into a post which they came to read as opposed to just stay on the homepage.

Below is the code snippet.



Top Traffic Sources from Jan-2017-April-2020 (Stacked Area Chart)


In this chart, I visualize the top traffic source sending traffic to my blog for the last 3 years.





  • Organic search has traditionally been the top traffic source for my blog but the interesting thing is that a lot of visitors come directly to my blog by typing the URL which is very surprising to me unless they bookmark it.
  • Traffic via Social channels started appearing since I started sharing my blog posts on LinkedIn and Twitter from early 2018 which explains the trend.

Below is the code snippet.




Age and Gender Data Captured since late 2018 (Pie Chart and Bar Graph)


Now, this might be a bit surprising for some Adobe Analytics users to understand how Google Analytics captures demographic data. This is done by enabling the Demographics and Interests reports  in Google Analytics which uses data collected from IDFA and Google advertising cookies to help in retargeting. Again, none of this data shown here is even borderline PII so Google has taken into consideration all privacy regulations. It might be a good addition for Adobe Analytics if it can receive similar data from the Adobe Ad Cloud platform.




There's not much to say here as these graphs are self-explanatory but I manually added the percentage (total is ~5500 Visitors) to show the breakdown of Gender in the Pie chart. The code snippet is show below.



Word Cloud showing Internal Search Terms since early 2019


Word cloud is a commonly used visual to show search terms or popular tags which people are looking at. I've imported the "ggwordcloud" package to do this. 



  • Given that I've written extensively about Adobe Audience Manager, it's not surprising that a lot of the search terms are tied to AAM.
  • This also tells me what else I can write about based on what people are searching for.

Below is the code snippet.



Map Showing Visitors by Country from Jan-2017-April-2020


Finally, this visual created using the "maps" and "viridisLite" (already available in R) packages shows which country is the most popular in terms of the total number of Visitors. As shown below, United States is the most popular followed up by India which are the two countries I'm associated with so I'm not surprised :)



Below is the code snippet for this.



Adobe Experience Platform Data Science Workspace


One of the best things about Adobe Experience Platform is that it provides you the ability to run SQL queries and run ML algorithms or models directly on the data (in XDM format) in the tool which has never been the case in the past. This is super powerful and completely eliminates the additional time it would take to export the data and make it available in a platform outside of Adobe. 

Data Science Workspace integrates Jupyter notebooks which is very popular open source application that allows you to run ML models, perform data visualization written in programming languages such as Python. The reason why I'm mentioning it in this post is because it can also run code written in R so theoretically everything that I showed you can work in Data Science Workspace but my understanding is that it requires the underlying data to be in XDM format. I haven't dabbled with it due to data access constraints but here's how you can access DSW and run R commands in case you have access.





So that was it! Hope with this post, I piqued your curiosity about R and its data visualization capabilities.