The Omni-Channel Multi-Party Customer Journey
St Michael slays the dragon

The Omni-Channel Multi-Party Customer Journey

(Or how I learnt to slay dragons and drink from poisoned chalices)

For many large organisations the goal of being able to better manage and focus on their customers is a holy grail … a holy grail in the sense that we can all visualise what it looks like and we absolutely MUST hold the grail but the quest to win the holy grail is fraught with dangers and there are many twists and turns along the way.

An ironic wit might say “just creating the customer journey is a journey of itself”.

Another might quip “here lie dragons”.

This is the story of how one team overcame their fears to slay the dragon and claim the grail.

In the beginning … was an Enterprise Data Warehouse

Oddly enough our story begins with an ambitious plan to corral all of the Enterprise Data assets into a Teradata Warehouse. Actually this is another story but the idea of the 360 degree customer view definitely started there and some of the heavy lifting took place in this system. What I mean is that this enterprise had a bucketload of IT systems (SAP, couple of billing systems, at least 2 CRM systems, several order management and provisioning systems, fault management, inventory, GIS and so on) that all got mapped into this warehouse and the problems of daily/weekly/monthly refreshes got solved.

But I digress, now what problem are we solving?

Who is our customer?

The Omni-channel, multi-party customer is a mouthful so what does it mean.

Well customers are complicated things. Sometimes it is a person buying for themselves, other times that person is buying for the company. Then there are family members and organisational hierarchies to deal with.

Sometimes the customer goes into a shop, other times they ring up the call centre. These days they are more likely to use an app or website to contact us. Sometimes they start their purchase online, then they ring up and finally they go into a shop. So how do you even begin to trace what the customer is doing — especially when Aunty Doris is buying in Benalla on behalf of nephew Jake who shopped online in Boronia.

It’s a minefield.

So the concept here is of a customer as an entity (perhaps a group of people) that purchases or uses a product or service. And the need for the omni-channel multi-party customer journey is the need to stitch together all the interactions across the journey that a customer makes to purchase and use a product or service — simple!

Solving the Problem is hard

The typical enterprise has a bucketload of systems, maybe 100 or more if that enterprise is moderately complex. The customer journey can impact many of those systems. First of all there is the website, where perhaps the customer first indicates interest by signing up. Then there are retail stores, some directly owned and some managed through a channel where information about the customer gets collected, usually into different systems and in different formats.

At the point of sale there are usually forms to fill in. These forms are used to populate ordering systems, warranty systems and then the ubiquitous CRM.

For many organisations the Customer Relationship Management system, or CRM, is or should be the system of record for all things customer. I have 2 arguments to counter this from a purely practical perspective. First off, every CRM implementation that I have witnessed trying to “boil the ocean” of customer relationship has run into challenges so smaller, point solutions have then sprung up around the CRM, resulting in a proliferation of mini-CRMs. Secondly the diversity of data that the enterprise collects about a customer has become immeasurably more complicated in recent years, resulting in the need for bolt-ons to the traditional CRM. Examples like IOT applications have added to the customer interface without being captured in the traditional CRM.

Moving on from the CRM there are numerous touch points that affect the customer, not all of them appropriately tagged and mapped to a customer. For example a project I was involved with sought to alert people of bushfire danger based on their location — one could easily extrapolate this example to consider a generic use case where a customer or user of enterprise services is affected by external events, necessitating the integration of datasets from outside the enterprise into the customer journey.

Boiling down the problem I have described above comes down to 3 big issues to overcome.

Issue 1 is the ability to consolidate data from literally 100’s of different and disparate data sources with different formats, with different identifiers into a data store that allowed us to map it and identify each and every customer interaction. This is tricky and needs thinking that goes beyond the relational model for data.

Issue 2 is the ability to process literally millions of interactions at scale. That by itself may not be a challenge but when the data sources are sitting on a huge variety of in-house hardware it is a challenge (show me the 100-year old corporate that is 100% cloud native).

Issue 3 is to determine, to a degree of certainty, the customer that each and every interaction is related to. Remembering that many of these interactions are not tagged to specific customers, that device addresses and web browsers are involved and also acknowledging the multi-party aspect of each interaction (i.e. the proxy acting for the end customer).

How did we solve the problem?

The secret sauce involved 3 pieces of relatively modern technology that had not been available to the builders of the Enterprise Data Warehouse some 10 to 20 years ago.

The first issue was solved through a three-step process that first mapped data from source systems to identical data structures in a Big Data lake (Hadoop of course). The concept here was to ensure we get data in a timely manner which meant the minimum possible transformation from source. The second step was to extract from each of the “new” transactions any new customer identifiers (remembering that we are comparing different identifiers from hundreds of systems — for example what does a new IP address signify, is it a new customer or an existing customer in a new location?). The third step is then to extract from all the transactions a set of customer events, some new, some representing updated data on old events.

At this point modern technology comes to our aid. The key element of the solution is a JSON data store. JSON, like XML, is a flexible self-describing data structure that allows a developer to write programs that manipulate different formats independently but store them in a single data store. The key (no pun intended) to operate the JSON data store is to have an identifier that allows applications to extract the right set of data. This is where the 2nd bit of modern technology comes into play. Imagine an enterprise with (say) 10 million customers and 100 different systems. It is possible that there may be 10 million x 100 = 1 billion identifiers that result. Of course you hope that some systems share identifiers but often different generations of system have evolved differently (i.e. those developed in the early 2000’s identified customers through an account number, those developed 10 years later had evolved to use an auto-generated customer id, then there were several different customer mapping/CRM systems that all used different IDs to describe the same customer). So lets assume 100 million lookups required to match each unique customer event. Doing that takes some grunt and a handy amount of parallelism is required. Thankfully data lakes run on Linux clusters that provide us with thousands of CPUs and some processing frameworks, like Spark, that take advantage of massive parallelism to speed through these type of lookups.

Problem solved.

One small catch with the previous solution was an in-house data lake with physically limited processing power. A cloud migration solved that — not quite straightforward but once the operational workload management was aligned to the cloud vendors elastic scaling technology everything started to perform (and scale elastically).

Now onto the challenge of probabilistic matching. Not trivial but also quite feasible with modern machine learning frameworks. The biggest area of uncertainty was around online and mobile users so a number of experiments were run with different assumptions and different models. Once the approach was known then it needed to be fitted into a Machine Learning framework. To be fair they all did the job but finally the solution was built in Spark-ML as it fitted the rest of the technology stack well.

Foundation laid, now to build value

The result of everything described above was merely the foundation — a reliable and performant data pipeline that takes both transactional and reference data from 100+ systems, matched it to specific customers, customer groupings and even hierarchies, extracted events from the transactional data and enabled anyone across the enterprise to stitch together a customer journey. So what?

The value is always what you do with the data. In the first case we have a product managers dream. The ability to take a specific product, find out which customers bought the product and what their journey pre-purchase and after purchase was. The next and obvious step is to start to use this data to predict (churn, upsell, cross-sell). With such a rich dataset the prediction is not a problem (though prone to overfitting due to the volume of data). The real challenge is what action do you drive and how do you measure and react to the data as it changes. Remember this data pipeline is practically real-time so the enterprise can react almost immediately to shifts in consumer sentiment.

Hang-on a sec, I think we are sub-optimising here.

Our logical unit of analysis is the product as this is how we have organised our business. Have a quick look across your banks with LOBs split by retail, mortgage, investment, life and so on, your telcos with LOBs split by mobile vs fixed, pre vs post-paid, insurance companies split across home, car, life. Yet our data pipeline and data store transcends all product boundaries. We start to understand behaviour across product boundaries. Moreover we start to understand associative customer behaviours, like the whole family that churns their mobiles over a 6 month period, or the group of friends that decides to switch banks in rapid succession one after the other.

Tremendous value is being created with every customer interaction the journey is enriched but what of the moral consequences?

The Goal is the Customer

One of the interesting by-products (and there are many) of this omni-channel multi-party customer journey is that the enterprise starts to understand the time it takes for customer to make and execute decisions.

Imagine the enterprise as a series of single-part flow items all stitched together.

What do I mean by this? For those with a modicum of lean manufacturing knowledge, or who have gone beyond scrum ceremonies in the agile world, the concept of flow, as measured by lead time and cycle time, is fundamental to the operation of the enterprise.

What we have created with our customer journey data store is the measurement of each individual cycle through a transaction. Measured but not optimised.

Everything I have talked about here I was lucky enough to be a part of with a couple of customer projects that I regard as critical to my understanding, both at a technical and at a business level, of this solution. The next step, to turn it into something truly impactful for all customers of the enterprise, by optimising flow across the enterprise I have not been a part of. And that is something I would be truly excited to accomplish. Perhaps the opportunity to build the most customer-focussed business on earth …

Sri-Jagadish(Jag) Baddukonda

Learner | Enterprise Architecture & Telecom BSS OSS Strategic Solution Consulting | Beyond Connectivity & B2B2X Solutions

5y

Good article Martin. One simple fact that is most often ignored is maintaining the uniqueness of each customer Account  and this should be paired to the various customer interactions that happen via the various channels.

Richard Jeffares

Vitrifi has built a machine learning powered autonomic networking company, the future is now!

5y

Well said Martin 👌

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics