Webinars

Customer Golden Records for Increased Customer Lifetime Value

October 10th, 2019

How does your company build and maintain customer relationships? Today, companies own numerous data sources and store massive volumes of customer data. Extensive data collection poses the risks of data duplication, poor data quality, and a lack of transparent data access. Without accurate and reliable customer data, companies may face higher spending along with acquisition and re-acquisition costs.

Obstacles like these were apparent across Chalhoub’s luxury brand and retail enterprise, a corporation managing over 650 stores throughout the gulf region. To maximize their customer data across multiple e-commerce and CRM systems, Chalhoub architected and deployed a cloud-based, centralized data hub on Microsoft Azure that provided data mastering capabilities to create customer golden records and enabled a 360-degree-view of the customer across all company brands. In this webinar, learn how Chalhoub was able to continue its “customer first” approach and improve their brand performance with Zaloni as a partner in the project.

Join Tim Blackwell, Analytics Data Architect at Chalhoub, and Scott Gidley, VP Product at Zaloni, as they discuss how a holistic view of customer data allows for greater insights and improved targeted marketing to ultimately increase the lifetime value of their customers.

Topics Include:
– Utilizing the Customer 360 approach as a foundational data lake use case
– Building customer golden records with a zone-based architecture
– Value achieved through data acceleration

 

 

Ready to start getting greater customer insights through master data management? Ask for your custom demo!

master data management

Read the webinar transcript here:

[Brett Carpenter] Hello everyone, and thank you for joining today’s webinar, how to build customer golden records that increases customer lifetime value through effective master data management. My name is Brett Carpenter and I’ll be your MC for this webcast. I’m excited to introduce our speakers first is Tim Blackwell and analytics data architect at the Chalhoub Group and joining him is Scott Gidley vice president of product management at Zaloni. Now we’ll have time to answer your questions at the end of the presentation. So don’t hesitate to ask them at any point using the ask a question box. That’s located just below your player window. But before we begin, I’d like to briefly introduce Chalhoub Group a to you. 

 

The Chalhoub Group is the leading partner for luxury across the Middle East since 1955. An expert in retail distribution and marketing Services based and Dubai the Chalhoub Group has become a major regional player in the beauty fashion and gift sectors by blending is Middle East expertise and intimate knowledge of luxury such as new group is building brands in the region by offering service Excellence to all its partners and a unique experience to its customers. The group is moving fast from a traditional distributor and retailer for luxury in the Middle East to a hybrid retailer bringing luxury experiences to the fingertips of customers everywhere and Zaloni helps Enterprises Drive competitive Advantage by cataloging and enriching and act and action and data for Rapid business value. 

By utilizing the Zaloni data platform to create a unified data supply chain, businesses can be more agile and move rapidly to the cloud by providing business users self-service access to trusted data and delivering fast time to value and scale now with that. I’ll turn it over to Scott and Tim to talk about what’s needed to effectively measure business value for your data project Scott. 

[Scott Gidley] Hey, thanks Brett. And thanks to everyone for joining today’s webinar and especially thanks to Tim for co-presenting with me. During today’s session we’re going to do is explore the benefits that organizations are gaining from delivering a golden customer record as part of their overall customer experience and customer lifetime value initiatives And you know, let’s face it. Master data management isn’t really A New Concept. This has been something that organizations have been trying to solve for a long time. But you know, a lot of them are still trying to get there. I mean this goes back to the early days of data warehousing and the customer golden record and what I think organizations are starting to see that their data Platforms in the scale of data that they’re working with changes its kind of thrown a wrench into a lot of the processes that they’ve had before. So we’ll look at how companies like Chalhoub started to overcome these problems in the challenges with a Little Help from a company called Zaloni and ultimately at the end of today’s session. Our primary goal is to provide you with insight and some food for thought on how you know a new or alternative approach to this old problem might be able to To deliver a more accurate and scalable customer view that inturn hopefully drives the more increased customer lifetime value. So let’s get started. If you look at this slide, you can see that organizations are continuing to invest heavily in tooling for master data management solution that is driving their digital transformation and their customer growth. CRM applications marketing automation segmentation analysis all of these systems, whether they’re on-premise, whether they’re in the cloud or combination of both or in high demand, but often their return on investment in their ability to drive a single customer review is limited or has reduced business impact because they’re still siloed, right? They’re not interconnected sets of applications where there’s a single ID that spans all of these systems and in the organizations that have been able to solve this problem and do it in a way that scales and can work well with other systems are really starting to drive value. You can see 84% of organizations working to improve customer experience reporting the increase in revenue that alone Should fund any of these initiatives As you move forward. Another one. That’s kind of interesting is 80% of customers are more likely to purchase a product or go back and buy from a brand where they provide a personalized experience. So again, the drive towards the digital transformation is really pushing a lot of these initiatives and then for my own data and analytics heart the drive to use data and analytics projects that are relating to customer experience is going to be more than 40% of the entire data and analytic portfolio by 2020 that quote comes from Gartner that’s pretty impressive as well. So those A few stats that show the importance of this but I’d like to kick it over to Tim for a second and let him talk a little bit about the impact of the benefit Chalhoub was hoping to gain as they built out their customer data hub and enabled master data management.

(5:07: Challenges in unifying the customer data through master data management)

[Tim Blackwell] Great, thanks. Yeah, I mean we had a particular set of challenges really in dealing with our customer data because we are a quite a diverse group made up of selection of franchises some of our own concept brand somewhere joint ventures. We have a very diverse landscape when it comes to our customer data. So the challenge there was to really get this data under control and start driving some value from it. Previously we had a centralized Comm system, but due to the fragmented nature of business. It couldn’t really meet the needs of Our Brands in terms of marketing activation helping them to know their customers and for us as a luxury retailer the relationship with them with our customers is very important, especially as those higher spending customers. Those are all frequently occurring customers and most of our businesses is done face-to-face. We’re very much a bricks-and-mortar operation. Although we are diversifying to become a hybrid retailer. So knowing our customers is a key for us and previously we had huge challenges in doing that just understanding the history of the relationship that we had with our customers in terms of what they’re spending where they’re spending what their preferences are was very difficult to achieve. 

[Scott Gidley] So the Tim were there specific metrics you were Trying to capture or you want to try and capture on sort of the interactions with the customers and those types of things that would help you measure some of the effectiveness of this master data management initiative.

[Tim Blackwell] Yeah. I mean, yeah, I mean if you really I mean the first challenge was to just who our customers, to know our customer and then to measure engagement with them. So particularly the the the lifetime with us and the customer lifetime value and then delving a bit more into how frequently they are shopping and what their activities is across our Brands. Now to the End customer they may not know that you know, they’re going to want to abandon another they don’t know they’re part of the same group, but that information is very valuable to us because we can see their shopping preferences and behavior across multiple Brands whether it be buying handbags or shirts or makeup or gifts. So this is really high value of information for us and built a measure that engagement and then ultimately to drive some sort of customized experiences. Or them and to really make sure when we are when we’re reaching out to these people in contact and we’ll look at that again level of engagement right? We’re not bombarding them with the text messages and that’s the messes and when we do contact them, it’s something of value and it’s and it’s targeted to their segment. So we’re segmenting them based on their spends and they’re recency and and which brands their shopping in. 

[Scott Gidley] Great. Thank you. Yes, I mean I think as we look at some. The value you’re trying to drive their one of the things we want to look at or some of the challenges and you mentioned some that you faced the customers face when trying to unify these type of information and we voted down to a few areas here complexity visibility and flexibility, but one of the things that there’s an interesting quote from Gartner in the talks about by 2022, 50 percent of large organizations will still will have still failed to unify engagement channels resulting in the continuation of disjointed and siloed customer experience and I think that’s really as Tim mentioned. What Chalhoub is trying to address and all of these issues complexity visibility and flexibility. Go back to the same issues. We’ve seen for decades right siloed data in the lack of standards and the lack of ability to understand the quality of information makes consolidating these datasets complex, right, you know lack of quality can diminish the value we’ve seen over and over again as organization look to bring the data together. Let’s say into a customer 360 viewer customer data is the data fit for purpose. Can we profile this information and identify, do we have dates of birth and the dates of birth Fields? How well populated is our social security number or identification field? Whatever that might be. How do we use those things and improve those things with an underlying data management platform to increase the effectiveness of our consolidation issues and I think all of these things have been compounded by the sprawling data and the overall growth and data volumes in the volume and variety of information that’s been caused by applications being split to the cloud people purchasing more and more third-party data, makes software-as-a-service applications and increasingly now people trying to capture data any more real-time fashion, whether that’s IOT or just truly just pulling data from, you know, customer experience and pulling that information in real-time to provide you know The next best customer offering those types of things. Just to give you some examples here from our end being we worked with a customer in the hotel and travel and Leisure industry and they had a matching application that they had built internally and if you’ve been working to build out there are 360 degree view of customer for years and it worked very well. They were very happy with the functionality, but the hotel had grown over acquisition year over year over year and they started to get more and more and more information in there and their existing application was now no longer able to run and meet the service level agreements that they have on a nightly basis. So let’s say they had a few hundred thousand records coming in from various hotel chains that they wanted to update their customer hub they had a three or four hour window that they could run during certain times of the evening that they were trying to run this process and ultimately they started to realize that it wasn’t scaling they needed to looked at a new approach that could scale in a more transient or ephemeral way to build. After processing to support their needs from nightly SLA perspective. So I think we’re starting to see more and more of that is organization look to modernize their approach through more than master data management solutions

And you know Tim, you talked a little bit about your challenges prior, but maybe there’s if anything else you want to highlight here the floor is yours? 

[Tim Blackwell] Yeah. I mean this thing looks it up a challenge is particular and not we’re not you know, a monolithic brand so the challenges very much was siloed data our and our partner brands have a what we call a freedom of execution since they are the owners of their of that customer relationship. They own that customer data. So they Implement their own e-commerce platforms their own CRM systems so that the challenge for us is to maintain a view of our customers out had a group level and we can provide the sort of you know, once we aggregate this data at a group level we can provide those extra value services back to our Brands we can provide them inside that they can’t get from their own individual CRMs. They can see obviously their interaction with their own customers, but they don’t know what their customers doing outside of their range of stores though. We can provide services back to those bands both enhancing the existing CRM capabilities through more advanced analytics and getting that cross-brand penetration. Seeing what else they’re shopping and what their preferences are and then obviously we have to join that with our The ERO system that historical CRM data is really a set of data sources to integrate some near real-time data throw integration by parts and things are mostly getting a handle on that data governing it and ensuring quality data quality. As you mentioned earlier is they used to getting collecting all this stuff without really understanding and being sure that you’re collecting the corrector quality data and have to cleanse it thoroughly before stitching and creating that trusted customer 360 that really made us realize the need for master data management.

[Scott Gidley] Sounds good. Yes, I mean, I think it’s an outline. There’s a lot of different challenges here. This parents systems trying to bring different types of data together and then being able to provide that back end of one of the really interesting things you mentioned there’s being able to provide that data to the other parts of your organization through the other parts of your brand so that they can have access to that information and ultimately to help our customers like Chalhoub and others accelerate this delivery of trusted high value data, Zaloni had built a high value platform and this is our govern data management solution, which is really unified and ultimately the idea here is were helping our customers connect to data sources wherever those data sources may be in the cloud on premise software and service applications file systems and streams and as we discover this information, we’re trying to create a very rich catalog that contains metadata environment and isn’t just a passive, you know catalog for Documentation but it’s really something that can help you take action on this information on this data. So the idea here is that once the data has been discovered and has been added to our catalog we can hope at business context to it. We can help validate its fit for use and fit for purpose, you know, so Tim mentioned struggling with understanding both the quality of the data and the context of the data of the problem in organizations suffer regardless of whether they’re trying to build out a customer 360 view or like I’m customer value during any application being able to search and find relevant data for analytics and operational use cases is an issue. So organizations can use our catalog. They can add information here to provide business contacts. They connect profile the data so that they can measure data quality. They can apply data quality to the information so that they can improve it if they want to use it for some other purpose and then ultimately they can consume that data from our catalog they can call that day. Data into you know a cloud data warehouse. They can pull that information into an FTP site where they’re making it available to other parts of their brand or other partner applications. And this is all done with a very connected line of governance across these different initiatives. So 

You may be able to set business users in your organization who can browse the catalog and consume information that they need for BI report. Whereas somebody like Brett maybe a data engineer and he’s building pipelines to populate data for your customer data hub for instance. So there’s really this collaborative role between business and it that the platform provides. But key to this platform is an add-on that we provided that’s called the data Master extension for master data management. So the Zaloni Data Master extension is essentially allows you from our catalog data matching algorithms that use artificial intelligence and machine learning to identify duplicate near duplicate information by applying survivorship rules of you’re trying to create that best customer review. You can do that as well and then ultimately persist information what you’ve got this whole View into a variety of data stores and Target so it could be something as I mentioned where you’re publishing this data out to customers pushing things into things like snowflake or redshift in the cloud other organizations are publishing it into an on-premise data Lake and maybe it’s being put into something like hbase that they’re using to drive analytics from the back end and I know Tim will go through some of their architecture. But our goal here is to provide this unified solution that gives you lots of flexibility to meet the scale. But then also continue to solve the underlying data management challenges that you might have had before.

(17:00: Current and Future State – Data Matching and Master Data Management)

So one of the things I wanted to mention here quickly is you know, we have a lot of customers that are trying to really decide they’re evaluating one whether they should pursue and the new data matching or new Master data management or potentially a new customer data Hub or drinks 360 customer view project and some of that has been spawned by the fact that organizations are pursuing, you know, the data Lake initiative where there may be pushing data to a cloud data Lake and they’re trying to determine what’s the best use of the information once they provided there and they want to take advantage of the scale that cloud provides but my advice often to our customers is just because you can do it doesn’t mean you should so I think that you really need to evaluate the current state of capabilities of what your what Your may be current initiative has or some of the tooling that exist in your current environment versus, you know, taking on a new initiative like this just to drive this type of project because there’s a lot of things to consider right, Traditional data matching Master data Management Solutions are generally focused on contact certain types of domains name address email contact information product information. And sometimes if you’re going to update the algorithms or update the way that these tools match it either requires an update from vendors or there’s a lot of In-House updates to this information service Professional Service engagement might be needed. So flexibility could be limited there, but A very powerful applications they’ve been around for a while and they’ve been through the rigor. So they provide pretty good results. Most of these types of solutions Scale based on an existing vendors technology or servers. So go back to the example I had from the hotel chain. They had no self build something internally, but it was single server application and they sort of Hit the bounds of they couldn’t buy a big enough server to make it perform in the way that they needed on a nightly basis and then another Something to consider their as I mentioned the matching algorithms are often based on proprietary technology. So if there’s some new advent of new matching capability that’s available in an open source project like the spark machine learning libraries. You may not be able to take advantage of that in an easy way some of the traditional matching tools, but nonetheless there they’ve been around a while. I’ve been put through their paces and they can deliver on the use cases that they’ve been promoting. So I think those are things you still need to consider, based on what your overall needs are. We see customers pursuing the Zaloni master data management where maybe they want to expand the different types of domains that they’re matching on you can build out the algorithms and build up the support via supervised learning techniques within our solution. So if you have data that’s specific to your organization that would help from the matching. You can take advantage of that and that’s sort of the whole process of live training or matching algorithms based on your live data. From a scalability perspective Zaloni data mastering is based on the spark back end so it can be scaled up and scale down as needed to determine the amount of data you have and the SLAs that you need to perform so we can kind of scale and ways and some other Solutions may not be able to and then ultimately we’re building everything on top of this work machine learning libraries with some customizations that we’ve made if there are new algorithms are new ways. You want to take advantage of matching Or something you going out internally. There is a pluggable approach to or solution which can provide some value for customers who have very specific needs. 

So I’ve gone over how we’re using supervised learning to make the system more intelligent. I think another thing I want to highlight is the advantage of using our Zaloni Data Mastering extension. If it is able to leverage the entire Zaloni data platform for all the metadata that we’ve discovered from your various systems can be used to identify data source you want to use this part of the matching process. You can leverage the data quality components the data profiling and the other things that can help you determine whether the data is fit for use and then you can use our data pipelines to be able to build the solution. It’s going to actually pull the data from The Source systems joining together match it up create this customer 360 View and then provide you with that output table or the stream whatever the way you want to provide the information to your consumers to being able to leverage this entire platform gives you a lot of capabilities and flexibility As you move forward so a few things to highlight here is we have sort of an IT interface. They can be used for setting up the Algorithms and maybe doing some of the initial matching process that’s going to essentially create the candidates for subject matter experts to review and say yes these look like matches or the art and I want to split these records apart those types of things and that helps us tune the results based on your sample data. And as you build out the model, once you’re happy with the accuracy of that model you can save it and then that keep individuals can take that model and apply it to the pipleine. So really it’s a collaborative environment where you can kind of work together. There’s governance involved there. Whereas a subject matter expert may be able to provide their expertise on determining what’s a duplicate or not a duplicate but the actual execution of this model on live data at scale is going to be something that would be managed from the IT side.

So just a few points therefore some of the ways that we approach this problem and ways that we help our customers move forward as they look to modernize their approach to customer 360 through master data management and more and now I’m going to turn it over to Tim and he’s going to give you some insight into how Chalhoub did this for their customer data project.

(22:00: Customer Data Hub – Data Flow at Chalhoub)

[Tim Blackwell] And yeah, I mean Scott mentioned are those various approaches you can take to this problem and it’s not his heart in his is a mastering this customer customer data. And how do we do that? We looked at the various options when we heard this from the traditional sort of big scale Master data management platforms. You can buy like Scott said that very well proven. They got solid track record in achieving this kind of thing and that the up we looked at you know. Just spilled it on you. Could you could do this on a little just from a series of rules in us in a SQL database it’s hard but with a hundred steps, you could you could achieve a reasonable match of data, but we went with this slightly novel approach because we saw an opportunity to bring in an entirely you data platform as well. Both / further use cases down the line initially the main problem statement was this deduplication of the customer data. There’s not an opportunity to bring in a set of new technology for us and to do things in the slightly novel way from the from the diagram a common view for a lot of businesses. You know, we have CRM sources we have our ERP for the transactional sources and we also have customer data in our Erp because some Brands don’t actually have a CRM system that just using our Paws system for basic crms are also bringing that data in. 

Recently we launched a group wide loyalty program. And again another reason for choosing a sort of Big Data scalable platform. And it was was we knew that this data was going to grow fairly quickly and naughty program has been a big success and we have a lot of new data coming in there. So the next phase the project is together that your data coming in and further in enrich the customer 360. So since we already had a deployment or on Azure or for some other Bi and we have to experience with you as your Cloud we chose that but it wasn’t, you know, we could have done with AWS whatever but it works pretty well on Azure So we chose Azure database stores as a storage layer am using HD insight and our spark cluster. So the hdinsight is nice because it’s a platform as a service. So it’s managed by Microsoft still scalars as you would a spark cluster years before help me load and things other you want but the underlying infrastructure management is not a problem for us that takes that headache away and then on top of that we have the Zaloni data platform, which does the stitching and as their Scott said yes is just part of a bigger pipeline managed entirely by the Zaloni data platform from the ingestion of the source files using the selected data collection agent. We dump we pump the files every night daily onto the edge node where the Zaloni data collection agent ingest those into our raw zone as a traditional data Lake architecture? We have a raw zone and the pre-processing zone and then finally to The Trusted production Zone and as part of this long pipeline, which we can see nicely through the Zaloni interface for data lineage as the cleansing is done so we can form the telephone numbers and things like that. We check that the email addresses. I’ve had symbols all the all the standard kind of cleansing steps all done as part of the Zaloni data workflow and then we actually use a another workflow scale up our cluster inside before running the data Master extension for master data management which does which is the secret sauce which stitches on that which creates those customer golden records and then we stitch the transactional history to that and push it all into a hive data warehouse. So as Scott said We have a few different audiences for this data from hive we then extract out into a SQL database and we’re using SQL Server analysis services with power bi on top for the dashboarding part of it for the more senior managers and the non-technical users to actually see those headline metric that we can measure customer lifetime value and we can see that trending over time. We can see who has shopped in the last 12 months and how that is trending over time to do a lot of Time series analysis and create a lot of calculations in the analysis services. And we also have the hive database where we have a couple of technical analysts embedded in our strategic marketing team, who do a lot of ad hoc queries through Hive just through SQL queries. To Hive and now we’ve been at raw so I’m playing some options notebooks and running some Python and things like that on it. And that’s the next step really once we’ve sort of done the traditional bi stuff will take next steps and trying to bit more advanced analytics we have we’re using R on top of it. So we just import data into R studio where we’re doing some Market Basket analysis there. So we’re looking at combining for the R research we do with some Market Basket analysis and were feeding this back to a couple of grand to using this for using targeted marketing though rambling on a bit about the whole architecture. But this is the bit I like the technical piece and it works very well. I think the point is that the master data management extension because it’s being particularly tuned to our requirement. It works particularly. Well took little while, you know, but that’s the point of you train a machine learning model. You test a you train you test it and we’re getting a very high accuracy which I don’t believe we would have achieved either by a rule-based matching or through traditional data master data management solution. So it works them pretty well for us. And like I said beauty is it’s going to scale to future use cases as will bring me more customer data. We bringing in Twitter data so we can align some sentiment analysis as well. So it’s got a little view here were pushing the data and some fairly simple. Power BI dashboards where we just have, you know, the average number of brands customers shopping and just these headline numbers which are very simple questions to ask but we just didn’t have any visibility of this we could not see what our customer lifetime value was how frequently customers are shopping and just what on average and known customer is spending as it’s a simple question to ask but is quite a difficult one to answer but now we have those answers which is nice in this is kind of the very first Steps in this has been quite successful and we’ll have some more detailed dashboards, which we are providing back to our brands because the brand of finding all that they running around siloed systems, the analytics that they can do on top of these order the out-of-the-box reports that these off-the-shelf CRM systems provide is not giving them very much Insight at all. So through the power BI interface they can slice and dice the data and then there’s can export directly from there to get that list of customers to run a particular marketing campaign, so they’ll slice that likes a body the recency frequency the money spend and other demographics obviously male female date of birth things on that where we have the coverage data and they can export those lists and then they run those they load those into their marketing campaign tools. So it’s got a lot of activity in this is growing pretty quickly for us. 

At the end of the day at the bottom here is obviously improving the marketing Roi nice thing is now we can actually monitor that uplift as well. This were feeding this information back in trying to create a feedback loop. So we extract the data the plans run marketing campaigns. Do we see an uplift of sales? We know who those customers are we targeted the was pulling from the CRM systems. We pull the customer data what I didn’t mention going back a bit. We actually Parolee campaign data as well. So we have customer and all the campaigns to any email and SMS campaigns that are sent. We also ingest that we’ve done some interesting things in R looking at the correlation between just a number of SMS is that we send to a customer. Is there a direct correlation between that and sales simple analysis, but we just didn’t have a view on that before. So now we’re looking at the feedback between running marketing campaigns and are we making it is it? Worth it. Are we pestering these people? Are we are we getting that level of engagement? Right? Like I said doing things like this simple Market Basket analysis and joining that with the RFM so that we can offer very simple customized offers next steps, of course is to build a complete recommendation engine on top of this. So we have the spark cluster and we have done so it should be fairly straightforward to extend this and start building recommendations and we want to offer this back to Our Brands as a service by creating an API layer, which we already have wasn’t shown on the previous diagram actually push some of the data into R. Elastic stack where we have some simple APIs. This is for experimental purposes. But the idea is that we could retrieve customer recommendations have the pause which is something should be out quite exciting. So what we’ve achieved quite a bit of success, so far with this we’re really can push on now and really grow the platform and then scale out a lot more use cases.

(32:48: Why Zaloni for Golden Records)

[Scott Gidley] that was really insightful and appreciate the information. So as we wrap things up here. I wanted to kind of review quickly just to summarize few of the key points? Right as we talked earlier organizations from the beginning of time pretty much are trying to build this holistic view of customer data. So that’s going to continue on and new and modern data platforms or changing the way that’s done in the little in certain ways. But it continues to be a struggle right the sprawled and hybrid environments the greater volume and variety of information some existing Technologies are struggling to provide the flexibility that organizations want or maybe the scale that they need to try and help address this alone. He’s delivered the Zaloni Data Platform. It’s a Unified Government data management solution. We’re helping our customers deliver trusted high-value data to accelerate time to Insight and as part of this. This data Master extension is an add-on that customers can use that specifically geared towards improving the overall data matching, master data management in building of customer 360 data 360 types of projects. And again many thanks to Tim from Chalhoub to kind of go through their journey is they’ve gone through this. I think there was some really good insights there on how they moved it to the cloud other leveraging the as your daily storage and some of the elastic compute their to scale is needed Tim are there any other uses that you’re looking to expand on 

[Tim Blackwell]: The first use case has proven the value of the platform and we always had a road map around customer data, through master data management. We want to bring in the digital exhaust from our most platforms it’ll be an interesting use case for semi-structured data as we grow our e-commerce business and do some eCommerce analytics again extending that customer 360 View and like I said, the Loyalty data will be coming in and completely separate to that were actually starting a new initiative on supply chain analytics. We have again part of our business. Obviously, we’re a retailer but also a distributor locally. Actually is a true kind of big data problem when we’re looking at things like the stock on hand snapshot every day for every store every warehouse throughout the region. So this is the next big set of use cases you want to deliver on the only data platform. And once we bring that in there’s a whole world of analytics we can do it again. Wait, this is simple questions people are asking they want to ensure that, you know, we’re trying to guarantee on shop availability of our Prime products So we do have a Pareto analysis. How do we ensure those products are always available on the shelf? And if they’re not what is that lost opportunity to restarting in the entries are now to analyze these data sources and supply chain team actually building their own analytics team and this is where the learning platform especially the data catalog is going to be a key enabler for us because they’re going to have technical analysis who will want to get you this raw data via the data catalog. So we’re going to spend some time and take our data governance seriously and get all this data under control and you know, because we’re talking now, you know, I’d done the assessment lookout couple of billion rows of data. So it’s not something we can just push into her the SQL Server big data use case.

(37:00 Questions)

[Brett Carpenter]. So this one is for Tim, how has this initiative improved your time to insights? And how long does it take for a customer action to turn into something marketing can use Now versus how it was before? 

[Tim Blackwell] Yes, that’s an interesting question. I mean, I mean comparatively because we had almost zero visibility of this. So in terms of improving pain to insights you can say its almost infinite. At the moment. We were taking transaction data daily, but in terms of the customer data were only refreshing on a monthly basis you could say our time to incite has a latency of one month before you refresh this data. Yeah, but it’s essentially the dashboards. Once that are dated is it’s almost you know, I’ve refreshed once a month and that data is always up-to-date in on the first month. You have data refreshed up to the end of the last month. So it’s a pretty quick turnaround of data and compared to what we had before. 

where is very piecemeal. And ad hoc is now a you know, a centralized view of this data. So it’s a fairly quick turnaround. I mean with we could improve the turnaround of the data. We don’t really see a strong use case a near real-time data. We’re looking at you know nor cases what is an activity over a longer period of time Our when we look at customer lifetime value or churn for example count of calculating churn. We measure it in years, you know, if we’re a fast moving Goods or a Telco you look at it in weeks or months maybe but for us, you know, you’re selling you know, expensive items. You’re going to sell that many of them. So our churn period is measured in years. So we refresh the data once a month now once we start bringing in other other data, we may have a requirement for the business to turn that data around faster and we We’ll be able to do that better than we had looks at one month latency in the data. 

[Brett Carpenter] Okay another one we’ve got what’s a what’s a typical implementation time for project that size. 

[Tim Blackwell] It went fairly quickly. I think end-to-end. I mean from, you know requirements Gathering to going live it we did it in under six months of things about five months. It took us from kickoff to go live with the platform, which I think is a pretty quick turnaround for this kind of project. 

[Scott Gidley] Yeah, I’ll jump in there. I think it kind of varies from Project to project in some cases. I think as you mentioned we’re being used for ingestion in other parts of the entire data pipelining and data lifecycle management and shall have been other cases from a pure magic perspective. We have customers who already let’s say put the data into a specific type of environment and we’re just, you know, we’re finding the algorithms building the models and applying them. It may go a little quicker and some of those cases but I think six months end to end to get things into Productions fairly good time timeline for a project of this nature. 

[Tim Blackwell] Now my most still growing it says well, we’re still constantly onboarding new datasets. I mean, the Zaloni team trained us on the platform. So we have the skills now and house and look so we discovered we had this extra repository of data in Arch or POS system where those Brands who didn’t have a CRM which we hadn’t really considered. So now we’re Coming in the process of uploading that so constantly refining and iterating and growing the platform.

[Brett Carpenter] So if I’m using other technology for ETL or data ingestion properties could we still use ZDP data matching capabilities. 

[Scott Gidley] So, you know as part of the Zaloni data platform, we have the end and capabilities for managing ingestion and ETL style data movement as well. And cataloging data quality and so forth. But if you already build an application or an of license and other technology than using from an ETL perspective, you can use that to move the data and we can catalog that information and from our catalog you can use the data Master data management extension to you know, create the either deduplicated information or build out your customer 360 hub using our platform so long answer to a question. 

[Brett Carpenter] Can other data matching technology be integrated into your solution.

[Scott Gidley] Yeah, so integrated is an interesting word there with the way I would say is if you’ve already created a unique customer ID with another matching technology that can be an input to the model that we use for for matching data. So that can be one input and then you can expand upon that with other variables in other parts of the model. So we’ve seen that happen specifically with some customers where they had build out their own matching initiative and wanted to sort of Leverage that as the starting point and the grow from there 

[Brett Carpenter] Thanks to Tim and Scott for taking the time to speak with us about Chalhoub’s customer golden record project and the partnership with Zaloni. This presentation will be available on demand. Thanks again, and we’ll see you next time.