March 26th, 2020
A common obstacle for a successful customer 360 initiative is attributed to data sprawl and siloed data, which compromises data quality. Bremer Bank has addressed this problem by transforming their organization and data operations to be more customer centric. In this webinar, you will learn how Bremer Bank unified data across multiple business units and third party sources to build golden records in a governed and secure way. By first building a “nucleus” of customer data, Bremer Bank was able to both align with their data ethics mission and meet regulatory requirements in a cost-effective way.
During the webinar Zaloni’s CEO, Susan Cook, and Bremer Bank’s VP of Analytics and Data Services, Leilani Moll will discuss common obstacles faced when pursuing customer 360 initiatives, building golden records from disparate sources, technology and architectural considerations, and finding success using a DataOps approach.
Attend this webinar to learn how to:
Create golden customer records from disparate sources
Increase data operations efficiency with machine learning
Find success using an ethical, customer centric approach
How Bremer Bank is dealing with COVID-related data management
Read the webinar transcript here for the webinar on Customer-Centric DataOps at Bremer Bank:
[Susan Cook] Happy Wednesday everybody and thank you for joining us. We are super excited to be talking to you virtually. This presentation had actually been intended to be delivered live at the Gartner data and analytics conference in Dallas in March, but the pandemic has changed a lot of everyone’s plans right now. So we are delighted to speak to you virtually and hope and pray that soon this crisis will be over and we’ll get to visit with each of you in person, so to that end let’s just give you a quick update on Zaloni. Again, My name is Susan Cook and I joined Zaloni as CEO about six months ago. We’ve got some exciting things that have been going on at Zaloni so we might look a little bit different and then you’ve heard in the past years, but we’re really excited. We’ve got a new logo, new website and even a new brand for the zaloni data platform. We have renamed it recently to Arena.
Arena is a place where people and ideas come together in one centralized location and then magic happens like the sporting event or a concert. So that is the Genesis behind the name here. Leilani and I might slip every once in a while and call it the Zaloni platform but the new name for our product platform, it’s the same great comprehensive platform that you’ve known for a couple of years now, but we renamed it to Arena.
So exciting things going on. So Zaloni is all about helping large Enterprises, on-prem in the cloud, across multiple clouds that solve this very vexing challenge of data sprawling throughout the enterprise. We want to do that in a secure governed controlled fashion and give you transparency and visibility into the entire lifecycle of your customer-centric data from beginning to end, soup to nuts. When we talk about that beginning to end the end is not just where the IT responsibility ends or the data officers responsibility and we want to give you that transparency visibility and control all the way to when an end-user consumes that data or just something with it.
So we’re having some really great success with partnered with AWS and Azure. We’re in both their marketplaces now and hopefully by the end of the year will also be with Google Cloud. We’re getting recognized by some great organizations; the CIO 100, Data Solution of the Year for finance. Just recently Dresner Advisory recognized us the number two data pipeline solution for analytic so you can see on the slide some of our recent Awards and in terms of customers, Bremer Bank who you’re going to be hearing from today. I know a lot of people on this call are from the financial services industry. So TIAA Nuveen, RGA Insurance, PWC, Toronto Stock Exchange, Bank Of England, Bank of America, a number of different types of customers.
So that’s a little bit about Zaloni but you’re not attending today to hear from me. Let’s get to the star of this show, my very good friend and our very good customer Leilani Moll, VP of data and analytic services for Bremer Bank. So welcome Leilani. Thank you for being here.
[Leilani Moll] Thanks Susan
[Susan Cook] Let’s dive Right In. So Bremer Bank has as a customer-centric 360 initiative, which I think most banks do right now. So talk a little bit about what were the challenges that you faced when embarking on this journey as you were starting to contemplate this notion of customer golden records.
[Leilani Moll] Great. So the starting point of course for us was not golden customer or customer 360 it is because like most of our peers we are down the digital transformation path, but as it turns out digital transformation requires really accurate governed and timely customer records. So we have to start with the customer first and that’s the golden customer at work. So you can imagine if you duplicate a customer record across your system, then of course your 360 will be distorted. It will not give you the customer-centric view that you were hoping for so we actually started our customers 360 initiative by involving the customer first and that’s why we call it Customer-Centric.
In a perfect world, you might have like a single source to onboard customers and that will be your single source of truth. But reality is there is no single source of truth for most banks in our position because there’re multiple places where customers can be onboarded. What we’re creating inside our customer analytics platforms will be called the single version of the truth because there is no source proof, right? So therefore our efforts to build the golden customer record meaning that we need to list of customers across all of Bremer was our starting point and it’s worth spending enough time in that, you know, explain a little bit more.
So to give you a 30,000 foot view our approach just so we started with a clean slate. I know not everybody will have the luxury to start at a clean slate but for us turning our business intelligence environment into a customer-centric analytics platform, which is not going to work for us and one of the simple reason we needed both, so we couldn’t wipe out in the real business intelligence.
Then we also know that everybody hears about data lakes turning into Data swamps. So we were adamant about not going to become a little data swamp. We know that the swamp has a security risk, we had to find a way to leverage the data lake technology but not like it turned into a Data swamp. And for that reason we were added with that every piece of data that we ingest into the data lake will be known and stratified before we start. Another approach was that we wanted to address the people process and technology question simultaneously because they’re interlinked.
The people that you’re going to use to both the customer 360 in your analytics platform may not be the exact same resources that go to their business intelligence environment. In our case. I highly recommend having a really good data platform architect because that piece on the people side on the process side. All I want is to go with Dataops, it will solve a lot of nuances because dataop with support in a bimodal model both mode 1 and type 2 work. From a technology, perspective, this is not a technology presentation. I just wanted to say that I think the technology platform that is easy to use combined with the right partner is what’s going to help with your success. You cannot make the complexity of the technology overwhelm the resources. So we needed to pick one. You do not have a steep learning curve on the beginning. Another thing is that we really needed to do our data homework. It’s important at the beginning before you even start to build a select the platform to know where customers are onboarded into your organization, you may find a few surprises there. Most of these Third party apps work where our customers are on boarded right sped up sometimes without you knowing so it’s beneficial to know when one of these things pop up so that you can incorporate that in your single customer-centric view of the customer, right the other thing that ended up front which is not necessarily in line with agile as we already had a pretty good idea in the beginning of what our customers 360 need to look like and that helped guide our data pipeline roadmap to make sure we bring in the right customer-centric data at the right time. So we had a record design layout and we sort of had an idea what the 360 you to look like so that you don’t have to go back and redo work. And then I also want to say be prepared to learn from mistakes and successes. I think it’s proven that the human brain learn a lot more from mistakes than success. So I also want to highlight a few of the challenges in making a difference in interest from people in your audience. We did our homework and thought, “Okay we’re prepared”. Brace yourself. You are going to stop at challenges that you did not anticipate, but you’ll get over it. Eventhough we did our homework, we knew that we have third party systems that onboard customers. We expected each system to have a unique identifier local do that system do uniquely identify customers that is onboarded to that system, but you can add trust that. It’s not guaranteed. Systems have a long Lifetime and even within a single system you can have duplicates, that was somewhat of a surprise to us.
The second challenge that we faced was the non-existence of metadata. I’m a huge advocate for metadata and you would expect a lot of in systems to come with metadata. So you understand what the customer-centric data is. That’s not a guarantee either. So my team had to dig really deep into the data itself for it to make sense. We learned quickly that you can’t trust the schema. Exhale for example say it’s sort of our own number go verified, you know trust but verify we have to have a system with a problem colorful phone number and it’s doesn’t store just a phone number. It might have like device ID and also tipping fees there. So that was a little bit of a surprise for us.
Another item that’s very important in our approach and what I would advocate for is to engage your business very early on in the process. So dealing with discrepancy in data etc the business input there is critical it might be painful for them because it’s a lot of work. So I would recommend that the other thing that
The other that was a little bit of a challenge for us was that traditional business intelligence teams, probably never used machine learning in a data curation process so i observed them resist trusting the machine, so they would, overtrain models, when they get over that, things get a lot easier. We did not have a Data Scientist in our team so we invested in a DME component of Zaloni. It just made it a lot easier so you could get away with not having a data scientist for That purpose.
As I said, even if you think you’re prepared for everything, brace yourself. You’re about to run into unexpected challenges.
[Susan Cook] A great lesson. All right, what’s going to this dataops approach, you’ve mentioned that a couple of times back. I just wrote an article on dataops, what that means and I think for some of our audience that might be a new term and it’s not exactly the Panacea. It’s not all things to all people, the approach itself. But talk about your approach How do you kind of Define dataops a little bit differently and put a very customer-Centric spin on it?
[Leilani Moll] It’s so, in essence, We are following, you know, the classic steps of dataops meaning that you start with a business objective and operationalize a models, which is added additional spend to it making it more around, applicable to banking. In banking you think that there’s a lot of money that flows right? It’s actually the data that flows and not money. So the same governance that you put around money flowing around you apply the same on the flow of data. It always reminds me of that’s how an air traffic controller works. The air traffic controller needs to know about all the planes that come in. And take off but not just about that airport. They almost need to know about other airports too because there airplanes come from other airports and go to other airports. So that’s the control and mindset of how the data flows. But they don’t need to know about all the passengers that sit on the plane but we do we need to know. We need to know what customers come in as part of which stream, which streams come from the third party system and each one of them have their own nuances. There is no one-size-fits-all solution. So that allows us as we know that these customers came in through the system that we can design the solution for that specific nuance because that we found their not alike. Each one of them have different challenges and it gives us the agility to take one customer-centric data source at a time. So there’s a few tips I wanted to give you guys on how we approach and how we stay agile. So the first thing I would say, address one data source at a time, believe me at some point in time all these names sort of blend into each other and it confuses the teams and you will lose time. So focus on one data source at a time. And as you ingest, you know more and more they do get better, you know, the first one takes a little bit and this went after ingesting each individual data source at at a time because in your process You’re going to tackle the data quality issues.
So that’s the next thing I want to say don’t expect this magical mastering process. This Falls within the quality issues. You have test those data quality issues before they enter the mastering step So each new source that you bring in you help your business clean up and improve customer-centric data quality by rejecting the back record come back in and that’s also why they take longer. As you work through each data set you run the engine multiple times, you’re going to get fewer and fewer rejected records that makes it through the mastering process. And then I also want to bring back that earlier point that don’t pick technology platform that is too hard to learn that if you have a steep learning curve, it takes away from the work that they actually came to do is defining the business logic it need to learn the data, and another. Reason why I like to call customer-centric these data flows that in test and curate and Provisions of golden customer records is currently the only thing in our system that uses the machine learning algorithm and you guys know that can be pretty resource-intensive. So what we want to do is you want to make all the cluster resources available to take care of the customer first before we go on and ingesting the rest of the data pipeline
[Susan Cook] Actually those are two really really big learnings that I want to highlight that address one customer-centric data source at a time. I see customers get wrapped up and almost paralyzed when they try the shotgun approach of trying to address a hundred different customer-centric data sources, and it’s almost like you need to be Kind of linear Take One Source all the way from beginning to end then tackle the next one because they all have their own unique challenges that you need to be able to focus to be able to get success so that that’s a great piece of advice and then not letting the technological complexity distract you from the real Mission, which is cleaning the customer-centric data is getting it when people hands. So two great pieces of evidence.
(How is DataOps improving COVID-19 response)
Next topic which is I think Banks like every other industry are reeling in response to covid-19. A lot of our customers are also in the pharmaceutical industry. We’re all learning a whole new way to do business and to conduct our daily work lives as well as personal lives. But so how your approach and your data ops mentality organization. How has it helped you be agile and responsive in light of covid-19.
[Leilani Moll]: A few things I can say about that topic. So dataops by nature helps you to be agile and expect to be derailed at any moment and be able to handle that right? So just the agility about adopting data ops model almost mentally prepares you to say well, we’re not in control of anything even if you think you are in control, things might pop in and you need to handle that fast because a lot of people are dependent on banks to disperse funding made available by the government another aspect that i speak about, the better you know your data, the faster you can respond. So I cannot imagine in the past, if we dint go through the whole customer-centric initiative and forcing ourselves to really know the customers through their data. You know, we don’t have the luxury of going to talk to the customers. All we have is we can get to know them through their data and we know them through transaction systems that capture that data not customers systems that capture customer-centric data so swift through a lot of transactional data and then read out the data about the customer and get to know them that way and that helps us really respond a lot faster. So it’s a combination of going through the effort and learning about the data itself and learning really who our customers are and where there’s gaps in the data that we wished we captured that he didn’t and the ones that we did to know the flaws in them. So what you can and cannot trust really puts you in a position that I think way better to deal with these issues as they come up so you know that but maybe I could come up with and it’s covid-19 is something else that’s going to happen.
Another thing when we talked earlier about digital transformation, this is the time when we really see the value of doing things digitally. If you have to stay home and couldn’t run any transactions digitally, imagine where we would have been during this pandemic. So this reinforces the need to ramp up our digital capabilities and how we can respond to that.
[Susan Cook] It was interesting to watch in the news just as we went through the Paycheck Protection Program and 349 billion dollars that got consumed in less than 2 weeks it was fascinating to watch the banks that jumped on it, were out in front to service their customers and others where their customers were left out hanging to dry because they couldn’t respond fast enough.
I wanted to ask you about something you mentioned earlier about engaging the business users early and you had mentioned marketing, what’s the cadence. How do you engage them? Is a steering committee or a regular meeting? How do you partner with your business users?
[Leilani Moll] Well we had Ben just become part of our team that way he was part of our agile team, part of scrum, part of everything, but the piece business really needed help with, you can imagine when you have customer-centric data coming from different sources you have to pick survivorship rule when you build your model, which says which system can i trust more than the other, and IT don’t necessarily have that insight, so having Ben with us every day, walking through and come up with the survivorship rule and that’s another challenge for why they need to stay in there, every new system that you introduce, adds to the mix a sort of customers that come in you, you have to redo the survivorship rule becasuase that source may have a new attribute that you can trust over other systems and i know Marketing in the past really struggled with not having clean accurate data, emails that bounce back or I’m not getting a hold of a customer or Market to the same customers twice as kind of embarrassing right? So they had a vested interest in having to clean clear customer-centric View with us. So engaging the business making them part of our scrum team and Ben there.
[Susan Cook] so by getting them engaged and vested your increasing their data literacy with each interaction.
(Unified DataOps)
Let me divert for a just a moment because we put this next slide in in terms of unified dataops because we’re using vocabulary about the dataops approach methodology organization and I think you and Zaloni are in lockstep and that we define this a little bit more broadly. So I wanted to share this picture with our audience today in terms of what does this really look like from the various sources platforms environment. What where do you start? And what is all what does it entail? So when you think About this unified data options platform it starts from the very beginning of the life cycle when you collect data inventory it, and yes, I share Leilani’s view that metadata, It is essential but not just paying this is the field. This is the character length yada yada yada, but truly active interactive augmented metadata, who’s touched it how many times, is it clean, where did it come from who sanctioned it, what does the data steward say about it blah blah blah real good metadata that bad somebody who’s actually shopping and researching is this data good to use as its have purpose all the things they need to know about that data and then classify it, profile it and the controlling it is not just access controls. Leilani mentioned
Automating, putting some machine learning into the data quality process, making recommendation weather its GDPR, CCPA regulation etc, making recommendations on how to mask, tokenize, encrypt, specific types of customer-centric data and then the secrete sauce is tracking the lineage all the way back and then forward to its ultimate in Point if it’s in a you know, BI dashboard, if it’s in AI algorithm whatever it however, it gets consumed and then finally at that consumption layer helping our customers build these marketplaces where you’re end users, your data scientists can actually go shop and provison and collaborate around data so that it becomes much more self-service much more autonomous and how people get at it provision and in a sandbox set up their own snowflake environment on AWS so they can start to run some A/B tests across various algorithms to determine what’s going to be most effective to reach out to those customers. So that to us is what the unified dataops platform. That’s what Arena is helping you have transparency and visibility at every step of that lifecycle.
All right. So having defined it that way, Leilani, Let me start off the questions and audience. Please post any questions you have to the chat. Let me kick it off.
You started this from scratch, how did you pick, how did you decide and choose your partner in your platform in this journey?
[Leilani Moll] I started to look at what features i want, i created a grid and say well I want like a way to capture metadata. I need to be able to classify customer-centric data. Data quality for me was not negotiable. So how on Earth am I gonna ensure that the data is trustworthy. For me data governance is about instilling trust in the data, data quality is on part of that, but I did not want to compromise on data quality and in our world with GOVA, GCPA and all these regulatory requirements. We could not keep up with classifying database and what the Regulators were classifying as API. So I went down the path and said that we need to classify everything. So then if there’s additional fields that come in for example CCPA introduce device ID. So being able to classify and tag the data and then have rules on how do I treat it based on has classified was important because I can’t keep up doing it after the fact right in the. Another problem that we have in in our world and I don’t know maybe it exists somewhere else to is that if you have a data platform that doesn’t allow you to leave the data where it is and use it where it is, but forces you to pull it out and analyze somewhere else.
That introduced that data spawl that introduces your risk exponentially because of you loose track of where it us. So I wanted a data platform that makes it easy to ingest data but also keep it there so avoid having the need for people to pull it out so that they can analyze it somewhere else. So we evaluated all the players believe me. There’s a lot to choose from right and just check off which features that they have. That’s how we ended up with a unified platform and Susanit’s the one that has by far the most features that we need. I did not come from the school where I want the best of breed at a feature level for me it was about how it all works together is way more valuable. I do not want to waste resources and integration after the fact so make them talk to each other after the fact that there’s some business value for that maybe sometimes there’s other people that think there’s value in having the best of breed and having seen you know, as components is probably fine for some but for me if its integrated unified and it gets the customer-centric data in one spot, meaning I can ingest and use it from the same platform, was key to our decision.
[Susan Cook] I heard that from another one of our banks is that you know, if our platform can get us to 80% of data quality data ingestion augmented data management catalog role-based access Control model above it like the entire Litany of that list of every Requirement then great. Let’s keep it in one platform. Maybe every once in a while. There will be an outlier or there is a very very very specified thing, feature, or function that we need then and only then will we make an exception and go outside and get some sort of specialized tool but exactly to your point. Let’s not break that transparency that that lineage and Separate piece of technology unless we absolutely have to so a lot of people are thinking along the same lines.
Can you talk more about protecting customer data when building golden records?
[Leilani Moll] So when you build the golden customer record up front you sometimes need the actual value. So the team or who going to detect the duplicate, you want to keep that as people as possible. But the moment you have done with mastering Now some of those toxic API is no longer relevant. So the Zaloni platform at the point lets you tokenize them. So exposing the Actual raw data in the beginning to a small set of people that their purpose in life is to both the customer-centric golden records and then tokenize them after you’re done when you expose. It is a really good model and the zoned architecture works is that after you done it transparently can tokenize or encrypt or mash it.
[Leyton] Thank you. Next question that we have. Can you discuss connecting different customer data sets without unique identifiers?
[Leilani] Well, we actually don’t care if it has a unique identifier because their attributes that we use to build the customer-centric golden record gives us a unique customer entity at the end. We assign then a Bremer wide unique identifier to it. So, you know there’s going to be duplicates and in the machine learning model one part of it is a cross-cutting algorithm will detect that these three people are the same and give it our own version of the unique identifier. So whether the system that it comes from has a unique identifier or not we keep that as a reference so we can back go back to that system, but it’s not crucial for our practice to work to have a unique identifier coming in from the system. But as a matter of fact, we know we going to get duplicates when they come it
[Leyton] How do you and how often do you have to make adjustments and retrain your machine learning model?
[Leilani Moll] The biggest thing is actually if you have to adjust your survivorship rule, retraining the model is pretty easy. You’re building your training customer-centric data set and DME component makes that easy, but what is sometimes a little bit cumbersome is rethinking the survivorship rule and not necessarily each new system that you bring in will in fact survivorship rule but is something to look out for overtime and that you need to readjust your survivorship rule when you bring in a new source that can potentially disrupt what you originally thought was who can I trust over who can influence that so it’s not necessarily on the model side than on the survivorship side that you need to watch out for. We do validate our models for accuracy quite frequently because like most banks, we have a model risk team and that is what they need to do to validate that it’s still accurate. We haven’t really had to retrain the model yet, but we had to adjust the survivorship rules
[Susan Cook] you mentioned early on that you guys defined the customer 360 profile in advance. How did you decide what attributes to include, which sources to use. Just talk about that process of truly defining what was in the golden record at the beginning.
[Leilani Moll] So the golden record was to identify the customer to have enough attributes to confidently say this customer is who they are right. The first layer was to see what products they have do they have insurance with us who have mortgages, so that means I need the account numbers Right? So we go a step further with, we call it 720. The vision is to add behavior aspects we’re not there yet. The other aspect we add is what we call householding and super householding.
[Susan Cook] Wow, very very complex. Thank you. Leilani. Thank you so much. This was so I learned a ton. So Leyton, I turn it back to you.
Yes, of course. Thank you for answering that final question that concludes our live sessions today. You can request a custom demo of our data platform Arena.
Thank you for attending and we look forward to hearing from you. Goodbye.