Crisis-Ready Data Operations for Cost Savings and Faster Response

May 5th, 2020


Agility in data operations is always essential for business innovation, revenue growth and improving the customer experience. But in a time of crisis like the world is facing now, data management pipelines must be able to quickly pivot to recognize new sources of information in non-standard file formats, and without the added burden of hiring large teams.

On March 26, 2020, Gartner polled more than 400 IT leaders to understand their top actions for controlling costs amid the COVID-19 crisis. The number one action (besides cutting back on travel and hiring) is increasing the use of automation and other advanced IT tools.*

Streamlining your DataOps pipeline through a platform with automated workflows to ingest, catalog, and provision time-sensitive information, gives enterprises the agility, self-service access, and scale in the cloud needed to quickly respond while delivering a better customer experience.

The value your organization delivers is critical. Whether you are faced with processing patient data, executing government relief programs, or securing your supply chain, your analysts need real-time access to trusted data to make fast decisions that have a big impact.

Attend this webinar to understand real-world examples from companies on the forefront of fighting the COVID-19 crisis, and learn how to be DataOps ready for whatever comes your way.

*Source: Gartner Q1 Emerging Risks Webinar

Who Should Attend This Webinar
IT decision makers and those responsible for enterprise data management.

You’ll Learn Data Management Strategies for:
– Data acceleration for faster actionable insights
– Better resource utilization
– Risk reduction
– Transparent communication and collaboration

crisis-ready data operations - click for demo

Do you have questions on how you can achieve Crisis-Ready Data Operations? Request a demo.

Read the webinar transcript here:

[Amy King] Thank you to everyone who is here. We are excited today to have DXC and Zaloni present crisis ready data operation, and how they will help you with cost savings and faster response. 

Many of you have probably seen the stat from Gartner saying that only 18% of IT leaders believe their businesses were highly prepared for the impact of coronavirus. I think that’s probably even an inflated number. Many of you probably who have experienced some of the challenges know that is likely even lower than that, as we all have been hit with what seems like a heart attack to many of the industries that we work with. 

In this new reality one of the things that were a huge impact very early on, for all the data challenges that businesses were facing many of them were trying to deal with monumental numbers of PPP and care tax programs and the data involved in those or the incredible amount of money needed to flow through the system through the TPP itself and handing out these loans, whether it’s employment issues, whether it’s manufacturing and supply chain issues so many things that had to do with monumental amounts of data, to really hit businesses and in such a short period of time. Through this system become clear that data ops data operations, optimizing the data pipeline getting data end and accelerated in a streamlined and governed fashion from one end, all the way from ingestion to your end business users. Seeing you want to optimize that experience and do it at scale was really the key to being able to respond to these many data challenges quickly and effectively. So to talk about this. We’ve seen shifting and business priorities as companies are trying to struggle and deal and manage these things quickly, while also maintaining costs, and a high financial pressure situation. So in this environment of changing priority for businesses are trying to get to analytics and faster accelerate the timeline provide better customer experience so they’re trying to deal with all of these cares and other challenges, and also be able to grow revenue, as they’re seeking to take advantage of some of the new opportunities that this many parts of COVID-19 can afford, whether they’re pharma or health sciences or supply chain and manufacturing. There’s many opportunities but all of them require this data preparation and being able to do that in a crisis situation. So to talk about all of these today, I’m really pleased to have two experts. First we have Susan cook the CEO of Zaloni, and then Jim Coleman, the lead architect for partner integration at CSC, I want to let both of them introduce themselves so Susan, welcome to the webinar. 

[Susan Cook] Hey, Amy. Thanks so much and hello everyone welcome to our discussion today as Amy said I’m the CEO of Zaloni I joined about six months ago but prior to zaloni I’ve always been in data and analytics at companies like Oracle and IBM and most recently in MicroStrategy so this is a topic I am super passionate about and I am thrilled to be here, especially with my dance partner today. Jim Coleman who is such an expert in this area. So Jim I’ll let you introduce yourself. 

[Jim Coleman] Thank you for spending an hour of your day on this relevant topic. My background includes over 25 years in data warehousing business intelligence analytics, and most recently been architecting our analytics platform and integrating best of breed third party applications as well as helping our clients move to the cloud and take advantage of the capabilities in the cloud environments, back to you Amy. Great. 

[Amy King] Thank you, Jim. So, obviously, before we start asking your panelists the questions, I just want to make sure you all as participants know that you can ask questions. So, see the feedback and the area for entering questions into the chat window. Please use that we’ll be monitoring those as we go through the webinar. We’ll save most questions for the end, but we’ll make sure that they all get answered so please do that and then you’ll also see some additional information provided to you in a panel on the side too that you can download and take with you after the presentation, recording will be available and circulated. But I just want to make sure that you have all those resources are narrower those are within the brighttalk channel itself. Okay, so let’s get started. First, you know one of the things that has become apparent to be most important in this crisis situation is the ability to be agile with your data operations and be able to come up with new solutions quickly. Jim was really excited to hear about the axes pandemic Response Unit. Can you give us a little bit of information about what that is and how the axes using it to help your customers, in response to this crisis. 

[Jim Coleman] Sure Amy, DXC Pandemic Response Unit, or PRU who is a mobile robot with the ability to take non contact temperature readings of up to 15 people at a distance of about two and a half to three and a half meters. He can take this reading with an efficiency of about 200 people per hour, that appear you also has computer vision for mass compliance detection. A voice Broadcast System to disseminate information quickly, and it can perform non contact us infections, the pandemic response solution also includes pre configured workflows and an AI platform to provide data and analytics to support our clients pandemic policies and compliance efforts. And one of the lessons we’ve learned in implementing this pandemic Response Unit is that the more mature your data ops environment is the quicker it is to implement our pandemic Response Unit solution for your organization. So, the easier it becomes for us to integrate the data and the analytics that we generate with our solution, and for you to be able to use it to manage your responses to the covid 19 pandemic. 

[Amy King] That’s really great and to get things done now, you know, the more mature the data ops environment is, you know, the better. You’ve seen companies, able to implement your solution. Can you give us an example or two of what maturity looks like for you on a day off environment like what are some of the things that you’re looking for. 

[Jim Coleman] The ability for them to easily integrate the data that we generate into their environment, to be able to set up those data flows to modify their data and to know which data tables in different areas they need to include that data into their business processes, and then to be able to implement that data in their business processes themself. Those are some of the key things that we’ve seen that we need to do, and various versions of key fields that we want in our environment that we need to be able to, they need to be able to find quickly and provide to us. 

[Amy King] That makes a lot of sense and it really speaks to the end to end nature of data. And that brings me to a question for you Susan pharma and life science, obviously have been worlds that have been tasked with a lot of great challenge when it comes to COVID-19, whether they’re working on testing treatments vaccines. Having a data management platform has really helped them. What are some of the ways that you’ve seen Sony’s customers in these sectors, using data and data intelligently to accomplish these challenges? 

[Susan Cook] Yeah, so I think piling on to Jim’s questions the preparedness and readiness of these types of companies, has been tested to the max is especially about speed and responsiveness, so I will speak through the eyes of a couple of our customers their experiences. The first is alexian pharmaceutical they’ve been in the news recently for two reasons. Number one, they just completed a pretty large acquisition, but more germane to this topic today. They have just entered phase three testing of a new drug that hopes to help minimize the damage to a patient’s lungs, but has been stricken with COVID so there’s just a lot of promise there and a lot of excitement of them entering into phase three testing. And then our other customers labcorp. labcorp is been in the news a lot because they have actually production alized and made available, a home testing kits for after the fact, you can test yourself, if you have antibodies. Meaning, have you actually had COVID in the past and in didn’t even know it so their home testing kit is going to has a lot of promise so anyway those are two of our customers so experiencing this pandemic through their eyes, some huge lessons learned about this ability to integrate vast amounts of new internal, and also external data sources, they, they find themselves in a position of having to purchase vast amounts of data for health insurance claims clinical trial data, and I need to integrate it back on it very, very quickly. Plus, these are regulated industries. They have huge pressure on them for HIPAA requirements and privacy requirements so they have all of these requirements for compliance and data security. And then finally, I go back to speed, speed, speed. They have to be able to serve the public’s desperate need for proactive pre emptive cures, or, or diagnostic tools. And then after the fact. how can we help people recover or understand if perhaps they have already recovered recovered and maybe they could help the greater public at large. So that’s what we’re seeing in our pharma and healthcare clients. 

[Amy King] That is a lot, and impressive that they’ve been able to do this so you know those three main things large scale data integration, speed, and the need for good data governance, all really seems I’m fine, not only the pharma but to banking as well, their challenges have been enormous from a data perspective, can you talk to all of us about some of the challenges that the financial services world has been dealing with. And how is data ops really helping them overcome these. 

[Susan Cook] So this one, we have direct experience through our own eyes. We’re a small startup so we got to kind of experience some of these programs personally. Generally, but also through our big big big banking customers like Bank of America plus some of our mid sized like brehmer Bank and dusters bank. We got to see this front and center, and the takeaway here. You saw it in the news. So many of these institutions, responded, super super well, and they were out in front, and they were processing hundreds and thousands and, in some cases millions of these small business requests for loans, very quickly and then others were laggards and they struggled and they had to start and then they had to start again and then they had to start again so the companies that were best prepared to respond quickly. You saw it in the news and you saw it as customers who was ready because they have already implemented and agile responsive data environment and our customer at Bremer bank had a had a great quote being ready with a data ops approach and strategy and quote unquote normal times prepares you for these times of crisis. And at a great data strategy, you know, soup to nuts. Culturally organizationally technologically in in time, like getting that ready in times of routine responsiveness, it, it prepares you well because you have actually planned to be disrupted, you actually know things are going to change along the way. So that’s a, that’s what we witnessed and saw in our banking customers of some shapes.

[Amy King] I think that’s really a great point about how, you know, you’re going to need to change in some way, when the next challenge. And the agility and the ability to scale gives you the ability to deal with that. You’re not going to necessarily know what kind of data you’re gonna need to deal with or what the data challenge is going to be but if you have that foundation there, then you’re really ready for the unknown, that is going to come next. So, with that, knowing that agility and scale are so important in helping a business, be prepared for crisis when it comes to their data, Jim, what are some other data driven solutions along that line that you’re providing to customers, and how they help in terms of their crisis preparedness. 

[Jim Coleman] One of the things that we’ve seen that this crisis has caused is disruptions in companies supply chains. And so one of the solutions we’ve developed is the excuse using predictive analytics solutions to simulate various supply chain scenarios, such as the loss of a key supplier or a change in demand of products or services. Changes in buyer behaviors or an overload of your key systems. Knowing how these scenarios will affect your business, allows you to develop mitigation plans to reduce their impact. So we’re able to run the different scenarios, seeing what effect that has on the business. And then the develop business can develop, whether it’s implementing new technologies, new skills, other suppliers, things like that, they can develop these mitigation plans and be better prepared for those changes in the future. Another thing that we’re seeing this caused by this crisis is an impact on the talent pool the employees of the client, they’re not may not always be available like they were before. There may be new skills required. So one of the things that DXC is doing is leveraging our data and analytics to optimize our clients talent pool that allows us to rapidly and economically upskill or rescale their staff to meet the new demands and ensure that their business continuity, and that they’re able to continue to grow their business and even in these difficult times. And then the last listen and I’ll discuss that we’ve developed is a quarantine compliance and verification system that allows any health or other required agency such as the police to effectively control and manage all the citizens or visitors to a country might be issued with a stay at home notice or quarantine our. The goal of this is obviously to reduce the spread of the covid 19 virus. And I would repeat and you’ve heard this concept. Many times, the more mature the organization is that we’re working with the easier it is for us to implement the solution to be able to exchange the data that we need, and to provide them the data they need to track the compliance, more mature their environment is reduced to a point, you know, AI is something we talk about a lot. 

[Amy King] Obviously with big data, and with data management today. I’m wondering, though, so, you know, with any crisis there’s often a lot of time pressure, and that is often going to necessitate some quick changes to the AI functions that you have within an org. The streamlining and having a really good data ops discipline, helps you with, you know, whether it’s retraining the algorithms that you need to get an AI really working better for you quickly, to be able to pivot faster. How are some ways that you see, having a more mature data up function helps you with really seeing better value from your AI, in a time of pressure. 

[Jim Coleman] Well one of the things that I would say that is probably something that we didn’t realize as much before is that it’s very important to have these capabilities available throughout your organization, because we really don’t know where the next disruption is going to impact what part of the company. So, if you have data democratization in place. If all of the different organizations have their species that have access to this data, they can develop their own analytics, develop their own solutions, or at least have access to the folks that do have the ability to do it. That helps you get to your solution much faster. So that the broader the solution is in your organization, the more people that can take advantage of it, the easier it is for you to react wherever the disruption affects. 

[Amy King] That was a really great point. So, Susan, you know, maturity is everything that Jim talking about maturity and data ops company being a good signaler of crisis ready success, and having that spread throughout the oreg. A lot of these concepts seem sort of like a fairly large challenge. And I’m wondering if there are some common threads that you’re seeing across companies to have been able to adapt and scale out these data operations in a way that they’re successful. What are some, some common threads that you’ve seen to that success that really showed that it doesn’t have to be a monumental task. 

[Susan Cook] Yes. So, when I am talking to companies about their data management approach data ops approach. I think where they have been challenged, or failed in the past, is when they dissected it up, and they, it was very fragmented. So for us, we think of data, almost like any other product or assets in your enterprise. And therefore, it has a supply chain, just like there’s an employee’s supply chain, there’s an energy supply chain there’s a money supply chain. There’s a product supply chain that you have to take care of and think of holistically. So, what often happens, then in an enterprise especially big complex ones somebody thinks about Okay. Well, here’s where the original sources if it’s Viva or Encino or Salesforce or SAP or whatever. And then we’re going to ingest or stream lots and lots of data into the big data or Hadoop or Cloudera environment that they think about separately, when they think about, Oh, we got some data quality issues so let me think about how we address data quality or how we address mastering or matching separately. And then you think about okay well we need some analytics sandboxes so let me address, getting data into snowflake or into redshift or the Azure Data Lake Management System or whatever it is, let me think about that separately. And then finally, a data scientist is going to consume it in some sort of our Python algorithm or somebody who’s going to consume it in a tableau or MicroStrategy dashboard or if they’re going to consume it maybe into another application that generates some sort of workflow. You can imagine that there’s all of these points of failure there’s all of these points of integration that don’t necessarily happen in an automated way which means that the supply chain breaks frequently. So, one of the six, the success stories or takeaways that I get from our more successful customers is think about that, holistically soup to nuts. In the end, and then I’ve already addressed. Assuming disruptions will occur. You know in in DevOps and product development and technology everyone has adopted now this agile DevOps approach. Really no, you’re going to have to turn on a dime that’s why scrims are only two weeks long, so you can change and turn and twist with the business, the same approach needs to be applied to data. And then the final takeaway that I have an, and maybe this doesn’t really apply just to data. It certainly applies to me in my CEO role. I heard one of an interview on COVID-19. And it was from one of the Fed leaders. The Federal Reserve Bank lead. And he said, You in times of crisis, whether it was you know the recession of 2008, or with COVID-19 right now. You cannot act quickly enough and you really can’t overreact in. So certainly in my new role as CEO, I just had to make decisions much more quickly, and much faster than I’m even comfortable with I’m tend to be a very analytic person, and sometimes acting quickly and decisively is more important than getting it perfect. So don’t be paralyzed by analyzing every tool every technology and every event that could happen sometimes you just got to jump out and act. So those are my key takeaways, Amy 

[Amy King] Those are great points, I think being able to act quickly and then really being able to look at things holistically, I know we talk a lot about dystonia about things that help with that holistic view, whether it’s having a view of your data across its entire journey, or being able to provide you the collaboration tool that allows you to look across that and and in order to find and produce better insights to recommend, and tag data to teams. All of those things that create efficiencies and more value, come from that holistic approach so i think that’s that’s really great. And while we’re talking about collaboration and things that are really changing in our face in order to make companies, both be able to accelerate this analytics for these timely crisis responses. and then also be able to reduce the cost that they need to in times of financial pressure, Jim, do you see any further changes in how companies will view these data pipelines and data ops in the wake of all of the things that we’re managing through in terms of COVID-19 now. 

[Jim Coleman] Absolutely. I’ll start out trying to assume the second point about, there’s an assumption today that there will be disruptions in the future. And one of the most common questions that we’re getting these days are, how can I be prepared, or prepare my organization for disruptions in the future, we’ve all known the value of data driven insights recognized for years, but the recent events and the disruptions that we didn’t expect are prioritizing data agility and its value in being able to respond and handle future disruptions. And so that’s a key change that we’ve seen it organizations are talking about in preparing for the party on providing data agility is going to drive leading companies to rest or implementation of data Ops, to be able to deliver speed quality and reliability in all their data and analytic applications, so that it can respond better to future business disruptions. And then, one of the things that I find interesting is that I think that even companies who have data ops in place today. This recent pandemic has stressed the capabilities of those data ops environments. And they’ve realized that it’s more than just continuous integration and continuous deployment of data pipelines. It also requires a human part third requires human collaboration and supporting processes to make that data ops valuable. Being able to make changes quickly only provides value if you can make the right changes. So, some of the organizations that have the technical capability to make the changes but didn’t have the right processes in place and didn’t have the right people to make the decisions for those, and what types of changes needs to be made and how they need to be made, and realize that it’s just as important to have the right people involved in the processes and have those processes in place, and it doesn’t have the technical ability to do to implement data ops. Yeah, so a well organized structure. 

[Amy King] People teams processes and pipelines. That makes a lot of sense. I’m hoping maybe you can kind of walk us through what that looks like in a real kind of structure example. 

(29:00 Overview of the DXC + Zaloni Crisis-Ready Solution)

[Jim Coleman] Sure. the diagram that we’re looking at here is a high level diagram. The DXC is learning crisis ready solution. Right. And if an organization is just starting out or their early stages developing their analytics environment. This solution provides an advanced analytic platform, integrated with powerful data management tools to deliver insights in weeks instead of once. So, one of the keys here is that you can very quickly stand up an environment, and have the capabilities for ingesting data storing data, developing analytics reporting and providing that analytics other environments. And with the arena tool which is indicated in blue in the middle, you can do that very easily through one central environment. So this allows you to set up users to set up workflows for approving different things to support that quick solution quick development of that. And with this, you’ll get a lot of benefits first, as I mentioned, reducing your analyst mutation from one to two days, you’ll be able to unlock new business opportunities right with my ability to quickly develop new analytic applications, I can create things such as new sources of revenue from applications. I can create things to support my pandemic responses and other types of business solutions. And it will improve my business decisions because of my agility and ability to quickly get data in there, work with that data in different sandboxes without affecting my primary data, and being able to develop new solutions. And then, now that I’ve got those insights. I’ll be able to be able to quickly deliver them across my enterprise. And one thing that I mentioned before is it’s key with an environment like this with a powerful tool like arena. We can give more of our employees, our clients employees access to this environment to develop new datasets to develop new analytics to develop new applications, and to provide value throughout the organization. 

[Amy King] That is really compelling. And I really like seeing it laid out here visually, so you can see how it really is about an end to end system that connects a lot of the people process and pipelines together to deliver that time and cost savings, those new sources of revenue and all of the agility and scale you need to be ready for the next big crisis, so thank you for that. 

[Jim Coleman] And one of the keys here is the fact that, I’ll point out two more things one is the fact that this can be implemented in a variety of environments, to see if we can implement in the cloud, on premise or in a combination of those two environments. Right. And the fact that this has already been integrated this environment is something that has stood up through scripts, can be done very quickly as we mentioned in days instead of months, and get you to your insights much faster than would if you were implementing it on your own and poking all these things up together and going through and testing the different types of polls and how they interact with each other. 


[Amy King] Yeah, the speed and the multi cloud are both great points, do you see more customers using more than one cloud service provider. 

[Jim Coleman] We do it first everyone was sort of you know dipping their toes in the water and trying to stand up one and see if they could get that working and learning about it. But we still see a lot of clients who for some reason those other may need to keep some application or data in their on premise environment, or they’re nervous about going with only one provider and being locked into that, or something we were even seeing recently as clients get more sophisticated, there may be a particular technology that one platform has and other doesn’t and they’re already on one platform but they want to take advantage of this new technology in a second platform. So we are certainly seeing an increase in the number of hybrid solutions that we’re implementing. 

[Susan Cook] And, yeah, I just wanted to chime in on DXC diagram. The reason I like it so much is it’s so again it goes back to comprehensive and holistic. It takes into consideration all different flavors of data, even if it’s streaming in real time off of sensors like like the prove that that Jim talked about earlier, but also you know your standard internal structure data that all different types of data, even if it’s a meal, a lot of our customers are doing ESG type products investment products, environmental, social governance type products and think about the data that would feed that it’s typically unstructured in very non traditional social type sources and. And so, dx sees very holistic approach contemplates all of that. And then, there in the middle the blue box is a loanees platform called arena. And for those of you who have known Zaloni for many years, and rebranded as aloni Data Platform as a Arena, just because that’s a magical place where all things come together. So, the loanees platform is now called arena and that’s that blue box and Amy I thought your recent blog articulated what that is really really well. Using the air traffic controller metaphor. I can think about what as your data is traveling through this end to end supply chain colonias arena services that that controller that, that, that help you navigate get from one place, the, the origin to the destination and and be able to track it, and monitor it control it secure it mask it tokenize it deliver it allows somebody to provision it into a sandbox so they can actually get some value out of it. So I love that DXC takes this holistic approach to DataOps, and has actually considered all steps of that journey. 

[Amy King] Excellent, thank you for those are really valuable and and Susan I love that you mentioned the DXC, I was actually just reading a piece the other day that was talking about being. They had looked at in, in this time of COVID-19, how more and more companies are looking to the ESG sources themselves and their partners as a way to really judge trustworthiness, and that those who had higher ESG scores were performing better overall financially. So I think it’s just becoming clear how important his scores are becoming and getting that data into systems and being able to score has become a great example of a data ops function that needs to be able to flow and. Excellent. So I think, you know, this has covered a lot of big topics, and I’m hoping maybe you can wrap this up here with a few recommendations before we move into questions on how I’m just summing up a lot of the things that we’ve mentioned companies can work with their data ops to streamline to Jim’s point to really accelerate that ingest analyze process and governed and well thought out way to really make themselves more crisis ready. 

[Susan Cook] Sure, as Jim started off in his introduction he’s been in this space for 25 years. I will not be so honest, I’ll just say over multiple decades. And this and this difference. Now, versus, way back when the technologies have advanced so much now to handle incredible, not just volumes of data, but variety of data. And I think now is the time to take advantage of all of that innovation to modernize your data architecture, and there’s lots of resources to help you do that on our website. We published, what we trademarked as the zone based governance model. So, just go to zaloni.com and you can actually see how we recommend that you promote data from pure raw on testers to something that is refined and tested and can be trusted. The second thing is data Ops, if any of you it’s such a new term to any of you. Data ops is simply an automated process, and methodology used by data professionals in order to improve quality get organized around some disciplines and best practices but handling data. And there’s been lots published recently on on a data ops methodology so there’s lots of resources out there. And then this leverage a collaborative data catalog. So, there’s been a lot in the course of history, there’s been a lot of products that do data dictionaries and metadata management. That’s not what we’re talking about, we’re talking about taking it multiple steps farther ahead. Think of yourself as a consumer of information. It’s not that you just want to know that name rank and serial number, in that file you want to know operational things like, Who else uses it. When was it last touch, how clean is it, where did it come from what asset. Is it is it confidential information is it already encrypted, like, these are the types of things if you’re shopping for data. You want to know a lot more than just the metadata about that file, which is a segue into the fourth one. Self Service data access is something that all the analytics and bi vendors have been talking about in terms of self service bi self service analytics. Well you can’t do that unless you give them self service access to the data they need, and and it goes back to the Amazon ification, if you will, of shopping for data. They should be able to just go look at all the characteristics of data that they might want to use in their analysis and then let them go grab it and use it and set up their own sandboxes to do what they need. And then finally I know I’ve hit this point again and again and again by thinking about the supply chain of your data. In the end, because there are all sorts of compliance and regulatory requirements that, you know, TCPA proud of California GDPR out of Europe. I could go on and on and on, but we have to have good controls over our data from the point of inception, all the way till when it’s consumed for a purpose so. So those are the key takeaways Amy. 

[Amy King] Those are excellent. And, you know, I’m glad you brought up the Amazon application too. I was thinking the other day. My whole life has moved to online shopping. How efficient, the shopping, the online shopping cart has made all of us. The digitized data something card is just as important. The well controlled marketplace that it provides is really one of the best ways to that government efficiency that we’re really looking for so thank you for bringing that up. And we have a couple questions from the audience. And before we get to them I just like to remind everyone to please if you have additional ones, enter them into the question chat window and we’ll make sure that Susan and Jim addressed your question, so I actually maybe even me because I think the first one I’ll take because I think we’ve answered it. It’s just a question and see me to be on cloud first Can you do both on prem and cloud data. The arena platform from plenty can handle both on prem, and cloud. We are Amazon Web Services marketplace partners as well as Azure partners and Google Cloud Platform. So we work with all the cloud solutions and a hybrid or solar situation as well as on premise, which is really where we started and DXC is happy then to also work with customers, regardless of their data warehouse configuration, whether it’s on prem and or in the cloud.

By modification if you will of shopping for data. They should be able to just go look at all the characteristics of data that they might want to use in their analysis and then let them go grab it and use it and set up their own sandboxes to do what they need. And then finally I know I’ve hit this point again and again and again. But thinking about the supply chain of your data. In the end, because there are all sorts of compliance and regulatory requirements that, you know, ccpa Province California GDPR out of Europe. I could go on and on and on, but we have to have good controls over our data from the point of inception, all the way till when it’s consumed for a purpose so. So those are the key takeaways Amy. Those are excellent. And, you know, I’m glad you brought up the Amazon application too. I was thinking the other day. My whole life has moved to online shopping. How efficient, the shopping, the online shopping cart has made all of us, and that digitized data something current is just as important. The well controlled marketplace that it provides is really one of the best ways to that govern deficiency, that we’re really looking for so thank you for bringing that up. And we have a couple questions from the audience. And before we get to them I just like to remind everyone to please if you have additional ones, enter them into the question chat window and we’ll make sure that Susan and Jim, adjust your questions, so I actually maybe even me because I think the first one I’ll take it I think we’ve answered it. It just is a question is do you need to be on cloud first Can you do both on prem and cloud data. The arena platform from plenty can handle both on prem, and cloud. We are Amazon Web Services marketplace partners, as well as Azure partners in Google Cloud Platform. So we work with all the cloud solutions in a hybrid or solar situation as well as on premise, which is really where we started and DXC is happy to also work with customers, regardless of their data warehouse configuration, whether it’s on prem and or in the cloud. Okay, so then, next question is, is Change Data Capture One of your features, because then I’ll throw that one over you. 

[Susan Cook] Yeah, so I was just looking at and I hate to give the consultant speak answer but it depends. So who ever asked that question, please submit your email or whoever asked that because that’s probably a longer conversation and the reason is, the answer is maybe. It Change Data Capture because it depends on what is the underlying source technology and what are you going to do with it. So can we capture, not the entire source, but can we monitor for changes and grab those as they occur. And the answer is maybe. It depends on the underlying technology that you’re going after so for example we couldn’t do that in a mainframe source. So, let’s, let’s talk about it further I’m not trying to avoid your question I want to answer it honestly so so let’s see what your specific environmental need is 

[Amy King] right okay and then we have a question specifically for Jim, which are the new staff skills, found out due to COVID-19. 

[Jim Coleman] There were a few of them, I mean some of the ones specific to COVID-19 would be ones like being able to disinfect an environment, and what are the requirements. Now, you know, one of the retail vendors needs to disinfect clothes after they are have been someone’s tried them on. There were a lot of different compliance processes that are put in place today, where people need to go through quick training for it. Show that they’ve, they’ve passed the training for it, and implement those things. We’ve also seen where it may be the case that I just have some of our employees aren’t available to me right now, and I have skills that have been in my business before, but now I want to train some of my other employees in these skills so they can take over for the ones that aren’t available to me at this moment. So part of what we’re doing is looking through and trying to figure out what would be the analytics for the best employees to rescale or to train in these skills that they didn’t have before. Which ones may have corollary skills that would make it faster for them to pick up the new ones. So those are some of the things that we’re doing. And looking at implementing analytics for the skills and for the talents. 

[Amy King] I feel like that is an endless field right now between keep breeding pieces on me like contact tracers and where we’re going to find all of them and people that take people’s temperatures and all of these things that have created work areas that didn’t necessarily exist anymore and how can we transfer one group of people who are finding themselves needing work to the skills needed for these new types of roles. That’s really interesting. Okay, so we have just a couple more. Jim this for you, saying that we already use quite a few data technologies. Why can’t we use them to build these data up functions ourselves. Why would we need to work with another vendor. 

[Jim Coleman] First of all, I would certainly say that you can build them yourself there’s there are a lot of different tools out there and you certainly can build data off functions with separate tools. But, Susan mentioned that Amy had a way that you described the arena as a flight controller air traffic controller. I’ll date myself a little bit but I like to refer to it as something similar to an integrated development environment. I’m an old programmer and when I first started programming. I had a separate tool for editing my code a separate tool for compiling the code for executing the code for testing the code for troubleshooting the code. And it took me a long time to set up an environment for a project to tie all those things together and to get a seamless set of tools to build an executable. When integrated development environments came out and you had one environment where everything was tightly coupled it was already tightly coupled for you, you could produce things you could start and traverse things much faster, and your production of applications was much faster as well. Well, the analytics environment we have in this lonely arena, or somewhere things where all of the link is already done and it’s already tightly coupled, so you’re not going to spend months trying to get these capabilities to work well with each other. And even when you do get them working. It’s much more tightly integrated whenever I’m ingesting data through arena. It’s also capturing metadata from that, when I am in my sandbox and looking for data, I can search the data, and some of that metadata was captured during the ingestion there for me to be able to search on same things true if I’m building new datasets and I can tie those datasets, it becomes easier for other people to search them because of the integration that’s already in this application. So, again, I would say yes you can. But it will take you longer, and you won’t have the productivity and the ease of use that you will have, if you do use these more modern technologies and more mature solutions. 

[Amy King] Those names that you mentioned definitely seem to be even more important in a crisis situation so that makes a lot of sense. I’m going to end with one last question for Susan, and we’ve been talking quite a bit about a lot of things on the IT side of the data spectrum. How do you as an end, gain the business side more involved in the data ops process. 

[Susan Cook] Well, they have a vested interest for sure they are the consumer they’re the consumers of this information so we have this notion in a data, ops methodology and, and certainly in our catalog of a data steward. A data steward is someone who owns that data, makes sure that the definition is clear and consistent and addresses the business need addresses what you know what does clean data look like. So, our customers get their business can. It’s right at the very beginning. One of the things I learned from one of our banking customers is the best way to approach this is take a source, a single source, all the way through the process from end to end. At one time, don’t try a shotgun approach of trying to adjust 500 different sources and cleaning them up. If you do one data source in the end with a great business partner, you will identify the vast majority of challenges that that you’re going to face for many many data sources so do one all the way through. In the end, then do another one. And you get faster, and you get better and then do another one. And you get faster and you get better and all of a sudden you’ve probably got five to 20 different data sources running through your data offs approach and methodology, and you have created these sponsors who turn into your spokesmodels for the good use of data and the good discipline around data. So, get them involved and engaged from the beginning is my recommendation. 

[Amy King] Thank you so much that is great advice. And I think we are done. So I just want to encourage everyone to look at the attachments and links, and we will be circulating the webinar by email. So thank you all so much for your attendance today. We hope you really got some great information about how better data ops can really help prepare your business for the crisis today and the one tomorrow. I want to thank both Jim and Susan for their great participation today, and again for all of you for joining. We look forward to having you on Zaloni’s next webinar and you can find information about those, and anything else about our platform on our website it’s learning, calm. Thank you all.