00:00
Gorkem Sevinc, Qualytics
Well, hello everybody. Thank you for joining us today. My name is Gorkom Sevinc. I'm the co founder and CEO of Qualytics. Today I'm joined by two good friends of mine who know the different spectrums of data quality and data governance really well. So first we have Renee Colwell from revantage. Renee, would you like to introduce yourself real quick?
00:22
Renee Colwell, Revantage
Yeah. Hi everyone. So I'm with revantage. Revantage is kind of the middle and back office for the Blackstone portfolio companies. We are a data aggregator, data consolidator for the Blackstone ecosystem and I am the global DQ lead. Been in data a really long time.
00:45
Gorkem Sevinc, Qualytics
Excellent. Thank you for that introduction. Rene and Matt, I think we all have heard the little name Georgia Pacific a little bit, but tell us a little bit about you and the company and your role.
00:57
Matt Robuck, Georgia-Pacific
Yeah. So good afternoon everyone. Matt Roebuck. I lead data and analytics here at Georgia Pacific. We're a large manufacturing company, part of Koch Industries as our parent company. I've been in this role and at GP for about three years, but been in data for a good portion of my career. So really excited to be here today. Excited to be here with Renee Gorkham to talk a little bit about data quality which is near and dear to my heart.
01:28
Gorkem Sevinc, Qualytics
Excellent. And for our attendees, this session is being recorded. We will share this recording after. After we're done. So with that, I wanted to start us off with a little bit of level setting. I started the company in the first place personally because I was in Matt and Renee's shoes where I personally have lived and breathed these data quality issues being problems that I needed to solve. And I like to call this problem expensive. Whack a mole because it's manual, because it's not scalable. We wait for things to break before we prioritize a data asset. And we're limited by the boundaries of our imaginations before we solve these problems. And Eric and I started the company in the first place because we wanted to move towards a proactive place within data quality.
02:19
Gorkem Sevinc, Qualytics
And I don't think I need to tell everybody too much about the fact that derivative uses of data, AIBI and operational data is just exploding. We are living in a data first world and this success of data call, success of AI, success of bi, all hinges on data quality at the end of the day. And don't just take my word for it. Some of the metrics that I like to put on the screen here, you know, three quarters of executives are unhappy with their data, half of Those executives are actually admitting that they have made a bad decision off of a bad data point in the last six months. Those are the ones that admit it. It's crazy. So what do enterprises do about this? They hire chief data officers, they hire head of data governance, head of data quality.
03:08
Gorkem Sevinc, Qualytics
And those folks are prioritizing three things. They're prioritizing security, discovery and quality. From when we started the company today, I've seen a significant change in the uptick of importance of data quality and data governance and how folks are actually having a lot more interest in getting a foundational trust in their data. So that's what we're going to talk about today. How the existing tooling, the legacy tooling of the of today, or the fact that the solutions that we have either used or built ourselves are inadequate. How those solutions require us to spend at least two years on seven figure investments to get to any level of business maturity. I'm not even talking about technological maturity, just business maturity or being able to have business impact.
04:03
Gorkem Sevinc, Qualytics
So today what we're going to dive into is not only making the ROI case, but the fact that AI is making our problems significantly worse. We are now enabling business users come up with previously unknown insights. And do we actually trust the insights that those folks are coming up with? Well, it's a big resounding no. And we need to get a handle on our data quality, data governance. So today's topics are going to include how does data quality affect business outcomes? How do we articulate the ROI of data quality to people that are not living and breathing? Data quality, data governance all day long. What are some of those metrics that resonate to business users and some of those real examples. So with that I'm going to stop sharing my screen and let's go into my first question, right?
04:57
Gorkem Sevinc, Qualytics
Getting budgets approved is a problem for us all. And you see a data governance product, you ran an assessment on it, you see the ROI clearly, but you're trying to make the case to the budget gods to get approval. So let's start with you, Rene, first. How do you, how do you help your business stakeholders understand and quantify the actual cost of bad data? How do you then articulate the ROI of improving it?
05:23
Renee Colwell, Revantage
You know, is an art, not a science. And it very much depends on, you know, we've been doing this for three years, so I can pull some real stories out and say, you know, in this other area, because before we started doing data quality, here were some errors and this is how much it Cost you. We have like risk and accounting and they can tell you exactly, you know, what the, the monetary pain is. And then, you know, after data quality, those errors go away. Or we're monitoring them, we're catching them before they cause problems. So, so that's a monetary value right there. The other thing is depending on the audience, you know, and I'm not that great with this part of it, but you can do estimates for, you know, what, how much does this cost you in manual labor? Right?
06:30
Renee Colwell, Revantage
How many people are you paying to do work that you're not even thinking about? Reconciliation, fixing, cleaning stuff up later, how much are you paying them? How much time are they spending? So kind of folding that in, depending on the audience, that can resonate.
06:51
Gorkem Sevinc, Qualytics
That's great. And that's from the champion perspective, right? Somebody that is a practitioner of operational data quality. And Matt, you're coming from it, from the economic buyer perspective. And tell us a little bit about your journey. Matt, how did you go from prioritizing data quality to getting budgetary approvals and ROI thought process?
07:15
Matt Robuck, Georgia-Pacific
Yeah, so I would say the first thing about investment in data quality, the first rule is you don't talk about data quality. Talk about the business problems and the business outcomes that, oh yeah, you will solve and that you will fix when you deploy a solution. And I would say, you know, for a number of years we've done a great job with more kind of rule based data quality, more of a manual type of data quality initiative where we're building individual rules where we can to improve data quality. It wasn't really until we deployed an agent, an AI agent on top of a sales data set to really enable our sales teams to win deals a lot faster. When we did that, it took us two weeks to get the large language model up on top of the data set returning answers.
08:17
Matt Robuck, Georgia-Pacific
It took us another six months for it to actually be useful. And that was because the data was in such a poor shape that we coded around it. We spent six months coding around bad quality data. So when I go back to our business leaders and say, hey, you know those problems that we're trying to solve, I can deploy an agent for you in two weeks. But our data is in such poor quality that it's going to take me another half a year before I get to that. When you have a story like that, things begin to start clicking. What's hard is until you have those stories to tell, just going and saying, hey, I want to spend X dollars on data quality. That becomes a harder discussion to have.
09:06
Matt Robuck, Georgia-Pacific
So here we're really problem led and we think about what business problems we're going to solve, not necessarily what technology capabilities that we're going to build. We don't lead with that. So we marry those two things together. So that's why I say the rule is you don't talk about data quality, you talk business problems you have.
09:28
Renee Colwell, Revantage
I second that. You talk the language of the business to the business and you can, for some people you're going to quantify. But sometimes a good story just gets everyone's attention. And you don't even say things like, well, how's your data quality? You say things like, do you trust the information you're using? You don't even have to use the word data. Right. But we know it is data and it is quality.
09:59
Gorkem Sevinc, Qualytics
And of course it's a journey. Right? It's a, your data governance journey is going to include people and process and technology. Technology just does not solve every single problem that you have. You have to have identified people and process and have lived and breathed some of the anomalies and data quality issues a little bit yourself before you can actually prioritize defining this at scale. So let me ask you this question. How have you both, I've seen you both do this, so that's why I want to ask about it. How have you involved business stakeholders in such trials, proof of concepts, et cetera, early so that they can see for themselves what does it mean to have data governance?
10:43
Gorkem Sevinc, Qualytics
What does it mean to, you know, showing anomaly to a person is not meaningful enough because they're going to say, okay, yeah, you should go fix that. So how do you actually prioritize their involvement early enough? Do you go to a CFO and say like, here's your data quality report, here's your data quality scorecard. Like what do you do? Maybe Matt go first.
11:04
Matt Robuck, Georgia-Pacific
Yeah, so like I'm, like I mentioned before, Gorkum, once you have that story and you can really tell that story and the story that I told, that was a year and a half to two years ago for us. Since then, what we've done and we really think about it through value streams, like we start with the business problem and then we deliver a solution that solves that business problem. True, end to end, what does that take? And in that example I gave, there was a six month period that data quality cost us in delivering that value stream. So what we did next is we said, ooh, I think we've got a problem here that we need a solution for. So once we did that, we took one and these. We got plenty of business problems here to go solve.
11:57
Matt Robuck, Georgia-Pacific
So the next one came up where were going to go and deploy an agent. On top of that, we said, hey, let's take a, our business partner along for that journey. Because that first example, they were like, I can't believe it took half a year to do this. So we said, okay, why don't you come along the journey with us and we will look at that true end to end value stream. And when you take a business leader along the journey and you talk in the language of value streams or true business outcomes, then you get questions like, hey, what can we do to accelerate this versus if I just come to that leader and say we need better data quality, they're like, I can't wrap my head around that.
12:39
Matt Robuck, Georgia-Pacific
But when you have them ask the question of here's a real business problem, took us six months, how do we accelerate that? That's a good question. So you want your partners and your business leaders asking you good questions and helping to take them along a journey with a specific example, which we did. So that was a sales example. Later on in the manufacturing space, we had another huge $75 million opportunity. So we said, hey, business leader, come along the journey with us and we'll talk through how we can accelerate this end to end. Just so happened that data quality was a big chunk of that and that's ultimately what led to us making more investment in the data quality space was that journey. So there's not a 10 page slide deck that I can put together to go pitch data quality.
13:31
Matt Robuck, Georgia-Pacific
It's very much wrapped into the stories and taking our business partners along for the journey.
13:41
Gorkem Sevinc, Qualytics
That's really great. And actually so if were to double click a little bit more, my traditional way of thinking has been the easiest way to calculate an ROI that I'm going to try to go make a case to my CFO about is operational efficiency. If I wanted this much coverage of data quality, it's going to take me this many FTE hours to actually implement this. Or I can just go buy. It's the build versus buy decision. But what we're not talking about is the risk. How do I actually put a number around risk if I don't do this? What is the cost of inaction? Rene, thoughts on what I just said and what Matt just said?
14:27
Renee Colwell, Revantage
Yeah, so I'm definitely like concrete examples and yes, data quality has to be baked in and I'm one Who says, get that specific example? Don't just talk in generalities. Data quality is great, right? So I'll give you an example that actually has to do with risk. So revantage. One of the services is to provide insurance for property holdings that Blackstone has in their portfolios. And so it's a very complicated process. There's a lot of manual touches, and at the end of the day, you get as much information as you can about every single piece of commercial real estate, for example, that you own, and then you calculate, well, how am I going to ensure, how am I going to mitigate the risk?
15:30
Renee Colwell, Revantage
So one fine day, using AI, because it's all an AI calculator, there was something that popped up that there's this building and the fire component of the insurance plan is more than 10 times that of all of its peers. Well, why is that? Why is that? So I said, listen, we'll do a forensics, we'll figure this out. And what happened was very simple. Someone had taken the code for what is a building constructed out of, and they put in the code for wood, Right. So the other attribute says it's in, you know, it's in a zip code that's a very dense urban commercial zip code. It's 30 floors high and it's made out of wood. So AI does not have common sense. AI goes, wow, 30 floors of wood, that's really dangerous for fire. We're going to charge you a lot for that.
16:38
Renee Colwell, Revantage
So that's a concrete example where I can say this little teeny, on a tech level, very small problem turned into X amount of dollars that you would have to pay because you didn't catch it in time and because AI doesn't have common sense. So you got to put that common sense in the business. People are often surprised that AI does not have common sense. Sometimes they assume, but that's obvious. Yes, not to AI. So that was kind of an easy sell when we found out.
17:16
Gorkem Sevinc, Qualytics
Yeah, absolutely. I mean, that's a great example of one anomaly that is going to cost you so much in operationally costing you so much. So we're talking about data quality is not just at the edge, it's at the operational side. You have to shift left. You have to get close, as close to the source as possible, so that you can actually correct issues at the source, not just slap band aids on it.
17:44
Renee Colwell, Revantage
Yeah. And can I just tell you that one example resonated more than any chart going. You're 50% complete and 20% validity. Like all of that, the business is like, oh my God, spare me. But, but that example was like, yes, I understand it now.
18:07
Matt Robuck, Georgia-Pacific
Yeah, Renee, I love that example. I think that's such a great example and triggered something in my head. We've got a data quality score here for our different data products and data assets. We don't talk about that publicly though. That's more of like an internal IT number. When we talk about metrics, we talk about missed revenue margin leakage, rework, delayed decisions, inventory errors. Like those are our data quality metrics, meaning those are tied to actual business outcomes. And when we can tie it to say, hey, based on if we've got nine figures inventory that we're trying to optimize and we know that we don't have all of the information because the quality of the data is not great from a buying perspective, we know we won't make the most cost efficient decisions when it comes to buying.
19:05
Matt Robuck, Georgia-Pacific
Therefore, we're going to improve the data quality. But what we really mean is we're going to improve our leverage from a buying perspective for our sourcing organization. So, Renee, great example. And I think it continues to come back to the business problems that you're trying to solve.
19:25
Gorkem Sevinc, Qualytics
That triggers something in my mind which is, right, we're not in the business. Or at least me as a vendor. I used to be on the buyer side, Right? Me as a vendor. Now I'm not in the business of selling you features. I'm in this business of selling you. What are the business problems we're helping you solve and how we're sol. How are we solving them faster? It's amazing. We talk to a lot of companies from different industries and you hear, you talk to 100 people, you're going to get 150 different opinions about how to do the scoring of data quality. I care about validity more than freshness. Well, okay, yes. And every anomaly is not equal. Yes. Also agree with that.
20:06
Gorkem Sevinc, Qualytics
Yes, there are dimensions of data quality that we care about, but your CFO typically is not going to care if your data quality score is going up or down. They do care. If your reconciliation processes between your Oracle and your SQL server are going to take 1/5 of the amount of FTEs and it's going to take 10% of the time that it usually used to take and there will be less errors in it. How do you tie a metric to that to actually show that to the business stakeholder? That's where I get a little blurry on.
20:41
Matt Robuck, Georgia-Pacific
Yeah, maybe let me say something that might not directly answer that question yet, but I'll walk you through kind of our journey there. So back to my sales story. Once we identify that, oh, you know, we've got a data quality problem and it's coming quickly from an AI perspective and as we're trying to drive more business outcomes from agents that we need another piece of technology and people in process to be able to much more quickly get our data into the state that it needs to be in. When we laid that out, we had a couple options. Like I said we had a great small team that were building rules. It would take us eight to 10 weeks to get those rules really deployed into production. It's the back and forth. Did you mean that? Did you mean this?
21:34
Matt Robuck, Georgia-Pacific
Can I'm going to put in the dev. Can you test it? So that took a long time for us have chosen to scale that team. And if we'd scaled that team, my estimate was we need to go from two people to about 20 people writing the rules that we needed to write across all of our data sets. You know, Georgia Pacific, we cover a number of different businesses areas. We have 35,000 employees. It's a big, complicated place from a data perspective. The other option we had was, hey, is there a more automated way? Not necessarily technology, but people in process. You've got data stewards or data owners who understand their data better than anyone previously. They're having to open a ticket with it and then it would build the rule and then it would take 10 weeks and then we would move on.
22:33
Gorkem Sevinc, Qualytics
And the business logic is already stale by the time it's implemented.
22:36
Matt Robuck, Georgia-Pacific
Yeah, exactly. So that's how we ended up talking to you. Gorkham was saying, hey, is there another way from a people and process side where this can be more of a federated model? Yes, guardrails in place, but more of a federated model so those data stewards and subject matter experts can build rules or even have rules suggested to them to really speed up that development lifecycle when it comes to deploying the rules. That. So that was a bit of our journey of how we got to the point where we're talking to Gorkom and team is to say, hey, what's a better way for us to do this because of how quickly we are now going to have to move. Previously, you know, there'd be some bad data in a report. The report owner would say, oh, I know that's bad data.
23:28
Matt Robuck, Georgia-Pacific
It's coming from this source system. I'll get in there and fix it. Now we've got agents crunching a bunch of information and then spitting out answer. There's no explainability the whole way through where a person is in the loop to say, oh, let me fix that real quick before you get the answer. So that's really what fundamentally changed for us in the last, really 18 months that have led us to hey, what's a better people process and technology way for us to make sure that our data is clean. So I don't know that answers the question you were asking Gorkum, but it does to understand kind of where we're at and this AI shift that's happened to us and how we've had to now do something different.
24:14
Gorkem Sevinc, Qualytics
Yeah, I like where you're going with that, Matt, which is, look, when we started the company, the methodologies that were thinking, which are two things, right? Augmented data quality. I'm not replacing the human, I'm augmenting the human with superhuman powers, which means rule generation, rule maintenance. 95% of the rules that you need being automated so that I don't have to think about them. I can focus on the 5% that are really complex business logic that cannot be automated. But then that 5%, it has to be a collaboration between the business stakeholders and the data teams. Data quality is not a data team problem, it's an organizational problem. And you have to have a cultural shift to be able to ensure that you're handling that data quality at scale. That business user, that FPA analyst is going to know their data the best.
25:04
Gorkem Sevinc, Qualytics
Then they're trying to communicate an Ajira ticket to somebody that doesn't know the data's context but knows how to write Python. Trying to write that logic, that's just not a winning formula. And it's amazing that we have seen this shift happen in the market. From being a nice to have this is cool to now being a I have to have this in place. Rene, how about you? You have been operationalizing data quality for a long time. What about these methodologies have landed well with your organization?
25:37
Renee Colwell, Revantage
So I have to say that the real estate and the way portfolios are structured as separate companies, it's the most aggressively federated landscape I've ever seen in my life. So one of the challenges was if you're going to get a bunch of coders, I mean, to me it was a total no brainer, you're going to get a bunch of coders and they're going to do it here and they're going to do it in SQL, but oh my gosh, now we have to give it to the same actual rules, like in the old way, and you're going to give them to a new set of developers and they're going to develop it for databricks or Python or whatever to get the same result.
26:25
Renee Colwell, Revantage
So one of the things that we talked really to all the constituents is you're all doing some things that are just baseline. You have to do it and machine learning, you know, you get that 95% or I'm going to argue with you, sometimes it's 80%, but whatever, you get the ones right off the bat. You're not wasting resources on that. And you can also lift and shift because we have a lot of portfolio companies. We have internal groups, we have internal. Like my Risk example, we have a group that's doing risk in the us, in North America and also in the UK and in Europe and they're all kind of similar, but not really.
27:20
Renee Colwell, Revantage
So, fine, you take what we did in the US and you send it across to Europe and now they have the exact same set and you can tweak instead of saying, we're going to start from scratch. One thing that I tend to do is I'm really a handholder. I don't just go, here's a spreadsheet, I'm throwing it over the wall. I spend some time listening to their pain because I feel like people always, if there's nothing in it for them, they're not going to be as focused. If I'm just doing this for another group, they'll tell you, yeah, this is great, it's going to help the company, but they want to see something for them. So I spend a little time, you know, this is what we're doing, but what is your pain?
28:08
Renee Colwell, Revantage
Maybe we can put a little bit for you into this while we're doing this, standing this up and just hand holding and you know, there's a lot of fear that people have with data. Frankly, they're terrified. So soothing some of that by saying we're going to start small, focus and on something that's important. So if it's AI, we're going to start with something that's going into AI and we're going to be with you every step of the way.
28:42
Gorkem Sevinc, Qualytics
That's great. And it's amazing. I like to think of the 1:10, 100 rule of data quality. Right. I don't know if everybody's familiar with this, but this is from a 1993 paper about the compounding Cost of bad data. And it talks about how if you catch yourself inputting that data into a system, you're fat finger something. You caught yourself. It only costs the company a dollar to fix that because you solved it right away. If it's later remediated that bad data has already gone into a database, it's going to cost $10 to actually remediate that issue. If you catch it after the fact, you have already made a bad decision off of that data, it's cost you $100. So how do you both see that actually growing from the perspective of different flavors of AI?
29:39
Gorkem Sevinc, Qualytics
In my mind, machine learning is the next level of the complexity. Add a zero for machine learning and then you go into AI models, that's another zero. You go into gen AI, another zero and you go into AI. You're going to agentic AI. Now I'm adding multiple zeros to that. Is that how you think about it as well? Has that been your experience? I mean, I know Matt, you're going into that quite a bit, you're talking about the end. But then also there are multiple steps in the middle. So how do you think about that compound and cost of that data?
30:15
Matt Robuck, Georgia-Pacific
Yeah, I think that study definitely needs to be updated to add in the thousand, ten thousand, hundred thousand dollars problem. Right. When you've got AI now making decisions on data, you know what's interesting. Maybe I'll back up for a second. There's, for those of you who are technicians and engineers and practitioners, you in the space, you know that there's always this debate between source system, like let's fix it in the source system versus let's fix it downstream. And I've got a great team. And that's a big part of this is you have to have the right mindset and the right team in place to be able to think this way. And I'm very fortunate at GP that we've got that team. But one of the things that we've been talking about, I actually had this conversation last week with our executive team.
31:08
Matt Robuck, Georgia-Pacific
We got into this debate again of who in it is responsible for data quality. Is it the source system team or is it the data team? And I said, well first of all, we're all responsible. And now that we have, we're on our way to a lot clearer data quality picture and true anomaly detection. Now us as the data team, we can help those source systems and ultimately the end users through detection to say hey guys, like we don't care whose problem it is. It's the end to end problem. But we now have clear visibility into what that is. So for us it's been a bit of a mindset shift.
31:53
Matt Robuck, Georgia-Pacific
Some of the problem getting these large organizations and you have these siloed groups like I own the source system and I own the data lake and I own the data warehouses and I own the analytics and you get this diffusion of responsibilities. We call it a tragedy of the commons. So having a team in place that thinks about it truly from end to end and then also having the technology that can support that true end to end view, so we can walk over to our ERP team and say, hey guys, you know, we saw this yesterday and we think that you should know about it. And I bet if you put a front end rule in place to prevent this from happening, not only is it better from a data perspective, but it's also better from ERP source system perspective.
32:37
Matt Robuck, Georgia-Pacific
So it's a bit of a paradigm shift and that's not always easy when you've been delivering data in a specific way for a really long period of time. So to me, again, it comes back to a mindset shift. It comes back to the team having the right team in place. But we are at this kind of pivot point where things are really changing and AI, yes, is driving a lot of that. But now these platforms are giving us the capability to hold the entire organization accountable. True end to end.
33:09
Gorkem Sevinc, Qualytics
It's amazing that you say that, Matt. It's. We have had a few cases where a prospect that is trying call it for example, or just, you know, like talking to us all of a sudden says, well, you know, like if I don't know about it doesn't exist. Right. Well, that mindset does not. You cannot get away with that mindset anymore.
33:32
Matt Robuck, Georgia-Pacific
That's how you end up with a hundred thousand dollar problem. Exactly.
33:36
Gorkem Sevinc, Qualytics
Yeah, exactly.
33:37
Renee Colwell, Revantage
Yeah. If you don't know about the termites in your house, they exist and you won't be happy with the result. Right, exactly.
33:47
Matt Robuck, Georgia-Pacific
For a long period of time.
33:49
Gorkem Sevinc, Qualytics
So, so let me ask you a very controversial question. We're talking about data governance. We're talking about data quality. All of these are part of data initiatives. Do you think data leaders should be reporting up to the CIO organization, to the CFO organization, to directly to the CEO? What do you think?
34:13
Matt Robuck, Georgia-Pacific
Well, that's a direct question. Renee, you want to start?
34:16
Renee Colwell, Revantage
Who, me first. So I vote for the CEO. You know, I'm just going to go right out there because you know, the CEO needs to know really that the data, that's your lifeblood. So, so go straight to the head. I haven't seen that actually very many places. The CFO is great because they've got the money and if they're a champion, you can get a lot of traction. It doesn't work so well in my experience if it's considered a tech function. So if you're reporting up to the cto, then I feel like there's a lot of work to be done.
35:09
Matt Robuck, Georgia-Pacific
Yeah, Renee, I would agree with that and I would add to it. Reporting structure definitely matters. Mindset matters much more than that. So if you've got leaders, executive leaders who understand how data can move their P and L, like how they can actually add revenue to the bottom line or reduce costs and you've got leaders who have that perspective, that's very important from kind of the senior level of the organization. And then the other thing I would say that's really important for us here. We talk a lot about integrated teams and we think a lot about not necessarily a reporting structure on a sheet of paper, but do we have best knowledge and do we have, we talk about leveraging comparative advantage. Do we have the right people in a, a product team or a POD team and a core component of that.
36:14
Matt Robuck, Georgia-Pacific
And I see this across my teams and across GP all the time. If I walk into a team that's delivering data and there is not a business leader, business owner, business stakeholder as part of that group, and I don't mean that they just show up to the once a month report outs, I mean like they are like in the scrum meetings, then I know that we're suboptimized. So we talk a lot about integrated teams and how important that is. And if you exist in an organization where it's. The data team lives in IT and their entire job is to get data from source to a data lake and then it hands off to someone else and those groups don't talk, that is very difficult to drive strategic value from.
37:03
Matt Robuck, Georgia-Pacific
So to me it's less about like what the piece of paper looks like and it's much more about the operating model and mindset of the teams that are involved.
37:12
Gorkem Sevinc, Qualytics
It's, it's almost like what the CISO organization looked like 10 years ago. The CISO role used to be just head of information security that reported to the cio. And nowadays you will not find an organization that does not have a CISO that reports to either the C2CIO directly or it goes to, but it's a C level executive or they go to the CEO directly and why. Right. It's the data space to me feels like what the information security cyberspace was 10 years ago, maybe 15 years ago. And we're following the same methods, right? They are, they started with basically rule based anomaly detection. You're finding anomalies in people behavior and you know, some code beh and then now you have people and processes and all the knock monitoring, etc.
38:08
Gorkem Sevinc, Qualytics
That you have in place so that you can actually feel like you have the right coverage. And with data it's actually very different because the repeatability perspective of threats are not the same. They're very different. But then again the chief data officers, the heads of data need to be empowered with budgets, with teams, with technology and being able to go. You made a really good point, Matt. Being able to go across the different parts of the organization to actually influence the work is very important. And actually that makes me lead to a question to Rene. Rene, you have been absolutely fantastic at engaging those business stakeholders in different portfolio companies to come in and actually jump on the bandwagon and be able to come and own your data quality. And that's a really, it's not an easy cultural shift to do. Right.
39:05
Gorkem Sevinc, Qualytics
Especially you're going to have people that say my data is good, your reporting is wrong, your snowflake implementation is wrong. Go blame the data engineer, blame the other guy. So how did you get to like, what are some of the examples of the kind of verbiage that you use to get that stakeholder to accept that they're going to co own data quality with you, that it's not just a you problem?
39:30
Renee Colwell, Revantage
Yeah, I definitely don't use the word data quality. I try to give them, you know, I ask the questions. I think the toughest sell is when somebody is when there is a very siloed mentality and each person is like, my data is good, not my problem. You have to break that somehow. Now at Revantage, Blackstone is the umbrella parent. So if they tell a portfolio company you're telling us the data is good, but it's not, they will listen and it'll take some of the heat off of them. If I come in and say, okay, these are the problems that are being noticed. Don't use the word problems. These are, these are some of the things that have been noticed and we're going to automate catching it before you get called on the carpet for it. Really like that. That's, that's part of the message.
40:43
Renee Colwell, Revantage
The other thing is when you're in a very siloed organization where you don't, you know, they can be battling, they're all peers, there's no one umbrella company saying, you know, you really have to get your ducks in a row. You can start by getting, finding a bit of an ally in each group and she showing what happens to the other groups. Like show, don't tell. Here's something that happened here. And your data was great, but when it got to this layer, it was used in a different way right there. There's data, purpose and meaning. It was, you know, your stuff is great for you, for the way you use it, but there's a bit of a mismatch when it goes down and now somebody in analytics is using it for something that you didn't know about.
41:47
Renee Colwell, Revantage
So I'm going to show you what happened and what the result was and how we can fix it. Right. So then they're part of the solution, not just part of the problem.
42:00
Gorkem Sevinc, Qualytics
So you bring them in early. There you strike the balance of carrot and stick.
42:07
Renee Colwell, Revantage
So I bring them in. I don't have much of a stick. Like I have to rely on other people to do the finger.
42:16
Gorkem Sevinc, Qualytics
Yeah, but your stick is that, you know, the big guys are gonna save you.
42:20
Renee Colwell, Revantage
I'm here to help you, right? I'm here, I'm here to help you. And, and also just, you know, not saying that, you know, I'm going to bring you in early. I am going to bring them in early. But what does that mean? Does that mean I'm going to put a touch point for an hour, you know, twice a week on your calendar? No, you don't have an hour, twice a week on your calendar. Does that mean I'm gonna run an education, you know, power deck to show you. Maybe a little bit. Maybe a little bit. But I'm going to keep it short. I'm going to run short targeted working sessions and then I'm going to go away, I'm going to do a lot of the work to get you started because you have to integrate things into their daily process. Right?
43:19
Renee Colwell, Revantage
You got to teach kids how to brush their teeth. But when is it? It's after dinner. It's not, but you got it. You got to like hold their hand and then it starts to become, they start to become self sufficient. And it depends on the maturity level of each group. Some of them this is second nature already and they're on board with it. And others maybe not so much.
43:42
Gorkem Sevinc, Qualytics
That's interesting. It's very interesting. I mean the company cultures also go into a big effect here. Of course. I know we're at time. I'm going to ask you both one last question and that's going to be a very quick answer. Not, not a long one. Do you think Data governance should be a cost center or a value generator?
44:04
Matt Robuck, Georgia-Pacific
Yeah, I mean the short answer is it's got to be a value generator and it comes back to how we tie it to business outcomes and business problems.
44:14
Renee Colwell, Revantage
Yeah, I agree with that. I see it more like if you think before accounting was invented, it's still a cost center but it is ingrained, necessary, nobody will ever question it. So maybe a bit.
44:32
Gorkem Sevinc, Qualytics
I love that. I love that. Thank you both Renee and Matt for joining us for this 45 minutes and really appreciated your insights here. As I said to everybody, we are going to share the recording after this and send you some follow up materials. Thank you both very much and hope you have a great rest of your day.
44:53
Renee Colwell, Revantage
Thanks everyone.
44:56
Gorkem Sevinc, Qualytics
Thank you.