Accelerating Value
Accelerating Value

Episode · 1 year ago

Quit Aiming for Data Perfection & Monetize

ABOUT THIS EPISODE

If you want to drive value from data, you have to understand the definition of value. It’ll be different at every organization, and most of the time it has nothing to do with financial metrics.


When you monetize data, you’re actually monetizing the manifestation of insights about your customers.

In this episode, I interview Bill Schmarzo, Data Science and Data Monetization Strategic Advisor at Dean of Big Data, about the meaning of value, the value of data, and the role of data analytics in data monetization.


In this episode we discuss:

- Why merely having data puts you in the negative percent of monetization

- How to avoid fixation on cleaning your data house

- How to avoid the cult of precision

- The ideal composition of a data science team

- Where the real value of AI manifests (hint: not with execs)

To hear more interviews like this one, subscribe to the Accelerating Value on Apple Podcasts, Spotify, or your preferred podcast platform.

Listening on a desktop & can’t see the links? Just search for Accelerating Value in your favorite podcast player.

Today, every budget approval is an investment deal. If you're a marketer, sales or business leader, you had to promise to deliver value and impact. Writing the wave to get there is hard enough. Finding your way through the storm is even harder if you're looking for that path forward so that you don't wipe out. You've come to the right place. Let's get into the show. Hi everybody, this is mark extuce with accelerating value, where we do our absolute damnedest to show you the way forward to create all kinds of value in your life and in your work. It is it's one of those things that we all strive for and that actually very little of us really understand the rules of the game. So today we have a professional umpire is going to help us understand the rules of the game of value. So Bill Schmarzo, who is known kind of broadly in the in the data science space as the dean of big data. So they don't they don't really get any bigger and more credible guy. Welcome bill. Thanks mark. Yeah, I know you bet, and I've got his book here, which I you know, I don't usually flat books, but this is actually a really, really, really outstanding book, as as I have said to bill, if you look at the cover, if you look at the title, you're going to go, oh my God, Mark Stucee wants me to read this book. Like, how am I going to stay awake? But let me just tell you that the person who writes the sparkling pros inside of this book may not be the same person who wrote the title and and so and so if you are thinking, if you're having a normal reaction to the cover, don't don't presuppose that you understand the book bias cover, because that would not that would be a bad move here. So one of the cool things about this book and what I think is going to come out this podcast right is that, Bill, you really know how to talk about this very authoritatively, very scientifically, very mathematically. That's your background, but in a way that that means something to normal people and you know how to democratize these ideas, and so that's where we're headed with this. So let's let's just get started. Let's jump in the deep end. And I'm not mark. That's that's quite an inter I'm not sure I can I can live up to that. I you know, I'm all kind of stupid. Gonna come out on my mouth now and it's going to I'm going to be exposed. Happen? Regarding the title, the only way it could have made the title even more more boring was to somehow try to work the word Broccoli in there. Right. Well, you know, it's funny because it definitely has the title of an academic treatise. Yes, yes, right, and yet that what's inside, even though it's very serious and very well thought out and well presented, is the word dry is nowhere in the description of this book, right. So, okay, so let's talk about this. So you you buy your own admission, right you, you like to get people's attention, and so you use some phrases and from time to time that are a little incendiary. So let's talk about the first one. Let's talk about extracting value from data. So just a few pages into your book, I think it's like for page four or five, somewhere in there, you write that data without analytics is an empty promise, which, if you are doing data visualization through tableau and all this kind of stuff, right, seems seems a little bit of a tough comment. What exactly do you mean by that? So you're right, mark I do like to throw incendiary statements out there, because what I find you have to do to to get people to listen is to sometimes get them pissed at you. And so, and I actually truly do mean that data without analytics is an empty promise. We have too many organizations spending too much time and money on the collection of data, on the management of the data, on the security of data, on the preparation of data, on the governance of data, and too little on the monetization of it. I always just talking to somebody about the you know, the Dama data management cycle, and, like you know, this is really cool, this is great. It's all passive, it's all defensive, there's nothing offensive, there's nothing. No one's weaponizing this. No one's talked about how to monetize it. And so the problem with...

...having data isn't that you have the battles half one having data. The Bet of the data is zero percent one. It isn't like having data even get you close. It's like saying you're going to go to the moon. You need climb to the top of a tree and say hey, look, world, I'm closer to the moon. Well, you can't get to the moon from a top of a tree. And so the challenge we've got to have was got to immediately take people and shake come out of their comfort zones. It's not about the possession of data, it's about how we're going to activate that data, how we going to use that data. And, Oh, by the way, the data itself isn't what we're going to monetize. We're going to uncover buried in the data, those customer, product and operational insights, those predicted behavioral and performance propensities. That's what we're going to amatize. And when I say monetize, right, so it really isn't data monetization, it's the in if it's a mode station, of the insights about your customers and how you're going to use that to improve customer attention, to improve cost cell, how you can to use the operational insights to reduce unplanned down time? How are you going to use the operational insights to reduce carbon footprint? Right, that's everybody has these business initiatives which have value associated with them. Data in itself isn't going to help, but if you can uncover those relevant insights, that's what you apply. So again, having the data isn't half of the battle. My insinderary statement here is having the data is zero percent. In fact, I'm going to argue it's a negative percent because data costs. There's a cost to having data. Got a store, managers, etc. So having data nally doesn't provide any value. It actually cost you money. It takes you deeper into the hole. So the minute you want to try to start monetizing it, the first thing you got to do is you got to dig out of the hole and start getting into the positive. I mean it's one of those things where it is necessary. Clearly right, you can have an analytical construct and if you don't have any data to arm it with, you've got a problem. But beyond that, you're exactly right. It is essentially the storage of unrealized potential value. Yeah, latent value, right, as we're looking at all it's latent values, got tentibal value, but in of itself, by itself, data is a cost. So here's another one. The and you talk about this a lot in your book, right, you talk about a maturity curve and you know companies and teams kind of move across this arc of maturity when it comes to data and analytics and and kind of their perspective, the way that they see these subjects. Perhaps one of the most common things that I that I hear and that it's got pretty widely in the data science world, is the statement from mainly operational teams, but also senior leaders you don't know any better and who rely on their operational teams, and that is sort of like, well, our data house isn't in order, we need to get it in order. First of core we do anything with analytics. What is what's right and what's wrong about that statement? But the way, I like the way you use the word data house and you you avoid all these other terms, these these religious battle terms, such as data warehouse, Data Mark, Data Mesh, data BFA. Right, data house. It's just this thing. Right. So we do ten as an industry to low religious battles on technology and how to term something. I was I was in the middle of the day drum data mark bersus data warehouse battle between Ralph Kimball and bill inmand for four years years, a battle that yielded no value, that cost hundreds of hours of time and waste of time and cycles and millions of lives. Right, but we get in these religious battles. So anyway, they didn't mean it. Go with the question. But I love the fact that let's not get into religious battles about technology, because the only people who care about those religious battles are the people who aren't focused on trying to drive value. It's like who, no one cares about those armors. You know, rose is a rose, so quit it. So this challenge of I can't do anything until my data is perfect is a great excuse to do nothing right and I don't think people do this on purpose and they make this argument because they're not certain what's over the horizon for them. All they know is I've got people are going to need my datas. I need to make it as clean as possible. But they don't know is what data is most important for the problems of customers trying to solve. How good is good enough? How clean does it need to be? You can spend an infinite out of money cleaning and lining and getting your data and govern your data to the point, but it's has the conversation about this data. Not all Datas of equal value. So we when you say we're...

...going to make sure all the datas clean, will you know probably nine percent of your value of your data is going to come from a handful of data sources. First off, let's focus on those. And how do you know which data source and are most valuable? Will you better understand the the business initiative, the customers are going after, your business users and the use cases that are going to support that and the decisions are going to have to make, the support to use cases and the KPIS and metrics against what you're going to measure progress and success. Now you have fighting chance to say, well, that's where going. Then you need these three or four data sources and now I can I can focus my efforts on improving those for fight data sources curate and I'm making sure they're their salad, I can reuse them and they're trusted by the customer. So the whole conversation with data, I believe, has to start by understanding how, if we want to drive value from data, then we need to understand how does the organization define value? Think about that term. How does the organization to find value? Some organizations define value with talk about digital media companies by, you know, cliques and views and likes and things like that, and they and they want data around that because they're trying to optimize those and those are the KPIS are going to measure success. Right, that's how they define value. But we are learning that value defined only on financial metrics is trouble. Can Lead to very unfortunate, unintended consequences. So you know, we need to look at society and not in confirmation bias and diversity and all these other ramifications that go into definition of value. So I'm going way off track here. Market as a point here is that the conversation that we as data scientists, as data managers, as data Efficionados, have to have is to understand, first and foremost, how does the organization create value. That's a tricky conversation. That's not a show up for to our meeting. I get it now these when I run these vision workshop exercises with customers who try to understand how they create value. These sometimes can take two weeks because you got to talk across the whole wide range of stakeholders who are all are looking at value creations slightly differently. But I can't even start to figure out what data is most important. And if I don't first off, know how the organization defines value and how we're going to leverage analytics, going back to your previous question, how we're going to leverage analytics to take the data, uncover those customer product operational insights to drive those sources of value. We have to take this approach our house and turn it upside down. Don't start with the data and work your way up and hope that you find value. Hope is only a strategy in the cosmetics industry right. Let's instead, if we want to drive value, let's start where value is created and then decompose it into the analytic and data elements we need. You know what, when you when you do that mark, it doesn't make the process mundane, but it makes it manageable, and that's what we have to do. Take this, take this very complicated conversation around data and analytics, and let's put in place a framework that allows us to manage how we were going to collaborate with our business takeholders to create or, is your case, accelerate value. You know, it's interesting. I not too long ago I was I was meeting with a big farm, a CFO, and he was very frustrated about his marketing department, fact that he couldn't get to a clear portrait of value and return and how. You know, he understood time lag as a finance guy, so he was even talking about that. And I said, well, have you ever sat down and had that conversation? Right, have you? Have you ever essentially outlined this is the game we're playing, these are the rules of this game, this is how you score points, this is how you wins, is how you lose, this is how you end up in the penalty box, right, all that kind of stuff. And you know he was like, well, no, I you know, and I said was so what? They're just supposed to read your mind and it the point. It actually the IT sounded like a very confrontational conversation. It really wasn't, but it got to the end of that conversation, I think was a very constructive place because of exactly what you're talking about, right. They realize that they had to have a conversation that really pulled all this stuff out into the light and that it wasn't enough for him them to get frustrated and kind of curse the darkness. You know, he had an equal responsibility to the CMOS and, for that matter, of other people to in the company to strike a match. The not covered in this book but in a previous book I wrote call the art of thinking like a data scientist. The starting point for the entire data science process is this very detailed value...

...conversation that has to be held not just between the data scientists and the business stakeholders. More Times not it's between the different business stakeholders who all have a little slightly different view of how they create value. Now I will tell you that in those diverse perspectives, in the conflict of those diverse perspectives, that's where innovation happens. When you're trying to optimize, not this or that, but insteady trying to optimize this and that. I use we, I use lots of design thinking techniques in my thinking like a data scientist book, which is a call it a mentality of abundance that looks at the end, the end, know this, end this, and the the ramifications of doing that, and I do talk about the slightly in the book, is that what we want to do as organizations is we don't want to compromise to the least worst option, but we want to synergize to the best, best option. That means taking and understanding how every defines value, how they win and figure out how do you create environments where everybody can win. Now, not everybody can win in total, but you can find that there are assets that different people bring around, how you create value and when you merge those, blend those, synergize those together, you do get that innovative effect of the one plus one equals seven or thirty one or six thousand. So let me, let me before we move on, I want to ask you the kind of a follow up on the very kind of pointedly, and that is, is it even possible to thoroughly understand and judge the quality of your data without easy analytics? No, no, I mean, I mean I love the the classic example is, and is this happened every day to scientists. Somebody in the business comes to US says here's some data, tell me what's interesting, and my response is always, well, how do you define interesting? We find understand what interesting is. How do I distinguish signal from noise in the data? Right, that's so. Yeah, so that you know what do you mean by interesting? Gets back to well, what is valuable to you? Kind of conversation, and that's a conversation by the way that sometimes data scientists are afraid to have the business hues. Are Just said, tell me chuseful of data. So they go through, they do a bunch of descriptive analytics and they come back a bunch of stuff. It is it usual looks of descriptive analytics as I can't use this. Am I going to make a decision on this? How am I going to optimize my cross cell? Right, it's you did it, sign ISS. She screwed up totally. Actually, a long time ago, when we were when proof was just starting, we had a early customer, very very large, very famous technology company who, fairly uniquely for the time, this is five years ago, had a had a large dedicated analytics, data science team or marketing. Okay, the the the problem was, and this goes to your point, that they were all PhDs. In fact, they all kind of came out of central casting for that kind of person right, the pocket protectors, the whole deal. Right. And and they're great people. I worked with all of them at one time or another and they're great people and they're very smart, but they had absolutely no knowledge of the business and they had no knowledge of marketing. So this you know. You so the natural of statement. Next statement would be, well, they just need to talk to the business and talk to marketing. Well, that didn't happen. The Marketing Department was a very kind of couture oriented marketing department, and so these these people really didn't have any interest in talking to the guys with pocket protectors and the business guys are too busy. And so essentially what happened at the end of the day is that they had a lot of the data science team had a lot of UN contextualized data that they were forced to essentially p hack their way through right to see if they could somehow stumble on something that was important. Ah, yes, the whole strategy they with you, with the also going to sell I creams and things like that to reduce the lines and take years off my face. And so it was. What was kind of interesting about it was that I was the one, because this was back when proof had like two people in...

...it. I was the one who sat down with them and said, Hey, here's a list of ten or fifteen questions, business questions, right that could could operate as our hypotheses for the time being, and let's wrap data around these hypotheses and see where we come out now. What was interesting about that is that it all of a sudden the reports changed dramatically and the business guy eyes thought it was the best thing in the world and unfortunately the marketing guys weren't to thrilled with it. And so there is a you know and they continue it. It's functional today in that company, but it is. It's interesting. The the context right, the issue of context and the issue of where I sit determines where I stand on all this stuff. How important that is. It's really important. So more one of the things that we do is we create we do a project. We have a you know, data science team, obviously, because this idea that you're going to have a singular Zebra Unicorn data scientists who can do it all is fallacy. There's there's five of them in the whole world and they all work at Google. So you're not going to get any of them right. So so what do you do is you create a team, but the team has to be diverse. You you want to have arguments and disagreements within the team, not first time outside the team. So our teams include your data side is a data engineers, we know ML engineers, blah, blah, all the right technical people, but we also include we call value engineers. Value Engineer's job is, that is to define that value statement. And we also includes, you know, such a matter of extras from the customer part. They're they're part of this team. Every one of my team has design thinkers, as a design thinker in there who's whose job is to drive this collaboration and to drive the ideation. It takes place and we have you mock up people and things like that. But the one thing we always do is we use kind of a team of teams approach and organizational Improv which was we could move dat anybody in that team could move to another team and still maintain operational integrity because we as we had a common methodology, we had common terminology, we had common framework, we had in common intent we created. You know, it's this organizational Improv is like watching a great soccer team play right there's there's movement on the field. There's like a coach of sitting above them saying no, John, you move here, mark, you go there, Bill, you do that. No, they've already established the common language of common framework, to common intent and we were on the field. They're they're communicating with each other and and it's we as organizations need to do the same thing as seems weird that I'm talking about organizational structure and a conversation around data and analytics, but the heart of understanding and defining value and applying the insights buried in the data to create those value it is a point of conflict. It'll create conflict in and it's best to have a team and which that conflict is happening. You don't want to shield a team from conflict. I want the team to be conflict. So the day of the ones trying to drive this ideation innovation, so that when they go out to the business owner users, they've already sort of encountered a lot of the issues these business users are going to be concerned with and they've already had a chance to wrestle how we're going to roll it out, how we're going to operationalize, how we're going to scale it, all those kind of questions that come out from the business users. To that actually is a great it brings up a great point. So if you talk to a lot of business leaders about data sciences, you hear stuff like I've never met a more argumentative group of people in that right, you hear phrases about what's called cult of precision, and what they're really what the business leaders are really talking about here, is that they feel frustrated that they can't get an operationally adequate answer from a data sciences without innumerable caveats. Let me, let me let you you've rate a good one. So, as a data scientist, when is good enough? Good Enough? Yeah, right, so you will get into this cult of precision if the data scientist doesn't know, one what good enough looks like and and the the the what data scientists, good data science team, is going to do at the basis for determining when is good enough good enough? If they're going to look at the cost of the false positive and false negatives? Right, that's what tells them. Do I need to have ninety two percent accuracy, or ninety eight percent or ninety nine point nine? Well, what are the cost of the false positive faults negatives? Who? Who determines the cost of the fause positive faults negatives? It ain't the data scientist this, it's the business takeholders. They're the ones have to tell you. If we do this marketing campaign and we're showed the wrong adds of the wrong people. It's going to cost us? Well, you...

...know, the cost of bad marketing isn't very expensive. Oh, we're going to serve up vaccinations to people, right, what's the cost of giving them? You know, what precision level do you need on vaccinations? What precision level? So the challenge when I hear organization talk about that, I didn't I've never heard this term cult of precision by love it. It actually boils down to the fact that there is a lack of ownership on the business side to understand clearly what it is we're trying to do, the decisions were trying to drive and what are the costs of associated with the false posit and faults and negatives, so the data scientists can know what is good enough. Is Good enough. And, by the way, the data science team, they do argue a lot. They're going to try different variables and metrics, different feature and Richmond and hyper parameter, they're going to try all these different techniques to try to get to that particular point of accuracy and they're going to yell and scream and they're going to it's a it's a process full of failure, but that's if you don't fail enough times, you don't learn enough times. And so it is certainly a making sausage opportunity, but when the sausage is done is determined almost entirely by the business users and their articulations of the cost of the false positives and the false negatives. You know, you're remind me of a story that I'll share just real fast. So when I was at huntingwell aerospace and we were doing a lot of this work and Dave Cody, who is than the Chairman and CEO of Honeywell International, was really a major champion of what we were doing in this in this area, and he but he heard that I was spending a fair amount of money, like you know, seven figures, attempting to get even more accurate. I have higher confidence numbers. So we were kind of routinely in the eighty five to ninety range and I was trying to get to ninety five plus right and he chewed me out. chewed me out and said, you know, this is not even remotely appropriate or relevant. He said, I make big time business decisions on sixty five percent confidence level stuff all day every day, and you're bringing me already stuff that's in the mid s. So if I hear that, you're, you know, tilting at the windmill, so to speak, of ninety five percent. Again, he's a you're out of here and I don't care how good you are. And I thought, you know, and he was sort of rough like that. I mean he was not maybe the nicest guy in the whole wild world, but it was incredible guidance and it was exactly what you're just talking about. Yeah, when I was at yeah, I think about and I was at Yahoo and we are building models to try to figure out, you know, when somebody came to a site, what they were most likely interested in. If I had a sixty five percent cop no accuracy level, I'm predicting that that was that was I could I run around naked in the building. Right, that was great. Right, because the costs of being wrong, the cost of me showing you the wrong ad, was a fraction of a penny, which is very different than the cost of me getting the covid right or the cost of knowing when to replace a ball bearing in or the the engine in a turbine right in a windmill. Turn absolutely and I think, I think the other part, which we're now just really starting to explore here via automation. Is that one of the reasons why a lot of data scientists, you kind of hit ninety five percent as the number, is that the time between recomputes on a lot of these models can be pretty sensitive. Yes, yeah, may be six months, nine months, twelve months, right. So this is sort of a way of bearing against some of that degradation that's invariably going to happen as automation is taken over and you're able to not only predict but check your predictions almost in real time. Right. Yet with new data flows, that change. Is that the whole issue dramatically? Yeah, yeah, we can. We can. We can build a test models much more quickly. And, by the way, when we're building the models we're not many times we're not building the models on the entire data set. We're taking representative data samples build the models on that because we don't want to spend we don't want to waste our data science team just screwing around with data management issues. Right, just take a workable data set, let's build our models on that and then test it on the full data set. Right. That's that's pretty common practice, because you you do want to have ability to iterate, because you're going to try a lot of things. You want to have the ability to try lots of things in order to figure out what is what are the combinations of these variables and metrics that are going to yield better predictions of performance? Okay, so then let's talk about this, right, because this is a you're giving me some really great segues here. The math around all this has been around for a long time,...

...right, the math is not particularly new. Multi variable linear regression, depending on who you talk to, and data science is pretty much the bed rock for seventy to eighty five percent of whatever it is you want to understand about life. Right, certainly business, but also even in the physical sciences and Social Sciences. You you see this as a fundamental underpinning of the scientific method of inquiry. So, given this, given the fact that the math isn't really the issue, what is the issue in terms of what is what are the big obstacles to companies who want to operationalize this kind of decision science and can't did this? This is the great question. You're going to we're going to get to your question in a second, because I'm going to first take a shot at your first statement. The math is not new. I think actually it is. I think it's dramatically new. I think the things we can do with deep learning and neural networks and reinforcement learning, true enough, are truly game changing in our ability to create autonomous devices, autonomous processes that are continuously learning and adapting, in many cases with minimal human intervention. So I do think the more the math has changed dramatically. But I think your point a spot on and the sense that a lot of the basic multivariate regression analysis been around forever. Mean God, I when I was in college I learned on that. I mean, I mean it hasn't really changed much, right, right. So what? Why? Well, the problems not the math. The problem is in the value, right, it's in the how do we create value? It is that we know the math works and it's a pretty comforting fact. The frame of the conversation shouldn't start with the model. It's just start with well, how do we create value? WHO The key stakeholders? What are the decision they're trying to make and what are the KPIS and metrics against what you're going to measure progress and success. You're going to hear that from me, you know, every other every thirty minutes. I those words come out of my mouth, whether I'm having dinner with my wife and, you know whatever are sit in the middle of church. Are just from blothering out right. I just are go schmiers, well again, somebody on plug in. Well you. And so the the challenge is that, as technologist, we've always been amazed by our ability to make the dog walk on its hind leg. I got to tell you, that's pretty cool, but I don't want that dog bringing my beer to me right. So we have to move beyond the look what I can do kind of thing and start the conversation about look what I can deliver. There's there's a road, a road, a blog on data science too. Transitioning from outputs to outcomes. The fact that we spend too much time talk about look at this great dashboard I can build, look at these great know, here's an amount model I can build in that pretty cool, look what it can do. No one in the business side cares about out what's what they want? Are Out Comet how does that improve customer retention? How does improve a marketing effects. How do I how do I use that to make sure my I'm retaining my most important employees? Well, how do you find important and etc. So I think about how you the conversation needs to take place first around the business and what creates value before we can start figuring out these shiny, nice objects and something like says. I'm been around forever, so I get I think that if you want to change the game, you have to change the frame and we need to transition the frame. We need to pivot the frame him and stop talking about the math, which no one gives a crap about. I got a I have a, you know, my undergradugrees in math and that got me zero dates, by the way, and I would say I gotta, I gotta degree in maths like snore right next swipe the rother right. So no one cares about the math. No one of the business life. It's like this conversation I had with Doug Laney, and Doug Laney's brilliant. Doug Laney has been been a leader in this space for a long time and try to dry value, and doug course, got famous because he's the guy who coined the term three V's, a big data, the volume, variety and velocity. Right, the right, the three views. I said, Doug, no executive I've ever talked to gives a shit about the three visa big data. They care about the four ends. A. Big Data make me more money. Right. So let's change the frame, Brom, the fact that we got these really cool technology. Look, I got spinners on my cardn't that really need? No, how do I get no? How do I make how do I get better outcomes? How do I prove my caste guess, my Liege? How do I get from point a to pour it be? No, the spinners don't help me get to my from here. They're more quickly, right. So stop talking about spinners. Right. So, anyway, I'm off on a tangent here, because I do think that once you've changed the frame of the conversation, everything becomes easier, every conversation you have, and that point forward becomes easier and more meaningful.

...becauld the business users now know that you're trying to help me and you got math that can help me do this better. And I've seen enough about what you know, what some of these people like. You know Google and Microsoft and and American Express and Tesla, and Alt of these companies are doing with great math. I want to do that too, but we have to as as citizens of data science, we need to stop talking about data science and stop talking about how do we create value. It's I really agree. I mean actually one of the is a totally different application of this, but there was a documentary made about eight or ten years ago and which is essentially an interview with former defense secretary Bob McNamara, and he is before in World War II he did use a data scientist before they called it that, and he was doing prediction around how many bombs will we have to drop on Japan in order to get a particular outcome? Right, Yep, and he was. He was really talking about the fact that at the end of the day, nobody cared about how he got his projections, only that they were proven correct overtime right, right, and and it was. It was a it's a it's you know, just goes to show you whether it's marketing or whether it's, you know, war. Right, it is. It's really the same from from the point of view of a data scientist. You know, you get me mark a really good at a little analogy here, which is, you know, don't use a neural network when a multivariate regression model would work. Just be heard. Neural network is really cool, do cool thing. It's so true, right, but no, I'm it just came in my head. Is Like how many times I had a conversation where I said down a data sinus that why are you using a neural network? They mean we don't have enough data, we don't have just use a multivariate us a clusterraine segment right there. So again we only care about the outcomes. We don't care about all the fancy things you're doing with the technology. And I think, I think that the other the other part of the problem here, though, is that there's been so many of the technical words have been twisted all out of shape by the way that they've been used in the market place. And so, for example, I think most people think that AI, specifically machine learning, can deliver a portrait of causality. MMM, when it can't. NOPE, no, right, it can show you really amazing patterns which may or may not be relevant causality. Right, but there's there's not. There's no magic here in that sense. Right, there's. And so it's you know, another example. This was, you know, a lot of people in high school or college stats class, you know, one of the few things they really remember is that phrase and around correlation does not apply, causality, right, but they don't really understand the phrase. They understand what that really means, right, and so they tend to discount all data science. That makes them feel threatened in their in their life and work. Right, using that phrase. It's an interesting conversation, mark, because you can almost this is a conversation we have in mind. was at Yahoo, the death of why. Right, it's if you if I'm testing whether a pink or blue button draws the most clicks and for whatever reason it's the Blue Button, I'm going to show more blue buttons. I don't know why. Be Honest with you, I don't have time to care why. I just know that it blue button works better. Now I'm going to use AB testing all the kind of needed validate that blue is better than pink and if, over time we see that green actually is better than blue, then will switch. But it's the death of why. And in some cases that's okay because the not knowing the why isn't a big deal, but in other cases to transparency on the why, which is what we're seen in a neural networks. Right, understanding the weights and biases that are buried inside the neural networks are becoming critical. You think about the you know, the Fair Credit Reporting Act. If the if a model says I'm not going to give you a loan, you have every right to say, okay, tell me why. Why do the model say I'm not going to get one? Because it better not be on race, gender, a, age, you know, all the areas of discrimination. Right. So we we are struggling. I think this is a this is a really good question mark. It makes me think about how we've gone through this process of not kind of worry and, in any time, trying to understand why things the causality. But now we're coming back around to the fact that causality. When you expand the measures for the value statements, against what you're going to measure the success, that understanding why becomes...

...more critical. Absolutely. There's a there's a blog in there somewhere. Any to remember one? Yeah, particularly if if you are actively trying to do that thing and make that thing happen. As a result, right, then causation becomes really, really key, right, because you're it's the test of your hypothesis. Yeah, okay, so here's another one. So I actually got a lot of these from from different people who really wanted to hear what you had to say. So there's an observation that very operationally focused people in teams, this could be any kind of team, right, but these are people who are basically making the trains run on time on daily basis, that they see data and analytics differently than those with a more strategic or a more integrated understanding of of a question or questions, and that there is when, if you're tried, for example, if you're a analytics vendor and you're selling into this mix of people, you'll typically get the buy in from the senior most people then find get themselves scared by what they're the more operationally people oriented people have to say about it. What is, how do you see this? Have you? Have you run into this yourself? This person's actually really interested in hearing this answer. Yeah, it's a really good point. And as we move into even more of these advanced analytics, to reinforcement learning, the deep learning things, I would consider more to be more AI, this becomes even more critical because I would argue that ai is going to impact the frontline employees much more there's ever going to impact Mahogany row. Think about the ability for a technician. A part is going down, machinery is gone down and every thirty minutes have machine down. Costs, you know, x number of millions of dollars. How do I get that machine up more quickly? Well, that's a great and great environment for, you know, an AI assistant to make recommendations, a root caused to bring together all the right mean you can start seeing how ai could start, you know, predicting even ahead of time as parts going to break, is going to break of the next twenty four hours in a and so the the the idea that ai can be like a assistant. It sits on your shoulder of the frontline employees, whether their frontline customer, engaging, our frontline operational engaging. This is where it really has huge impact and optimizing that that creaturs it on your shoulder and the human, the dialog between the human and the AI to continuously learn an adapt becomes very valuable to make sure that machine, those machineries still instead of trying to get it back up more quickly, let's make sure it never friggin goes down right on a reasonable, you know basis. Right. Those are things that impact the frontline people. It isn't like this thing is these little ai models are going to tell you, well, you need to go out and acquire a different company or they're not there. They're the real benefit of AI isn't in getting at the senior executive level, it's in the operational level. It's again, it's that point where your you got you're in a call center and somebody's called in and the and the systems listening, seeing this person is really stressed out and here's their their value scores and here their their inflow a score, and, you know, I think we need to do something for the how about we do this? How about you do it? Is Given in recommendations in real time to this person who's on the phone trying to make sure that's very valuable. Customer is calmed down and they can solve their problem for him. That's the this, this augmentation of the frontline, is going to be really important place. So, getting back to your question, the people you need to win over are are the the the operational people, because they're the ones who are going to not only get the most value from these kind of things. But because of the interaction they can have with an aiml the this ai human interface, they're going to also generate even news sources of value that aren't going to come from a hogging Mahoggan. Is just they don't it doesn't help them. So so I think this is a great question. I think it's all about operational making the making the operational which, by the way, has a whole set of problems. Is this, this is this role by going to replace me? Now, most times not. It's going to just make you more effective. You can maybe support more customers add more success. We found that one case we have the recommendation engine delivering recommendations to fraud investigators. Fraud investigators get paid a bonus and how many fraud cases they resolve. So if this aim model is saying don't target that fraud case to this one, is that when you want to go after here's all information you need. Blah, blah, blah. You know, if you helping people be more effective and helping them make more money, they're all in. I...

...mean people are coin operated for the most part. So again it gets back to I think your point here is we start talking about aim l deep learning, this reinforcement learning, continuous learning and adapting environment. It's the operation of people who not only need to be bought in to this but need to feel comfortable feeding in their learning so that you have this this the symbiotic relationship around continuous learning and a apting. So, with that in mind, the the another question that's right in line with this is how would you explain the interaction between causal analytics and machine learning as in terms of building out the entire portrait of value? I've never thought of him as being separate, and maybe that's me being very naive here. It's because they're not. But but I think operationally they are in many organizations. What what I'm trying to do operationally is to make better decisions. We're better, as you know, more accurate, more timely or whatever, whatever the the value statements of the organization are. Is How I'm trying to make better decisions, whether I'm using reinforcement learning to continuously learn and figure out should I replace a part, or where I'm using multivariate analysis combined with some cluster or segmentation analysis to figure out what parts to replace. I mean in reality, you datacide as. It probably is using a combination of lots of different things in order to to solve a problem. We we use many times, you know, for example, unsupervised machine learning to help us tag data that we then applying supervise machine learning in order to drive that causal factory and in so I can that's that's probably a question it's much hot smart for a much smarter person than me to answer, because all I'm trying to do is use that the different analytic tools I have to drive better decisions. And again, the challenge I face is much more on definition of better than it is on the definition what tools to use. Okay, so last question. So you've made a statement in your book that the economies of learning will always, I'm paraphrasing, but I think I'm doing violence to your idea. The economies of learning will always outperform the economies of scale. So fake statement. So here's here's what I so I see in knowledge based industries the economies are learning a more powerful and economies of scale, because the ability to learn an adapt faster than your competition is what will actually ultimately win for you. So no building massive you know, crystal palaces, in the sky aren't going to be nearly as important as ability for the organization to create both AI enabled devices and human empowered humans to continuously learn an adapt. We we have a pandemic hit us right the organizations that did really well with the organizations who are very flexible and agile. They had, you know, aiml models some cases that helped them tell them what customers were increasing usage, and they also had humans of the frontline who are seeing this and we're sharing that. The this this idea that that we're going to we know, we know we're in the world of constant transformation, pandemics, economic transformations, technology transformations in you know, environmental is just there. Is constantly happening to us and the idea that we can build a model and have that model work indefinitely for us isn't going to happen. And when when money ball came out, billy being came out with his concepts, he was he was legions ahead everybody else. Right, it was like, oh my gosh, she discovered the secret sauce, until everybody started copying them. Then the Boston Red Sox win the world series of the first time in about eighty eight years and then the Chicago comes when a world series in the first time on a hundred eight years, all using a lot of the same techniques. So the world is constantly transforming. So the my issue with the economy is a scale. We think we can build these big monolithy processes, frameworks and plants, and that's a bit it's but it's built in the frame of what the world looks like today and we know the world is constantly transforming. So the ability for us to learn. Now, we humans have always had the ability to learn and but we've never we do a shitty job of empowering humans, as be really honest. Right, right, we put people in boxes, not in swirls. Right. So people sit in a box and you do this, what you do. This is your box and you know, God help you if you go talk to somebody else in a different box. Right. But if we empower people, they continuously learning and death. They see what's going on, they make changes, they communicate those changes. And now we can build ai machine learning enable devices that are doing the same thing. At Tesla's basically a continuous learning and adapting to asset. Right, it just every time it drives around...

...a corner of fitting smarter any in the beauty of any anyone. Problem that the one million Tesla's encounter in a day after it gets sent to the Tesla cloud in the sky and back propagated to all the cars. Now every other car knows how to handle that same problem without ever having to experience it. So the this learning, the mechanism, it can't just be with AI and ml it also has to be with empowered humans. And when we do that right, and so it's not humans who are constantly putting the same bolts on the car. Right, it's humans are figuring out how to do that more effective it. Well, maybe, maybe the car shouldn't even have a friggin bolt, right. And and so we need to do something. The transformation is all about this ability to coninuously learn an adapt, that the economies of learning are more powerful than economy is a scale in a knowledge based industry where things are constantly changing. And I'm going to conclude by saying that I believe every industry is a knowledge based industry in a world of constant change. And so the only way that companies are going to be able to survive against their competitors as they can out learn them and out adapt and out adapt them. Otherwise they're going to be you know, they're they're there next year's if you know, you know blockbuster. Right. There's a great so you know what I'm not doing. What I'm doing. I'm actually pretty deeply committed historian and one of the more famous quotes from George Washington during the revolution is all about this choice. That's really not a choice, but he was being presented to him as a choice between, hey, if I can give you the Continental Congress that I we can give you more money from more men, or we can give you more money from more understanding, more intelligence of the enemy. Right. Which one do you want? Right? And he picked intelligence over more guys. Right. And he said that the importance of intelligence in war is so important that nothing more needs to be said about it. And and I and so he was faced with do I scale my operation with more guys, right, or do I get smarter? Right, and smarter part I'm going to try to find that cool. That is. That is a great historical point to the fact that the economies of learning are more powerful and economies of scale. Man, what a great conversation. Thank you, Bill. This is been awesome. You know if you're up for I think we'd love to have you back for for kind of like part two, because, yeah, there's a lot here to talk about and it is so on the front line right now this whole issue, particularly after two thousand and twenty. The speed and volatility of change meant that more people than ever before realized that they couldn't into it their way based on the on their data, and that they and that past is not prolog which was essentially the premise of what passes for a lot of of data visualization type stuff. So so marking vision. Imagine the world where you've got AI and ML models who are constantly looking at historical data to uncover patterns, trends of relationships that can use, and you've got empowered humans or leveraging their natural boring curiosity and creativity to look over the horizon. What what's next? You bring those two together. What a powerful combination. The problem in that scenario isn't the AIML models, which are getting there. The problem is the humans. And here's my point. What are the most what's kind of distinguish humans from machines? That one of the most important human characteristics we have is curiosity, now we are all born, when natural curiosity. I like to tell a story about taking my dad's radio apart right then putting it back together and realize that there's a lot of extra parts your daddy didn't need and that radio it didn't work either. Right, it's like loops. Right. But what happens to to humans is we immediately start applying standardized testing, standardized curriculum, standardized schools, standardized operating procedures. We do everything in our power to take that natural, powerful human curiosity and rub it right out or going to humans that are going to survive in this world of transformation are to be the ones that are going to try to have to out learn these machine learning, deep learning, reinforce, mrent learning, federated learning, transfer learning, active learning kind of mechanisms. No, we're not going...

...to out learn them. We're going to have to be more creative and that means we need to nurture again that natural curiosity and that's going to have an interesting ripple effect through our school systems. And what are we doing to sort of nurture curiosity? Because by my favorite example is I already gave up competing with a with a five dollar calculator. Who can calculate screw a rout more quickly? Game done. You calculator for five bucks. You win that battle, right. Doesn't mean but I can now focus on next thing. I don't need to worry about how to calculate. You know, square root. What can I do with it? How do I envision? How do I ideate? How do I brainstorm? How do I collaborate? So anyway, I think we're in a really interesting junction point of the next four to five years where more and more AI and ML is going to take on more of this mundane task to free up humans to be more creative, to be more curious and drive innovation. Damn, it's a good light, a good time to be alive. I agree absolutely. Thank you so much, and we look forward to having you back call me in mark. Thank you. The sooner you can optimize your marketing spend, the quicker you can start delivering clear, measurable value to Your Business. That's exactly where business GPS from. Proof analytics can help. Learn more at proof analytics DOT AI. You've been listening to accelerating value, where raw conversations about the journey to business impact help you weather the storm. Ahead to make sure you never miss an episode. Subscribe to the show in your favorite podcast player. Until next time,.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (35)