Accelerating Value
Accelerating Value

Episode · 1 year ago

Decoding the Red Hot World of Data & Analytics


The term decision support came about 70 years ago. Throughout the years, the discipline has gone through different iterations from machine learning to business intelligence, but at its essence, it’s still simply about using data and analytics to support people in making decisions.

Want to know more?

Join us as Donald Farmer, Principal at TreeHive Strategy, leads us through the complex world of data and analytics.

We discuss:

- The relationship between data and analytics

- The analytics process

- Working with AI

- Why data analytics needs to match the rhythm of the business

- How automation helps you do business in a new way

Keep connected with Accelerating Value on Apple Podcasts or Spotify.  

Listening on a desktop & can’t see the links? Just search for Accelerating Value in your favorite podcast player.

Today, every budget approval is an investment deal. If you're a marketer, sales or business leader, you had to promise to deliver value and impact. Writing the wave to get there is hard enough. Finding your way through the storm is even harder if you're looking for that path forward so that you don't wipe out. You've come to the right place. Let's get into the show. Hi everybody, this is Mark Stue with accelerating value. There's a ton of questions out there. We get them all the time about hey, analytics, data, all this kind of stuff. It's red hot right now. There are a lot of people who are trying to figure out how to move forward in this whole area in their business, in their organizations. What does that look like? What the what are the what should their expectations be? All these kinds of questions are bubbling up pretty much constantly and it's camping in a in a context of stress and speed, lack of information, misinformation, all kinds of stuff. So we are here with don farmer, who is going to help us with us. Welcome so much, glad to have you. Thanks very much, Mark. Yeah, that's it's a fascinating time to be having this conversation. I'm delighted to be having it. So let's start kind of like at the beginning. So what is the relationship between data and analytics? I would you describe that? Well, when you say start at the beginning, let's go back seventy years. Okay, absolutely, which is before my time. I should point out seventy years ago people talked about a discipline called decision support. That's when that term first started to be used. I think that is still the best term for what we do in the world of data and analytics. We support people in making better decisions, and we've got all sorts of names come along like machine learning, business intelligence, all these things. You know, we all all familiar with these terms, but ultimately there was a marketing terms. In practical terms, what we're still doing is decision support, and so the relationship between data and analytics is that we have the data that we collect in our business and our lives, in our in our work, we have decisions that we need to make and analytics, if you like, is the bridge between what data do we have and what decisions do we need to make? And I know that sounds super simplified, but I think it is as simple as that. Analytics is how you get from data to decisions. So there is a there's a point of view that's been advanced that you can have all the data in the world and if you don't have of the analytics, you don't have much. That would appear to square with this metaphor of the bridge. Right, you can have all this stuff on one side of the river, but if you're not connecting it anything at so, if I read you right, is that pretty much it? That's absolutely right. You know, I often say, in fact I actually think it might even be the tagline on my website, and that, you know, data without analytics is a wasted acid. We often talk about data as an asset, but actually data just sitting there isn't doing much for us at all. Using the data nearly always requires some form of analysis in order to extract from the data the information which is inherent in it. So really I'm quite serious when I say at the data without analytics is largely a wasted asset. Can you is it possible to even know whether you have good data or the right data in the absence of analytics? That's a bit of that. That's also interesting question. It also raises another kind of set of issues, which is what on earth do we mean by good data? All right, data, and I'm going to suggest a good data is good for a purpose, in which case you need to have a good understanding of what the purposes with data in order to know if it's good enough. Really, the only definition I can give you of data quality is is it fit for purpose? Is it is it fit for you know what we're doing with it? Give you a great example of that if I can give you an anecdote from my own kind of experience. Many years agoing over twenty years ago, I was working on a project for European credit card processor and they would issue credit cards on behalf of stores. So just as you might get today at Home Depot Credit Card or a target credit card, they had stores all over Europe that had their branded credit cards now there with cards could be used actually in any store, not limited to to the store in question, and they had relatively small credit limits on them compared to see a general purpose credit card. And of course these a lot of analysis to be done there, and the sales and marketing team were really interested in how do we sell more of these... How do we encourage people to use them and how do we understand how people are using that? So they needed to build a data warehouse, and in those days that's a big operation to do this analysis and there was a number of problems. First of all, this was in the days when most of the credit card systems were connected with dialog mode ends, sometimes even with paper slips that that had people had to fill out, but mostly with kind of dial up processes. Connections were poor. Very often you'd Swipe a card and it wouldn't work. Sometimes they wouldn't work because you didn't have credit. Sometimes it didn't work because the system wasn't working. All sorts of things could go wrong. And so in order to get good data for analysis, we had to clean up all the Miss Swipes, all the duplicate swipes, all the cards that, you know, we're use once or twice and in order to get to the transaction and a clear definition of the transaction, and then the sales and market team team could look at these transactions. Sekay, here's the clean data. Let's get to work on this and devise new products, new offerings and so on. Now, in those days this is a big job. It took us about three months to develop the cleansing and etl process and that etl process, extraction of the data, transformation, cleaning, loading into the data warehouse. That ran every night for about eleven, twelve hours, from the point of which door was closed, and then try and get it up in the morning and it was a challenge to do it in twelve hours. And so so we built this process is very successful. Marketing team is absolutely delighted. Along came the the bank, the Credit Card Issuers Fraud Department and they said we here, you got, as Great Day, the warehouse of all transactions across Europe. We'd love to use that for analyzing fraud. And we said, well, that yeah, you know, you can have access to it, and they said, well, great. So what we need to with that analyze are all the duplicate swipes, all the failed transactions, in fact, all the things that we have spent you know, three four months of development time and eleven hours a night throwing away, because the data that was good for fraud analysis was exactly wrong. It was bad for doing sales analysis and the data that we cleaned up to make good for sales analysis was completely inappropriate. It was terrible. It was useless for what analysis? And yet the raw data was the same for both of them. So when we talk about data quality and do we have the right data, what we need to understand is what is our purpose and how do we get the data into the shape, into the format, but also into the quality that suitable thought our purpose, and that needs a good definition of purpose. And the idea that there's a sort of generically good, one version of the truth, all side, one size fits all data set is is very misleading and has led a lot of people down, I think, dead alleys when it comes to that, the data analysis. So that would seem to to really emphasize, then, the importance of understanding the questions that you want answered. Absolutely, yeah, that's your number one step then right, and and this is why I say that, you know, we're still in the world of decision support, because the process that I recommend to people is start with a good understanding of the decisions you're going to make and once you've got a good understanding of the decisions you're going to make, you can start to model what are the inputs to those decisions. And once you've modeled what are the inputs to those decisions, then you can start to model. How do I get the data from there's raw data that I have into the right data in the right format, with the right you know, and quality of right type of quality at the right time in order to provide these inputs to make better decisions. So we actually need to work backwards, and a lot of people actually make mistakes of working forwards. I've got all this data, what can we do with it, rather than thinking what is it we need to do to make our business better and then work back from that step to what data do we have to provide at and maybe we need to get more data, maybe we need to get different data, maybe we need to treat this data differently in order to support that. So this would also tend to suggest that there's a pretty strong difference in the way that someone who is more operationally focused is going to see this versus a decision make. That is true. I'd suggest to things to to answer that one is that actually well as that list? I'll do with the element first. There are different types of decisions. Not all decisions are created either. There are indeed the operational decisions. These these decisions which are I mean, I'm I wouldn't make this distinction necessarily between a decision maker in something who's operationally focused. The decisions that we make operationally are very small. In fact, are so small in school and require a relatively small amount of data that we may even be able to automate them in many cases. And an operational decision can incclude things like I'm a salesperson,...

...who do I go and visit next and I make nowadays, I may get a recommendation on that rout from an automated system. If I'm in manufacturing and there's an alarm, what do I do about this alarm? Well, that alarm is probably already either got an automated system built around it are there's a standard set of processes that I follow. So it's almost not like a decision at all because it's already laid out for me. So operational decisions of you like are very small in scope and each one is, to be honest, not of great business impact. However, there are millions of them in a typical business, so it aggregate, they have a huge impact. At the opposite end of the scale, you'd have decisions which have huge business impact but are relatively rare. So, for example, of strategic decision. What new market do we enter? You know, what competitor do we acquire? What you product do we launch? Who Do we hire? As CTCML was something like that. You know, big decisions which aren't made very often but they have a huge impact on the business when they are made. And those strategic decisions are ultimately fed by a huge amount of data. They're ultimately fed by all of the operational data which has been aggregated up into an overall view of the entire business, and then there are also other data sources coming from outside, which will include things like macroeconomic information and salt not there. And frankly, there's a lot of human input as well at the strategic level, which is going to like gut field and inspiration and and market intelligence, which is very often fed by the strategic decision makers of personal experience. In the middle of this, between the operational and the strategic decision making, you have tactical decisionmaking, typically the area of a typical business manager, to be a sales manager, could be a plant manager whatever, and they are dealing with all this operational data aggregated up to their level. They're doing analysis, they're making decisions of course, and then very often the analysis is being passed up from them to the strategic decision maker. So it's not as if there's one type of decision making and each of these decision making patterns, if you like, has its own data requirements. You can't overwhelm the strategic decision maker with operational detail. Equally, the operational decision maker isn't going to be no what to take account of. What do I do about this? Lar and well, it depends on the macroeconomic situation that that's not going to happen either. So you know, you have to have the right information, that the right kind of level for these decisions. So that's the first thing that I would suggest. The second thing, though, is that this world decision making is changing. First of all, there's a lot of more, a lot more operat lot more automated decisions, especially at the operational level. The surprising how many automated decisions are actually happening even at the tactical level. Not Quite at the strategic level, the yet, but I mean at the technical level. I see automated hiring, for example, is actually very common. Automated Investment Strategies, automated supply chaining decision making, which is which is super interesting, including automated supply chain negotiation with bots that can actually negotiate with your suppliers or your behalf as it fascinating to see that. So in it sends the definition of a decision as such. I always feel when something is automated then the decision is become something a little different. So it's fascinating to me that both the nature of what a decision is is changing, but also we have to remember that there's a broad range, but really broad scope, of decisions to be made. So one of the things that that I when I went out to quite a few people on Linkedin in advance of this conversation and ask them hating, you know, like what questions do you really want me to ask you origin? One of them is this. So there's a that a lot of hype around Ai and machine learning as a subset of AI, and there's a lot of people who are wondering if that's all they need, for example, to demonstrate causality. And then when you have a when you talk to them about this and you kind of, you know, tea's apart their knowledge or lack of knowledge, what they what is invariably surprised is that ML is. That is pattern recognition. It's not causality. Right. You get a causality through another, another type of analytic how do you how do you talk to people at that level about issues like that? Well, I'm going to suggest that there's upset of misunderstanding sometimes about artificial intelligence, but the nature of its intelligence it we tend to think a lot about its intelligence and we forget that as artificial and I think we need to sometimes focus more on the artificiality. What does that mean? It means that we've constructed it. Doesn't necessarily mean that we're always in control of it, but ultimately it's been constructed with our understanding of what's required and our understanding of how intelligence at such works.

So what that means is that artificial intelligence, machine learning so are all based on, first of all, a model of the past. We look at past data, we find patterns, we extend them into the future. Artificial intelligence does that as well, and it's entirely based on existing experience. It happens to be based on its own experience that it's acquired from the data that we give it, but still it's based on its own experience and it has no ability, if you like, to form new experience without us as feeding that. That restricts very much the things that is capable of compared to human beings, who are bringing in a vast amount of experience from all over, you know, their all over the world, from all over their sensory world that they that they bring in. And in business we often bring to our business knowledge, in particular analogies from from the outside world and we learn stories from the outside world that are very meaningful to us in business and enable us to often to take new directions, while artificial intelligence could be very, very smart, but it's based on a relatively restricted model of the world. I give an example of that. I was are you watching the Olympics? You can following the Olympics, sure, and not really very sporty, but with the Olympic story that happened yes yesterday, which is fantastic story about the I don't know, the woman's road race and cycling. Did you hear about this? No, Huh. So this is a long road race. It was going from Tokyo to the the foothills of Mountfuji. Long, long race, and they're typical in cycle racing you know, there's a peloton forms of people and it's very tactical. People are Jawsling for position in the tech and a pealet on, and then there's a group of people who will break away towards the end of the race and and and, you know, sort of aim to win the race and typically teammates kind of cooperative, cooperate with each other to make sure that the people are getting the right kind of drag and I d aerodynamics, and the strategy is built up that way. And so the pallet on is racing along, it's doing you know, it gets towards the end of the race. They start to break away and the people have got the energy left at the end sprint away and one of them breaks away. A Dutch woman breaks away from the Palot on. She's leading, leading the Palaton. She gets to the end of the race, puts our arms up, she's one, she's celebrating our gold. And what she doesn't realize is that there was an Austrian woman who was and no coach, had no trainer, was an amateur in the world of cycling, who actually already finished almost a minute and a half previously, and nobody had paid any attention to her. At one point he was ten minutes ahead, which is a huge amount to lead an in a cycling race, and she and a ready finished. If she don't, he's already wont the gold. But the Palleton have been so focused on their strategy and their tactics and their dynamics that they hadn't even noticed it was somebody and minute ahead and a half head. And I'm literally out of side now. I tell you that story and I know you're a great business thinker and I think about business and all the time. I could absolutely take that analogy. It's almost into many of my clients and any of the companies I'm talking to and I could use that analogy. And that's actually very often when we talk to people about business we do that. We use analogy. It's from up, from supporting the world. We do them, from political world, all sorts of things. Artificial intelligence can't do that. Artificial intelligence only knows what you've told it. It's not, you know, reading the newspaper and saying, Oh, that's inspiring or that makes me think in a differently. So in a sense artificial intelligence enables us to do what we already know how to do better. It doesn't help us to do what we don't know. We need literally inspiration for that, and that comes always from outside, to school with boo artificial intelligence, because look at does that make sense? If I put absolutely so. So would it makes sense to say that AI is more about scaling intelligence as opposed to be getting smarter? I think it is. I mean there's a there's a phrase which is super interesting to me, of that augmented intelligence. Of you heard are yeah, I use that a lot. Yeah, yeah, so augmented, to tell you just typically means that we use artificial intelligence to augment human intelligence, and that is the scaling question. You know, we can, we can make a lot more decisions that a human being could make. We can scale it in geography, we can just scale it across more customers and so on. So so the scaling question is a really good one and I think of augmented analytics as being about enabling human beings to do this more at a greater scale, much faster and actually more reliably, in a sense of more repeatably, more consistently than human beings can do, which is great and I actually think in some ways the most interesting aspect of the future is not so much artificial intelligence agmenting human beings as human beings agmenting artificial intelligence. And what I mean by that is knowing that there is this system which could do things faster,...

...better or stronger, faster than, you know, more scalable than we can. But can we bring to that? Can we bring our human intelligence to augment what these systems are doing? So imagine, for if I think of a drone and a lot of people are using drones and all sorts of areas, insurance as using drones to go and look at claim sites and things like that, crash sites, houses with the roof blown off and auric and all sorts of things are being used. So if I think of a drone as being an extension of me, I think of it as a a manting me. The drone can go places where I can, that it can do things, but it's essentially doing what I want, but it's doing it, you know, more scalably than I can. Then that's one way in which the drone is augmenting me. But what if I think about it the other way around, that I'm augmenting the drone? At the drone is capable of going there, but there's some things the drone can't do that I can do. Can I bring my intelligence to the drones operations rather than just using the drone as a way of scaling what I do? And I think that shift in mindset could get really interesting over the next few years if we can make that happen. So here's another question. So a lot of this kind of revolves around data, because there's a tremendous preoccupation with an inside of companies right now. How much data do I need do x, Y and Z in analytics right understanding that the answer is going to be different for machine learning versus something like regression. Well, there's there's a great saying that I've heard. I can't remember who actually use this phrase first, so I don't want to miss a tribute it, but somebody said that, you know, more data always beats better algorithms, and I think that is I understand what they're trying to say. I think it's actually a little bit at naive and in a sense that, as we talked about the quality of data and the the insight that you can get from data, I think algorithms are actually what help us define the purpose of what we're doing. And yet it's great if we can throw more data at that, but I think we shouldn't assume that more data is always is always better. So there are some really and rightly interesting work you can do with relatively small amounts of data, so long as that data has been carefully collected, carefully curated, irefully managed, so that that you know it as is fit for purpose. And equally, you know you can get very misled by having large volumes of data which don't tell you quite what it is that you think it's telling you, and that that's to my my dad's actually a real challenge. So you know, again it comes down to this question of what's the the right quality of data. I would say that one thing that's very important when we look at using data is do I have enough data to really give me a good sense of the scope of the problem that I'm dealing with? So if I'm doing automated hiring, and I mentioned automated hiring, you know, if you're hiring people into your business and I hire, let's say I'm a packaging company, I know acting company that does automated hiring. So they are very seasonal and at different times of the year they need to to hire temporary staff who will be with them for sixty eight weeks. But the previous year they had about three hundred of these temporary staff and the previous year before that they had about two hundreds of on so on average there they're getting about two hundred and fifty to three hundred temperies that a year. Is that really a big enough data set in order to analyze the attributes of past employees that you would, I want, to apply to future employees? In other words, can you tell from last year's data reliably who's going to be a good hire this year, so reliably that you can do it automatically? And I my answer to that is probably not. There's probably a lot more that that that we need to bring to that at that data set in order to be able to make that decision. Be Great, for example, if we can go out and see what's happening in other industries, and luckily there are, you know, public data sets. There's a lot of shared information out there. I think one of the key and analytics that I always want my clients to look at is some form of benchmarking against industry, because our own internal data is rarely enough. No matter how much of it we've got. And that's not because of the volume of data, it's because of the scope of the data. And so in some ways I turn your question around and say I'm more worried about having data with the right scope rather than the right volume. So one of the reasons why the question was posed is that I think a lot of people would agree that AI, and is varying forms, is most commonly a big data solution m but that when you...

...look at the realities inside of a lot of companies, while they have a lot of data, typically the individual data sets are rarely big data. And so does that disfalify them in many ways from being able to leverage AI effectively? So I don't think so, but I do think you have to be somewhat careful in as a the algorithms that you choose and the approach that you take to it. There's some algorithms, so called naive bayse algorithm, for example, which works actually pretty well with relatively low volumes of data. But you have to be aware then that you're not going to necessarily get the accuracy and you're not going to have the scope and variety of one who won't cope, for example, with you know, many new scenarios that that might arise. So you have to be somewhat conscious in your use of it and I think that's part of the understanding that a data scientist needs to bring to an organization. But there's also another aspect of this scaling, which is, you know, do we how do we maintain this, how do we actually operate with with big data? And I think put the the are they are, we've seen in the last kind of few years the role of the data engineer emerging, not just the data scientist but the data engineer, and the data engineer is very often a person who knows how to put into production the model and the design that the data scientist has come up with. So data scientists can make in research some you know, kind of really en tryant think, find things that might be very practical for the company in the data engineer may come along and say, yeah, but we that's a oneelf experiment, we can't really run this continuously in production. We need more data, we need better data. Are In fact maybe there's too much data to run in production, too expensive to Rome, and so I think we're always kind of finding this compromise. But I don't want people to get hung up on the idea that artificial intelligence requires huge volumes of data. It doesn't, but it does require you to tune your use of artificial intelligence to the data you've got and equally tune the data you've got and the process as it goes through to the decisions that you wanted to make. So here's a here's another question. That's actually it's a perfect segue out of what you just said. So for a lot of business leaders, who's last real brush with any of this was thirty years ago in school, when they sit down with a data science team, they are some of their some of their observations goes something like this. I've never worked with a more argumentative group of people. I've never worked with a bunch of people who have, you know, were so obsessed with precision rather than getting me an answer that I can work with. This is a perception, right, this isn't assure yeah, yeah, speed, speed the insight. Write these. A lot of times business leaders will say, man, I can't wait, you know, three, four or five months to get a model and get a get an insight and then wait another three or four months to get it renewed again. What what do you what's the best way forward here? How do you how do you kind of bring these two very, very different archetypes into a closer working relationship? It's really tough. You know, very often we talk in the tech world about the the center of Excellence and the analytics center of excellence as a model of, you know, how to embed in the larger organization a team of experts. The problem with that is at the center of excellence very quickly can become a center of arrogance, a sort of isolated team who see themselves as masters of a very particular domain and they can focus so much on their specialization that their relationship to the rest of the business it could become fraught. Frankly, this is your cycling analogy again. It is actually yeah, very much. Yeah. So something that we really know what East in the last few years, and I think people are now hiring with this specifically in mind, is that data scientists, at least leaders in your data scientists science organization, have to be great communicators, to be effective data storytellers. They need to be data somebody's going to phrase the data interpreter, and this is actually a really important skill. Having a data scientist who is a genius are doing data science but actually not very good at communicating what they're doing isn't the benefit to your company at all. You need somebody who is able to communicate with what they're doing and also communication is too way be able to receive the communication about what's needed. So when I talk to teams about developing a data science practice, the first higher should be somebody who's hired as much for their ability to communicate about the work they're doing and the findings that they're making as it is somebody who is is actually an expert of doing it. You'll find a balance much more important than finding...

...the personality who's a genius data scientist but can't communicate, and you can have, you know, future hires who are the genius data scientists who can communicate. That's fine, but you're building it round a team which is focused on communication first of all. Other sort of aspect of this, which I think is also very important, is to try to focus a data science team, especially as they're starting their work, on a number of quick wins, a number of areas which are actually of lasting value to the business. The data scientists will often think of them as being maybe a little trivial and even because they may be quite simple to develop. And the quick wins that I was suggest is a project that looks for anomalies and outliers data sets. That's a like a standard practice for a data scientist. They should be able to do this almost in their sleep. Tremendously value to the business to know that, you know, out of these, you know, five hundred equipment failures, we've seen, these five are actually a novelists are out of these customers who are, you know, are our top sort of two thousand customers, these ten are actually, you know, kind of outliers in terms of what it is they're buying in the patterns of buying them. That could be tremendously valuable to the business relatively straightforward to a data scientists. The might not, you know, to do this. So that's something time series analysis. You can find within any business, a time series that can be projected forward and that can be done relatively simple and can be of great value to the business, ongoing value to the business. And then you can also do things like you're finding clusters. Is Always Custom analysis is always super interesting to people and in sort of comparisons and so on. So what I always suggest is there's a a small number of very specific projects that you can do very quickly which have some advantages to the business. Factor can be very advantageous to the business, even though the relatively simple data science. But the other thing that does is it also enables the data scientists to really get to groups with the data and really get to bit grips with the business cases. Very often they might the data scientists you hire might not have a lot of experience in in your vertical. They certainly don't have experience of your specific business. So now they get to know your data, they get to know all the people in the IT team, they get to know the business leaders are talking with and they're delivering what, from their point of view might be a very simple project, but one which actually provides lasting business value. On top of that, you could now build a business and technical relationship which leads into the advanced data scientists science that might actually move the needle for your business. But you can't expect to get there in one leap. The danger of doing it is that it fails and almost the greater danger is that you build this this practice which is super expensive to run and isn't really delivering business value that can be acted upon, because it's abstract, academic and I'm not grounded in the reality of the day to day business. I would another problem that has crop that I've heard people discuss quite a bit is the udeloop right needing to speed it up right match the clock time of the business. Is there a way to do that sort of software? You know, you you need to be aware of the rhythm of the business very much. I need to be really aware of the cycle time of business and you need to be aware of that. That is something which is continually changing. I get a good example of this and telecoms. You know, a few years ago telecoms companies built some of the world's largest dayawarehouses and so if you were a company like verizon then you had landlines, you would be you know, your landline telephone business without a huge day to warehouse listing, you know all the calls at all. It called detail records going back over many, many years. Technos day, terabytes, all those PEDA bytes of data. Nowadays you may still have a huge data where I was more likely a huge data lake, but every time someone changes their plan. Every time apple comes out with a new feature, although some new android capability, that actually changes people's behavior. So the information you had from a year or two ago may actually not be very valid for informing your customer decision making just now, because the plan that they want know is going to be influenced by the features they have on their phones as opposed to their past behavior from two or four years ago. So understanding that that rhythm can actually change the economics of Your Business and the decision making human need to make. It. It's super, super important and in terms of the the the cycle of decision making a when do we bring in more information? There's an art to that, for sure. There's a kind of human art to that, and there's also a circain amount of science which is analyzing these these cycles of the business and knowing when to bring in more information. One of the dangers is that you are very common danger is analysis paralysis, that you bring in more and more data because you assume and that that's going to help you and you...

...bring you do more and more analysis, but you're not making better decisions, are faster decisions and and that's a real problem. And again it comes back down again to model the decision making process and I'm working back from the decisions to the to the inputs. So historically, you know, one of the examples that you hear cited a lot is that if you talk if you were working with Nielsen or somebody like that, right and they're going to charge you about two and a half million dollars and they're going to give you three models that are updated twice a year, and that seems to someone outside of the data signuce space to be strange and somewhat disconnected from reality. I'm not asking you to endorse it one or or the other. I'm just saying what is the what's going on there? Why do you think it seems strange to them? Mainly it's the fact that the computation at month one ages out so quickly that waiting six months to do it against and or twelve months to do it again seems to be missing the point. That is again that's a mismatch of this kind of rhythm of the business with with people's expectations of what is needed. And of course, part of the problem there is that if you're an aggregator of data and you are providing generic analyzes like a Nielsen, then simply the scope that you have to collect to do that. There me just be constraints on the on that, and so there maybe a I would suggest, of what I'm hearing here is a kind of mismatch of expectations between what people want, the day decision speed that they want, and just the of the analysis, in which case, of course you may be misusing the Neilson data. You may be wanting to read into that information that you can't actually care from it, and that's part of the answer to that might be, well, we want Nielsen to speed up, but part of the answer to that might also be we should use an Neilson data in a different way. It's not the right analysis for probably want and maybe we need a different way of analyzing in the very short term rather than, you know, trying to use a longer term analysis for that purpose. But it's certainly if people feel that there's this mismatch, then I think the two possibilities are that there's a miss magic of the expectations or there's a mismatch of usage in the systems. So I actually really agree with that because I think that one of the things that I see a lot is there's it comes down to failure to communicate right the business, whether it's internally or with outside vendors, is assuming that everyone is reading their mind, as opposed to saying hey, this is the kind of reporting cadence we really need in order to stay ahead of the market place. What can you do to help US support that, as opposed to being very prescriptive? Well, I want this kind of program yeah, you know that. That only generates a calculation every six months. You know, rum, rum rumbles. Right. One of the great challenges that I've seen in the last sort of sixteen months, eighteen months of the of the pandemic as being people analyzing, of course, needing, needing analysis very quickly and and not being able to do analysis of the right scope very quickly. What I mean by that is, let's say you're in an office rental business and your all these rentals are dropping, you know, fifteen, twenty percent as a result of the pandemic, maybe even more than that. That, of course, it's very concerning to you. Fight me, even almost in a panic. You're going to want to respond to that. You're going to want to to do something about it. Ide know whether it is trying to sell more, but that sound likely to help. And maybe you have to reduce your costs, maybe their redundancies, all sorts of things may have to happen. But the something that you that's missing from that, which is you know what's happening to you, but you don't necessarily know what's happening to the rest of the industry, and so you have no benchmark. If you're if you're you know obvious, space rentals have dropped by fifteen percent, but the industry and a whole is dropped by twenty five, you're doing pretty well. You're actually doing better than everybody else and maybe there's not much you need to change. And similarly, the other way around, it could be you know of your if your rentals of increased by ten percent and you're happy, but everybody else has increased by twenty, then you're not doing so well. All our analysis about finding these these these relative, these relative benchmarks to the rest of the industry, and we haven't got feeling about it. And of course we can get some aggregate analysis because we readn't you know, there's that's what analyst firms are for, for doing that aggregation for us. But every business is in this in this situation, even in quite small things. There's a construction company on...

East Coast who a few years ago there was a really bad winter over, the really bad winter season and lots of accidents on construction sites, and they'd increase their training and increase their kind of safety awareness and safety provisioning, but still they had a real spike in a number of accidents, which was very worrying for them. What they didn't realize until much later was actually their spike and accidents was less than the rest of the industry. They'd actually been doing a really good job on safety compared to and compared to other people in industry, and it wasn't really until l after the winter was over and the analyst firms in the construction industry look back and said that was a really bad year for accidents, that they realize they've done a good job of it. And so again there's often comes down to this question of school. What are we what are we looking at and what rhythm of business are we are responding to, and I think bench working is actually one of the greatest things that we can do in a business in order to provide value. Now it's very difficult to do benchmarking. You know, you could opt the industry analyst firms. There's one opportunity, though, that that that that's emerging, that is really valuable, which is that software vendors, especially now with the move to software as a service, multi tenant software as a service, they have the opportunity to look across the data of many of their customers, see in their particular vertical, in the pharmaceutical space, in the construction space, or in a horizontal, say in the ethics and compliance horizontal, and say, you know, we could anonymously analyze the data across the entire industry sectors and feedback to you, of course as a value added service, but feedback to you that analysis so that you're continually able to benchmark yourself against the rest of the industry, and that's an opportunity that whenever I'm talking too software vendors, I always emphasize it's something we can do the day that we couldn't do, you know, five ten years ago, and if tremendous value to an industry to know how am I actually comparing? We've all had those conversations. Anybody who sat in a sales meeting quarterly sales meeting as had these conversations, but all these numbers were western region or terrible, and western region are saying yeah, but you know, western region isn't. is in a bad state. You know, we're struggling just now and everybody, everybody we talked to in the industry is struggling. The numbers look bad on our dashboard, but actually maybe we're not doing that badly compared to the rest of the business, and that's a sort of insight that we can now get that was much more difficult to get previously. Last quest of so going back to to where we started, with decision support. So I think, particularly after the epidemic, after the pandemic right, this is I've just never seen just in our work. I've never seen such focus on us. And yet people feel that they in many cases are behind the curve. Right, they're lower on the maturity curve than they want to be or should be. What do you if you kind of fast forward two or three years, four years, five years, what do you think it's going to look like? Where do you think we're going to be gone on this like, and not the really like cutting edge stuff, but like what's going to be a norm. Right. That's a that's a great question. I love your intro to it because it actually sets up the problem really well. That all these people think they're behind the curve. Is like the inverse of the lake will be gone problem, you know, like wake will be gone where all the children are above average. For this is everybody's behind the curve. But the reason we think we're behind the curve is because what we hear in the news, what we hear and when we read the business journals, are about those cutting edge cases and and that gives us a sense that there's, you know, potential out there that we are not achieving, which is kind of true. But it's the cutting edge cases that are exciting and what's actually moving the the engine of business and is kind is moving. They know, the industry forward are, you know much it's not what Google and Microsoft and and Amazon are doing with amazing cutting edge data and deep thought and and all the Advancedi it's actually every day improving the decisions we make a little bit. And we always want this. How we always want the silver bullet answer. You know, the in our world of data warehousing that I used to work it used to be the story about your verdic and a beer and diapers kind of urban ledge and that, you know, they do this analysis, they discover that beer and diapers cell well together because guys go home on a Friday nights of buying they want, they stop at the store to buy beer and then he buy diapers to assuage their guilt at buying beer, you know. And it turned out it was never true, particularly but it speaks to our belief that there's a sort of silver bullet, that there's an analytic solution that will give us this insight that nobody else has, which is a great story, but it's not as it's not as real, the story is not as achievable as story and is not as valuable a story.

As you know, every single one of our decisions improved by about ten fifteen percent, but that makes a huge difference to your business and we want a bit. We had to have these these kind of these silver bullet stories. So I actually think it over the next few years what you will see is what a data scientist today would think of this pretty mundane analysis, the kind of analysis that I was talking about about getting quick wins. Everyone will be doing better clustering everyone will be doing better time series analysis and projections. Everyone will be finding outliers and exceptions and we'll be doing that much better. The other thing is we will have done a much better job of automating our businesses, automating components of our business that can be automated in supply chain, for example, in finance, and so we'll be doing business in different ways, because automation at its best isn't just about doing something faster, it's about actually enabling you to do business in a new way. So, for example, there's financial services companies who do you may be heard of robotic process automation. You know, so you you're filling out forms regularly in finance, and robotic process automation actually watches what the human being does, captures all these actions that they take and then automates that, but like old fashioned macros actually, but running on a superfast scale with a lot of intelligence behind that. If you apply that in a financial space, you can now get automated closing of your books. That terrible process that you go through every month and every quarter of closing the books, which is always a nightmare, can be automated and it could be audited in the way automated in a way that it happens now all the time. Your books are always closed and balanced. That doesn't just speed things up, it actually changes the way in which you think about your business. It's the difference between a series of still photographs and a movie. A movie is a series of still photographs, but it becomes a very thing when it's played at speed, is that twenty five frames per se, and that makes us see it in a very different way. But Your Business is automated at the very best. It's not just going faster, it becomes a different way of perceiving your business because all of this stuff before that felt like a big decision now flows more easily and you are freed up. Your imaginations freed up, your strategic, technical operational decisions are freed up to act in a different way because so much that you did before is now not taken for granted but taken care of. And that's what we're going to see the next two and three three years. So much more than today is a burden will just be taken care of. You know that I always think that this. Back in the day I was of electricity, when electricity first and electrification started to commit to factories and businesses and the used a point. See eels and CEOS weren't chief executive officers or chief electricity officers because electrification was such a big job and such an important job that you needed somebody to actually manage that entire process. Now you just switch you on and you connect to the grid and you've got electricity, and many of the things today that we think of as big processes will become like that. They will become largely automated and we'll just lug in machine learning, will just ask for that answer and get it, and that's going to be very transformation. You know, an example of that metaphor would be the GPS. The GPIS is a few years ago was miraculous, you know right, it really it. It revealed how much precision you need. So is it? Is it really critical that your car is located within two feet of where it actually is or as a thirty foot circle enough right? When? The other things that I was thinking about as I was listening to you talk is, I guess was probably about ten or twelve years ago when most GPS companies started putting the ETA countdown clock into it, right, and it was it was meant to give you more confidence. It was sort of like the proof of it working, right when you turned into your destinations parking lot and the clock went to zero qed. It was accurate. Right. Yeah, the GPNS is a great example because, you know, this technology was something that even to this day, most people don't really understand how it works, you know. And yet, you know, we just we completely rely on it for many things that we don't even necessarily assume that we rely on. But I think the one of my cousins were worked and GPS systems for the military and if he had seen today that people are using them in a watch to measure their running, you know, and their exercise, he'd be an he'd be astonished at that, not only that they could shrink to the size of a watch, but that it would actually you know, that would be as that would be a use case. Right, that would... a thing and it's such a trivial thing, you know, but will you? But we all do it, you know. Right. Anything else that you really wanted to say on this, because, as this has been a fantastic conversation and I have a feeling that the audience is gonna send me a lot of email wanting you back, but is there is there anything that that you feel like is really important to say about any part of this subject right now? Well, I think I've made the point a few times about the the importance of understanding decisions and the importance of modeling our decisions, and I think another part of this that I want to to emphasize is the importance of the human element in the decisionmaking. That all of this automation, all of this intelligence, all of this data can do so many things, but human beings are innovative beings, are imaginative. Human beings do things against the data, against the pattern that we've seen for all these years before, and and that's often where, you know, competitive advantage comes from. So I want people to be very purposeful about their use of data, about their use of analytics, but also to build into their understanding of it and they're and their thoughts about strategy and tactics and operations the human role in that. There are things that the engineer on the shop floor can do and see and understand that the system never will, even though the system can greatly, you know, exceed the shop floor worker in terms of capacity and reliability they can't exceed the shop floor worker in terms of of insight and and innovation, and we should think of our human beings as humans, with with all the complexity and problems that they bring, but also that great capacity, because ultimately, when everything is automated, the difference, the innovation, the competitive advantage, the edge will come from the human beings. When we all have the same data, I mean mostly do, but we all have the same capacity to make automated and insight pocisions. The difference, it's going to come from our human beings and that's the asset that we we need to we need to nurture. Thank you so much. What a great conversation. I've had a great deal fun. Thank you very much. I do hope I can come back. That would be great fun. We'd love to have you back. So, guys, this is a been a great opportunity to hear from an expert on this full subject of data and analytics and how to figure out how to make better decisions going forward within your organization. I'm sure that many of you will do this anyway because you seem to on on all the podcast but if you have additional questions that you would like us to to do on round two. Then please send them to this and and we'll set it up and and we'll kind of have a little maybe not a call in, but it'll sort of ob be the proxy for the call in. How's that? So thanks so much, guys, and we'll see you next week. The sooner you can optimize your marketing spend, the quicker you can start delivering clear, measurable value to Your Business. That's exactly where business GPS from. Proof analytics can help. Learn more at proof analytics DOT AI. You've been listening to accelerating value. We're raw conversations about the journey to business impact help you weather the storm ahead. To make sure you never miss an episode, subscribe to the show in your favorite podcast player. Until next time,.

In-Stream Audio Search


Search across all episodes within this podcast

Episodes (35)