Accelerating Value
Accelerating Value

Episode · 1 year ago

The Art of Data Science: Understanding Data’s Role in Business

ABOUT THIS EPISODE

Many businesses look at the wealth of data available to them and think: “Finally, I’ll have all the answers I need.”

Meanwhile, every great data scientist is shaking their head…

Because they know that data science is about more than the data (or even the science).

One such master of the art of data science is Andy Hasselwander, Chief Analytics Officer at MarketBridge, who joins the show to share what businesses (and many data scientists) get wrong about analytics.

In this episode, we discuss:

  • Common mistakes companies make with data analytics
  • The measurement trap (and how to avoid it)
  • Why data specialists need to better understand the businesses and industries they work in

Keep connected with Accelerating Value on Apple Podcasts, Spotify, or our website.

Listening on a desktop & can’t see the links? Just search for Accelerating Value in your favorite podcast player.

Today, every budget approval is an investment deal. If you're a marketer, sales or business leader, you had to promise to deliver value and impact. Writing the wave to get there is hard enough. Finding your way through the storm is even harder if you're looking for that path forward so that you don't wipe out. You've come to the right place. Let's get into the show. Hi, guys, this is Mark Stuce with accelerating value, your weekly podcast about all things related to value creation, specifically in your job, in your work, in your companies. How does that look? So we always bring you some really cool guests that can speak to this from very, very different vantage points, the very different track records respectives. Their experiences are all over the place and we do this to help you find commonalities in their experience that you can then use in your own work. With that in mind, I want to introduce what we're kind of doing a date and analytics being right now we're kind of in that, in that phase, and this is the penultimate interview in that series. Andy Housle Vander is is is the CDO of a really, really cool company that I'm going to let him describe and tell you about, but he also just has a phenomenal track record, not only from a technical analytics side but also from a business side and a marketing side, and so this is going to be very syncretic in terms of what he's learned and how he leverages all this together to create value. So, Andy, welcome and mark, thanks so much for inviting me on to the podcast. That's it's fun to be here and I'm really looking forward to hopefully sharing some insights with your audience. So my name's Andy House and wonder I'm a lead the product function in the analytics function market bridge, and we're a marketing analytics company. Our tagline is go to market science, so we are basically taking a look at marketing effectiveness, marketing optimization, go to market effectiveness generally on both the BTC and be to be tob fronts. Our heritage is we were started in the mid s by Tim Fury who was doing his research, it harbored at the time, on multichannel economics and sort of go to market economics and how enterprises were shifting their channel mix to take advantage of technology and also to optimize, you know, outtimize selling cost versus transaction size. So ...

...think about. You know, at the time it was IBM moving from blue suited feed on the street to the partner channel and eventually, you know, down into the time e commerce was the hot thing. So Anyway, fast forward today to two thousand and twenty one, and we're focused really on all things go to market, go to market science in terms of applying a databased lens to both measuring and optimizing marketing mix and marketing channels and also understanding audiences. So that's that's a little bit about us. My personal background, I'm a sort of Jack of all trades when it comes to business and marketing, statistician, but I also, is mark mentioned, I spend a ton of time on the strategy side and and also, you know, coding, coding up some pretty cool applications as well as data science. So happy beer. Awesome. So when I when I do these interviews, part of the questions come from the audience, right, so we're going to we're going to kick off of one of those caps. So how would you describe to someone who doesn't have a background in this at all the difference between or the relationship between data and analyts? Yeah, I mean it's it's so data are, you know, are the a lot of times we just we just talked about signal and and if you think about, you know, for example, just the human brain and just walking around outside, you know, data are all the things that are coming in through your eyes and your ears and and sort of the raw inputs, you know, color or shape, motion, etc. And analytics is what your brain does with it, right, so it's all of the you know, it's finding meaning. You know, that's probably the simplest explanation. I actually we just had a new class of analyst start and I and I looked up the definition, the etymology for the word analytics and it and it. It comes from Greek and it sort of means, you know, untying and not or tearing a part of problem, which is another way to think about it. You think about sort of some impenetrable problem and and and picking at the loose ends and picking at the strands of the rope and then figuring out, you know, how that thing all lays out. It's sort of unstructured analytics and then you can get into more predictive things as well. But you know, What's interesting about about data, you know, just just generally, is when people talk today about big date. I think, you know, that's one of these phrases that's, you know, like a lot of things in our field, is overused. But really the what is interesting about data today is simply its scale. And you know, there's there's just more data being being created by machines today than there ever were in those machines include, a lot of people don't think about this, but also include, you know, marketing machines, and marketing machines are, you know, obviously everything. It's out there, digital ad network and and and so...

...forth and so on. We also spend a lot of time, by the way, with small with small data, which is which are survey data and things like that. We think those are interesting too. But you know, that's this kind of way I think about it. Yeah, you know, one of the things that I see a lot in very practical terms inside of large companies is that actually very little of the data that they have is actually big data. When they say big data, they mean I have a lot of data. Yeah, yeah, right, which is true. Yes, but the individual data sets are not big data now and and it's in and you know, there's there's a, I think, a huge amount of technology chasing the big data problem and probably quite a few executives who are not quite familiar enough to realize they may sometimes be barking up the wrong tree. You know, there's the scope or the scale of an order. The scope of an organization's data are often are off. Is often much more challenging than the scale of it. is another way to think about it. So if you think about I have twenty five data sources, some of which might be look up tables might only have a couple hundred records, but I'm only really using five. I might be missing a tremendous amount of insight there. So what are the you know, one of the other questions that I think a lot of people have, who are used to thinking about measurement as kind of like the be all and end all, is what what are the what's the fine what's the limit of the value of data? Understatement, you have to have it in order to power analytics, but absent analytics, how much value can you actually get from data? You know, there's we talked a lot about the measurement trap and and there's a there's an idea that you know, if you can't measure it, it doesn't exist. Sounds something vaguely like, you know, quantum physics or something, but of course it's just marketing and I think that, you know, there's there are there are channels that are inherently more measurable. You know, direct mail is a great example. Direct mail is extremely measurable. We know. You know, we can predict, based on a mailing and based on a model, exactly when phone calls are going to commit or, if there's a vanity or all, when those are going to commit and how many leads those are going to create. So sometimes direct mail might be, for that reason, used more then something that might be more difficult to measure. The hardest thing to measure for brands generally is up funnel advertising, brand advertising, and so there are companies, particularly in the more finance driven spaces, that are used to measuring everything who shy away from a foam brand advertising for that reason. That's the measurement trap, and then you can get yourself in trouble where you start circling the drain on, you know, optimizing these supermeasurable channels. So I think that's, you know, that's one of the limits, as just because you can't measure it doesn't mean it's not worth doing. And now and a CFL would might argue with you...

...and say show me the money. Right, but you might just have to dig a little bit harder to find your evidence, and there's a lot of ways to do that. But it requires a little more creativity and in a little maybe even a little faith sometimes, which it sounds I prop shouldn't say it as an analytics person, but well, one of the things that I discovered a long time ago, when I was still a CMO, was that time lag actually was obscuring the value of stuff like that. Yeah, and that once I started to have teams that could deploy multi variable linear and nonlinear regression, a lot of that snapped into focus. Yeah, that's right. For a lot of this, I mean it's essentially the same math that's used to look at climate change or pandemics or know anything like that where there is time lag involved. How do you when you when you get into a conversation like this with a marketer who understands in conversational terms that marketing takes time to pay off but has never really thought about mathematically calculated time lag? How do you typically address that? Well, I mean the best way to convince a skeptic or at least to lead somebody along a path towards econometric measurement, which is really, you know, that's the fancy way to say. To talk about what we're saying is with case examples, and there are good ones, you know, out there. There's there's a lot of academic work that's been done, you know, bit day back into the s on the utility of econometric models to understand, particularly up funnel marketings, contribution to enterprise value. There's also Meta studies that have been done. One of the best ones is is a study called the long and short of it came out maybe seven or eight years ago out of the UK. That looked at, you know, I think, a hundred or so consumer brands and and the relationship over the long run, again with time lag, of something in metric called extra share of voice, which which is sure of voice share, voice market share. I always get that the difference between the two. So if you're if your share of voice is higher than your market share, you have positive extra share of voice, it's or you have negative. Pretty simple metric and the relationship between that and you know, overall brand performance and what you know what they found is there's a positive relationship, and so case studies help a ton in, you know, and then you get into the questions after somebody says, okay, I get that this can be done and I get its relationship, but what about how can I do it? And you know that can be a challenge, particularly when someone doesn't have a lot of track record. Right one of the hardest things is trying to get a company that has not particularly done a lot of investment up funnel to start, because that's where you do require the leap of faith. You say, look, I'll show you these case studies, I'll show you this metadata. These are these met these Meta...

Studies. But ultimately, you know you have to you have to use to do a test, and one of the challenges there is the you know, the the conservative test is, do them do the bare minimum? Well, when you know I'm going to spend half million dollars, you know, in four weeks, well, that's not going to show anything. So you know you need to really which is which is why I think sometimes you were seemu see emote that that job requires a huge amount of like, you know, guts. No, it does. It's absolutely yeah, I mean, I think you have to have both the courage of your convictions, but you also have to have something that underlies or supports your convictions absolutely. If you just go in there and say I know what I know, trust me, yeah, that's typically not going to fly most of the time right now, particularly not today. Maybe it would have work thirty years ago. So where do you see? So if we look at analytics date and analytics historically, it has continued to move, is continued to gain efficacy and impact and credibility and all those things, and yet there has been a lag in adoption. Yeah, and operate and how the analytics are actually operationalized. What do you see? Is The big challenges there and we're doing to see it all going. You know, I think the challenge, there's a little there's a lot of challenges. You know, we could talk about the challenge of finding talented people to actually run these models or to collect these data. That certainly is out there today and it seems to be in right now in twenty two one as hard as I can ever remember being right, to find to find really good people. You know, that's always there. I think. I though. I think the challenge, though, that a from an executive perspective, that a lot of executives or aspiring executives sometimes miss is how important it is to be consistent. And what I mean by that is when you think about data, you know, for example, think of a log file for a website. You know I have a date, time stand by, have some kind of a user ID, I've you know, I've got all this. That's big data. I might have millions and millions of these of these log files being generated per day. Okay, that's fine. You know, I can deal with that. I can deal with that scale date in a lot of different ways. Now that UST MA. Maybe fifteen years ago that would have been a real tough problem. computationally. The challenge is is the lookup tables that go against that. And those look up tables you can think of those. For example, if your and sales forcecom it's the picklists, right, the three audience dimensions that you choose, the age bands that you choose, the campaign objectives, those simple things. You know, audience definitions and what you find it. So it's so many organizations is you might have different pick lists between systems. So, for example, between a marketing planning system and a marketing measurement system like an adobe or what? I know they do both, but you know, you know what I mean. You might have different, rapidly changing dimensions through time. So another one would be, you know,...

New Year, new campaign planner, totally new strategy, let's change all those things. So what you get into is it's very difficult to start drawing inferences in using analytics when the data about the data, which are sometimes called Metadat or stable. And so I think one of the things executives can do is simplify the dimensions across which we analyze and speak about our business, keeping those simple and consistent. Obviously there will be changes, right, and we're not suggesting that the business is laid in concrete never changes. But there's huge value in consistency, even testing, right. So when you think about how tests are run, making sure that the way you run tests is consistent, you know if we make a decision at the enterprise that ten percent of our budget should be focused on testing and ninety percent in production, we should stay that way and we should use the same language as we talk when we talk about them. It will be much easier than for all the fifty or hundred people that are required to slice and dice mounds the data, do the analytics, build the reports. This is, by the way. It is one of the biggest problems with dashboards. You know, very hard to build dashboards when the dimensions are constantly changing. Try to build a dashboard which fundamentally is a chart with a bunch of picklets across the top, when the picklists change all the time. You can't do it, which sounds you know, but it's like it's totally true. Well, it is, and and it and it kind of begs a question about whether or we're kind of starting at the wrong end of the stick. And I think that one of the things that a lot of people observed in two thousand and twenty, right in particular. Right. So, if you compare in two thousand and nineteen to two thousand and twenty, two thousand and twenty one, right, huge, who huge variability? Right, and how did that? You know, how do you account for the chain, not only the changing dependent variables? Hmm, let me be clear. Right, a dependent variable is like sales or something that you're trying to make happen. Independent variables or what you're doing or what someone else is doing or what the market place, the doing so some of these you control and some of them you don't control. Right when those are changing as well, how do you navigate? I mean, if it ultimately the goal from a business perspective is to be predictive, to then show what, how correct that prediction was relative to reality, and or make changes right, to kind of pivot along the way? How does how does that? How, when you working with your customers, how do you them? Like that's grasping a nutber a little bit right. Yeah, yeah, well, you know, it's a the sometimes I'll talk about the the analytics, jobs to be done right, and if you this is I use this tremendously oversimplified framework. But of you know what, why, who and how? Right the...

...what, of what happened in two thousand and twenty? We know. You know, and maybe maybe twenty years ago it was difficult, but now you know. Any organization Bele say you know, here's a time series of my sales, here's the time series of my leads. I know what happened. We were bover below goal. The why thing gets pretty difficult. Sometimes they got very difficult last year, and I think that's sort of what you're driving at which is well, everything changed. Why did it change? You know, and the the answer the looking back, you know, post fact, I'll why job for an analyst or for a marketing analyst? Maybe, maybe is the most critical, because if you don't understand the drivers of why things changed, it's very tough to make strategy. And so you know that data detective role is by the way, it's also one of the hardest. Going back to the theme of difficult to hire, is one of the hardest to hire. But again, the tool of econometric econometric approaches, which which is really organizing day to your point, through time and across dimensions, will get you there. We'll get you the answer. or I mean it's a it's the Sherlock Holmes, you know, version and marketing of saying I was, you know, in what are those dimensions? While I can look at it by DMA or by city, you know, I can obviously going to look through time. I've got the picklist we talked about before. I'm going to go look at what my competitors did. One of the things we saw last year, for example, of the upple companies, was, you know, it was really important to control for geographic mobility due to covid and that was a huge that was a huge, hugely different by parts of the country. So bringing in those control variables and then saying, you know, it turns out that the reason we were down here was because we had, you know, real suppression of customer activity here and actually not here. Another thing that happened last year with some, with some of our clients was the impact of the election. There was so much air that there was so much oxygen taken out of the room, both from immediate perspective, because airtime was being brought up, but from in her media perspective and just from an audience fatigue for perspective that, you know, you could take October in, you know, October and November of two thousand and twenty and basically throw him out of the window. Or is September October anyway? And and but that job of data detective, you know that going back to the knot right, unt untangling the not and getting to the why, you know, and then having it, having the the executive, having the foresight to take those answers and put them in an next year strategy. You know, to me that's the essence of a learning marketing organization, Learning Organization period. So so one of the other questions, and this is all kind of related, that I was asked to bring up here, is the issue of analytics latency. HMM. So, if you're a business leader, right, you have a very clear use case for analytics. Right, it's helped me make a better decision and enough time where I can change my future, right. Right. Yeah, and if...

...we look at the way things have been done historically, particularly in some of the really big firm Nielsen, right, we're talking about a lot of latency between calculations. Yeah, how do you how do you help people deal with that? Well, you know, I think this using another analogy of it, and in the intelligence field, you know, there's there's raw and eligence and then there's process intelligence and then eventually it gets into the newspaper and then somebody writes a book about it. You know, and the part and the you know, the the accuracy and precision gets, maybe not the precision, but the accuracy gets higher in theory as time goes on and more and more rebs have been used against it. So it's very important to understand that. You know, hot takes can be very wrong. So you do have to balance timeliness with accuracy. That being he said, I think some of the principles of Agile was so adile. The Agile Manifesto came out as coffare engineering and I'm actually a big believer of it. Software, not so much in marketing analytics. But I do think that one of the principles, which is shortening the distance between, in the case of software, the developer and the customer, is brilliant and you know, and in marketing analytics it's shortening the distance between the analysts with the data scientists and the executive. So I think the the best way to do it really is is to have those weekly conversations of saying sort of what are you seeing, even if the answers are incoate and couched in well, but you know kind of things it is. It is possible, I'd say, in a weekly cadence, to to get some pretty significant early indicators. Concretely, from going back to sort of our conometrics theme, one thing that can be done there is if you have a econometric model that's looking, for example, it elasticities, are marking ulasticities by channel, say that you know some some digital channels that last to see a point too, which which means it if I increase the spend by a hundred percent and get twenty percent more, which is pretty typical. Okay. So one way to look at that is just have your codebase have a little parameter which is a look back window, and the look back window might be the last twelve months. So every week I move forward a week. So I go from, you know, I my fifty two week window, moves forward and then and then what I'm looking at there is I'm starting to look at those transacts. I think this thing starting to drop. Let's go talk to the performance marketing guy, right and he might say, well, yeah, you're right, actually it's been really hard to find inventory. Well, what should we do? Well, I'm going to call it even to point, and those in those sort of leading indicators, a little spark lines, should be reflected back, you know, and a good again, a good exact you're good manager, won't chop somebody's head off if they're wrong because they're asking for fresh, unprocessed data. But also should be able to sort of look at that. So so one of the one of the related questions here is could you explain to the folks that are listening. Why the recalt in so many...

...cases take six months. There's there's did, there's recalcus and then there's recalts. So one RECALC would be in a rebuild. would be saying it's very typical today and digital marketing, in six months to completely stop spending in f channels and add for new ones that we've never seen before. That happened, that happens right. I mean, you know there's so many about, you know, bounty firms out there. There's so many, there's so many, you know, AD tach firms. There is coming and going. So that kind of recap would be saying I need to reprint, I need to rebuild this model almost from the ground up to take into account the new structural realities of my business. That's certainly you know, that's prying annual exercise, but that takes a long time because these kinds of models are it's a bit of whackable. You know, you hit, you hit one thing, it's something else POPs up. It's not. It's not a linear build. Then there's the second, lighter version of that, which is just recalculating, assuming the structure is the same. It's recalculating with the data. And if you have good day to etl pipelines, that shouldn't take too long, especially if you're very producable codebase, which means that you know basically my queries, all my analytics code and my reporting. You know, I can press a button and at any given time it's going to it's going to produce the same answer, you know, assuming the underlying data to have it changed, and I and and it won't. You know, I won't need seven analysts to commit and type stuff's type of stuff in, you know. So that would be that. This the second piece and then, you know, the third. The usiest thing to do is just is just simply recalculate with the exact same coefficients, which isn't hard. But you know, I think those, you know, those are rough. I think what sometimes people don't understand is is that first one I mentioned, you know, that big one. Whenever there's a new data, whenever there's new channels, you know what really needs to happen? Well, I don't know where it fits really, I don't know. I don't know what it's impacting. I need to figure that out. I need to figure out all the interaction effects. There's probably some almost certainly some data manipulation that needs to be done because the source is a new source and you know, they may have a Jason foule that looks different from somebody else's and there's you know, whatever it is, there's million different reasons, but that does take time. You feel like that there's a that one of the gaps between data scientists and business leaders is that there is a almost like a cult of precision in data science that doesn't exist in business. Yeah, I think it's it's a cult of precision and I also think it's a it's a bit of a it's a bit of the biggest problem with the data scientists have, and I'm not trying to Impugne my colleagues, but that is they're not interested enough in the business and and I think that you know asking all those questions, but you mean the main lack of domain knowledge. Yeah, I mean, you know, like I always say, you need to be passionate about your topic, right, you need to be passionate about your methods, right, both of those things, and your industry. And what I mean by that is like a good marketing data scientist, for example in healthcare,...

...payer should really understand and be passionate about the healthcare pay or industry, should really be passionate and understand marketing and should really be passionate understanding data science and and and those are three different things, right. I mean, you know, one is how does this industry structured? And then how does marketing data work, and how do marketers think? And then how do I apply on my tools against that? And I think that, and I mentioned agile earlier, in a purely agile and Linux organization, ideally all those data scientists and analysts would have a lot of that and they're therefore won't need three layers of project managers or won't need somebody to translate. You know, the more layers of translation between an executive and the person doing the work, the more it's going to get messed up. Yeah, no, it's and that that is actually a really important point. When I was at Honeywell and we were doing this, we we figured out is that the marketers on my team and the business leaders would, you could not show them a stata output state as a yeah, on software. I know you know that. I'm think that I was and and you it's a you. You would show it to him and they go what the Hell am I supposed to do with this? Yeah, right, yeah, and so there was a lot of interpretation that had to take place. Right, you could almost call at Ux. We're oh, yeah, right. So that actually leads us to our last big question, last big topic here, and that is where do you see all this going, if you were to prognosticate where this was going to be, some of these issues that we've just discussed, how they're going to be resolved and where things are going, particular, coming out of two thousand and twenty, where basically gasoline was poured on the Sol Isue in two thousand and twenty, not only socially and and medically, but in business. Where's it going? Was a Yogi barts that it's hard to make predictions, especially about the future. I so. So, I don't know. I you know, you everyone makes predictions based on the past then and I think if you think back twenty years, I did, I think there's there's there's two rough ways it could go, you know. Way One is software, marketing, software really finally reaches it, the Promised Land, that the promises that executives have been told, that sales forest conference and Adobe Conference, you're in year out for years and years and years, and there's a common operating system develops, sort of similar to like fats be in accounting, where it no longer really is this constant reinvention and things stabilize. That's that's an option. You know. I think you know marketing becoming more deterministic and is another way to think about it, and less fuzzy. But the problem with that is is that, unlike accounting, marketers are dealing with people who are human beings that spend ninety nine...

...percent of their life underwater and using the iceberg analogy, and you only see one percent of them. And that's never going to change. And certainly things like third party cookies going away and continuing privacy legislation and antitrust probably creeping up on, you know, some of the big advertise advertising platforms will make that harder. So I would probably be a little short on that one and long on marketing. Marketing analytics becoming more transparent, more reproducible and other words more code base being exposed, executives getting more comfortable with that. But also, at the same time, I do see certainly possibility that on standardization of data structures, and this goes back to the theme of sort of pickles dimensions, which is right. There's really no reason that it's an industry marketers and advertisers could not start at least coming up with some basic standards about how to talk about customers, audiences, etcetera, etc. Etc. And there are. There are shelves of that, you know, like if you look under the hooded sales and sales force, you know they're basic structure of accounts and opportunities, leads, contacts and so forth. Products is really not change for twenty years, nor probably can right right out of sea ball, right. So there's shelves of that, but I think you know. So I guess I'm sort of hedging right. I'm heaging between like standardizing of data structures and more reproducibility and methods, better bench marks, but also probably not this, you know, purely deterministic world where, you know, marketing is basically done purely by machines and inside of software. So, you know, maybe that's a half baked answer. It's call it. It's great the you know the the interesting thing about that is I think this is where the ninety five percent confidence score in data science. Yeah, it's actually an unintended obstacle to program okay, yeah, right, were. I want to hear more. But yeah, well, predicting, I mean anytime you're dealing with data that represents human beings. Yeah, right now, you are not going to get anywhere near ninety five percent. Well, and if you want and if you do, you need to be careful. I used to be great concerned. Yeah, I mean I thought you were going to go also to you know, the phenomenon and mark that markers due to if p hunting, which is, you know, I'll run or run a bunch of run, twenty test and I'll pick the one that P is less than one and twenty five, and you know that. You know that. And then they don't, and then people wonder why it doesn't reproduce the second time. You know there's there's a there's a huge number. That's exactly that's exactly right. You know I mean, and that's a peaking or peoning is a great example of data science without context and without hypothesis, and one of the things that I have, in my own mind, kind of wrapped myself around is the idea that, at the...

...end of the day, the scientific method of inquiry is a really, really great approach. And so you need to have a question and you need to have a hypothesis. Then you can test with a model and with data. Yeah, and if you, if you, there is a time in a place, I think, where you can pack. It's like there is real value in machine running based pattern recognition, right, yeah, but neither one of those are our deterministic. Neither one of them are causal. You know, the whole point of hypothesis generation is, without getting too much into the philosophy of science, which I think is, I actually think is under a huge amount of pressure across it not in marketing but everywhere. You know, I always ask my analysis say you know, what's the mechanistic reason this might be happening, and if you can't come up with one, it's not really a valid result. And what I mean by that is right now for the audiences listeners, is tell me, why? Tell me what might have happened. You're strong idea mechanistically of how a chain of events unfolded that drove this. That might be. You know, what is a what it what is the consumer belief that drove this? And of course this happens all the time you know, in those four questions on the WHO side, which is you know, I'm I imagine these personality traits in my audience. They're usually wrong. You know they're and so you really need to have a strong, a strong hypothesis, test, testable. You know I mean ideally, and this is why, you know, we didn't talk at all really about test design or design of experiments. But you know, remains really critical. I had a client ask this year we, you know, we did some inferential work, you know, with some really neat results as so, how many of these can immediately put in market? I said I wouldn't put any in market. I would test them awful and and I'll always say that right, because ultimately inference is inference, and in you in ultimately, you know, you really do need to be more sure anyway. And and of course that confounds your question about timeliness. But you know, of being fast, I guess I'm contracting myself Andy. Thank you so much. This has been a great conversation. Actually, I would love to if you're will, and I'd love to have you back. Sure, Martin. Yeah, this is all. This is Super Fun. Talked about this for our or so so guys this is, you know, we're we're nothing on accelerating value if not ECLECTIC. This is the end of this particular conversation. The next one, I think, is with a CFO, so, you know, somewhat different perspective. And then we actually have a leading religious leader right ahead of a denomination, ahead of a church, who is going to talk not not so much about how he adds value as a priest but how he adds value as the CEO...

...of this church. So it's going to be kind of interesting. I think it's it's kind of a different, kind of kind of little bit weird, but but I think he has some really great things to say. So with that I will say audios and we'll see you in a week. Thanks so much. The sooner you can optimize your marketing spend, the quicker you can start delivering clear, measurable value to Your Business. That's exactly where business GPS from. Proof analytics can help. Learn more at proof analytics DOT AI. You've been listening to accelerating value, where raw conversations about the journey to business impact help you weather the storm ahead. To make sure you never miss an episode, subscribe to the show in your favorite podcast player. Until next time.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (35)