The Evolving Leader

‘The Art Of Uncertainty – Embracing The Unknown’ with Sir David Spiegelhalter

Sir David Spiegelhalter Season 8 Episode 3

In this episode of The Evolving Leader, co-hosts Jean Gomes and Scott Allender talk to Sir David Spiegelhalter, one of the world’s foremost authorities on risk and probability. David explains why embracing uncertainty, rather than trying to eliminate it is essential for leaders who want to build resilience and make better judgments in complex times.

Drawing from his latest book “The Art of Uncertainty”, David shares stories that bring statistics to life, from forecasting failures to the importance of imagination and red-team thinking. This conversation offers both a challenge and a toolkit: how to hold authority while admitting doubt, how to communicate risk with clarity, and how to lead teams toward what he calls ‘safe uncertainty’.

 

Further materials from David Spiegelhalter:

Five Rules For Evidence Communication

The art and science of uncertainty - with David Spiegelhalter (recorded at the Royal Institution, 30 January 2025)


 Other reading from Jean Gomes and Scott Allender:


 Leading In A Non-Linear World (J Gomes, 2023)

The Enneagram of Emotional Intelligence (S Allender, 2023)


Social:

Instagram           @evolvingleader

LinkedIn             The Evolving Leader Podcast

Twitter               @Evolving_Leader

Bluesky            @evolvingleader.bsky.social

YouTube           @evolvingleader

 

The Evolving Leader is researched, written and presented by Jean Gomes and Scott Allender with production by Phil Kerby. It is an Outside production.

Send a message to The Evolving Leader team

Jean Gomes:

Constant, uncertainty is a given for every human being. So why are we so obsessed with it right now? Well, obviously to many of us, the world feels a little crazy and unstable, and this is deeply troubling because our brains crave the opposite. We build environments from houses to organisations and cities that seek to make the world feel more predictable, more certain. So we live in this paradox between reality and desire. In this show, we talk to one of the world's leading uncertainty experts who, although an incredible statistician, recognises that thinking about this most slippery of subjects is equal parts science and art. So tune in to a brilliant mind expanding conversation with Professor Sir David Spiegelhalter about his amazing new book, The Art of uncertainty.

Scott Allender:

Hi friends. Welcome to the evolving leader, the show born from the belief that we need deeper, more accountable and more human leadership to confront the world's biggest challenges. I'm Scott Allender

Jean Gomes:

and I'm John Gomes.

Scott Allender:

How are you feeling today? Mr. Gomes,

Jean Gomes:

I'm feeling really uncertain. No, I'm not. I'm feeling really excited about this conversation, because the topic that we're talking about seems to be pervasive at the moment, and there's a good reason very excited. How are you feeling, Scott?

Scott Allender:

I'm feeling very thankful that it's Friday. I'm feeling that it's been a full week, good week but but definitely lots of competing priorities for my attention. So I'm really been looking forward to this conversation. What a wonderful way to cap the week with the conversation that we're about to have, because today we're joined by David spielhalter, and he's emeritus professor of statistics at The University of Cambridge, and his best selling book, The Art of statistics, has been published in 11 languages. He was knighted in 2014 for services to medical statistics. Was president of the Royal statistical society and became a non Executive Director of the UK statistics authority in 2020 and his latest and brilliant book, The Art of uncertainty, how to navigate chance, ignorance, risk and luck is incredibly useful as a set of tools to help us get a hold on how to think in a more uncertain world. David, welcome to the evolving leader.

David Spiegelhalter:

Oh, great pleasure to be here.

Jean Gomes:

David, how are you feeling today?

David Spiegelhalter:

just keeps erupting up at moments. What am I doing here? Someone's going to find out.

Jean Gomes:

So uncertainty is understandably everywhere

Scott Allender:

So I have so many questions, but let's start with from my perspective. You know, I'm thinking about leaders Trump and Brexit and Ukraine and AI. How does uncertainty affect us at the most human level? Do you think? that have been sort of trained to eliminate uncertainty from their thinking through exhaustive planning and control, and now are feeling particularly out of their depths as they get sort of rocked by a series of systemic disruption. So what's the starting point for embracing uncertainty? How do we need to be thinking about that?

David Spiegelhalter:

Yeah, the first thing is, you can't eliminate it. And in an uncertain world, which I think a lot of periods, think they live in an age of uncertainty, but I think we can fairly say we're in the middle of one at the moment. Uh, one of the first things we have to acknowledge that we can't even think of all the things that can happen. We can't think of it. I like to think of possible futures. And when people are planning, they like to construct scenarios, or sort of best case, worst case, or something, you know, central scenario, perhaps. And that in no way exhausts the possibilities so and we have to realise that. And even in those scenarios, the one thing we know is none of them are actually going to happen as going to happen as we, as we, as we have planned them. So we have to be ready for that, and we do it in our lives. So we can't, you know, we can't think of everything we're going to be doing in five years, 10 years time or so. We can just have a broad idea of the possibilities where we might be and try to protect ourselves against the worst things that can happen. And we, I suppose we're going to come back to this again and again. I think we, you know, we're talking about resilience. We're talking about not robustness. Robustness is the idea that we've thought we've got a long list of possibilities, and we've hedged ourselves, and we've optimised and things like that, so we're robust to a wide range of circumstances than we thought of. Now that's not enough anymore, because, you know, things will happen that we haven't thought of. And so we're talking about non optimization, you know, not trying to get everything all perfected, not, not really, you know, we can't assume there's a mathematical solution to dealing with this as the sort that we've taught, been taught, and I've been taught. I've been taught. I taught decision theory for years and all this stuff. So I know all about this, but it's just not true in the real world as we operate. And so we have to have this extra idea of being prepared for the things we

Jean Gomes:

And in the last few years, given you know, all the haven't thought of. things that have been happening in your work, What? What? How has that kind of evolved, that thinking and your application of your your work, what's the most exciting or interesting things that you've kind of learned through that?

David Spiegelhalter:

I don't know. I think one of the first ones that really brought this home to me was 2010 when an Icelandic volcano erupted, which I'm not going to try to pronounce, and because of the direction of the wind, the ash came over Northern Europe and closed down European airspace for 10 days. I mean, it complete shambles, absolute curse. No planning was ready for that at all. And I was actually brought in on the committee that was hastily constructed about what, you know, in the UK, but what, what are you going to do about it? And you know, it's an amazing, you know, they had all sorts of people, and it crucially came down to whether the airline manufacturers would allow their planes to fly with a slightly in a slightly less precautionary way. And you know, the UK had a plane flying up there in the ash cloud taking measurements and showing, yeah, you could do it. But there was no planning for it, and so it wasn't on the UK risk register. And you might think, Well, that was a black swan. Nobody could have thought of that. They had thought about it, somebody had thought about it, and actually it did, but it didn't make that this was a possibility, but it hadn't made the risk register, and so there was no thinking about it. So I don't think there is, although I said, Oh, well, we often can't think of the possible futures. It doesn't mean that nobody's thought of those possible futures. So what that's really brought home to me is, and in all sorts of other circumstances where surprises have happened, and then after you realise, well, somebody had thought about that. Is this idea of imagination. I was thinking, there should be another. It's like futurologists, I suppose, but in much more specific sense, rather than long term trends, we're thinking of actual specific events. And I always say Imagineering should be a new sort of discipline, like engineering, where you just have to think of things that can happen. And some of the people who actually really thought about this, I think, are the UK Ministry of Defence. So two ways. First of all, they've employed science fiction writers and publish the stories. They're rather good short stories because none of them are going to happen. But the idea is to, in a way, broaden our thinking. There's a wonderful one about you. So drones all over London, you know, running amok and that sort of stuff, and and it's all to do with governance of drone traffic in a dense in a dense area. Now you realise, yeah, this is something that has to be really thought about carefully. But, you know, purely people are thinking about that. And the other thing the Ministry of Defence, you know, really recommend is, you know, your red teams. And they got a whole guidebook of the red team mindset. And I mean, the classic idea of a red team is actually a separate group of people who were there, to be critical of assumptions, to challenge group think, to think of the stuff that other people hadn't thought of, basically to be real pains in the WhatsApp. In other words, they're thinking of things that could go wrong. And Ministry of Defence encourage a red team mindset, so that even if you are an individual, you try to try to do it yourself. And it's self criticality. It's trying to challenge your assumptions. It's trying to be aware of your biases and things like that, which I find very difficult indeed. I'm I've got biases up to my idols, and so I got some awareness of them, but it's very difficult to do anything about. Did, I think so it's better if you've got a range of teams. And I mean, there's two elements of the two. Well, that, again, brings up some of the sort of stories that I'm interested in and I talk about in the book, and that's areas of super forecasting. Now super forecasting, the forecasting competitions that have been run are a bit different, because they are about specified futures. They're not about general imagining of all the things that could happen. They have a list of futures that have to people have to put probabilities on them, and when they occur in six months time, or something like that, or don't occur, then the best people be the closest probabilities, using a special scoring a proper scoring rule, win the competitions, and you know, they've shown that, you know, the almost amateurs, but who have a particular mindset can beat some of the best sort of professional analysts. And that's been shown. And that mindset is almost a red team mindset. It's one that is very open minded, is not stuck on one way of seeing the world. They self critical, they take advice. They're prepared to change their minds very quickly, very agile. They look for every source of information, including those that might disprove what they originally thought, particularly those. And so these are it's been Shang in all the sort of work that's been done on these competitions, this particular sort of personality or attitude has been shown to be the most effective, and that has been the best characteristics to have.

Scott Allender:

What are you seeing in terms of the trend of that? Because it feels like, and this is just my feeling, I didn't study this, but it feels like people are becoming in many ways less open, right?

David Spiegelhalter:

That's a real problem. Do you know this phrase the hedgehogs and foxes? I think a lot of people have used that. So just very briefly, you know, there's an old story. The hedgehog knows one big thing, but the fox knows many things, and it's foxes who have this agile, un mind that isn't committed to a particular worldview that do best in forecasting. And hedgehogs, who are part of a group of people who think the same way and try to fit the world into their particular worldview, are not good at making these judgments. And I think you're right. There's an increasingly polarised worlds. People are becoming hedgehogs of one sort or another, and that sort of fluid central ground where you really are adapting and to what you hear, and very quickly adapting is lost if you have taken up a committed, polarised position. So I, I, oh, is really depressing thought, but could be true. I don't, you know. I would not sure about the empirical evidence on that, but I yeah, it certainly seems plausible.

Jean Gomes:

You've also noted that people's tolerance for uncertainty varies enormously, and some some team members thrive on unpredictably, while others just just kind of gripped by anxiety. If you're a leader looking after a team, or somebody who's trying to inspire people there, how do you manage those differences, so that both the cautious and the adventurous thinkers are heard and the team can can still make bounce?

David Spiegelhalter:

Oh, that's, that's, I think, a slightly outside my expertise as a common or garden statistician, but it is important, because people very actually, people within themselves. They very much, very much an attitude to uncertainty. They may, as I, you know, I've known people who took what I thought was extraordinary physical risks and yet were very cautious about their money. So, you know, the people would take, you know, very big social risks, and yet being very, incredibly cautious about their health. So there's all, there's all sorts of variation between us and within us. And yeah, we have to take account those different views. And it actually, of course, a diversity of views is a really good thing to have. If everybody was really gung ho and, you know, risk taking, well, that could be extremely dangerous. And same, similarly, if everyone was very risk averse and incredibly cautious, that'd be pretty hopeless, too, in a team. So it's that range of viewpoints that such a valuable, such a valuable thing to have. And of course, you've got to have that discussion. An example I use was back in 2011 when islama Bin Laden was thought to be in this compound in Abbottabad in Pakistan, but it wasn't certain at all. And President Obama, I did a brilliant thing, which is he set multiple teams on the job, multiple independent teams. And I don't know if he deliberately set it up that some were rather red teamish, very critical, and some were more gung ho and optimistic, but they came up with a very wide range of assessments. He said afterwards that some thought there's only a 30 40% chance he was there, probability that he was there, and others were 80 to 90% they were really confident. So some were being really cautious, or, I'm not sure, and and he had to then make the decision, but he knew the range of opinion. And he said he thought it's about 5050, and wait for it, it's right. Is there? Right decision well, unless you were Osama bin Laden, of course. But then, so the point being that, and it's been argued afterwards, it'd been an academic paper written saying that maybe these teams should have got together beforehand and come up with a single view to give the decision maker. And I think that's completely wrong. It's absolutely right that the person who really, in the end, has to carry the can knows that there's a diversity of opinion from different perspectives, and has to make that judgement themselves. But it is they, and they have to take responsibility

Jean Gomes:

for them. One of the things that keeps on coming up in these conversations is about people's ability to make the distinction between risk and uncertainty, and how that often gets just called mushed up through emotions and so on. How do you help people?

David Spiegelhalter:

They don't actually really like that distinction at all, because I just see it as all as a continuum. I mean risk is Cat tech. You know, typically one situations where, you know, we kind of know the probabilities, the gambling areas, the stuff where we've got, we feel we've got everything in control. And you know, those sorts of situations when you can optimise and do it mathematically and uncertainty of those situations, when, when, as I said, you might not even be able to list the possibilities. You can't assess the probabilities. The evidence is very good. It's all a bit of a mess. And I don't, I mean, it's, I just see that as a complete continuum, because even in risk areas, you're always making, there's always judgments, there's always assumptions. I mean, I always carry a, got it somewhere, two headed coin. And I, because I, you know, I flip a coin, and before I say, what's the problem with this coin will be heads. They all says, Oh, 5050, and say you're wrong. You know, you made that assumption. You trusted me. You must be mad. So even in those apparent risk situations, you're full of judgments, you're full of assumptions that may be wrong. So I don't like, particularly like that distinction, and I know in particular I don't like the distinction between which was made 100 years ago by Keynes and Frank Knight and others, that in one of them you could calculate the probabilities, and other ones you couldn't. Well, this was before the age of subjective probability, super forecasting everything, where we know that subjective people, subjective probabilities can be extremely valuable, can be very good in the writing from the right assessors and so on. And so you don't need to be able to calculate probabilities based on data in order to make good judgments. So I don't really like that distinction, but there absolutely is a continuum from nicely, well controlled problems, where we might be able to assess some reasonable numbers and things and even what the values of the outcomes are and optimise, do our proper risk analysis, and the shambles down in the corner of what people call radical, or I like the term deep uncertainty, when you can't even list the outcomes, and you certainly can't think of the probabilities. There are some interesting situations in between which people are thinking a bit odd ones, where you can't think of everything that can happen. But you can think of probabilities. And you may think, what? How can you do that? Bank of England do that. They model the central 90% of their belief about the future, inflation and growth and so on. But the outside 10% the two tails, they don't make any attempt to model. They just say it's somewhere out there. Could be anywhere out there. So they're not saying they're not putting a probability to street Anything could happen, essentially. And in other words, they're giving a probability to none of the above, something else. And I think that's really nice. It's sort of a is sort of giving a probability to an unstable situation. So that when in Cove, the first quarter of covid GDP dropped 25% it's the classic 25 standard deviation drop. It really happened. But they said, well, sorry, that's just part of the 5% tail we you know, you've got to be ready for this. You've got to be resilient to these things, because they can happen. And I really like that idea of having an honest, you know, essentially a probability given to everything else that we haven't listed above. You know, the warning problem, and it's an old idea in statistics, in Bayesian statistics, of what's called Cromwell's law that you should never you should always have a certain amount of probability unassigned. That's sort of free floating for something we've never thought of, because then it mathematically enables you to adjust incredibly quickly to new circumstances. The brains, you know, a good Fox like brain, Bayesian brain, will operate like that, because they'll never think they've actually got the problem solved.

Jean Gomes:

It's very helpful. Thank you.

Scott Allender:

Yeah, super helpful. I like that thinking of risk and uncertainty on a continuum that feels true.

David Spiegelhalter:

Yeah, I think a continuum, and because there's there's a lot of uncertainty in any risk assessments, there's always judgments and assumptions.

Scott Allender:

So you've shared some really interesting examples. And another story you recount in your book is the the Bay of Pigs story where a 30% success probability was watered down. Down to described, as, you know, a fair chance of success, which contributed, of course, to a disastrous decision. I'm curious, do you see similar sort of fuzzy language in businesses, and how can leaders ensure that they and their teams are communicating risks with clarity and precision?

David Spiegelhalter:

Yeah, I think, you know, I, you know, I'm not sitting there where these important decisions are being made, but I always just, sort of just, you know, sort of grip the table slightly when people do when they're talking about important things that could happen. To just use words like, could happen, might happen, could possibly. And of course, we do. We use this in our language all the time. We can't put numbers and all the stuff we think would happen. But, you know, I decided, as I say, I report, I just talked to about intelligence agencies that don't that have gone beyond this. I've got in from my desk right in front of him. I got a mug from mi five, and because there's, you know, I gave a talk at MI five, and all I got was this lousy mug, so and but it says that here that if I think something is 25 to 35% probability, like the Joint Chiefs of Staff was thought for the success of to Bay of Pigs, you have to report that as unlikely, not Fair Chance, like they did, which was just hopeless. So and if it's if I use the word likely in an intelligence report, and I can use it, I can use words all time, but I have to mean between 55 and 75% probability. And just those, they quite broad bands, but they at least get people onto a similar wavelength of those of the use of words same thing as using climate change. They've got a the word likely is used in lots of different ways, but they tend to be around between 16 is 55 to 80% mean, that's the sort of range that is allowed for quite a broad range for likely and and I think this is incredibly valuable, because we know that people, because of their risk averse, you know, they're different attitudes to risk they will interpret words in many different ways.

Jean Gomes:

I want one of the things that I really enjoyed was the quiz that you had at the Royal Institute talk that I saw a few months ago. Can you talk us through that in terms of helping us to quantify our ignorance? Yeah.

David Spiegelhalter:

The main thing is, I the really annoys me about when people are trying to say, predicting the future is they try to predict the future and they say, what's going to happen. I don't want to know what you think is going to happen. I want to know your probabilities for what's going to happen. And so within forecasting competitions, people own and with good intelligence analysis, people only use probabilities. They don't say, Oh, I think this will happen. No, I don't care what you think. I want to know your probabilities. I want also good evidence that you are a good probability assessor and and you can train people up in that using sort of quiz that I do with audiences that I got in the book, which essentially just asks them Almanack questions, just sort of, you know, which is, you know, which is further north, Brussels or Kiev, or something like that, I'll ask it's a really nasty question. In fact, it's a really horrible one, but an easy one. But you know, who's got more? Which has got a bigger population? Cambridge, England, or Cambridge, Massachusetts and things. So I can ask these questions, which I hope people don't know automatically, and ask them. Say, I don't want to know which one you think. I want to know your probability for which is right. So as which one are you most confident for? A or B? And then what's your confidence? And it's very simple. On a score, 56789, 1010, out of 10, you're absolutely sure. Five out of 10, you've got no idea. And the point is, then, and people, I could do this with school kids, young school kids, and they get it immediately, immediately. Well, they certainly do after the first question, when they absolutely say, when they make a mess of it. So it's really straightforward thing to do. People can and we're assessing subjective probabilities for facts. This is really quite a sophisticated idea, and yet it is very natural to people. People can do it. And then, of course, you tell them the answer and you score them, and if you're and it's using a particular scoring rule, what's called the briar scoring rule, which was developed in weather forecasting in the 1950s it's the one used in all the super forecasting competitions and and I've tweaked it a bit to make it, I think, easier to use. And essentially, if you say 10 out of 10 and you're right, you get 25 points. If you say 10 out of 10 and you're wrong, you lose 75 so it's really vicious. It's very highly asymmetric, and that's deliberately designed to and if you say, you know nine out of 10 and you're right, oh God, you lose you get 24 and if you're wrong, you lose 81 I can do that in my head, you know, because of a clever trip. And so you get your points and and then some people do really quite get, people who know a lot or a bit lucky, get quite a lot of points, and people who are quite cautious, the risk averse, people who know what they don't know often, you know, stay around. Fact, they do five sixes and sevens, and they stay around zero. And then, of course, you get people with big negative scores, the gung ho people say, Oh, yes, I think Brussels is further north than Kyiv or something. I think Barnstable, Massachusetts got more people in basketball. Oh, the other one I do is, you know, who's older? President Trump or Charles the third? And that's not, maybe that's not too difficult, but I don't know who's older, the Prince of Wales or the Princess of Wales, all these sort of things that and and then, if they get, you know, the gung ho people get it wrong, they lose 75 and then they try to catch up, and they go more than they go, and they end up with massive negative scores. It's particularly associated, I feel, with young males. But never mind. And I would say these are sort of people you do not want is your financial advisors, because they don't know what they don't know. And the other thing I say, you know, a chimpanzee can get zero just by saying five every time. You should not be getting negative points. So I think it's a really good and this is a sort of training exercise that people do to train people in subjective probability for assessment, to stop them being overconfident, to stop them saying, Oh, I'm 99% certain when actually this is hopeless. You know, they're not. They're not calibrated. In other words, because if somebody is well calibrated when they say, I'm 90% sure they're right, 90% of the time, roughly.

Scott Allender:

Do you know if that leads to sustainable change for people like, when that overconfident young man gets such a negative score, does does he actually shift?

David Spiegelhalter:

I they they can improve over a competition. They really can, because they learn so quickly. Oh my god, I got to be careful, apart from the ones who say, I've got to catch up and get even worse. But that's the that's the gamble. Who puts even more on after they're losing? Yes, but there's always those. But I said, Don't gamble whatever you do. But people can, people pick up on it very quickly and realise, Whoa, I got to be careful. I don't want to lose. I don't want to lose all my points. I'm going to stick to what I genuinely feel, rather than exaggerating my confidence

Jean Gomes:

that that ability to that kind of metacognition, the ability to actually understand your confidence around what you do and don't know in a situation of uncertainty, is incredibly valuable, and it could be applied in lots of different circumstances.

David Spiegelhalter:

Oh, just in your normal life. It just

Jean Gomes:

implies, what else could we do to increase that? Do you think?

David Spiegelhalter:

I think do this quiz I usually built in the school curriculum because quizzes like that, and they have used this, they've used sort of probabilistic confidence based marking for questions, rather than multiple choice. And you have to say what you you actually say you're confident you are in your answer. I think it's a very good, a very good way of marking people's knowledge, because if you know what you if you know you don't know, you don't get marked down as much as if you you know. Just stab at an answer.

Jean Gomes:

Where do you find in your personal life, where you might not necessarily always, kind of use that for yourself?

David Spiegelhalter:

Yeah, I don't like it in my personal life, because I know I say all this stuff doesn't mean I'm hopelessly optimistic. That's the problem so, and I know it, at least I do have that insight. And so I'm a positive liability when it comes to making decisions in the face of uncertainty. I'm counted by my partner who's very good, much more, much more down to earth. Oh, no, you we will take more than that time to do something. And pulls me back from my gross optimism about everything we can manage to do in the next three hours, or something like

Jean Gomes:

that. Well, it's heartening that it's not you're not perfect, no,

David Spiegelhalter:

no, exactly, and but on the other side, again, what it shows is that in some ways, you know, people are different about different things. When we go on holiday, I used to be when you go on holiday, do you plan everything? Do you know where you're going to stay

Jean Gomes:

today? Yeah, what I'm going to do? No, my wife wants to plan everything. I don't

David Spiegelhalter:

Okay. I was different. I used to plan every holiday, and, you know, meticulously and where we're going to go and everything I was on guidebooks and everything we know we have to see this. This is where we're going to stay. Anything for that? My partner, would not, didn't want to think about anything until she was on the plane, if then, and we reached a compromise. And now we go sort of backpacking in Asia, and I'm allowed to book the first two nights, and that's it. And then we then we make it up as we go along. It's chaos, and I so I've learned to deal with it. I think this is all right. First is, you know, cast. It seems absolutely horrific, but I think it's because I'm with her, and I think it is that she's really good, solid travelling companion that I did it. So I've managed to change and become less of a planner and more open to the uncertainty, the possibilities, and it's great, it's wonderful, but it did take a while.

Scott Allender:

Let's build on that. You said your relationship with travel has changed. Is that? Is that the primary way? Or what else have you learned about coming to terms with doubt and uncertainty? You wrote, you wrote about it in a recent piece. I just wanted to flag that.

David Spiegelhalter:

Yeah. Yeah, yes, yeah, yeah, exactly No. I, I felt this quite strongly, because Peter rogow was my father. I used to suffer from what he called Travel fever, which is a, you know, apparently there's a term for this in other languages. That is when your anxiety about what could go wrong or what might happen, your uncertainty becomes so much it paralyses you. And he stopped, had to stop travelling, which was such a shame, because he was a good traveller, but he just got so, so anxious about it. And this is, you know, standard thing. You know, there are standard intolerance of uncertainty scales. Then if it gets high enough, people have a clinical condition that they just can't deal without being absolutely sure of what's going to happen at every stage. And I still suffer. And I started feeling I was suffering from this a lot. I was going, I couldn't sleep for days before going on a trip. I couldn't do anything. I couldn't do it a terrible state. And I went to a psychotherapist, and it was really valuable, because I've since found out this is absolutely standard bit of cognitive behavioural therapy that she did on me. But I thought this was great. And she just said, okay, don't try to deny the feelings. Feel the feelings, but then just say to yourself, these feelings are identical to what I would be feeling just from excitement of a really exciting, fun thing to happen. You know, the butterflies, the feeling of thing. And so you're just trying to recode in your brain that sort of anxiety into excitement. I mean, it's absolutely now, I know it's completely standard, but it was, I think it's worked to some extent. I give myself a little bit of a talking to get a great man, and I think it's worked to some, some extent. So, you know. So what that is, is getting we can't, is the situation just as uncertain? I'm changed. What I do at all, none the least. So in a way that the actual chances of stuff happening haven't changed one bit. It's purely the way I respond to it that one can change. So there's

Jean Gomes:

a good reason why your books call the art of uncertainty. Because you haven't just got it's not just a book about statistics. It's also about recognising, as you say, the the map, not the territory, is important. Can we talk about that a bit?

David Spiegelhalter:

Yeah, no, it's so important because I'm a mathematical statistician. In my life, I've done huge amounts of data analysis and modelling and using packages, writing packages, and doing all this stuff, and where you are trying to, anyway, tame uncertainty and either getting it into a form of an uncertainty interval that you can calculate about a quantity or about the future or something like that. And I'd done that and taught it all and and it's great, but it's all wrong. And that's not really fair to say it's wrong, because it sounds like you can throw it out. No, it's deeply limited. And so my book is really about and the why it's called The Art of uncertainties, that it's not a science of uncertainty. Is this not an algorithmic process to go through all this, although I've taught it as that, oh, you know, work out what the decision, the options are, what what the possibilities are, put your utilities on them, sets the probabilities, calculate the subject of expected utility. And there, you know, the right decision to take. I've taught that for years. That's nonsense, it really so it's not nonsense, but it's a sort of idealised package that really is not going to work in most circumstances. So I so a lot of now what I teach or talk about is trying to quantify as much as you can, while being fully aware you can never do it. So I really love statistical analysis. I like good models and things like that, but none of them are right. Them are right. You know, it's an old saying, all models are wrong, but some are useful. And you really have to get that into your mentality. And so every, every time I see a calculated interval, a p value, all these things, I just know they're all wrong, and because they're all assuming the truth about that the model, every assumption in them, every assumption in the model, is true. And you know, this is definitely not the case. Now, they may not be, importantly, wrong, but they're just not wrong, not right. They're not a fact about the world. And so and this very, I think, quite humbling and very valuable, because it gives you a degree of of humility about any analysis you might do. And that's why, as I said, the simplest thing of flipping, asking people what the probability of a head is 50, and they say 5050, and just say, sorry, you're wrong, because you trusted me. I've got a two headed coin, so you can be caught out at just not understanding something about the problem and and that can mess up everything you've done, and yet you've confidently claimed this analysis so and the other side of that is, well, no, not on the other side of that, though, one way to contribute, I think, to that humility, is something I now do and recommend, and many people do, which is, as well as doing a quantitative analysis and calculate, calculating the estimates and the intervals, all the stuff that comes out of the packages actually, to look inside yourself and say, Well, how confident am I in this analysis? Do I actually believe this stuff? You know, how good is the evidence? Is the data just so ropey, or am I, you know, do I am I suspicious about where it came from, or the design of the study, or, etc, etc, or my. Be understanding what goes on in the world. To build a model, all the distributional assumptions you have to make the independence of something, you know, just what are all the things could go wrong in some situations. You're very confident. You really feel you've nailed it. You understand, you know, you can think of car insurance or something. You know, you've got vast amounts of data. You've got a lot of information. Nothing can go that wrong. And so you could have high confidence in your analysis. But if you're doing insurance on even on space, spacecraft becoming more common now, but you know, on really things have happened. You got to admit that you haven't got much confidence in all your calculations. And similarly, I mean, during covid, for example, when the scientist committee were giving advice to the government. They qualified all their judgments by how confident they were in their analysis and into a low, low to moderate, moderate, moderate to high and high confidence that the data we've got can answer the question that we've been asked. And I now do that absolutely routinely, and recommend that everyone does a sort of meta view of your quantitative analysis. Well, what I can also can call indirect uncertainty, not the direct uncertainty about the quantity that's going to happen, but your indirect uncertainty, all the other stuff that makes you, which you might normally just put as little caveats at the bottom of some reports, but that's completely inadequate. You should have that upfront. Actually, we don't really know what's going on, or we really do know what's going on.

Sara Deschamps:

Welcome back to the evolving leader podcast. As always, if you enjoy what you hear, then please share the podcast across your network and also leave us a rating and a review. Now let's get back to the conversation.

Scott Allender:

I'd love to get your thoughts on what might need to change in organisational cultures to allow for leaders to articulate the level of uncertainty they might be feeling. Because I'm thinking as you're talking about the pressure that's on people to give the appearance that they know everything right stakeholders and investors and your boss, they want to know that you know. So there's almost this externalised pressure to over articulate your confidence in a decision, perhaps where actually the real strength might be in being able to articulate clearly the level of uncertainty that's a part of this decision. What? What? What are some thoughts you might have for leaders listening right now?

David Spiegelhalter:

Yeah, no. It's very common, because I talk to people in business with people in government, and they all say, Yeah, but my boss, or my minister, I think, wants to know what's going to happen. He doesn't mind all this hedging, or, you know, laced really tightly defined in a series of calculated options. I think essentially, wants to, wants the uncertainty be taken away out of the that bigger uncertainty to be taken out of the problem and and I just say, what's gonna tell them? No, it's partly because they don't want to have to admit it, and they don't know almost what language to use when they're explaining it to others. Politicians are hopeless. They cannot admit uncertainty. They cannot they have to be absolutely confident about everything they do, about every decision. And it puts them in. It paints them into corners. They can't get out of it, covid, you know, they couldn't admit that. They didn't know what's going on. So they made all these recommendations that then they couldn't go back on, even when they were found quite quickly, I was wiping surfaces and all this stuff, we knew within a few couple of months, that this is all a waste of time, but no, they couldn't say that like people were still wiping surfaces a year later. So, you know, they can't go back on it, because they were, had to be. They were so confident. They had no idea of provisionality that what we're saying, it may be fair now, but we're going to change our minds, and that some people who get to a certain level to find that completely impossible to do, and it does require confidence to do that, and it requires it does help if you're already a trusted person, you can then admit uncertainty more, because you know, our research we've done, and not just us, but others, have shown that if you do in sense of confident uncertainty, if you bring people into your confidence and say, Well, look, we know all this. We know a lot, but we don't know everything, and there are these things we don't actually know yet. We're doing our best to find out, but we don't, we don't know yet. In the meantime, you can do you should, you could do this, but our advice will change, and that's sort of what I would call trustworthy communication of uncertainty is very difficult to find within senior figures. You can find it in covid. I could see it. And some senior scientists could, could do that. But among politicians, that's extremely difficult to do. They don't seem to have a language for it. They just think, partly, I think they have a myth. There's this myth that, oh, if we admit uncertainty, nobody will trust us ever anymore. The public will just reject us. It's completely wrong. The evidence is that we've got from not dissent, not just us, but from others is that if you do admit uncertainty and give a balanced opinion of, well, there are benefits and harms for this policy and things, but on balance, we decide this best thing to do. But you know, this isn't going to suit absolutely everybody. Exceptionally, there's some uncertainty, and the evidence is very good. If you give that balanced opinion, then people are really sceptical of you. Trust. You more, and that's reasonable, because you're addressing their concerns. You're saying what they were concerned about, rather than just saying, No, you know, what you're saying is nonsense. Vaccines are safe and effective. You say, you know, all vaccines are safe enough and effective enough to use on some people in some circumstances, but not everybody at all the time. And there are still some uncertainties, and if you say that, the sceptics trust you more, which means, what's interesting is that it means that when the governments, like they do say, you know, deliberate this, what you might call persuasive message, vaccines are safe and effective and you should take them, they are actively decreasing trust in the group they're trying to reach. They're making it worse. They're making people are sceptical, trust them even less, and but they don't have a language for doing that on the whole. And that's something i That's why I push all the time. But trustworthy communication a language for expressing uncertainty and doesn't mean well, we don't know anything, or your guess is as good as mine. No, you can't do that. Is terribly important, I think, for the trustworthiness of in a society which increasingly is becoming polarised, with untrustworthy statements coming from both sides, to be honest, from all sides, you know, to hold that middle ground and actually try to realise that there are there's a balance as winners and losers. It's not absolutely black and white. The issue and to take that position is becoming very difficult, and yet, I think it's even more crucial to be trustworthy.

Jean Gomes:

We've kind of talked about this, but not using this word luck. Where does that fit into your thinking?

David Spiegelhalter:

Oh, I like luck. I think it's great. I don't think it exists out there. It's not some external force there's not the goddess Fortuna is not watching over us and either rewarding us or punishing us. So I not believe in that, but I think it's a good term for describing really retrospectively. You know, things that happen to us, that are unexpected and are out of our control and yet have an impact, either good or bad. We're on the wrong plane at the wrong time, or we do happen to win a lottery ticket or something like that. So I, I, I like it as a concept, and because has those characteristics, and it's one of the aspects of living with uncertainties, that things will happen to us because we're not in control of our lives most of the time, only to a limited extent. So things are going to happen to us that we didn't expect. And I think to be able to identify that in a way, I think it's valuable to acknowledge the lack of control we do have. I think we've got an exaggerated many people have said this before. We've got an exaggerated sense of the amount that we control our lives, and we are personally responsible for the situations we're in. And philosophers have really identified this with the idea of constitutive luck. And I love this. This is the luck of just who you were born, as you know, what your ethnicity, your family, your time in history, the place, your siblings, the situation you're born into your community and things over which you had zero genes, you're over zero control your parents. You don't choose your parents. You know, they rarely choose you. So, you know, so you feel love. You've struck a nerve there. Well, I said in the book, I for the book, I researched my own conception, which I think still feel slightly uncomfortable.

Jean Gomes:

You weren't going to make this personal. No, I know,

David Spiegelhalter:

I know you personally. So it really changed my perspective on my life, to realising the bizarre chain of circumstances that led to me being here, and how easily I might not be here. It's certain, it's bit humbling, because, you know, to realise how, how amazingly easily it is that you wouldn't be here. So that's constitutive luck, and that's hugely important. I know who I was born as, where and when, where and when it's just, you know, it's affected my life all the way through. And then there's, they've identified circumstantial luck, which being in the right place at the right time, the wrong place at the wrong time, being in the wrong plane at the wrong time, or something like that. And then outcome, like how it happens just at that moment, how it happened to work for you, good or bad. And I, yeah, I think it's a good way of taking apart our experience.

Jean Gomes:

It well, when, when people look at their decisions and they do the post post mortem on the decisions, how often do they conflate luck, good or bad with the decision?

David Spiegelhalter:

Yeah, I mean, I think, I think it is quite reasonable to say that, well, it ended up badly, but that doesn't mean it was a bad decision, and this is particular case in health decision making, because, you know, if you're in a situation, I've got prostate cancer, so, you know, I've been in a situation, trying to make decisions about my treatment, and the the anxieties that, you know, I can't under whatever the decision, I can't control what's going to happen. Otherwise it'd be quite clear what to do. And you can't, because it's so much uncertainty, it's untrue. Do, but research has shown and also I feel my say and I say I feel it myself, is that a big fear is a fear of regret. Oh my god, if only I'd done that, this would have been alright this way. In other words, if only I'd known better at the time, if only I'd thought of that, if only I'd really consider the decision more so that, because the evidence, I think, is that even if you make a decision and it happens to go badly, where's bad luck? You know, your tumour coming back is, it's luck really. You know, it's the but if you did, just didn't do everything, you know, you didn't take the step, even the minimal steps, to avoid that, then actually you could say, well, I can regret that. I could have, I could have, at least made my chances better if I had done X, Y and Z. So I think the fear of regret is very strong in as in ourselves, as individuals and and that's to and what I think, what that illustrates the fact that you don't regret things so much. If what happened, you may not work out well, but if it's just bad luck, well, I did the best I could, and that is, I think, deeply reassuring, because we can't make every we can't we can't control everything that happens to us. We can't make sure everything turns out well. We just don't want to be kicking ourselves.

Scott Allender:

So you've touched on a lot of this already, but as we near the end of our time, how can we better prepare ourselves as leaders for a more uncertain world?

David Spiegelhalter:

Yeah, I think I've learned this recently. It's not in the book which it was that people in therapy, family therapy have developed this idea of safe uncertainty. And what they found is that people used to go to therapists in a state of what they call unsafe uncertainty. Oh, my God, I don't understand what's going on. My I'm not I don't have any control of my life. I'm in a I'm in a state or, you know, anxious panic attacks and so on, unsafe uncertainty. And they would go to their therapist with the illusion that this, the therapist could cure them, could turn them into a situation of safe certainty, yeah, everything's sorted. I have solved, solved your problem, and they could walk out all cured and solved. Now this, of course, is a delusion in therapy. And so there's this move which I think is really clever, that you shouldn't even pretend that you can put someone in a situation of safe certainty. You what you want to do is try to move them into a situation of safe uncertainty. And that means that we can't solve the problem completely. There'll still be things which are not in control of. Stuff will happen that you didn't expect. However, what we would hope is that through the therapy you are now built at the resilience to be able to deal with that in a way that it's not unsafe anymore. And this is now being you. This is used also now in child protection and all sorts of all sorts of places, as a recommended idea. And I really like this. And I kind of think I know when I talk to analysts who advising ministers, I said, Don't ever let them think that you can put them into a situation of safe certainty. They know you are there. You're their therapists. Really they want help, and they expect to be told exactly what to do or what's the case. Don't ever pretend that you can put them into that situation. What you want to try to do is move them into a situation of safe uncertainty. In other words, we've done our best. We thought of the things that can happen. We've hedged the best we can. And it's not completely safe, but we are. We do feel now we're resilient to what the world might throw at us, up to, you know, absurd levels, maybe not to an alien invasion, but come on, we got to draw the line somewhere.

Jean Gomes:

I love that at the end of your book, you you finish with a manifesto for uncertainty, which helps us to recognise that it's our personal relationship to the world that defines our perception of uncertainty. Can you unpack that little for us?

David Spiegelhalter:

Yeah, it's something I feel quite strange actually, from my background in Bayesian statistics, which based on the idea that probability doesn't exist. In fact, it's not out there as a property the world. It's just an expression of our personal relationship to the world and and this goes along with the idea of uncertainty being a conscious awareness of ignorance. It's called, comes back to the self, to ourselves as a relationship. And I think it's really helpful to realise that that we have to live with it. We can even learn to enjoy it in some circumstances, you know, bit of risk, bit of fun, bit of uncertainty, we like it. We like it. So I think, but we do have to have that humility to realise we cannot totally control everything, we can't optimise everything we can't model everything. We can have a go. Doesn't mean we shouldn't try, but we can never actually achieve it. And so and that, it requires both sort of saying, some boldness, to really try to think of all the things that might happen in an imagination, but never to think, never think you're actually doing it because. You always have to have that built in agility to switch in when things you didn't expect happened to you. And I think this for me, I writing this book changed my life completely and and so I kind of feel that these lessons, I mean, it's not a self help book, but I'm making it sound as almost as it is a self help book, because I actually think it is quite, I think it is quite useful to have some of these ideas into in your own personal life. But actually, you know, it's also, I feel quite appropriate to organisations as well.

Jean Gomes:

Yeah, no. I mean, it definitely is. And I think, you know, in the heart of every self help book is changing the relationship itself. So you are doing that in a in a very different way than the typical self help book, but it is very powerful from that perspective.

David Spiegelhalter:

Thank you. That's really kind, although I keep on having to tell my publicist, it's not a self hope. Lots of matter, not much because people have bought it, I think thinking, Oh, this will tell me how to live my life in the face of uncertain and these then they see the binomial equation, and they go into a state, yeah.

Jean Gomes:

Well, Scott and I are avidly not mathematical, and we had Marcus de Sara on the show, and he tested us, oh no, and we failed very badly.

David Spiegelhalter:

I wouldn't do that. Then I'm even more pleased that you actually enjoyed it?

Scott Allender:

Yeah, absolutely, yeah. Thank you. Thank you. And thank you for joining us and for sharing some of the wisdom from that book. It's such a rich conversation. Really, really great. Thank you.

David Spiegelhalter:

It's huge fun. Thank you so much.

Scott Allender:

And folks, if you haven't gotten his book, The Art of uncertainty, do that now I can't guarantee that you will like it, but I'll give it but I'll give it a probability of nine out of 10. I think you will so do it and and until next time, remember the world is evolving. Are you?

People on this episode