The Evolving Leader

'The Extinction of Experience' with Dr Christine Rosen

Christine Rosen Season 8 Episode 7

In this thought-provoking episode of The Evolving Leader, hosts Jean Gomes and Scott Allender talk with Dr Christine Rosen, senior fellow at the American Enterprise Institute and author of The Extinction of Experience. The conversation examines how our growing dependence on technology is subtly reshaping what it means to be human, thinning our attention, dulling empathy, and changing how we connect and make sense of the world. Drawing on history, neuroscience and cultural observation, Christine invites us to pause and consider what is the true cost of convenience, and how do we reclaim our embodied authentic experience in a digital world?

For leaders, this conversation is a wake-up call. As AI accelerates and organisations seek ever greater efficiency, Rosen argues that the future advantage will belong to those who can cultivate distinctly human capacities, presence, curiosity, and discernment. Her insights challenge us to create workplaces that strengthen, rather than outsource, our humanity.

 

Further materials from Christine Rosen:

‘The Extinction Of Experience’ (Jan, 2025)


 Other reading from Jean Gomes and Scott Allender:


 Leading In A Non-Linear World (J Gomes, 2023)

The Enneagram of Emotional Intelligence (S Allender, 2023)


Social:

Instagram           @evolvingleader

LinkedIn             The Evolving Leader Podcast

Twitter               @Evolving_Leader

Bluesky            @evolvingleader.bsky.social

YouTube           @evolvingleader

 

The Evolving Leader is researched, written and presented by Jean Gomes and Scott Allender with production by Phil Kerby. It is an Outside production.

Send a message to The Evolving Leader team

Jean Gomes:

As technology becomes ever more compelling, powerful and integrated into our lives, more of what was real experience is being mediated by screens and speakers for what we gain, we must also question what we are losing and what ultimately that costs us for our future. This is the work of Christine Rosen, most recently explored in her new book, The extinction of experience. Tune into an important conversation on the evolving leader.

Scott Allender:

Hi folks. Welcome to the evolving leader, the show born from the belief that we need deeper, more accountable and more human leadership to confront the world's biggest challenges. I'm Scott Allender

Jean Gomes:

and I'm Jean Gomes.

Scott Allender:

How are you feeling today? Mr. Gomes,

Jean Gomes:

well, I'm feeling on the cusp of something. Because even though everybody for the last month has probably been, you know, on holiday and so on, I think I've never worked so hard getting ready for the set of challenges facing our business and our clients and our research for so I'm feeling very excited about all of that, but I'm also feeling, in a strange way, more tired in as we start September than I probably ever felt before, but so I need to listen to that and do something about it. So I'm good to go for the rest of the year. But other than that, I'm feeling very, very motivated and excited. How you feeling? Scott?

Scott Allender:

Well, I feel like I need you to be rested, because I have a lot that depends on you, so I need you to take care of yourself. I'm feeling, I'm feeling really energised today, been excited about this conversation we're about to have, and so just feeling a lot of gratitude. Yeah, little bit fatigued, but all on the positive side overall. Today we're joined by Dr Christine Rosen. Dr Christine Rosen is a senior fellow at the American Enterprise Institute, where she focuses on American history, society and culture, technology and feminism. She's a prolific author, and we're delighted to be talking to her today about her new book, The extinction of experience. Christine, welcome to the evolving leader. Thank you so much for having me. Christine, welcome to the show. How are you feeling today?

Christine Rosen:

You know, I like that. You ask each other that question, and then I thought, Oh, I hope they don't ask me, but I will say, I this time of year because schools, kids are going back to school, things like that. I've always gotten a little burst of energy, even though it's been a long time since I've gone back to school in the fall. I like the change of seasons. I'm in Washington, DC. It's just starting to get cool again, and I'm excited. I love the fall, and I like the sense of enthusiasm when you see the kids walking to school in anticipation, because by mid November, they all are trudging along like they've just got the weight of the world on their back so but no right now, Feeling very, very happy.

Scott Allender:

Excellent. Well, the extinction of experience is a really striking phrase and title and covers a big shift in society. Can we start by, what do you mean by the extinction of experience? What are we losing?

Christine Rosen:

So I borrowed the phrase the extinction of experience from a naturalist named Robert Michael Pyle, who, years ago, was worried about young children not having an experience in nature, not being out in the world, spending too much time indoors or staring at a television screen and and not coming to appreciate and understand the importance of the natural world. And it struck me when I read that, that that's really not only children who are having that experience, it's all of us. When we think about how often we put something between ourselves and other people and put something between ourselves and the rest of the world, usually that thing is some form of a screen, some form of technology, some sort of communication platform, and I began to worry that it was changing our habits of mind, changing our expectations of each other and having some ill effects above and beyond the ones that we have seen in a sort of tech backlash over the past five to 10 years, I worried that it was changing our expectations of what it means to be human and to have human experiences. So that prompted me to start thinking about, what are human experiences? What can we call why do we need to distinguish between the human and the non human, and do we need to defend some of those traditional human experiences? Because it's not just that they're good for us individually, but they help us form stronger families and stronger communities. So the book isn't a binary kind of attack on on technology at all. It's much more practical than that in terms of helping us to think differently about, you know what it is to be human? I mean, we're using screens right now to mediate our conversation with which we couldn't otherwise do. So there's tremendous value that you acknowledge in that. But I'm interested when you you think back, um.

Jean Gomes:

Um, in your your theorising on this, was there a moment where you kind of went this, I need to write this book. What was this something, or is it a cumulative set of things? What was, what was going on for you there?

Christine Rosen:

I would say it's cumulative, but a few things stand out to me. I'm a Gen Xer. So, you know, I grew up without all this stuff, and then came to it as a cynical adult. And of course, we are the best generation. Everyone overlooks our important contributions to everything. So you know, the cynicism remains. But I was struck one day while in a public space in Washington, DC, looking around and just noticing that I was the only person noticing things everyone else was either rushing around headphones in, you know, staring at a phone, and this was about 10 years ago, so it's gotten only worse since then, in terms of how people mediate their experience. And it struck me as kind of tragic and very lonely that I was in this crowded public space with lots of other people. I love to people watch. So I was really kind of intensely looking at what other people were doing, and everybody was going through life as if they weren't physically there. And it was that lack of a sense of physical embodiment. I mean, they're bumping into each other because they're not paying attention, and all the frustrations of, you know, bumping into someone who's staring at their phone instead of what's in front of them. But to me, it also was this huge missed opportunity to just understand the world we live in and then world we share. Everyone was trying to escape from that shared reality into their own personal reality, and that really struck me as something that we shouldn't let happen without comment. And so I guess that really prompted me to start looking at other areas of life where we have cast aside the face to face human interaction, thinking we're improving improving speed, convenience, efficiency and connection with the technology, but where we might actually have not improved what it means to be in a profitable, productive, flourishing human relationship with other people.

Scott Allender:

I'm supremely interested in this. I'm thinking philosophers have been sort of talking about humans propensity to be kind of in a waking sleep state, you know, for hundreds of years, right? So it seems like technology is now only exponentially advancing that sort of predicament in your research. I'm sure it also includes, you know, the dopamine hit we get from quick interactions with technology and being on our phones. Is that the primary reason why we're so willing to trade off, or is there other stuff going on that? What why people are willing to to give up that sort of interpersonal interaction and just go to the device?

Christine Rosen:

So there's certainly the dopamine hit is real, and there have been a lot of interesting studies of what that does to us, how these are like little mini slot machines in our pockets, and keep us coming back for that intermittent reward system. But I think what fascinates me is how it transforms our expectations and habits of mind. So what I mean is that if we, if we spend a lot of time in this little dopamine machine, with this dopamine machine, getting those hits and rewards. Then when we don't have it and we're interacting with another difficult person, how do we handle that? Because we haven't practised that skill as much, because most of our interactions are under our own control. When they're through a screen, we can escape them tap, you know, we can claim to have an internet problem and get cut off. All these other things, we have a sense of control. So two things happen. I think we try to have exercise that sense of control over other people, and that's always a disaster. And we also don't spend time doing what you began this podcast asking, which is thinking about how we actually feel in a given moment, people don't sit with their feelings and try to understand them as much. It's particularly true of the young, because they can always Deschamps themselves. But part of being a fully formed human being is to be able to sit and go Wait. Am I angry or am I sad? Am I actually anxious, or am I worried about something I should be worried about? Processing one's own emotions? Is part of being a grown up, and it's something that takes practice, and that practice begins at a very young age. If you constantly can distract yourself from those often uncomfortable moments, how do you ever practice them? How do you grow up to be someone who can identify their own feelings in a moment that's perhaps tense or or beautiful? I think the escape from the reality is the part of these devices that worries me the most.

Jean Gomes:

Yeah, and as we escape from reality, we start to kind of weaken what makes us human in many ways, the the embodied cognition that gives us advantage, our ability for empathy, for understanding, for insight, creative in the list of things that allow us to actually have advantage in a AI world is is endless, and at the very moment when we most need those things, we are weakening them in ourselves. So I what I was fascinated reading your work was this that what you're you're doing here is you're really deepening and understanding of those qualities. And what we're losing and how to reclaim them. So we haven't got time to cover everything that you you've researched in that but if, if you could help us kind of navigate this a bit, and think I'm somebody who is at the start of my life in business or in education, and AI is going to automate a whole bunch of things that that I'm going to do. How do I prevent myself from becoming less? How do I become more?

Christine Rosen:

That's a great question, because the the kind of mass de skilling we're seeing, everyone worries about de skilling with AI and the outsourcing to AI. I worry about what we've already de skilled ourselves, those those things we know without knowing why we know them, which is the embodied cognition, the embodied way that we read each other's facial expressions or hand gestures, or why, why you can walk into a tiny metal cube with a bunch of strangers and because everybody gives a little nod in that elevator before the doors close, you know that you're safe, or you maybe don't Get on that elevator because someone looks a little dodgy. So these are all things that we're hardwired through 1000s of years of evolutionary development to know. And I think one of the conceits of our current technology is to try to train us to mistrust those things and to think that those skills aren't actually useful or meaningful anymore. Because AI can do that for you. AI can gather all this information about how you how often you spoke in that last meeting, and what that means for your state of mind, and all the that's information. I'm really interested in the stuff that is human, and so I we produce this information, but to understand how we can best use it, particularly in a business setting, you have to understand what it means to be a person how to interact with other people, and that is face to face interaction. So you can't always be face to face, but the kind of de skilling that I think we should be concerned about with AI is the idea that it's no longer a tool, just a tool that can help us do certain things better, but that it should replace some of our own judgments about what it means to assess another person's character. You see this already with how resumes are actually vetted. You know, a human being doesn't even set eyes on a resume in some of these corporations until it's gone through several different cullings. And that worries me, because people are weird and quirky, and some of the things that an AI might have been designed to weed out might be precisely what your organisation needs, because it is that quirky, creative bit that's going to bring to the table something new, a new idea that's being surfaced by a human's quirkiness, not by an algorithms design.

Scott Allender:

So in this sort of less embodied state that you're talking about, I'm curious about what you're finding about the impacts on our well being. So we're talking about the skill outsourcing and the potentially over outsourcing certain things, like the recruitment example, but in the in the human piece of it that you're referring to, like, what are some of the detrimental impacts on well being?

Christine Rosen:

I think we're all this is going to be a broad brush stroke. So forgive me, because there are always exceptions to these sorts of statements, but I think we're all becoming habituated to settling for much lower quality interactions in our lives, and I think we see that reflected in the concern about the loneliness epidemic, which is not quite right. We do not suffer from a loneliness epidemic. We suffer from self isolation, which is different. We are choosing to be alone because we don't feel alone when we're engaged on a social media platform or, you know, watching YouTube videos or endless Instagram reels because, but that's sort of airsats. It's like it's a substitute for real interaction, but it gives us just enough of a sense of connection for a little while, and it's so easy and it's so convenient, and it's so available to us on demand, and we don't have to brush our hair and put on decent clothes and leave the house to do it. And so we start to think, well, it's fine. It's just fine. And the more we choose that, and the less we slightly inconvenience ourselves and go out and meet a friend at a restaurant and sit across the table from each other, look at each other's faces and interact. Then the more that becomes a more, a bigger challenge, both emotionally for a lot of people, a lot of young people who don't who feel it's a real thing if you walk up to them unannounced without a text message announcing your arrival, or, good Lord, if you call them on a telephone. And I know this, I'm kind of saying it mockingly, but I have kids in this generation, I actually say with a lot of sympathy, I these are things that they haven't practised. They need to practice these human skills. And I think I worry that we're selling, settling for massive quantities of lower quality experiences. And that does change us. It changes our expectations for ourselves, and it changes our interactions with each other.

Scott Allender:

I feel that so I've got kids as well in this generation, but you know, my phone rings and I sometimes pause and like, why is Why are you calling me aggressive? Too much. It's too assertive. Text me like a normal person, not even.

Jean Gomes:

A different issue, Scott, we'll come to I'll tell you after the show, how is this reshaping our mind, potentially, what is actually going on for us?

Christine Rosen:

So I think one of the areas that really concerns me is that the change in how we understand time, like chronological time and the formation of memory. So I became, as you know, because she read the book, I became a little obsessed with handwriting in this book. So I have, I have a chapter which delves deeply into handwriting. I thought I was doing this because I'm an old fogy who likes to do things the old fashioned way, and I made my kids learn to write cursive and all this. But in fact, once you start looking at some of the fascinating neuroscience research, in particular writing, the mind body connection with regard to handwriting, is implicated in all kinds of things, with the formation of short term and long term memories. With literacy, it's just our brains are these fascinating puzzles that we only have figured a small piece out about and with handwriting, something that many schools said, Oh, we don't need this. We anymore. It's not a skill that the modern child should have. We'll just teach them block letters and move straight onto keyboards. Turns out, learning how to write in cursive brings this whole other layer of memory formation. So again, for me, that became this example of something where I thought, oh, you know, everyone should learn to write by hand, because I'm left handed, so I had to suffer through like, dragging my hand across the page, and it was just once I was able to write fairly decently. It was a really proud accomplishment. But it turns out that actually it does change our minds. It changes how we understand the past, how we understand and process what's right in front of us. And when you think long term about a, say, a nation's cultural memory, how is that going to what's that going to look like in an age where images can be totally manipulated or created from whole cloth, where you can't really trust a lot of the information, and we haven't raised generations of younger people to understand what it means to form their own memories and to understand collective memory. So these are all the sorts of ways in which, I think, on a broad scale, we haven't asked the right questions. We just kind of push forward, assuming everything will turn out fine, and mostly it has. But there are some places where I think we can say, maybe we stop here with this, or maybe we reintroduce cursive handwriting, because we now know it has this, these other valuable things to teach us.

Scott Allender:

So historically, you know, with every new technology, there's this sort of fear that comes with it, right? So the phone would make us more isolated television, I know, you know, I was told that playing video games was going to rot my brain and make me anti social, or, you know, all the things. What is different about this moment in

Christine Rosen:

time. So the difference now is the scope, the scale and the speed of what we can do with these things. Because earlier technology and the speed is particularly important, because with each new technology that was culturally disruptive, the telephone, the television, there was a, then a lag of time where people, not everybody, had one for a while, and then once, once, there was sort of mass adoption. There had been some time for norms to develop. So people learned how to answer a phone and how to take, you know, and there was a there was a little bit of delay. And we humans, our evolutionary history teaches us, you know, we still have a lizard brain. Sometimes. That's why we reach for the phone, thinking, Oh, is someone calling me that we are not. We need that time to develop new norms and new habits and to try to cope with that environment, these technologies that we all now carry around in our pocket, the first thing we reach for in the morning and the last thing we touch at night. In many people's cases, they've not been with us that long, and yet they are in ubiquitous use throughout the day and not just in work, but in our private lives, literally on our bodies. If you wear an Apple watch or an oura ring or any of these tracking devices, we have brought them right onto our human bodies and wear them all the time and use them all the time, and that the norms that have not had time to develop and catch up. And I think that's why you do see a lot of you see a lot of backlash. You see a lot of overreaction. You saw that with video games earlier too, but we we should be cognizant of that fact and not treat every negative reaction is as just a Luddite rant. I'm not a Luddite. I get called one a lot. I use this stuff every day, but I often, I want to always ask the question like, how is this changing my life for the better? And what is it taking away? What experience do I no longer have because I do it this way? And in that sense, it's like the old order Amish. You know, they do use some tools, but they choose them extremely carefully, because they don't want to undermine their their community's broader purpose and theirs and their values. And we don't do that enough. We just sort of say, oh my word. Look at this cool new phone, and the next thing you know, you've spent three hours playing Candy Crush. And that is more so I think we actually could be a little more Amish in how we approach these new things. Things, not to reject them entirely, but to make sure they align with our values.

Jean Gomes:

And AI takes this on to a whole new level. Can you talk to us about how you're thinking about AI, both at your own use and in your work?

Christine Rosen:

So I have a couple of I've had a couple of Off The Record seminars with AI developers and people, people who develop the tools, use the tools, and then some of us who are a little more sceptical of some of the tools. And what's been productive there is to see when humans and AI are used together to solve a problem that humans alone could solve, but not as well, but recognising that you do need the partnership there. So radiologists, for example, who can now use AI to scan things and find potential cancerous cells that just the human eye and just the well trained radiologist alone couldn't now some, I think there are plenty of people in Silicon Valley who would say, Great, we don't need these expensive radiologists anymore. We'll just let the AI do it. That is the wrong direction, because you still need there's still a creativity and an art to the medical science of figuring out you know, each person's person scan so AI used as a very effective, limited tool by people who know what they're doing already. That's fantastic. That is, that is bringing just orders of magnitude of power to decision making. Where it worries me is when it starts to outsource the development of skills. And so the perfect example here is chat GPT with kids who are learning how to write. Had a long argument with someone recently very successful businessman who was like, You writers, I don't know what you're talking about. My kids write these amazing papers, and he's explained to me how chat GPT does a draft and then they edit it. And I said, Well, chat GPT is turning your children into beautiful editors, but your children can't write. You got very upset, and I said, give them a blank piece of blank screen or a blank piece of paper, and tell them to write a story, see if they can do it. Then they you know that they can write, because that is a as someone who writes for a living, I can tell you, there's no scarier moment than that blank screen in the blinking cursor or the blank sheet of paper. So when we think that these AI tools should substitute for learning that tough skill, that difficult skill of like getting analysing and thinking, because writing is thinking, if you're a human being, that's how you think. You can think that way if you allow chat GBT to do the first draft, and you're not thinking in the same way anymore. So that. And then finally, I worry about the substitution of, you know, the chat bot style ais that want to substitute human relationships for a for a sycophantic AI relationship that tells you what you want to hear is never exhausted, is never tired, never has problems of its own that you might need to help them solve, and that, I think, can condition, particularly younger people who are learning about emotional development. It could condition people to have different expectations for human relationships that they spend a lot of time in that kind of interaction.

Scott Allender:

What additional sort of consideration should leaders of organisations be thinking about, from a sort of ethics perspective, when adopting technologies, adopting AI, bringing that

Christine Rosen:

in so whether it's AI or even if it's just some of the older surveillance technologies, I actually tell I teach college students every summer at a seminar we do at AEI, and a lot of them are about to graduate and start a new job. We have a lot who are, you know, recently graduated from college. And I always say, when you get your badge, like, if you're going to work for an organisation, and they give you that badge to get in and out of the building, just ask the HR rep, what else it monitors. Because I had what? So every year I do this, and I've had several students, one who went to work for a very large bank in New York, who emailed me and said, you could not believe what they are tracking. They're like monitoring how often I speak in meetings and where I go in the building, and who my badge pings off of, and other people in the building, so they know who I'm interacting with. He was astonished. And so I said, Well, this is one of the things we that both if you're an employee, but but especially if you're a leader, some transparency about how much surveillance is actually happening. Because if you don't have that, he immediately started to mistrust his new employers, like, why are they tracking me? Don't they trust me? So the trust transparency and surveillance aspects right now in many organisations are not healthy. So I would say that's one thing. The other thing is to be straightforward with with people whose jobs might actually end up being eliminated by a technology, particularly with AI. We're seeing this with a lot of sort of entry level white collar work that will probably disappear in the next five to 10 years. So if you're the head of a business and you have a lot of those people on staff, you should already be having those conversations about skills retraining, about, okay, do you know how to use the AI, is there a way that we can integrate that to help you do better, so that we you're not eliminated? I mean, these sorts of conversations should be happening. And I think there's a lot of fear in leadership, of panicking the masses of employees. And that's too bad, because the best integration of new technologies, and I do think we see this a lot in the medical field, is when the tool is seen as a bright opportunity to improve the broader mission of you know, in the case of radiologists finding cancer, or in the case of you know, doctors who have a new robotic technique to like heal more people, I think every leader of every business needs to have that like, what is our mission? And where are humans absolutely irreplaceable? And what? Where, if they are replaceable by AI, what are other human things we can that can help them fill that void? Because that's your responsibility. I think if you're leading an organisation, what

Jean Gomes:

comes out across the course of the book, for me is to kind of call to arms to strengthen all of these different abilities that we have as human beings. Because in an AI world, as you say, there is a reality. Organisations are not going to just choose a human centred approach. They're going to operationally reduce the costs of the business as much as they can using AI. And if you think about what many people do today is it's not very demanding cognitively or even socially. They pass around other people's pieces of information, copy and pasting it, you know, they're attending to emails, not really adding an awful lot of value, and so on. When all that work goes away and you're left with, well, what do I do now? The organisation is going to ask people to do more difficult things. It's going to ask them to solve problems that are harder. They're going to ask them to have difficult conversations. Work is going to get cognitively and socially more demanding, potentially for a lot of people. So from your perspective, what advice would you give people to think about, how do they actually succeed in that world? How do they improve, you know, the the qualities they have so they're ready for this?

Christine Rosen:

Well, it's, it's a difficult question, because there's, there were several generations in now to training people in habits of mind that do the opposite of that, that actually make them more machine like and you know, if you talk to anyone in coding, and which is another area that's been completely upended, you know, everyone was a computer science major, they were guaranteed a job out of college, and they would make pretty good money and do all the stuff well that those jobs are just gone. So what? How? But if you haven't trained in the human skills to find some other whether it's creative problem solving or what far more important, managing humans, dealing with other people, those are the sorts of jobs that AI is not going to be able to do. And as long as there's an identifiable uncanny valley where humans look at even very sophisticated AI and some sort of, you know, humanised robot, we still can tell it's not human, and it freaks us out. That's a good thing, until that moment, we still need those human skills. So if you have, if you have the kind of job where half of what you do is paper pushing, that's going to be outsourced. Think about the things you do that cannot be replaced. And this is where I sound cheesy, but I'm like, where can you really defend the human in the work that you do? What is that and not just the human, but the uniquely human thing you do, so whether that's you know, maybe you're very good at getting people together and getting them to agree on something. There are these AI programmes that claim they can do that with people who disagree. But the best way to get a bunch of people who are Arjun about something, you need that mediator. It's like, if you've ever done jury service, the person who ends up getting not chosen as the foreman, but there's always that person on a jury who kind of brings everybody to the centre of the table so the decision of verdict can be reached. Every organisation has those people. They're not always using their skills wisely, though, because they don't get to practice them. Those sorts of things are have to be identified anew and again. That shouldn't be seen as a panicked way to keep people employed, but as a real opportunity, as you say, to give people more challenging and more creative work than they used to do before.

Scott Allender:

What surprised you most in your research, I'm curious, as you were writing this book, like what you came across, that was particularly surprising.

Christine Rosen:

So I think the thing that was most surprising, because it was most concerning, was the lack of patience, just how impatient our culture has become. And I know in I have a lot of a lot of the friends in the Econ department at AEI were like, No, you want people who are impatient and want things on demand. This fuels innovation, and it's good for the economy. And they were giving me that whole spiel, and I'm like, Okay, there's something to that. But when it comes to long term flourishing, long term success in life, long term strategy and planning and seeing something complicated and large, all the way through to the end, when you think about our politics and sort of long term planning for social projects, and what we need to do is, as as nations, you have to have people who are willing to be patient, and that is something I started by looking at rates of road rage. Because I'm like, boy, people really. I live in an area that has really terrible drivers, and everyone complains about the drivers. And I'm like, but I wonder if that's just me getting older. No, it turns out actually, road rage is on the increase. Air rage is on the increase. People are more hostile and impatient in public. There. All these ways we can measure the change in the behaviour and the technology isn't entirely to blame. All kinds of stresses in life drive people into that sort of behaviour, but there's a kind of weird acceptance of it now that you should never be bored. You should always have some way to entertain yourself. And that's the part of it that I didn't see talked about enough. And what that led me, personally to realise is that every time I had a free interstitial moment, I was picking up my phone to check, text, email, whatever, and I don't even use social media, so I didn't even have social media as a draw, and I was still always picking up that phone. And so to realise that we're filling all these little minutes of the day that over time, add up to your life with that kind of interaction, rather than, say, interacting with another person, noticing your surroundings, letting your mind wander to daydream, to do all the kinds of things that we used to do because we had no option now, you have to actively choose to put the phone down and try to do something else. I do think that that was, to me, the most surprising thing in my own life, but also in seeing how impatience and on demand thinking has changed our shared social space, because I think we are a lot less kind and empathetic and thoughtful of each other in public space these days. I

Jean Gomes:

just wanted to go back to something you mentioned earlier on, which is the Amish that use a value driven filter for new technology, whilst a modern company can't necessarily go back to that kind of world, what lessons can they draw, practically from that approach? Do you think,

Christine Rosen:

Well, I think yes, we all going to continue using zippers and driving places, so we're not going to go Amish. But I think what you can do every organisation and its leader can sit down and say, first of all, make sure they have clearly defined values, you know, mission statements. Most businesses have those. But then there's a second step, because so much technology is integrated into how everybody does business these days. And that's to say, when a new thing comes up, what is the decision making process we have? What are the questions we ask before we adopt or reject a new thing. And that seems simple, but many businesses don't do this. Many educational systems don't do this. This is how bad ed tech ends up in everybody's kids schools. But I think the way to do that there, there are some models. Jacques, a little technology theorist of the 20th century, had a whole list of questions to ask about technology, and he broke them out into like moral questions, philosophical questions. Philosophical questions, environmental questions, each organisation should sit down and say, if our value is x and we introduce a new thing, how will this new thing change how we understand our values? So if you are a human centred company that really invests heavily in your employees, because you care about how you know their well being and what they can do for the company. Then, if you introduce an AI chat bot to interact, say, between divisions, or to summarise meetings, what's the human skill that disappears there? If you care about humans now, many companies will say, well, it was something that wasn't that valuable. And, you know, we outsource that already, and they'll go with the chat bot. But I think some places, if they stop and think about it, will go, Well, you know, Susie from HR has loves to convene these meetings, because then she can see all the people from the different divisions. And after the meeting, they come up to her and say, you know, we have this problem in our division with this guy or this gal. And things happen on a human scale if you bring people together. So each leader has to have go through the same set of questions, whether it's a new chat bot, whether it's a new, I don't know, Outreach Programme, whatever it is that however your organisation uses technology, you should have a list of questions you ask of every new thing, because every new thing is not always going to be an improvement if you don't, if it doesn't redound to the values that you started with. And we just don't ask because we love technology. It's so seductive and so amazing. And I mean, I'm guilty of this too, but you've you've got to ask those questions. They won't be the Amish questions, but they should have some of the Amish principles about community, what you value, what might undermine those values. That's what to ask.

Jean Gomes:

How you seeing this? In in your world, in in academia, what are you getting right and wrong there? Do you think

Christine Rosen:

so? It's been fascinating to me to see a lot of my friends have returned to the in the United States. We used to do all our exams on a blue book. You know, you sit down and do your final exam, writing it all out. And I have a lot of friends, particularly in the humanities, I'm historian by training, who are using blue books again, because they it avoids the having to deal with kids using a chat G, P, T, to create answers on, on take home tests. So that's been fascinating to me, because they all then have to struggle through the very poor handwriting of all of their undergraduates who have not practised handwriting in a while. So you see a kind of weird retro reversion to things that that will maintain the integrity of this of the classroom. Scholarly research has been transformed in some negative ways. So these tools are extremely effective at mimicry. So you can write, you can have an AI or a chat GBT, generate a perfectly plausible article with completely made up footnotes. And unless, and I've seen this because a friend Tess. In me with this, an area of American history that I know a great deal about. I've read all the scholarly literature, kind of a obscure area. And he's like, look, read this paper and tell me what's wrong with it. And I found three footnotes. I was like, I have not read these books, and I would have, because they were published when I, you know, like, I and he's like, Okay, so those were made up footnotes. The programme kind of generated a plausible footnote. I knew that because I've studied that I went to graduate school and got a PhD to know that a normal person looking at that, thinking there's no way they would know that. So that kind of inadvertent deception, because of the way these tools work, is very dangerous, certainly in scientific research, we've seen it in the legal field, where people have submitted briefs to courts that turn out to have fake footnotes, some of those lawyers have been disbarred. So I think that's where if human expertise and human detection of error is going to become ever more important, because the ability to mimic real things by these programmes is incredibly sophisticated.

Scott Allender:

What about in corporate life? Kind of coming back to an organisational conversation. You know, I sit in meetings and people are tethered to their phones and their laptops and their iPads and all sorts of things while they're in person and trying, and so they're not really present. And there feels like it feels like a real need. There's a sense of urgency in the room that my job is so important I couldn't possibly be fully present here. And sometimes that's probably legitimate. There might be something that they're waiting on that's super important. Other times, I think it's just habitual, right? You just sort of don't want to leave your devices. So as a leader of teams who you know might have a bunch of people walking in the room with their devices, some, necessarily, some maybe not so necessarily. What are some tips that you could give to a leader to sort of help people pull away from those sort of habits that aren't as necessary as they may feel?

Christine Rosen:

Well, I would say I mean absent presence as an acceptable way to behave in any sort of meeting or both, both in one's personal life and in one's professional life. Shocks me. This is this should never have been as acceptable as it is. Again, I'm showing my age here, but the only people who were immediately accessible when I was young had a beeper, and they were either a doctor or a drug dealer. That was it like nobody else needed to be on call like that. So I think this there is, of course, if you understand human nature and human behaviour, there's a real sense of status signalling too, right? If you go into the meeting, but you, you know, somebody's buzzing me, I better text back. I think if you're the leader, you have to be pretty draconian about it. I think you either have meetings where you're like, you know what, everybody phones in the middle of the table for the next 10 minutes while we brainstorm this particular problem. And if you, if you're waiting for a call, we'll get you don't even come into the meeting. Like, that's actually that way. It's like, this is a space where, if you're so if you're doing something else, do something else, but right here, right now, for the next 20 minutes, 30 minutes, we're just having this conversation. And this actually works. The the examples I'll give come from my real life. One is a comedian I know who talks about making when he's working on new material and he's touring, he makes people put their phones before the concert in a little yonder bag so they no cell phones. And he said he did that it's not because he's worried about people stealing his jokes. It's because when he's trying to get a read on the audience, he can't do that when every other person's got either the phone up in his face or their face in their phone. And it's transformative for his experience as a performer to be able to really get that interaction, understand the mood of the crowd, and if it's shifting, if a joke lands, that's like an It's art. But that's true of a good someone who can lead a good meeting too, like really reading the room and seeing, okay, this guy's nodding off, and so that that makes his job easier, but also makes everyone's experience better. The other thing is, my kids, who, you know, they all have smartphones, and they go off to college, and, you know, none of them have any money, so when they all go out to eat, they do this game where, you know, everybody, if they want to have real conversation, everybody puts their phone in the middle of the table. Middle of the table during the dinner, and the first person to pick up their phone during that dinner has to pay for everyone's meal. So everybody doesn't, you know they're all broke. They're like, I'm not picking up my phone. But again, it's they're making a decision that says, you all are important to me for the next hour, and nothing else is as important to me. And I think if you're a leader in a business, you're telling that to your employees too. You're like, we're I value your time, so I'm not gonna and you should value my insight as your leader. So let's all try to do this. But it does. It really does require making more seemingly more draconian rules in order to have more free flowing conversation.

Sara Deschamps:

Welcome back to the evolving leader podcast. As always, if you enjoy what you hear, then please share the podcast across your network and also leave us a rating and a review. Now let's get back to the conversation.

Jean Gomes:

Can we talk about some of the things that you do and you've learned others do to. Build this presence in the world, to build our experience. So it's richer. What are the some of these things may be blindingly obvious, but we're not doing them, so they're clearly not as obvious as we might think. But what are you learning about how you build that?

Christine Rosen:

So I think choosing the face to face is the most important thing, if you can do it. Now, we can't all do that every day. We're, you know, obviously we're it's wonderful that we can see each other while we record this conversation, but it would be a lot more fun if we were all in the same studio. And I know this, I do a podcast with some of my colleagues, and some of them are New York, and some of us are here. And when we're all in the same place together, it's so much more fun, and it's also more productive conversation. There's fewer interruptions, there's fewer like, glitchy moments and and that's just because that's how humans are meant. We're all meant to be in each other's presence. That's the embodied part of it. But I think we also have to embrace the idea that we should have a little friction in our lives. So for the last 15 years, Silicon Valley has given us these incredible devices and a lot of really slick marketing that says your life should just always be made easier, more efficient, seamless, frictionless. But in fact, for most people, that starts to feel very weird after a while, friction is what teaches us things. Now. Does this mean I want people going around deliberately making their lives harder. No, but I think what it means is choosing the human and maybe the more potentially difficult relationship sometimes, because it'll teach you something about yourself and also connect you to another human being, which is a good in and of itself. It's not a commodity. It's not but it is a good thing. And I do think a lot of the a lot of the animosity we see in public space. A lot of the hostility and polarisation of our current politics is the result of the fact that a lot of it happens with this online disinhibition effect in place. Meaning I'll say stuff to someone who I don't see, or I don't have to ever bump into on my street that I would never say in person. So I think we have to, we now have to actively seek out the human because the the default now is mediation. So throughout your life, you know, in your daily life, think about when, when could I actually go down the street and say hi to my neighbour in person, versus, you know, texting them, or here in America, putting them on next door on alert because they forgot to pick up after their dog, you know, surveying them so that those sorts again, it seems very mundane, but if enough people have that awareness that they're it's a choice. It is a choice to pick up your phone and text Happy birthday to your grandmother, rather than to say, buy a birthday card and write her a note, or, even better, call her on the phone or visit her. I mean, the these are all choices we have now, and I think people act as if they don't have a choice, and I think that is delusional, because we all have a choice, and we need to sometimes choose that more difficult, more inconvenient, but more human.

Scott Allender:

Thing is that the best way to combat the sort of dopamine addiction and that sort of like mindless activity we do sort of just, you know, is there other other way, other things we should be thinking about to combat the habit? Because I fully admit, sometimes I just reach for my phone, as you described earlier, for no reason. It hasn't even done anything. I just, I'm checking it. Are you still on? Is it still Yeah, what what else should we be thinking about in terms of habit breaking so that we can feel like we can lean into these choices with more success?

Christine Rosen:

Well, I'm, look, I'm a big fan of having just non mediated hobbies and practices in daily life. I'm I teach and train martial arts, where I the Japanese martial art of Aikido. It's very formalised. It's very you wear uniform everybody you know you you come into the dojo. Nobody cares who you are, what you do, if you're important or not. And it's all a bunch of adults completely making fools of themselves, learning how to do things like roll and throw and like it's so fun. And it's fun because you have to use your mind and your body, and you have to give it your complete attention, and you cannot have any, you know, devices anywhere near there are no screens anywhere. And if you don't give it your full attention, you might hurt yourself or hurt someone else. And so I think everyone can have some sort of non non money making, non productive, just completely weird thing that's theirs, but it should use your mind and your body and not be mediated. I have friends who do that with cooking or with gardening or, you know, craft making. I mean, there are a million things, whatever your thing is, find it, but then practice it. Do it often enough that it is a practice. Because I think what it I know for me, especially when I teach, it just forces me to slow down and get back to basics. Because I'm like, I'm teaching someone who's never tried to do this thing with their body, and I have to break it down into like, oh, you put your hand here, and then you do this, and there's a and it's slows everything down. And so I think a lot of us exist at this level of efficiency, productivity and speed that we when we. Down. It's almost discombobulating because it's we don't do it often enough. So anything that you can use your mind and your body unmediated that slows you down, is what I would recommend.

Scott Allender:

I love that.

Jean Gomes:

What else should we be talking about? What haven't we covered in your thinking?

Christine Rosen:

You know, one of the things that worries me a lot, and I think it's a group that's been extremely good so far about raising some alarm bells with AI, and that's creativity. And artists work, what artists do. Because if you think about one of the ways that humans have always communicated with each other, it's through symbol, through image, through music, through through a way of through literature, communicating across generations, but connecting in a way that reminds people we might not be from the same place or even the same time, but there's something that connects us, that's human. And I think one of the challenges now is again this, this idea that we don't have the patience any longer to sit and be humble before the vision someone else might present to us of what they think Being human means. So we'll take AI slop art or AI music, and, you know, think, oh, but it's free, it's cheap, and it sounds just like Taylor Swift's latest song. Well, no, you actually want the song that was written by the human because she had another terrible breakup, or, you know, whatever it is. But that's true of painting. And so one of the stories I tell in the book that really resonated with me was a an art history professor, and she was finding her students, even this is at an Ivy League school. They were very accomplished kids, but they were so impatient and they she just couldn't get through to them until she started assigning them something at the beginning of the semester, which was to pick one work of art, and you they had to spend a certain number of hours, I think originally was like six hours, four to six hours, not all together at the same time, but before they were allowed to comment on it, write about it, discuss it, they had to just look. They had to sit and they had to look. And what that taught was patience, awareness and humility. And I think humility is something that's really lacking in our daily lives, because so much can be outsourced and done for us and and, but humility is an important human thing, right? It reminds us that we are one small person in a much larger universe, and that there are amazing people who can do things we can't even comprehend, and we can learn from them. And technology sends us the opposite message every moment we use it. It's you put this phone in your hand, and you're in charge of everything. You're in charge of everything. You're the centre of the map. You can summon a car. You can get tacos delivered by drone, whatever it is you are God, and I think artists are constantly in the battle to remind us that we're human, which means we are limited. We have we have a time limit, we have a lifespan. We have cycles of life where things are good and bad, and that recognition requires us to slow down and really look and be aware. So I think that's a part of the story that continues to fascinate me. I follow a lot of artists who are who are really questioning that in their in their work and and if you go on social media, a lot of artists will now say human made art. They're they're distinguishing what they do as a human activity, and I think that's really important,

Jean Gomes:

what's next for you in your research?

Christine Rosen:

So I am now I have two I'm going to start working on another book, but I have to, this is how I do things. I just start keeping notebooks, and when I get to like, five notebooks full of ideas, I start going back through and trying to find the thing that's going to be the main idea. I've been really fascinated by how we understand risk culturally over time, so how the how the modern person understands risk, versus how someone 100 200 years ago might have understood risk and and how that is also linked to sort of the development of, I don't know, sense of morality, values, norms, those sorts of things. So it's all very hazy right now, but I'm I, but, but the risk is something that I think we've allowed to set us. We've set aside in some ways because of convenience, but we're also really unable to assess risk in a way that's rational sometimes. So I'm just, it just fascinates me that how humans understand risk and how they assign value to certain things is risky or safe, and how that's informed by culture and history.

Jean Gomes:

We had a we had a great conversation with the psychologist Ellen Langer about how risk is subjective, not quantifiable, in that in that kind of financial sense. Yes, you might want to listen to that show.

Scott Allender:

Excellent. Thank you for doing this work, writing this book. It's so important. How optimistic are you about our ability to start re embracing our humanity more fully?

Christine Rosen:

So even though I might sound a lot like a doom. Sara, I am optim. I'm optimistic by nature as a person, but I am actually optimistic after doing all of this research, because you can't keep humans down. I mean, we have this wonderful, quirky, rebellious, stubborn streak. That means that whatever scheme you know Silicon Valley billionaire wants to impose on the rest of humanity. Like it just takes a couple of us going, I'm not going to do that. You're not the boss of me. And actually, that's the thing. We are the bosses of ourselves. We have a conscience, we have free will. We have choice if we're lucky, and so we need to exercise those more responsibly and thoughtfully. So I'm I don't think that we're all going to end up in a human zoo with an AI overlord or, you know, Skynet. I actually think that we will fight back in creative ways, but we do have to see it as a not a hostile war, but as a thoughtful response to what these very powerful things we've created. How do we use those to help us be better humans, not to make humans more conform more to the values of the machine? That's really for me. I think why I can remain optimistic, because we are crazy creatures. I love humans.

Jean Gomes:

What a wonderful thought to finish this on, because I think that idea of human values versus machine values is such a powerful kind of way of framing this whole challenge. Thank you.

Scott Allender:

Thanks so much and folks, until next time, remember the world is evolving. Arjun.