Our Guest Malcolm Gladwell Discusses
Malcolm Gladwell on Tesla, RFK, and Why AI Could Save Us
Listen
Can generative AI help us close the gap between expertise and access?
This week on a special episode of Digital Disruption, we're joined by New York Times bestselling author Malcolm Gladwell, recorded live in Las Vegas at 91ÖÆÆ¬³§'s LIVE tech conference.
Malcolm Gladwell is the author of eight New York Times bestsellers, including his latest, Revenge of the Tipping Point. Named one of Time’s 100 Most Influential People and one of Foreign Policy’s Top Global Thinkers, he is renowned for his unique perspective on the forces shaping human behavior and society. An extraordinary speaker, Gladwell combines eloquence, warmth, and humor to both entertain and challenge audiences. Through masterful storytelling, he unpacks complex and often misunderstood ideas, from decision-making in Blink and the roots of success in Outliers to our underestimation of adversity in David and Goliath and the missteps we make when interacting with strangers in Talking to Strangers.
Malcolm Gladwell sits with Geoff Nielson for an engaging conversation on the future of AI, the power of storytelling, and the evolving forces that shape society. From AI’s role in closing the expertise gap to how unexpected narratives drive lasting cultural change, Gladwell offers his signature perspective: thoughtful, contrarian, and always surprising. He talks about why the most transformative uses of AI may be the simplest, how generative tools can elevate human capability, and why culture never changes in ways we expect. Malcolm provides insight into how the media, brands, and politics are changing and what that could mean for leadership while touching on the surprising truth about misinformation, expertise, and AI as a corrective tool.
00;00;02;26 - 00;00;23;26
Geoff Nielson
Hey, everyone, I'm here with Malcolm Gladwell, bestselling author, and, you know, just had an amazing keynote. Malcolm, I just wanted to, you know, jump right into it and ask you, you know, as you look at the future, as you look out over, you know, this wave of new technologies, you know, generative AI obviously at the forefront, what what kind of excites you most and what concerns you most as you look over the horizon?
00;00;23;29 - 00;00;54;01
Malcolm Gladwell
Well, you know, I mean, the theme of my talk today was about how AI is a it it's clearest and maybe most promising application is strengthening weak links. And so upgrading is bringing lower performing things to some higher level. And if you just think about it in those terms, it's unbelievably exciting. Right. So that you can think of, you know, everything.
00;00;54;03 - 00;01;32;18
Malcolm Gladwell
You know, I often think of any argument about AI is clarified. If we remove the developed world and we just say, just think about this. If we're only talking about less developed countries, do we think, what do we think the kind of pros and cons of AI are? And it's just overwhelmingly pros. I mean, if I can bring world class, you know, medical decision making to some rural area, or I can get a give a subsistence farmer, you know, cutting edge advice on what he should be planting and when he should be planting it on what the weather is that I could go on and on and on.
00;01;32;21 - 00;01;56;16
Malcolm Gladwell
So that is the clearest win. And then I think it gets a little more complicated in the developed world where what we choose is our kind of favored approach, because we do have like the the talk it was given today, which was all about the use of AI to train teachers. You know, the very clear choice is you can use AI to replace the human teacher, or you can use AI to make the human teacher better teacher.
00;01;56;18 - 00;02;11;26
Malcolm Gladwell
I think the second is a lot more. It's a lot bigger win for society. But the temptation just to remove and replace the teacher will be there. But I just think it's up to us to decide what we think the right choices.
00;02;11;29 - 00;02;37;28
Geoff Nielson
Yeah, it's it's super interesting. And one of the, one of the pieces of sort of conventional wisdom around AI that you completely flipped on its head, which I had never heard before, is I feel like there's so much talk about is AI smarter than the smartest human? And is it going to replace us and all the, you know, the top of the mountain use cases and the, the use case that you showed is more about like, can I be dumber?
00;02;38;04 - 00;02;57;23
Geoff Nielson
Right. If I, if I can kind of just shorthand it like that. Right. For, for so many of us and for so much training, there's and you probably have better language around this than I do, but there's a transfer from a high expertise individual to a low expertise individual. And that mismatch is just kind of rife for conflict or frustration.
00;02;58;00 - 00;03;12;17
Geoff Nielson
And I can actually yeah, I can actually support that and train people on this. And you talked about it as you said in the context of schools. But you know, have you thought about more, do you see other potential use cases, you know, in any other field?
00;03;12;20 - 00;03;50;25
Malcolm Gladwell
Oh, I mean, it's I mean, it's really limitless. You know, once you think about that to the question as you pose, it is right. There is a persistent problem that is growing larger over time. Are this gap between high expertise and low expertise, you know, people or environments. So think about medicine, for example. The not only is the medical profession required to know medical professional applied to know a lot more than they were required 25 and 30 years ago, but they also have the job of explaining it to their patient, which is getting harder and harder and harder to do.
00;03;50;28 - 00;04;17;13
Malcolm Gladwell
Right. So is there a role for AI to assist, to get the patient up to speed so they can have a productive conversation with a real person when they get into the office? That would be an equivalent use case. That's a really where. And because so much of the problem in those kinds of interactions is there is a degree of social embarrassment on the part of the person who is on the low end of that encounter.
00;04;17;15 - 00;04;36;09
Malcolm Gladwell
They just don't want to feel dumb in front of somebody else. But I, you know, remember that the famous thing from early on the internet, no one knows if no one knows you're a dog. Yeah, well, with AI, no one knows if you're an idiot, right? Like, it's. It's wonderful. It's a place for you to get smarter before you go out and encounter the world.
00;04;36;14 - 00;05;08;05
Geoff Nielson
Yeah. One and, you know, there's this long history, and I'm certainly not the first one to say it, that in high expertise professions, traditionally training is around the expertise. Right. It's around learning medicine. Right. It's you know, I've said before, it's more about health than it is about care, right? It's not about that person to person impact. And I don't know, maybe it's, you know, against the grain, but it feels like, as you said, that, that the people side of it is actually the best application of this technology.
00;05;08;05 - 00;05;09;28
Geoff Nielson
In some cases.
00;05;10;01 - 00;05;32;24
Malcolm Gladwell
I mean, not all. I mean, I we do this really fun project with, IBM where we, we interview a bunch of their clients and we did this, but totally hilarious thing with L'Oreal, which is a major IBM client, is basically using an A, and I want to make better lipsticks because making a great lipstick is a data problem.
00;05;32;26 - 00;05;56;20
Malcolm Gladwell
And it's gotten so complicated that there is almost no way for unassisted researcher to sort through all of the iterations and variations. And so that's a that's a wholly different kind of shirt. And so we know but we knew that already. We knew that that was going to be I was going to be brilliant at running through a billion different permutations in a blink of an eye.
00;05;56;21 - 00;06;14;22
Malcolm Gladwell
Right. That part we knew. So what I'm interested in is, is extending the use cases into, less obvious areas. And that's that's where I think there can be, equally significant trade offs. Okay.
00;06;14;24 - 00;06;23;18
Geoff Nielson
So, so outside of IBM work, do you find your experimenting with generative AI or AI in your personal life? And if so, you know, what does that look like and how have you found it?
00;06;23;18 - 00;07;00;06
Malcolm Gladwell
Not as much as I'm beginning. I'm sort of tentatively beginning to explore it a little bit. The problem is that, from my perspective, it's not solving. It's solving a problem I don't necessarily have. And that is my problem is how do I find an unusual way to tell a conventional story or, how can I find very specific little details that serve a narrative function of sparking people's interest or.
00;07;00;08 - 00;07;32;20
Malcolm Gladwell
And that's actually not what those two tasks are, not what I was good at. Right? What AI is really good at being is mastering the kind of conventional wisdom in a really, really thorough in. And that's actually that's not the hard part for me. Right? So I've been a little bit, but I am beginning to understand how in terms of, you know, when I'm doing some kind of large, complex project, I'm doing one right now, a nine part series on some legal case and where the amount of information I'm dealing with is enormous.
00;07;32;23 - 00;07;40;24
Malcolm Gladwell
It's become a data problem. And I'm a fan that I can really, really help me, efficiently manage that pile of data.
00;07;40;26 - 00;07;57;00
Geoff Nielson
So when you, when you do a new project, whether it's a new book or any sort of, you know, kind of deep, meaty topic, do you typically start with the thesis and kind of work down for examples, or do you try and find interesting stories and, you know, collate them into a broader theme?
00;07;57;02 - 00;08;19;05
Malcolm Gladwell
I mean, I typically go both ways, but usually it starts from the bottom right, that you you hear a story or you hear a little snippet and it makes you. So somebody, a friend of mine sent me a paper that just came out. I don't know if you're a baseball fan, but a couple years ago, the Houston Astros Astros were discovered to have cheated.
00;08;19;07 - 00;08;25;26
Malcolm Gladwell
They were decoding five their opponents, signs. Pitching signs.
00;08;25;27 - 00;08;26;11
Geoff Nielson
Oh, wow.
00;08;26;17 - 00;08;36;05
Malcolm Gladwell
And communicating the pitches to their batters in advance of the pitch. Right. So catch you would give a sign that fastball and they would bang a drum.
00;08;36;05 - 00;08;38;15
Geoff Nielson
It's like breaking the Navajo code or something. Exactly.
00;08;38;16 - 00;09;02;26
Malcolm Gladwell
They would bang a drum in the batter with no fastballs coming. So this guys, this is big analysis of it. And they discover you can't find any advantage to the Houston Astros from this cheating, even though it would seem intuitive that if you can tip off a batter to a pitcher coming, the batter should do better. Turns out the batter does not do better, and so they give a whole bunch of it's not possible reasons why this is true.
00;09;02;26 - 00;09;28;24
Malcolm Gladwell
No, I read that city and I was like, oh, I'm going to use that one day. I have no idea where right or how, but it's been stored away and I, I promise you, in the next five years that will show up somewhere. But I'm going to find a I gotta find a home for it. Right. A context in which that particular story can have real meaning, right?
00;09;28;24 - 00;09;45;07
Malcolm Gladwell
Because it is telling us. It's telling us something really kind of fascinating about when does rule breaking position is it, you know, that means that what looked like advantages not always advantages. You know, it isn't. I mean, there's any number of things more.
00;09;45;07 - 00;09;46;03
Geoff Nielson
Information isn't.
00;09;46;10 - 00;09;51;20
Malcolm Gladwell
Necessarily better or whatever. I don't know what it is yet, but that's very much how I work, is is thinking about those kinds of things.
00;09;51;27 - 00;10;14;18
Geoff Nielson
And and that was kind of what I figured intuitively. And the reason I brought it up is I ran a little experiment myself, which is I ask ChatGPT like, what are some what do you think could be the next book Malcolm Gladwell writes, and try to get it to come up with some hypotheses. And what I found it, you know, I'm curious to run it by you, but they didn't really in my mind, maybe you'll find differently.
00;10;14;18 - 00;10;31;10
Geoff Nielson
Like they didn't nail the mark. And to me, my hypothesis for why they didn't nail the mark is because you don't start writing a book with a title, right? Like it? It's it's not that top level that makes what you do fascinating. It's the it's the whole thread. Yeah. Right.
00;10;31;13 - 00;11;00;19
Malcolm Gladwell
Or I would say so that you could ask ChatGPT if you fed like I spent a lot of time with academics. So and lately for this project, I'm working on being a lot of time with, with law professors. If I take a law professors published work over the course of 25 years, and I give to Jed, GPT and I say, predict what the next paper will be in this person's research.
00;11;00;21 - 00;11;20;23
Malcolm Gladwell
I'm guessing ChatGPT would do a very good job because because there's a clear pattern to that kind of academic research, right? You have patterns of inquiry and they point in certain directions. But the difference with me is I don't have formal patterns of inquiry. I'm I'm just a I have to, you know, I do stuff for the most serendipitous.
00;11;20;23 - 00;11;22;04
Geoff Nielson
You're an interested guy, right?
00;11;22;06 - 00;11;30;18
Malcolm Gladwell
But someone tells me something or something turns out to be unexpectedly interesting, or so it's not. It's hard to do a prediction, right?
00;11;30;18 - 00;11;33;20
Geoff Nielson
Can I can I run a few by you and just see, what you think.
00;11;33;22 - 00;11;35;02
Malcolm Gladwell
Looks.
00;11;35;04 - 00;11;58;22
Geoff Nielson
Like? So the first one is called, and I didn't tell it to do anything about AI, by the way. But the first one is called the Algorithmic Instinct. How AI is rewriting human judgment concept. Gladwell could explore the tension between human intuition and algorithmic decision making. Much like blink examined snap judgment, the book could interrogate when and why algorithms outperform or undermine human instincts.
00;11;58;25 - 00;12;02;08
Malcolm Gladwell
She said. It sounds interesting. I'm not going to do it.
00;12;02;08 - 00;12;20;18
Geoff Nielson
But now I guess you're not going to do it. One more for you here. The network effect, social contagion and influence in the digital age. 91ÖÆÆ¬³§ing on the tipping point, it could revisit social epidemics in the context of social media and decentralization, exploring how ideas now go viral and how it influences evolved.
00;12;20;23 - 00;12;29;20
Malcolm Gladwell
Yeah, well, I sort of did that with my last book, but I'm not going to do that because I don't think the world needs that book. I like that books and written ten times.
00;12;29;20 - 00;12;30;09
Geoff Nielson
Right.
00;12;30;12 - 00;12;42;11
Malcolm Gladwell
So, again, ChatGPT is telling me some telling us something is giving you the conventional answer of. But you know, my value in the world is not in giving conventional answers.
00;12;42;11 - 00;13;14;10
Geoff Nielson
Right? Right. So so, you know, to that effect, you know, you talked today a little bit about being able to, you know, help everybody develop more expertise. This world we're moving to of more, you know, weak link and elevating weak links. You know, you're someone who I, I don't know, people are very touchy. I find about the word experts, you know, I don't know if I can call you an expert if you are an expert, but yet you're someone who, you know, practices expertise and things, learns deeply about things.
00;13;14;13 - 00;13;34;08
Geoff Nielson
As someone sort of at that pinnacle, are you worried about a world where there's an army of Malcolm Gladwell's and what you're doing is devalued, or do you see something, you know, unique for you or for? I don't even know who people like you are, to be honest. But I for for people kind of in this space that transcends, you know, what this technology can do.
00;13;34;10 - 00;14;24;18
Malcolm Gladwell
I'm not worried. I mean, I mostly because a lot of what is and this is true of any kind of creative field that, what excellence is in a creative field is very often unquantifiable or, it's not what it's not intellectual capacity or access to information or efficiency. It's sorting or it's some kind of ineffable thing, that, you know, we I did a project with Paul Simon, the musician and he has a lovely riff he would always give about how the ear is drawn to the discordant note.
00;14;24;20 - 00;14;56;15
Malcolm Gladwell
The thing that draws you in to a melody is a little deviation. Weird, unexpected, bizarre. Sometimes. Or a deviation from the conventional path that the listener thinks that they're on. Right. And so that's what creativity is. It's this discordant thing. So I don't know, like one of my best friends is a very successful screenwriter. And one of the reasons he's very successful is he's just weird in a really good way.
00;14;56;18 - 00;15;27;02
Malcolm Gladwell
He's had a very weird background, grew up in kind of very religious southern environment, has a PhD in theology, smuggled Bibles into Eastern Europe in the 80s. Wow. He's just like, that's what makes him interesting, because he's always doing things that should have, you know, I don't know whether I may be wrong, but that part of it doesn't seem to be replicable with our current iterations of AI.
00;15;27;03 - 00;15;44;11
Malcolm Gladwell
Right? On the other hand, he will tell you that AI has sped up his research process by an order of magnitude. It increases productivity, but it's increased the productivity of someone who's bringing some distinctive, interesting, weird thing to the table.
00;15;44;13 - 00;15;53;25
Geoff Nielson
Because we don't we don't want the perfect song, right? We don't want to smooth out all the wrinkles. It's the wrinkles that make it good. Right? Yeah. That attract us. Yeah, yeah. No, it's.
00;15;53;25 - 00;16;12;19
Malcolm Gladwell
Like, yeah, I was going to give a call. You know the there's a famous thing with Paul Simon who's one of, he had a long thing. I don't know if you know the song. Take me, come and take me to the Mardi Gras. It's an old classic. Paul Simon's. Come on, take me to the Mardi Gras.
00;16;12;21 - 00;16;43;24
Malcolm Gladwell
That's a song that where he took a a falsetto, a guy, a, a reverend from Harlem who's one of the great falsetto singers in gospel music, took him to Muscle Shoals, Alabama, which is the center of R&B, imported a marching band from New Orleans. And they got together and they did a calypso song. So he combined four different musical traditions.
00;16;43;24 - 00;16;57;24
Malcolm Gladwell
And he was a white Jewish guy from New York. Wow. Right. And like, that's one of his greatest songs. That's not. That's what we're talking about. Like, only he would think to combine these combining five musical traditions.
00;16;57;24 - 00;16;59;28
Geoff Nielson
Yeah.
00;17;00;02 - 00;17;16;15
Malcolm Gladwell
That's not I is not going to give you that ChatGPT. He's not giving you come and take me to the Mardi Gras. Right? It might give you a Taylor Swift song like the fifth best song on her album. It's not giving you. It's not giving you Graceland, right? It's not giving you, like, all of the things that are kind of.
00;17;16;23 - 00;17;23;28
Geoff Nielson
Iconic, it doesn't have that unique human filter of I happened to be, you know, exposed to these inputs that, you know, created. Right?
00;17;24;02 - 00;17;26;28
Malcolm Gladwell
Yeah. Yeah. Yeah.
00;17;27;00 - 00;17;51;12
Geoff Nielson
Yeah. So, so, you know, on that note, one of the things, you know, Malcolm, that I've been talking about lately is just, you know, for everyone, certainly for leaders, just the power of storytelling as a vehicle for influence. Right. As a vehicle for, leadership, as a vehicle for, I don't know, in some ways, you know, get, get, get getting your will to actually be accomplished.
00;17;51;12 - 00;18;09;29
Geoff Nielson
And this is something that, you know, I certainly view you as kind of, you know, a world class storyteller. Is there a recipe for a good story or what? What actually makes a story compelling? And is this is your version of storytelling, do you think broadly applicable to, you know, everyone.
00;18;10;01 - 00;18;44;16
Malcolm Gladwell
Environment is interesting. I don't I mean, I think that, I mean, the audience just one definition of a story is it is simply a, an experience that leaves the listener in a different place at the end than they were at the beginning. So there has to be movement. So it's not the assertion of a fact. You know, it's not a description of a situation.
00;18;44;18 - 00;19;26;19
Malcolm Gladwell
There has to be some distance traveled. And if along the way in that journey, the the listeners expectations are violated in some way, then we have a real story. Right. And by the violation of expectations is this crucial thing as well. Like there has to be a turn like to give you an example of a story for years and years and years, it's not just one of the most powerful brands in America is testing right, valued at a many multiples of any other automobile maker.
00;19;26;22 - 00;19;59;07
Malcolm Gladwell
Incredibly success. The first successful automobile startup in God knows how many years, and it has a really incredible story with a violation of expectation. And the incredible story is a very weird, obsessive, brilliant guy is devoting all of his time and energy to creating a car that didn't exist before and is aren't like every part of that is every other car manufacturers, big corporate blend classes.
00;19;59;09 - 00;20;19;09
Malcolm Gladwell
And here we have a weirdo out in California who, you know, went to college in Canada and grew up in South Africa and has this bizarre father and is just obsessed. Right. That's a story like, oh my goodness. And he wants to do something. Create electric cars. Like in a way it it it violates our expectations about where cars come from.
00;20;19;11 - 00;20;45;02
Malcolm Gladwell
Right. And it's satisfying and meaningful to us because I you know, we've had it's it's this classic primal idea of the, of the genius who applies his, you know, the genius returns to cars, have no genius since Henry Ford or Ferdinand Porsche or whatever. And then Tesla falls apart. Why does Tesla fall apart? Because the story's violated. The.
00;20;45;04 - 00;21;17;10
Malcolm Gladwell
See, he's no longer he's no longer devoting his attention to it. That's why I bought the car. Because I thought the story was the genius was obsessed with every detail. And now the genius is doing a million other things. Why? The story's gone. Right? So, like, those are that that idea that implicit in that brand was this really interesting, weird, particular detailed story about why was important to the to the users at the brand, the adherence to the brand.
00;21;17;10 - 00;21;23;18
Malcolm Gladwell
And you have to honor that story if you're going to keep the brand afloat.
00;21;23;20 - 00;21;42;11
Geoff Nielson
It's a it's a really, really interesting example. And I'm I'm realizing I never process that for the Tesla brand. How integral a part Elon as an individual is of the brand. Right. Like if you take Elon out of the equation like what is Tesla, right. It's this abstraction. Yeah it yeah the story is no longer there.
00;21;42;13 - 00;22;05;23
Malcolm Gladwell
I'm like oh is like a contrast. So if you Chevy the story is not about Chevy. The story all of your narratives are about, you know, the person who buys a Chevy, their their brand identity is all caught up in people they knew who drove Chevy's. My dad had one of these cars. It means something to me. It's a completely different kind of story.
00;22;05;25 - 00;22;26;09
Malcolm Gladwell
You could have the weirdest, craziest person run. Chevy wouldn't matter. No one even knows who runs Chevy. Like who knew who run Chevy. Do you know? No, I don't know. I'm a car nut. I have no idea who run Chevy. I know who I know runs GM, but like on a grand level. But yeah, I know that was a very particular relationship that people.
00;22;26;11 - 00;22;42;23
Geoff Nielson
So when we think about stories, you know, one of the adages, you know, you hear in marketing these days is that people are the new brand. You know, I've heard a few people say that. And, you know, certainly the Elon story kind of reinforces that. The Chevy, on the other hand, doesn't. Do you do you buy that?
00;22;42;23 - 00;22;56;04
Geoff Nielson
Do you think as storytellers, we need to be more conscious of our role in the story and our role in kind of the brand that's being created either personally or professionally?
00;22;56;07 - 00;23;27;23
Malcolm Gladwell
I'm not sure. I mean, I think there's I my problem is that I have such a kind of, Catholic, small c Catholic definition of what a story is that I feel like it can take many, many forms. I think it's certainly true of some things, but not others. You know, and also, I think the stories differ from not all users.
00;23;27;25 - 00;23;55;16
Malcolm Gladwell
Not all customers of a brand have the same relationship to that brand. So there may be multiple different stories appealing to multiple different people. So I don't know, like certainly to someone of my I have an iPhone and Steve Jobs matters. My concept in my conception of what Apple is Steve Job continues to loom large. If I was 25, I'm most 25 year old.
00;23;55;16 - 00;24;11;18
Malcolm Gladwell
People with iPhones have no idea who Steve Jobs was. He's vanished. It's now do they care about Tim Cook? Not a whit like they have some other kind of. So I guess I'm more impressed by the kind of diversity, the many different roles that stories can play.
00;24;11;20 - 00;24;12;01
Geoff Nielson
Right?
00;24;12;01 - 00;24;12;20
Malcolm Gladwell
You know?
00;24;12;22 - 00;24;23;22
Geoff Nielson
Yeah. And I'm still thinking, I mean, for them too. Similar to Chevy, it's probably more about who has or had an iPhone, right? Like it's people driven but it's not monolithic. And yeah.
00;24;23;24 - 00;24;26;18
Malcolm Gladwell
Or it's what I've done with my phone. What's on it.
00;24;26;19 - 00;24;27;06
Geoff Nielson
Yeah.
00;24;27;09 - 00;24;47;14
Malcolm Gladwell
Right. Yeah. I mean, my story my phone story begins with like, hold on, you just disappeared. For my daughter. No way. It's personal object now. It's, like, full of all my personal things. So maybe my story has nothing to do with Apple whatsoever. I mean, I don't know, it's interesting.
00;24;47;18 - 00;25;13;05
Geoff Nielson
You know, that is interesting. Yeah. My background is the same thing. It's my. It's my daughter as well. But, I was just reflecting on that, Malcolm. And I was thinking about, you know, revenge of the Tipping Point, your last book and over stories. Right. And how there's these, you know, the point you make in the book is that you use it geographically, that there's these in these communities where in there's these sort of invisible narratives that guide our behavior.
00;25;13;07 - 00;25;31;00
Geoff Nielson
Do you think that's true? Yeah. How do I, I want to ask this. Do you think that extends beyond just sort of geography and does it does it work in terms of brands, organizations, you know, some of the institute tions that we interact with on a daily basis.
00;25;31;00 - 00;25;57;24
Malcolm Gladwell
Yeah. Oh, I think very much I mean, I chose not in that book to focus on over stories that had a kind of dimension of place, a geography attached to them. But I think that you can conceive of these kinds of organized narratives in any number of ways. You can conceive of them generationally. You can see them professionally.
00;25;57;27 - 00;26;34;05
Malcolm Gladwell
You can. Yeah, I think that I mean, I think this idea can be kind of played with particularly because, you know, one thing I don't go into in the book, but which I think is very clear there is it most of us are kind of operating, within multiple over stories any given time. You know, it's I'm always reminded of years ago I went with a friend to visit, the Lubavitcher, the orthodox, ultra-Orthodox community in Brooklyn.
00;26;34;08 - 00;26;57;12
Malcolm Gladwell
And, you know, there's a sort of messianic, strain of Orthodox Jewry. We're chatting to a guy, and he was talking who was a Lubavitcher at the time. They thought that their rabbi was the Messiah. He's talking about how the rabbis, the Messiah, and he goes, you know, it reminds me of an episode of Mork and Mindy, which is a TV show.
00;26;57;14 - 00;27;17;11
Malcolm Gladwell
And I realize it was Moses talking about this. I don't know, there's nothing shocking about that at all. One of his over stories is this someone who belongs to a close knit ultra-Orthodox community in Brooklyn. Another of his stories is he was an Australian guy who watch a lot of American TV, and he also existed in that universe.
00;27;17;11 - 00;27;43;25
Malcolm Gladwell
And he probably had seven other for, you know, his son played soccer and he was obsessed with soccer. And he was, you know, I could go on like, we all have these. We it's very easy for us to, to think that we're only functioning in one universe where, in fact, maybe one of the hallmarks of the, the digital world is it's it's permitted us to belong to many different kind of worlds all at once.
00;27;43;27 - 00;27;59;21
Geoff Nielson
Right. And and in a way that's, you know, actually, I think, hidden or at least kind of below water in a way that never was before, like it used to be, that you could, you know, you could if there was someone you cared about, you could see all those influences. But now, you know, they live in your device and it's.
00;27;59;22 - 00;28;17;20
Malcolm Gladwell
Yeah, you know, my my assistant told me that I like, where are you going? And you. I'm going to New York today. I was like, where you why are you going to New York? I'm going to a show at Madison Square Garden. It's like, what's the show? He goes, it's, it's a dungeons, and it's a live taping of a Dungeons and Dragons podcast.
00;28;17;23 - 00;28;28;24
Malcolm Gladwell
They filmed Madison Square Garden. Oh, like I never heard of this. I don't even know this existed, but that's my point. Like this. That was one of his identities that I had no idea it was part of me.
00;28;28;26 - 00;28;53;21
Geoff Nielson
Well, and that's I mean, that's also just culturally so interesting and it yeah, it comes back to community and identity that there's now enough people, I mean, podcasts. I don't want to get too meta here, but the fact that podcasts have become so popular and people see them as a medium that, you know, in a lot of ways has become more popular than a lot of traditional media, I don't know, to me, it feels like people want to imagine that they're sitting at the table with you, right?
00;28;53;21 - 00;29;02;01
Geoff Nielson
Like there's something deeply personal about it that you don't that you don't get from a higher budget production that's, you know, less personalized.
00;29;02;01 - 00;29;06;00
Malcolm Gladwell
Yeah, I think that's true. Yeah. That intimacy is a big, you know, calling card.
00;29;06;06 - 00;29;27;00
Geoff Nielson
Yeah. I wanted to come back to the overstory piece. And, you know, for me, you know, I talk a lot about organizational leadership and culture, right. And how we can kind of define organizational culture for, for over stories and, and, you know, these kind of these kind of, you know, invisible forces. And this is something you don't really get into in your book.
00;29;27;00 - 00;29;43;26
Geoff Nielson
But I was curious about on your perspective, like to what degree they're actually steerable and moldable versus just being emergent, like, can we harness them in any way or is it better just to, you know, understand them and take advantage of our deeper understanding of them to get to our outcome?
00;29;43;26 - 00;30;15;24
Malcolm Gladwell
I do think that they are fluid and changeable. I don't think we can ever have complete confidence that we know how to change them or can change them at will, or can, but it is pretty clear that, like the ways in which we organize these kinds of common narratives shift all the time. I was I'm doing a project where known the death penalty in the United States and, you know, it.
00;30;15;26 - 00;30;37;08
Malcolm Gladwell
It's a big deal in the 40s and 50s and 60s. It goes away in the 70s and Americans overwhelmingly turn their back on death penalty, and then it comes back, right? Mean. So like in the span of 30 years, it's big, not big and then big again. And at no point in that cycle did anyone predict the next stage in the cycle.
00;30;37;11 - 00;31;09;10
Malcolm Gladwell
But that is a these are deep. These are not frivolously, lightly held beliefs. These are deeply held beliefs that nonetheless changed, twice in the span of a generation. Right? Right. So there is these things are in ways I think. And and in the book I talk about, you know, the doctor moves from Buffalo to Denver and to Boulder and what their, their kind of conception of what it means to be an effective medical specialist changes instantly.
00;31;09;12 - 00;31;12;07
Malcolm Gladwell
These things are are volatile.
00;31;12;08 - 00;31;37;03
Geoff Nielson
Yeah. Well, and the other thing you talk about in the book that I thought was interesting is, you know, when it comes to shifts in perception and adoption of certain ideas that the impact the media has, whether it's traditional media, whether it's, you know, traditional voices, singular voices, you know, what I'm thinking specifically about, I think you talk about, you know, will include the will and grace effect on, you know, people in America is perception of gay marriage.
00;31;37;03 - 00;31;58;20
Geoff Nielson
And, you know, the one I was thinking about more recently. And I'm sure you can find, you know, dozens of examples is, you know, and I, I this is not a political podcast, but the, the Republican Party kind of before and after Donald Trump, where this guy comes in and says, well, actually, this is my view. And suddenly there's, you know, 180 is is too strong a, you know, description of the shift.
00;31;58;27 - 00;32;20;28
Geoff Nielson
But there are these huge kind of tectonic changes in how, you know, identity is viewed. And, you know, people view this. And I'm curious on your perspective because my sense and especially in a weak link world, you would think it would be more democratized in terms of how these things shift. And there's more subtler forces. But I don't know.
00;32;20;28 - 00;32;31;22
Geoff Nielson
Is that true? Are both true? How do you see the impact of, you know, more monolithic forces, forces versus more decentralized forces in the coming years?
00;32;31;24 - 00;33;12;02
Malcolm Gladwell
It's a hard question. You know? Because we don't we sort of have swapped one model for another, or at least in many realms. We've gone from these high, highly centralized forms of cultural production, to decentralized ones. And we're discovering that, certain we're discovering the differences between those two forms are are significant. You raise the question of Trump, do I think the Trump the shift to Trump would have been possible in an earlier, more centralized environment?
00;33;12;02 - 00;33;21;00
Malcolm Gladwell
And I would say I do think that I don't think I don't think there's a I don't think Trump gets elected in 1972.
00;33;21;01 - 00;33;23;05
Geoff Nielson
Oh, I agree, I agree. Yeah.
00;33;23;08 - 00;33;54;18
Malcolm Gladwell
I mean, I think there's something that's somehow distinctive of the kind of moment that we're in and the idea that politics is playing such a different function culturally today than it did back then, it's really become entertainment. A it's a it's the only common conversation we're having now. Sports and politics are the only common conferences we're having, whereas we used to have far more outlets for this kind of, collective discussion of who we are, what we mean.
00;33;54;20 - 00;34;12;21
Geoff Nielson
So so let me ask you another difficult question and feel free to just, you know, kind of, you know, muse on it however you want. But, you know, I think I completely agree with your comment about Trump. You know, he's not happening in 1972. And I feel like I don't think I made this up, but he's kind of the he's the social media president, right.
00;34;12;21 - 00;34;42;07
Geoff Nielson
Like Twitter was the vehicle that kind of propelled him into office. Are are we going to see more people like that? And now that we're, you know, in terms of the technologies, like guys, it seems like we've like social media. Sure. But now we're talking about generative AI and the impact some of these technologies can have. What impact does that have on the dissemination of culture and how that kind of translates into power structures?
00;34;42;10 - 00;35;06;18
Malcolm Gladwell
I mean, I would say. I don't know, and I don't know if for the following reason, which is that we have to remember that all of these technologies we're discussing are in their infancy, and that historically, when you look at the advent of New, particularly media forms of media, it takes years for society to figure out what they're for.
00;35;06;20 - 00;35;20;03
Malcolm Gladwell
The telephone for the first 25 years, the telephones life, the telephone industry actively tried to discourage people from using it to to gossip, to catch up with friends. They thought it was a business.
00;35;20;03 - 00;35;22;28
Geoff Nielson
Tool beneath the function of the technology. Yeah.
00;35;23;00 - 00;35;47;10
Malcolm Gladwell
This should not be shouldn't squander this thing on chatting with your mom. You should it. It's a business tool and they actually, like I said, actively discourage people from. They didn't realize what it was until the 20s, which is, you know, when does Alexander Graham Bell telephone 1873. And it's it's it's the end of the 1920s before they wake up to what it is.
00;35;47;12 - 00;36;10;00
Malcolm Gladwell
So and the telephone is a bigger deal than Facebook and Twitter. Yeah. But it strikes me that Facebook and Twitter are, are still in their infancy. They're really young. Do I even it's quite possible that if we had this conversation five years from now, not neither of you, both of us would have only a dim memory of this thing called Facebook.
00;36;10;00 - 00;36;22;27
Malcolm Gladwell
Yeah, or Twitter or, I don't know, or the opposite. Yeah, that it completely dominates your life. I just don't think the only confidence I have is that we will be using these technologies in unanticipated ways.
00;36;22;29 - 00;36;23;24
Geoff Nielson
Yeah.
00;36;23;26 - 00;36;28;20
Malcolm Gladwell
In the future, but I but no one can predict what those unanticipated ways are.
00;36;28;27 - 00;36;46;28
Geoff Nielson
I think that's well said. And I, I completely buy that. And the reason I was kind of chuckling is I don't know if you've ever had this experience, but even something like Facebook, I feel like if you go back in time, rewind the clock 5 or 10 years and look at Facebook, then it's completely unrecognizable from what it looks like now.
00;36;47;04 - 00;37;08;03
Geoff Nielson
And even I found if I look at how I used it, then I don't even recognize myself. You know, I'm like, oh, I was, you know, just using it in this very nascent, sort of now feeling bizarre way. But the technologies and our, you know, patterns are just evolving so fast that, you know, and I have to only imagine that's going to continue into the future.
00;37;08;05 - 00;37;29;29
Geoff Nielson
I did want to ask you, Malcolm. You know, one of the things, you know, I know you're not a technologist, but I'm always curious to hear your perspective on, you know, what you're hearing in the way things are changing. But one of the things I like to do in these these formats is ask people what you're hearing about now that there's a lot of hype around technology or otherwise, that tends to be the side guys that you're like, you know what?
00;37;30;02 - 00;37;39;05
Geoff Nielson
I think that's bullshit. I think, you know, it's BS that's being sold because it generates a good story, but that you don't think is necessarily going to have its day.
00;37;39;07 - 00;38;10;10
Malcolm Gladwell
I think I have a lot of confidence that whatever employment dislocation is caused by AI will be. Will be short and not painless, but less painful than we think. I think the gloom. And I don't buy the doom or the gloom and doom thing on it. Yeah, I just think, like we always say this every time something comes along, it never pans out to, yeah, everyone has nothing to do.
00;38;10;10 - 00;38;13;29
Geoff Nielson
It's become kind of Malthusian, right? Like this wave is going to. Yeah.
00;38;14;01 - 00;38;53;04
Malcolm Gladwell
And I think people have more ingenuity than that. And also I think that we're probably a lot further off from truly transformative AI than we realize. I just, I'm a I'm on the kind of my expectations are. But I also I also sometimes believe that a lot of the most. You know, revolutionary uses of AI are some of its simplest ones and just need it, need to be this incredibly mind blowing technological accomplishment to make a difference in our lives.
00;38;53;06 - 00;39;05;25
Malcolm Gladwell
Simply holding and organizing information and standing at the ready to give good answers to problems is huge. I mean, if that's all it did, it would be transformative.
00;39;05;28 - 00;39;16;14
Geoff Nielson
When you talk about it from a research perspective as kind of a data organizer, right? If even if all it does is you feed it all your data and then ask it questions about that data, you know, it's still exponential. Yeah.
00;39;16;20 - 00;39;17;14
Malcolm Gladwell
That's exponential.
00;39;17;17 - 00;39;33;29
Geoff Nielson
Yeah. Are you you know, I, I was just listening to that response. Are you a natural optimist with, you know, just kind of forward looking in general? I guess I sort of get that sense about you that you don't you don't have a lot of doom in your vocabulary, or are you just an interested observer?
00;39;34;05 - 00;40;01;18
Malcolm Gladwell
Well, I have a little bit of doom right now in my, my work, but, yeah, I generally think, you know, I think a problem is, is simply arms races between the problem and, and that in a solution. And, we're generally pretty good at keeping up in our end of that arms race. I must say that,
00;40;01;21 - 00;40;20;27
Malcolm Gladwell
Well, not to get political, but RFK junior has shaken my arm. I find him, I find what he's going. What is going on? I know I did not anticipate this would ever happen in any kind of developed country in the 21st century. I am utterly gobsmacked.
00;40;21;00 - 00;40;22;22
Geoff Nielson
The anti-science approach to.
00;40;22;25 - 00;40;35;07
Malcolm Gladwell
How he does it, the idea that a guy could be running the greatest scientific institution in the history of mankind, who believes Louis Pasteur was wrong, is just. I did not anticipate that.
00;40;35;11 - 00;40;57;01
Geoff Nielson
Yeah, yeah, that's that. Yeah, that that's a fair one. I was going to ask you. Yeah. That's I understand why there's a sense of doom when you see that it. Well and that's so, so let's maybe loop that back around to the conversation about expertise. Right. Because there's I don't know if this is too dramatic, but I feel like there's sort of a war on expertise going on right now.
00;40;57;01 - 00;41;22;05
Geoff Nielson
Like there's that there's certainly a wide, you know, chasm between, you know, experts and non-experts. And it feels like it's become very in vogue to be like, well, the experts are like they're as self-motivated as anybody else. They're manipulating you, right? You can't trust them. We know better. Common sense prevails. You know, Louis Pasteur be damned.
00;41;22;07 - 00;41;23;12
Malcolm Gladwell
Yeah.
00;41;23;15 - 00;41;33;00
Geoff Nielson
Yeah, well, what the hell do we do about that? Like, and does this training piece that we were talking about, you know, 20 minutes ago play a role in that?
00;41;33;02 - 00;41;51;14
Malcolm Gladwell
I think? I mean, this is the best one of the best case scenarios for AI, which is I had a little, a little hilarious version of this where I had done this podcast that was very critical of RFK Jr and and of Joe Rogan, for sure.
00;41;51;18 - 00;41;53;07
Geoff Nielson
Kind of platforming these people.
00;41;53;09 - 00;42;21;16
Malcolm Gladwell
Like and not. And Elon Musk retweets some some angry tweet about my work and it gets a ton of views and blah, blah, blah. And one of the things that happens is that people on Twitter start asking grok is Gladwell. You know, does he make stuff up? Is he a hack? Is he all the things that I was accused of being and grok defends me.
00;42;21;16 - 00;42;50;03
Malcolm Gladwell
Yeah. No, this is no, this is a personal version of a larger, bigger thing, which is, you know, we are actually we have rarely been better equipped to deal with, nonsense than we are now. I mean, we shouldn't. We have a tool at our disposal which finds that it's very hard for for AI to lie about the big issues, right?
00;42;50;05 - 00;43;14;00
Malcolm Gladwell
You're not going to get your AI is never going to tell you or is not going to tell you that vaccines cause autism because it's going to look at the data. Right. That's what it does unless you corrupt it in some spectacular fashion. So like it's you know we are building a corrective to a lot of this nonsense at the same time is so is it symptoms.
00;43;14;00 - 00;43;21;20
Malcolm Gladwell
It as the nonsense seems to be peaking. We, we built an institution that can that can answer it right.
00;43;21;22 - 00;43;43;18
Geoff Nielson
Which in some ways is really reassuring. I'm still like I'm still processing what you mentioned earlier about storytelling, right? Which is like facts are not a story. It's it's the narrative. It's the, you know, emotional journey. You go on and you know, the impact that these outsized voices can have because they put their own ribbon on whatever series of facts or falsehoods that they want to.
00;43;43;20 - 00;43;58;00
Geoff Nielson
But you, you know, we sort of backed into something that I wanted to ask you as well, which is, you know, I think I mentioned off the top that, you know, I've been a fan of yours for a long time and like to me, that's kind of a no brainer. I just like, I, I just personally relate to your worldview.
00;43;58;03 - 00;44;15;11
Geoff Nielson
And as I was looking at some of your stuff online, there's like, there seems to be like a Gladwell backlash out there, like there's just certain areas of social media that are just like, and you sort of talked about it in the context of this, where they're, I don't know, like if you want to call them haters or what, but just people who like you can say anything and they're like, you're the devil.
00;44;15;15 - 00;44;31;28
Geoff Nielson
And and I'm curious why in your mind that is like, what led to that? Because I like I don't know, I don't I don't see any sort of negative force in anything you've ever done. So I yeah, I'm curious what you're going on.
00;44;32;01 - 00;45;03;04
Malcolm Gladwell
I would say it's actually less now than it used to be, and it's been going on for a long time. And I think it's simply there's a I can give you a kind of like flippant mathematical response, which is it's a function of if 90% of people like your work, which should be high. But let's assume that across the board for all writers come to 90% of their consumers approve of what they what they what they're reading.
00;45;03;07 - 00;45;29;23
Malcolm Gladwell
If you sell ten books, that means you have nine fans and one critic, right? It's going to seem you probably never hear the one critic you're going to see. And if you hear anything at all, you hear from the fans. If you have, a million readers, it means you have 900,000 fans and 100,000 critics. Right. And that's if you have 90% approval from your.
00;45;29;25 - 00;45;50;23
Malcolm Gladwell
So it's just it's just a function of like, how many people are reading your stuff like, and I think that's what and the, the, the, the haters are more motivated right than the non haters. And so you're going to, you're going to have this perception of and also I think I would say is I do love poking the bear.
00;45;50;25 - 00;45;56;19
Malcolm Gladwell
So sure if you go after Joe Rogan Elon. Yeah. And RFK Jr, you're going to get.
00;45;56;23 - 00;45;57;08
Geoff Nielson
Mobilized.
00;45;57;08 - 00;45;59;01
Malcolm Gladwell
There. You're going to mobilize your force.
00;45;59;04 - 00;46;14;21
Geoff Nielson
Well, and I was thinking too, it's like it's kind of like the Yelp review effect, right. Like you never have people like just saying like the restaurant was good, right. Like it was either the best meal they've ever had or probably even more likely, like, you know, they treated me like garbage and gave me food poisoning in my entire family.
00;46;14;21 - 00;46;21;29
Malcolm Gladwell
So the the extent to which social media is not a representative sample. Yeah. This cannot be underestimated.
00;46;22;02 - 00;46;27;26
Geoff Nielson
Very well said. Malcolm, I want to say a big thank you for joining me here today. I thought this was really interesting and really appreciated your time.
00;46;27;29 - 00;46;28;17
Malcolm Gladwell
Thank you so much.
00;46;28;23 - 00;46;29;01
Geoff Nielson
Thanks.


The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Listen
Our Guest Cathy Hackl Discusses
How Spatial Tech Will Change Your Reality Forever
Cathy Hackl sits down with Geoff Nielson for an honest conversation about where technology is headed and what’s really happening with spatial computing, AI hardware, and the future of human connection.
Listen
Our Guest Malcolm Gladwell Discusses
Malcolm Gladwell on Tesla, RFK, and Why AI Could Save Us
Can generative AI help us close the gap between expertise and access?
Listen
Our Guest John Rossman Discusses
Former Amazon Exec: The 3 trends That Will Disrupt Every Industry
Can your organization survive its own hesitation to take bold bets? Today on Digital Disruption, host Geoff Nielson and former Amazon exec and bestselling author John Rossman unpack why most change initiatives fall short and what leaders can do to shift the odds in their favor. Get ready to challenge assumptions, cut through the hype, and transform the way you lead.
Listen
Our Guest Tom Goodwin Discusses
The Big Lie Behind AI Hype: Tom Goodwin on the Future of Business
Is the smartest move in a time of rapid technological change to slow down, ignore the hype, and prioritize people, processes, and a long-term strategic vision? That's what Tom Goodwin, co-founder of All We Have Is Now argues in this compelling podcast interview with Geoff Nielson.