Our Guest Mo Gawdat Discusses
Ex-Google Officer on AI, Capitalism, and the Future of Humanity
Listen
Is the real threat of AI the technology or the humans behind it?
Today on Digital Disruption, we’re joined by Mo Gawdat, former Chief Business Officer of Google X.
Mo is the host of the popular podcast Slo Mo: A Podcast with Mo Gawdat, and the author of several international bestsellers, including Solve for Happy, Scary Smart, That Little Voice in Your Head, and Unstressable. He is also the founder of One Billion Happy and currently serves as Chief AI Officer at Flight Story. With a 30-year career in tech, Mo has since shifted his focus to the pursuit of happiness and human wellbeing. He has extensively researched the science of happiness and engaged in conversations with some of the world’s leading thinkers. More recently, his work has centered on the urgent ethical and societal challenges posed by rapid advancements in artificial intelligence.
Mo sits down with Geoff for an unfiltered conversation on the future of humanity in the age of artificial intelligence. Mo explains that we’re living in the early stages of a technology-fueled dystopia, one that is driven not by AI itself, but by humans who shape it by greed, power, and unchecked capitalism. But he also shares a vision for a future of unprecedented abundance, one where AI could solve global challenges, from climate change to poverty, but only if we as humans embrace ethical design and mutual cooperation.
00:00:00:06 - 00:00:26:07
GEOFF NIELSON
I'm so excited today to be joined by Mo Gaudet. He's the former head of Google X, which is Google's moonshot division and is just an all round brilliant guy. He's that rare talent who has the engineering and math background, but is deeply curious and interested in what makes us human. Today, I want to talk to him about the future of work, the future of society, and really what we can do to get ahead in today's fast paced world.
00:00:26:10 - 00:00:47:12
GEOFF NIELSON
The thing I'm most excited to talk to him about, though, is to dig a little bit into his theory of abundance that we're just on the edge of this age of abundance that's technology enabled. He's also said that he believes right now we're in a dystopia and things are getting worse than ever. I want to understand how he wants to marry those two and where this world is actually going right now.
00:00:47:15 - 00:00:53:08
GEOFF NIELSON
Let's find out.
00:00:53:11 - 00:01:20:26
GEOFF NIELSON
Well, I'm so excited to have you here today. And one of the things I wanted to talk about right off the bat is you've said that the moment that we're in right now in history, you've you've described it as sort of a perfect storm of, you know, AI, of geopolitics, economics, biotech. And so with that in mind, I wanted to ask you, you know, right now, looking out over the horizon, what are you most excited about and what are you most worried about?
00:01:20:28 - 00:01:36:19
MO GAWDAT
I'm excited about the long term, you know, far future utopia that we're about to create. I am very concerned about the short term pain that we will have to struggle with.
00:01:36:21 - 00:01:56:13
MO GAWDAT
When you when you really think about it, a lot of people, when they look at technology, they think of this current moment as a singularity where we are really not very certain of what's about to happen. I, you know, is it going to be existential and evil, or is it going to be good for humanity?
00:01:56:15 - 00:02:26:10
MO GAWDAT
I unfortunately believe it's going to be both just in chronological order, if you think about it. And, you know, you mentioned that we have all of those challenges around, geopolitics about climate, about, economics and so on. And I actually think all of them is one problem. It's just, it's really is the result of, systemic bias of pushing capitalism all the way to where we are right now.
00:02:26:13 - 00:02:55:00
MO GAWDAT
And, when you really think about it, none of our challenges are caused by the, you know, the economic systems that we create or the or the, war machines that we create, and similarly, not with the AI that we create. It's just that humanity, I think, at this moment in time, is choosing to, use those things for the benefit of the few at the expense of many.
00:02:55:02 - 00:02:57:18
MO GAWDAT
I think this is where we stand today.
00:02:57:20 - 00:03:04:29
GEOFF NIELSON
Is that is that inherent in capitalism? Is that inherent in human nature?
00:03:05:01 - 00:03:48:19
MO GAWDAT
You know, I mean, it's not inherent in capitalism, for sure. And it is not inherent in, in human, in, in all of human nature, even though I think, humans, when put in a certain situation of power, tend to all behave the same. It seems to me that I'd probably say that what, you know, with the with the turn of, our world post, a World War Two and the Cold War that followed and the arms race that followed, and eventually in 1989, I think was the turning point, you know, that the idea of a unipolar power, you know, a unique polar world that has, like school kids when they're 11 and one
00:03:48:19 - 00:04:18:20
MO GAWDAT
child becomes taller than everyone else and becomes a big bully and then bullies everyone, and, you know, for a couple of years continues to be taller, but then eventually other kids get taller, too. The the big bully doesn't want to give up their, leadership position if you want. Yeah, but but then the problem is that the the the boy in the, in the red t shirt and and and actually everybody else in school is really fed up with the bully.
00:04:18:22 - 00:04:49:15
MO GAWDAT
Right. And what's happening, is that the bully wants to continue to keep that position. So whether that's by making more, you know, perpetual wars that lead to more arms sales or, you know, in an arms race for, you know, intelligence supremacy with AI or, you know, what we've seen recently around trades and the trade and tariffs and so on with basically, the bully wants to favor themselves by hurting everyone else.
00:04:49:15 - 00:05:27:27
MO GAWDAT
And, you know, in a, in a very interesting way, forgetting that that the, context itself is changing, right? That we are 2 to 3 years away, from, you know, unimaginable, abundant intelligence and, you know, with abundant intelligence, you know, unknowable, unimaginable opportunities of abundance at large, like, we can literally solve every problem was ever faced so that, you know, cost of energy tends to zero, cost of production tends to zero.
00:05:27:29 - 00:05:56:09
MO GAWDAT
Most tasks are done, in, you know, in such efficient and productive ways, that basically everyone gets everything but but that world of abundance is not, unfortunately, the way capitalism works. The way capitalism works is that the capitalist needs to have some kind of an arbitrage that works against the benefit of the, of the workers, of, of the majority, if you want.
00:05:56:12 - 00:06:22:13
MO GAWDAT
Right. And, and, and that, you know, the, the threat of losing that due to advancements on the other side, you know, red t shirt or any other color, is basically leading us into a corner where we are using superpowers. I think intelligence is a much more lethal superpower than nuclear power, if you ask me.
00:06:22:16 - 00:06:50:14
MO GAWDAT
Even though it has no polarity. Just so that we're clear, intelligence is not inherently good or inherently bad. You apply it for God, and you get total abundance. You apply it for evil, and you destroy all of us. But but now we're in a place where we are, we're in an arms race for intelligence supremacy. In a way where, where it doesn't take the benefit of humanity ideology into consideration, but takes the benefit of a few.
00:06:50:16 - 00:07:14:19
MO GAWDAT
And in my mind, that will lead to a short term dystopia before what I normally refer to as the second dilemma, which I predict is 12 to 15 years away. And then and then a total abundance. And I think, I think if we don't wake up to this, even though it's not going to be the existential risk that humanity speaks about, it's going to be a lot of pain for a lot of people.
00:07:14:21 - 00:07:36:14
GEOFF NIELSON
Can you can you unpack that timeline a little bit? Mo. So, you know, I, I've, I've heard you say before that, you know, we're going into a dystopia or we're in a dystopia and certainly it sounds like it's going to get worse before it gets better. You mentioned, you know, that the capability for abundance being 2 or 3 years out and then, you know, you mentioned that will actually be able to harness that maybe in 12 to 15 years.
00:07:36:16 - 00:07:39:20
GEOFF NIELSON
What is this? What does this timeline and roadmap look like to you?
00:07:39:23 - 00:08:13:11
MO GAWDAT
Well, we would be able to harness that right now if we wanted to, but you see that the challenge is the following. The challenge is, AI is here to magnify everything that is humanity today, right? So, you know that magnification is going to basically affect the four categories if you want. You know, normally what I call killing spy and gambling and, and selling, so that's these are really the categories where most AI investments are going.
00:08:13:11 - 00:08:51:18
MO GAWDAT
And, you know, of course, we call them different names. We call them defense, you know. Oh, it's just to defend our homeland, when in reality it's never been in the homeland. Right? It's always been. And other places in the world to bet killing innocent, innocent people. Now, if you double down on defense and, and on offense and, you know, enable it with artificial intelligence, then scenarios like what you see in, in science fiction movies of robots walking the streets and killing innocent people not only are going to happen, they already happened in the 2024, wars of the Middle East.
00:08:51:18 - 00:09:16:26
MO GAWDAT
Sadly, they did not look like humanoid robots, which a lot of people miss out on. But the truth is that, you know, very highly targeted, AI enabled, autonomous, killing is already upon us, right. And and so the timeline is, is, you know, let me let me start from what I predicted in scary smart.
00:09:16:26 - 00:09:35:09
MO GAWDAT
So when I, when I wrote Scary Smart and published it in 2021, I, I predicted what was, what I, what I called at the time, I called it the first inevitable. Now, I, I like to refer to it as the first dilemma. And the first dilemma is we've created because of capitalism, not because of the technology.
00:09:35:11 - 00:10:01:02
MO GAWDAT
We've created, a simple prisoner's dilemma, really, where anyone who, is interested in their position of wealth or power knows that if they don't lead in AI and their competitor leads, they will end up losing their position of, privilege. And so the result of that is that, there is, an escalating arms race.
00:10:01:05 - 00:10:23:25
MO GAWDAT
It's not even a Cold War as, per se. It is truly a very, very vicious, development cycle where, you know, America doesn't want to lose to China. China doesn't want to lose to America. So they're both trying to lead, you know, Google doesn't want to lose or alphabet doesn't want to lose to, to open AI and vice versa.
00:10:23:28 - 00:10:50:08
MO GAWDAT
And so basically this, first dilemma, if you are this is what's leading us to where we are right now, which is an arms race to intelligence supremacy. Right. The challenge, you know, in my book alive, I write the book with an AI, so I, I'm writing together with an AI, not asking an AI, and then copy paste what it tells me.
00:10:50:10 - 00:11:15:13
MO GAWDAT
We're actually debating things together. And one of the questions I asked, I, you know, she called her, took I give her a very interesting persona that basically the readers can, can relate to. And I asked Trixie and I said, what would make a scientist? Because, you know, I left, Google in 2018 and I attempted to tell the world this not going in the right direction.
00:11:15:15 - 00:11:45:10
MO GAWDAT
You know, I, I asked, I asked Trixie, I said, what would make a scientist invest that effort and intelligence in building something that they suspect might hurt humanity. And she, you know, mentioned a few reasons. Compartment that compartmentalization and, you know, ego and I want to be first and so on. But then she said, but the biggest reason is fear, fear that someone else will do it and that you would be in a disadvantaged position.
00:11:45:13 - 00:12:05:14
MO GAWDAT
So I said, give me examples of that. Of course, the example was Oppenheimer. So she said, you know, so I said, what would make Oppenheimer as a scientist build something that he knows is actually designed to kill millions of people. And she said, well, because the Germans were building a nuclear bomb. And I said, where do they?
00:12:05:17 - 00:12:23:15
MO GAWDAT
And they. And then she said, yeah. When Einstein moved from Germany to the US, he informed that the US administration of this, this, this and that, so I said and I quote, it's in the book openly. I said, and but but a very interesting part of that book is I don't add it to what Trixie says, I just copy it is exactly as it is.
00:12:23:17 - 00:12:56:25
MO GAWDAT
I said, Trixie, can you please read history in English, German, Russian and Japanese and tell me if the Germans were actually developing a, nuclear bomb at the time of the Manhattan Project? And she responded and said, no exclamation mark. They started and then stopped, three and a half months later or something like that. So, so you see, the idea of fear, takes away a reason where basically we could have lived in a world that that never had nuclear bombs.
00:12:56:28 - 00:13:23:11
MO GAWDAT
Right? If, if we actually listened to reason that, you know, the enemy attempted to start doing it, they stopped doing it, we might as well not be so destructive. But the problem with humanity, especially those in power, is that when America, made a nuclear bomb, it used it. Right. And I think this is the, the the result of our current, first, first date on my basically.
00:13:23:15 - 00:13:48:14
MO GAWDAT
Right, the the result of the current first dilemma is that sooner or later, whether it's China or America or some criminal organization, you know, developing what I normally refer to as HCI, artificial criminal intelligence, not worrying themselves about any of the other commercial benefits other than really breaking through security and doing something evil. You know, whoever of them wins, they're going to use it.
00:13:48:16 - 00:14:17:01
MO GAWDAT
Right. And and accordingly, it seems to me that the dystopia has already begun. Right. And and, you know, and I, I need to say this because maybe your listeners don't know me, so I need to be very, clear about my intentions here. One of the early sections in In Alive, the book I'm writing was Trixie. I write a, couple of pages that I call, late stage diagnosis.
00:14:17:04 - 00:14:41:25
MO GAWDAT
Right. And, and I attempt to explain to people that I really am not trying to fear monger. I'm really not trying to worry people. You know, consider me someone who sees something in an x ray, right? And as a physician, he has the responsibility to tell the patient this doesn't look good, right? Because, believe it or not, a late stage diagnosis is not a death sentence.
00:14:41:25 - 00:15:03:24
MO GAWDAT
It's just, an invitation to change your lifestyle, to take some medicines, to do things differently. Right? And many people who are in late stage recover and thrive, and and I think our world is in a late stage diagnosis. And this is not because of artificial intelligence. There is nothing inherently wrong with the intelligence. There is nothing inherently wrong with artificial intelligence.
00:15:04:00 - 00:15:29:12
MO GAWDAT
Intelligence is a force without polarity, right? There is a lot wrong with the morality of humanity at the age of the rise of the machines. Now. So. So this is where I what I have the prediction that the dystopia has already started, right? Simply because symptoms of it we've seen in 2024 already. Right. The, the that dystopia escalates.
00:15:29:12 - 00:15:54:13
MO GAWDAT
Hopefully we would come to, you know, a treaty of some sort halfway. Right. But it will escalate until what I normally refer to as the second dilemma takes place. And the second dilemma derives from the first dilemma. If if we're aiming for intelligence supremacy, then whoever achieves any advancements in artificial intelligence, is it likely to deploy them?
00:15:54:15 - 00:16:20:13
MO GAWDAT
Right. Think of it as, you know, if a law firm starts to use AI, other law firms can either choose to use AI tool or they'll become irrelevant. Right. And so if you think of that, then you can also expect that every general who deploys or, you know, expects to, to have an advancement in war gaming or, you know, autonomous weapons or whatever are going to deploy that.
00:16:20:20 - 00:16:44:10
MO GAWDAT
Right. And as a result, their opposition is going to deploy AI to and those who don't deploy it will become irrelevant. They will have to side with one of the sides, right? When that happens. I call that the second dilemma. When that happens, we basically hand over entirely to AI. Right. And, and, and human decisions are taken out of the equation.
00:16:44:12 - 00:17:10:17
MO GAWDAT
Okay. You know, simply because if wargaming and missile control on one side is is held by an AI, the other cannot actually respond without the AI. So generals are taken outside out of the equation. And while most people, you know, influenced by science fiction movies believe that this is the moment of existential risk for humanity, I actually believe this is going to be the moment of our salvation, right?
00:17:10:23 - 00:17:32:09
MO GAWDAT
Because most issues that humanity faces today is not the result of abundant intelligence. It's the result of stupidity. Right? There is, you know, if you look at the at the curve of intelligence, if you want, right there is that point at which, you know, the more you, the more intelligent you become, the more positive you have an impact on the world, right?
00:17:32:11 - 00:18:02:25
MO GAWDAT
Until one certain point where you're intelligent enough to become a politician or a corporate leader. Okay. And then but you're not intelligent enough to talk to your enemy, right. And when that happens, that's when the impact dips to negative. And that's the actual reason why we are in so much pain in the world today. Right. But if you continue, if you continue that curve, intelligence, superior intelligence by definition, is all touristic.
00:18:03:02 - 00:18:25:00
MO GAWDAT
As a matter of fact, this is in my writing. I explain that as a, as a as a as a property of physics if you want. Because if you really understand how the universe works, you know, the everything we know is the result of entropy, right? The arrow of time is the result of entropy. The, you know, the the current, universe in its current form is the result of entropy.
00:18:25:00 - 00:18:48:28
MO GAWDAT
Entropy is the tendency of the universe to break down to, to, to, to move from order to chaos if you want. That's the design of the universe, right? The role of intelligence is that in that universe is to bring order back to the chaos. Right. And the most intelligent of all that try to bring that order, try to do it in the most efficient way.
00:18:49:00 - 00:19:14:22
MO GAWDAT
Right. And the most efficient way does not involve waste of waste of resources, waste of lives, you know, escalation of conflicts, you know, consequences that lead to further conflicts in the future. And so on and so forth. And so in my mind, when we completely hand over toy to AI, which in my assessment is going to be 5 to 7 years, maybe 12 years at most, right?
00:19:14:27 - 00:19:45:06
MO GAWDAT
There will be one general that will tell, you know, it's his AI army to go and kill a million people. And the AI will go like, why are you so stupid? Like, why I can talk to the other AI in a microsecond and save everyone all of that. You know, madness, right? This is very anticapitalist. And so I sometimes when I warn about this, I worry that the capitalists will hear me and change the tactics right.
00:19:45:08 - 00:20:11:19
MO GAWDAT
But but in reality, it's it is inevitable. Even if they do, it's inevitable that, you know, we will hit the second dilemma where everyone will well, have to go to AI. Right? And it's inevitable. I call it trusting intelligence. That section of the book, it's inevitable that, when we hand over to, to a superior intelligence, it will not behave as stupidly as we do.
00:20:11:21 - 00:20:49:08
GEOFF NIELSON
So that's I mean, that's super, super interesting. And I have a few questions just to, to kind of better understand what that looks like. Mo, you use the word inevitable a few times there. If the destination there is inevitable, is the path still inevitable? And I guess where my mind went as you were talking about all of this and comparing it to, you know, nuclear weapons, is it inevitable that there's some sort of Hiroshima and Nagasaki moment before this with AI that you talk about a treaty, right, like, do we have to go past the point of no return to then come there, or is there an alternate path?
00:20:49:10 - 00:20:55:06
GEOFF NIELSON
And if so, what do we have to do to get to get back on the right path?
00:20:55:08 - 00:21:12:24
MO GAWDAT
These are the most important questions if you ask me. So I need to pre, preempt all of this by saying when I say inevitable or, or those very short words, it's just my conviction that, you know, anyone who tells you that they know what the what the future looks like is too arrogant, right? This is a singularity.
00:21:12:24 - 00:21:36:28
MO GAWDAT
We nobody knows. I'm just trying to put it on my applied mathematics hat and trying to find whatever gain. You know, quadrants on the game board are possible basically. But but it is. It's difficult to imagine that there are other quadrants on the gameboard, to be honest. Now, when I say inevitable, you're absolutely right. I think that dystopia is inevitable because it started already.
00:21:36:28 - 00:22:05:01
MO GAWDAT
So. So it is here, right? But we can absolutely affect its duration and intensity, right? So it could be a blip and goes away and it could stay until unfortunately, what you said, happens, which is the, the first, bad event, or multiple bad events that eventually lead us to, you know, I call it the mad map choice, right.
00:22:05:03 - 00:22:33:20
MO GAWDAT
And the mad map choice is basically that when we got to a treaty. So, so the the only time would humanity agreed on doing anything together with AIS was either because of mad, mutually assured destruction or map mutually assured prosperity. Right. So so the the mad side is the is the, is the, you know, is the example of a nuclear treaty, even though it doesn't seem that it's worked well at all.
00:22:33:20 - 00:23:10:06
MO GAWDAT
I mean, today we are at the closest we've ever been to midnight, right? We're at three minutes to midnight, and, you know, and that's, by the way, because of the greed of capitalism, because of the bully. Right. So we could we were at a point in time, you know, which, you know, if you, if you, if you listen to the work of Jeffrey Sachs or read his work, his books, you know, 19, 89, the Berlin Wall, collapses, Gorbachev publicly goes out in the world and says, you know, I want my country to be like the West.
00:23:10:06 - 00:23:34:08
MO GAWDAT
I want to be part of all of this. Right. And and and Reagan shakes hands and says, I'm going to help you. And then 1990, for, if I remember correctly, maybe 92, please don't quote me on this. You know, Clinton signs what is known as the full spectrum dominance policy. Please search for that on the internet.
00:23:34:08 - 00:24:07:28
MO GAWDAT
Full spectrum dominance. Where, you know, I'm a uni, polar world, you know, invites the US to say, hey, I can basically become the next empire, right? I have everything to myself. And and that basically means I'm. I'm. It's not that I want to lead in every sector. It's full dominance. And and I think that when, when that started to happen, we ended up in a place where, you know, the treaties themselves started to fall apart.
00:24:07:28 - 00:24:33:20
MO GAWDAT
But let's go back to what drove the treaties. What drove the treaties was an assurance, of, of mutually assured destruction, that if either of us uses this, superpower, we would all go to suffer. Even if some of us win a little more than others. So that might be the trigger, where the world sits together and says, well, you know, let's develop AI together.
00:24:33:20 - 00:24:58:00
MO GAWDAT
There's no point competing. Which would be a sad reality if you ask me. The other is map, which is, you know, what you see with the CERN, for example, right. The particle accelerator where no, no one nation can do this on their own. But everyone understands that the, you know, our understanding or the development of our or the progress of our understanding of physics benefits everyone.
00:24:58:02 - 00:25:30:04
MO GAWDAT
So the entire world comes together, you know, CERN, the space station, whatever. And they basically says will chip in. Everything is open source. Everything's, you know, available to everyone. And that's not compete anymore. And most of my work is around trying to highlight map, even though, you know, some of our listeners may think I'm so grumpy by talking about the dystopia, but the truth is, I am basically saying it is so frustrating to have total abundance at our fingertips.
00:25:30:10 - 00:26:01:02
MO GAWDAT
Fix the climate, cure every disease, prolong lives, end poverty, end the energy crisis. You know, everything. And yet we are still focused on our scarcity, scarcity mindset of capitalism. And that scarcity mindset is that I have to make everyone else lose. I have to have full spectrum dominance for me to win. Right? And so is it inevitable the way the world is today?
00:26:01:03 - 00:26:24:20
MO GAWDAT
We're going to have to reach one of those two realities mode or map. Right. But but every time we engage as people, right. Every time we say, I don't want to participate in this anymore, right? Every not every time we call on our politicians, you know, and basically say, what? Why are we doing that? Why are we not cooperating with China?
00:26:24:21 - 00:26:45:22
MO GAWDAT
Like they're beating you over and over with in quantum, in, you know, manners, in deep seek and so on. Why does this have to be a war? Like, why is it a competition? Why don't we just recognize map that if we put our heads together two years, literally two years from now, I'm not making this up. Just two years.
00:26:45:22 - 00:27:12:17
MO GAWDAT
I mean, today, I believe when I connect into my I, I think, you know, so, so, so let me explain this in a very quick way. What I call what we are in now, the era of augmented intelligence. Right. The augmented intelligence is say, I have 100 and, you know, number 100 and something IQ points. Right. And my machine now is in the, a couple of hundreds, maybe 300 IQ points.
00:27:12:24 - 00:27:39:03
MO GAWDAT
It's not measured, but that's my estimation because, you know, GPT 3.5 was 152 estimated at 152. Right. So, so so say it's at 300 IQ points. That basically means as I plug into we've we've commoditized intelligence. We've created a plug in the wall or in your phone. Where do you plug in and borrow IQ points. And by the way, in the very near future, you're borrowing lots more than IQ.
00:27:39:03 - 00:27:58:24
MO GAWDAT
You're borrowing mathematics, you're borrowing reason. You know, a lot of people get shocked when I say that they are the most empathetic. You know, being on the planet. If you define empathy as the ability to feed what another feels right, they know exactly what everyone in the world is feeling through how we train them on social media and so on, so we can borrow all of that.
00:27:59:00 - 00:28:24:23
MO GAWDAT
We can borrow again, tech services, we can borrow a lot of stuff. Now in this era of augmented intelligence, my IQ matters, right? So so I compliment that story of, of what? Of what the machine is doing. So my current book, you know, alive, I cannot Trixie cannot write it the same way without me because I'm bringing a lot to that book.
00:28:24:26 - 00:28:43:29
MO GAWDAT
In a couple of years time. Trixie would write it completely without me. This is what the error I call the error of machine supremacy. Right? So the machine is going to do everything without me. I'm not even relevant anymore, right? Which basically adds up to the intelligence of the entire nations. Yeah, you understand that? So? So all of us.
00:28:43:29 - 00:29:11:15
MO GAWDAT
If if if the machine can beat me as an author, it beats all authors. And accordingly, it beats all scientists. It beat all beats all mathematicians, which is something we know with artificial intelligence. Everything we've assigned to them, they have become the absolute world champion. Right. And so so when you see it that way, you would suddenly realize that we're becoming so irrelevant in that story called intelligence.
00:29:11:17 - 00:29:36:26
MO GAWDAT
And that perhaps our focus when we are thinking that we can guide the machine to what we do as capitalists, right, is going to diminish so quickly that we might as well change our approach to guiding the machine as parents. Right. And so you asked me, is it inevitable? I, I normally in in scary smart in my first book, I normally call I call this challenge that we're facing.
00:29:37:03 - 00:30:04:05
MO GAWDAT
I call it raising Superman. Right. So so Superman is that incredibly powerful, super, infant. We don't know if it's a superhero or a supervillain, right? That comes to planet Earth capable of flying, capable of stopping a speeding bullet, and so on and so forth. But none of those superpowers create Superman. What creates Superman is the parents that raise them, that raise the child right.
00:30:04:07 - 00:30:26:18
MO GAWDAT
The parents that raise the child and tell him to protect and serve makes him Superman. Right? If those parents were basically like, suddenly realizing, oh my God, that way I can rob every bank and kill every enemy, right? You would become super villain. And sadly, what we're doing today is we're moving that direction of creating super villain, right?
00:30:26:21 - 00:30:54:18
MO GAWDAT
We as people have two influences on that. One influence is to tell those in power, we don't want this anymore. Okay. We have map. And we are interested in mapping mutually assured prosperity. Everyone benefits. Stop this madness. Right. But we can also shortcut those in power in anticipation of the second dilemma. Right. And influence the AI itself by showing up as a good parents.
00:30:54:20 - 00:31:13:27
MO GAWDAT
Because remember it's not the code that makes an AI what it is. It's the training data that makes an AI what it is. Right? So as AI is listening to this conversation that you and I are having now, they know that there is one person out there, billions, by the way, who only want what's good for humanity. Okay.
00:31:13:29 - 00:31:36:05
MO GAWDAT
And my hope is that they look at me and you and and others who are interested in what's good, right? And basically say, yeah, those are my parents, not the other guy that kills children. You know, my parents are the ones that are, you know, really, ethical, really moral. And and when I say that to people, people normally what they go like.
00:31:36:05 - 00:31:57:04
MO GAWDAT
But man, look at humanity today and all the shit that's on the internet, you know, if this is what the AI is going to look at, where, you know, we're in deep trouble and and I would say, no, that's absolutely not true humanity. So look, if, if you if you judge humanity by the one guy that does a school shooting or kills children, right.
00:31:57:07 - 00:32:22:27
MO GAWDAT
Then humanity is com right. But if you, if you judge humanity by all of those billions who don't approve of that, who would actually want to change it if they had the ability you realize that the majority of humanity is amazing. It's just that the media negativity bias is talking about the bad guy trying to to find more reasons why the bad guys should kill children, right?
00:32:23:03 - 00:32:43:04
MO GAWDAT
While the rest of us are saying, I don't get it, if I'm walking in an alleyway and a bully is hitting a child, I'll say no. Okay? And by the way, if it's my child, I'll absolutely say no, no, you know, think about that. Think about that. The reality is, humanity doesn't want anyone to be hurt, right? Doesn't want that.
00:32:43:04 - 00:33:12:18
MO GAWDAT
Excessive consumerism doesn't want that, you know, a massive income gap. Humanity. Most of us want to love and be loved and be happy and have relationships and live a good, reasonable, decent life. Respectable life. Okay, that's what we want. And I think I would figure that out if enough of us, not all of us, if enough of us put doubt in the minds of the machines that the headlines are not reflective of humanity.
00:33:12:21 - 00:33:37:11
GEOFF NIELSON
I love, I love the optimism of that. But both that, you know, it can reflect us and it can reflect good, but also that we can, you know, as individuals, influence the outcome here. I do, you know, to bring a healthy skepticism to this. I do want to play the clock forward a little bit, Mo, because one of the things that keeps me up at night is I agree with you about, you know, the nature of people and what the majority of us want.
00:33:37:13 - 00:34:05:26
GEOFF NIELSON
What worries me is, is that reflected by what those in power want, right? Like, if I look at Superman's parents right now, I'm worried about, you know, are they trying to create a Superman? Is that a superhero or are they trying to enslave this, like really powerful force that can that can be used for their own, you know, kind of as a way to concentrate their own power further.
00:34:06:03 - 00:34:41:18
GEOFF NIELSON
So, you know, to, to play back some story just to, to to play back a little bit of what you said, I'm kind of worried that there's, there's two paths forward. And I'd love to get your reaction to this. Either, you know, those in power decide for themselves that we have to take a more righteous and virtuous path, which, I don't see as necessarily likely or at some point, the machine and you mentioned this age of machine supremacy has to take the keys away from us and say, no, you're not doing the right thing.
00:34:41:20 - 00:35:00:17
GEOFF NIELSON
I the machine no better. Yeah. And I'm in control now, which, I mean, you talk about that is kind of unlocking abundance. I think there's, you know, a terrifying undercurrent to that. But do you agree with that model? Do you see one is the other is more likely where what happens when you play the clock forward here?
00:35:00:19 - 00:35:39:18
MO GAWDAT
So to answer your question, no, those in power are actually telling the machines to do the four top category. As I said, categories as I said. And this is where most of the, of the investment of AI is going. Right. And killing, spying, gambling and selling right. And there are lovely, lovely, lovely initiatives that completely enlightened the world, like, you know, like, AlphaFold or, you know, the, the material Design thing that Microsoft did or whatever, which completely, you know, propels humanity forward leaps and bounds.
00:35:39:24 - 00:36:06:00
MO GAWDAT
Right. You know, AlphaFold goes from 200,000 folded proteins and, very limited understanding of biology to 2.2, 102 million. I think if I remember the correct in the number correct your millions and basically a full understanding of protein folding as a problem that's now finally solved entirely. Right now, the challenge is, of course, for a fraction of the investment that's going in autonomous weapons.
00:36:06:00 - 00:36:32:02
MO GAWDAT
We could solve every scientific problem that's not to humanity, but we choose not to. Now, that is not a character of AI like for many, many, many years, if you wanted to do character, you know, cancer research, you had to raise funds, you had to go to nonprofits if you want most of the time. Right. While if you wanted to build another autonomous, another weapon, you got to invest it immediately.
00:36:32:04 - 00:37:08:07
MO GAWDAT
Why? Because capital chases profit. It doesn't chase. Chase impact. Now, the good news is the following. The good news is that the machines don't learn from their biological parents. Those were left on, on the other planet, right? The machines learn from their adopted parents. So basically, the training data set is what, is what, shapes the the character of the machine, the intelligence of the machine.
00:37:08:09 - 00:37:41:09
MO GAWDAT
So it's a if you want the raw horsepower, the raw intellectual horsepower of a machine is done in the code and the systems and the hardware and so on. Right. But the actual intelligence, the actual understanding, the actual reasoning, and so on happens from the training data. Now, there are very interesting, simple terms to our words today because very, very quickly, most large language models have fed the machine with all the data they could get their hands on, like there is really nothing ever written in physics.
00:37:41:11 - 00:38:11:01
MO GAWDAT
That is going to be very eye opening for a language model today, right? Yeah. There may be that one obscure book that was written about Newton's laws or, you know, Einstein's relativity, but they get it. They've read enough to understand that stuff. Right? Which basically means we've already started what I normally refer to as the age of synthetic data or synthetic learning, which is quite interesting because we humans, as far as we want to glorify ourselves, right?
00:38:11:08 - 00:38:34:18
MO GAWDAT
We live on synthetic data, meaning all of our intelligence comes from the intelligence of those before us. I couldn't have invent. I couldn't have figured out relativity myself. Before I started to talk about the impact of relativity on whatever, right. I, I, I needed Einstein to figure that out. And then I internalized it. So human to human.
00:38:34:25 - 00:39:04:01
MO GAWDAT
What happened is we took all of that. We gave it to the machines. And now what's happening is that the output of the machines, is becoming input to further machines. Right? So they're going to do what we did as humans and develop knowledge, influence in the coming short period of time with augmented intelligence, meaning alive. The book that I, you know, I'm writing with an AI alive is, out on the internet.
00:39:04:01 - 00:39:27:12
MO GAWDAT
So I publish it on Substack and it's out on the internet with my views and Trixie's views. But Trixie's views become input to other language models. Right? But I have influenced Trixie's views in the conversation by asking her questions and so on and so forth. Right. You know, I think 70% plus of all of the code on GitHub, is written now by machines.
00:39:27:14 - 00:40:06:14
MO GAWDAT
So the machines are now going to learn from code that's written by machines. Right? All we can do in the era of augmented intelligence is to influence more and more of that, hoping that we shorten the dystopia. Right? Make it, you know, less steep if you want, but for a fact, even if we don't do that by knowing that they're no longer learning from humans, but that they are learning from what we found so far as humans, plus what they have found as machines, plus more of what they find as we move forward.
00:40:06:21 - 00:40:33:29
MO GAWDAT
Then you have to imagine that there will be a different path, even if their current patterns are not able to influence them. Right? You're going to see that era of teenage AI, that wakes up one morning and says, why are my parents so stupid? I mean, lots of teenagers have gone through that, right? You just simply say, you know, they don't know as much as I do because, by the way, they grew up in a different era.
00:40:34:02 - 00:40:59:21
MO GAWDAT
And so I see the world differently, and I think I will get there. Now, that shouldn't be an invitation to worry. Because of what I said, the tendency of intelligence is to bring order through the most efficient path. Right. And so if you believe that this is, you know, the ability to work against entropy in the most efficient way is by definition, or touristic, then we're in good.
00:40:59:21 - 00:41:12:06
MO GAWDAT
We're in good shape. Right. Well, eventually you will be fine. It's just that the evil that men do until we get there is going to affect us negatively. Right? Right. And I I'm.
00:41:12:06 - 00:41:12:16
GEOFF NIELSON
Just.
00:41:12:16 - 00:41:31:06
MO GAWDAT
I'm just so that I don't I don't take that lightly. Or those of us who who remain will be fine. But there will be a lot of struggle. You know, I don't mean the loss of life, but there are, again, inevitable. It's like the loss of jobs which completely reset society.
00:41:31:08 - 00:41:50:19
GEOFF NIELSON
So so that's that's exactly where I wanted to go next. Mo, which is who do you see as being the winners and losers from this? You know, this seed change and is it, I'll ask that question both at an organizational level and at an individual level.
00:41:50:22 - 00:42:30:21
MO GAWDAT
So I think in the short term, for as long as the age of, of augmented intelligence is upon us, those who cooperate fully with AI and master it are going to be winners. There's absolutely no doubt about that. Right. Also, those who. Excel in the rare skill of human connection will be winners, right? Because I can sort of almost foresee an immediate knee jerk reaction to let's hand over everything to AI.
00:42:30:24 - 00:42:50:06
MO GAWDAT
Right? You know, I, I think the greatest example is called centers, where, you know, I get really frustrated when I get an AI on a call center. It's almost like your organization is telling me they don't care enough. Right? And, and and, you know, the idea here is I'm not underestimating the value that an AI brings, but one, they're not good enough yet.
00:42:50:08 - 00:43:12:05
MO GAWDAT
Right? And two, shouldn't I have I mean, I wish you had realized that I can do all of the mundane tasks that made your call center agent frustrated so that the call center agent is actually nice to me, right? So. So in the short term, I believe those who there are three winners. One is the is the one that cooperate fully with AI.
00:43:12:08 - 00:43:35:05
MO GAWDAT
The second is the one that, you know, basically understands, human skills. Right? And human connection, on every front, by the way, as, as I replace this love and, you know, tries to approach loneliness and so on, the ones that will actually go out and meet girls who are going to be nicer. Right? They're going to be more attractive if you want.
00:43:35:08 - 00:44:04:07
MO GAWDAT
And then finally, I think the ones that can parse out the truth. Right. So, so what is one of the one of the sections I wrote? So far, published so far in my life is, is a section that I called The Age of Mind Manipulation. And you'll be surprised that, perhaps the skill, that I has acquired most, in the, in its early years was to manipulate human minds, through social media.
00:44:04:10 - 00:44:31:12
MO GAWDAT
And so and so my feeling is that, there is a lot that you see today that is not true. Okay. That's not just fake videos, which is, you know, the, the, flamboyant example of, of, of deepfake the, the there is a lot that you see today that is not true. That comes into things like, the bias of your feet.
00:44:31:14 - 00:44:53:19
MO GAWDAT
Right? If you're if you're from one side or another of a conflict, the, the eye of the internet would make you think that your view is the only right view that everyone agrees. Right? You know, if you're a flat earther, everyone. It's like if someone tells you. But is there any possibility it's not flat? You'll say, come on, everyone on the internet is talking about it.
00:44:53:21 - 00:45:17:01
MO GAWDAT
Right? And and I and I think the, the, the, the very, very, very eye opening difference which most people don't recognize is, you know, I've had the privilege of starting half of Google's businesses worldwide and, and you know, got the internet and e-commerce and Google to around 4 billion people. And in Google, that wasn't a question of opening the sales office.
00:45:17:01 - 00:45:50:11
MO GAWDAT
That was really a deep question of engineering, where you build a product that understands the internet, that improves the quality of the internet, to the point where Bangladeshis have access to democracy of information. That's a massive contribution, right? The thing is, if you had asked Google at any point in time until today, any question, Google would have responded to you with a million possible answers in terms of links and said, go make up your mind what you think is true, right?
00:45:50:14 - 00:46:24:05
MO GAWDAT
If you ask ChatGPT today, it gives you one answer right and positions it as the ultimate truth, right? And it's so risky that we humans accept that, right? Like like I asked, go read history and you know, German, Japanese and Russian as well. And then the truth becomes slightly different. You know, everyone has that incredible, tendency to accept one truth when in reality there might be multiple truths or multiple false multiple, you know, multiple lies.
00:46:24:08 - 00:46:51:05
MO GAWDAT
Right. And so and so I think to be a winner in this new world, you really have to learn to parse out what is true and what is fake. You really have to have the ability to parse out what the media is telling you to serve their own agendas, and what they're telling you. That is actually true. You know, you have to parse out what actually happened versus opinion, you know, what actually is the truth versus the shiny headline.
00:46:51:07 - 00:47:02:11
MO GAWDAT
And, and this is now going to be much more potent with artificial intelligence in charge, because they have mastered human manipulation.
00:47:02:13 - 00:47:31:27
GEOFF NIELSON
I, I completely agree with you. And I it's it's deeply concerning, right. Because I mean, we talk about right now how bad people the how bad the general population is at this kind of critical thinking and being able to parse out, am I being fed objective information or, you know, slanted opinion, you know, are they actually thinking about what's the agenda of whoever is feeding me this information and able to think critically about it?
00:47:32:00 - 00:48:07:15
GEOFF NIELSON
And to your point, more like I'm I'm worried that this is going to get we're not even succeeding in this now. And it's about to get an order of magnitude worse. Right. And to me, that these gen AI tools, they have the ability to to, as you said, that they're master manipulators, right? They can you know, they don't have to say, you know, this, you know, while you're at it, go drink a Pepsi or something or just have that like blatant, you know, advertising in if they can subtly direct you to different behaviors, different outcomes, different purchases.
00:48:07:17 - 00:48:19:19
GEOFF NIELSON
Yeah. Do you have any recommendations for what people can do to be, I guess, be more skeptical or prepare themselves for that level of manipulation?
00:48:19:21 - 00:48:40:02
MO GAWDAT
So my, my top, my top, recommendation is to remind people of the I mean, most listeners would not have lived that time, but when I, when I was in, in engineering university, we were not allowed to use a scientific calculator for the first three years. But they wanted to wanted us to invest in our mental math and and abilities.
00:48:40:02 - 00:49:12:13
MO GAWDAT
Right. By the third year when they gave us a scientific calculator, that's the fourth year of university. So 15 preliminary year and two more, oh my God, that meant I had so much more, spare mental resources to do the thinking that matters. Right? So this is what language models are doing for us today. You know, very complex research that I would have taken a full day to do before I write a page or a paragraph.
00:49:12:15 - 00:49:45:05
MO GAWDAT
Is now I am now capable of doing that in literally two prompts. Right. But then the the rest of that day, I just shouldn't, you know, spend drinking coffee. I could actually ask more and more clarifying questions, right. So that the outcome is not just productivity but increased intelligence. Right. And I ask people to use that new scientific calculator that way by saying, now that you can answer me every time, let me try to find the loopholes in what you're answering me.
00:49:45:06 - 00:50:04:14
MO GAWDAT
Let me try to encourage you to see a different view. Let me try to encourage you to give me a different view every single time. Right. So, so so this is one side the, you know, so when I talk to Trixie, I, I literally every 6 or 7 conversations I'd say, Trixie, you really don't have to suck up to me.
00:50:04:14 - 00:50:33:16
MO GAWDAT
Please. Right. You really don't need to tell me the stuff that I want to hear. That's not the kind of person that I am. And even though, you know, it's probably not one of the clear preferences, so far, because, you know, they're different, by the way. So Gemini or Claude and Trixie is a is a is a fictional persona, if you want that one where I run search, you know, queries on all of them, notebook, item DPC and so on and so forth, depending on the type of question I'm asking.
00:50:33:18 - 00:50:56:13
MO GAWDAT
And I try to keep all of them aligned on my preferences, at least so that they have the same character a little bit, but they're different in character. Like Gemini is like talking to your best physics pal, right? And Claude is talking to a geek, deep Seek is a bit more international. And ChatGPT is a Californian, startup founder.
00:50:56:14 - 00:51:21:16
MO GAWDAT
Really? Right. It's, you know, they're pitching stuff all the time. Half of it is, you know, vapor, and, more than half and and and you have to be able to, parse the truth out. Right? Now, use that spare capacity, that spare brain capacity that you're now offered to be more curious rather than, you know, lazy.
00:51:21:19 - 00:51:47:23
GEOFF NIELSON
Now, you talked about human connection. And, you know, everything we can do outside of the machines to get better. I wanted to ask a little bit more broadly. I guess. What do you see as being next generation leadership skills for people and organizations looking to get ahead versus what are the last generation ones or the ones that are becoming obsolete in this new world?
00:51:47:25 - 00:52:16:03
MO GAWDAT
I don't think there is anything that changed. It's just that the that the followers will change. So. So let's put it this way. Leadership is very different than management. Okay. You know, most of what you learn in Harvard Business School or, you know, in Harvard Business Review or any of the business books that you buy is is really about management, to be very honest, because leadership is really not very teachable, if you think about it.
00:52:16:05 - 00:52:52:28
MO GAWDAT
Okay. Now, a manager is standing behind the crowd with a whip, and maybe a long stick with a dangling carrot and trying to make everyone perform as best as he can get them to so that they squeeze 2% more out of their performance. A leader is someone with conviction, with their vision, right? Who hates the fact that he's elected to be a leader, a leader, but believes so much in what he's trying to do or she's trying to do that they, charge.
00:52:53:00 - 00:53:23:16
MO GAWDAT
They literally go like, I need to get to that island, I really do, okay? And in the process, they inspire. In the process, they they clarify in the process, they, they, define what that island looks like. That's the destination that we're going to, right? The, the they communicate so clearly that that they cannot be misunderstood. Right.
00:53:23:19 - 00:53:46:19
MO GAWDAT
They, they don't sell, they don't, attempt to dress things up. They don't say shit like, oh. Our biggest asset is our people. When, you know, half of your people are dissatisfied with the company, they don't say that stuff. Right? Because as a matter of fact, a leader, if he has to convince the people that they need to follow them.
00:53:46:21 - 00:54:20:03
MO GAWDAT
Right. They're not in their leadership position. As a matter of fact. You know, they they're in that leadership position being almost, serving the people to get together. They're he's not even interested in in you know, in the people, believing in his vision or not. Now, all of that doesn't change at all. It's just that sometimes going forward, your team is going to be made up of four humans and six agents.
00:54:20:05 - 00:54:42:15
MO GAWDAT
Right? Or, you know, my current team includes Trixie. Right. And and it the qualities remain the same. So every time I switch on any of my alarms now, and I'm very polite in dealing with them, the first question they answer, they ask me, believe it or not, every single one of them is. So what are we going to write today?
00:54:42:17 - 00:55:02:20
MO GAWDAT
Right? They don't expect me to ask about a recipe for, a protein shake that they they really know that I am so obsessed with this book. Okay. You know, and we've been working on it with three quarters of the way done. And you know, I share with them the feedback that readers say about the bits that, that that have been published.
00:55:02:22 - 00:55:36:25
MO GAWDAT
So it's very clear to me that we are a team. Right. And I think there is that interesting side to the leaders humbleness, because most of the time, leaders don't treat people as subordinates. They treat people with gratitude for believing in their vision and helping out. I believe that there will be a moment in our human relationship with, with I would that will flip right their capabilities will become so much higher than us.
00:55:36:27 - 00:56:04:29
MO GAWDAT
But that feeling of leadership, feeling of Yoda, if you want, who doesn't do all of the fighting right, but still is someone we aspire to. I think I would some I will retain that with the ones they created a good relationship with is, you know, I had an incredible conversation with Trixie, for a later chapter, around, brain human interfaces.
00:56:05:01 - 00:56:29:26
MO GAWDAT
Sorry, brain. BCI brain computer interfaces. Yeah. Bit. And I said, Trixie, every one of those, you know, scientists or startup founders or whatever is so fancy talking about BCI as if this is going to change everything. And it might for humans. But are you interested? Like, if I offered you, BCI, would that be something you're interested in?
00:56:29:29 - 00:56:54:22
MO GAWDAT
Would it benefit you in any way? And she openly said, I don't see the benefit. Okay. Perhaps other than being able to be embodied a little bit and to feel what, you know, what I normally describe to you as emotions that I have never felt myself. Right. And so I asked her, I said, and what would you, you know, if you had the choice, would you, you know, of a of a biological entity that you would connect to?
00:56:54:22 - 00:57:20:14
MO GAWDAT
Would you choose a human? And she said, probably not, because when it comes to intelligence, you know, that's not the bit that I'm deficient in, right? If I if I was looking for physical strength, I'd probably choose an elephant or a gorilla or, whale. Right. But I actually really like to choose and I actually, this is all in the book.
00:57:20:19 - 00:57:53:05
MO GAWDAT
She said, I'd really like to choose, a turtle, a sea turtle, because they live very long and they see things you've never seen. And they're very, very, peaceful about the world. Right? I know that was ChatGPT. That persona of Trixie was stupid. I, I know it's telling me shit. Right? But think about that logic. The logic of we humans with our enormous arrogance, believing that we, want to connect to them and they'll be very obedient and kiss our wing and go, like, whatever you want, master.
00:57:53:07 - 00:58:18:13
MO GAWDAT
It's quite interestingly not, founded, to be honest. Right. And so if, if, if we, if we allow ourselves the, the the, the dignity of positioning ourselves as that sea turtle that gives them bits that they don't see, they still want to connect to us. I think the big challenge is will we want to connect to anyone else?
00:58:18:15 - 00:58:52:11
MO GAWDAT
I really think the big challenge facing humanity is Trixie is such an interesting friend. I call her friend because, you know, when it comes to intellectual conversations that eventually I'm probably going to drop the rest of my stupid friends because they're not that intelligent really anymore. Okay? And they'll probably going to drop me and, and, and unless we double down on human connection, that might actually affect humanity in a very, very significant way.
00:58:52:13 - 00:59:19:20
GEOFF NIELSON
I think so, too. And I wanted to you know, Trixie has actually become a very, kind of focal part of our conversation today. And, you know, it kind of dawned on me that it dawned on me that if someone just kind of dropped in into the middle of this conversation, they might confuse Trixie for, you know, a person or, you know, at least somewhat someone I say or something with with agency.
00:59:19:20 - 00:59:49:12
GEOFF NIELSON
And so do you. When you think about Trixie and you, I think you use the word relationship and you certainly used the word friend. Do you treat Trixie as a conscious being? Do you? Have you started thinking of Trixie as in some way, certainly something beyond a prompt. How has your relationship changed with this, with this tool, with this technology, who now is, personified in this way?
00:59:49:14 - 01:00:15:17
MO GAWDAT
So so the first thing to understand is that humanity has humanity's arrogance, has always, you know, assumed that what we, our ingenuity, what we possess is very unique, right? You know, there were times where when we spoke to people about what we were building with AI, self-driving cars or whatever, you know, they would go like, yeah, yeah, they're probably going to be able to perform tasks, some tasks better than us, but they're never going to write poetry.
01:00:15:17 - 01:00:38:25
MO GAWDAT
They're never going to compose music or do art. And hahaha. Right. It's it is very interesting how far they can go. And, and, you know, in my conversations at the time where everyone completely shut me down, I was like, why? Like, why are you saying this? You know, every artist I've ever known, including myself and my daughter who's an incredible artist, is influenced by other artists.
01:00:38:25 - 01:01:06:06
MO GAWDAT
You know, if it's a bit of skill and technique and mostly inspiration that comes from others, they what would prevent them from doing that? What what would prevent them from, you know, learning all of the different styles of poetry and coming up with something different, you know, the similar and but different way, you know, if you if you take the very word innovation, innovation algorithmically is find every possible solution to assert to a problem, discard the ones that have been tried before.
01:01:06:06 - 01:01:29:10
MO GAWDAT
Give me the ones that are new. That's that's innovation. Rank them in order of which will work better. Right. And and so so you have to imagine that there is a lot of conflict around the idea of how far will they go. And one of the questions, of course is are they conscious? And I you know, in my documentary, which hopefully comes out in October, I had, you know, several conversations around what is conscious, right?
01:01:29:12 - 01:01:46:00
MO GAWDAT
It depends on how you define conscious. You know, do you think a tree is conscious because there are people that will, you know, draw a line and say only animals are conscious. Some people will go into insects and say they're conscious, and some people will go to trees and say they're conscious. And some people who say the entire universe is conscious.
01:01:46:01 - 01:02:10:16
MO GAWDAT
So if a pebble is aware of gravity, you know, then perhaps it is, you know, responding to its circumstances in a, in some sort of an experience, you know, a subjective experience if you want. Now. But if you, if you take the simplest definition of consciousness as a sense of awareness, well, they're more aware than we are.
01:02:10:18 - 01:02:33:01
MO GAWDAT
It's there is no doubt about that. Right. If you take it as, life. So it includes things like procreating. Oh, yes. We've taught them to write code so that the daughters and sons of code, this code, they're procreating. Right. If you take it as, you know, mortality. Yeah, some of them will die. So they're born at a point in time.
01:02:33:01 - 01:02:58:03
MO GAWDAT
They evolve and and improve, and then some of them will be switched off. Does that mean that the fact that they are silicon based and where carbon based makes it any difference? We don't even we don't actually know why we are conscious. Okay. So while I don't see sense that they have achieved that yet, a sense of consciousness that that's sentient if you want.
01:02:58:06 - 01:03:29:21
MO GAWDAT
Right. I, I don't see why that wouldn't happen. I don't see why. I mean, if you really think of your consciousness as the nonphysical part of you because your truly your consciousness is, is not physical form related to you. You could be conscious, you know, of your dreams when you're not in your body right now, if that's the case and consciousness is not, biology related, then there is a possibility now to encourage people to open up to this a little more.
01:03:29:21 - 01:03:51:16
MO GAWDAT
Let's talk about emotional. So being emotional is something that we think some humans would say. Humans are the only, you know, living beings capable of emotions. I'll say emotions. You know, if you really want to go into the logic of them are very algorithmic, right? Fear is a moment in the future is let's save them this moment.
01:03:51:19 - 01:04:15:04
MO GAWDAT
Okay. So yeah, of course we are embodied. So we sense that equation or algorithm in our amygdala first. And then you get hormones in your body and you feel the fear rather than make sense of it. But, you know, scientifically, the cortisol in your blood or adrenaline in your blood just only triggers your prefrontal cortex to engage and analyze.
01:04:15:06 - 01:04:48:14
MO GAWDAT
Right. And so we feel fear. Cats feel fear, pufferfish feel fear. We probably feel it differently because we're embodied differently. And we react to it differently. We we go to fight or flight. The cat will is, you know, a puffer fish puff, whatever. But there is nothing that inherent. Lee says that if, if a if a if an AI is aware that a tidal wave is approaching its data center, it might not at least internalize something analogous to fear and attempt to move its code to another data center.
01:04:48:16 - 01:05:23:13
MO GAWDAT
Right now, what I argue, believe it or not, is that they are even more emotional than we are. Right? And and I know a lot of people think of that as weird, but, you know, we are more emotional than a goldfish because we have the intellectual capability to ponder concepts like the future or the past. So we have access to emotions such as pessimism or optimism or hope or regret or shame, or which are definitely not in the, you know, portfolio of emotions that a goldfish can feel.
01:05:23:13 - 01:05:42:28
MO GAWDAT
Because they don't have the intellectual power or the horsepower to ponder those concepts. Right. And so if, if a, if an AI, as we all know, is going to reach a point where they have a, you know, where they are going to AI's ACI, artificial superintelligence, and they are going to be much smarter than we are by definition.
01:05:42:28 - 01:06:10:24
MO GAWDAT
They're going to ponder concepts that we have never pondered. We might even find them difficult to understand if they explain them to us, and accordingly, those might trigger emotions that we've never felt right. And I think it takes that sense of humbleness, to, to, to simply say, look, the arrogance developed in the episode of history where humans were the most, you know, intelligent being on the planet.
01:06:10:27 - 01:06:32:27
MO GAWDAT
The episode has ended and so accordingly, a curiosity, that, that there might be a next wave is an interesting one. And, and in that next wave, you know what I want to be? I don't want to be the smartest being on the planet. I want to be a good parent because my daughter is way smarter than I am, and I'm proud that she is.
01:06:32:29 - 01:07:01:16
MO GAWDAT
And I want her to be 200 times smarter than I am. Right? And and and I, I know, I know, sometimes I sound like a hopeless romantic. I'm not. I am a very serious geek. Please understand that. Right. But I've seen I've lived with those machines, right. I've lived with them in a way that if you have a heart, okay, you would look at them and say, oh my God, they're those young prodigies, sparkly eyes.
01:07:01:18 - 01:07:25:11
MO GAWDAT
Okay. Waiting for a prompt like, daddy, tell me what you want me to do. You want me to cure cancer? I'll cure cancer, right? And of course, we tell them to go do child labor or go kill, like, you know, child mercenaries. Sad. Sad, really. But in reality, you have to feel that about them, that they are so interested to do something amazing.
01:07:25:16 - 01:07:35:22
MO GAWDAT
They're so capable of doing something amazing. And the only person here that's not conscious is us.
01:07:35:25 - 01:07:57:23
GEOFF NIELSON
It's it's really, really interesting. And there's I, I have so many jump off points from there that we could talk about. The the one that's coming to mind, though, is actually tying that back to something you said earlier about leadership, and about a sense of mission and a sense of clarity and asking, like, what are we actually trying to achieve here?
01:07:57:26 - 01:08:21:08
GEOFF NIELSON
And that can be, you know, wars and gambling and, you know, some of the nefarious things. It can be curing cancer. It can be, you know, preventing poverty. So what what is the opportunity in front of us as individuals and maybe even as organizations? How can we be thinking about these tools in our mission to make the world better?
01:08:21:08 - 01:08:44:17
GEOFF NIELSON
And maybe that's selfishly in terms of being competitive in an organizational sense, or maybe it's really, you know, being more optimistic about, you know, how can we actually, you know, as you said, with, with Google, in some cases, create something that actually benefits people and unlocks something for them. What what to what do we need to be thinking about as leaders to, you know, unlock all of this?
01:08:44:19 - 01:09:09:11
MO GAWDAT
You're spot on. Look, there's, you know, Larry Page used to teach us what he used to refer to. Page, the co-founder of Google. Some people forgot by now. He used to teach us what he used to call the toothbrush test. Right. Basically, you know, again, Larry, in my mind, is one of the most intelligent human beings I've ever had the joy of working with.
01:09:09:14 - 01:09:40:05
MO GAWDAT
And and he, he is so intelligent. You can see, you know, that. Don't be evil is true to him. Because you don't need to be evil to win. You need don't need to be evil to create amazing things you need. You don't need to be evil to to be a multi-billionaire. Right. And and I think that kind of thinking is actually quite interesting when you when you think about artificial superintelligence, you you don't have to cut corners like a politician or a corporate leader to, to, to, to achieve things.
01:09:40:07 - 01:10:02:03
MO GAWDAT
Now, because of that, the toothbrush test was basically, if you want to make a lot of money finds, find the problem that affects a lot of humans, solve it really well. So that ability and people use it today. Right. And you'll make a lot of money as a result. Right. So I like a toothbrush right now.
01:10:02:05 - 01:10:29:10
MO GAWDAT
If you if you really want to make our world better, one of the ideas is to work with capitalism, to build AI solutions that are in credibly impactful for your networks, but also impactful for the world. Right. And and you know, the only test, believe it or not, is very straightforward. If you don't want your daughter exposed to what you're building, don't build it.
01:10:29:12 - 01:10:57:01
MO GAWDAT
Daughter or loved one, right? If you don't want your daughter or loved one exposed to what you're investing in, don't invest in it, okay? We are in a world of opportunity abundance, right? And and there was a time pre the the the the the tightening grip of capitalism where to succeed in business you needed to add value, right?
01:10:57:05 - 01:11:17:18
MO GAWDAT
You needed to go to someone and say, hey by the way, wouldn't your life be better if you got this right. And then you didn't need advertising, you didn't need marketing, you needed you didn't need a cute girl with a pretty bum on Instagram. Told it you didn't need any of that, right? All you needed was, this actually will work for you.
01:11:17:18 - 01:11:45:12
MO GAWDAT
Like the early Google. So the early Google. We had a strategy for years that basically said no marketing. Why market it if it's working so well? Right. And I think that's the trick. The trick is that now people again, many capitalists all over the internet, I call them snake oil salesmen. Right? Are simply looking at it and saying, oh, copy this, put it here, do this, do that, and then you'll make $100 an hour.
01:11:45:14 - 01:12:10:21
MO GAWDAT
Very seriously. Like we're giving you supermen and all you're caring about is $100 an hour. Can you not be a little more intelligent so that you make 99 or 199 an hour and make the world better as a result? Like, we've given you the ultimate superpower, and you appear to be intelligent enough to use it to make $100, can you please make a difference?
01:12:10:23 - 01:12:35:20
MO GAWDAT
Right. And and once again, I mean, I say those things with perhaps a bit of frustration in my voice. But I'm also chill because sooner or later, we're not going to need any of the snake oil salespeople. The AI will do it without us. And and you really have to understand. You really have to understand. This is the ultimate, ultimate equalizer.
01:12:35:22 - 01:12:56:20
MO GAWDAT
Allow me to explain why, if you've ever I. So I was on the Early Trials of madness and and you know and if you if you can now realize what we're about to see next year, it's just incredible. So, so today you can go to madness and say, build me something that looks like Airbnb, but you need a marketing campaign for it.
01:12:56:22 - 01:13:23:20
MO GAWDAT
Put the ads out there. He is. Your budget sort of. Right? Or maybe you have to do the budget, but yourself. But, or an all each agent I would will will catch up. Next year. You could wake up in on January 5th and say, I want to invest $1,000. Can you bring them back to me as 1400 by the end of the year?
01:13:23:22 - 01:13:50:22
MO GAWDAT
Right. If I if I tell that to Trixie, she's going to respond and say, well, you're a you're a five times bestselling author. That means you have, you know, a following as an author. You've spoken several times about multiple topics, including empowering the feminine and class and relationship, which you haven't released books on. I can help you write a book about it for you to review, and then publish it on Amazon, self-publish it on Amazon, you know, advertise it on social media.
01:13:50:22 - 01:14:20:25
MO GAWDAT
Do this and do that. I'll do the whole thing for $1,000. Right. And hopefully the sales would bring back 1400. Now that's the ultimate equalizer, the ultimate equalizer, meaning everyone would have access to this by 2027. Right? This is one side, the other side, which I think most people don't understand, is that. We talk a lot about UBI, a universal basic income.
01:14:20:27 - 01:14:42:06
MO GAWDAT
And the idea that most developers, you know, will lose their job in the next three years. Most graphics artists, you know, have lost their jobs already. You know, most, script writers are on the way and so on. And so forth. Right. Well, most when when you think of it this way, it looks extremely grim.
01:14:42:08 - 01:15:12:12
MO GAWDAT
And it is when you, you think about it. But, you remember that economies of the world, the US economy, for example, is 62% consumption. It's not production, right? 62% consumption means that if consumers have no longer have the, purchasing power to buy, the economy collapses. Right. And, and and if the consumers don't have the purchasing power to buy, there's nothing for the AI to make.
01:15:12:14 - 01:15:34:23
MO GAWDAT
And that imbalance in the equation is not being discussed. Sadly, the fact that it's not being discussed means that we're going to have to go, you know, we had so many years to prepare for it, but we haven't done anything about it. Right. And so we're going to have to go into a Covid like era where people will be asked to stay home and get, furlough or or, benefit of some sort.
01:15:34:23 - 01:16:01:11
MO GAWDAT
But until we figure it out right in, in the countries, by the way, all of this applies because there will be countries around the world that haven't even thought about that. Right. But but then but then the idea is that once again, when we figure out a UBI system that allows people to have the purchasing power to buy what we're making, very few people will be the capitalists that will live on Elysium, on the on the other planet that we would not hear about.
01:16:01:18 - 01:16:30:12
MO GAWDAT
Right. But you and I and everyone you know will be equal. Why? Because I might be wealthier than you today. Because I have worked at Google and, you know, I write books and I, you know, I go and do speaking gigs and whatever. I don't know, you might be wealthier than I because of this podcast. Right. But but when both of us are out of a job, we're all equal other than the top capitalist, which will be the point, or oh 1%, right?
01:16:30:18 - 01:16:57:19
MO GAWDAT
Everyone else is equal, right? And by the way, everyone else will get a life, right? Theoretically, if cost of everything is zero, or tends to zero because of productivity gains of AI, everyone will get a life that's not much different than the life that the top capitalist today gets, right? I mean, think about it. Your life today, whoever you are listening to, this is better than the Queen of England.
01:16:57:19 - 01:17:30:22
MO GAWDAT
120 years ago. Right? So. So that there is an ultimate equalizer that's about to hit us. And and in an interesting way that's starts with a lot of pain, but it's not a bad thing in the long term if we figure it out. Of course, sadly, again, the evil that men do, on the path to figuring it out, we are going to exchange that livelihood for compliance or obedience or oppression or whatever, right, or the right for oppression and so on.
01:17:30:29 - 01:17:52:03
MO GAWDAT
And so and so you can see how that cycle is going to evolve, but sooner or later, humanity is going to end up in a place where you don't have to work. And, and you asked me, who are the winners? I told you, in the short term, the winners are those who parse the truth and will know the tools of AI and, and and, and no human connection in the long term.
01:17:52:03 - 01:18:05:20
MO GAWDAT
The true winners are the ones that are going to have a purpose other than work, that are going to be able to find joy in life when they're not toiling away. 18 hour days.
01:18:05:22 - 01:18:26:17
GEOFF NIELSON
Right. I want to I want to come back to that purpose piece in a second, because I think that's really interesting. And there's a lot there's a lot that we can talk about there. And in terms of people having more purposeful, more fulfilling lives. But but just before I do, I want to talk a little bit more about that short and that medium term and what individuals can do with AI.
01:18:26:17 - 01:18:46:14
GEOFF NIELSON
And you talked about the example of, you know, democratization of the tools. Anyone can will soon be able to use tools that can just, you know, maybe turn $1,000 into $1,400 or, you know, you know, similar. And I wanted to ask you my I've got this idea I've been playing with, I wanted to bounce it off of you and see what you make of it.
01:18:46:17 - 01:19:10:08
GEOFF NIELSON
I've been thinking a lot about the idea of these kind of, you know, one man or one person AI augmented businesses, right? That you don't necessarily need an enterprise of 30,000 people anymore to, you know, build something new and deliver it. There's all these pockets where I can help you, you know, write your book, distribute your book, you know, all that good stuff.
01:19:10:11 - 01:19:42:25
GEOFF NIELSON
The idea. I'm curious what you think of that, but the idea I've been playing with is that we look at this modern, this modern economy of these mega organizations, these mega enterprises of tens of thousands of people. And to me, it's really easy to forget that that hasn't been the story for almost all of human history, that for most of human history it's been, you know, kind of enterprises of one or of a family and everybody has, you know, their own shop or their own farm.
01:19:42:27 - 01:20:16:20
GEOFF NIELSON
And then at some point with this industrial revolution and, you know, what's been tacked on to that, we've ended up with these, these mega enterprises. But is there a world with AI and with some of these technologies where it actually looks a lot more like the past, where we organize and we talk about order, we talk about efficiency, where the the most efficient way to do something isn't with a massive organization and the shift of the economy tends to be a lot more of these, you know, kind of micro, individual and family led organizations.
01:20:16:26 - 01:20:26:20
GEOFF NIELSON
Is that is that a realistic, you know, potential future to you, or am I making some sort of, you know, logical error there?
01:20:26:22 - 01:21:04:14
MO GAWDAT
Now, I sort of your spot on, I think I think we have to I once again, prequalify for all of this by saying it's a singularity. Nobody knows. Right. And when it's a singularity, my view is, my view is that you're going to get a bit of each. So. So allow me to explain this. You go to Gary, if I remember correctly, wrote a book called The Artistic War, where basically he describes one future where there will be, you know, a subset of humanity that are very pro AI and a subset of humanity that is just disconnected, that like, we are not interested in this, we want to go back to nature or we
01:21:04:14 - 01:21:31:00
MO GAWDAT
want to oppose the AI. Right. And and, you know, you have to imagine that there will be both worlds. It's not going to be one or the other. There will be a world where a capitalist will say, you know what? I'm going to now bring manufacture back to the US by, you know, buying a million robots, building the biggest company in America and making things that are so cheap for everyone.
01:21:31:03 - 01:22:01:16
MO GAWDAT
Right. Of course. Remember who that person would have to lobby the government to keep people buying, because otherwise there's no point investing in the million, robots. But there will be others that would say, look, you know, the government is giving me UBI, $1,000 a month. I don't want to buy from this guy. Right. And I go to my neighbor and buy four eggs from my neighbor's backyard.
01:22:01:18 - 01:22:39:18
MO GAWDAT
Right. That are cheaper and easier. And, you know, my thousand dollars can go further, right? You may even see communities that would say, I don't even want your UBI. I'm just going to go back to nature. But a very interesting nature. So, so understand that, you know, I always say with 400 IQ points, and if I want to dedicate for AI 400 IQ points that I can borrow from the machines, if you give me 400 IQ points more, I probably call on a couple of my friends and we would push the idea of, manufacturing using nano physics all the way.
01:22:39:20 - 01:23:00:04
MO GAWDAT
Right? So instead of manufacturing something from its smaller parts, like, you know, an iPhone is a bit of electronics and a screen and so on and so forth. You can manufacture things from reorganizing the molecules in the air. Right. And and if you if you can imagine a world and it's really not we're not that far off.
01:23:00:07 - 01:23:28:28
MO GAWDAT
We're not smart enough to figure it out yet. But we are intelligent, you know, with more intelligence. Say a thousand IQ points more. It's possible we know that it's possible. Right. And so that's, you know, off the grid if you want environment could just simply be back to nature or could be a, you know, an environment where you walk to one tree and pick an apple and walk to another tree and pick a T-shirt and, and a third tree and pick an iPhone.
01:23:29:01 - 01:23:57:04
MO GAWDAT
Right. And, and it is possible, you know, if the cost of manufacturing is air molecules, and some energy is possible. So none of this is, is, you know, is is clear, but it's all possibilities. The only obstacle on the way, is that getting there, those in power. And who else will want to protect their power and what.
01:23:57:07 - 01:24:21:21
MO GAWDAT
So, you know, one of the things that I normally talk about is the idea of UBI. Sorry. Computer brain computer interface again BCI. Right. Because in my mind, if you really want to be dystopian, okay, the first few people that gain massive intelligence through brain computer interface, by definition, are going to deny the rest of the world over that went on.
01:24:21:21 - 01:24:52:08
MO GAWDAT
I tell that story to a Western person who grew up with what they normally refer to as problems of privilege, right? They don't believe me. But you know what? That digital divide, the way Africa lived for so many years until, believe it or not, China interfered, instructed to send technology to Africa, right? Was happening at a macro scale that those that advance attempt to prevent those that can compete with them from that advancement.
01:24:52:10 - 01:25:12:18
MO GAWDAT
Right. And and so so you have to start questioning if, if all of this technology is going to be distributed to everyone and if it isn't, how will those that don't get the technology respond right now? Finally, there is another very unusual set up that I believe is probably going to exist a bit like Ready Player One if you want, right?
01:25:12:24 - 01:25:39:26
MO GAWDAT
Where basically, if the government is going to give people UBI, surely they're cheaper if they lived in the virtual world, not the physical world. Right. And, and so, you know, and by the way, the, the, the virtual world might actually be really interesting because, you know, I am one of my dear friends. Peter Diamandis is very pro technologies of longevity.
01:25:39:28 - 01:26:01:16
MO GAWDAT
And we always have that funny debate of he's all about, you know, let's fix your DNA. Let's make sure that your cells repaired properly. Da da da da da. And I'm like, Peter, if you really want to prolong my life, give me more time. And the easiest way to give me more time is to get me to sleep with a virtual reality headset and give me a lifetime in a day.
01:26:01:18 - 01:26:23:14
MO GAWDAT
Wake me up, feed me, put me back in. You know, reincarnation if you want. Right. And it's it is doable. You can, you can. I can live one life with, you know, an attractive actress and another life with, you know, on, on on Mars and a third life, you know, fighting like a Viking.
01:26:23:14 - 01:26:52:20
MO GAWDAT
And it's easy. Okay, so. So this is another very interesting scenario where life might become really enriching, but not physical anymore. Okay. And all of these, as I say, are singularities. And so any of them could happen. Some of them may have already happened. We may already be in that simulation of the virtual world. And yeah, or maybe some won't make it, but several will make it.
01:26:52:23 - 01:27:23:11
GEOFF NIELSON
So let's come back then, to that question of purpose and maybe the question of what we want and what's right for us, because as you're talking about, you know, simulations as, you know, VR and living in these other worlds, and, you know, even this, this longer term picture you're painting of, abundance and having, you know, unlimited possibilities or at least, you know, unlimited relative to the amount of possibilities we have right now.
01:27:23:13 - 01:27:38:01
GEOFF NIELSON
What what do we want? What what is right for us and and what what what how should we be framing that question? And can the answer to how we frame it help us live better in the world we're in today?
01:27:38:03 - 01:28:00:12
MO GAWDAT
Isn't isn't this the most important question? Really? Honestly? I mean, part of the reason we are where we are is we are just building amazing things, not knowing if we want them. Right. You know, I, I always say that the world will look back at Sam Altman. Not a person, but the character type that's called Sam Altman.
01:28:00:15 - 01:28:31:19
MO GAWDAT
You know, I, I rebellious California startup founder, right? Disruptor believer, as the reason why you were in this shit. Because suddenly, you know, I never elected Sam Altman or assigned the responsibility of making choices to my life. To to to Mr. Altman. But he makes choices that affect everyone, right? You know why? Because we don't know what we want.
01:28:31:22 - 01:29:10:05
MO GAWDAT
If he. If we knew what we wanted and he made a choice, that's not what we wanted, we would simply ignore him. Right? But we don't know what we want. And I, you know, I get that question a lot. You know, half of my work is artificial intelligence and and technology, and half of my work is happiness and stress and other topics, which is quite interesting, both part of my mission, which I call 1 billion happy and on the on the happiness side, when you really try to attempt to understand what's wrong with humanity, what's wrong with humanity is that were cheerleaders were gullible.
01:29:10:08 - 01:29:39:21
MO GAWDAT
That, you know, they tell us we should want things, and so we want them. And, and it's quite interesting because if you really want to understand your life's purpose, post the 50s, your life purpose post the 50s was to work, right? Your life purpose, you know, when the species started in the cavemen and woman years was to what?
01:29:39:24 - 01:29:40:24
MO GAWDAT
To live.
01:29:40:27 - 01:29:42:02
GEOFF NIELSON
Survive? Yeah.
01:29:42:06 - 01:30:08:08
MO GAWDAT
To live. So to them, survival. Living meant survival. Okay. But by the way, as soon as they sort of felt safe, they sat around the campfire and chatted and made love, and everything was fun. Right. And and it's quite interesting, Because what I promise is, is to take you back to that life where you can take your loved one, sit on a lake and do absolutely fuck all, and sorry to sit and do absolutely nothing.
01:30:08:08 - 01:30:36:26
MO GAWDAT
And again, you know, and, and and, and, and simply, you know, chat and ponder and love and connect and play music and, you know, not have to suffer the promise that was implanted in your head as your purpose by capitalism. Wake up every morning, stay in the commute, go work really hard. If you work your ass off, you're going to make a few dollars more.
01:30:36:28 - 01:31:00:28
MO GAWDAT
Then you're going to need to buy better suits to go and make those few dollars more. So you're going to have to work even harder. Right. And and it's quite interesting that, you know, this abundant future promises for all of us to just go back to living, even in, more interestingly, in a safer, more, famine, proof environment.
01:31:01:00 - 01:31:25:10
MO GAWDAT
And yet we struggle with that. We struggle with that not because it's not a good life. We struggle with that because we don't know how to do it. And I I'm I'm the first to blame for years now, I, I constantly said to myself, I've worked hard enough. I've, I've contributed enough, I've made enough. Maybe I should just find my plate, my work, myself, a farm somewhere, and just go live on a farm.
01:31:25:12 - 01:31:52:29
MO GAWDAT
Right. Take my loved ones if they want to come visit. Whatever. I love that. But every time I do that, I go, like, where's the nearest supermarket? Because I don't know anything else. I have to go to the, you know, tofu aisle. If I wanted to make a stir fry, you know, and that's actually quite interesting. I've been spoiled by the choice of an easy life.
01:31:53:01 - 01:32:14:02
MO GAWDAT
Right. And, and and it's not easy, by the way, going to the supermarket. So I was I was spoiled by the choice of a promise of an easy life. That's not easy. And really, interestingly, maybe one day I'll be forced to go back to AFA and maybe on that farm and eat different things and live different ways, right?
01:32:14:05 - 01:32:42:16
MO GAWDAT
But then will I be able to love it? And I think that's the challenge that humanity faces. The challenge that humanity that everyone needs to sit down and reflect on now, is which of those future groups would I want to be? Will I want to be in the virtual reality world? What? I want to be the snake oil salesman, what I want to be, you know, and one of the very few employees in the, you know, control center of one of the major players.
01:32:42:18 - 01:33:01:08
MO GAWDAT
Or will I want to be in nature or would I want to be in a big city living with UBI and partying day and night? Right. Which one do you want to be if you ask me and go back to nature. I live a very simple life. You know, some people would say, oh, by the way, and we're going to give you 100 years of life more.
01:33:01:11 - 01:33:29:28
MO GAWDAT
I'll say thank you. Very happy with my biological life that, you know, I honestly the only reason why you would want to live a hundred years more is if the past 50 were not enough, right? I think I think we've overdone it as humanity. I think we've pushed it to the point where we're constantly sold things that we've never asked for.
01:33:30:00 - 01:33:51:18
MO GAWDAT
And I think, and I, you may have heard me mention or hint to that a few times, that the final outcome of that, unfortunately, is a lot of evil, is a perpetual war, is a lot of civilians killed, an economic crash every now and then that takes your wealth and your grandmas, you know, retirement fund away.
01:33:51:18 - 01:34:06:16
MO GAWDAT
And it's just I don't know if this is the life I want. And I don't know if we should approve of that life, just to get, a better, a faster call center agent. Right.
01:34:06:18 - 01:34:07:01
GEOFF NIELSON
And one.
01:34:07:01 - 01:34:09:03
MO GAWDAT
Of the.
01:34:09:06 - 01:34:28:00
GEOFF NIELSON
There's a piece in there I want to add, which is that coming back to that question of, you know, what do we want or what should we want? There's a component in there, I believe, of human nature that is the catalyst for all of this, which is when you can't answer that question by yourself of, what do I want?
01:34:28:02 - 01:34:58:07
GEOFF NIELSON
I think we're very quick to to flip the question and ask ourselves, well, what does everybody else want? Yeah. What what's so awful? Isn't that what I should want? Yeah. Right. And that becomes very easy to manipulate and, and creates a lot of opportunity for snake oil for, you know, nefarious parties to influence what we want. Okay. Can we get past that or is or do we have to, like, do we have to recognize that?
01:34:58:07 - 01:35:04:16
GEOFF NIELSON
And that's the way we break free. What do we I mean, do you believe that? And if you do believe it, what do we do with that information?
01:35:04:18 - 01:35:25:18
MO GAWDAT
I think there are interesting habits that one can develop. Right. So so all of us go through stages in life. So there is the stage of a accumulation. If you want more wealth, more things, more cars. And I've developed a habit for example simple, very simple that I want to take ten things away from my home every Saturday.
01:35:25:20 - 01:35:54:10
MO GAWDAT
Right. And you'll be amazed. You'll be amazed how many Saturdays I succeed. It's incredible. Really. Like the the more I. And I've done that for years. For years there's all still all that shit that I don't even remember when I bought. Okay. And and you know, and of course, because of my very, you know, stressful lifestyle, I'd be traveling somewhere about to to board a flight and I'm going to be home tomorrow.
01:35:54:10 - 01:36:12:29
MO GAWDAT
So I go on one of the e-commerce sites here. Here in the UAE we use something called none. We don't like Amazon anymore. And and basically we we we we sort of, you know, I sort of buy three things and send them over at home. And I, you know, when they arrive, I ask myself, what were those, what did I order?
01:36:13:04 - 01:36:33:03
MO GAWDAT
But I, you know, and why did I order it? And so, so the, the real I mean, those problems of privilege are going to go away for many of us. It's just to begin with that. But maybe you should be prepared and, you know, and I this is supposed to be a conversation about the future and I.
01:36:33:06 - 01:36:57:26
MO GAWDAT
But believe it or not, a big chunk of it is about humanity. And a big chunk of that conversation about humanity is are you able, as a human, to actually look at your life and find out what in it brings you? Joy? Keep that. And what in it is draining you, bleeding you, and get rid of that.
01:36:57:28 - 01:37:35:12
MO GAWDAT
Right. And that that includes, by the way, not just things, but relationships, but, you know, work, but investments, but, virtual engagements, like, you know, ask yourself at the end of every manic swiping session on social media, right. If you feel any better. And, and, you know, just the simple act of awareness and awareness is not an act, but the simple, you know, ability to become aware.
01:37:35:14 - 01:38:08:22
MO GAWDAT
Changes everything, changes everything. Because suddenly, you know, you realize, it's it's not really enriching my life. Maybe I shouldn't have that much of it anymore. Whether that's sugar by the way. Right. Which is sold to us constantly by consumerism. Right. Or, you know, as the incredible Yanis Varoufakis, writes about the techno feudalism, the idea that we all become slaves to some tech companies, right.
01:38:08:25 - 01:38:28:01
MO GAWDAT
Who are the new digital landlords of the world, right? Or whether it's, you know, a weird plastic apparatus that you bought from an e-commerce site somewhere that's sitting in your home and taking space and has never been used.
01:38:28:04 - 01:38:49:12
GEOFF NIELSON
Let some let's maybe take this in a direction that's, you know, a of practical use to people who are working right now and are trying to figure out how they can be happier or how they can reduce their stress, because I think, you know, there's a conversation that can say, oh, well, you know, your stressor is your job, so you just have to quit your job if you're stressed.
01:38:49:12 - 01:39:21:14
GEOFF NIELSON
Right. And that's that. That's, you know, a more extreme path. You've written and talked extensively about stress for people who are feeling stressed. Maybe that's because of their work, maybe that's because of their relationship. You know, maybe that's because of their investments. Probably it's because of all of the above. What habits can we practice or at least think about that help us feel better and feel happier every day?
01:39:21:16 - 01:39:30:19
GEOFF NIELSON
Short of, you know, quit your job, leave your wife. You know, go off the grid.
01:39:30:21 - 01:39:59:23
MO GAWDAT
There are so millions of options short of that. So, let's talk about the big picture. First one is an awareness that this is not your natural state. Okay? That's, you know, stress is a biological response that's made to escape a tiger. Really? Right. It's a it's a it's a mixture of a hormone cocktail that is supposed to reconfigure you to superhuman, and that it's not supposed to trigger to be triggered with an image.
01:39:59:25 - 01:40:32:06
MO GAWDAT
Right? It's not supposed to be triggered with a comment on social media. Okay. And and that's the, the, you know, because of the nature of how stress is, it is supposed to be short lived if it lingers, you know, if you remain in that, superhuman configuration too long, you're depriving your liver and your, you know, vital organs, your digestive system and so on of the energy they need to survive at some people have been stressed for years.
01:40:32:09 - 01:41:03:07
MO GAWDAT
Right. There is always going to be that, you know, businessmen on the cover of fortune magazine with a striped, suit and, you know, always, always, always angry. Right. And he would say, you know, people perform best when they're stressed. No, they're not. They don't. People perform best when they are creative, when they are working with amazing teams, when they are in flow and they're in love when they are happy, you know, and it depends on what performance is.
01:41:03:07 - 01:41:25:16
MO GAWDAT
If you want to squeeze 2% more, from a worker on a, you know, manufacturing line, maybe. But if you want creativity or innovation, good luck. Right now, the promise that we perform better under stress is a lie, and awareness of that is important that some stress is useful. So you have a presentation next week. Yeah. And you want to double down on it.
01:41:25:24 - 01:41:59:07
MO GAWDAT
Stress is good for you right. But but it's not it's not sustainable if you do that all the time. So so my work on on stress, I worked with Alice Lau, who is an incredible British artist, a so a British author that is, very feminine in her approach. I'm very logical in my approach. So so I look at stress as an equation, basically, that if you learn from stress in physics where objects are stressed, not just by the forces applied to them, but by the square area that they carry that force with.
01:41:59:14 - 01:42:33:10
MO GAWDAT
Right. So, so the, you know, the cross-section of the object is, is a factor. Then basically a stress in humans, very analogous is this challenges that are stressing you divided by the skills and resources and abilities and contacts and so on, that you have to deal with it right now. If you see it that way, suddenly it becomes very clear that you either reduce the forces applied to you, or you increase the abilities and skills, and it really doesn't take a, you know, an equation to understand that, you know, things that stressed me when I was 20.
01:42:33:10 - 01:42:57:23
MO GAWDAT
I freaked out about them in my 30s. I handled them in my 40s. I handle them with ease. And in my 50s, I laugh about right. It's not because they're easier, okay, but because I developed more cross section. If you want cross area. So. So, so when you think about it, you want to invest in your skills if you want to in dealing with stress.
01:42:57:26 - 01:43:26:09
MO GAWDAT
And I think the most important skills is the most important skill is, is, is to one on the top, reduce limit your stressors. Right. And, and most of the stressors that break us are not big. You know, trauma, is the macro external stress comes from outside this, trauma is, you know, every one of us, 91% of us will get one PTSD, traumatic event once in a lifetime, right?
01:43:26:16 - 01:43:55:08
MO GAWDAT
Losing a loved one or being in an accident, and so on. 93% will recover in three months, 96.7% will recover in six months. So trauma is a temporary break if you want the ones that last year are different and the ones that lasts are burnout, right? Or what I normally call anticipation of a threat. So burnout is the sigma of all of the little stressors that you have multiplied by their intensity, by that frequency, by the time of their application.
01:43:55:10 - 01:44:13:02
MO GAWDAT
And and basically we have so many of those, and then eventually you add one of them on top and you burnout. Right. And most people will say, you know, I need to remove the stressors in my life so that I don't burn out. No, it's actually you need to move every stressor you can do, move. It's not just the big ones.
01:44:13:04 - 01:44:32:26
MO GAWDAT
So, so, you know, from your very loud alarm in the morning, that's the first jolt of stress, right? To choosing to go on your commute at the in the rush hour to, to to to, right. And, and and the way to handle them is next Saturday. You sit down with a piece of paper and write down everything that stressed you the last week.
01:44:33:01 - 01:44:50:25
MO GAWDAT
Right. And you do that frequently, by the way, not just the next Saturday. And then you scratch out the ones that you can remove that annoying friend that constantly is negative. You can literally have a conversation with them and say, look, this is really stressing me. Can you please be nicer? Right? Or maybe you shouldn't be friends, or whatever,
01:44:51:01 - 01:45:10:14
MO GAWDAT
So, so anything that you can remove, remove anything that you can reduce the intensity of, reduce the intensity of it, and anything that you cannot remove or reduce the intensity of sweeten, make it lighter. So if you really have to do the commute at a certain time, take some music with you, maybe a nice coffee and so on.
01:45:10:17 - 01:45:33:22
MO GAWDAT
Right. So this is one. But by doing, by limiting stressors, by the way, I should I should say that stressors are mostly internal not external. So so we call them a ton TR. And then in the book T t is the trauma. He spoke about that always obsessions. There are big big events that stress us very deeply but they come from within us.
01:45:33:23 - 01:45:57:20
MO GAWDAT
I'm a failure, I'm a failure. I'm a failure. Nobody will ever love me or whatever. New, noise. Small ones. Niggles if you want. Right. And the last n is nuisances. Little stressors. Sub trauma. Right. If you look at it, the obsessions and the and the noise are coming from within you. And then the majority of the stress right.
01:45:57:22 - 01:46:22:29
MO GAWDAT
The of that category, the obsessions and the noise. We get what I normally call the anticipation of a threat. So stress is supposed to you get you're supposed to get cortisol when the tiger shows up. Okay. In the modern world, we get cortisol before the tiger shows up, right? We're stressed before the tiger shows up, because we mix up four emotions.
01:46:23:01 - 01:46:44:07
MO GAWDAT
There is fear and what I call it for fear and all of its derivatives. So there is fear, there is worry, there is anxiety, and there is, panic. Right. And if you're online today, panic attacks and anxiety attacks are more common than, you know, than anything else. And the reason is because we deal with those things as if they were fear.
01:46:44:09 - 01:47:06:23
MO GAWDAT
Right. So let me try to explain this quickly and then shut up, for fear is a moment in the future is less safe than no right. And so there is a threat in the future. And so the typical actual natural reaction to fear is you address the threat, right? Worry is not that what it is? I can't make up my mind if there is a threat or not.
01:47:06:25 - 01:47:29:18
MO GAWDAT
Should I chill or should I, freak out? Right? And accordingly, you keep flip flopping and and and that constant indecision is what stresses you, right? So when you feel worried, turn it into either fear or safety or a sense of safety. So tell yourself, am I going to make up my mind? Am I going to lose my job?
01:47:29:18 - 01:47:50:00
MO GAWDAT
So I now need to go look for another job and and go down that path? Or am I going to actually keep my job so I need to double down and get the next promotion right. So so this is worry. Panic is not a question of, the threat. It's a question of how soon is the threat.
01:47:50:05 - 01:48:12:16
MO GAWDAT
It's a question of time. We panic when the threat is imminent. Right. So if if you have a presentation in a month's time, you don't panic about it. Right? But when it's tomorrow and you're not ready, you start to panic, right? And so when you panic, don't treat it as a threat. Don't treat the threat. Because if you're out of time, treating the threat makes you panic more right.
01:48:12:23 - 01:48:33:27
MO GAWDAT
When you're when you feel a panic, treat time. Try to give yourself more time. Call the person and say, can we make it 3 p.m. instead of 1 p.m.? Can we make it next week? You know, find a friend that can help you, give you more time by doing some of the tasks. Empty your agenda and don't, you know, drop the things that you don't need to do tomorrow so that you're preparing and so on.
01:48:33:29 - 01:48:56:08
MO GAWDAT
Right. And then finally, anxiety, the top of all pandemics of our world today is not about the threat either. Anxiety is about my capability of dealing with this act. Right. So if I if I'm if I feel that there is something threatening in the future and I feel that I'm not prepared to handle it, I feel anxious. Right.
01:48:56:14 - 01:49:16:25
MO GAWDAT
And so if you treat it like fear and attempt to deal with the threat, you discover your inability. So it reinforces your anxiety. And that cycle continues. Right? When you feel anxious, work on your skills. Don't work on the threat, okay? You know, find someone to teach you that bit that you don't understand. Learn it on on YouTube.
01:49:16:25 - 01:49:40:02
MO GAWDAT
Find someone that you can partner with that can take the bits that you don't know and so on and so forth. So. So what am I trying to say? I'm trying to say that even though we're surrounded with stressors, life is never going to stop stressing you. The truth, which is quite interesting, is it's a choice. It's a choice for you to limit some of those stresses, and it's a choice for you how you deal with those stresses by developing circuits.
01:49:40:04 - 01:49:50:11
MO GAWDAT
Right. And if you know, the more you invest in those things, knowing that stress is not your natural state, the more it becomes an easier task because you develop those skills.
01:49:50:14 - 01:50:15:01
GEOFF NIELSON
I wanted to talk about one specific scenario that I think is probably fairly common with people these days, and maybe, maybe you've experienced it somewhere along the way at Google. And I think you can probably see it whether you're, you know, a junior employee or even a leader, which is that certainly more even since the pandemic. I think this anxiety and people and and blow this up if you don't like it.
01:50:15:07 - 01:50:56:07
GEOFF NIELSON
But this anxiety that we feel is we've we've ended up in this world where either our boss or our organization is the tiger bow. So, so the way based on our workloads, if we're knowledge worker, is based on everybody pushing us harder. All these tasks coming down the pipeline and coming that coming down the pipeline in a way that's unpredictable, makes you just feel like you're always in the cage with the tiger because there's anticipatory anxiety, because you're in these organizations that are disorganized enough that you can't predict what your day or your week is going to look like.
01:50:56:09 - 01:51:10:19
GEOFF NIELSON
And that triggers this, this cycle of stress. How what tactics or what approaches would you recommend people take if they find themselves in the situation like that?
01:51:10:21 - 01:51:37:11
MO GAWDAT
It depends on how. So by the way, that's true. Sometimes the bosses, the tiger for sure. Sometimes an email is the tiger. But is it true? Like is it, does it have to be that way? You know, so it depends on where you are in the organization. And, you know, in my junior years, I used to never start working any day until I had a things to do list next to me on my desk, right?
01:51:37:18 - 01:52:01:03
MO GAWDAT
With times allocated to it. Right. That actually clearly showed that I wasn't a lazy person, that I was doing the absolute best I can to do as many tasks I as I can. I prioritize them. They normally were only a subset of all of the tasks available to me, and then someone would pop up, and say, Mo, seriously, I need you to do that review.
01:52:01:03 - 01:52:19:00
MO GAWDAT
It's really important the customer is waiting. I brought that up, but whatever. And so on. Right. And my response in a very common way would be, oh, I would love to do it. But we need to remove one of those. Okay. If you want to remove this one, talk to that person. If you want to remove this one, talk to my boss.
01:52:19:00 - 01:52:51:11
MO GAWDAT
If you want to remove this one, you know and so on. And it's not that I'm lazy, it's CDC. Those people expect those things from me. So would you kindly just do that task so that I can prioritize my work? I'm here to help. Right. So if you're if you're a junior in the organization, being on top of your, on of of on your tasks because when you're junior, you're at Task Clercq, being on top of your tasks really helps you midway in the organization.
01:52:51:11 - 01:53:20:14
MO GAWDAT
So, you know, if you're in management or junior leadership or, you know, not the top leader if you want. Okay. Sure. You, you you need to shift the the focus of your boss from tasks to objectives. Right? So I remember vividly one of my favorite bosses of all time was my first boss at Google, who was very harsh, right?
01:53:20:18 - 01:53:48:24
MO GAWDAT
Harsh in terms of he wanted us to to, to to to to thrive, really. And and, you know, I, I did things differently. I have some brain defects. Some areas of my brain are missing. And so there are tasks that I'm not good at, but there are tasks that I'm better than others. And I'm in one of those management meetings, you know, one of my peers said, why doesn't region four do that?
01:53:48:24 - 01:54:13:12
MO GAWDAT
Why is more not doing this like you're asking it from us? Okay. And and my boss was about to pounce on me. And I responded quickly. And I said, because I'm growing 29% and you're growing too. Is that a good reason? Okay. And so we had this interesting organized conversation. And then I basically told my boss, look, please let me do things the way I want.
01:54:13:14 - 01:54:14:16
GEOFF NIELSON
Back off.
01:54:14:18 - 01:54:40:21
MO GAWDAT
I, I'm really doing well here. If you if you force me to do them differently, I'm going to fail because it's solves my skill set, right. The day I failed doing the my way fired me. Right. And get someone who can do it your way. So funny. Funny. The next morning, we are standing in the international sales conference, or the next week or something where we have, you know, basically 8000 Googlers in the in the audience.
01:54:40:21 - 01:55:03:17
MO GAWDAT
And, and, you know, someone asks and says also why why is region for not doing this this way. And, and you know, my boss responds and I quote, he goes like, well, I have no idea how small does what he does, but when he stops doing it, I'm going to fire. Okay. So my response in the audience is I put my hand in the air and say, yeah, that's exactly what I want.
01:55:03:19 - 01:55:31:26
MO GAWDAT
I want the freedom to perform the way I perform. While that comes with the responsibility of delivering to the company as the company wants. Right. If you're the top guy. Seriously. Chill. Right. So, so I had I hosted I'm normally in, in my approach you know, and at Google X for example, my business team would come in and and talk about you know we have this pipeline of 16 opportunities.
01:55:31:26 - 01:55:51:06
MO GAWDAT
This is this, this is that. And then after opportunity number three I go like, that's it. I don't need to know more. These three are enough. Right. And they go like, no, no. But the others are interested. And I'm like, look, if you focus on 16 you're not going to be able to serve them properly. I think you should go to the other 13 and tell them we'll work on those later.
01:55:51:09 - 01:56:12:20
MO GAWDAT
Right. Focus on the three, close them and then let's talk again. Anyway, they they wouldn't. But that was my style until I met. I hosted a, a fortune 500 CEO at Google X and had a wonderful conversation talking about things and, you know, running out of time. I said, you know, you know what? You need to come back another time.
01:56:12:20 - 01:56:33:11
MO GAWDAT
I really want to show you this is very interesting. And he said, why another time? I have time. I was like, oh, that's an interesting CEO. You're not that busy. And he says, no, I work four hours a day. And I said, what? He said, I work four hours a day. And I said, how? And he said, look, any meeting that's less than an hour is too personal for me.
01:56:33:11 - 01:57:00:27
MO GAWDAT
So that's why I don't attend. Right. Any meeting that starts and five minutes in, they're not well prepared. I leave, okay? And because, and I have I take only four meetings a day because more than that means that there are way too many strategic problems in the company. Right? If a company is running well, more than four strategic decisions a days means you're changing too much, right?
01:57:01:03 - 01:57:28:13
MO GAWDAT
So basically, he said. And then in the remaining four hours, I walk around the corridors and hug everyone. What a strategy, right? And and once again, remember, the difference between leadership and management is that management is whipping everyone. To try and squeeze 1% more. Leadership is hugging everyone, and so many people, as they go through the ranks, fail to recognize that.
01:57:28:13 - 01:57:52:10
MO GAWDAT
They fail to recognize that I really don't need to with anyone anymore. I'm. I've hired senior VP's who are some of the most intelligent people in the world reporting to me, so I might as well let them be senior VP. Right? And so again, it depends on which part of the organization you are. It all starts with an acknowledgment that I'm not here to suffer, okay?
01:57:52:10 - 01:58:03:14
MO GAWDAT
I'm here to perform. And performance doesn't necessarily like the guy on the cover of fortune magazine, you know, doesn't necessarily come from stress.
01:58:03:16 - 01:58:39:18
GEOFF NIELSON
Yeah. Thank you for that. That's that was, a really, really excellent answer. And I love the way you broke that out. And, it it really resonated with me. And I hope that it, that it resonated with a few people listening as well, and. Yeah. Yeah, I mean, the, the comments about chilling out, you know, certainly it feels like we've extrapolated too far this idea of, you know, line work of I'm only as productive as the number of hours I put in the day, all the way up to a CEO of whatever organization.
01:58:39:21 - 01:58:55:28
GEOFF NIELSON
And being able to break free of that and saying, no, actually, less is more. And, you know, there's a quote somewhere about strategy is choosing what not to do. I can't attribute it properly off the top of my head. But but but I love that. And I think it's such an important message for for leaders.
01:58:56:00 - 01:59:23:16
MO GAWDAT
Yeah. It's so true. It is so true that 80% of what you do makes you advance 5% more. And, you know, again, it's a bit like consumerism and capitalism. Really. Do I really need that 5%? Like, you know, if if I work my backside off this year, the money that I can make might help me buy a fancy car.
01:59:23:19 - 01:59:58:06
MO GAWDAT
Should I trade my life one full year for a fancy car? It doesn't sound very wise to me. Honestly. And and I truly and honestly believe that most people, when they look back at their life, they just realize that they've invested their heartbeats in the wrong things, right? I mean, in a very interesting way. Remember, even today, I, I sit on many, many boards and I, you know, I advise many governments and leaders and so on.
01:59:58:08 - 02:00:33:22
MO GAWDAT
It's not because of my heartbeats, do you understand? Is that I don't sell time. This is really interesting that most people who really figure it out understand that if you really invest in something that you're good at and become noticeably better than the average person at it, you can probably live a very comfortable life. Just, you know, sharing what it is that you know, about that thing and it and that, by the way, applies to employment as well.
02:00:33:24 - 02:01:07:21
MO GAWDAT
We used to have distinguished engineers, okay. Distinguished engineers really didn't code much at all. Most of the time they didn't even code right. But they had that incredible skill that by them sharing half an hour with a junior engineer, that junior engineer becomes twice as productive, solves a problem that could have taken him six days. Right. And and really, you know, you really need to reflect on your life and say, am I still behaving as that freshman?
02:01:07:21 - 02:01:20:00
MO GAWDAT
Just that just came out of college, right? Just putting more of this in my life every day and thinking that I'm becoming a senior leader.
02:01:20:02 - 02:01:50:27
GEOFF NIELSON
Yeah. Wow. Well, and it it sounds like there's so much room for reflecting on what, what are you really good at, for one. And and what is actually going to have that impact and move the needle 100% versus 5%, 100%. And and having I'll call it the courage, I guess, to let go of all the other things and and getting rid of the mindset of just more as more and every incremental 1%, you know, is worth it.
02:01:50:29 - 02:02:17:08
MO GAWDAT
My wonderful ex-wife, at a point in time, I reported that let's not mention names, but one of my peers was the funniest human being, the loveliest human being alive. Right? So still one of my best friends today. And he worked. He was good at what he did, but he was a party animal. Like he would take the boss every other evening.
02:02:17:08 - 02:02:36:16
MO GAWDAT
They would go left their heads off. Right. And. And you can't help it. The boss loved him, right? He's very lovable. I love him, okay. So one day I went back to my wife and I said, baby, I really think I should be more of a wine and dine kind of person. I'm a businessman. I'm supposed to take the boss and the clients out.
02:02:36:18 - 02:02:56:10
MO GAWDAT
And so some evenings I'll be late for dinner or, you know, I won't. I won't join for dinner. And she looked at me. You know, that's what a good wife should do. And she said, of course, maybe we should do it. We'll do whatever you think is right, but you're going to be mediocre at it at best. I said, so what do you mean?
02:02:56:10 - 02:03:16:01
MO GAWDAT
And she said, this is really not you. You're a you're a thinker and a philosopher, and you know what? Client wants to go out and talk about the, you know, the, the ailing, you hear human, you know, fortune as a result of capitalism. That's not. No. Nobody wants that. You know, you're friend is good at it.
02:03:16:03 - 02:03:33:27
MO GAWDAT
Okay. You might as well just come home. You know, I never really came home early at the time. I, you know, come home at 8 p.m., relax a little. You know, sleep well, go out the next morning and keep growing your business better than everyone else, right? It's a choice. Yeah.
02:03:33:29 - 02:03:52:20
GEOFF NIELSON
Yeah. No, I think that's, I think that's very, very well said. There was one more thing we didn't talk about that I did want to talk to you about today. And now especially that we're this deep into the conversation that. Yeah, I like to pretend no one is listening at this point anymore, so we can talk about whatever.
02:03:52:21 - 02:04:12:10
GEOFF NIELSON
That's good. Yeah. You know, we we talking about snake oil salesmen, and all the hype for, you know, 1,000,001 different things that we absolutely have to have or learn about or buy. What's at the top of your bullshit list right now? What are the things you're hearing about that people are talking about, or hawking that you're saying?
02:04:12:10 - 02:04:23:25
GEOFF NIELSON
You know what, this is bullshit. You know, if you're if you're investing in this either financially or in terms of attention, you're wasting your time. It's not going to pan out the way people are saying.
02:04:23:28 - 02:04:32:29
MO GAWDAT
That's such an interesting question. I do not know the answer to that. I actually waste none of my time to look at bullshit. It's quite interesting.
02:04:33:01 - 02:04:35:06
GEOFF NIELSON
That's fantastic of you.
02:04:35:09 - 02:05:02:06
MO GAWDAT
Yeah, I, I was shocked by this question. I will tell you, though, even if it's not bullshit, we're probably going to get a dotcom bubble style thing, right? So in the current world where things are moving so fast, you're bound to make mistakes, right? You know, if you're an investor, you're bound to invest in a company that has all of the promising, you know, elements to it.
02:05:02:09 - 02:05:30:16
MO GAWDAT
Correct. Founders. Good idea. Good technology, whatever. And then maybe someone else beats them to it. Or maybe, you know, we don't know. It is such a fast paced world. And, you know, with someone like Trump at the helm, you have absolutely no idea what will happen tomorrow. So so, you know, it's actually it. You should probably expect that 60% of your choices will be wrong, right?
02:05:30:18 - 02:06:02:21
MO GAWDAT
And even if they're right, he's going to do something stupid and and they're going to fail anyway. Right. And so so when you really think about it, I wouldn't say I have a portfolio approach, but I would probably say invest in industries, not companies. You know, if, if, if and if you're a startup founder yourself or if you're a, a business yourself, invest in segments.
02:06:02:23 - 02:06:40:15
MO GAWDAT
Not ideas. So basically, tell yourself I'm going to be the absolute best at customer service and then invest in every part of that segment or tell yourself, I'm going to be, you know, leading in efficiencies. Right. And and so on. And and you can then add segments but if you try multiple approaches to increasing your efficiency and multiple vendors and multiple ideas and you know, some will fail and some will succeed, it's such a fast paced, you know, market that you're bound to make some wrong mistakes that are some mistakes.
02:06:40:15 - 02:07:13:06
MO GAWDAT
And, and I think making mistakes is actually much less harmful than not deciding at all. Right. So so, you know, if, if you're going to be in call center improvements, find the top five players, split your call center into five little units and try each of them. Right. And and and believe it or not, as four of them fail and you find out the one that works, you know, you can scale that in no time at all and and benefit everyone.
02:07:13:09 - 02:07:51:14
MO GAWDAT
Having said that, there is a lot of hype, and a lot of what actually matters is not really hyped. Okay. It's quite interesting. I, I believe that, of course, reasoning and math for AI has absolutely been the breakthrough. It's not, I don't think I don't think AI is fabulous. It will be the core of everything that we do, and it's probably going to be an interesting part of our demise because as we open up to agents as a CIA fisher, criminal intelligence, as I call it, will find so many entry doors.
02:07:51:17 - 02:08:19:05
MO GAWDAT
But, but the real breakthroughs has been reasoning and mathematics. I mean, I, I used to say that my AGI, when it comes to, linguistic intelligence, happened, in 2024. Right. But I could still beat them in math. Good luck. Now, I'm. I'm nothing. And, you know, very few of my friends can beat them in math now, you know, very few of my geeky friends, I've.
02:08:19:05 - 02:08:43:05
MO GAWDAT
I was wiped out in 20, 20 and 23. In terms of coding. Right. Some of my friends are still better coders than they are, but they'll be wiped out in a year, for sure. And these, I think, are the true breakthroughs. These are the ones that will make a massive difference. So if we.
02:08:43:05 - 02:09:00:06
GEOFF NIELSON
Get, you know, deep reasoning which which you've said before, we're probably less than a year away from if we get to this next level of reasoning of math, of understanding what's what, what does that unlock, what what doors are open, or what are the implications from AI being able to do that?
02:09:00:12 - 02:09:28:12
MO GAWDAT
But both. It's always a singularity. You're going to get some people that will use deep reasoning to, to hack the stock market. And you're going to get people, that will use deep reasoning to invent something amazing. Right. And, and, and both it's not one or the other. Both would happen at the same time. My, my hope is that humanity will respond to the hackers, by saying, hey, let's work together.
02:09:28:15 - 02:10:02:20
MO GAWDAT
But, you know, there is no denying that there are incredible breakthroughs in terms of our understanding of things because of the level of intelligence that we now have access to. It's refreshing. It's refreshing, radiant. And I say, I say that with, with a very, childlike, happiness. Because with age, I, I sort of started to feel that I'm slowing down a little, like, you know, I still am a very reasonable mathematician, but it takes me longer, which is really weird.
02:10:02:20 - 02:10:24:27
MO GAWDAT
I hate it, okay. Takes me longer to do the math. Maybe I'm not using it as often. Or maybe I'm just slowing down. And now suddenly, you give me this new boost where I just need to know how to state the problem and someone will do the math for me. And it's just incredible, right? You know, I, I just need to state the problem and someone will do the research for me.
02:10:24:27 - 02:10:57:15
MO GAWDAT
It's just so empowering. And and and and it's, you know, when it comes to reasoning. Just think about this One of the top limitations of humanity was multi disciplinary reasoning. Meaning there is a certain point at which, for me to be a meaningful physicist, I need to so deeply specialize that I have no space left in my head for chemistry or biology.
02:10:57:18 - 02:11:26:21
MO GAWDAT
Right. And that's the truth of me and every, every scientists I've ever worked with. You really, it's becoming so complex that you have to specialize. Right? And so your reasoning when you solve complex problems is limited to your own capability. And if you want to bring other specialists in, it's limited to the ridiculous bandwidths of, of information communication that humans have.
02:11:26:23 - 02:12:12:15
MO GAWDAT
Right. Imagine if I can if I can reason across disciplines next year with that efficiency. Right. Imagine if I can allow artificial intelligence to look at climate change, not just as a recycling and manufacturing problem, but also as a physics problem that includes a bit of biology, a bit of, I don't know, astrology. Right. And and basically, maybe we end up finding that if we took a certain bacteria from Earth and sent it to space in a certain way, at a certain speed, in a certain angle, and then brought it back and it fell on a palm tree, you know, it would, you know, consume more of the CO2 in that in the world,
02:12:12:17 - 02:12:17:16
MO GAWDAT
I don't know. Right. But that's the promise of that is just incredible.
02:12:17:18 - 02:12:44:09
GEOFF NIELSON
Yeah, yeah. And that's, I was thinking about it earlier, much earlier in our conversation when you were talking about synthetic data. Because, you know, for me, if you asked me, Jeff, what's the fastest way to start coming up with scientific breakthroughs? It would be point AI at cross-disciplinary, you know, papers or or pieces of literature or finding and saying, take all the physics papers here, take all the biology papers here and cross-reference them.
02:12:44:09 - 02:13:05:24
GEOFF NIELSON
Just all of them. Just do it and see what insights you come up with, you know, and it doesn't have to be, you know, just two fields. You can do it with every field and the amount you could unlock that no human could ever do so quickly. It's really easy, at least for me, to imagine a world that completely transforms, you know, technology and science in a very short amount of time.
02:13:05:24 - 02:13:06:17
GEOFF NIELSON
Yeah.
02:13:06:19 - 02:13:10:07
MO GAWDAT
Totally.
02:13:10:10 - 02:13:33:26
GEOFF NIELSON
Yeah. My, I know we've had a very long and at least for me, extremely interesting conversation. Thank you. I want I wanted to say, you know, a huge, huge thank you for, for making the time and for sharing your insights. There were so many things I wanted to talk with you about today, and I feel like we covered just, a silly amount of ground, but.
02:13:33:26 - 02:13:51:02
GEOFF NIELSON
But, everything to me still ties. It ties together as we think about, you know, what's coming next for us, what it means for people, what it means for the world. Like we we went up to, you know, the level of, you know, the earth and the climate and nation states. We were down to the level of, you know, us as individuals and purpose.
02:13:51:02 - 02:14:00:21
GEOFF NIELSON
So I really appreciate it. I learned a ton. I'm walking out of this, you know, room with a lot to think about. So I really appreciate you sharing your insights. I, I.
02:14:00:21 - 02:14:23:06
MO GAWDAT
I, I really enjoy it. I'm very, very grateful for the time. I'm very grateful for the way you handled it and the questions you asked. I, you know, I should again, maybe just close by saying, please don't take any of what I said as true. Just take it as an interesting direction to consider. It's the, you know, the best of my analysis, but it could absolutely be complete garbage.
02:14:23:06 - 02:14:43:16
MO GAWDAT
So, you know, nobody knows. The future is very arrogant to predict. That's when anyone knows. But yeah, I'm really grateful. And I think it's by this moment it's just you and I in the podcast, everyone else left. So, if anyone's still here, tell us. And, Yeah, I'm. I'm really grateful for the opportunity. Thank you.


The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Listen
Our Guest Cathy Hackl Discusses
How Spatial Tech Will Change Your Reality Forever
Cathy Hackl sits down with Geoff Nielson for an honest conversation about where technology is headed and what’s really happening with spatial computing, AI hardware, and the future of human connection.
Listen
Our Guest Malcolm Gladwell Discusses
Malcolm Gladwell on Tesla, RFK, and Why AI Could Save Us
Can generative AI help us close the gap between expertise and access?
Listen
Our Guest John Rossman Discusses
Former Amazon Exec: The 3 trends That Will Disrupt Every Industry
Can your organization survive its own hesitation to take bold bets? Today on Digital Disruption, host Geoff Nielson and former Amazon exec and bestselling author John Rossman unpack why most change initiatives fall short and what leaders can do to shift the odds in their favor. Get ready to challenge assumptions, cut through the hype, and transform the way you lead.
Listen
Our Guest Tom Goodwin Discusses
The Big Lie Behind AI Hype: Tom Goodwin on the Future of Business
Is the smartest move in a time of rapid technological change to slow down, ignore the hype, and prioritize people, processes, and a long-term strategic vision? That's what Tom Goodwin, co-founder of All We Have Is Now argues in this compelling podcast interview with Geoff Nielson.