Our Guest Gary Rivlin Discusses
Pulitzer-Winning Journalist: This Is Why Big Tech Is Betting $300 Billion on AI
Listen
What role should government, regulation, and society play in the next chapter of Big Tech and AI.
Today on Digital Disruption, we’re joined by Pulitzer Prize–winning investigative reporter, Gary Rivlin.
Gary has been writing about technology since the mid-1990s and the rise of the internet. He is the author of AI Valley and 9 previous books, including Saving Main Street and Katrina: After the Flood. His work has appeared in the New York Times, Newsweek, Fortune, GQ, and Wired, among other publications. He is a two-time Gerald Loeb Award winner and former reporter for the New York Times. He lives in New York with his wife, theater director Daisy Walker, and two sons.
Gary sits down with Geoff to discuss the unchecked power of Big Tech and the evolving role of AI as a political force. From the myth of the benevolent tech founder to the real-world implications of surveillance, misinformation, and election interference, he discusses the dangers of unregulated tech influence on policy and the urgent need for greater transparency, ethical responsibility, and accountability in emerging technologies. This conversation highlights the role of venture capital in fueling today’s tech giants, what history tells us about the future of digital disruption, and whether regulation can truly govern AI and platform power.
00;00;00;27 - 00;00;01;17
Hey everyone!
00;00;01;17 - 00;00;03;21
I'm super excited to be sitting down
with Pulitzer
00;00;03;21 - 00;00;06;21
Prize winning investigative journalist
Gary Rivlin.
00;00;06;24 - 00;00;10;20
He's been reporting on the personalities
and power structures behind Silicon Valley
00;00;10;20 - 00;00;12;08
for nearly 30 years.
00;00;12;08 - 00;00;13;16
What's cool about Gary is
00;00;13;16 - 00;00;16;27
he has the inside scoop on the who
and what of the AI boom.
00;00;17;09 - 00;00;20;02
I want to ask him about the technology,
risks and promises
00;00;20;02 - 00;00;23;02
he sees on the horizon,
as well as the players driving it.
00;00;23;09 - 00;00;26;09
And if we can trust them, let's find out.
00;00;29;27 - 00;00;32;21
The first thing I wanted to ask you,
I realize, is like a very loaded question
00;00;32;21 - 00;00;35;28
for a technology podcast, but maybe
I'll ask it anyway and see where it goes.
00;00;36;22 - 00;00;40;08
Gary,
I read that you have or used to have,
00;00;41;09 - 00;00;45;05
basically a plaque on your desk,
that your North Star is to state
00;00;45;05 - 00;00;49;17
complex social issues in human terms,
impossible to ignore.
00;00;50;02 - 00;00;51;15
Which I love, by the way.
00;00;51;15 - 00;00;56;01
But I wanted to ask you, you know,
as you kind of look from your vantage
00;00;56;01 - 00;01;01;05
point in 2025, you know what
are the complex issues of the day
00;01;01;10 - 00;01;05;00
or, you know, the things that you think
are worth crafting these narratives
00;01;05;00 - 00;01;07;13
around that,
that you want to be a part of,
00;01;07;13 - 00;01;10;13
or at least want somebody
you know, telling stories of.
00;01;10;16 - 00;01;14;03
And by the way, it's still right
there above my desk, a little bit faded.
00;01;14;05 - 00;01;14;22
I should probably,
00;01;16;03 - 00;01;17;08
redo it, but,
00;01;17;08 - 00;01;23;22
so, you know, my whole approach
is to tell stories through the people.
00;01;24;03 - 00;01;25;26
You know, we're talking about AI.
00;01;25;26 - 00;01;27;25
Like, you know, is it a good thing?
00;01;27;25 - 00;01;29;02
Is it a bad thing?
00;01;29;02 - 00;01;33;02
Why did arrive when it did
you know that set of questions?
00;01;33;02 - 00;01;34;14
What's it going to mean?
00;01;34;14 - 00;01;37;14
Or the impact
on, you know, society, jobs, etc.?
00;01;37;21 - 00;01;40;06
You know, rather than,
00;01;40;06 - 00;01;44;02
writing a book exploring each one of those
issues chapter by chapter.
00;01;44;14 - 00;01;48;09
I really kind of try to stay true
to to that motto.
00;01;48;09 - 00;01;49;27
And just like
00;01;49;27 - 00;01;52;24
I think it would be dull for people,
but if I can kind of introduce you
00;01;52;24 - 00;01;55;13
to a set of characters,
you're almost emblematic, right?
00;01;55;13 - 00;01;58;19
I mean, here
is someone trying to cash in on AI.
00;01;58;20 - 00;02;03;06
Here's somebody who had the opposite view
by telling their stories.
00;02;03;10 - 00;02;06;11
You know, I think people could understand
00;02;06;11 - 00;02;09;27
these complex social issues
in a very relatable human way.
00;02;11;09 - 00;02;12;02
Right.
00;02;12;02 - 00;02;14;02
So, so let's let's talk about that
a little bit.
00;02;14;02 - 00;02;18;16
And you've been, you know, the
the questions of AI and the story of AI
00;02;18;16 - 00;02;21;18
obviously is kind of intrinsically
linked to Silicon Valley today.
00;02;21;18 - 00;02;24;21
And you've been following
the Silicon Valley beat, if I can call it
00;02;24;21 - 00;02;27;21
that, for, you know, quite a long time.
00;02;27;23 - 00;02;31;01
As you went back there and looked
at some of these characters, you know,
00;02;31;01 - 00;02;35;10
what's what's changed in the past
20 some odd years there.
00;02;35;10 - 00;02;38;02
And you know what's fundamentally
the same.
00;02;38;02 - 00;02;38;10
Right?
00;02;38;10 - 00;02;43;14
So, you know, I started writing about tech
in the mid 1990s.
00;02;43;14 - 00;02;45;27
In fact, I moved from writing
about politics at tech.
00;02;45;27 - 00;02;47;10
I have an engineering background.
00;02;47;10 - 00;02;53;01
I program earlier in my life and just
boom, the.com, the internet, was with us.
00;02;53;01 - 00;02;56;28
And so,
you know, some things don't change, right?
00;02;56;28 - 00;03;00;18
You know,
just like when Netscape went public,
00;03;00;18 - 00;03;02;18
the internet was starting to spread.
00;03;02;18 - 00;03;07;01
You know, you saw a lot of excitement,
a lot of over enthusiasm
00;03;07;14 - 00;03;11;19
for the ASM, you know, and so, you know,
kind of the the hype, the overhype,
00;03;11;19 - 00;03;15;20
the promise, the over promises, you know,
the internet was going
00;03;15;20 - 00;03;17;07
to bring about world peace.
00;03;17;07 - 00;03;18;14
You know, essentially,
00;03;18;14 - 00;03;22;11
you know, now AI is going to equalize
between the developing world
00;03;22;11 - 00;03;24;25
and the developed world
is going to cure cancer.
00;03;24;25 - 00;03;26;04
Yes. Yeah.
00;03;26;04 - 00;03;28;22
I think AI is an incredible technology.
00;03;28;22 - 00;03;32;26
Obviously, the internet, you know,
has changed society in profound ways.
00;03;32;26 - 00;03;39;04
But know some of the over promise
almost feeds the other side's, skepticism.
00;03;39;04 - 00;03;41;04
And like AI is not good.
00;03;41;04 - 00;03;44;14
It might help some scientists help cure,
00;03;45;12 - 00;03;46;28
cancer, but you know AI, is
00;03;46;28 - 00;03;50;28
it not going to cure cancer,
at least anytime, anytime soon?
00;03;52;06 - 00;03;55;04
You know, one big difference is the money.
00;03;55;04 - 00;03;57;14
So when I first started
writing about tech,
00;03;57;14 - 00;03;59;05
I was always interested
in the venture capitalists
00;03;59;05 - 00;04;02;11
and the startups and that whole ecosystem,
like this idea, like,
00;04;03;13 - 00;04;06;05
you know, our idea for a company
is either going to work,
00;04;06;05 - 00;04;09;18
be worth back then, tens of millions,
hundreds of millions
00;04;09;18 - 00;04;12;22
now, billions, if not trillions,
or it's going to be worth nothing.
00;04;12;26 - 00;04;15;07
And the venture capitalists
who are staking, you know, back
00;04;15;07 - 00;04;18;07
then, millions now, tens of millions,
hundreds of millions, billions.
00;04;18;19 - 00;04;21;04
But, you know, in 90, 1995,
00;04;21;04 - 00;04;24;03
venture capital was under $10 billion
a year.
00;04;24;03 - 00;04;27;25
By 2021, it was over 300 billion a year.
00;04;27;25 - 00;04;32;13
You know, roughly 100 and 3050
billion went into AI startups,
00;04;33;17 - 00;04;34;13
last year.
00;04;34;13 - 00;04;37;12
I mean, a lot of it went into a few,
you know, like,
00;04;37;13 - 00;04;40;25
anthropic, OpenAI x AI, that's Elon Musk.
00;04;41;02 - 00;04;44;01
You know, they raise
collectively tens of billions of dollars,
00;04;44;01 - 00;04;47;01
you know, almost $100
billion just between those three.
00;04;47;24 - 00;04;50;18
But there's still a lot more money
going to AI startups.
00;04;50;18 - 00;04;53;14
So the money has really changed.
00;04;53;14 - 00;04;57;08
I guess the final difference is,
you know, when the internet came out,
00;04;57;09 - 00;05;01;06
like maybe the biggest criticism
was around the attention span.
00;05;01;17 - 00;05;02;09
You know, oh,
00;05;02;09 - 00;05;06;02
there is if you're always online,
you know, this instant gratification.
00;05;06;02 - 00;05;08;22
Well,
what was it going to do for consumerism?
00;05;08;22 - 00;05;11;22
In our in our society,
00;05;12;14 - 00;05;14;12
with AI,
00;05;14;12 - 00;05;18;17
there was much more of,
a worry, much more of a, a backlash.
00;05;18;17 - 00;05;21;25
People didn't,
you know, greet open AI with, excuse me,
00;05;21;25 - 00;05;26;21
I with open arms the way they did
engineered people are fearful of it.
00;05;26;22 - 00;05;27;28
You know, we could talk about that.
00;05;27;28 - 00;05;30;26
I you know, I think it's kind of Hollywood
induced fear.
00;05;30;26 - 00;05;34;21
I don't think the media has done
such a great job, with AI.
00;05;35;15 - 00;05;39;09
So, you know,
I cause a double battle, like, there's
00;05;39;09 - 00;05;42;28
the usual battle of creating a startup
and trying to cash in.
00;05;42;28 - 00;05;45;26
But, you know, the second battle
of trying to convince people
00;05;45;26 - 00;05;49;01
that this is a good thing,
and the laser eyed robots aren't going to,
00;05;50;08 - 00;05;52;23
beat us into submission.
00;05;52;23 - 00;05;55;07
So. So let's talk about those laser
eyed robots for a minute.
00;05;55;07 - 00;05;58;14
I'm curious, Gary,
because, yeah, you've painted,
00;05;59;03 - 00;06;02;17
I think pretty accurately the picture
of how people talk about AI.
00;06;02;17 - 00;06;07;06
Everything from, you know, this, you know,
utopia of abundance, which, by the way,
00;06;07;06 - 00;06;10;22
I get people,
you know, who I talk to, I'd say routinely
00;06;10;22 - 00;06;13;28
who use words like that to describe
where AI is going to take us,
00;06;14;03 - 00;06;17;11
you know, curing cancer,
you know, and, you know, laser robots
00;06;17;11 - 00;06;20;15
who are going to destroy the world and,
you know, the blink of an eye, you know,
00;06;21;18 - 00;06;25;22
from, from your vantage point, like,
how transformational is this?
00;06;25;22 - 00;06;29;11
Like, what's your outlook for what
this technology can and cannot do
00;06;29;18 - 00;06;31;00
in the next few years?
00;06;31;00 - 00;06;33;21
So Reid Hoffman,
who's the main character for my book.
00;06;33;21 - 00;06;34;10
All right.
00;06;34;10 - 00;06;35;13
Here's a set of terms.
00;06;35;13 - 00;06;38;04
There's the Zoomers, the laser robots.
00;06;38;04 - 00;06;39;08
We're fearful of this thing.
00;06;39;08 - 00;06;42;25
Let's bomb the data centers before they
take over to give the extreme of that,
00;06;43;26 - 00;06;48;07
the Zoomers, who say nothing.
00;06;48;07 - 00;06;51;05
There should be no speed bumps,
no regulation.
00;06;51;05 - 00;06;55;19
The only role of government,
is to speed this thing up, in part
00;06;55;19 - 00;06;57;15
because we're in a race
with China, in part
00;06;57;15 - 00;07;02;01
because tech is what's kept
the US, economy going and strong.
00;07;02;01 - 00;07;05;18
And, you know, it's one of our competitive
advantages as a country.
00;07;05;25 - 00;07;08;16
You know, I'm more of
what Reid Hoffman calls a bloomer.
00;07;08;16 - 00;07;10;29
You know, I, I definitely am an optimist.
00;07;10;29 - 00;07;15;02
I think AI is going to do
extraordinary things around science,
00;07;15;09 - 00;07;19;23
medicine, education,
a whole wide range of things.
00;07;19;23 - 00;07;23;16
But I do think we have to be deliberate,
about it, in part
00;07;23;20 - 00;07;26;26
because AI is
where way ahead of the public.
00;07;26;27 - 00;07;29;28
You know, Pew did a study last year,
a poll last year,
00;07;30;05 - 00;07;34;06
and the majority of Americans
are fearful, of of AI,
00;07;34;12 - 00;07;38;22
less than a third
are optimistic or, excited about AI,
00;07;39;24 - 00;07;40;19
being here.
00;07;40;19 - 00;07;42;23
And, you
know, I didn't have the best time.
00;07;42;23 - 00;07;46;06
I mean, you know, innovation happens
when innovation happens, but, you know,
00;07;46;06 - 00;07;50;06
kind of hitting at the end of 2022,
the distrust for big tech,
00;07;50;06 - 00;07;54;11
the distrust for tech generally
is pretty much at an all time high.
00;07;54;19 - 00;07;57;27
And, you know, of course, I like,
you know, ultimately humans
00;07;57;27 - 00;08;03;25
will lose their apex status is the, you
know, the smartest entity, on on Earth.
00;08;03;25 - 00;08;06;07
But, you know,
we're asking for a lot of trust, like, if,
00;08;06;07 - 00;08;07;15
you know, the big thing this year.
00;08;07;15 - 00;08;09;00
Of course
I'm sure you've been talking about.
00;08;09;00 - 00;08;11;21
This is personal agents, a personal agent.
00;08;11;21 - 00;08;14;20
They really have to get to know me.
They have to know my private details.
00;08;14;20 - 00;08;17;19
They're going to have
a lot of agency in my life.
00;08;17;19 - 00;08;21;13
And if if these companies get ahead, it
the this, you know,
00;08;21;13 - 00;08;24;27
the Googles and Microsofts of the world
or the startups
00;08;25;15 - 00;08;28;22
that hope people are using their agents,
they get too far ahead of people.
00;08;28;22 - 00;08;31;25
I, I'm not sure people
are going to trust these things.
00;08;31;25 - 00;08;34;16
And, you know, one less, point on that is
00;08;34;16 - 00;08;38;12
there is going to be something bad
that happens because I
00;08;38;13 - 00;08;42;09
just to make one up, like,
you know, $1 trillion is siphoned off
00;08;42;19 - 00;08;48;01
from the global economic system
before a human could even notice it.
00;08;48;01 - 00;08;51;07
And, you know, at that point, of course,
we all know the way these things play out,
00;08;51;13 - 00;08;52;28
you know, people are going to be fearful.
00;08;52;28 - 00;08;57;29
And so I, I'm a bloomer who thinks
there should be some regulation.
00;08;57;29 - 00;09;02;03
But I also think that
AI is going to be central
00;09;02;03 - 00;09;06;07
to our lives
within a decade, maybe 15 years.
00;09;07;26 - 00;09;10;02
It has vast potential.
00;09;10;02 - 00;09;11;01
Right.
00;09;11;01 - 00;09;13;21
So, so with that in mind and,
and thinking about the,
00;09;13;21 - 00;09;16;21
you know, the trust part.
00;09;17;01 - 00;09;19;26
We're in a space now where,
00;09;19;26 - 00;09;22;01
well, let me ask it as a question.
00;09;22;01 - 00;09;24;15
There's in your book
and I'm sure a little bit beyond
00;09;24;15 - 00;09;28;06
your book, there's a handful of top dogs,
I'll call them, who it seems like
00;09;28;06 - 00;09;32;28
have a really outsized influence
over this technology, how it's developed,
00;09;33;04 - 00;09;36;09
how it's played out,
how it collects data, what it does.
00;09;36;23 - 00;09;38;04
Is that fair?
00;09;38;04 - 00;09;39;25
Is there really kind of a,
00;09;39;25 - 00;09;43;18
you know, a small cabal that has,
you know, an oversize amount of control
00;09;43;26 - 00;09;47;15
and, you know, do we trust them
and should we trust them?
00;09;48;02 - 00;09;48;11
Right.
00;09;48;11 - 00;09;52;22
So so you're getting one of my fears
around AI.
00;09;52;22 - 00;09;56;08
It's nothing to do with laser eyed robots
or anything like that.
00;09;56;08 - 00;10;00;12
It's the consolidation in the hands
of the same few tech companies
00;10;00;12 - 00;10;04;00
that have been dominant,
for the last decade or two.
00;10;04;01 - 00;10;04;23
You know, it's funny.
00;10;04;23 - 00;10;09;05
So I started this book right
the end of 2022, started 2023,
00;10;09;14 - 00;10;12;14
and I went in search of the next Google,
the next meta.
00;10;12;17 - 00;10;15;05
And, you know,
I ended up concluding, like,
00;10;15;05 - 00;10;19;11
I fear that the next Google is Google in
AI and the next meta is meta.
00;10;19;22 - 00;10;21;13
You know that. Yeah, big.
00;10;21;13 - 00;10;23;03
This stuff is really expensive.
00;10;23;03 - 00;10;26;03
When I first started,
you know, people were talking about,
00;10;26;14 - 00;10;32;06
you know, millions, tens of millions
to train, fine tune and operate
00;10;32;06 - 00;10;36;16
these chatbots, large language models
where whatever you want to call it.
00;10;36;16 - 00;10;38;23
And the same with text to video, audio to,
00;10;39;25 - 00;10;42;03
a text audio.
00;10;42;03 - 00;10;45;24
By the time I was done
reporting at the end of 2024,
00;10;45;24 - 00;10;50;16
May 2024, it was hundreds of millions,
if not billions.
00;10;50;16 - 00;10;54;28
And Dario Modi for from anthropic
they do clogged the chat bot clogged.
00;10;55;08 - 00;10;57;28
You know he's
estimating that they're going to need $100
00;10;57;28 - 00;11;01;08
billion by 2027 to train these things.
00;11;01;20 - 00;11;04;11
And so who has that kind of money?
00;11;04;11 - 00;11;06;07
You know, Google, Microsoft.
00;11;06;07 - 00;11;09;01
They have 100 billion
or so laying around in cash.
00;11;09;01 - 00;11;13;26
But if you have to raise $100 billion
or even if it's only, you know,
00;11;14;12 - 00;11;17;19
three, five, $10 billion, well,
a large venture capital outfit
00;11;17;19 - 00;11;21;27
in Silicon Valley has $1 billion,
all told, in a fund.
00;11;22;06 - 00;11;23;13
And so we're talking about billions.
00;11;23;13 - 00;11;27;23
And so that's one way this is,
way to to big tech.
00;11;27;23 - 00;11;29;08
And the other is data. Right.
00;11;29;08 - 00;11;32;26
This is really central to the,
the remedies that government
00;11;32;26 - 00;11;36;13
is now talking to remedies
for the Google antitrust trial.
00;11;36;19 - 00;11;42;04
You know, a, a federal judge found that,
you know, Google is a monopolist.
00;11;42;05 - 00;11;44;22
It abused its power.
So now what should we do?
00;11;44;22 - 00;11;49;17
And a lot of the discussion
I think rightfully is around the data.
00;11;49;25 - 00;11;52;25
You know, OpenAI approached
Google and said, hey,
00;11;53;08 - 00;11;57;03
can we lease we
can we kind of buy access to your data?
00;11;57;03 - 00;11;58;14
And they said, no.
00;11;58;14 - 00;12;00;23
And that's a huge advantage.
00;12;00;23 - 00;12;04;29
In fact, anyone reads the book
every time Google steps on stage
00;12;04;29 - 00;12;08;07
in 2023 or 2024,
they fall flat on their face.
00;12;08;14 - 00;12;12;27
Google was so far ahead of every other
big tech company around machine learning.
00;12;13;00 - 00;12;13;25
You know, give them credit.
00;12;13;25 - 00;12;17;27
They were using it in the 2000s
and they went on a hiring binge binge.
00;12;18;07 - 00;12;21;03
You know, in the early 20 tens, long
before anyone else, you know,
00;12;21;03 - 00;12;25;04
they came up with the transformer
paper in 2017 that T and GPT,
00;12;26;17 - 00;12;27;20
and so it's, you know,
00;12;27;20 - 00;12;32;03
foundational research, and yet, you know,
it's a big company that was scared.
00;12;32;03 - 00;12;34;17
So OpenAI puts out ChatGPT.
00;12;34;17 - 00;12;38;06
They try to catch up to a race, they're
already winning, but they're scared.
00;12;38;06 - 00;12;39;13
So they weren't winning it.
00;12;39;13 - 00;12;41;09
And they kept on following their phase.
00;12;41;09 - 00;12;45;13
Despite that, Gemini,
their chat bot, is one of the 2
00;12;45;13 - 00;12;48;10
or 3 most popular chat bots on the planet
because they're a Google.
00;12;48;10 - 00;12;49;29
You know, we have many of us.
00;12;49;29 - 00;12;51;26
Google's the front door to the internet.
00;12;51;26 - 00;12;55;15
They have all this data
and they can, you know, train,
00;12;55;27 - 00;12;59;07
better and cheaper, than others.
00;12;59;07 - 00;13;02;05
So yeah, it's it it's interesting.
00;13;02;05 - 00;13;06;28
And, you know, your position is very
similar to mine, I think, which is,
00;13;08;19 - 00;13;10;26
it really feels like there's,
I don't know if I want to call it
00;13;10;26 - 00;13;15;03
an arms race or a space race,
but there's this race
00;13;15;03 - 00;13;19;05
of some kind between the these,
you know, tech giants to try and just
00;13;20;04 - 00;13;22;06
own the platform of the future,
00;13;22;06 - 00;13;25;17
you know, own the data, use it as a way
00;13;25;17 - 00;13;28;17
to, continue to monetize,
00;13;29;00 - 00;13;31;16
you know, the the public here, I guess
00;13;31;16 - 00;13;34;16
so, you know, with that in mind,
00;13;34;23 - 00;13;38;08
who are going to be the winners
and the losers of,
00;13;38;09 - 00;13;41;09
you know, that this shift to AI
and is it necessary, you know,
00;13;41;09 - 00;13;43;17
when you look out over it,
do you really see it's
00;13;43;17 - 00;13;47;17
winner take all with one or a few,
you know, technology giants
00;13;47;17 - 00;13;51;15
that probably are the OpenAI, the Googles,
the Microsofts, the anthropic?
00;13;51;23 - 00;13;54;23
Or is it going to be more democratic
than that?
00;13;55;12 - 00;13;58;09
Yeah, I mean, we're talking
about predicting the future.
00;13;58;09 - 00;14;02;20
So, you know, take anything coming out of
my mouth right now with a grain of salt.
00;14;03;26 - 00;14;04;15
You know, I
00;14;04;15 - 00;14;08;02
would kind of distinguish
between OpenAI anthropic, they're,
00;14;08;10 - 00;14;11;14
you know, relatively new startups
that have a lot of cash.
00;14;11;14 - 00;14;14;03
They've raised billions, tens
of billions of dollars
00;14;14;03 - 00;14;18;26
in the case of OpenAI, versus
a Google, Microsoft,
00;14;19;20 - 00;14;23;11
MIT, Amazon and Apple or they're, they're
00;14;24;00 - 00;14;27;03
they're not quite doing it yet,
but they're, they're in the race,
00;14;27;29 - 00;14;31;15
to, you know, I, I, I want to be optimist.
00;14;31;15 - 00;14;34;17
I, I guess I'm built
I, I root for startups,
00;14;35;01 - 00;14;36;25
you know, I root for the underdog.
00;14;36;25 - 00;14;37;27
You know, to me.
00;14;37;27 - 00;14;42;09
Well, one thing that was exciting about
AI is the possibility that there would be
00;14;42;19 - 00;14;46;29
kind of a new set, of major, major,
major players.
00;14;47;24 - 00;14;52;18
But I, I even wonder if an OpenAI AI,
despite all the successes.
00;14;52;18 - 00;14;52;25
Right.
00;14;52;25 - 00;14;55;26
It just raised $40 billion earlier
this year.
00;14;55;26 - 00;14;59;12
That's the largest raise
in the history of venture, capital.
00;14;59;12 - 00;15;03;02
It's worth, you know, it's it's
got a paper worth of $300 billion right
00;15;03;02 - 00;15;07;12
now, which, if you put in the fortune
500, would make it top 50.
00;15;07;16 - 00;15;09;17
I think it's top 50 for the globe.
00;15;09;17 - 00;15;13;11
And so, you know, it's it's
a, it's a, you know, it's ten years old,
00;15;13;19 - 00;15;18;17
but it's, it's seems on paper just kind of
a global AGI has been around forever.
00;15;18;19 - 00;15;21;02
But I'm still not sure they
can make it because they,
00;15;22;14 - 00;15;25;12
they're bringing in revenue,
but they're still losing a ton of money.
00;15;25;12 - 00;15;27;07
I mean, give them credit.
00;15;27;07 - 00;15;30;16
They're making virtually nothing
in 2022 and 2024.
00;15;30;24 - 00;15;34;02
They brought in almost $4 billion.
00;15;34;02 - 00;15;36;29
They'll have a much bigger number
that than that this year,
00;15;36;29 - 00;15;38;08
you know, 10 billion or more.
00;15;38;08 - 00;15;42;20
But they're still losing money because
this stuff is so expensive to operate.
00;15;42;21 - 00;15;44;29
You can use ChatGPT here.
00;15;44;29 - 00;15;46;23
There are other products for free.
00;15;46;23 - 00;15;51;01
You can pay a premium,
but you get a very good service for free.
00;15;51;01 - 00;15;51;16
And so,
00;15;51;16 - 00;15;54;25
you know, it's it's like kind of that
consumer thing and you could just go to,
00;15;55;04 - 00;15;58;20
you know,
Claude or EW.com or Gemini or Exa,
00;16;00;11 - 00;16;03;04
what it was
00;16;03;04 - 00;16;06;04
grok you could use, you know, there's mass
00;16;07;05 - 00;16;09;28
and so that that one costs
00;16;09;28 - 00;16;12;04
and so I, I,
00;16;12;04 - 00;16;15;02
I do fear that what's going to happen
00;16;15;02 - 00;16;18;28
is it's inevitable
that one of the giants of Microsoft
00;16;18;28 - 00;16;21;28
or Google are going to buy,
00;16;21;29 - 00;16;24;15
an anthropic or an open AI.
00;16;24;15 - 00;16;28;10
You know, right now,
Microsoft's put around 15 billion or so
00;16;28;27 - 00;16;31;18
into OpenAI, you know, anthropic.
00;16;31;18 - 00;16;35;08
There are two biggest
funders are Amazon and Google.
00;16;35;23 - 00;16;38;10
And so let's take Dario
00;16;38;10 - 00;16;41;14
Amodei at at his word in two years.
00;16;41;14 - 00;16;43;15
Where's he getting that $100 billion?
00;16;43;15 - 00;16;46;20
I mean I suppose
he could raise that money,
00;16;46;20 - 00;16;50;07
but I'm not even sure
they're profitable at that point.
00;16;50;07 - 00;16;53;16
So I do wonder if it's inevitable
that these
00;16;53;16 - 00;16;57;03
popular chat bots get bought,
by one of the tech giants.
00;16;57;14 - 00;17;01;00
Well, I have to imagine the tech giants
are chomping at the bit at the opportunity
00;17;01;00 - 00;17;04;07
to buy, you know, an opening
Iron Anthropic two and add them to their,
00;17;04;15 - 00;17;05;18
you know, to their stack.
00;17;05;18 - 00;17;09;23
Well, and they have been doing that,
you know, kind of this idea of an acquire
00;17;10;05 - 00;17;14;04
or inflection was having to be the company
at the center of my, book.
00;17;14;04 - 00;17;17;04
It was started by Reid Hoffman,
this guy, Moustafa Suleiman,
00;17;17;09 - 00;17;20;03
know, kind of a co-founder of DeepMind,
which, you know, it's kind of the first
00;17;20;03 - 00;17;24;08
great machine learning, startup
that started in 2010, in London.
00;17;24;08 - 00;17;27;28
And Google
bought it for 650 million, dollars.
00;17;28;03 - 00;17;30;26
You know,
they had everything going for them.
00;17;30;26 - 00;17;35;02
You know, it's like Reid Hoffman, best
connected guy, arguably in Silicon Valley.
00;17;35;02 - 00;17;40;16
You know, kind of, the the pedigree
of a moustafa Suleyman, a great CEO.
00;17;40;22 - 00;17;41;29
You know, Bill gates funded it.
00;17;41;29 - 00;17;45;24
Well, I am, you know, kind of had that,
you know, kind of Hollywood thing.
00;17;46;23 - 00;17;49;27
Ashton Kutcher was, was a, investor.
00;17;49;29 - 00;17;51;16
They hoarded all this talent.
00;17;51;16 - 00;17;55;12
Some of the top researchers in the world
raise a billion
00;17;55;12 - 00;17;59;04
and a half dollars, in, like, 15 months.
00;17;59;16 - 00;18;00;24
But it wasn't enough.
00;18;00;24 - 00;18;03;25
And so basically,
everyone at the company jumped,
00;18;04;29 - 00;18;07;29
Microsoft Microsoft paid the investors
00;18;08;00 - 00;18;11;00
$650 million to make everyone happy.
00;18;11;02 - 00;18;15;16
And, you know, essentially this company
inflection, really cutting edge chatbot.
00;18;15;21 - 00;18;18;22
What I loved about it is it's great at IQ,
00;18;18;22 - 00;18;21;22
emotional intelligence, not just IQ.
00;18;21;28 - 00;18;26;12
But now the talent, virtually
the whole company is inside Microsoft.
00;18;26;12 - 00;18;29;27
Google did the same thing with character
AI spent billions of dollars,
00;18;30;27 - 00;18;34;03
essentially to hire the two creators of,
00;18;35;03 - 00;18;35;22
of character.
00;18;35;22 - 00;18;38;17
I really popular another popular chat bot.
00;18;38;17 - 00;18;41;27
These two founders who are Google ites,
got frustrated at Google
00;18;41;27 - 00;18;44;24
because they couldn't do what they wanted
to do, went and did it,
00;18;44;24 - 00;18;45;23
and it started up.
00;18;45;23 - 00;18;49;11
And then Google had to pay billions
of dollars just to bring them back.
00;18;51;04 - 00;18;51;26
Yeah.
00;18;51;26 - 00;18;54;24
So so, you know, when I think about AI
00;18;54;24 - 00;18;57;24
and the impact it's going to have,
there's clearly this kind of,
00;18;58;28 - 00;19;00;11
I don't know
if I'm supposed to call it this,
00;19;00;11 - 00;19;03;01
but I'll call it a boys
club of like the big,
00;19;03;01 - 00;19;05;04
you know, incumbent
or emerging organizations
00;19;05;04 - 00;19;07;27
and the tens of billions of dollars
and the kind of,
00;19;07;27 - 00;19;10;08
you know, political infighting
you see there. And it's.
00;19;10;08 - 00;19;14;19
Yeah, you know, it's funny, Gary,
and I don't know if this is a result of
00;19;14;19 - 00;19;18;07
your background or just kind of a parallel
I picked up on, but I can
00;19;18;10 - 00;19;21;10
I can certainly see through lines
to what's going on here, to
00;19;21;10 - 00;19;24;10
what you probably saw,
you know, covering City Hall,
00;19;24;29 - 00;19;28;03
and those kind of political beats
all those years ago where it's jockeying
00;19;28;03 - 00;19;31;28
for, for power and influence is that
I mean, is that fair?
00;19;31;28 - 00;19;35;18
Is are there is that just coming through,
00;19;35;29 - 00;19;38;28
do you think sorry,
is there merit to that?
00;19;39;11 - 00;19;40;00
Yeah.
00;19;40;00 - 00;19;42;19
There is like, you know, in, in,
00;19;42;19 - 00;19;45;23
political sense
we call backroom deals kind of
00;19;46;16 - 00;19;49;17
going on, you know,
that politics is happening in front of us,
00;19;49;17 - 00;19;53;12
but the real deals are being
cut, behind closed doors and
00;19;54;16 - 00;19;56;08
it is true of Silicon Valley.
00;19;56;08 - 00;19;57;12
It is true of tech.
00;19;57;12 - 00;19;59;22
Like it's one of my big,
you know, Chris is.
00;19;59;22 - 00;20;04;29
But let's just get five smart guys
and they are invariably guys in a room
00;20;04;29 - 00;20;06;06
and we'll figure it out.
00;20;06;06 - 00;20;08;23
And, you know,
I mean, that works for your app.
00;20;08;23 - 00;20;13;02
That has worked very well, for for tech.
00;20;13;02 - 00;20;16;02
But I think I different, you know,
00;20;16;02 - 00;20;20;26
I, I requires
a much broader, group of people.
00;20;20;26 - 00;20;24;10
I mean, some of it is what discipline
are you from?
00;20;24;10 - 00;20;25;28
Like, it's not just computer scientists.
00;20;25;28 - 00;20;26;28
It's not just,
00;20;26;28 - 00;20;31;03
you know, tech folks, you know, I,
I mean, first off, it's it's linguists.
00;20;31;15 - 00;20;32;14
You know, you need linguists.
00;20;32;14 - 00;20;34;18
It's math. It's math.
00;20;34;18 - 00;20;37;26
And so but I, I really want a much broader
group,
00;20;37;26 - 00;20;42;23
given the power and potential of AI
and what AI is.
00;20;43;20 - 00;20;46;11
I want a broader group
in terms of discipline.
00;20;46;11 - 00;20;49;10
I want a broader group
in terms of who we're talking about.
00;20;49;16 - 00;20;52;08
I mean, tech is heavily
skewed male, of course,
00;20;52;08 - 00;20;55;11
and these days it's very white and Asian.
00;20;55;26 - 00;20;57;14
Our world is much broader than that.
00;20;57;14 - 00;20;58;17
It's a global thing.
00;20;58;17 - 00;21;02;22
And so bringing bringing in people
from different backgrounds from,
00;21;02;23 - 00;21;06;19
you know, around the globe,
you know, to figure this out, I again, I
00;21;06;28 - 00;21;07;25
coming back to a point
00;21;07;25 - 00;21;11;04
I made before about the strategy,
I think it's gonna be very dangerous,
00;21;12;04 - 00;21;13;01
for tech,
00;21;13;01 - 00;21;16;07
if this is the same old, same old
where it's, you know,
00;21;16;16 - 00;21;20;13
Sam Altman
negotiating with a Sachin Nadella
00;21;20;13 - 00;21;24;12
at Microsoft with a few other people,
in the in the room.
00;21;24;12 - 00;21;29;24
I really think for kind of trust
and safety, reasons, there needs to be
00;21;29;24 - 00;21;33;05
a much broader group than the same old,
same old that Silicon Valley does.
00;21;34;24 - 00;21;35;22
It makes sense.
00;21;35;22 - 00;21;38;21
And as I think about
like what that impact could be
00;21;38;21 - 00;21;42;21
and what this technology could do, I mean,
we talked about why, you know, the whole
00;21;42;21 - 00;21;46;02
the hype is kind of, you know, inflated
potentially on both sides.
00;21;46;23 - 00;21;49;23
What impact do you see this having,
00;21;50;29 - 00;21;53;25
I guess across, across the population.
00;21;53;25 - 00;21;57;22
And is this a leap forward
in some ways for,
00;21;57;23 - 00;22;01;28
you know, the poor or the traditionally
marginalized or underserved or is it
00;22;02;12 - 00;22;06;28
do you foresee it just going to end up
being like a rich, get richer, platform?
00;22;07;21 - 00;22;08;00
Yeah.
00;22;08;00 - 00;22;10;15
So again, I'm about to predict the future.
00;22;10;15 - 00;22;12;03
So take anything I say.
00;22;12;03 - 00;22;14;04
That's what we do here, Gary.
00;22;14;04 - 00;22;16;02
But a great grain of salt.
00;22;16;02 - 00;22;17;06
But, you know, I.
00;22;20;16 - 00;22;23;01
So I do
00;22;23;01 - 00;22;26;01
think I can be
00;22;27;03 - 00;22;29;29
a great equalizer.
00;22;29;29 - 00;22;30;29
You know, kind of.
00;22;30;29 - 00;22;32;24
I give a couple examples.
00;22;32;24 - 00;22;38;10
So in the sense of making art, you know,
if you want to make your, your movie
00;22;38;10 - 00;22;41;22
maker, you want to make a movie,
you need a lot of money to do that.
00;22;41;22 - 00;22;43;05
You have to hire people and stuff.
00;22;43;05 - 00;22;46;16
But with AI, you can make a movie.
00;22;46;23 - 00;22;50;28
You know, it's it's kind of the great
equalizer in in terms of the arts.
00;22;50;28 - 00;22;54;06
It gives you these superpowers,
to do things.
00;22;54;06 - 00;22;56;02
We could talk about the ramifications
that, you know,
00;22;56;02 - 00;22;59;29
all those actors and production
people, etc., who aren't getting the job.
00;22;59;29 - 00;23;01;23
And, you know, to add a caveat, it's
not just like,
00;23;01;23 - 00;23;03;24
make me a Martin Scorsese movie.
00;23;03;24 - 00;23;06;11
You know, the human is still the driver.
00;23;06;11 - 00;23;09;17
The human has to bring the the storylines
and the plots and,
00;23;10;04 - 00;23;12;09
you know, plot twists and all that.
00;23;13;14 - 00;23;15;17
And, you know, another example is, again,
00;23;15;17 - 00;23;19;06
it could be, an equalizer that,
00;23;21;02 - 00;23;23;24
anyone can code, right?
00;23;23;24 - 00;23;26;29
I mean, I don't think we're quite
at that point yet, but we're not that far
00;23;26;29 - 00;23;31;17
away for a point where you don't have
to have a computer science degree.
00;23;31;17 - 00;23;34;17
You don't have to have much experience
of all, if any at all.
00;23;34;18 - 00;23;35;10
And you could code
00;23;35;10 - 00;23;39;05
and that that that will really helps
as so in, in the developing world
00;23;39;14 - 00;23;41;22
could see a problem
that's not a first world problem
00;23;41;22 - 00;23;46;00
but a problem in their world
and do code, you know, and,
00;23;46;01 - 00;23;50;27
and have this magic power
of coding to come up with a solution,
00;23;52;00 - 00;23;54;00
tailored to their world.
00;23;54;00 - 00;23;56;11
You know,
just use the example of education.
00;23;56;11 - 00;24;01;02
I think a has amazing potential,
for for education.
00;24;01;09 - 00;24;04;09
They did a study, in Nigeria,
00;24;04;13 - 00;24;08;11
where they had teachers use AI tutors.
00;24;08;11 - 00;24;11;13
And of course, an AI tutor
could figure out exactly what
00;24;11;13 - 00;24;15;19
a student doesn't understand
and then feed them, examples,
00;24;16;12 - 00;24;20;11
information that helps them,
learn that aspect and stuff.
00;24;20;11 - 00;24;23;14
And what they found
is just in a six week period,
00;24;23;22 - 00;24;26;20
they had like two years of advancement
in the students.
00;24;26;20 - 00;24;27;17
They worked with.
00;24;27;17 - 00;24;31;13
And I really do think, you know, this,
this idea that, you know, a tutor,
00;24;31;24 - 00;24;35;06
you know, $100 an hour or $200 an hour,
I, I don't know what a tutor,
00;24;35;13 - 00;24;36;08
but it's a lot of money.
00;24;36;08 - 00;24;40;08
It's beyond the means for,
you know, most people on the planet.
00;24;40;15 - 00;24;43;19
But what of this tutor was in your pocket?
00;24;44;23 - 00;24;45;07
And it was,
00;24;45;07 - 00;24;48;10
if not free, virtually
free or health care.
00;24;48;10 - 00;24;52;15
This is a great stat
that basically a billion or so people
00;24;52;24 - 00;24;56;01
have access to ready health care,
00;24;56;15 - 00;24;59;20
but there's 5 billion
or so who have a smartphone.
00;25;00;03 - 00;25;01;29
And should you go to a doctor?
00;25;01;29 - 00;25;04;16
Is a doctor better
or a nurse practitioner or whatever?
00;25;04;16 - 00;25;06;18
A health care
professional better than a phone?
00;25;06;18 - 00;25;08;10
Yes yes yes yes yes.
00;25;08;10 - 00;25;11;15
But what about all those people
who have zero access
00;25;11;25 - 00;25;15;03
and you know, you could take a picture
and my child has this rash,
00;25;15;18 - 00;25;18;18
you know, should I be worried and
00;25;19;00 - 00;25;21;20
move heaven and earth
to get to a hospital?
00;25;21;20 - 00;25;22;21
Or is it nothing?
00;25;22;21 - 00;25;25;01
And so that could be a life
and death thing.
00;25;25;01 - 00;25;27;09
So, you know, I, I do think,
00;25;27;09 - 00;25;31;03
you know, kind of a
this idea of a eye doctor in your pocket,
00;25;31;12 - 00;25;35;24
you know, it's certainly better
than not having any access to health care.
00;25;35;27 - 00;25;38;26
So that's another way I see,
00;25;38;26 - 00;25;43;19
you know, education or medicine
that I could be an equalizer.
00;25;43;19 - 00;25;47;19
But the flip of that is that,
you know, winner takes all,
00;25;47;19 - 00;25;49;16
if not winner takes most.
00;25;49;16 - 00;25;52;18
And, you know, I, I am nervous that
00;25;53;17 - 00;25;55;15
well, as we've been talking
00;25;55;15 - 00;25;58;20
power over AI
in the hands of just a few people
00;25;58;27 - 00;26;03;15
will mean that that small group
in Silicon Valley or technology
00;26;03;15 - 00;26;08;26
generally, is going to grow super wealthy
and only a little bit will spill over to.
00;26;08;26 - 00;26;10;24
To the rest of us.
00;26;10;24 - 00;26;13;02
Yeah. It's.
00;26;13;02 - 00;26;13;16
Yeah.
00;26;13;16 - 00;26;15;28
Prediction
there is it is a tough game and it's,
00;26;15;28 - 00;26;18;28
there's
like so many moving pieces to this.
00;26;19;05 - 00;26;20;25
And I don't know
if this is something you heard about
00;26;20;25 - 00;26;22;02
or you're concerned about as well.
00;26;22;02 - 00;26;25;06
But when I think about that,
you know, that AI doctor, which
00;26;25;29 - 00;26;27;26
you know, whether whether we choose
to acknowledge it or not,
00;26;27;26 - 00;26;29;07
I feel like that's already
starting to happen.
00;26;29;07 - 00;26;31;28
Right. People are talking to ChatGPT
about their symptoms.
00;26;31;28 - 00;26;36;08
And and frankly, I think for the most
part, that's probably a good thing. But
00;26;37;15 - 00;26;40;03
what I don't know if is happening right
now, but I have to
00;26;40;03 - 00;26;45;13
imagine we're on the cusp of is ChatGPT
or whatever, generative AI saying,
00;26;45;17 - 00;26;49;04
oh, and you should take an Advil
for your symptoms.
00;26;49;22 - 00;26;54;20
And there's quietly, you know, money
changing hands between, you know, the tech
00;26;54;20 - 00;26;57;26
giant and Advil so that you take an Advil
and not a Tylenol.
00;26;57;26 - 00;27;02;26
And suddenly you know the difference to me
between the web
00;27;02;27 - 00;27;03;26
and some of these generative
00;27;03;26 - 00;27;07;09
AI things is just and even even Google,
frankly, is it?
00;27;07;18 - 00;27;11;23
It gives you an answer that it acts like
is definitive, right?
00;27;11;26 - 00;27;14;22
Yeah. And what kind of influence does
that have over people?
00;27;14;22 - 00;27;18;06
Is that something that came up at all or
something that you have a perspective on?
00;27;18;11 - 00;27;19;05
Yeah, yeah.
00;27;19;05 - 00;27;23;26
And these these chat bots speak
with an authority, I joke.
00;27;23;26 - 00;27;28;07
Well they were, they were trained
on a disproportionately male data set.
00;27;28;18 - 00;27;29;23
And so they're no at all.
00;27;29;23 - 00;27;30;08
In fact,
00;27;30;08 - 00;27;31;16
they've had to be teach to say
00;27;31;16 - 00;27;34;16
like it's okay to say,
I don't know if you don't know,
00;27;34;21 - 00;27;37;19
but what you're getting at
is one of the things,
00;27;37;19 - 00;27;40;22
again, before we were talking about
like what the fear out
00;27;40;22 - 00;27;45;11
there is with laser eyed robots,
there are very real fears that are,
00;27;46;27 - 00;27;48;27
that we could see right now.
00;27;48;27 - 00;27;53;21
And one of them is the power of
AI to manipulate or the power of AI
00;27;53;27 - 00;27;57;28
to make suggestions that you think, well,
this is just based on its knowledge
00;27;57;28 - 00;28;01;23
reading, you know, what
it's been trained on and you don't know,
00;28;01;23 - 00;28;05;01
like, oh, no, the maker of Advil has paid,
00;28;05;24 - 00;28;08;19
for this, by the way, right now,
00;28;08;19 - 00;28;13;11
I don't think there's any chat bots
that are giving recommendations.
00;28;14;07 - 00;28;17;15
You, you know, I,
I think people are using and probably
00;28;17;15 - 00;28;22;04
sometimes abusing as if, like people
who do have access to medicine,
00;28;22;08 - 00;28;26;12
like, oh, I yeah, I'm
going to learn about this here.
00;28;26;12 - 00;28;27;22
And like, you know, sometimes that works.
00;28;27;22 - 00;28;31;06
Sometimes it does it, you know,
there's there's stories where, you know,
00;28;31;06 - 00;28;35;22
someone has struggled for months, years,
to figure out what's wrong with them. They
00;28;36;22 - 00;28;39;22
give it
their symptoms to ChatGPT or whatever,
00;28;40;00 - 00;28;43;00
and they figure out the solution.
00;28;43;00 - 00;28;44;25
I mean, you know, I can be incredible.
00;28;44;25 - 00;28;49;27
There's a study where,
they are training a chat bot
00;28;50;12 - 00;28;53;19
to have early detection
00;28;54;06 - 00;28;58;14
of diabetes type two, type two diabetes
just by hearing your voice.
00;28;58;24 - 00;29;00;17
And so, you know,
that's the positive where
00;29;00;17 - 00;29;02;21
there's amazing things that can happen.
00;29;02;21 - 00;29;06;17
You know, they're really accurate
at reading mammograms and,
00;29;06;17 - 00;29;11;01
you know, X-rays and all that
far more accurate than, human doctors.
00;29;11;01 - 00;29;14;01
Me, I want doctors to use,
00;29;14;29 - 00;29;17;29
I as a backup, as a check.
00;29;18;08 - 00;29;21;14
I don't want I reading, my, you know,
00;29;21;23 - 00;29;25;17
I another thing that worries me
is autonomous AI,
00;29;25;26 - 00;29;28;26
you know, for the foreseeable future,
if not forever.
00;29;28;26 - 00;29;31;10
We need humans in the loop. These things,
00;29;32;19 - 00;29;33;19
they know a lot
00;29;33;19 - 00;29;36;19
about everything,
but they don't understand a thing.
00;29;36;25 - 00;29;38;10
They have no common sense.
00;29;38;10 - 00;29;41;10
They have no, you know, the it's
00;29;41;11 - 00;29;44;13
a parrot is repeating words,
but it doesn't really understand.
00;29;44;24 - 00;29;46;08
You know, there's an expression
00;29;46;08 - 00;29;50;17
I learned doing the research here
that it tends to know everything.
00;29;50;19 - 00;29;52;18
These chatbots, these large language
models, tend
00;29;52;18 - 00;29;56;14
to know everything
that humans over 20 years old know.
00;29;56;18 - 00;30;00;22
But they don't know a lot that a kid
under a five year old, five years old,
00;30;02;11 - 00;30;04;03
would know, you know, just kind of
00;30;04;03 - 00;30;07;19
just the common sense stuff,
like just stupid stuff, like,
00;30;07;29 - 00;30;10;26
you chew this, but you don't chew this,
you swallow, you know, that.
00;30;10;26 - 00;30;14;03
Kind of like
they have to be taught every right thing.
00;30;14;11 - 00;30;18;09
And so, you know, it's I, I do fear
00;30;19;00 - 00;30;23;02
that we're going to invest too much power
prematurely in these things.
00;30;23;02 - 00;30;26;02
I really don't want autonomous
AI anytime soon.
00;30;26;10 - 00;30;28;28
That's that that that's fair. What?
00;30;30;03 - 00;30;33;03
When you look at
what we have or where we're going with AI,
00;30;33;07 - 00;30;36;27
is there anything that came across to you
as being underreported right now,
00;30;36;29 - 00;30;39;14
either in capabilities or in risks
that, you know,
00;30;39;14 - 00;30;42;11
you think we need to have a little bit
more of a light on?
00;30;42;11 - 00;30;42;17
Yeah.
00;30;42;17 - 00;30;45;17
So, I mean, broadly speaking,
00;30;46;13 - 00;30;49;28
I don't think the media I mean,
I used to work for the New York Times,
00;30;49;28 - 00;30;53;00
so I am big media,
I guess you can say, at least in my past,
00;30;53;10 - 00;30;57;07
but you know, that the New York Times,
Washington Post of the world,
00;30;57;19 - 00;31;00;22
I don't think they've done a great job,
covering
00;31;01;27 - 00;31;06;17
AI, generative AI in particular,
since the release of ChatGPT.
00;31;06;24 - 00;31;09;16
And that was November of 2022.
00;31;09;16 - 00;31;13;04
You know, there was a little bit about,
you know, its potential,
00;31;13;14 - 00;31;16;28
but right away
it was it's all the negatives.
00;31;16;28 - 00;31;20;08
It's going to destroy education,
you know, that it's going to write
00;31;20;08 - 00;31;23;13
essays for kids and, you know,
kids are going to use it for cheating.
00;31;23;14 - 00;31;24;14
Yeah, that's a problem.
00;31;24;14 - 00;31;28;21
But you know, what about in fact,
I have an eighth grader and,
00;31;28;21 - 00;31;32;01
you know, he writes a composition
and he gives it to ChatGPT
00;31;32;24 - 00;31;35;04
and it gives him feedback.
00;31;35;04 - 00;31;38;19
You know, the teacher's busy, but like,
you know, how can I improve this?
00;31;38;19 - 00;31;43;00
Well, you can make a better topic sentence
and it's just repetitive kind of thing
00;31;43;06 - 00;31;44;13
and, you know, gets good feedback.
00;31;44;13 - 00;31;47;01
And so it's an amazing tool for a teacher.
00;31;47;01 - 00;31;49;15
You could devise lesson plans.
00;31;49;15 - 00;31;54;01
You can, you know, use it to help
with grading and and, and, and all.
00;31;54;25 - 00;31;58;19
And, you know, I think that
I think that has been largely missing
00;31;58;27 - 00;31;59;14
from the coverage.
00;31;59;14 - 00;32;03;22
Occasionally you see an article about, oh,
we could do this around health care.
00;32;04;14 - 00;32;09;03
And, and also,
I really don't think the broader public
00;32;09;03 - 00;32;13;09
has a sense
of what the true promise, of this is.
00;32;13;15 - 00;32;15;19
And the flip side of that is,
00;32;16;23 - 00;32;19;09
I think people in
00;32;19;09 - 00;32;24;04
part because of the media coverage,
are scared of the wrong things.
00;32;24;13 - 00;32;29;03
I mentioned a, autonomous
I bet AI the use of AI in warfare,
00;32;29;24 - 00;32;33;04
the use of AI for surveillance,
you know, a a
00;32;33;26 - 00;32;36;26
AI that can come up with,
00;32;38;04 - 00;32;39;25
a new vaccine
00;32;39;25 - 00;32;43;25
or a new treatment
or tailored treatments for this specific,
00;32;44;12 - 00;32;47;12
cancer or whatever someone is suffering
00;32;47;18 - 00;32;52;12
through can also create a deadly pathogen
that could kill a lot of people,
00;32;52;21 - 00;32;54;09
you know, kind of the potential for
00;32;54;09 - 00;32;58;16
AI to write your wedding toast
or for your friend's 50th birthday,
00;32;58;29 - 00;33;02;09
you know, is a powerful tool
in the hands of scammers.
00;33;02;24 - 00;33;07;28
And so I wish that's
where the coverage was.
00;33;07;28 - 00;33;11;08
You know, around
here are the potential pitfalls
00;33;11;14 - 00;33;14;15
and this is the stuff we need to navigate.
00;33;14;16 - 00;33;20;06
I mean, every tech that
every technology ever is both pro and con.
00;33;20;06 - 00;33;21;06
It goes both ways.
00;33;21;06 - 00;33;24;07
You know, the cars
extraordinary change our society.
00;33;24;20 - 00;33;25;13
But, you know,
00;33;26;22 - 00;33;27;28
global warming
00;33;27;28 - 00;33;32;11
35,040 thousand people in the US
die in car accidents, you know,
00;33;32;11 - 00;33;36;13
so every technology cuts both ways in the
same will be true with AI.
00;33;36;22 - 00;33;40;28
And I think I,
I wish we being smarter, more deliberate
00;33;41;04 - 00;33;45;26
about making sure I was a net positive
rather than net negative right now.
00;33;45;26 - 00;33;49;09
I don't know, like
I'll give you an example of regulation.
00;33;49;13 - 00;33;54;01
So the Biden administration imposed,
I thought light regulation like, okay,
00;33;54;01 - 00;33;56;01
if you're working on one of these
powerful models
00;33;56;01 - 00;33;58;14
before you release it,
we're gonna require you to red team it
00;33;58;14 - 00;34;01;28
to hire an outside group to,
you know, test for vulnerabilities
00;34;01;28 - 00;34;03;25
and then share the results
with the US government.
00;34;03;25 - 00;34;08;05
Didn't seem a very heavy lift,
but as soon as Trump took over, he,
00;34;09;07 - 00;34;11;16
counteracted that executive order.
00;34;11;16 - 00;34;16;16
And JD Vance in January,
I mean, February was in Paris
00;34;16;16 - 00;34;19;20
for the third of these worldwide
AI summits.
00;34;20;00 - 00;34;23;00
The first two were totally focused
on trust and safety.
00;34;23;06 - 00;34;25;06
This one was not. And there was J.D.
00;34;25;06 - 00;34;28;08
Vance, who stood up there and said,
stop with the hand-wringing.
00;34;28;17 - 00;34;32;15
We need to win this right, this race
because it China, China, China thing.
00;34;32;15 - 00;34;35;09
And you know that
that that that worries me.
00;34;35;09 - 00;34;37;11
Government does have a role.
00;34;37;11 - 00;34;40;07
Again,
I don't want them micromanaging this.
00;34;40;07 - 00;34;43;07
That's
not the way we have great innovation.
00;34;43;13 - 00;34;46;22
But I do think they need
I do think we need guardrails.
00;34;47;00 - 00;34;50;19
And I do think
we need kind of this mindset.
00;34;50;19 - 00;34;54;15
So we're really looking at the stuff
around the corner
00;34;54;15 - 00;34;56;23
that's negative and this stuff
around the corner on this positive.
00;34;56;23 - 00;34;59;03
So we can, you know, work it out.
00;34;59;03 - 00;35;00;20
So it's more positive than negative
00;35;01;25 - 00;35;02;06
right.
00;35;02;06 - 00;35;06;08
So so yeah you know I'm thinking about
that kind of duma versus zoomer mentality.
00;35;06;08 - 00;35;12;08
And it feels like, you know as we
look at the regulatory bodies broadly,
00;35;12;14 - 00;35;16;05
we've gone from Duma to Zuma
and just started like ripping off,
00;35;16;06 - 00;35;17;28
you know, red tape
if you can call it that.
00;35;17;28 - 00;35;22;08
But, it sounds like your perspective,
Gary, is we do need
00;35;22;26 - 00;35;28;17
the the focus on some sort of responsible
AI guardrails are important
00;35;28;17 - 00;35;31;26
and that that now is the right time
to get some of that right.
00;35;31;27 - 00;35;33;03
Is that fair? Right.
00;35;33;03 - 00;35;37;06
So I you know, it's funny,
I was so if you if we had spoken in 2023
00;35;37;13 - 00;35;39;24
I would have been really optimistic
about this.
00;35;39;24 - 00;35;40;19
Right?
00;35;40;19 - 00;35;42;08
I think most of us can agree
00;35;42;08 - 00;35;45;21
that the government blew it
as far as social media, right.
00;35;45;21 - 00;35;49;11
It was just like it took 20 years before
their first hearing about social media.
00;35;49;11 - 00;35;52;11
And by that time, social media had,
you know, caused the problems.
00;35;53;05 - 00;35;55;17
It it caused.
00;35;55;17 - 00;35;58;08
But I like, you know, remember Sam Altman
00;35;58;08 - 00;36;01;10
in the spring of 2023, you know, stood
00;36;01;10 - 00;36;05;18
before the Senate and said,
I have fears around this, too.
00;36;05;19 - 00;36;07;09
We need to be regulated.
00;36;07;09 - 00;36;10;16
He was talking about, AI body like,
you know, a nuclear
00;36;10;16 - 00;36;13;16
regulatory agency like thing for AI.
00;36;13;16 - 00;36;16;16
People have talked about
a global version, of that.
00;36;16;16 - 00;36;19;04
And, you know,
the senators were talking about it.
00;36;19;04 - 00;36;21;23
You know, Chuck Schumer,
then the majority leader,
00;36;21;23 - 00;36;25;18
he had a series of talks in 2023
where they had leaders in
00;36;25;18 - 00;36;29;24
AI come in and talk to the senators about,
here's the way it works.
00;36;29;24 - 00;36;30;22
Here's our fears.
00;36;30;22 - 00;36;33;06
Here's where we think the potential is.
00;36;33;06 - 00;36;36;02
But then you know two things.
00;36;36;02 - 00;36;40;16
One, the tech companies themselves,
because of the arms
00;36;40;16 - 00;36;44;16
race, as you called it before, you know,
they really dropped the trust
00;36;44;18 - 00;36;48;11
and safety, aspect,
or at least that took a backseat,
00;36;49;14 - 00;36;51;06
to winning this race.
00;36;51;06 - 00;36;54;15
And, you know,
kind of our, our model out to your,
00;36;55;17 - 00;36;56;08
model.
00;36;56;08 - 00;36;59;08
And, you know, the other thing
just kind of the wider politics
00;36;59;08 - 00;37;03;25
meant there really hasn't
been much room, lately for kind of debates
00;37;03;25 - 00;37;08;01
about AI when we're talking
about a whole different set of issues.
00;37;08;12 - 00;37;09;17
And, and politics.
00;37;09;17 - 00;37;14;09
So, you know, now I am worried,
you know, where I was at this school,
00;37;14;12 - 00;37;18;27
that I thought it was a good thing
that OpenAI put out ChatGPT.
00;37;19;25 - 00;37;21;13
When they did,
it wasn't that powerful yet.
00;37;21;13 - 00;37;25;12
It really couldn't
do much harm, at that point.
00;37;25;18 - 00;37;29;07
But it sparked the discussion,
you know, long
00;37;29;07 - 00;37;32;26
before this thing was super powerful,
you know, so, you know, it sort of.
00;37;33;15 - 00;37;35;11
We figured out how to make,
00;37;36;10 - 00;37;40;00
jet engines, jet airplanes safe
or relatively safe.
00;37;40;00 - 00;37;42;12
There's very few accidents.
00;37;42;12 - 00;37;44;23
Because, you know,
it started off with prop planes
00;37;44;23 - 00;37;47;05
and we came up with rules,
and then they got better and stronger.
00;37;47;05 - 00;37;50;03
You know, cars. We've, you know, cars.
00;37;50;03 - 00;37;54;13
We've, anti-lock brakes and,
you know, kind of all those kind of
00;37;55;01 - 00;37;57;22
although airbags,
all those kind of safety things,
00;37;57;22 - 00;37;59;25
you know, it's it's good
that we're kind of struggling
00;37;59;25 - 00;38;02;11
with the technology
before it's super powerful.
00;38;02;11 - 00;38;05;28
But I think in 2023 we were really jumping
on that opportunity.
00;38;06;17 - 00;38;09;05
But now it just I don't hear it at all.
00;38;09;05 - 00;38;12;05
It doesn't seem an issue.
00;38;12;06 - 00;38;13;10
Yeah. Not it it's fair.
00;38;13;10 - 00;38;16;09
It feels like it has been like,
00;38;16;10 - 00;38;20;14
like almost like a pendulum swing
in the complete opposite direction.
00;38;20;14 - 00;38;23;10
And of course, pendulums
never swing to the middle, where they,
00;38;23;10 - 00;38;25;16
you know, more or less.
Should they swing to the other side.
00;38;25;16 - 00;38;26;07
Yeah.
00;38;26;07 - 00;38;27;11
Yeah, yeah.
00;38;27;11 - 00;38;29;29
No, that's that's well well said.
00;38;29;29 - 00;38;33;04
Moving away from, from, you know,
the government and the regulators here,
00;38;33;10 - 00;38;35;26
if we think about the role
that that organizations play
00;38;35;26 - 00;38;38;02
and you know,
call it call it companies here,
00;38;39;07 - 00;38;40;29
what what would be
00;38;40;29 - 00;38;46;09
your best advice for, you know,
if you were brought in to a boardroom of,
00;38;46;09 - 00;38;49;18
you know, a non,
you know, a non, you know,
00;38;49;18 - 00;38;52;24
whatever you want to call them magnificent
seven tech company.
00;38;52;24 - 00;38;56;06
But just, you know, a company
that sits in the middle of the economy
00;38;56;12 - 00;38;59;12
is looking at
AI, is looking at kind of future tech
00;38;59;20 - 00;39;02;22
wants to know if it can help them
or hurt them, what they need to know.
00;39;02;23 - 00;39;05;23
What's kind of your best advice for what
they should be thinking about. So.
00;39;05;27 - 00;39;09;01
So my my second interview
ever as a tech reporter
00;39;09;09 - 00;39;12;29
in the mid 1990s with this guy
Paul Saffo, is a prognosticator,
00;39;13;06 - 00;39;17;17
a futurist is kind of the more,
more common word for it.
00;39;17;22 - 00;39;18;17
And he said
00;39;19;25 - 00;39;22;22
it wasn't his idea,
but he's the one who shared it with me.
00;39;22;22 - 00;39;26;09
We tend to overestimate the short
term impact,
00;39;27;01 - 00;39;30;20
of a new technology,
an underestimate the long term aspect.
00;39;30;20 - 00;39;32;27
And, you know, the internet was a perfect
example of that.
00;39;32;27 - 00;39;37;03
Like all these startups
thought we're going to get rich overnight.
00;39;37;12 - 00;39;41;11
And, you know,
every company had to embrace the internet
00;39;41;11 - 00;39;42;07
or they were going to die.
00;39;42;07 - 00;39;45;10
Well, you know, most of those companies,
they thought they were going to get rich
00;39;45;10 - 00;39;47;04
overnight, went out of business.
00;39;47;04 - 00;39;52;06
But slowly but surely, the internet
seeped into more or less everything.
00;39;52;17 - 00;39;55;26
And so, you know, probably
took a, I'm going to say, 15 years,
00;39;56;14 - 00;39;58;04
before it was central to everything.
00;39;58;04 - 00;40;00;23
And I think the same thing
is going to happen with AI.
00;40;00;23 - 00;40;02;29
So back to the boardroom.
00;40;02;29 - 00;40;04;06
You know, I be saying
00;40;05;18 - 00;40;07;04
this stuff is real.
00;40;07;04 - 00;40;08;14
It's coming.
00;40;08;14 - 00;40;12;20
Obviously there are tools that individual
employees can use, large language models,
00;40;12;24 - 00;40;17;07
you know, marketing team, you know,
kind of its graphics ability, and all.
00;40;17;07 - 00;40;18;14
And you should be using that.
00;40;18;14 - 00;40;22;10
In fact, I would encourage
everyone in, in an organization to use it
00;40;22;18 - 00;40;26;06
because our construct is a little bit
has been a little bit
00;40;26;06 - 00;40;32;03
wrong, is in the short medium term, it's
not going to be AI who replaces humans.
00;40;32;03 - 00;40;33;18
There'll be some exceptions to that,
00;40;33;18 - 00;40;37;23
but it's going to be humans who use AI,
who get the best, humans who don't.
00;40;37;29 - 00;40;40;22
This is a powerful tool.
I use it as an editor.
00;40;40;22 - 00;40;43;15
I use as a research assistant.
I'm I'm faster.
00;40;43;15 - 00;40;44;27
I think I'm better.
00;40;44;27 - 00;40;45;11
Because of it.
00;40;45;11 - 00;40;47;20
So I've been encouraging
their individual employees.
00;40;47;20 - 00;40;52;14
But as far as their organization
changing as processes and stuff,
00;40;52;24 - 00;40;55;26
I would encourage them to experiment,
to be thinking about it,
00;40;56;07 - 00;41;00;14
to maybe put a pilot project here,
but it's not there yet.
00;41;00;14 - 00;41;03;14
I mean, this is 2025
is supposed to be the year of the agents.
00;41;03;29 - 00;41;06;28
And just like, well, first off,
they don't have much of a memory.
00;41;06;28 - 00;41;11;00
So there's a personal agent
that's going to be as if we're all wealthy
00;41;11;00 - 00;41;14;00
people with a personal assistant
than those all our preferences
00;41;14;08 - 00;41;18;07
and stuff, like it's having a hard time
remembering us from session to session.
00;41;18;07 - 00;41;22;13
So they're not a good personal assistant
in that, sense.
00;41;22;13 - 00;41;27;20
And, you know, there's interesting things
I see companies doing, with with,
00;41;28;21 - 00;41;29;15
these, these
00;41;29;15 - 00;41;32;27
AI agents, but it's a work in progress.
00;41;32;27 - 00;41;34;26
So not quite there yet.
00;41;34;26 - 00;41;38;22
So, you know, bottom line,
like anticipate it,
00;41;39;03 - 00;41;42;03
play with it,
figure out how it might help.
00;41;42;06 - 00;41;46;15
But you know right now
like there's also the exponential factor.
00;41;46;15 - 00;41;49;06
Like these things have been getting like
00;41;49;06 - 00;41;52;19
ten times
better more or less every year and stuff.
00;41;52;19 - 00;41;56;04
So what we have today, we could presume
that, like what we have in
00;41;56;04 - 00;41;59;04
ten years
will be more powerful, different.
00;41;59;07 - 00;42;01;09
And so and at the same time,
00;42;01;09 - 00;42;03;22
there's this whole thrust ever since deep
think like,
00;42;03;22 - 00;42;05;27
wait, maybe we're going to have smaller
models and stuff.
00;42;05;27 - 00;42;10;21
So, you know, this stuff is moving so fast
and there's still like big changes.
00;42;10;28 - 00;42;14;29
You know, I think a corporations job,
a business leader
00;42;15;02 - 00;42;16;17
or a for that matter, an educator.
00;42;16;17 - 00;42;19;12
Anyone running a big organization
is to be paying attention.
00;42;19;12 - 00;42;20;10
This stuff.
00;42;20;10 - 00;42;23;24
Find ways you can use it to help
individual employees.
00;42;24;14 - 00;42;27;18
But I'm not sure I would turn
the company upside down yet.
00;42;28;27 - 00;42;30;08
I don't, and
00;42;30;08 - 00;42;34;19
I like I like that
and it's, you know, it's clear advice.
00;42;34;19 - 00;42;38;10
And if I can like if I can hone
in on a part of it, going back
00;42;38;10 - 00;42;41;17
to what you said about the start ups and,
you know, kind of the, the,
00;42;42;23 - 00;42;47;04
you know, the web, the web bubble
that we had, you know, 25 years ago.
00;42;47;11 - 00;42;50;27
It sounds like you've got a, you know,
kind of a healthy skepticism of like
00;42;52;01 - 00;42;55;11
companies that are thinking, oh, well, I'm
just going to uproot everything
00;42;55;11 - 00;42;55;24
I'm doing.
00;42;55;24 - 00;42;58;27
And we're an AI company
now, that's not that.
00;42;58;27 - 00;43;00;22
That's not the advice you're giving.
00;43;00;22 - 00;43;03;19
Right. So, you know,
00;43;03;19 - 00;43;06;15
I'm going to contradict myself,
but then I want to defend it.
00;43;06;15 - 00;43;10;05
I think AI is under hyped and overhyped,
00;43;10;25 - 00;43;13;15
and I think it's overhyped
00;43;13;15 - 00;43;17;19
because these startups have taken hundreds
of millions,
00;43;17;19 - 00;43;21;15
billions of dollars,
and they have to overhype it.
00;43;21;19 - 00;43;23;19
They have to tell you
that artificial general
00;43;23;19 - 00;43;27;07
intelligence is just around the corner
to justify the huge amount of money
00;43;27;17 - 00;43;31;04
they have Microsoft, Google,
you know, Amazon, etc.
00;43;31;07 - 00;43;35;02
they're putting Facebook meta,
you know, they're putting, you know, tens
00;43;35;02 - 00;43;39;21
of millions, tens of billions of dollars,
you know, into AI.
00;43;39;22 - 00;43;44;00
So of course, they want to see,
they want to recoup their investment.
00;43;44;05 - 00;43;47;08
So they're overhyping this like, hey,
you know, use copilot
00;43;47;08 - 00;43;50;09
is going to change your life
tomorrow, you know, kind of thing.
00;43;51;09 - 00;43;52;02
And so I think
00;43;52;02 - 00;43;55;02
the individual companies, startups
and big tech alike,
00;43;55;27 - 00;43;59;21
the venture capitalist in front of them,
you know, they're overhyping it.
00;44;00;07 - 00;44;03;01
But I do think it's under hype
in that sense.
00;44;03;01 - 00;44;07;08
I said before that we tend
to underestimate the long term impact.
00;44;07;18 - 00;44;09;13
You know, I,
00;44;09;13 - 00;44;12;11
I think everything
00;44;12;11 - 00;44;16;16
business, education and health care
00;44;17;22 - 00;44;19;28
consumers in their personal relationships,
00;44;19;28 - 00;44;24;22
I think everything is going
to be profoundly impacted, by AI.
00;44;24;26 - 00;44;28;05
And again, I don't know if that's
five years from now, ten years from now,
00;44;28;05 - 00;44;33;01
15 years from now, but at some point
in the next decade or so, you know,
00;44;33;01 - 00;44;36;07
the world will look very different,
the same way the world looked different,
00;44;37;01 - 00;44;41;04
after the internet, the same way
the world, look different after mobile.
00;44;42;00 - 00;44;45;23
You know, I think we're going to see
the same impact, if not a greater impact.
00;44;45;23 - 00;44;47;11
I mean, there's this sense that,
00;44;47;11 - 00;44;52;10
you know, kind of the semiconductor
plus personal computer,
00;44;52;20 - 00;44;58;02
plus the internet, plus the cloud,
plus the phone, you know, all of that
00;44;59;04 - 00;45;02;22
is undergirding
all of that is going to be used in AI.
00;45;03;03 - 00;45;05;26
And so it's almost a culmination,
00;45;05;26 - 00;45;08;28
you know, kind of the computer revolution
or whatever you want to call it,
00;45;08;28 - 00;45;14;03
since, you know, over the last 50,
70 years, AI is the combination.
00;45;14;03 - 00;45;18;15
In fact, you know, I had been, Bill gates
as a fun quote about this.
00;45;18;15 - 00;45;19;01
And, you know, that's
00;45;19;01 - 00;45;23;06
been the Holy grail for him
since he first touched a computer, right?
00;45;23;06 - 00;45;29;06
Like, again, the power of Hollywood,
that power of science fiction and media,
00;45;29;06 - 00;45;33;18
this idea that, you know, like Star Trek,
we could talk to the computer.
00;45;33;18 - 00;45;36;05
For me, it was like,
what was it called, Lost in Space?
00;45;36;05 - 00;45;40;02
That we'd all have a robot helper
who had all the information in the world
00;45;40;02 - 00;45;42;12
that was super powerful
and could protect us.
00;45;42;12 - 00;45;45;21
And I do think that's been the dream.
00;45;46;06 - 00;45;50;15
And I do think, like,
we're getting pretty close to that dream.
00;45;50;26 - 00;45;53;12
Yeah, well, I think that's well said.
00;45;53;12 - 00;45;53;29
And it is,
00;45;55;13 - 00;45;57;26
you know, it
makes me reflect on the fact that this is,
00;45;57;26 - 00;46;00;16
this is the continuation
or the extension of a thread
00;46;00;16 - 00;46;02;08
that's been running for a long time.
Right?
00;46;02;08 - 00;46;04;14
People say I and sometimes
00;46;04;14 - 00;46;06;29
sometimes we talk about AI abstractly
in a way
00;46;06;29 - 00;46;08;18
where it's like,
okay, what are you actually talking about
00;46;08;18 - 00;46;12;26
when you say AI because, you know, beyond
like the narrow scope of gen
00;46;12;28 - 00;46;17;21
AI and maybe even with it, you know,
I it's it's an algorithm, right?
00;46;17;21 - 00;46;21;25
Like it's just it's taking input,
transforming into outputs
00;46;21;25 - 00;46;25;01
and in some way, you know, creating
something new or doing something new.
00;46;25;01 - 00;46;27;14
And it's like, well, that's that's
what we've been doing all along, right?
00;46;27;14 - 00;46;29;19
Like that's
what computers were meant to do.
00;46;29;19 - 00;46;34;20
We've just gotten to a point at it
now where like something has tipped over
00;46;34;26 - 00;46;36;14
and what it's capable of doing,
00;46;36;14 - 00;46;39;21
and whether it's a genetic
or whether it's generative, you know,
00;46;39;27 - 00;46;42;29
but it's still the same,
the same thread, if.
00;46;43;06 - 00;46;44;18
You ask me for surprises.
00;46;44;18 - 00;46;50;23
Another surprise is AI has been with us
for a long, long, long time.
00;46;50;24 - 00;46;57;05
And the term artificial intelligence
was first coined in the 1950s.
00;46;57;15 - 00;47;01;08
And you know that there have been people
who saw like, hey, you know,
00;47;02;14 - 00;47;04;24
useful AI is just around the corner.
00;47;04;24 - 00;47;09;01
It's, you know, it's it's been a decade,
a decade away for like 70 years.
00;47;09;01 - 00;47;11;05
And we're finally around that corner.
00;47;11;05 - 00;47;14;05
We're now finally
where, you know, people can use it.
00;47;14;11 - 00;47;16;16
But the truth is
00;47;16;16 - 00;47;19;27
Google has been using AI, in search.
00;47;19;27 - 00;47;21;08
You know, it's had to help.
00;47;21;08 - 00;47;24;17
Translated, garbled search,
you know, spelling mistakes and such.
00;47;24;17 - 00;47;29;00
They've been using AI since the 2000
to more efficiently deliver up ads.
00;47;29;11 - 00;47;32;18
You know, in the 20 tens,
I'll stick with Google, Google Translate.
00;47;32;18 - 00;47;33;22
That's AI.
00;47;33;22 - 00;47;37;21
And, you know, it's I think 2015
is when Google Translate came out.
00;47;37;21 - 00;47;38;16
So, you know,
00;47;38;16 - 00;47;43;00
billions of people have been using
AI without knowing they're using AI.
00;47;43;00 - 00;47;44;11
You know, suggestion engines.
00;47;44;11 - 00;47;49;05
You go to Netflix, you, you know, you
you get a, recommendation
00;47;49;05 - 00;47;52;24
of, movies
you might like, you know, that's AI.
00;47;52;25 - 00;47;55;10
So we've been using AI for a long time.
00;47;55;10 - 00;47;58;23
I think the difference with generative AI,
the difference with ChatGPT is
00;47;58;23 - 00;47;59;28
we could talk to it.
00;47;59;28 - 00;48;03;09
It wasn't this thing behind the scenes,
that you had
00;48;03;09 - 00;48;06;28
to kind of know how it works and to know
that was artificial intelligence.
00;48;07;10 - 00;48;10;00
Now, you could speak with it
and, you know, it's funny, I,
00;48;10;00 - 00;48;13;00
I kind of was a skeptic
in the second half of the
00;48;13;01 - 00;48;16;09
90s where, okay,
this stuff is really interesting.
00;48;16;09 - 00;48;20;22
I was, you know, I used email, I love,
you know, browsing things on the internet.
00;48;20;22 - 00;48;25;23
But, you know, these dot coms seem
kind of preposterous, to me.
00;48;25;23 - 00;48;27;18
And some of the claims they're being made.
00;48;27;18 - 00;48;30;17
So, you know, the end of 2022,
I started using ChatGPT.
00;48;31;13 - 00;48;32;26
I'm a I'm a journalist.
00;48;32;26 - 00;48;35;16
I was, you know, a reporter
who covered this stuff.
00;48;35;16 - 00;48;41;03
I'm used to tech companies overstating the
potential of something. But
00;48;42;02 - 00;48;44;16
first off, OpenAI, there was no press
release.
00;48;44;16 - 00;48;49;27
There's no press event that's released
this thing in the on November 30th, 2022
00;48;50;01 - 00;48;54;00
with a research report and, you know,
kind of word spread there rather quickly
00;48;54;07 - 00;48;54;21
and stuff.
00;48;54;21 - 00;48;58;06
But, you know, the first time I used this,
like I was ready to be unimpressed.
00;48;58;15 - 00;49;01;07
And, you know, it was magical.
It was cool.
00;49;01;07 - 00;49;02;22
It was fun.
00;49;02;22 - 00;49;04;05
It was like extraordinary.
00;49;04;05 - 00;49;07;25
Like, first thing I did, like,
write me a 5000 word book proposal,
00;49;08;25 - 00;49;12;25
justifying a book about AI
at the end of 2022 and stuff.
00;49;12;25 - 00;49;16;21
And, you know, two things it's far better
read than I am has a far better memory,
00;49;18;10 - 00;49;19;13
that that I do.
00;49;19;13 - 00;49;21;20
But the other thing, it's it was flat.
00;49;21;20 - 00;49;24;20
I mean, I couldn't like, hit,
you know, print and turn it in,
00;49;24;24 - 00;49;26;02
you know, kind of thing.
00;49;26;02 - 00;49;31;03
But it was like, it took like, what would
what ended up taking me a week, you know,
00;49;31;03 - 00;49;35;09
it produced a serviceable proposal
within a minute.
00;49;35;27 - 00;49;40;08
And, you know, just like its ability
to write poems, one of my favorite
00;49;40;08 - 00;49;44;25
early uses was in this kind of underscores
at its original is generative AI.
00;49;44;25 - 00;49;47;17
It's creating something original.
00;49;47;17 - 00;49;49;29
Explain Marx's Marx's
00;49;49;29 - 00;49;53;12
economic theory,
in the form of a Taylor Swift song.
00;49;53;25 - 00;49;55;06
And it was really good.
00;49;55;06 - 00;49;57;23
It was really clever at doing it.
00;49;57;23 - 00;49;59;05
Obviously, that was original.
00;49;59;05 - 00;50;02;19
I don't think they went out and found
that somebody had done the,
00;50;03;09 - 00;50;04;25
before and produce it.
00;50;04;25 - 00;50;07;11
So, you know, I, I kind of did start with,
00;50;08;20 - 00;50;10;24
this stuff being
00;50;10;24 - 00;50;14;01
magical, you know,
I could be talking to you right now.
00;50;14;01 - 00;50;18;10
I don't understand Italian, but
we could translate my words with my voice.
00;50;18;14 - 00;50;21;26
And I could do this interview in Italian
or Chinese or, you know, just just,
00;50;22;00 - 00;50;24;28
you know, any of a dozen, two
00;50;24;28 - 00;50;28;08
dozen languages, probably by this point,
100, plus languages.
00;50;28;16 - 00;50;32;05
And so, yeah, I do think it's
an extraordinary technology.
00;50;32;05 - 00;50;34;02
That was kind of fun for me
because, again,
00;50;34;02 - 00;50;36;17
you know, as a journalist,
you kind of have ideas in your head.
00;50;36;17 - 00;50;37;27
I think is a good journalist.
00;50;37;27 - 00;50;42;03
You should be willing to say,
oh, what I had in my head was wrong.
00;50;42;06 - 00;50;42;26
And that just like,
00;50;42;26 - 00;50;46;15
pursue examples that show like, oh,
this thing is, you know, lousy.
00;50;46;20 - 00;50;49;26
So it was fun for me to like,
kind of have my world turned upside down.
00;50;50;13 - 00;50;54;21
From the start, as I was going out
and introducing myself to founders,
00;50;54;21 - 00;50;58;12
venture capitalists, you know,
those trying to cash in on this I moment.
00;50;59;20 - 00;51;00;06
Yeah.
00;51;00;06 - 00;51;03;06
No, it's it's it's super, super neat.
00;51;03;11 - 00;51;05;10
And yeah, I it's fair.
00;51;05;10 - 00;51;08;10
I think it's,
it came as a surprise to all of us.
00;51;08;17 - 00;51;13;09
I do, Gary want to slightly switch gears
and talk about back to the human
00;51;13;09 - 00;51;17;23
in the loop and, you know, 2025
and maybe we can talk about AI
00;51;17;23 - 00;51;18;27
or maybe not, but
00;51;18;27 - 00;51;22;18
what I, what I wanted to ask you about is,
I think about six years ago,
00;51;22;18 - 00;51;27;08
you wrote a book series called,
Masters of Work on a few different.
00;51;27;08 - 00;51;31;14
You know, what you considered really
interesting, important roles at the time.
00;51;31;14 - 00;51;34;10
You did venture capitalists,
you did cybersecurity.
00;51;34;10 - 00;51;36;09
You did sports agent. Right.
00;51;36;09 - 00;51;39;13
If if you were to do that again
in 2025 or,
00;51;39;13 - 00;51;42;14
you know, 2026,
has that changed for you or there?
00;51;42;14 - 00;51;44;21
Are there new roles that you found
in kind of your research
00;51;44;21 - 00;51;46;13
and what you've been pressing up against
00;51;46;13 - 00;51;49;13
that you think are going to be really,
really interesting in the next few years.
00;51;49;16 - 00;51;51;00
So that was part of a bigger series.
00;51;51;00 - 00;51;54;00
And those are the three I
there were like 20 or so,
00;51;54;19 - 00;51;58;01
professions,
you know, I'll answer it differently.
00;51;58;01 - 00;52;01;17
Like I think something
that's so interesting is,
00;52;01;29 - 00;52;06;06
you know, what's going to
be the impact of AI on jobs.
00;52;07;13 - 00;52;10;05
Already we're seeing AI,
00;52;10;05 - 00;52;12;10
being very productive,
00;52;12;10 - 00;52;17;10
in customer service, centers,
you know, this idea that, you know,
00;52;17;11 - 00;52;21;18
the I could do a perfectly good job
of answering basic questions, I hadn't.
00;52;21;21 - 00;52;25;07
I need to reset my,
you know, password kind of, thing.
00;52;25;07 - 00;52;28;22
So, you know, a lot there's millions
of people who work in customer service.
00;52;29;02 - 00;52;32;02
It's going to take a big, chunk
chunk out of that.
00;52;32;17 - 00;52;35;17
One of the more,
there's 8 to 10 million people,
00;52;36;04 - 00;52;39;12
who make a living in this country,
as drivers, long haul
00;52;39;12 - 00;52;42;11
drivers, Uber drivers, taxi drivers,
local delivery.
00;52;42;16 - 00;52;45;06
You know,
how far away is autonomous vehicles?
00;52;45;06 - 00;52;47;11
I mean,
I was just out in San Francisco and,
00;52;47;11 - 00;52;50;07
you know, there's,
you know, robo taxis everywhere.
00;52;50;07 - 00;52;52;14
And it's, you know, it's working there
in a few different cities.
00;52;52;14 - 00;52;56;03
So I can't predict it for two,
five, ten years out.
00;52;56;10 - 00;52;59;10
But at some point in the foreseeable
future,
00;52;59;11 - 00;53;04;10
you know, all those jobs, as drivers
or most of those,
00;53;04;10 - 00;53;07;29
most of the delivery people that carry it,
unless we have a robot in the car,
00;53;07;29 - 00;53;10;06
you know,
bringing in the the stuff they delivering.
00;53;11;07 - 00;53;12;20
And so, you know,
00;53;12;20 - 00;53;15;20
there be job categories
that are decimated.
00;53;16;06 - 00;53;21;01
But I but there will be these
new categories that we can understand.
00;53;21;06 - 00;53;24;06
One of the obvious ones, right, right now
is prompt engineer.
00;53;24;10 - 00;53;29;02
You know, it's not is you have to
something I think people don't use
00;53;29;02 - 00;53;33;24
I don't understand
is that you're still the creative.
00;53;34;02 - 00;53;39;13
You're still the one who's needs
to find the use in this thing.
00;53;39;23 - 00;53;40;10
You know it.
00;53;40;10 - 00;53;42;22
Use can like,
tell me something interesting.
00;53;42;22 - 00;53;47;07
You know, you you know, I would I use it,
I use my knowledge
00;53;47;07 - 00;53;51;12
and you know, I've done a little bit
research and I can guide it and stop.
00;53;51;19 - 00;53;55;20
And so prompt engineer is going to be
or is a job.
00;53;55;20 - 00;53;58;18
But there's all these job categories
that we can't imagine
00;53;58;18 - 00;54;00;05
I might make the example I use.
00;54;00;05 - 00;54;03;05
It's, you know,
Gutenberg comes up with the printing press
00;54;03;12 - 00;54;04;29
and, you know,
we could worry about like, oh,
00;54;04;29 - 00;54;08;06
where are all these monks going to do,
how they can earn a living kind of thing.
00;54;09;03 - 00;54;11;25
It gave rise to the publishing industry.
00;54;11;25 - 00;54;16;10
It gave rise to newspapers and magazines,
like there's millions of people, you
00;54;16;10 - 00;54;21;03
know, who have jobs based on essentially
this technology in the printing press.
00;54;21;12 - 00;54;24;14
And so I can't help but think that
00;54;24;28 - 00;54;28;15
AI is going to create job categories
that our brains
00;54;28;15 - 00;54;32;29
can't quite understand,
because we need to see how it plays out.
00;54;32;29 - 00;54;35;24
I do fear that AI is going to be,
00;54;36;26 - 00;54;38;23
is going to lead to
00;54;38;23 - 00;54;42;13
the destruction of more jobs
than the creation of new jobs.
00;54;42;17 - 00;54;43;28
But I could be wrong.
00;54;43;28 - 00;54;46;29
I think I said the same thing
in the mid 1990s.
00;54;47;11 - 00;54;49;01
And the internet. Right?
00;54;49;01 - 00;54;52;00
I mean, you need less people
at the Amazon bookstore
00;54;52;05 - 00;54;54;24
than you would at the local, bookstore.
00;54;54;24 - 00;54;58;11
So in my head like, oh my goodness,
the internet is going to destroy
00;54;58;24 - 00;54;59;22
all these jobs.
00;54;59;22 - 00;55;03;02
But, you know,
we have 4% unemployment right now.
00;55;03;02 - 00;55;05;21
So somehow we've we figured that out.
00;55;05;21 - 00;55;08;07
But, you know, I, I am scared of that.
00;55;08;07 - 00;55;11;07
And it's actually happens to be something
I'm writing about a research
00;55;11;09 - 00;55;14;15
now because I,
I don't know the answer in my head.
00;55;14;15 - 00;55;18;14
I again I fear, is to be largely,
a big net negative.
00;55;19;07 - 00;55;22;07
But I've had that fear
before that have been wrong.
00;55;22;20 - 00;55;24;22
Well, so can you tell me a little bit more
about that?
00;55;24;22 - 00;55;27;17
What? Your, your next project
and what you're looking into?
00;55;28;20 - 00;55;29;13
I mean,
00;55;29;13 - 00;55;32;04
this is just an this is just said article
00;55;32;04 - 00;55;35;04
I'm working on for the New York Times.
00;55;35;15 - 00;55;38;15
But, you know, I,
you know, if you look at my career,
00;55;39;20 - 00;55;43;13
I've written about tech
or I've written about,
00;55;44;11 - 00;55;48;17
politics, urban issues,
and of course, what's been happening
00;55;48;17 - 00;55;52;20
the last year or so is,
you know, tech is eating politics.
00;55;52;20 - 00;55;54;05
Tech is at the center of politics.
00;55;54;05 - 00;55;57;21
I would argue that tech is the largest
00;55;58;05 - 00;56;01;01
special interest corporate interests,
but whatever you want to call it,
00;56;01;01 - 00;56;05;01
like in the history of politics,
like in the old days, you know,
00;56;06;14 - 00;56;09;14
the rich moguls would write big checks,
00;56;09;28 - 00;56;14;21
and you know, kind of have their
politicians, they treated these puppets.
00;56;14;28 - 00;56;15;26
But this is different.
00;56;15;26 - 00;56;19;20
I'm first off, you know,
look at Elon Musk, 250 million or so
00;56;19;27 - 00;56;23;27
crypto
put like $200 million into the 2024,
00;56;25;00 - 00;56;25;22
election.
00;56;25;22 - 00;56;29;26
They're running much, much,
much, much, much bigger checks.
00;56;29;26 - 00;56;32;21
And so somehow I wonder if there's,
00;56;32;21 - 00;56;33;23
a book, a book there.
00;56;33;23 - 00;56;35;15
The billionaire Boys club, I did.
00;56;35;15 - 00;56;37;09
Gary, want to talk to you
about something else outside
00;56;37;09 - 00;56;41;04
of, tech and politics
that you kind of touched on earlier,
00;56;41;20 - 00;56;45;05
but both in your career and with, like
the Gutenberg press example, which is,
00;56;45;24 - 00;56;49;26
journalism
and how journalism fits into all of this
00;56;49;26 - 00;56;54;16
and what I means for journalism
and journalism.
00;56;54;16 - 00;56;56;21
I mean,
I don't think I'm being controversial
00;56;56;21 - 00;56;59;21
by saying it's
in a pretty precarious place right now.
00;57;00;10 - 00;57;03;10
And has seen,
00;57;03;18 - 00;57;06;18
in my eyes, like a rapid and depressing,
00;57;06;20 - 00;57;10;06
you know, decline in the past
decade or more.
00;57;11;06 - 00;57;13;03
What does all of this mean for journalism?
00;57;13;03 - 00;57;16;04
And if you had, like,
well, let me maybe just leave it there.
00;57;16;14 - 00;57;20;09
Yeah, I so is that this is been going on
00;57;20;19 - 00;57;23;26
long before I grew front and center.
00;57;23;26 - 00;57;25;08
In fact, you can make the argument
00;57;25;08 - 00;57;28;28
that it was the internet,
you know, the internet.
00;57;29;03 - 00;57;31;28
Craigslist wiped away
classified advertising.
00;57;31;28 - 00;57;35;29
That was a big piece, of of, the revenue
00;57;36;07 - 00;57;39;28
for any, newspaper,
you know, a whole set of things.
00;57;40;01 - 00;57;43;08
You know, I, I fear accelerates that.
00;57;43;08 - 00;57;46;29
I mean, again,
I think I could be a, a good tool
00;57;47;08 - 00;57;50;05
for a journalist
to help him or her do her job.
00;57;50;05 - 00;57;52;20
Job. But, you know,
00;57;52;20 - 00;57;56;06
I, I do fear, in fact,
we're already seeing it.
00;57;56;17 - 00;57;59;29
That AI is good enough right now
00;58;00;13 - 00;58;03;18
where it could write a basic news story.
00;58;03;26 - 00;58;04;10
You know,
00;58;04;10 - 00;58;08;00
kind of that one of the coins of the realm
these days is like the heartache.
00;58;08;09 - 00;58;12;02
Well, I could just digest
what's being said about,
00;58;12;11 - 00;58;16;05
you know, this, this event or that,
and you train it.
00;58;16;05 - 00;58;18;21
It could come up with a hot
take much faster.
00;58;18;21 - 00;58;23;11
And many, many, many, many more than any
human, Sports Illustrated.
00;58;23;12 - 00;58;24;23
They got in trouble for it.
00;58;24;23 - 00;58;28;08
But they created
these bots, even gave them
00;58;28;08 - 00;58;31;20
names and backstories,
as if they were real humans.
00;58;31;20 - 00;58;37;00
They were deceptive, about it,
but they're using them to write stories.
00;58;37;07 - 00;58;38;18
CNet the same thing.
00;58;38;18 - 00;58;43;00
They they did it, but they, they stopped
because they found that there, you know,
00;58;43;02 - 00;58;47;06
half the stories or more, had errors,
some of them serious errors.
00;58;47;06 - 00;58;52;21
So again, I come back to this idea like,
okay, we could use AI bots.
00;58;52;21 - 00;58;56;24
I think it's dangerous
to create a rough draft of a story,
00;58;57;07 - 00;59;00;26
but there has to be humans in the loop
who are skeptical,
00;59;01;01 - 00;59;02;26
who are like, wait,
what's the source for this?
00;59;02;26 - 00;59;06;16
When I use, you know, a chat
bot to do research,
00;59;06;23 - 00;59;09;20
you know, I, I use one of them
like perplexity as an example
00;59;09;20 - 00;59;14;10
that footnotes because I, you know,
I'm not going to trust anything it says.
00;59;14;12 - 00;59;17;13
Hey, most of what it says is right,
but most of what it says is
00;59;17;13 - 00;59;18;12
right is not good enough.
00;59;19;11 - 00;59;21;07
So I always want to go, oh, okay.
00;59;21;07 - 00;59;26;07
That quote here's a CNBC article
that had that quote, in it.
00;59;26;14 - 00;59;30;11
And so, you know, it can be a tool, but,
you know, I'm scared that,
00;59;30;17 - 00;59;33;17
you know, with all the pressure on,
00;59;34;06 - 00;59;37;08
news outlets to cut costs
because we're getting,
00;59;37;08 - 00;59;41;20
you know, they're getting less revenue
from subscriptions, less revenue
00;59;41;20 - 00;59;44;20
from classified ads,
less revenue for advertising.
00;59;45;21 - 00;59;50;14
That their solution is going to be,
you know, kind of reporter bots and all.
00;59;50;14 - 00;59;52;09
But, you know,
00;59;52;09 - 00;59;54;21
who's going to go to the meeting,
00;59;54;21 - 00;59;57;05
the city council meeting
and get the quotes.
00;59;57;05 - 01;00;01;09
I mean, I still have to think
that needs to be journalists
01;00;01;09 - 01;00;04;24
out there doing this shoe
leather reporting, and all.
01;00;04;24 - 01;00;09;15
But, you know, big companies are always
looking for ways to, to save money.
01;00;09;17 - 01;00;14;23
You know, maybe we'll have reporters, but
the editors and copy editors, they'll be.
01;00;15;01 - 01;00;19;07
But so I, I think there'll be a profound
in, impact of AI.
01;00;19;07 - 01;00;20;22
In fact, one more data point.
01;00;20;22 - 01;00;25;27
And again, this is an I this is before I,
I, you know, newspapers and magazines
01;00;25;27 - 01;00;31;08
in the US spreading like $50 billion
in advertising collectively in 2000.
01;00;31;16 - 01;00;34;00
But Google and Facebook now matter.
01;00;34;00 - 01;00;34;11
You know,
01;00;34;11 - 01;00;35;05
they have even
01;00;35;05 - 01;00;38;11
huge portions of that to the point
where it's now closer to 10 billion
01;00;38;11 - 01;00;39;17
rather than 50 billion.
01;00;39;17 - 01;00;42;14
That's a lot of money
that they're no longer getting.
01;00;42;14 - 01;00;43;29
And so how do you make it work?
01;00;43;29 - 01;00;47;14
How do you, as a newspaper, magazine
or publication, make it work?
01;00;47;25 - 01;00;50;18
You know, AI is a way you can save money,
01;00;52;11 - 01;00;54;21
right? And
01;00;54;21 - 01;00;55;20
it makes sense.
01;00;55;20 - 01;00;58;02
And I really like,
you know, your take on it
01;00;58;02 - 01;00;59;01
and what you covered
01;00;59;01 - 01;01;01;09
that there's one like specific piece
that I want to dig
01;01;01;09 - 01;01;04;27
in a little bit deeper, which is
specifically investigative journalism,
01;01;05;04 - 01;01;08;07
which is something that
that you've focused on in your career.
01;01;08;15 - 01;01;13;19
And to me is, I don't know,
maybe the piece that's least
01;01;13;19 - 01;01;17;22
replicable by AI, either
because they can't do it or because,
01;01;17;22 - 01;01;20;24
you know, the news outlets of the future
don't necessarily want them to do it.
01;01;21;06 - 01;01;24;28
Is that like, what is the future
of investigative journalism
01;01;24;28 - 01;01;27;03
and is it in trouble in your eyes?
01;01;27;03 - 01;01;27;24
Well, investigative
01;01;27;24 - 01;01;31;17
journalism is in trouble generally
because it's very expensive, right?
01;01;31;17 - 01;01;33;02
Yeah.
01;01;33;02 - 01;01;37;09
I could write two news stories a day
if I'm just go to a city
01;01;37;09 - 01;01;40;25
council meeting or I go press release,
make a couple of calls and crank it out.
01;01;40;25 - 01;01;43;29
You know, I've,
I've spent months on investigate of,
01;01;45;00 - 01;01;45;29
pieces.
01;01;45;29 - 01;01;48;29
So investigative reporting is expensive.
01;01;49;01 - 01;01;51;11
Generally,
there's been kind of a nonprofit model,
01;01;51;11 - 01;01;54;11
you know, model where like,
oh, ProPublica.
01;01;54;16 - 01;01;58;06
You know, we'll have an outlet
that does investigative journalism.
01;01;58;06 - 01;01;59;27
That's their thing. And they're, you know,
01;01;59;27 - 01;02;03;09
funded by foundations and such and,
you know, use users.
01;02;04;07 - 01;02;06;14
Who appreciate their, their work.
01;02;06;14 - 01;02;10;21
You know, I, I so I don't think
that's reversing anytime soon.
01;02;10;21 - 01;02;14;13
In fact,
I fear that it's going to be, you know,
01;02;14;13 - 01;02;19;06
less and less what media outlets are able
to, invest in.
01;02;19;06 - 01;02;22;15
But, you know, to kind of express
an optimistic note for a moment like,
01;02;22;28 - 01;02;25;15
hey, I could be an amazing tool.
01;02;25;15 - 01;02;29;11
One one thing that I so good at
is going through vast
01;02;29;27 - 01;02;33;09
sets of data
and finding connections that no human,
01;02;33;21 - 01;02;37;25
you know, could, could, could see and so,
you know, the SEC,
01;02;37;25 - 01;02;41;23
you know, there's a gazillion filings
from publicly traded companies.
01;02;42;00 - 01;02;45;20
You could train
AI to look for patterns like, you know,
01;02;45;28 - 01;02;50;20
look for times where there was big news
announced by a company
01;02;50;20 - 01;02;53;25
who sold stock inside the company
ahead of that, you know,
01;02;53;25 - 01;02;56;25
just to use a quick example
off the top of my head.
01;02;56;27 - 01;03;00;08
But so, you know,
I could be an extraordinary tool again,
01;03;01;00 - 01;03;04;08
as long as there's a writer
or a set of writers
01;03;04;14 - 01;03;07;07
who are using that tool,
because it's the human instinct
01;03;07;07 - 01;03;08;27
that's going to understand, like
01;03;09;27 - 01;03;12;27
where the story is, you know, these
these things that,
01;03;13;02 - 01;03;16;15
you know,
we're at GPT 4.5, you know, right now.
01;03;16;18 - 01;03;20;18
So I don't know what GPT 8 or 9 or 10
or 12 is going to be like.
01;03;20;18 - 01;03;24;22
But these things cannot write
a good investigative story.
01;03;24;24 - 01;03;27;07
These things could not write
a good magazine story.
01;03;27;07 - 01;03;29;04
These things can't write a book.
01;03;29;04 - 01;03;31;10
And they seem a long, long,
long way that you flat.
01;03;31;10 - 01;03;31;27
They're boring.
01;03;31;27 - 01;03;33;15
They're, you know,
01;03;33;15 - 01;03;38;13
as you point out before, they're just
mathematical models looking for patterns.
01;03;40;00 - 01;03;42;22
And so it's derivative by definition.
01;03;42;22 - 01;03;45;26
What makes a book good is its original.
01;03;45;26 - 01;03;49;06
I would argue that the human toil
and sweat
01;03;49;18 - 01;03;52;17
is what makes it good.
01;03;52;17 - 01;03;56;20
It's the sense
the approach, of of the writer.
01;03;56;27 - 01;03;58;23
But, you know, I, I don't know.
01;03;58;23 - 01;04;01;24
I mean, I do wonder
if these things are getting them
01;04;02;07 - 01;04;05;07
ten times better, more or less every year.
01;04;05;09 - 01;04;08;21
Will there ever be a point
where could write a beautiful novel,
01;04;08;25 - 01;04;12;25
will ever get a point where it could,
like, write a really engaging, page
01;04;12;25 - 01;04;17;27
turning work of nonfiction that's told
through characters like replacing me,
01;04;18;08 - 01;04;19;21
I, I don't I,
01;04;19;21 - 01;04;20;16
you know, I, I joke,
01;04;20;16 - 01;04;24;06
I'm writing as fast as I can because
I don't know if that's going to happen,
01;04;24;06 - 01;04;26;03
but I definitely think it's possible
it's going to happen.
01;04;27;07 - 01;04;27;11
Yeah.
01;04;27;11 - 01;04;29;24
What do you think? No, it's it's
what do you think about it?
01;04;29;24 - 01;04;31;15
I mean,
01;04;31;15 - 01;04;33;09
I first of all, I was going to say
01;04;33;09 - 01;04;37;17
I really appreciate the note of optimism
there about what it can do.
01;04;37;17 - 01;04;39;07
And, you know, I was I was just,
01;04;39;07 - 01;04;42;28
you know, as the wheels were turning,
as you were talking, I don't know, I think
01;04;44;00 - 01;04;46;15
it wouldn't shock me if at some point
01;04;46;15 - 01;04;51;00
I could get to writing articles,
if not books, maybe books
01;04;51;00 - 01;04;53;12
that the thing that you've probably seen
this too, in your research,
01;04;53;12 - 01;04;58;07
one of the limitations right now
is that the longer you let generative
01;04;58;07 - 01;04;59;12
AI generate,
01;04;59;12 - 01;05;03;04
the more likely it is to go off the rails
and go in some wacky direction.
01;05;03;11 - 01;05;06;11
And the analogy, by the way, that
I kind of thought of for this is like,
01;05;06;28 - 01;05;10;00
I feel like that's true
of like the human body, right?
01;05;10;00 - 01;05;11;20
Like once a human,
01;05;11;20 - 01;05;13;04
you know, turns 100, like things
01;05;13;04 - 01;05;15;19
start going wrong with the body
because it's been running for too long
01;05;15;19 - 01;05;17;12
and all those little things
start to add up.
01;05;17;12 - 01;05;19;07
Generative AI, I guess, is the same way.
01;05;20;10 - 01;05;21;14
So, so long
01;05;21;14 - 01;05;24;24
form I think
is probably safe for a little bit, but
01;05;24;24 - 01;05;28;08
I, I've certainly found in experimenting
with it, Gary, that you can ask it like,
01;05;28;09 - 01;05;33;03
you know, be compelling and in a short way
it can be more compelling and less flat.
01;05;33;15 - 01;05;37;16
But you know, the piece
that is interesting and gives me hope.
01;05;37;16 - 01;05;41;03
There is more on the investigative
journalism piece
01;05;41;11 - 01;05;44;28
and the, the, the tool for it
to be able to make those connections.
01;05;45;11 - 01;05;48;07
And you said it,
01;05;48;07 - 01;05;51;16
but it's really only as good
as the prompter is, right?
01;05;51;16 - 01;05;51;23
Right.
01;05;51;23 - 01;05;55;17
Like, yeah, it
it needs a human to ask the good questions
01;05;55;21 - 01;05;57;15
and give you the answer
to that good question.
01;05;57;15 - 01;05;58;27
And I mean, I tell you anecdotally.
01;05;58;27 - 01;06;01;19
It needs human to digest it.
I mean, that's right.
01;06;01;19 - 01;06;05;16
It'll give you a vast amount of data
like and it's your job as the writer,
01;06;05;16 - 01;06;06;29
the journalist, the human. Yeah.
01;06;06;29 - 01;06;10;27
You know, to kind of figure out like,
oh, here's here's my story again.
01;06;10;27 - 01;06;12;02
I, I'm always playing with it.
01;06;12;02 - 01;06;15;02
I'm always like,
you know, yeah, here's here's my notes.
01;06;15;18 - 01;06;17;03
Give me a good opening.
01;06;17;03 - 01;06;20;04
I'm like basically,
oh for every time doing that.
01;06;20;04 - 01;06;23;06
But you know, it's like, oh,
but that's an interesting word to use in.
01;06;23;06 - 01;06;23;25
Oh right.
01;06;23;25 - 01;06;25;29
Good place to start.
I hadn't really thought of that.
01;06;25;29 - 01;06;26;09
Yeah.
01;06;26;09 - 01;06;30;23
So again this is you know you know
Microsoft calls their chat bot copilot.
01;06;32;01 - 01;06;33;13
And I think that's the perfect
01;06;33;13 - 01;06;37;16
what I like about that
is it's the name tells you what it is.
01;06;37;27 - 01;06;39;29
You know, you're flying the plane.
01;06;39;29 - 01;06;41;25
You're the one who knows this,
01;06;41;25 - 01;06;45;22
but you got this really good assistant
right there that could do some work.
01;06;45;22 - 01;06;46;24
You know, I have to go to the bathrooms.
01;06;46;24 - 01;06;49;05
Okay, you fly for a minute, but just for,
you know, a little bit.
01;06;49;05 - 01;06;50;24
I'm. I'm learning this thing.
01;06;50;24 - 01;06;53;22
I'm taking it off.
01;06;53;22 - 01;06;56;03
And so, you know, it is your copilot.
01;06;56;03 - 01;06;59;22
And again, whether it's, we're talking
about investigative journalists or,
01;07;00;03 - 01;07;05;18
you know, more or less
any creative, professional, job,
01;07;05;18 - 01;07;09;27
it really is this amazing, tool
like reports, right?
01;07;09;27 - 01;07;14;03
It could write a really good,
efficient 20 page report
01;07;14;10 - 01;07;18;06
that reflects more or less
every 20 page report that's written on a,
01;07;18;07 - 01;07;21;09
you know, white paper
like thing on a basic subject.
01;07;21;09 - 01;07;25;26
So if I was in that that's
the kind of writer I was, two things.
01;07;25;26 - 01;07;28;23
One, I'd be using and again,
you're steering it.
01;07;28;23 - 01;07;30;06
It's you're the prompt engineer.
01;07;30;06 - 01;07;33;27
You're the one who is going to have
to figure out how this is useful.
01;07;34;03 - 01;07;36;19
You need to edit it,
you need to rewrite it.
01;07;36;19 - 01;07;37;06
And then yesterday.
01;07;37;06 - 01;07;40;09
But you know, instead of it
taking two weeks for write that report,
01;07;40;13 - 01;07;41;28
maybe you could do it in 3 or 4 days.
01;07;41;28 - 01;07;43;18
But the other thing
I'd be thinking is like,
01;07;43;18 - 01;07;45;29
I might want to find a different career,
01;07;45;29 - 01;07;49;17
because if my big company has 20 of us
who writes reports,
01;07;50;01 - 01;07;53;01
it's not too hard to imagine in the
not so distant
01;07;53;01 - 01;07;56;05
future that instead of 20,
they might have 2 or 4 of us.
01;07;56;17 - 01;07;57;26
Yeah, yeah.
01;07;57;26 - 01;07;59;16
No, that's that, that's well said.
01;07;59;16 - 01;08;01;14
But I do love the,
01;08;01;14 - 01;08;04;28
I just wanted to kind of close that out
with the notion that,
01;08;05;08 - 01;08;06;11
you know, to me,
01;08;06;11 - 01;08;09;15
and I don't think you'll find
this controversial at all, like journalism
01;08;09;15 - 01;08;14;00
and especially investigative
journalism are a public good, right?
01;08;14;00 - 01;08;16;10
Like they do
something of benefit to society.
01;08;16;10 - 01;08;19;09
And I worry about a future
where they're not there.
01;08;19;09 - 01;08;22;28
And you, you've given me hope
that, you know, whether it's that
01;08;22;28 - 01;08;26;29
or some of the other kind of creative
tools for it, they're not going away.
01;08;26;29 - 01;08;30;18
And it can actually be a force
for good as we look at what it can do.
01;08;30;19 - 01;08;31;22
Yeah, I hope so.
01;08;31;22 - 01;08;34;06
But, you know, there
is that other trend line that, you know,
01;08;34;06 - 01;08;35;28
we don't have enough investigative
journalists.
01;08;35;28 - 01;08;39;13
There's, you know, most local newspapers,
01;08;39;23 - 01;08;42;24
you know, have few,
if any, investigative journalists.
01;08;42;24 - 01;08;46;12
But, you know, investigative
journalists have uncovered extraordinary,
01;08;47;07 - 01;08;49;12
thing and Watergate,
01;08;49;12 - 01;08;52;20
Jeffrey Epstein,
you know, kind of the list is long.
01;08;52;20 - 01;08;55;28
And I think at this moment, getting back
to a point you made before, like,
01;08;56;09 - 01;08;59;05
you know, Elon
Musk could say what he wants to say
01;08;59;05 - 01;09;02;13
and stuff, but investigative journalists
are the ones who figure out
01;09;02;13 - 01;09;07;07
in the New York Times, he's done
a great job of this saying like, wait,
01;09;07;21 - 01;09;11;16
but there's all these investigations
that the federal government, and say,
01;09;11;16 - 01;09;12;09
how does that work out?
01;09;12;09 - 01;09;17;01
Like, wait, you get contracts, in fact,
with, you know, rural broadband.
01;09;17;01 - 01;09;19;13
Like there's I don't know where it is.
01;09;19;13 - 01;09;22;13
It's not my topic,
but there's talk of Starlink
01;09;22;13 - 01;09;26;02
that's, you know, a must,
you know, Musk, product,
01;09;26;14 - 01;09;30;15
you know, of, of Starlink
being brought in, to give us,
01;09;31;26 - 01;09;32;26
rural
01;09;32;26 - 01;09;35;26
broadband, bringing broadband
to rural areas.
01;09;36;02 - 01;09;39;08
Well, not broadband, bringing us,
you know, wireless, bringing Starlink
01;09;39;08 - 01;09;44;23
instead like that could be worth billions
and billions of dollars, to us.
01;09;44;23 - 01;09;47;01
So, you know, is that what's happening?
01;09;47;01 - 01;09;50;20
I don't know, but I think it's
an investigative journalist role
01;09;51;00 - 01;09;55;01
to unearth that and say, like, okay,
I don't mean to be a cynic here,
01;09;55;01 - 01;09;58;01
but there is
this other set of motivations,
01;09;58;02 - 01;09;59;09
that the public should know about
01;09;59;09 - 01;10;02;12
when they listen to
what Elon Musk says about X or Y.
01;10;03;04 - 01;10;06;27
Well, and if the biggest barrier to that
right now is while it's just it's cost
01;10;06;27 - 01;10;09;14
prohibitive
to have too many investigative journalists
01;10;09;14 - 01;10;12;02
for a local paper or for any paper,
and we say, well, wait,
01;10;12;02 - 01;10;15;16
I can decrease the cost
of an investigative journalism
01;10;15;16 - 01;10;19;02
piece by,
you know, ten x suddenly, you know,
01;10;19;25 - 01;10;23;14
in my utopian world
that's back on the table.
01;10;23;14 - 01;10;24;25
And that could be really interesting.
01;10;24;25 - 01;10;27;22
You know, ten x might be a little bit
too utopian for my taste here.
01;10;27;22 - 01;10;28;27
Fair enough. But, you know.
01;10;28;27 - 01;10;30;25
I can dream. Yeah.
01;10;33;05 - 01;10;36;03
But, you know,
I, I keep on coming back to this term,
01;10;36;03 - 01;10;40;01
I, I stole it from Reed Hoffman,
you know, kind of superpower that,
01;10;40;01 - 01;10;44;21
you know, you
it does amplify your intelligence.
01;10;44;21 - 01;10;47;11
It does amplify your ability to do things.
01;10;47;11 - 01;10;48;21
And, you know,
01;10;48;21 - 01;10;52;00
I'll just kind of restate something
I said earlier, like investigative
01;10;52;00 - 01;10;55;22
journalists using AI are more powerful
01;10;55;22 - 01;10;58;29
than investigative journalists
who don't use AI.
01;10;59;00 - 01;11;02;09
So I do think that's kind of
01;11;02;10 - 01;11;05;10
not an equalizer,
but I do think it can help.
01;11;05;25 - 01;11;08;14
You know, what's what's a
01;11;08;14 - 01;11;09;10
help preserve?
01;11;09;10 - 01;11;12;28
What's a diminishing, resource
in our, in our country.
01;11;13;16 - 01;11;14;22
Well said.
01;11;14;22 - 01;11;17;07
Gary, I want to say a big
thank you for today.
01;11;17;07 - 01;11;19;03
This has been really, really fascinating.
01;11;19;03 - 01;11;21;23
A lot of really interesting insights
I'll be taking away.
01;11;21;23 - 01;11;24;21
So I really appreciate you coming on
and the time you spent here today.
01;11;24;21 - 01;11;26;14
Oh my pleasure.
Thanks for having me. This is fun.
01;11;28;06 - 01;11;31;03
This. Is.


The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Listen
Our Guest Roman Yampolskiy Discusses
Roman Yampolskiy: How Superintelligent AI Could Destroy Us All
Is this a wake-up call for anyone who believes the dangers of AI are exaggerated?
Listen
Our Guest Zack Kass Discusses
Ex-OpenAI Lead Zack Kass: AI Judges, Abundance, and the Future of Society
Zack Kass, an AI futurist and former Head of Go-To-Market at OpenAI, sits down with Geoff to explore the philosophical implications of AI and its impact on everything from nuclear war to society’s struggle with psychopaths to humanity itself.
Listen
Our Guest Gary Rivlin Discusses
Pulitzer-Winning Journalist: This Is Why Big Tech Is Betting $300 Billion on AI
This conversation highlights the role of venture capital in fueling today’s tech giants, what history tells us about the future of digital disruption, and whether regulation can truly govern AI and platform power.
Listen
Our Guest Andy Boyd Discusses
Ex-CIA Cyber Chief: Here's What Keeps Me Up at Night
Andy sits down with Geoff to discuss the future of cybersecurity in a rapidly evolving digital world.