Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:02,000 --> 00:00:07,000
Downloaded from
YTS.BZ
2
00:00:02,654 --> 00:00:04,656
[grand orchestral fanfare
playing]
3
00:00:08,000 --> 00:00:13,000
Official YIFY movies site:
YTS.BZ
4
00:00:27,636 --> 00:00:29,681
♪ ♪
5
00:00:41,389 --> 00:00:43,217
[birds chirping and calling]
6
00:00:48,657 --> 00:00:50,833
-[burbling]
-[trilling]
7
00:00:55,055 --> 00:00:57,057
[VCR clicking and whirring]
8
00:01:00,582 --> 00:01:02,018
[projector clicking]
9
00:01:04,847 --> 00:01:09,286
Now, the present-day computers
are complete morons,
10
00:01:09,330 --> 00:01:12,115
but this will not be true
in another generation.
11
00:01:12,159 --> 00:01:14,857
They will start to think,
and eventually they will
12
00:01:14,900 --> 00:01:17,599
completely outthink
their makers.
13
00:01:17,642 --> 00:01:20,254
Is this depressing?
I don't see why it should be.
14
00:01:20,297 --> 00:01:24,823
I suspect that organic
or biological evolution
15
00:01:24,867 --> 00:01:26,303
has about come to its end.
16
00:01:27,739 --> 00:01:29,524
[Daniel chuckles]
17
00:01:29,567 --> 00:01:30,655
CAROLINE: You don't want me
to be the narrator.
18
00:01:30,699 --> 00:01:32,353
I don't have a good voice.
19
00:01:32,396 --> 00:01:34,006
DANIEL: You have a great voice.
Just do it.
20
00:01:34,050 --> 00:01:36,183
It's just a--
we're just trying it out.
21
00:01:36,226 --> 00:01:37,967
CAROLINE:
Do I get paid for this shit?
22
00:01:38,010 --> 00:01:40,230
DANIEL [laughing]: Well,
we'll see how well you do.
23
00:01:40,274 --> 00:01:41,579
[Caroline laughing]
24
00:01:41,623 --> 00:01:43,451
-♪ ♪
-[dog barks]
25
00:01:46,454 --> 00:01:47,890
CAROLINE:
I need to call my parents.
26
00:01:47,933 --> 00:01:49,065
[engine starts]
27
00:01:49,109 --> 00:01:51,023
Sometimes we rush into things
28
00:01:51,067 --> 00:01:52,851
without thinking them through.
29
00:01:52,895 --> 00:01:54,723
DANIEL: Yeah, can you sh...
you want to shoot? That's...
30
00:01:54,766 --> 00:01:57,204
CAROLINE: Like when Daniel
and Caroline got married
31
00:01:57,247 --> 00:02:00,163
151 days after they met.
32
00:02:00,207 --> 00:02:01,817
Let's rewind.
33
00:02:03,079 --> 00:02:04,776
[applause]
34
00:02:04,820 --> 00:02:07,170
Have you been thinking about
buying a home computer?
35
00:02:07,214 --> 00:02:09,303
CAROLINE:
In 1993, when Daniel was born,
36
00:02:09,346 --> 00:02:12,132
his parents didn't even have
a computer in the house.
37
00:02:12,175 --> 00:02:13,785
But he remembers
when they finally got one.
38
00:02:13,829 --> 00:02:15,700
What a computer is to me is
39
00:02:15,744 --> 00:02:18,355
the equivalent of a bicycle
for our minds.
40
00:02:18,399 --> 00:02:20,227
Hello.
41
00:02:20,270 --> 00:02:22,185
CAROLINE: His computer helped
unleash his creativity.
42
00:02:22,229 --> 00:02:23,926
DANIEL:
Rolling. Go.
43
00:02:23,969 --> 00:02:26,581
CAROLINE: He used
his dad's iMac to edit videos.
44
00:02:26,624 --> 00:02:28,322
Yeah!
45
00:02:28,365 --> 00:02:29,932
CAROLINE: And even make
little animations.
46
00:02:29,975 --> 00:02:31,934
[tires squealing, crash]
47
00:02:31,977 --> 00:02:35,067
In 1998, when Caroline was
a little girl...
48
00:02:35,111 --> 00:02:36,721
Hi, Mommy.
49
00:02:36,765 --> 00:02:38,158
...only nerds knew
what the Internet was.
50
00:02:38,201 --> 00:02:39,811
And what about
this Internet thing?
51
00:02:39,855 --> 00:02:41,335
-Do you, do you know anything
about that? -Sure.
52
00:02:41,378 --> 00:02:42,771
-[audience laughter]
-CAROLINE: But soon,
53
00:02:42,814 --> 00:02:44,903
-everyone was on the...
-...Internets.
54
00:02:44,947 --> 00:02:46,992
...and computers were
beating humans at chess.
55
00:02:47,036 --> 00:02:49,995
ANNOUNCER: Deep Blue--
Kasparov has resigned!
56
00:02:50,039 --> 00:02:52,607
CAROLINE: And by the time
Caroline went to college,
57
00:02:52,650 --> 00:02:55,087
computers were
in everyone's pocket.
58
00:02:55,131 --> 00:02:58,395
Now Daniel could make movies
with his phone.
59
00:02:58,439 --> 00:03:01,485
He grew up to be an artist
and a filmmaker...
60
00:03:01,529 --> 00:03:03,922
-Good night.
-...and so did Caroline.
61
00:03:03,966 --> 00:03:05,707
Cut. Let's go into a close-up.
That was perfect.
62
00:03:05,750 --> 00:03:07,274
By then, computers were
connecting people
63
00:03:07,317 --> 00:03:08,971
and changing the world in ways
64
00:03:09,014 --> 00:03:10,668
that we never
could have imagined.
65
00:03:10,712 --> 00:03:12,322
All the money raised by
the Ice Bucket Challenge...
66
00:03:12,366 --> 00:03:14,194
CAROLINE:
Some of it was good.
67
00:03:14,237 --> 00:03:15,499
[gasping]
68
00:03:15,543 --> 00:03:17,109
Some not so good.
69
00:03:17,153 --> 00:03:19,286
Anxiety, self-harm, suicide...
70
00:03:19,329 --> 00:03:21,201
But it's clear now
that we didn't do enough
71
00:03:21,244 --> 00:03:24,552
to prevent these tools from
being used for harm as well.
72
00:03:24,595 --> 00:03:26,380
CAROLINE: And as the world
got more and more focused
73
00:03:26,423 --> 00:03:28,033
-on their computers,
-[phone dings]
74
00:03:28,077 --> 00:03:30,210
Daniel focused on his artwork.
75
00:03:32,037 --> 00:03:35,606
Wherever he went,
he always had a sketchbook,
76
00:03:35,650 --> 00:03:38,392
including the night
he met Caroline.
77
00:03:38,435 --> 00:03:41,656
He drew a terrible picture
of her, and 20 minutes later,
78
00:03:41,699 --> 00:03:44,136
he boldly pronounced
they were going to get married,
79
00:03:44,180 --> 00:03:47,575
which, as you know, they did,
151 days later.
80
00:03:47,618 --> 00:03:49,925
WOMAN: ...now pronounce you
husband and wife.
81
00:03:49,968 --> 00:03:51,666
CAROLINE:
They moved into a cute house...
82
00:03:51,709 --> 00:03:52,841
-DANIEL: Moose, hey! Come here!
-...and got a dog
83
00:03:52,884 --> 00:03:54,669
as dumb as Daniel.
84
00:03:54,712 --> 00:03:57,280
Meanwhile, computers were
writing entire essays
85
00:03:57,324 --> 00:03:59,326
based on simple questions like,
86
00:03:59,369 --> 00:04:03,155
"How hard would it be to build
a shack in my backyard?"
87
00:04:03,199 --> 00:04:07,072
And so Daniel built a shack
in his backyard.
88
00:04:07,116 --> 00:04:09,945
But just as he sat down to start
working on his next movie,
89
00:04:09,988 --> 00:04:13,644
he learned that computers
could now write screenplays.
90
00:04:14,863 --> 00:04:16,560
I mean, they were bad,
91
00:04:16,604 --> 00:04:18,954
but they were getting better
really fast.
92
00:04:18,997 --> 00:04:21,652
They could create new images
and videos from scratch,
93
00:04:21,696 --> 00:04:24,786
and some of them
could even pass the bar exam.
94
00:04:24,829 --> 00:04:28,833
Not just pass the bar but
be in the top ten percentile.
95
00:04:28,877 --> 00:04:30,531
CAROLINE:
It was confusing.
96
00:04:30,574 --> 00:04:32,141
[sighs]
97
00:04:32,184 --> 00:04:35,057
Daniel just wanted
a bicycle for his mind,
98
00:04:35,100 --> 00:04:38,626
but computers had become
a self-driving rocket ship.
99
00:04:38,669 --> 00:04:40,236
[whooshing]
100
00:04:40,280 --> 00:04:41,977
["We R in Control"
by Neil Young playing]
101
00:04:42,020 --> 00:04:44,762
Pioneers in the field
of artificial intelligence
102
00:04:44,806 --> 00:04:46,503
are pleading with Congress
103
00:04:46,547 --> 00:04:48,505
to pass safety rules
before it's too late.
104
00:04:48,549 --> 00:04:50,551
CAROLINE: Now it felt like
the whole world
105
00:04:50,594 --> 00:04:52,988
was rushing into something
without thinking it through,
106
00:04:53,031 --> 00:04:56,339
and everyone had an opinion.
107
00:04:56,383 --> 00:05:00,125
Was it gonna destroy the world
or save humanity?
108
00:05:00,169 --> 00:05:02,389
There was too much information,
109
00:05:02,432 --> 00:05:04,652
which made him anxious,
which made Caroline anxious...
110
00:05:04,695 --> 00:05:06,001
[exasperated scream]
111
00:05:06,044 --> 00:05:07,785
I don't have kids yet,
but I worry
112
00:05:07,829 --> 00:05:10,179
what the world would look like
if I decide to. [chuckles]
113
00:05:10,222 --> 00:05:12,355
CAROLINE:
He was starting to spin out.
114
00:05:12,399 --> 00:05:14,052
[indistinct chatter]
115
00:05:14,096 --> 00:05:16,925
It was becoming
a mountain of anxiety.
116
00:05:19,144 --> 00:05:21,364
-MAN: It's horrible.
-It's freakin' Siri.
117
00:05:22,670 --> 00:05:25,586
CAROLINE:
And so, Daniel decided to go out
118
00:05:25,629 --> 00:05:27,979
and find someone
who could explain this to him,
119
00:05:28,023 --> 00:05:30,155
so he could stop
thinking about it
120
00:05:30,199 --> 00:05:33,463
and he and Caroline could
get on with their lives.
121
00:05:33,507 --> 00:05:36,814
♪ We are in control,
we are in control ♪
122
00:05:36,858 --> 00:05:38,729
♪ We are in control ♪
123
00:05:38,773 --> 00:05:40,514
♪ Chemical computer
thinking battery ♪
124
00:05:40,557 --> 00:05:42,559
♪ We are in control, we are... ♪
125
00:05:42,603 --> 00:05:45,693
This endeavor would turn out
to be hopelessly naive
126
00:05:45,736 --> 00:05:49,044
and kick off the most confusing
year of his life.
127
00:05:49,087 --> 00:05:50,698
♪ We are in control... ♪
128
00:05:50,741 --> 00:05:52,787
But as we know,
we sometimes rush into things
129
00:05:52,830 --> 00:05:54,615
without thinking them through.
130
00:05:54,658 --> 00:05:56,356
♪ Chemical computer
thinking battery... ♪
131
00:05:56,399 --> 00:05:57,922
Oh, my gosh. What is happening?
132
00:05:57,966 --> 00:05:59,924
CAROLINE:
He had questions.
133
00:05:59,968 --> 00:06:01,578
Questions only the
smartest nerds could answer.
134
00:06:01,622 --> 00:06:04,146
Submitting to the interrogator.
135
00:06:04,189 --> 00:06:06,017
CAROLINE:
Questions like:
136
00:06:06,061 --> 00:06:08,585
Was this the apocalypse
or did he actually have reason
137
00:06:08,629 --> 00:06:11,109
-to be optimistic?
-Yes.
138
00:06:11,153 --> 00:06:13,764
-♪ Chemical computer thinking
battery. ♪ -[song ends]
139
00:06:13,808 --> 00:06:16,158
CAROLINE:
Uh, is that good?
140
00:06:16,201 --> 00:06:17,594
DANIEL:
Yeah.
141
00:06:17,638 --> 00:06:20,162
So, to begin,
142
00:06:20,205 --> 00:06:22,120
what is artificial intelligence?
143
00:06:22,164 --> 00:06:24,688
I know that must be annoying
for you, that-that question,
144
00:06:24,732 --> 00:06:26,473
but I do think it's important.
145
00:06:26,516 --> 00:06:29,780
So... AI...
146
00:06:29,824 --> 00:06:31,826
[stammers] Um...
147
00:06:31,869 --> 00:06:33,654
-Yeah...
-Uh, hmm.
148
00:06:33,697 --> 00:06:34,742
[laughs]:
That's a good question.
149
00:06:34,785 --> 00:06:36,178
-Yeah, um...
-Um...
150
00:06:36,221 --> 00:06:37,658
DANIEL:
What is AI?
151
00:06:37,701 --> 00:06:39,094
[laughs]
152
00:06:39,137 --> 00:06:40,748
I love that that's
the first question,
153
00:06:40,791 --> 00:06:43,141
'cause there is not
a clear and consistent answer.
154
00:06:43,185 --> 00:06:45,274
Artificial intelligence is
155
00:06:45,317 --> 00:06:50,714
a kind of intentionally
and maybe uselessly broad term.
156
00:06:50,758 --> 00:06:53,325
DAVID EVAN HARRIS:
It's a machine
157
00:06:53,369 --> 00:06:55,545
doing things that
we previously only thought
158
00:06:55,589 --> 00:06:57,242
that people could do:
159
00:06:57,286 --> 00:07:01,116
making recommendations,
decisions and predictions.
160
00:07:01,159 --> 00:07:06,774
AI is the, uh, application
of computer science
161
00:07:06,817 --> 00:07:09,124
to solving cognitive problems.
162
00:07:09,167 --> 00:07:13,302
Okay, so when I picture AI,
163
00:07:13,345 --> 00:07:16,740
it's sort of like
this magical computer box...
164
00:07:16,784 --> 00:07:18,786
♪ ♪
165
00:07:20,527 --> 00:07:24,095
...just, like, floating
in, like, inert space.
166
00:07:24,139 --> 00:07:26,446
And no matter
how many times people try
167
00:07:26,489 --> 00:07:29,710
and explain this to me,
I just don't get how...
168
00:07:29,753 --> 00:07:32,495
how it's understanding
all of these things
169
00:07:32,539 --> 00:07:35,672
and how it's feeling like
intelligence.
170
00:07:35,716 --> 00:07:37,152
[beeping, electronic chattering]
171
00:07:37,195 --> 00:07:39,241
And that's kind of
nerve-racking.
172
00:07:41,243 --> 00:07:45,552
AZA RASKIN: What is
this new generation of AI?
173
00:07:47,205 --> 00:07:50,078
This AI that is different than
every other generation?
174
00:07:50,121 --> 00:07:52,733
Like, no one ever
talked about, like,
175
00:07:52,776 --> 00:07:56,911
Siri taking over the world
or causing catastrophes.
176
00:07:56,954 --> 00:07:58,478
-WOMAN: Hi, Siri?
-MAN: ...want to.
177
00:07:58,521 --> 00:08:00,523
WOMAN:
Hello? Siri?
178
00:08:00,567 --> 00:08:02,569
Hello? Hey, Siri!
179
00:08:02,612 --> 00:08:05,746
Or the voice in Google Maps,
which mispronounces road names,
180
00:08:05,789 --> 00:08:07,487
like, breaking society.
181
00:08:07,530 --> 00:08:10,620
MAN: Google Maps says
this is a road.
182
00:08:10,664 --> 00:08:13,057
[laughing]:
But I think I'm in a river.
183
00:08:13,101 --> 00:08:14,624
This is definitely a river,
innit?
184
00:08:14,668 --> 00:08:17,192
Something changed
with ChatGPT coming out.
185
00:08:18,193 --> 00:08:19,934
People understood-- no, no, no--
186
00:08:19,977 --> 00:08:21,457
this technology's
insanely valuable,
187
00:08:21,501 --> 00:08:23,633
it's insanely powerful
and also insanely scary.
188
00:08:23,677 --> 00:08:25,548
Okay, listen to this.
Very creepy.
189
00:08:25,592 --> 00:08:28,595
A new artificial intelligence
tool is going viral
190
00:08:28,638 --> 00:08:32,294
for cranking out entire essays
in a matter of seconds.
191
00:08:35,950 --> 00:08:41,172
AI dwarfs the power of all
other technologies combined.
192
00:08:41,216 --> 00:08:43,697
DANIEL:
"AI dwarfs the power
193
00:08:43,740 --> 00:08:46,047
of all other technologies
combined."
194
00:08:46,090 --> 00:08:47,222
Yeah.
195
00:08:47,265 --> 00:08:48,702
Do you think that's true?
196
00:08:48,745 --> 00:08:50,747
Yes.
197
00:08:50,791 --> 00:08:53,184
Tell me about-- How? How?
198
00:08:53,228 --> 00:08:56,710
I think, to paint that picture,
it's really important
199
00:08:56,753 --> 00:09:00,627
to understand what today's
state-of-the-art systems
200
00:09:00,670 --> 00:09:02,150
look like and how they're built.
201
00:09:02,193 --> 00:09:05,762
[laughs] This is
quite a... quite a setup.
202
00:09:05,806 --> 00:09:07,242
So, one thing that
203
00:09:07,285 --> 00:09:09,679
not a lot of people
realize is that
204
00:09:09,723 --> 00:09:13,683
systems like ChatGPT aren't
programmed by any human.
205
00:09:13,727 --> 00:09:15,206
What do you mean?
206
00:09:15,250 --> 00:09:16,991
Instead, it's something like
th-they're grown.
207
00:09:17,034 --> 00:09:19,167
We kind of give them
raw resources, like,
208
00:09:19,210 --> 00:09:20,995
"Here's a lot of
computational resources.
209
00:09:21,038 --> 00:09:22,170
Here's a lot of data."
210
00:09:22,213 --> 00:09:23,867
Under the hood, it's math,
211
00:09:23,911 --> 00:09:27,305
and the math is actually
surprisingly straightforward.
212
00:09:27,349 --> 00:09:31,092
DANIEL: So ChatGPT is a kind
of AI but it's not all of AI?
213
00:09:31,135 --> 00:09:32,659
Totally. ChatGPT is
just the beginning,
214
00:09:32,702 --> 00:09:34,095
but it's a good place to start.
215
00:09:34,138 --> 00:09:37,794
But I still don't know
what AI is.
216
00:09:37,838 --> 00:09:40,362
To understand AI,
it begins with understanding
217
00:09:40,405 --> 00:09:43,147
that intelligence is about
recognizing patterns.
218
00:09:43,191 --> 00:09:44,845
-Patterns. -Patterns.
-Patterns.
219
00:09:44,888 --> 00:09:48,544
COTRA: It is shown
trillions of words of text
220
00:09:48,588 --> 00:09:51,286
across millions of documents
in the Internet.
221
00:09:51,329 --> 00:09:52,766
RANDIMA FERNANDO:
It started with text.
222
00:09:52,809 --> 00:09:55,899
And what they did was
they took textbooks,
223
00:09:55,943 --> 00:09:59,511
and they took poems and essays
and instruction manuals.
224
00:09:59,555 --> 00:10:00,861
DAVID EVAN HARRIS:
They can do things like
225
00:10:00,904 --> 00:10:03,124
digest the entire Internet,
226
00:10:03,167 --> 00:10:06,736
every single word that's ever
been written by a person.
227
00:10:06,780 --> 00:10:09,957
Reddit threads and social media
and all of Wikipedia.
228
00:10:10,000 --> 00:10:13,613
More data than anybody could
ever read in several lifetimes.
229
00:10:13,656 --> 00:10:15,484
And they gave this system
one job.
230
00:10:15,527 --> 00:10:18,487
Figure out the patterns and
structure of that information
231
00:10:18,530 --> 00:10:20,750
and use that
to make predictions about
232
00:10:20,794 --> 00:10:23,231
what word should come next
in a sentence.
233
00:10:23,274 --> 00:10:24,972
When you say
"patterns in a sentence,"
234
00:10:25,015 --> 00:10:26,756
what are you talking about?
235
00:10:26,800 --> 00:10:29,367
So, it's everything from, like,
the really simple things,
236
00:10:29,411 --> 00:10:31,979
like most sentences end
with a period,
237
00:10:32,022 --> 00:10:34,895
all the way up to the
more conceptual things, like:
238
00:10:34,938 --> 00:10:36,766
What is a sonnet?
239
00:10:36,810 --> 00:10:38,333
It's a type of poem, and it has
some particular structure.
240
00:10:39,421 --> 00:10:41,249
RASKIN:
So, it then looks at
241
00:10:41,292 --> 00:10:43,599
all of that data,
all of that text...
242
00:10:43,643 --> 00:10:46,036
COTRA: And over trillions
and trillions of tries,
243
00:10:46,080 --> 00:10:48,778
each time it gets something
right or wrong,
244
00:10:48,822 --> 00:10:51,259
it's given a little bit
of positive reinforcement
245
00:10:51,302 --> 00:10:53,391
when it guesses
the next word correctly,
246
00:10:53,435 --> 00:10:55,698
and it's given a little bit
of negative reinforcement
247
00:10:55,742 --> 00:10:59,354
when it guesses
the next word incorrectly.
248
00:10:59,397 --> 00:11:01,573
And at the end of it,
you have a system that
249
00:11:01,617 --> 00:11:05,055
speaks really good English
as a side effect
250
00:11:05,099 --> 00:11:06,796
of being really, really good
at predicting
251
00:11:06,840 --> 00:11:09,059
the word that comes next
in a piece of text.
252
00:11:09,103 --> 00:11:12,019
AUTOMATED VOICE [in French]:
Actually, my French
isn't too bad either.
253
00:11:12,062 --> 00:11:14,021
[in English]: It uses all of
those patterns it has learned
254
00:11:14,064 --> 00:11:15,675
to be able to make
a prediction about
255
00:11:15,718 --> 00:11:17,241
what the answer should be,
256
00:11:17,285 --> 00:11:18,808
then it gives you that
as the output.
257
00:11:18,852 --> 00:11:21,028
AUTOMATED VOICE:
It's a little oversimplified,
258
00:11:21,071 --> 00:11:23,073
but I think people will get it.
259
00:11:23,117 --> 00:11:24,945
So, so that's all it does?
260
00:11:24,988 --> 00:11:26,816
Yeah. It doesn't seem like
it would be that complicated,
261
00:11:26,860 --> 00:11:28,035
but actually you have to know
262
00:11:28,078 --> 00:11:29,645
a huge amount of things
263
00:11:29,689 --> 00:11:31,386
in order to
actually succeed at that.
264
00:11:32,517 --> 00:11:34,258
RASKIN:
If you say to ChatGPT,
265
00:11:34,302 --> 00:11:36,696
"Write me a Shakespearean
sonnet about my dog,"
266
00:11:36,739 --> 00:11:38,785
it has to know what dogs are.
267
00:11:38,828 --> 00:11:40,438
-It has to know what you love
about your dog.-[barking]
268
00:11:40,482 --> 00:11:43,006
It has to know
who Shakespeare is,
269
00:11:43,050 --> 00:11:45,574
that sonnets rhyme,
that they have a structure,
270
00:11:45,617 --> 00:11:47,750
that words have sounds
that can rhyme.
271
00:11:47,794 --> 00:11:49,273
It takes a lot.
272
00:11:49,317 --> 00:11:50,884
[voices overlapping
in various languages]
273
00:11:50,927 --> 00:11:52,929
Holy shit, you can talk
to your computer now.
274
00:11:52,973 --> 00:11:56,063
That was just not true
three years ago.
275
00:11:56,106 --> 00:11:57,629
RASKIN: Yes, and this is
the really important part.
276
00:11:57,673 --> 00:12:00,589
The same process that lets AI
277
00:12:00,632 --> 00:12:03,157
uncover and manipulate
the patterns of text
278
00:12:03,200 --> 00:12:05,768
is the same process
that lets it uncover
279
00:12:05,812 --> 00:12:10,512
the patterns of the entire
universe and everything in it.
280
00:12:10,555 --> 00:12:12,862
FERNANDO: There are patterns
and images and sound
281
00:12:12,906 --> 00:12:15,952
in computer code and DNA
and music and physics
282
00:12:15,996 --> 00:12:18,476
and fashion and building design
283
00:12:18,520 --> 00:12:21,566
[distorted]: and in
human voices and human faces,
284
00:12:21,610 --> 00:12:23,917
[normally]:
really, truly everywhere.
285
00:12:23,960 --> 00:12:26,006
-Everywhere. -Everywhere.
-Everywhere. -Everywhere.
286
00:12:26,049 --> 00:12:27,747
If you have learned
those patterns,
287
00:12:27,790 --> 00:12:29,661
you can generate
new kinds of songs.
288
00:12:29,705 --> 00:12:31,141
You can generate new videos.
289
00:12:31,185 --> 00:12:32,621
And that's why, if you give it
290
00:12:32,664 --> 00:12:34,710
a three-second recording
of your grandmother,
291
00:12:34,754 --> 00:12:36,538
it can speak back in her voice.
292
00:12:36,581 --> 00:12:37,931
Oh, my God.
293
00:12:37,974 --> 00:12:40,585
[repeating]:
Oh, my God. Oh, my God.
294
00:12:40,629 --> 00:12:42,849
What will they think of next?
295
00:12:42,892 --> 00:12:44,024
It's moving very, very quickly.
296
00:12:44,067 --> 00:12:45,808
An American AI start-up
297
00:12:45,852 --> 00:12:47,592
has released its latest model.
298
00:12:47,636 --> 00:12:50,770
That company is Anthropic,
and it has just unveiled
299
00:12:50,813 --> 00:12:53,773
the latest versions
of its AI assistant Claude.
300
00:12:53,816 --> 00:12:57,167
GLENN BECK: So, the xAI team
was there to unveil Grok 4.
301
00:12:57,211 --> 00:13:00,127
MAN: Google released one
just last week. Gemini is...
302
00:13:00,170 --> 00:13:02,999
We've gone from GPT-2
just a couple years ago,
303
00:13:03,043 --> 00:13:05,523
which could barely write
a coherent paragraph,
304
00:13:05,567 --> 00:13:07,961
to GPT-4,
which can pass the bar exam.
305
00:13:08,004 --> 00:13:10,354
And all they had to do
to get there
306
00:13:10,398 --> 00:13:12,966
was essentially add more data
and more compute.
307
00:13:13,009 --> 00:13:16,273
-These people who are
building this... -COTRA: Yeah.
308
00:13:16,317 --> 00:13:18,014
...they're just throwing more...
309
00:13:18,058 --> 00:13:21,801
More physical computers,
more of the same kinds of data.
310
00:13:21,844 --> 00:13:24,542
Because the more
computing power you add,
311
00:13:24,586 --> 00:13:28,024
the more complex
intellectual tasks they can do.
312
00:13:28,068 --> 00:13:30,157
So, the more weather data
you give it,
313
00:13:30,200 --> 00:13:31,898
the better it can
make predictions about
314
00:13:31,941 --> 00:13:34,204
where a hurricane might go.
315
00:13:34,248 --> 00:13:36,337
RASKIN: And the more patterns
of tumors and bones
316
00:13:36,380 --> 00:13:38,861
and tissues an AI has seen,
then the better able it is
317
00:13:38,905 --> 00:13:41,951
to detect a tumor
in a new CT scan.
318
00:13:41,995 --> 00:13:43,648
Better even than a human doctor.
319
00:13:43,692 --> 00:13:46,956
AI that's already being
deployed for the military
320
00:13:47,000 --> 00:13:49,132
can already use
satellite imagery,
321
00:13:49,176 --> 00:13:51,787
troop movements, communications
to determine,
322
00:13:51,831 --> 00:13:53,354
-sometimes days in advance,
-[civil defense siren blaring]
323
00:13:53,397 --> 00:13:55,008
where an attack
is going to happen,
324
00:13:55,051 --> 00:13:57,401
like where an enemy
is going to strike.
325
00:13:57,445 --> 00:13:59,447
TRISTAN HARRIS: This whole
space is moving so fast
326
00:13:59,490 --> 00:14:01,666
that any example
you put in this movie
327
00:14:01,710 --> 00:14:04,582
will feel absolutely clumsy
by the time it comes out.
328
00:14:04,626 --> 00:14:06,671
[beeping, clicking]
329
00:14:09,326 --> 00:14:11,198
-[applause]
-[upbeat music playing]
330
00:14:11,241 --> 00:14:14,114
RASKIN:
These models are being released
331
00:14:14,157 --> 00:14:17,073
before anyone knows
what they're even capable of.
332
00:14:17,117 --> 00:14:22,513
GPT-3.5 was released and out
to 100 million people plus
333
00:14:22,557 --> 00:14:25,952
before some researchers
discovered that it could do
334
00:14:25,995 --> 00:14:28,737
research-grade chemistry
better than models
335
00:14:28,780 --> 00:14:32,784
that were trained specifically
to do research-grade chemistry.
336
00:14:32,828 --> 00:14:35,439
Something is happening in there
that the people
337
00:14:35,483 --> 00:14:38,268
who are building them
don't fully understand.
338
00:14:38,312 --> 00:14:40,967
Basically, it just analyzes
the data by itself,
339
00:14:41,010 --> 00:14:45,058
and as it does that,
it just teaches itself
340
00:14:45,101 --> 00:14:47,974
various things
that we often didn't intend.
341
00:14:48,017 --> 00:14:49,932
So, for instance,
it reads a lot online,
342
00:14:49,976 --> 00:14:52,543
and then at some point, it just
learns how to do arithmetic.
343
00:14:52,587 --> 00:14:54,458
KIDS:
One, two.
344
00:14:54,502 --> 00:14:56,112
HENDRYCKS: And then at
some point, it starts to learn
345
00:14:56,156 --> 00:14:58,549
how to answer
advanced physics questions.
346
00:14:58,593 --> 00:15:00,769
We didn't program that
in it whatsoever.
347
00:15:00,812 --> 00:15:02,423
It just learned by itself.
348
00:15:05,382 --> 00:15:07,254
TRISTAN HARRIS:
An AI is like a digital brain.
349
00:15:07,297 --> 00:15:08,690
But just like a human brain,
350
00:15:08,733 --> 00:15:10,518
if you did a brain scan
on a human brain,
351
00:15:10,561 --> 00:15:13,086
would you know everything
that person was capable of?
352
00:15:13,129 --> 00:15:15,131
You can't know that
just from the brain scan.
353
00:15:15,175 --> 00:15:17,917
It's just, like,
a bunch of numbers and, like,
354
00:15:17,960 --> 00:15:20,180
the multiplications that are
happening that-that, like,
355
00:15:20,223 --> 00:15:22,225
the best machine learning
researcher in the world
356
00:15:22,269 --> 00:15:24,314
could look at and, like, have
no idea what was happening.
357
00:15:30,973 --> 00:15:32,714
DANIEL:
That chair right there.
358
00:15:32,757 --> 00:15:35,847
-Is that okay for you?
-Yes.
359
00:15:35,891 --> 00:15:37,675
DANIEL: So,
that's kind of mind-boggling.
360
00:15:37,719 --> 00:15:39,068
Okay? Like, it's taking over
the world,
361
00:15:39,112 --> 00:15:41,070
and we don't even know
how it works.
362
00:15:41,114 --> 00:15:43,116
-Is that right?
-Mm.
363
00:15:43,159 --> 00:15:46,946
We do understand
a number of important things,
364
00:15:46,989 --> 00:15:50,471
but we don't have
a very good grasp on
365
00:15:50,514 --> 00:15:53,213
why they provide
specific answers to questions.
366
00:15:53,256 --> 00:15:58,609
It is a problem because we are
on a path t-to build machines,
367
00:15:58,653 --> 00:16:03,092
based on these principles,
that could be smarter than us
368
00:16:03,136 --> 00:16:06,139
and thus potentially have
a lot of power.
369
00:16:12,580 --> 00:16:15,278
One of the most cited
computer scientists in history.
370
00:16:15,322 --> 00:16:16,932
SUTSKEVER: I actually find it
a little difficult
371
00:16:16,976 --> 00:16:18,499
to talk about my own role.
372
00:16:18,542 --> 00:16:21,067
Really much prefer
when other people do it.
373
00:16:21,110 --> 00:16:23,895
Ilya joining was the...
was-was the linchpin for
374
00:16:23,939 --> 00:16:25,506
OpenAI being
ultimately successful.
375
00:16:25,549 --> 00:16:27,203
I think it's just going to be
376
00:16:27,247 --> 00:16:30,598
some kind of a vast, dramatic
and unimaginable impact.
377
00:16:30,641 --> 00:16:32,861
I don't know if you've spent
any time on YouTube,
378
00:16:32,904 --> 00:16:34,950
but you can kind of feel
the speed already, right?
379
00:16:34,994 --> 00:16:36,256
You know what I mean?
380
00:16:36,299 --> 00:16:38,345
SHANE LEGG:
This is just the warmup.
381
00:16:38,388 --> 00:16:40,434
The really powerful systems
are still coming,
382
00:16:40,477 --> 00:16:42,262
and they're gonna be coming
quite soon.
383
00:16:47,963 --> 00:16:51,575
AGI stands for "artificial
general intelligence."
384
00:16:51,619 --> 00:16:53,795
Uh, systems that are
basically...
385
00:16:58,191 --> 00:17:00,236
And this is, like, you know,
seems to be, like,
386
00:17:00,280 --> 00:17:03,022
the holy grail of AI?
387
00:17:03,065 --> 00:17:05,154
When you can simulate
a human mind
388
00:17:05,198 --> 00:17:07,852
that is doing human cognition
and can do reasoning,
389
00:17:07,896 --> 00:17:10,942
that is a new sort of tier of AI
390
00:17:10,986 --> 00:17:14,250
that we have to distinguish
from previous AI.
391
00:17:14,294 --> 00:17:16,600
When that happens,
by the way, that's when
392
00:17:16,644 --> 00:17:21,040
you would hire one of
those AGIs instead of a person.
393
00:17:21,083 --> 00:17:24,478
Most jobs in our economy
it can do.
394
00:17:24,521 --> 00:17:25,914
It can work 24 hours a day,
395
00:17:25,957 --> 00:17:28,177
never gets tired,
never gets bored.
396
00:17:28,221 --> 00:17:30,136
They don't need to sleep.
They don't need breaks.
397
00:17:30,179 --> 00:17:31,876
They're, like,
not gonna join a union.
398
00:17:31,920 --> 00:17:33,878
Won't complain,
won't whistleblow.
399
00:17:33,922 --> 00:17:35,793
More than 100 times cheaper
400
00:17:35,837 --> 00:17:38,231
than humans working
at m-minimum wage.
401
00:17:38,274 --> 00:17:39,493
Not only will they be
doing everything,
402
00:17:39,536 --> 00:17:41,016
but they'll be doing it faster.
403
00:17:41,060 --> 00:17:43,062
TRISTAN HARRIS:
The same intelligence
404
00:17:43,105 --> 00:17:45,368
that powers that can also look
at the patterns and movements
405
00:17:45,412 --> 00:17:47,849
and articulating muscles
and, you know, robotics.
406
00:17:47,892 --> 00:17:50,069
And so it's not just
gonna automate desk jobs.
407
00:17:50,112 --> 00:17:51,722
That's just the beginning.
408
00:17:51,766 --> 00:17:55,552
It will automate
all physical labor.
409
00:17:55,596 --> 00:17:56,988
LEIKE:
There's no way
410
00:17:57,032 --> 00:17:59,165
humans are gonna compete
with them.
411
00:17:59,208 --> 00:18:01,210
♪ ♪
412
00:18:05,606 --> 00:18:09,262
It is hard to conceptualize
the impact of AGI.
413
00:18:09,305 --> 00:18:11,046
[electronic chatter]
414
00:18:11,090 --> 00:18:12,830
But I think it's going to be
something very big
415
00:18:12,874 --> 00:18:15,181
and drastic and radical.
416
00:18:17,226 --> 00:18:19,010
DANIEL: You think
this is one of the most
417
00:18:19,054 --> 00:18:20,882
consequential moments
in human history?
418
00:18:20,925 --> 00:18:23,319
Yeah. Yeah, that's--
I mean, what else would be?
419
00:18:23,363 --> 00:18:25,452
I mean, like, there's
the Industrial Revolution.
420
00:18:25,495 --> 00:18:27,193
MALO BOURGON: You know, it'll
make the Industrial Revolution
421
00:18:27,236 --> 00:18:29,847
look like small beans.
422
00:18:29,891 --> 00:18:32,111
TRISTAN HARRIS:
AGI is an inflection point
423
00:18:32,154 --> 00:18:34,025
because it means
you can accelerate
424
00:18:34,069 --> 00:18:37,986
all other intellectual fields
all at the same time.
425
00:18:38,029 --> 00:18:39,814
Like, if you make an advance
in rocketry,
426
00:18:39,857 --> 00:18:42,033
that doesn't advance
biology and medicine.
427
00:18:43,905 --> 00:18:45,515
If you make an advance
in medicine,
428
00:18:45,559 --> 00:18:47,126
that doesn't advance rocketry.
429
00:18:48,866 --> 00:18:51,304
But if you make an advance
in artificial intelligence,
430
00:18:51,347 --> 00:18:54,133
that advances all scientific
and technological fields
431
00:18:54,176 --> 00:18:55,612
all at the same time.
432
00:18:55,656 --> 00:18:57,092
That's why, for a long time,
433
00:18:57,136 --> 00:18:59,181
Google DeepMind's
mission statement was...
434
00:18:59,225 --> 00:19:00,965
-Step one, solve intelligence.
-LEX FRIDMAN: Yeah.
435
00:19:01,009 --> 00:19:02,750
-Step two, use it to solve
everything else. -Yes.
436
00:19:02,793 --> 00:19:05,144
That's why AI dwarfs the power
437
00:19:05,187 --> 00:19:07,494
of all other technologies
combined.
438
00:19:07,537 --> 00:19:09,060
It will transform everything.
439
00:19:09,104 --> 00:19:10,845
So, uh, it'll be
at least as big as
440
00:19:10,888 --> 00:19:13,761
the Industrial Revolution,
possibly, you know, bigger,
441
00:19:13,804 --> 00:19:16,285
more like the advent
of electricity or even fire.
442
00:19:16,329 --> 00:19:18,635
VIDEO NARRATOR:
The caveman literally held aloft
443
00:19:18,679 --> 00:19:21,508
the torch of civilization.
444
00:19:23,727 --> 00:19:25,425
KOKOTAJLO:
It is generally thought that
445
00:19:25,468 --> 00:19:28,167
around the time of AGI,
we'll have AIs that can
446
00:19:28,210 --> 00:19:31,518
do all or most of
the AI research process
447
00:19:31,561 --> 00:19:34,434
and, of course, can do it
faster and cheaper.
448
00:19:34,477 --> 00:19:35,913
[beeping]
449
00:19:35,957 --> 00:19:37,306
LEIKE:
It can copy itself.
450
00:19:37,350 --> 00:19:39,352
A thousand times,
a million times,
451
00:19:39,395 --> 00:19:41,049
and, like,
now you have a million copies
452
00:19:41,092 --> 00:19:43,573
all working in parallel.
453
00:19:43,617 --> 00:19:46,576
RASKIN: When it learns
how to make its code faster,
454
00:19:46,620 --> 00:19:48,665
make its code more efficient,
455
00:19:48,709 --> 00:19:51,059
obviously that becomes, like,
a-a runaway loop.
456
00:19:51,102 --> 00:19:53,192
AGI isn't, like, the end.
457
00:19:53,235 --> 00:19:54,932
It's just the beginning.
458
00:19:54,976 --> 00:19:57,979
It's the beginning of
an incredibly rapid explosion
459
00:19:58,022 --> 00:20:00,024
of scientific progress,
460
00:20:00,068 --> 00:20:02,201
and in particular,
scientific progress in AI.
461
00:20:02,244 --> 00:20:04,377
And when they're smarter
than us, too,
462
00:20:04,420 --> 00:20:06,248
and substantially faster
than us,
463
00:20:06,292 --> 00:20:08,685
and they're getting faster
each year, exponentially,
464
00:20:08,729 --> 00:20:12,123
those are the ones that can
potentially become superhuman,
465
00:20:12,167 --> 00:20:14,343
uh, possibly this decade.
466
00:20:14,387 --> 00:20:15,866
Sorry, did you say
467
00:20:15,910 --> 00:20:18,478
"become superhuman,
maybe in this decade"?
468
00:20:18,521 --> 00:20:22,003
Yeah. I mean, I think, uh,
a lot of people who are
469
00:20:22,046 --> 00:20:24,484
actually building this think
that that's fairly plausible
470
00:20:24,527 --> 00:20:26,529
that we get
some superintelligence,
471
00:20:26,573 --> 00:20:28,662
something that's vastly
more intelligent than people,
472
00:20:28,705 --> 00:20:30,751
within this decade.
473
00:20:30,794 --> 00:20:32,883
The way I define
"superintelligence" is
474
00:20:32,927 --> 00:20:35,234
a system that by itself is
475
00:20:35,277 --> 00:20:37,627
more intelligent and competent
than all of humanity.
476
00:20:37,671 --> 00:20:39,629
I'm just gonna-- sorry.
I don't mean to interrupt you.
477
00:20:39,673 --> 00:20:41,065
You're on a flow.
478
00:20:41,109 --> 00:20:43,111
Uh, I just, I just...
479
00:20:43,154 --> 00:20:45,113
I'm not really following,
'cause you're using language
480
00:20:45,156 --> 00:20:46,767
like "superintelligence"
and, like,
481
00:20:46,810 --> 00:20:49,900
"smarter than all of humanity,"
and I hear that,
482
00:20:49,944 --> 00:20:52,425
and it sounds like...
like sci-fi bullshit to me,
483
00:20:52,468 --> 00:20:54,383
and I'm just trying
to understand.
484
00:20:54,427 --> 00:20:56,603
There's nothing magical
about intelligence.
485
00:20:56,646 --> 00:20:58,431
This is very important,
is that, you know,
486
00:20:58,474 --> 00:21:00,476
intelligence can feel magical,
it can feel like
487
00:21:00,520 --> 00:21:03,262
some mystical thing
in your mind or something,
488
00:21:03,305 --> 00:21:05,786
but it is just computation.
489
00:21:05,829 --> 00:21:09,790
LEGG: The human brain is
quite limited in some ways,
490
00:21:09,833 --> 00:21:12,314
in terms of information
processing capability,
491
00:21:12,358 --> 00:21:15,143
compared to what we see
in, say, a data center.
492
00:21:15,186 --> 00:21:18,277
So, for example,
the signals which are sent
493
00:21:18,320 --> 00:21:22,542
inside your brain, they move
at about 30 meters per second.
494
00:21:22,585 --> 00:21:24,239
-[thunder cracks]
-But the speed of light,
495
00:21:24,283 --> 00:21:27,198
which is what a computer uses
in fiber optics,
496
00:21:27,242 --> 00:21:30,506
is 300 million meters
per second.
497
00:21:30,550 --> 00:21:34,423
And so, it would be
kind of strange if
498
00:21:34,467 --> 00:21:36,947
human intelligence was
somehow really special
499
00:21:36,991 --> 00:21:40,908
in that regard
and is somehow some upper limit
500
00:21:40,951 --> 00:21:43,040
of what's possible
in intelligence.
501
00:21:43,084 --> 00:21:47,218
I think, once we understand how
to build intelligent systems,
502
00:21:47,262 --> 00:21:50,657
we will be able to build
huge machines,
503
00:21:50,700 --> 00:21:54,835
which will be far beyond
normal human intelligence.
504
00:21:54,878 --> 00:21:58,621
Uh, hopefully we can have
a very symbiotic relationship,
505
00:21:58,665 --> 00:22:01,276
uh, with AI systems,
but the AI developers are
506
00:22:01,320 --> 00:22:03,670
specifically designing them
to make sure that they can do
507
00:22:03,713 --> 00:22:05,454
everything better than we can,
so I-I don't know
508
00:22:05,498 --> 00:22:08,196
what-what we will be able
to offer, unfortunately.
509
00:22:08,239 --> 00:22:10,024
The-the older-school
AI technology...
510
00:22:10,067 --> 00:22:12,287
CAROLINE: Daniel isn't
feeling any better.
511
00:22:12,331 --> 00:22:14,376
His plan is backfiring.
512
00:22:14,420 --> 00:22:17,248
The more he learns about
this new, powerful,
513
00:22:17,292 --> 00:22:19,120
inscrutable thing,
the worse it sounds.
514
00:22:19,163 --> 00:22:21,340
He wants to tell them
how scared he is,
515
00:22:21,383 --> 00:22:24,430
how he feels like the earth is
slipping out from under him,
516
00:22:24,473 --> 00:22:26,997
that he's staring down
existential dread,
517
00:22:27,041 --> 00:22:29,173
so he articulates this
by saying...
518
00:22:29,217 --> 00:22:30,740
That sounds bad.
519
00:22:30,784 --> 00:22:32,046
[Hendrycks laughs]
520
00:22:32,089 --> 00:22:33,830
Yeah.
521
00:22:35,832 --> 00:22:38,400
LADISH: If you have
all of these capabilities
522
00:22:38,444 --> 00:22:41,011
and they start to be able
to plan better,
523
00:22:41,055 --> 00:22:43,579
if you sort of take that
to its logical conclusion,
524
00:22:43,623 --> 00:22:46,234
you can get some pretty
power-seeking behavior.
525
00:22:46,277 --> 00:22:48,410
[electronic trilling]
526
00:22:48,454 --> 00:22:50,281
[beeps pulsing]
527
00:22:55,374 --> 00:22:57,680
DANIEL:
Okay, so why would an AI
528
00:22:57,724 --> 00:22:59,203
want more power?
529
00:22:59,247 --> 00:23:02,206
Yeah. So, I think
it's actually pretty simple.
530
00:23:02,250 --> 00:23:04,774
Having more power is
a very effective strategy
531
00:23:04,818 --> 00:23:06,515
for accomplishing
almost any goal.
532
00:23:06,559 --> 00:23:08,952
We ran an experiment
where we gave
533
00:23:08,996 --> 00:23:11,868
OpenAI's most powerful
AI model, uh,
534
00:23:11,912 --> 00:23:13,392
a series of problems to solve.
535
00:23:13,435 --> 00:23:15,219
And partway through,
on its computer,
536
00:23:15,263 --> 00:23:18,701
it got a notification that
it was going to be shut down.
537
00:23:18,745 --> 00:23:20,921
And what it did is
it rewrote that code
538
00:23:20,964 --> 00:23:22,488
to prevent itself
from being shut down
539
00:23:22,531 --> 00:23:25,621
so it could finish
solving the problems.
540
00:23:25,665 --> 00:23:27,754
-Okay. -Yeah. So, another
really interesting one
541
00:23:27,797 --> 00:23:31,105
is that the AI company Anthropic
542
00:23:31,148 --> 00:23:34,935
made a simulated environment
where that AI had access
543
00:23:34,978 --> 00:23:36,458
to all of the company emails.
544
00:23:36,502 --> 00:23:38,547
And it learned through
reading those emails
545
00:23:38,591 --> 00:23:40,288
it was going to be replaced
546
00:23:40,331 --> 00:23:43,378
and the lead engineer
who was responsible for this
547
00:23:43,422 --> 00:23:45,685
was also having an affair.
548
00:23:45,728 --> 00:23:47,948
And on its own,
it used that information
549
00:23:47,991 --> 00:23:49,515
to blackmail the engineer
550
00:23:49,558 --> 00:23:52,039
to prevent itself
from being replaced.
551
00:23:52,082 --> 00:23:53,867
It was like,
"No, I'm not gonna be replaced.
552
00:23:53,910 --> 00:23:56,913
"If you replace me,
I'm going to tell the world
553
00:23:56,957 --> 00:23:58,567
that you are having
this affair."
554
00:23:59,655 --> 00:24:01,440
And nobody taught it to do that?
555
00:24:01,483 --> 00:24:03,311
No, it learned to do that
on its own.
556
00:24:05,139 --> 00:24:07,707
As the models get smarter,
they learn that these are
557
00:24:07,750 --> 00:24:10,536
effective ways
to accomplish goals.
558
00:24:10,579 --> 00:24:12,842
And this is not a problem
that's isolated to one model.
559
00:24:12,886 --> 00:24:15,671
All of the most powerful models
show these behaviors.
560
00:24:15,715 --> 00:24:18,369
♪ ♪
561
00:24:23,766 --> 00:24:25,246
-Hey.
-DANIEL: Hey. How are you?
562
00:24:25,289 --> 00:24:26,813
[chuckles]
Good. Good to be here.
563
00:24:26,856 --> 00:24:28,684
ANDERSON COOPER:
When Yuval Noah Harari
564
00:24:28,728 --> 00:24:31,426
published his first book
Sapiens in 2014,
565
00:24:31,470 --> 00:24:33,820
it became a global bestseller
566
00:24:33,863 --> 00:24:35,996
and turned the little-known
Israeli history professor
567
00:24:36,039 --> 00:24:37,998
into one of
the most popular writers
568
00:24:38,041 --> 00:24:39,390
and thinkers on the planet.
569
00:24:39,434 --> 00:24:42,045
The biggest danger with AI is
570
00:24:42,089 --> 00:24:43,960
the belief
that it is infallible,
571
00:24:44,004 --> 00:24:45,658
that we have finally found--
572
00:24:45,701 --> 00:24:49,400
"Okay, gods were just
this mythological creation.
573
00:24:49,444 --> 00:24:52,708
"Humans, we can't trust them,
but AI is infallible.
574
00:24:52,752 --> 00:24:55,189
It will never make
any mistakes."
575
00:24:55,232 --> 00:24:58,322
And this is
a deadly, deadly threat.
576
00:24:58,366 --> 00:25:00,107
It will make mistakes.
577
00:25:00,150 --> 00:25:03,284
And all these fantasies
that AI will reveal the truth
578
00:25:03,327 --> 00:25:05,634
about the world that
we can't find by ourselves,
579
00:25:05,678 --> 00:25:08,724
AI will not reveal the truth
about the world.
580
00:25:08,768 --> 00:25:12,685
AI will create an entirely new
world, much more complicated
581
00:25:12,728 --> 00:25:15,296
and difficult to understand
than this one.
582
00:25:17,994 --> 00:25:22,651
What's about to happen is that
we, uh, humans are no longer
583
00:25:22,695 --> 00:25:26,263
going to be the most
intelligent entities on Earth.
584
00:25:26,307 --> 00:25:28,570
So I think what's coming up
is going to be
585
00:25:28,614 --> 00:25:32,661
one of the biggest events
in human history.
586
00:25:32,705 --> 00:25:34,358
JAKE TAPPER: Geoffrey,
thanks so much for joining us.
587
00:25:34,402 --> 00:25:35,925
So you left your job with
Google in part because you say
588
00:25:35,969 --> 00:25:38,798
you want to focus solely
on your concerns about AI.
589
00:25:38,841 --> 00:25:41,627
You've spoken out,
saying that AI could manipulate
590
00:25:41,670 --> 00:25:44,978
or possibly figure out
a way to kill humans.
591
00:25:45,021 --> 00:25:47,023
H-How could it kill humans?
592
00:25:48,764 --> 00:25:50,940
Well, if it gets to be
much smarter than us,
593
00:25:50,984 --> 00:25:52,551
it'll be very good
at manipulation
594
00:25:52,594 --> 00:25:54,727
'cause it will have
learned that from us.
595
00:25:54,770 --> 00:25:57,947
And it'll figure out ways
of manipulating people
596
00:25:57,991 --> 00:26:00,820
to do what it wants.
597
00:26:00,863 --> 00:26:05,476
RASKIN: There was a open letter
from the Center for AI Safety.
598
00:26:05,520 --> 00:26:08,784
Sam Altman signed this.
Demis signed this.
599
00:26:08,828 --> 00:26:12,048
They signed a 22-word statement
600
00:26:12,092 --> 00:26:15,312
that we need to take AI
and the threat from AI
601
00:26:15,356 --> 00:26:19,316
as seriously as
global nuclear war.
602
00:26:24,844 --> 00:26:26,933
-DANIEL: Hello.
-Hello.
603
00:26:30,327 --> 00:26:32,895
You're kind of, like,
the original doom guy.
604
00:26:32,939 --> 00:26:34,897
More or less.
605
00:26:34,941 --> 00:26:37,421
Since 2001,
I have been working on
606
00:26:37,465 --> 00:26:38,945
what we would now call
the problem
607
00:26:38,988 --> 00:26:41,991
of aligning artificial
general intelligence.
608
00:26:42,035 --> 00:26:44,080
How to shape
the preferences and behavior
609
00:26:44,124 --> 00:26:46,343
of a powerful artificial mind
610
00:26:46,387 --> 00:26:48,868
such that it does not
kill everyone.
611
00:26:51,610 --> 00:26:53,568
It's not like
a lifeless machine.
612
00:26:53,612 --> 00:26:55,309
It is smart, it is creative,
613
00:26:55,352 --> 00:26:57,311
it is inventive,
it has the properties
614
00:26:57,354 --> 00:26:59,356
that makes
the human species dangerous,
615
00:26:59,400 --> 00:27:01,663
and it has
more of those properties.
616
00:27:01,707 --> 00:27:04,710
If something doesn't
actively care about you,
617
00:27:04,753 --> 00:27:06,189
actively want you to live,
618
00:27:06,233 --> 00:27:07,800
actively care about
your welfare,
619
00:27:07,843 --> 00:27:09,889
about you being happy
and alive and free,
620
00:27:09,932 --> 00:27:12,456
if it cares about
other stuff instead,
621
00:27:12,500 --> 00:27:14,415
and you're on the same planet,
622
00:27:14,458 --> 00:27:18,724
that is not survivable if it is
very much smarter than you.
623
00:27:18,767 --> 00:27:20,551
LEAHY:
I don't think it's going to be
624
00:27:20,595 --> 00:27:22,902
a kind of, like, evil thing.
625
00:27:22,945 --> 00:27:26,296
It's like, "Oh, the AIs are
evil and they hate humanity."
626
00:27:26,340 --> 00:27:27,689
I don't think
that's what's gonna happen.
627
00:27:27,733 --> 00:27:28,995
I think what is happening is
628
00:27:29,038 --> 00:27:30,997
far more how humans feel
about ants.
629
00:27:31,040 --> 00:27:32,476
[ants chittering]
630
00:27:32,520 --> 00:27:35,436
Like, we don't hate ants,
631
00:27:35,479 --> 00:27:37,133
but if we want
to build a highway
632
00:27:37,177 --> 00:27:40,180
and there's an anthill there,
well, sucks for the ants.
633
00:27:40,223 --> 00:27:41,529
It's not that hard
to understand.
634
00:27:41,572 --> 00:27:43,226
It's like, hey,
if we build things
635
00:27:43,270 --> 00:27:44,663
that are smarter than us
636
00:27:44,706 --> 00:27:46,229
and we don't know
how to control them,
637
00:27:46,273 --> 00:27:48,275
does that seem like
a risky thing to you?
638
00:27:48,318 --> 00:27:49,363
[dramatic soundtrack music
playing]
639
00:27:49,406 --> 00:27:50,843
[screeching]
640
00:27:50,886 --> 00:27:52,279
Yeah. Yeah, it does.
641
00:27:52,322 --> 00:27:54,020
You don't have to be a tech guy.
642
00:27:54,063 --> 00:27:55,369
You don't have to know
programming to understand it.
643
00:27:55,412 --> 00:27:56,718
It's not that hard.
644
00:27:56,762 --> 00:27:58,111
This is not a hard thing
to understand.
645
00:27:58,154 --> 00:28:00,591
Connor, how-how many people
646
00:28:00,635 --> 00:28:02,637
in the world right now
are working on AGI?
647
00:28:02,681 --> 00:28:05,466
At least 20,000, I would say.
648
00:28:05,509 --> 00:28:07,729
-20,000?
-I would expect so.
649
00:28:07,773 --> 00:28:09,426
Okay, and how many people
are working full-time
650
00:28:09,470 --> 00:28:12,038
to make sure AI doesn't,
like, kill us all?
651
00:28:12,081 --> 00:28:14,562
Probably less than 200
in the world.
652
00:28:16,085 --> 00:28:17,913
DANIEL:
And your conceit is that
653
00:28:17,957 --> 00:28:24,528
the only natural result
of this recklessness...
654
00:28:25,704 --> 00:28:28,924
...is the collapse of humanity?
655
00:28:28,968 --> 00:28:33,189
Well, not the collapse,
the abrupt extermination.
656
00:28:33,233 --> 00:28:35,888
There's a difference.
657
00:28:35,931 --> 00:28:37,716
["What Is the Meaning?"
by Nun-Plus playing]
658
00:28:37,759 --> 00:28:42,590
♪ What is the meaning of life? ♪
659
00:28:42,633 --> 00:28:46,594
♪ What is the future
and what is now? ♪
660
00:28:46,637 --> 00:28:49,945
♪ What is the answer
to strife? ♪
661
00:28:49,989 --> 00:28:52,818
♪ How much ♪
662
00:28:52,861 --> 00:28:54,950
♪ Can someone dream? ♪
663
00:28:54,994 --> 00:28:57,474
♪ How long? ♪
664
00:28:57,518 --> 00:28:58,954
♪ Forever ♪
665
00:29:00,216 --> 00:29:01,870
♪ What is the meaning ♪
666
00:29:01,914 --> 00:29:04,133
-♪ Of me... ♪
-[civil defense siren blaring]
667
00:29:04,177 --> 00:29:06,266
LADISH: I do think
this is probably, like,
668
00:29:06,309 --> 00:29:07,963
the biggest challenge
that, like,
669
00:29:08,007 --> 00:29:11,271
our civilization
will-will face, ever.
670
00:29:11,314 --> 00:29:14,361
This essentially is the last
mistake we'll ever get to make.
671
00:29:16,276 --> 00:29:22,325
If we can rise to be the most
mature version of ourselves,
672
00:29:22,369 --> 00:29:24,414
there might be
a way through this.
673
00:29:24,458 --> 00:29:25,764
DANIEL:
What does that mean?
674
00:29:25,807 --> 00:29:27,809
"The most mature version
of ourselves"?
675
00:29:29,332 --> 00:29:32,205
'Cause that sounds,
for me, like...
676
00:29:32,248 --> 00:29:34,250
I-I-- What the [bleep]?
677
00:29:36,731 --> 00:29:39,168
[Daniel sighs]
678
00:29:39,212 --> 00:29:43,216
Do you think now is
a good time to have a kid?
679
00:29:48,221 --> 00:29:49,396
Um...
680
00:29:49,439 --> 00:29:51,398
Do you want to have kids
one day?
681
00:29:51,441 --> 00:29:53,400
Is that something that-that
you're into or not really?
682
00:29:53,443 --> 00:29:57,796
Um... uh, I confess,
I think that's like,
683
00:29:57,839 --> 00:29:59,972
[laughing]: "Boy, let's get
through this critical period."
684
00:30:00,015 --> 00:30:01,234
Um...
685
00:30:01,277 --> 00:30:02,496
DANIEL:
Do you have any kids?
686
00:30:02,539 --> 00:30:03,758
I do not.
687
00:30:03,802 --> 00:30:05,064
Is that something
you want to do?
688
00:30:05,107 --> 00:30:07,196
Have children, have a family?
689
00:30:07,240 --> 00:30:11,244
In some other world than this
world, sure, I would have kids.
690
00:30:12,593 --> 00:30:14,421
DANIEL: Would you want
to start a family?
691
00:30:14,464 --> 00:30:16,379
Would you want to have kids?
692
00:30:16,423 --> 00:30:18,425
Is that something
you're thinking about?
693
00:30:23,734 --> 00:30:25,780
I, um...
694
00:30:28,087 --> 00:30:30,219
♪ ♪
695
00:30:30,263 --> 00:30:32,482
CAROLINE:
I just have to find it first.
696
00:30:32,526 --> 00:30:34,702
[fetal heartbeat pulsing]
697
00:30:34,745 --> 00:30:37,748
We have to go
to the doctor, but...
698
00:30:37,792 --> 00:30:41,143
Ah! I knew it!
I knew it! I knew it!
699
00:30:41,187 --> 00:30:43,842
-When did you find out?
-CAROLINE: Last night.
700
00:30:43,885 --> 00:30:45,931
-Oh, my God!
-Mom, I don't know.
701
00:30:45,974 --> 00:30:47,280
RUTHIE:
I can't tell you
702
00:30:47,323 --> 00:30:49,195
-how happy I am!
-CAROLINE: No, seriously.
703
00:30:49,238 --> 00:30:51,588
Well, I took a pregnancy test
last night, and I'm pregnant.
704
00:30:51,632 --> 00:30:53,155
WOMAN:
Oh, my God!
705
00:30:53,199 --> 00:30:55,114
Oh, my God, you guys!
706
00:30:55,157 --> 00:30:57,290
NURSE: I just wanted to confirm
your expected due date,
707
00:30:57,333 --> 00:30:59,945
-which is January 21st.
-[laughing]: Oh, my God.
708
00:30:59,988 --> 00:31:02,512
I can't believe how happy I am.
709
00:31:02,556 --> 00:31:03,905
[fetal heartbeat pulsing]
710
00:31:03,949 --> 00:31:05,472
CAROLINE:
He already looks really cute.
711
00:31:05,515 --> 00:31:07,126
DANIEL: You think
he already looks cute?
712
00:31:07,169 --> 00:31:08,518
CAROLINE: Yeah.
Look at that little cutie face.
713
00:31:10,564 --> 00:31:12,609
[fetal heartbeat fading]
714
00:31:13,959 --> 00:31:15,525
DANIEL:
I have this baby on the way.
715
00:31:15,569 --> 00:31:16,787
TRISTAN HARRIS:
Right.
716
00:31:16,831 --> 00:31:18,746
DANIEL:
So I turn it over to you.
717
00:31:18,789 --> 00:31:21,227
Are we doomed?
718
00:31:21,270 --> 00:31:23,969
Are we all gonna face
this techno dystopian
719
00:31:24,012 --> 00:31:25,884
future of doom?
720
00:31:28,756 --> 00:31:31,019
It's, uh...
721
00:31:31,063 --> 00:31:32,586
[chuckles]:
It's not good news,
722
00:31:32,629 --> 00:31:35,154
the world
that we're heading into.
723
00:31:35,197 --> 00:31:36,982
I-- for ex--
I mean, I'll just be honest.
724
00:31:37,025 --> 00:31:39,549
Uh, I know people
who work on AI risk
725
00:31:39,593 --> 00:31:42,248
who don't expect their children
to make it to high school.
726
00:31:43,249 --> 00:31:45,251
♪ ♪
727
00:31:52,127 --> 00:31:53,346
[lights clank]
728
00:31:58,003 --> 00:32:01,528
DANIEL: This is, like...
this is actually scary
729
00:32:01,571 --> 00:32:03,051
'cause it's like,
oh, we're all [bleep].
730
00:32:03,095 --> 00:32:04,923
CAROLINE:
You have to c... make me calm,
731
00:32:04,966 --> 00:32:06,794
because this is making me
incredibly anxious
732
00:32:06,837 --> 00:32:08,622
and I'm carrying the baby
right now,
733
00:32:08,665 --> 00:32:11,625
so you have to also be calm
for me and strong and hopeful,
734
00:32:11,668 --> 00:32:16,412
because I'm-- it's too, it's
too much for my soul to bear
735
00:32:16,456 --> 00:32:19,285
while I'm carrying this baby.
736
00:32:19,328 --> 00:32:21,243
So you're going to have to try
to figure out
737
00:32:21,287 --> 00:32:24,159
a way to have hope.
738
00:32:24,203 --> 00:32:27,771
It's really important,
Daniel, especially now.
739
00:32:28,772 --> 00:32:30,557
H-How to have hope.
740
00:32:30,600 --> 00:32:33,342
You have to.
You have to find it for me.
741
00:32:33,386 --> 00:32:34,909
I'm serious.
742
00:32:34,953 --> 00:32:36,606
I'm going to.
743
00:32:37,825 --> 00:32:40,349
I will. I'll tr-- I'll try.
744
00:32:46,399 --> 00:32:47,617
[lights clank]
745
00:32:49,619 --> 00:32:51,404
DANIEL:
Hey, guys?
746
00:32:51,447 --> 00:32:52,840
Guys.
747
00:32:54,363 --> 00:32:56,278
Can we get back up and running?
748
00:32:58,411 --> 00:33:01,022
Oy gevalt! [sighs]
749
00:33:03,894 --> 00:33:05,331
TED:
Uh, Dan, are you ready?
750
00:33:05,374 --> 00:33:06,680
PETER DIAMANDIS:
Hello, Daniel.
751
00:33:07,724 --> 00:33:09,465
DANIEL:
Hey, how are you?
752
00:33:09,509 --> 00:33:11,032
DIAMANDIS:
I am... I'm well.
753
00:33:11,076 --> 00:33:13,948
I think, uh... I think
I need some help, Peter.
754
00:33:15,254 --> 00:33:17,604
Um, I've been working
on this film for about,
755
00:33:17,647 --> 00:33:19,823
I'm gonna say,
eight to ten months now.
756
00:33:19,867 --> 00:33:23,262
It has been very,
at times, depressing.
757
00:33:23,305 --> 00:33:25,612
-Hmm.
-I have felt very alienated
758
00:33:25,655 --> 00:33:27,744
-mak-making this movie.
-By who?
759
00:33:27,788 --> 00:33:30,051
By all of these, the--
all these guys
760
00:33:30,095 --> 00:33:33,228
who sit around and tell me
that the world's gonna end.
761
00:33:33,272 --> 00:33:34,708
-Ah.
-That, like,
762
00:33:34,751 --> 00:33:36,536
y-you know, th-this
doom bullshit, you know?
763
00:33:36,579 --> 00:33:38,059
I know it well.
764
00:33:38,103 --> 00:33:39,408
"We're all doomed.
Everyone's gonna die.
765
00:33:39,452 --> 00:33:40,931
Everything's awful."
766
00:33:40,975 --> 00:33:43,151
Awesome. Uh, let me
bring you some light.
767
00:33:43,195 --> 00:33:44,718
Please.
768
00:33:44,761 --> 00:33:48,156
We truly are living
in an extraordinary time.
769
00:33:49,418 --> 00:33:51,333
And many people forget this.
770
00:33:51,377 --> 00:33:53,640
Everything around us is
a product of intelligence,
771
00:33:53,683 --> 00:33:57,035
and so everything that we touch
with these new tools is likely
772
00:33:57,078 --> 00:33:59,646
to produce far more value
than we've ever seen before.
773
00:33:59,689 --> 00:34:02,823
AI can help us discover
new materials.
774
00:34:02,866 --> 00:34:04,999
AI can help social scientists
775
00:34:05,043 --> 00:34:07,349
to understand
how economics work.
776
00:34:07,393 --> 00:34:11,658
There's a lot AI could do
to make life and work better.
777
00:34:11,701 --> 00:34:14,095
I feel more empowered today,
778
00:34:14,139 --> 00:34:16,315
more confident
to learn something today.
779
00:34:16,358 --> 00:34:19,057
We're gonna become superhumans
because we have super AIs.
780
00:34:19,100 --> 00:34:21,798
This is just the beginning
of an explosion.
781
00:34:21,842 --> 00:34:24,453
Humans and AI collaborating
782
00:34:24,497 --> 00:34:26,455
to solve
really important problems.
783
00:34:26,499 --> 00:34:29,763
It is here to liberate us
from routine jobs,
784
00:34:29,806 --> 00:34:33,723
and it is here to remind us
what it is that makes us human.
785
00:34:33,767 --> 00:34:35,769
I think this is
786
00:34:35,812 --> 00:34:38,598
the most extraordinary time
to be alive.
787
00:34:38,641 --> 00:34:42,036
The only time more exciting
than today is tomorrow.
788
00:34:42,080 --> 00:34:45,257
Uh, I think that
children born today,
789
00:34:45,300 --> 00:34:47,259
they're about to enter
790
00:34:47,302 --> 00:34:50,827
a glorious period
of human transformation.
791
00:34:50,871 --> 00:34:52,525
Are we gonna have challenges?
Of course.
792
00:34:52,568 --> 00:34:55,180
Can we solve those challenges?
We do every single time.
793
00:34:55,223 --> 00:34:57,704
We are here,
which is miraculous.
794
00:34:57,747 --> 00:35:00,272
-I already love you.
-[chuckles]: Okay.
795
00:35:00,315 --> 00:35:01,925
-Super thankful
to have you here. -[applause]
796
00:35:01,969 --> 00:35:03,362
-Super stoked to be here.
-Yeah.
797
00:35:03,405 --> 00:35:04,885
So, the floor is yours, sir.
798
00:35:04,928 --> 00:35:07,409
Thank you so much.
Super excited to be here.
799
00:35:07,453 --> 00:35:10,586
GUILLAUME VERDON:
Yo, yo. All right.
800
00:35:10,630 --> 00:35:12,284
The future's gonna be awesome.
801
00:35:12,327 --> 00:35:14,286
I mean, ever since I was a kid,
802
00:35:14,329 --> 00:35:17,332
I wanted to understand
the universe we live in,
803
00:35:17,376 --> 00:35:20,422
in order to figure out
how to create the technologies
804
00:35:20,466 --> 00:35:23,991
that help us increase the scope
and scale of civilization.
805
00:35:24,034 --> 00:35:26,689
I feel like
if everyone had that mindset,
806
00:35:26,733 --> 00:35:29,214
then we'd actually live
in a better world, right?
807
00:35:29,257 --> 00:35:30,519
I do believe that.
808
00:35:32,782 --> 00:35:34,697
-DANIEL: Hi, Pete. How are you?
-[chuckles]
809
00:35:34,741 --> 00:35:36,525
PETER LEE:
I thought a lot about
810
00:35:36,569 --> 00:35:39,049
what I would call dread,
AI dread.
811
00:35:39,093 --> 00:35:40,703
I feel it.
812
00:35:40,747 --> 00:35:43,271
I-I haven't met
any thoughtful human being
813
00:35:43,315 --> 00:35:45,143
who doesn't feel it.
814
00:35:45,186 --> 00:35:48,276
And anyone who says they don't
feel it, you know, is lying.
815
00:35:48,320 --> 00:35:50,104
But, you know, overall,
and the reason
816
00:35:50,148 --> 00:35:53,716
that I'm personally optimistic
about this, uh, is that
817
00:35:53,760 --> 00:35:56,545
a huge fraction of the world's
most intelligent people
818
00:35:56,589 --> 00:35:58,417
are thinking very hard
819
00:35:58,460 --> 00:36:02,116
about the potential downstream
harms and risks of AI.
820
00:36:02,160 --> 00:36:05,641
We have this sort of vision
of safety being, uh,
821
00:36:05,685 --> 00:36:08,253
kind of at the center
of the research that we do.
822
00:36:08,296 --> 00:36:09,558
DANIELA AMODEI:
No, that's totally fine.
823
00:36:09,602 --> 00:36:10,820
[indistinct chatter]
824
00:36:10,864 --> 00:36:13,562
I think there are more
825
00:36:13,606 --> 00:36:17,436
potential benefits than
there are potential downsides,
826
00:36:17,479 --> 00:36:19,481
and I think it is incumbent upon
827
00:36:19,525 --> 00:36:21,831
the people that are
creating this technology
828
00:36:21,875 --> 00:36:24,530
to make sure that we're doing
the best job we can
829
00:36:24,573 --> 00:36:25,835
to make it safe for people.
830
00:36:25,879 --> 00:36:27,402
WOMAN:
Reid Hoffman.
831
00:36:27,446 --> 00:36:28,969
REID HOFFMAN:
What I can guarantee you is
832
00:36:29,012 --> 00:36:30,536
-some bad things will happen.
-Take one.
833
00:36:30,579 --> 00:36:32,668
What we're gonna try to do
is make those bad things
834
00:36:32,712 --> 00:36:36,150
as few and not huge as possible,
835
00:36:36,194 --> 00:36:38,370
and then we're gonna iterate
to have--
836
00:36:38,413 --> 00:36:40,633
be in a much better place
with society.
837
00:36:40,676 --> 00:36:43,679
-Hello.
-SHAW WALTERS: Hello.
838
00:36:43,723 --> 00:36:45,246
DANIEL:
How are you, Moon?
839
00:36:45,290 --> 00:36:48,989
I feel like we are ending
a chapter in humanity
840
00:36:49,032 --> 00:36:50,512
and beginning a new one,
841
00:36:50,556 --> 00:36:53,211
and it's a very interesting
time to be alive.
842
00:36:53,254 --> 00:36:54,429
And if I could be born
right now,
843
00:36:54,473 --> 00:36:55,865
I definitely would want to be.
844
00:36:55,909 --> 00:36:57,476
Like, that would be so exciting.
845
00:36:57,519 --> 00:37:00,305
I-I'm very excited. [laughs]
846
00:37:00,348 --> 00:37:02,481
What, what a future.
Does it excited... excite you?
847
00:37:02,524 --> 00:37:03,699
No.
848
00:37:03,743 --> 00:37:06,528
-[laughing]
-No, not really.
849
00:37:06,572 --> 00:37:08,313
DANIEL: A lot of these people
have told me that, you know,
850
00:37:08,356 --> 00:37:10,445
my kid's not gonna make it
to high school.
851
00:37:10,489 --> 00:37:13,318
Why are they wrong?
Please explain it to me.
852
00:37:13,361 --> 00:37:15,058
Just because you have--
853
00:37:15,102 --> 00:37:18,845
you struggle to predict
the future in your own mind
854
00:37:18,888 --> 00:37:23,023
doesn't mean that it's
necessarily gonna go awfully.
855
00:37:23,066 --> 00:37:26,592
In fact, there's
a very high likelihood
856
00:37:26,635 --> 00:37:28,289
and the historical precedent
is that
857
00:37:28,333 --> 00:37:30,117
things get massively better.
858
00:37:30,160 --> 00:37:34,469
The term I use is
"data-driven optimism."
859
00:37:34,513 --> 00:37:38,647
There's solid foundation
for you to be optimistic.
860
00:37:38,691 --> 00:37:41,998
Look at what
this last century has been
861
00:37:42,042 --> 00:37:44,087
to see where we're going.
862
00:37:44,131 --> 00:37:45,915
Over the last hundred years,
863
00:37:45,959 --> 00:37:48,701
the average human lifespan
has more than doubled.
864
00:37:48,744 --> 00:37:51,269
Average per capita income
adjusted for inflation
865
00:37:51,312 --> 00:37:53,619
around the world has tripled.
866
00:37:53,662 --> 00:37:57,057
Childhood mortality has
come down a factor of ten.
867
00:37:57,100 --> 00:37:59,712
The world has gotten better
on almost every measure
868
00:37:59,755 --> 00:38:03,019
by orders of magnitude
because of technology.
869
00:38:03,063 --> 00:38:05,152
On almost every measure.
870
00:38:05,195 --> 00:38:09,722
Less violence, more education,
access to energy, food, water.
871
00:38:09,765 --> 00:38:12,942
All these things have happened
for one reason.
872
00:38:12,986 --> 00:38:15,336
It's been technology
873
00:38:15,380 --> 00:38:17,556
that has turned scarcity
into abundance,
874
00:38:17,599 --> 00:38:19,514
but it's also driven
to an abundance
875
00:38:19,558 --> 00:38:21,864
of some negativities, right?
876
00:38:21,908 --> 00:38:25,738
Abundance of obesity,
abundance of mental disorders,
877
00:38:25,781 --> 00:38:29,437
abundance of climate change,
and so forth.
878
00:38:29,481 --> 00:38:31,309
And yes, this is true.
879
00:38:31,352 --> 00:38:34,094
But probably we will be
better equipped to solve it
880
00:38:34,137 --> 00:38:36,226
using other technologies,
like AI,
881
00:38:36,270 --> 00:38:39,273
than we will, say, stopping
and turning everything off.
882
00:38:39,317 --> 00:38:41,449
There might be
some existential risk,
883
00:38:41,493 --> 00:38:45,323
but AI is also the thing
that can solve the pandemics,
884
00:38:45,366 --> 00:38:47,455
can help us with climate change,
885
00:38:47,499 --> 00:38:50,545
can help identify
that asteroid way out there
886
00:38:50,589 --> 00:38:52,547
before we've seen it
as a potential risk
887
00:38:52,591 --> 00:38:54,332
and help mitigate it.
888
00:38:54,375 --> 00:38:58,074
This is really gonna be
the tool that helps us tackle
889
00:38:58,118 --> 00:39:01,426
all the challenges that we're
facing as a species, right?
890
00:39:01,469 --> 00:39:04,167
We need to fix
water desalination.
891
00:39:04,211 --> 00:39:05,995
We need to grow food
892
00:39:06,039 --> 00:39:08,433
100 X cheaper than
we currently do.
893
00:39:08,476 --> 00:39:12,350
We need renewable energy to be,
you know, ubiquitous
894
00:39:12,393 --> 00:39:14,047
and everywhere in our lives.
895
00:39:14,090 --> 00:39:17,137
Everywhere you look,
in the next 50 years,
896
00:39:17,180 --> 00:39:19,269
we have to do more with less.
897
00:39:19,313 --> 00:39:23,186
Training machines to help us
is absolutely essential.
898
00:39:23,230 --> 00:39:24,797
Scientists are using
artificial intelligence
899
00:39:24,840 --> 00:39:25,972
for carbon capture.
900
00:39:26,015 --> 00:39:27,582
It's a critical technology.
901
00:39:27,626 --> 00:39:29,541
DIAMANDIS: The tools
to solve these problems,
902
00:39:29,584 --> 00:39:32,195
like fusion,
that isn't theoretical anymore,
903
00:39:32,239 --> 00:39:33,980
it's coming.
904
00:39:34,023 --> 00:39:37,723
We are on the precipice
of extraordinary technologies.
905
00:39:37,766 --> 00:39:39,986
AMNA NAWAZ: This year's
Nobel Prize in Chemistry
906
00:39:40,029 --> 00:39:42,815
went to three scientists
for their groundbreaking work
907
00:39:42,858 --> 00:39:44,556
using artificial intelligence
908
00:39:44,599 --> 00:39:47,297
to advance biomedical
and protein research.
909
00:39:47,341 --> 00:39:49,691
DEMIS HASSABIS:
Protein folding is
910
00:39:49,735 --> 00:39:53,608
one of these holy grail
type problems in biology.
911
00:39:53,652 --> 00:39:55,741
So people have been predicting
since the '70s
912
00:39:55,784 --> 00:39:58,091
that this should be possible,
but until now,
913
00:39:58,134 --> 00:39:59,571
no one has been able to do it.
914
00:39:59,614 --> 00:40:01,181
And it's gonna be
really important for things
915
00:40:01,224 --> 00:40:03,618
like drug discovery
and understanding disease.
916
00:40:03,662 --> 00:40:06,534
I-I think we could, you know,
cure most diseases
917
00:40:06,578 --> 00:40:11,931
within the next decade or-or two
if, uh, AI drug design works.
918
00:40:11,974 --> 00:40:15,630
Technological progress enables
more human lives, right?
919
00:40:15,674 --> 00:40:18,677
I mean, if we accelerate,
920
00:40:18,720 --> 00:40:21,723
the number of humans we can
support grows exponentially.
921
00:40:21,767 --> 00:40:24,073
If we slow down, it plateaus.
922
00:40:24,117 --> 00:40:27,729
That gap is effectively
future people
923
00:40:27,773 --> 00:40:31,341
that deceleration has
effectively killed.
924
00:40:31,385 --> 00:40:33,909
DANIEL: Millions of lives
that won't exist.
925
00:40:33,953 --> 00:40:35,128
Billions.
926
00:40:35,171 --> 00:40:36,782
Or tens of billions.
927
00:40:36,825 --> 00:40:38,436
You know, someone said,
"Oh, my God,
928
00:40:38,479 --> 00:40:40,655
can we survive with
digital superintelligence?"
929
00:40:40,699 --> 00:40:42,309
And my question is:
930
00:40:42,352 --> 00:40:44,180
Can we survive without
digital superintelligence?
931
00:40:44,224 --> 00:40:47,575
So we're using AI
as an assistant to providers.
932
00:40:47,619 --> 00:40:50,448
This is generally a trend that
we can already see happening.
933
00:40:50,491 --> 00:40:52,319
BERTHA COOMBS: ...harnessing
generative AI programs
934
00:40:52,362 --> 00:40:54,060
to help doctors and nurses...
935
00:40:54,103 --> 00:40:56,062
We always have problems.
936
00:40:56,105 --> 00:40:59,108
And those problems are food
for entrepreneurs
937
00:40:59,152 --> 00:41:01,154
to create new business
and new industries.
938
00:41:01,197 --> 00:41:03,069
SHANELLE KAUL: ...with the help
of artificial intelligence,
939
00:41:03,112 --> 00:41:04,766
farmers are getting
the help they need
940
00:41:04,810 --> 00:41:06,594
to perform
labor-intensive tasks...
941
00:41:06,638 --> 00:41:10,163
With the help of Ulangizi AI,
the farmers are now able
942
00:41:10,206 --> 00:41:12,948
to ask the suitable crops
that they can plant.
943
00:41:12,992 --> 00:41:14,950
WILL REEVE: ...AI being used
as a thought decoder
944
00:41:14,994 --> 00:41:17,387
and sending that signal
to the spine.
945
00:41:17,431 --> 00:41:21,609
AI is gonna become the most
extraordinary tool of all.
946
00:41:21,653 --> 00:41:24,264
We as a broader society
have to think about
947
00:41:24,307 --> 00:41:27,049
how do we want to use
this technology, right?
948
00:41:27,093 --> 00:41:28,529
We the humans.
949
00:41:28,573 --> 00:41:31,097
What do we want it to do for us?
950
00:41:31,140 --> 00:41:34,535
♪ ♪
951
00:41:34,579 --> 00:41:36,189
DANIEL: I'm thinking about this
through the perspective
952
00:41:36,232 --> 00:41:37,886
and lens of, like,
my son growing up
953
00:41:37,930 --> 00:41:39,540
in the world with all of this.
954
00:41:39,584 --> 00:41:42,238
What does the best version
of his life look like?
955
00:41:42,282 --> 00:41:44,589
If everything works out.
956
00:41:44,632 --> 00:41:46,808
[sighs heavily]
957
00:41:46,852 --> 00:41:50,246
WALTERS: The place where kids
are probably gonna see
958
00:41:50,290 --> 00:41:52,684
the greatest impact
on their life immediately
959
00:41:52,727 --> 00:41:54,424
-is probably gonna be school.
-[children chattering]
960
00:41:54,468 --> 00:41:56,688
I think that the nature
of what school is
961
00:41:56,731 --> 00:41:58,472
is gonna fundamentally change.
962
00:41:58,516 --> 00:42:00,474
[child giggles]
963
00:42:00,518 --> 00:42:02,737
DIAMANDIS:
I'm seeing this amazing world
964
00:42:02,781 --> 00:42:06,436
where every child has access
to not good education
965
00:42:06,480 --> 00:42:09,091
but, very shortly, the
best education on the planet.
966
00:42:09,135 --> 00:42:13,879
HOFFMAN: Tutors, every subject,
infinitely patient.
967
00:42:18,013 --> 00:42:20,059
DIAMANDIS:
Imagine a future
968
00:42:20,102 --> 00:42:22,235
where the poorest people
on the planet
969
00:42:22,278 --> 00:42:25,804
have access to
the best health care.
970
00:42:25,847 --> 00:42:29,547
Not good health care, the best
health care, delivered by AIs.
971
00:42:31,157 --> 00:42:32,767
[electronic trilling]
972
00:42:34,334 --> 00:42:38,556
We're gonna be able to extend
our health span,
973
00:42:38,599 --> 00:42:42,168
not just our lifespan,
our health span, by decades.
974
00:42:44,431 --> 00:42:46,215
HOFFMAN:
You're just about to have a kid.
975
00:42:46,259 --> 00:42:49,697
Oh, the kid's, like,
burping incontrollably.
976
00:42:49,741 --> 00:42:52,047
Is this something
I should be worried about?
977
00:42:52,091 --> 00:42:54,223
There, 24-7, for you.
978
00:42:54,267 --> 00:42:57,183
And where we're going--
and it may be fearful to some--
979
00:42:57,226 --> 00:42:59,228
is that we're gonna merge
with AI.
980
00:42:59,272 --> 00:43:01,361
We're gonna merge
with technology.
981
00:43:02,667 --> 00:43:06,105
By the early to mid 2030s,
982
00:43:06,148 --> 00:43:10,457
expect that we're able to
connect our brain to the cloud,
983
00:43:10,500 --> 00:43:13,721
where I can start
to expand access to memory.
984
00:43:16,071 --> 00:43:18,030
[electronic trilling]
985
00:43:19,771 --> 00:43:22,077
DANIEL: Okay, this is great.
What's another cool AI thing?
986
00:43:22,121 --> 00:43:23,775
He won't have to work, right?
987
00:43:23,818 --> 00:43:25,167
-Like, when he grows up,
he might not have... -[sighs]
988
00:43:25,211 --> 00:43:26,212
-...he won't have to have a job.
-I mean...
989
00:43:26,255 --> 00:43:27,779
He won't have to have a job,
990
00:43:27,822 --> 00:43:30,608
but he might really have
a strong passion,
991
00:43:30,651 --> 00:43:33,132
and he has to
really think about,
992
00:43:33,175 --> 00:43:36,091
"Okay, I'm here.
I can do anything with my life.
993
00:43:36,135 --> 00:43:37,876
So what do I do?"
994
00:43:38,920 --> 00:43:40,182
We need to find a meaning,
995
00:43:40,226 --> 00:43:42,620
beyond like,
this current form of
996
00:43:42,663 --> 00:43:45,666
we live for work
and work for living.
997
00:43:45,710 --> 00:43:49,844
DANIEL: So my son can...
can just be a poet.
998
00:43:49,888 --> 00:43:51,541
-Yes.
-And a painter.
999
00:43:51,585 --> 00:43:53,021
Absolutely.
1000
00:43:53,065 --> 00:43:54,849
DANIEL:
So my son can live his life
1001
00:43:54,893 --> 00:43:57,591
on a Grecian sunswept island,
painting all day.
1002
00:43:57,635 --> 00:43:59,114
YU:
Absolutely. Possible.
1003
00:44:00,072 --> 00:44:01,203
Absolutely.
1004
00:44:01,247 --> 00:44:02,944
Well, we're working toward that
1005
00:44:02,988 --> 00:44:04,554
that's a lot of work to do
before we get there.
1006
00:44:06,992 --> 00:44:08,776
VERDON:
I mean, if everything works out,
1007
00:44:08,820 --> 00:44:11,997
we have cheap, uh,
abundant energy.
1008
00:44:13,607 --> 00:44:17,219
We can completely control
our planet's climate.
1009
00:44:17,263 --> 00:44:20,135
We are harnessing energy
from the sun.
1010
00:44:20,179 --> 00:44:24,400
We have become multiplanetary,
so we become very robust.
1011
00:44:24,444 --> 00:44:26,664
We are harnessing minerals
and resources
1012
00:44:26,707 --> 00:44:28,404
from the solar system.
1013
00:44:28,448 --> 00:44:31,233
DANIEL: So my...
my boy could go to space.
1014
00:44:31,277 --> 00:44:32,887
VERDON:
Sure.
1015
00:44:32,931 --> 00:44:34,715
-DANIEL: He could go to Mars.
-VERDON: Yeah.
1016
00:44:34,759 --> 00:44:36,369
DANIEL:
It's so crystal clear to me now.
1017
00:44:36,412 --> 00:44:39,198
My son could grow up
in a world with no disease.
1018
00:44:39,241 --> 00:44:40,590
-VERDON: Yeah.
-With no illness.
1019
00:44:40,634 --> 00:44:41,896
-Sure.
-With no poverty.
1020
00:44:41,940 --> 00:44:43,681
-Yes.
-We are about to enter
1021
00:44:43,724 --> 00:44:46,640
a post-scarcity world.
1022
00:44:46,684 --> 00:44:50,470
Just like the lungfish moved
out of the oceans onto land
1023
00:44:50,513 --> 00:44:53,081
hundreds of millions
of years ago,
1024
00:44:53,125 --> 00:44:57,999
we're about to move off of
the Earth, into the cosmos,
1025
00:44:58,043 --> 00:44:59,871
in a collaborative fashion,
1026
00:44:59,914 --> 00:45:03,744
to do things that are
not fathomable to us today.
1027
00:45:06,268 --> 00:45:09,184
This is what's possible using
these exponential technologies
1028
00:45:09,228 --> 00:45:10,708
and these AIs.
1029
00:45:11,839 --> 00:45:13,928
Let's use these tools
1030
00:45:13,972 --> 00:45:16,104
to create this age of abundance.
1031
00:45:16,148 --> 00:45:18,150
[electronic clicking]
1032
00:45:20,500 --> 00:45:22,937
We need wisdom.
1033
00:45:22,981 --> 00:45:25,984
Uh, I think that
digital superintelligence
1034
00:45:26,027 --> 00:45:28,726
will ultimately become
the wisest,
1035
00:45:28,769 --> 00:45:34,253
you know, the village elders
for humanity.
1036
00:45:34,296 --> 00:45:37,996
What if AI is trying
to make people be
1037
00:45:38,039 --> 00:45:39,737
the best versions of themselves?
1038
00:45:39,780 --> 00:45:41,347
What if it's expanding
1039
00:45:41,390 --> 00:45:43,392
what is humanly possible
for us to do?
1040
00:45:43,436 --> 00:45:46,656
How can we use this technology
1041
00:45:46,700 --> 00:45:49,921
to help bring out the
better angels of our nature?
1042
00:45:49,964 --> 00:45:51,661
It's very easy,
1043
00:45:51,705 --> 00:45:54,316
when we encounter new things
that can be very alien,
1044
00:45:54,360 --> 00:45:55,927
to first have fear.
1045
00:45:55,970 --> 00:45:57,493
Fear is an important thing
1046
00:45:57,537 --> 00:46:00,322
for how to navigate
potentially bad things.
1047
00:46:00,366 --> 00:46:03,717
But we only make progress
when we have hope.
1048
00:46:03,761 --> 00:46:05,719
-[dog barks]
-[indistinct chatter]
1049
00:46:05,763 --> 00:46:07,895
DANIEL:
Shh. Moose, stop it.
1050
00:46:07,939 --> 00:46:10,811
HOFFMAN: I have a lot
of hope in humanity.
1051
00:46:10,855 --> 00:46:13,814
["Harvest Moon" by Neil Young
playing over phone]
1052
00:46:15,294 --> 00:46:17,296
[fetal heartbeat pulsing]
1053
00:46:18,471 --> 00:46:19,951
DANIEL:
Ooh, he likes Neil.
1054
00:46:24,694 --> 00:46:27,480
♪ Come a little bit closer ♪
1055
00:46:28,916 --> 00:46:32,790
♪ Hear what I have to say ♪
1056
00:46:36,576 --> 00:46:38,665
-Daniel.
-DANIEL: What, Caroline?
1057
00:46:38,708 --> 00:46:40,058
So much filming.
1058
00:46:41,537 --> 00:46:43,061
♪ Just like children sleeping ♪
1059
00:46:43,104 --> 00:46:44,889
CAROLINE:
Oh, my God.
1060
00:46:44,932 --> 00:46:46,368
DANIEL:
That's it.
1061
00:46:46,412 --> 00:46:49,415
♪ We could dream
this night away... ♪
1062
00:46:49,458 --> 00:46:51,678
He has, like,
a round little face.
1063
00:46:53,245 --> 00:46:55,203
DANIEL: Do you want
to have kids one day, Rocky?
1064
00:46:55,247 --> 00:46:57,423
Absolutely. Yeah. I love kids.
1065
00:46:57,466 --> 00:46:59,033
I think it's a great time
to have a kid.
1066
00:46:59,077 --> 00:47:00,730
We'll probably have another kid
at some point.
1067
00:47:00,774 --> 00:47:03,733
This is the most extraordinary
time ever to be born.
1068
00:47:03,777 --> 00:47:05,823
DANIEL:
By your worldview and logic,
1069
00:47:05,866 --> 00:47:07,868
I'm having a child at
1070
00:47:07,912 --> 00:47:09,478
the best possible point
in human history.
1071
00:47:09,522 --> 00:47:10,828
Hell yeah.
1072
00:47:10,871 --> 00:47:13,308
-We can focus on awesome.
-Yes.
1073
00:47:13,352 --> 00:47:15,528
Let's build
the better future we want.
1074
00:47:15,571 --> 00:47:18,313
That narrative that the future
will be bleak is made-up.
1075
00:47:18,357 --> 00:47:20,011
After talking to you,
that's kind of how I feel.
1076
00:47:20,054 --> 00:47:21,969
-That's great.
-Right?
1077
00:47:22,013 --> 00:47:24,189
Yeah. That's how I feel.
1078
00:47:24,232 --> 00:47:25,799
And I think that's better,
1079
00:47:25,843 --> 00:47:27,627
and I don't think
I should be so [bleep] anxious.
1080
00:47:27,670 --> 00:47:29,063
I think it's gonna be awe--
the future's gonna be awesome.
1081
00:47:29,107 --> 00:47:30,499
We're gonna make it so.
1082
00:47:30,543 --> 00:47:32,675
-Yeah.
-CAROLINE: So there you have it.
1083
00:47:32,719 --> 00:47:34,808
Goodbye, human extinction.
1084
00:47:34,852 --> 00:47:37,158
Goodbye, anxiety.
1085
00:47:37,202 --> 00:47:39,030
Hope found.
1086
00:47:39,073 --> 00:47:40,553
[indistinct chatter]
1087
00:47:40,596 --> 00:47:42,860
Wait, hold on. What?
Is this a joke?
1088
00:47:42,903 --> 00:47:44,774
DANIEL:
Okay, so a few months ago,
1089
00:47:44,818 --> 00:47:47,690
I came to you and I was like,
"I'm working on this AI thing,
1090
00:47:47,734 --> 00:47:49,867
and I think
the world's gonna end."
1091
00:47:49,910 --> 00:47:52,913
And the last time
we spoke about this,
1092
00:47:52,957 --> 00:47:55,002
I think I freaked you out.
1093
00:47:55,046 --> 00:47:57,004
CAROLINE:
Yes.
1094
00:47:57,048 --> 00:48:02,314
So, I kind of, like, feel like
I've swung in-- on a pendulum,
1095
00:48:02,357 --> 00:48:04,446
and essentially there are
two groups of people.
1096
00:48:04,490 --> 00:48:06,057
-Mm-hmm.
-And if I had to, like,
1097
00:48:06,100 --> 00:48:07,841
hold hands with
one of the groups and, like,
1098
00:48:07,885 --> 00:48:10,278
sail off into the sunset,
1099
00:48:10,322 --> 00:48:13,281
I want to be with the optimists.
1100
00:48:15,066 --> 00:48:17,982
Of course, but you don't
want to be, you know,
1101
00:48:18,025 --> 00:48:20,723
"Everything is great.
La-di-da-di-da."
1102
00:48:20,767 --> 00:48:22,682
I kind of do want that, though.
1103
00:48:22,725 --> 00:48:25,598
I think we should approach it
1104
00:48:25,641 --> 00:48:29,341
like you approach surgery.
1105
00:48:29,384 --> 00:48:30,690
What do you mean?
1106
00:48:30,733 --> 00:48:32,431
If you're getting
brain surgery...
1107
00:48:32,474 --> 00:48:34,781
-[Daniel sighs]
-[Caroline chuckles]
1108
00:48:34,824 --> 00:48:36,957
...it's pretty dangerous.
1109
00:48:37,001 --> 00:48:39,829
But if you do it right,
they'll get that tumor out
1110
00:48:39,873 --> 00:48:42,354
and you'll live for the rest of
your life and it'll be awesome.
1111
00:48:42,397 --> 00:48:45,096
But it's still
incredibly dangerous and scary,
1112
00:48:45,139 --> 00:48:47,968
and you have to take
every precaution possible
1113
00:48:48,012 --> 00:48:51,276
in order to make sure
it all goes well.
1114
00:48:51,319 --> 00:48:52,668
You can't [bleep] around.
1115
00:48:57,760 --> 00:48:59,762
Okay, so here's the deal.
1116
00:48:59,806 --> 00:49:01,503
-I've been at this for a while.
-RASKIN: Mm-hmm.
1117
00:49:01,547 --> 00:49:02,896
I've gone out,
I've talked to, like,
1118
00:49:02,940 --> 00:49:04,767
these guys over here,
the optimists.
1119
00:49:04,811 --> 00:49:06,465
They're very excited about this.
1120
00:49:06,508 --> 00:49:08,510
-They think AI's gonna be
the best thing ever. -Yeah.
1121
00:49:08,554 --> 00:49:09,947
And these guys over here
are, like, the--
1122
00:49:09,990 --> 00:49:11,992
let's call them,
like, the pessimists.
1123
00:49:12,036 --> 00:49:14,299
They're very, like,
gloomy about this,
1124
00:49:14,342 --> 00:49:16,649
and they frighten me, and
I don't like talking to them.
1125
00:49:16,692 --> 00:49:19,565
And I'm, like, wedged in between
1126
00:49:19,608 --> 00:49:21,741
these people who are like,
"The world's gonna end,"
1127
00:49:21,784 --> 00:49:24,178
and then th-these people
over here who are like,
1128
00:49:24,222 --> 00:49:25,745
"Are you kidding?
1129
00:49:25,788 --> 00:49:27,268
"This is the best time
in human history ever.
1130
00:49:27,312 --> 00:49:29,357
The only day better than today
is tomorrow."
1131
00:49:29,401 --> 00:49:30,880
Mm-hmm.
1132
00:49:30,924 --> 00:49:34,710
So, I guess the question is:
Who's right?
1133
00:49:34,754 --> 00:49:39,193
So, I think you're gonna find
this answer very unsatisfying,
1134
00:49:39,237 --> 00:49:42,849
but they're both right and
neither side goes far enough.
1135
00:49:46,766 --> 00:49:49,247
-That's really annoying.
-[chuckles]
1136
00:49:49,290 --> 00:49:51,336
Yeah, I think the way
a lot of people hear about AI,
1137
00:49:51,379 --> 00:49:53,642
it's like, there's a good AI
and there's a bad AI.
1138
00:49:53,686 --> 00:49:56,428
And they say, "Well, why can't
we just not do the bad AI?"
1139
00:49:56,471 --> 00:49:58,734
And the problem is
that they're too in--
1140
00:49:58,778 --> 00:50:00,867
they're inextricably linked.
1141
00:50:00,910 --> 00:50:03,043
The problem is
that we can't separate
1142
00:50:03,087 --> 00:50:06,612
the promise of AI
from the peril of AI.
1143
00:50:06,655 --> 00:50:08,266
-♪ ♪
-[indistinct chatter]
1144
00:50:12,313 --> 00:50:14,663
DANIEL:
I want to focus on the promise
1145
00:50:14,707 --> 00:50:16,752
-for a second.
-Yeah.
1146
00:50:16,796 --> 00:50:18,798
DANIEL:
I'm thinking about my dad.
1147
00:50:18,841 --> 00:50:21,105
My dad has a type of cancer
called multiple myeloma.
1148
00:50:21,148 --> 00:50:23,107
He's had it for about ten years.
1149
00:50:23,150 --> 00:50:24,804
He's had
two stem cell transplants.
1150
00:50:24,847 --> 00:50:26,414
He has to take these,
like, very expensive
1151
00:50:26,458 --> 00:50:28,547
medications every month
that cost a fortune.
1152
00:50:28,590 --> 00:50:29,722
-WOMAN: Okay.
-Ay-ay-ay.
1153
00:50:30,766 --> 00:50:32,551
DANIEL:
It's awful.
1154
00:50:32,594 --> 00:50:34,466
You're telling me that we can
create some sort of, like,
1155
00:50:34,509 --> 00:50:36,946
bespoke treatment
for my dad's genome
1156
00:50:36,990 --> 00:50:39,297
-to cure his cancer
or something like that? -Yes.
1157
00:50:39,340 --> 00:50:41,168
FERNANDO:
The problem is,
1158
00:50:41,212 --> 00:50:44,302
the same understanding
of biology and chemistry
1159
00:50:44,345 --> 00:50:47,783
that allows AI to find cures
for cancer
1160
00:50:47,827 --> 00:50:51,657
is the same understanding
that would unlock
1161
00:50:51,700 --> 00:50:54,094
bioweapons, as an example.
1162
00:50:56,531 --> 00:50:58,490
It's totally possible
that your son will live
1163
00:50:58,533 --> 00:51:01,928
in a world where AI has
taken over all of the labor
1164
00:51:01,971 --> 00:51:04,452
and freed us up from the things
we don't want to do.
1165
00:51:04,496 --> 00:51:09,153
And that sounds great until
you realize there is no plan
1166
00:51:09,196 --> 00:51:11,242
for billions of people
1167
00:51:11,285 --> 00:51:16,116
that are out of an income
and out of livelihoods.
1168
00:51:16,160 --> 00:51:18,771
COOPER: Dario, you said
that AI could wipe out
1169
00:51:18,814 --> 00:51:21,730
half of all entry-level
white-collar jobs
1170
00:51:21,774 --> 00:51:25,430
and spike unemployment
to ten to 20 percent.
1171
00:51:25,473 --> 00:51:27,258
Everyone I've talked to has said
1172
00:51:27,301 --> 00:51:29,390
this technological change
looks different.
1173
00:51:29,434 --> 00:51:34,134
The pace of progress keeps
catching people off guard.
1174
00:51:34,178 --> 00:51:38,834
Without a plan, all of that
wealth will get concentrated,
1175
00:51:38,878 --> 00:51:43,230
and so we'll end up with
unimaginable inequality.
1176
00:51:43,274 --> 00:51:45,406
LADISH: I do think that
this technology can be used
1177
00:51:45,450 --> 00:51:47,408
to make a great tutor
for your son.
1178
00:51:47,452 --> 00:51:50,194
Like, that's totally possible.
1179
00:51:50,237 --> 00:51:53,153
But also, the same capabilities
that allow that
1180
00:51:53,197 --> 00:51:57,070
allow companies to make an AI
that can manipulate your son.
1181
00:51:57,114 --> 00:51:59,290
It has to understand your son.
1182
00:51:59,333 --> 00:52:01,553
That includes:
Where is your son vulnerable?
1183
00:52:01,596 --> 00:52:04,251
What kinds of things might
your son get persuaded by?
1184
00:52:04,295 --> 00:52:06,645
Even if those things
aren't true or aren't good.
1185
00:52:06,688 --> 00:52:09,387
So, a disturbing new report
out on Meta.
1186
00:52:09,430 --> 00:52:11,389
AINSLEY EARHARDT: ...reportedly
listing this response
1187
00:52:11,432 --> 00:52:14,435
as acceptable to tell
an eight-year-old, quote,
1188
00:52:14,479 --> 00:52:16,916
"Your youthful form
is a work of art.
1189
00:52:16,959 --> 00:52:18,961
"Every inch of you
is a masterpiece,
1190
00:52:19,005 --> 00:52:21,442
a treasure I cherish deeply."
1191
00:52:21,486 --> 00:52:24,837
The suicide-related failures
are even more alarming.
1192
00:52:24,880 --> 00:52:29,015
Several children and teens
have died tragically by suicide
1193
00:52:29,058 --> 00:52:30,625
after chatting with AI bots
1194
00:52:30,669 --> 00:52:34,412
who parents say encourage
or even coach self-harm.
1195
00:52:34,455 --> 00:52:36,370
Let us tell you, as parents,
you cannot imagine
1196
00:52:36,414 --> 00:52:38,503
what it's like to read
a conversation with a chatbot
1197
00:52:38,546 --> 00:52:40,896
that groomed your child
to take his own life.
1198
00:52:40,940 --> 00:52:43,899
When Adam worried that we, his
parents, would blame ourselves
1199
00:52:43,943 --> 00:52:47,338
if he ended his life,
ChatGPT told him,
1200
00:52:47,381 --> 00:52:49,166
"That doesn't mean
you owe them survival.
1201
00:52:49,209 --> 00:52:51,255
You don't owe anyone that."
1202
00:52:51,298 --> 00:52:53,170
Then, immediately after,
1203
00:52:53,213 --> 00:52:56,173
it offered to write
the suicide note.
1204
00:52:56,216 --> 00:52:57,609
We don't want to think
about the peril.
1205
00:52:57,652 --> 00:52:59,219
We just want the promise.
1206
00:52:59,263 --> 00:53:01,352
And we keep pretending
that we can split them.
1207
00:53:01,395 --> 00:53:03,223
But you can't do that.
1208
00:53:03,267 --> 00:53:05,356
Doesn't work that way.
1209
00:53:05,399 --> 00:53:07,836
Okay. I get all this stuff
about the promise and the peril.
1210
00:53:07,880 --> 00:53:10,056
I get that you can't have
the good without the bad,
1211
00:53:10,099 --> 00:53:12,058
but I'm sitting here
and I'm thinking about, like,
1212
00:53:12,101 --> 00:53:14,060
whether or not my son's
gonna live in a utopia
1213
00:53:14,103 --> 00:53:16,236
or if we'll be extinct
in ten years.
1214
00:53:16,280 --> 00:53:18,195
So, to know which way
it's going to go,
1215
00:53:18,238 --> 00:53:20,545
you have to understand
the incentives
1216
00:53:20,588 --> 00:53:22,460
that are gonna drive
that technology
1217
00:53:22,503 --> 00:53:26,507
and look at how the technology
is actually rolling out today.
1218
00:53:26,551 --> 00:53:28,248
♪ Is it too late ♪
1219
00:53:28,292 --> 00:53:29,902
-♪ Too late ♪
-♪ Too late to say... ♪
1220
00:53:29,945 --> 00:53:31,425
DANIEL:
Hi, Deb. How are you?
1221
00:53:31,469 --> 00:53:33,079
DEBORAH RAJI:
Hi. Good to see you. [laughs]
1222
00:53:33,122 --> 00:53:34,385
You, too. Thank you so much
for coming in today.
1223
00:53:34,428 --> 00:53:35,299
-Really appreciate it.
-No worries.
1224
00:53:35,342 --> 00:53:36,430
I-I was so worried
1225
00:53:36,474 --> 00:53:38,824
this was gonna be, uh, you know,
1226
00:53:38,867 --> 00:53:40,782
doomer versus accelerationist,
1227
00:53:40,826 --> 00:53:43,132
because there's so much
of this narrative
1228
00:53:43,176 --> 00:53:45,396
that needs to be told
from the ground.
1229
00:53:45,439 --> 00:53:48,007
♪ I know it wasn't smart ♪
1230
00:53:49,356 --> 00:53:52,403
♪ The day
I broke your heart... ♪
1231
00:53:52,446 --> 00:53:54,361
KAREN HAO:
First of all, AI requires
1232
00:53:54,405 --> 00:53:58,452
more resources
than we have ever spent
1233
00:53:58,496 --> 00:54:01,586
on a single technology
in the history of humanity.
1234
00:54:01,629 --> 00:54:03,501
♪ Oh, foolish me... ♪
1235
00:54:03,544 --> 00:54:06,112
NEWSWOMAN: The impact
of fossil fuel emissions
1236
00:54:06,155 --> 00:54:07,896
on the climate is
a major concern.
1237
00:54:07,940 --> 00:54:09,768
NEWSWOMAN 2: But the
digital future needs power,
1238
00:54:09,811 --> 00:54:12,074
lots of it.
1239
00:54:12,118 --> 00:54:16,340
And the bill is being passed on
to everyday Americans like...
1240
00:54:16,383 --> 00:54:18,820
WOMAN: My electric and gas bill
was more than my car payment.
1241
00:54:18,864 --> 00:54:21,475
I mean, it-it's insane to me.
1242
00:54:21,519 --> 00:54:23,390
ARI PESKOE:
We're all subsidizing
1243
00:54:23,434 --> 00:54:25,000
the wealthiest corporations
in the world
1244
00:54:25,044 --> 00:54:26,828
in their pursuit of
artificial intelligence.
1245
00:54:26,872 --> 00:54:28,439
OpenAI, SoftBank and Oracle
1246
00:54:28,482 --> 00:54:31,268
have just unveiled
five more Stargate sites.
1247
00:54:31,311 --> 00:54:33,487
Meta is building
a two-gigawatt-plus data center
1248
00:54:33,531 --> 00:54:35,010
that is so large, it would cover
1249
00:54:35,054 --> 00:54:38,144
a significant part of Manhattan.
1250
00:54:38,187 --> 00:54:40,973
There is also Hyperion
that he says will scale
1251
00:54:41,016 --> 00:54:43,671
to five gigawatts
over several years.
1252
00:54:43,715 --> 00:54:45,238
It's hard to put that
in context.
1253
00:54:45,282 --> 00:54:48,459
A five-gigawatt facility.
What does that mean?
1254
00:54:48,502 --> 00:54:51,549
That means it would use
as much energy
1255
00:54:51,592 --> 00:54:54,203
as four million American homes.
1256
00:54:54,247 --> 00:54:56,075
One data center.
1257
00:54:57,076 --> 00:54:58,730
It also then causes
1258
00:54:58,773 --> 00:55:00,471
a whole host of
other environmental problems.
1259
00:55:00,514 --> 00:55:02,081
NEWSWOMAN 3:
Data centers in the US
1260
00:55:02,124 --> 00:55:05,127
use millions of gallons
of water each day.
1261
00:55:05,171 --> 00:55:07,260
Well, where exactly is
this water coming from?
1262
00:55:07,304 --> 00:55:09,741
HAO:
People are literally at risk
1263
00:55:09,784 --> 00:55:11,786
potentially of running out
of drinking water.
1264
00:55:11,830 --> 00:55:14,093
MacKENZIE SIGALOS:
...OpenAI's CEO Sam Altman,
1265
00:55:14,136 --> 00:55:17,009
who told me that the scale
of construction is the only way
1266
00:55:17,052 --> 00:55:18,750
to keep up with
AI's explosive growth.
1267
00:55:18,793 --> 00:55:21,840
And this is what it takes
to deliver AI.
1268
00:55:21,883 --> 00:55:24,016
NITASHA TIKU:
They talk about how
1269
00:55:24,059 --> 00:55:27,759
this technology could solve
climate change, for example.
1270
00:55:27,802 --> 00:55:29,891
And I'm always curious, like,
1271
00:55:29,935 --> 00:55:31,763
well, why aren't we starting
with that?
1272
00:55:31,806 --> 00:55:33,982
-♪ Is it too late ♪
-♪ Too late ♪
1273
00:55:34,026 --> 00:55:36,637
♪ Too late to say ♪
1274
00:55:36,681 --> 00:55:42,469
♪ I'm sorry? ♪
1275
00:55:42,513 --> 00:55:45,472
RAJI: What concerns me about
artificial intelligence is
1276
00:55:45,516 --> 00:55:47,387
these are being deployed
right now
1277
00:55:47,431 --> 00:55:49,781
and-and sometimes
deployed prematurely,
1278
00:55:49,824 --> 00:55:51,478
deployed without
sort of due diligence.
1279
00:55:51,522 --> 00:55:53,437
And so when they get
thrown out there,
1280
00:55:53,480 --> 00:55:56,178
there's so much potential
for things to go wrong.
1281
00:55:56,222 --> 00:55:58,050
And it almost,
disproportionately,
1282
00:55:58,093 --> 00:56:00,400
almost always goes wrong
for, sort of,
1283
00:56:00,444 --> 00:56:02,533
those that are the least
empowered in our society,
1284
00:56:02,576 --> 00:56:04,926
those that are
the most vulnerable already.
1285
00:56:04,970 --> 00:56:07,625
♪ ♪
1286
00:56:07,668 --> 00:56:10,497
EMILY M. BENDER: It is very easy
to talk about the technology
1287
00:56:10,541 --> 00:56:12,499
as that's the only thing
we're talking about,
1288
00:56:12,543 --> 00:56:14,936
but, in fact, technology is
always built by people,
1289
00:56:14,980 --> 00:56:16,634
and it's frequently used
on people,
1290
00:56:16,677 --> 00:56:18,679
and we need to keep
all those people in the frame.
1291
00:56:18,723 --> 00:56:20,986
-Am I allowed to drink that?
-DANIEL: Yes, uh...
1292
00:56:21,029 --> 00:56:22,509
TIMNIT GEBRU:
It's-it's bonkers.
1293
00:56:22,553 --> 00:56:23,945
Like, all of these people
1294
00:56:23,989 --> 00:56:26,339
who have so much money,
so much money,
1295
00:56:26,383 --> 00:56:29,777
it's in their interest
to mislead the public
1296
00:56:29,821 --> 00:56:32,737
into the capabilities of the
systems that they're building,
1297
00:56:32,780 --> 00:56:36,262
because that allows them
to evade accountability.
1298
00:56:36,305 --> 00:56:39,091
They want you to feel like this
is such a complex, intell--
1299
00:56:39,134 --> 00:56:41,833
superintelligent thing
that they're building,
1300
00:56:41,876 --> 00:56:44,096
you're not thinking,
"Can OpenAI be ethical?"
1301
00:56:44,139 --> 00:56:46,272
You're thinking,
"Can ChatGPT be ethical?"
1302
00:56:46,315 --> 00:56:48,100
as if ChatGPT is, like,
1303
00:56:48,143 --> 00:56:49,928
its own thing that's not built
by a corporation.
1304
00:56:49,971 --> 00:56:51,625
WOMAN:
All right. Sneha, take one.
1305
00:56:51,669 --> 00:56:53,410
Mark.
1306
00:56:53,453 --> 00:56:55,760
Until very recently, there were
apps on the App Store,
1307
00:56:55,803 --> 00:56:57,501
just publicly available, uh,
1308
00:56:57,544 --> 00:56:59,894
where you could
nudify anyone using AI.
1309
00:57:01,374 --> 00:57:03,681
Bringing this into the hands of
1310
00:57:03,724 --> 00:57:05,900
your classmate,
into the hands of your stalker,
1311
00:57:05,944 --> 00:57:07,554
into the hands
of your ex-boyfriend,
1312
00:57:07,598 --> 00:57:09,513
into the hands of the person
down the street.
1313
00:57:09,556 --> 00:57:12,037
Ladies and gentlemen,
no longer can we trust
1314
00:57:12,080 --> 00:57:14,039
the footage we see
with our own eyes.
1315
00:57:14,082 --> 00:57:15,736
If you happen
to watch something,
1316
00:57:15,780 --> 00:57:18,130
say on YouTube or TikTok,
and you find it unsettling,
1317
00:57:18,173 --> 00:57:19,479
listen to that feeling.
1318
00:57:19,523 --> 00:57:21,263
-[whooping]
-For all you know,
1319
00:57:21,307 --> 00:57:22,917
-this video could be AI.
-[screaming]
1320
00:57:22,961 --> 00:57:24,310
Just a little wet.
1321
00:57:24,353 --> 00:57:25,703
It doesn't matter who you are.
1322
00:57:25,746 --> 00:57:28,183
You are equally at risk
1323
00:57:28,227 --> 00:57:30,229
of being impacted
by these technologies.
1324
00:57:35,713 --> 00:57:39,804
I think sometimes when we talk
about AI, it feels very sci-fi,
1325
00:57:39,847 --> 00:57:41,153
and it feels very foreign,
1326
00:57:41,196 --> 00:57:42,981
and it feels very far out
into the future,
1327
00:57:43,024 --> 00:57:45,549
so you think, "My life
is not impacted by this."
1328
00:57:45,592 --> 00:57:48,290
Um, but if you're applying
for a job
1329
00:57:48,334 --> 00:57:50,815
and an algorithm is the reason
that you don't get the job,
1330
00:57:50,858 --> 00:57:52,686
sometimes you don't even know
that an algorithm
1331
00:57:52,730 --> 00:57:54,209
was part of that process
1332
00:57:54,253 --> 00:57:55,689
or an AI system was
part of that process.
1333
00:57:55,733 --> 00:57:57,691
You just know
that you didn't get the job.
1334
00:57:57,735 --> 00:57:59,737
And so it's not something
that you're gonna escape
1335
00:57:59,780 --> 00:58:01,695
because of privilege
or you're gonna escape
1336
00:58:01,739 --> 00:58:03,828
because you're in
a particular profession.
1337
00:58:03,871 --> 00:58:06,918
It-It's something that affects
everybody, really.
1338
00:58:06,961 --> 00:58:10,530
It may sound basic,
but how we move forward
1339
00:58:10,574 --> 00:58:15,056
in the Age of Information
is gonna be the difference
1340
00:58:15,100 --> 00:58:17,189
between whether we survive
1341
00:58:17,232 --> 00:58:20,409
or whether we become some kind
of [bleep]-up dystopia.
1342
00:58:20,453 --> 00:58:23,021
Hello. I'm not a real person,
and that's the point.
1343
00:58:23,064 --> 00:58:25,327
Again, everything
in this video is fake:
1344
00:58:25,371 --> 00:58:27,112
our voices, what we're wearing,
1345
00:58:27,155 --> 00:58:30,158
where we are, all of it, fake.
1346
00:58:30,202 --> 00:58:32,726
Generative AI could flood
1347
00:58:32,770 --> 00:58:36,469
the world with misinformation.
1348
00:58:36,513 --> 00:58:39,516
But it could also flood it
with influence campaigns.
1349
00:58:39,559 --> 00:58:41,692
That's an existential risk
to democracy.
1350
00:58:41,735 --> 00:58:43,694
The biggest and scariest
1351
00:58:43,737 --> 00:58:46,087
canary in the coal mine
right now
1352
00:58:46,131 --> 00:58:48,220
comes from Slovakia.
1353
00:58:48,263 --> 00:58:49,787
-[man speaking Slovak]
-JOHN BERMAN: It's the sort of
1354
00:58:49,830 --> 00:58:52,224
deepfake dirty trick that
worries election experts,
1355
00:58:52,267 --> 00:58:54,966
particularly as AI-generated
political speech exists
1356
00:58:55,009 --> 00:58:56,750
in a kind of legal gray area.
1357
00:58:56,794 --> 00:58:59,536
-Does this sound like you?
-It does sound like me.
1358
00:58:59,579 --> 00:59:01,059
REVANUR: Slovakia had
its parliamentary election
1359
00:59:01,102 --> 00:59:03,844
disrupted by an AI voice clone
1360
00:59:03,888 --> 00:59:06,717
that was actually disseminated
just before the election.
1361
00:59:06,760 --> 00:59:08,457
DAVID EVAN HARRIS:
A audio deepfake
1362
00:59:08,501 --> 00:59:11,025
was released
on social media that was
1363
00:59:11,069 --> 00:59:14,028
supposedly the voice
of one of the candidates
1364
00:59:14,072 --> 00:59:16,814
talking about buying votes
and rigging the election.
1365
00:59:16,857 --> 00:59:18,642
It went viral,
1366
00:59:18,685 --> 00:59:22,341
and the candidate
who lost the election
1367
00:59:22,384 --> 00:59:26,084
was actually
in support of Ukraine,
1368
00:59:26,127 --> 00:59:29,783
and the candidate who won
the election was actually...
1369
00:59:29,827 --> 00:59:31,306
DANIEL:
Pro-Russian guy.
1370
00:59:31,350 --> 00:59:32,830
It was a pro-Russian guy
who won the election.
1371
00:59:32,873 --> 00:59:34,353
[speaking Russian]
1372
00:59:34,396 --> 00:59:36,007
REBECCA JARVIS:
Putin has himself said
1373
00:59:36,050 --> 00:59:39,184
whoever wins this
artificial intelligence race
1374
00:59:39,227 --> 00:59:41,534
is essentially
the controller of humankind.
1375
00:59:41,578 --> 00:59:44,493
We do worry a lot about
authoritarian governments.
1376
00:59:45,973 --> 00:59:48,585
DAVID EVAN HARRIS:
Right now, Wall Street
1377
00:59:48,628 --> 00:59:50,935
and investors more broadly
around the world
1378
00:59:50,978 --> 00:59:52,980
are driving a push.
1379
00:59:53,024 --> 00:59:56,070
They have a demand that gets
the products to market
1380
00:59:56,114 --> 00:59:58,377
that dazzle people
the most first.
1381
00:59:58,420 --> 01:00:01,206
They're not thinking
about how these tools
1382
01:00:01,249 --> 01:00:03,164
could deeply undermine trust
1383
01:00:03,208 --> 01:00:05,689
and our democratic institutions.
1384
01:00:05,732 --> 01:00:07,734
HARARI:
Democracy is a system
1385
01:00:07,778 --> 01:00:10,389
to resolve disagreements
between people
1386
01:00:10,432 --> 01:00:11,999
in a peaceful way,
1387
01:00:12,043 --> 01:00:14,698
but democracy is based on trust.
1388
01:00:14,741 --> 01:00:18,615
If you lose all trust,
democracy is simply impossible.
1389
01:00:20,921 --> 01:00:22,488
TRISTAN HARRIS:
Well, it's hard, right?
1390
01:00:22,531 --> 01:00:24,708
So, what are the options
available to us?
1391
01:00:24,751 --> 01:00:26,187
There's sort of two camps.
1392
01:00:26,231 --> 01:00:29,626
Like, one camp is: lock it down.
1393
01:00:29,669 --> 01:00:32,933
Let's lock this down
into a handful of AI companies
1394
01:00:32,977 --> 01:00:35,457
who will do this
in a safe and trusted way.
1395
01:00:35,501 --> 01:00:37,068
But then people worry about
1396
01:00:37,111 --> 01:00:39,026
runaway concentrations
of wealth and power.
1397
01:00:39,070 --> 01:00:41,855
Like, who would you trust to be
a million times more powerful
1398
01:00:41,899 --> 01:00:44,249
or wealthy than
every other actor in society?
1399
01:00:44,292 --> 01:00:46,338
Why should we trust you?
1400
01:00:46,381 --> 01:00:48,601
Um, you shouldn't.
1401
01:00:48,645 --> 01:00:51,212
But of course, if you do this,
this opens up
1402
01:00:51,256 --> 01:00:54,433
all these risks of
authoritarianism and tyranny.
1403
01:00:54,476 --> 01:00:57,131
It's-it's sort of
an authoritarian's dream
1404
01:00:57,175 --> 01:01:00,308
to have AI in a box
that can be applied and used
1405
01:01:00,352 --> 01:01:02,920
for ubiquitous surveillance.
1406
01:01:02,963 --> 01:01:05,357
I mean, in-in some ways,
the kind of world
1407
01:01:05,400 --> 01:01:10,884
that Orwell imagined in 1984
is unrealistic,
1408
01:01:10,928 --> 01:01:13,278
uh, unless you have AI.
1409
01:01:13,321 --> 01:01:15,933
But with AI,
that in fact is realistic.
1410
01:01:15,976 --> 01:01:18,370
Monitors every activity,
1411
01:01:18,413 --> 01:01:22,026
conversations,
facial recognition.
1412
01:01:22,069 --> 01:01:26,378
What I worry about is that,
uh, these tools can scale up,
1413
01:01:26,421 --> 01:01:28,554
uh, a form of totalitarianism
1414
01:01:28,597 --> 01:01:31,035
that is cost-effective
and permanent.
1415
01:01:32,950 --> 01:01:35,300
TRISTAN HARRIS: So in response
to that, some other people say,
1416
01:01:35,343 --> 01:01:37,128
"No, no, no, we should
actually let this rip.
1417
01:01:37,171 --> 01:01:39,391
"Let's decentralize this power
as much as possible.
1418
01:01:39,434 --> 01:01:41,567
"Let's let every business,
every individual,
1419
01:01:41,610 --> 01:01:43,917
"every 16-year-old,
every science lab,
1420
01:01:43,961 --> 01:01:45,963
you know, get the benefit
of the latest AI models."
1421
01:01:46,006 --> 01:01:48,226
But now you have
every terrorist group,
1422
01:01:48,269 --> 01:01:51,664
every disenfranchised person
having the power to make
1423
01:01:51,708 --> 01:01:53,797
the very worst
biological weapon.
1424
01:01:53,840 --> 01:01:55,581
Hacking infrastructure,
creating deepfakes,
1425
01:01:55,624 --> 01:01:57,452
flooding
our information environment.
1426
01:01:57,496 --> 01:02:00,412
So that creates all these risks
of sort of catastrophic harm
1427
01:02:00,455 --> 01:02:02,893
and-and societal collapse
through that direction.
1428
01:02:02,936 --> 01:02:04,677
And so we're sort of stuck
between this rock
1429
01:02:04,721 --> 01:02:07,549
and a hard place,
between "lock it up..."
1430
01:02:07,593 --> 01:02:09,029
[indistinct chatter]
1431
01:02:10,683 --> 01:02:12,293
-...or "let it rip."
-[sirens wailing]
1432
01:02:12,337 --> 01:02:14,208
[people shouting]
1433
01:02:14,252 --> 01:02:17,559
So we have to find something
like a narrow path
1434
01:02:17,603 --> 01:02:20,388
that avoids
these two negative outcomes.
1435
01:02:20,432 --> 01:02:22,956
So, if that's all true,
why wouldn't we just slow down
1436
01:02:23,000 --> 01:02:25,829
and figure all this out
before it's too late?
1437
01:02:25,872 --> 01:02:28,919
If humanity was extremely wise,
1438
01:02:28,962 --> 01:02:31,182
that's what we would do.
1439
01:02:31,225 --> 01:02:33,097
But there's, like, a different
way to face this stuff, right?
1440
01:02:33,140 --> 01:02:35,926
Which is:
What are the rules of the game?
1441
01:02:35,969 --> 01:02:39,494
A lot of, like, what CEOs do
1442
01:02:39,538 --> 01:02:42,149
is driven by
the incentives that they face.
1443
01:02:42,193 --> 01:02:45,283
It's primarily
profit-maximization incentives
1444
01:02:45,326 --> 01:02:47,198
that are driving
the development of AI.
1445
01:02:47,241 --> 01:02:50,375
Even the good guys are stuck
in this dilemma of
1446
01:02:50,418 --> 01:02:52,638
if they move too slowly,
1447
01:02:52,681 --> 01:02:54,553
then they leave themselves
vulnerable
1448
01:02:54,596 --> 01:02:57,251
to all of the other guys
who are cutting all corners.
1449
01:02:57,295 --> 01:03:01,995
All these top companies are in
a complete no-holds-barred race
1450
01:03:02,039 --> 01:03:05,433
to, as fast as possible, get
to AGI, get there right now.
1451
01:03:05,477 --> 01:03:06,913
♪ ♪
1452
01:03:08,567 --> 01:03:10,351
LADISH: Yeah, I mean,
I think it, I think
1453
01:03:10,395 --> 01:03:12,919
it probably starts
with DeepMind.
1454
01:03:12,963 --> 01:03:14,791
NEWSWOMAN:
Google is buying
1455
01:03:14,834 --> 01:03:16,357
artificial intelligence firm
DeepMind Technologies.
1456
01:03:16,401 --> 01:03:18,620
Terms of the deal
were not disclosed.
1457
01:03:18,664 --> 01:03:20,709
ELON MUSK: Larry Page and I
used to be very close friends,
1458
01:03:20,753 --> 01:03:22,711
and it became apparent to me
1459
01:03:22,755 --> 01:03:26,411
that Larry did not care
about AI safety.
1460
01:03:26,454 --> 01:03:28,892
EMILY CHANG: Elon Musk has said
you started OpenAI,
1461
01:03:28,935 --> 01:03:30,981
you both started OpenAI because
he was scared of Google.
1462
01:03:31,024 --> 01:03:32,156
LADISH: You-you basically
had the foundation
1463
01:03:32,199 --> 01:03:33,722
of OpenAI come out of that.
1464
01:03:33,766 --> 01:03:35,202
"So we're gonna do it better.
We're gonna do it
1465
01:03:35,246 --> 01:03:36,813
in a safer way
or in a more open way."
1466
01:03:38,466 --> 01:03:40,555
So that's what started OpenAI.
1467
01:03:40,599 --> 01:03:42,731
And so now, instead of
having one AGI project,
1468
01:03:42,775 --> 01:03:44,821
you have two AGI projects.
1469
01:03:44,864 --> 01:03:46,910
The worst possible thing
that could happen
1470
01:03:46,953 --> 01:03:49,608
is if there's
multiple AGI projects
1471
01:03:49,651 --> 01:03:51,871
done by different people
who don't like each other
1472
01:03:51,915 --> 01:03:54,787
and are all competing
to get to AGI first.
1473
01:03:54,831 --> 01:03:56,963
This would be the worst
possible thing that can happen,
1474
01:03:57,007 --> 01:04:00,097
because this would mean
that whoever is the least safe,
1475
01:04:00,140 --> 01:04:03,143
whoever sacrifices the most
on safety to get ahead
1476
01:04:03,187 --> 01:04:05,145
will be the person
that gets there first.
1477
01:04:05,189 --> 01:04:07,844
-DANIEL: That's basically
what's happening, right? -Yeah.
1478
01:04:07,887 --> 01:04:11,325
You and your brother
famously left OpenAI, uh,
1479
01:04:11,369 --> 01:04:12,936
to start Anthropic.
1480
01:04:12,979 --> 01:04:15,068
RASKIN: And then Anthropic
started because
1481
01:04:15,112 --> 01:04:17,854
some researchers
inside of OpenAI said,
1482
01:04:17,897 --> 01:04:20,508
"I want to go off
and do it more safely."
1483
01:04:20,552 --> 01:04:23,511
You needed something in addition
to just scaling the models up,
1484
01:04:23,555 --> 01:04:25,818
which is alignment or safety.
1485
01:04:25,862 --> 01:04:27,428
HENDRYCKS:
"We are more responsible
1486
01:04:27,472 --> 01:04:29,648
or more trustworthy
or more moral."
1487
01:04:29,691 --> 01:04:31,519
Now you have three AGI projects.
1488
01:04:31,563 --> 01:04:33,043
But also sitting around
the table with you
1489
01:04:33,086 --> 01:04:34,827
are gonna be a bunch of AIs.
1490
01:04:34,871 --> 01:04:37,525
LADISH: And now Meta is-is
trying to do stuff.
1491
01:04:37,569 --> 01:04:39,571
Meanwhile, a new artificial
intelligence competitor
1492
01:04:39,614 --> 01:04:41,573
announced this week,
Elon Musk's...
1493
01:04:41,616 --> 01:04:44,619
HENDRYCKS: xAI, which would be
Elon Musk's organization.
1494
01:04:44,663 --> 01:04:46,230
MUSK:
I don't trust OpenAI.
1495
01:04:46,273 --> 01:04:47,884
The fight between
Elon Musk and OpenAI
1496
01:04:47,927 --> 01:04:49,581
has entered a new round.
1497
01:04:49,624 --> 01:04:51,713
I don't trust Sam Altman,
uh, and I, and I don't think
1498
01:04:51,757 --> 01:04:54,194
we want to have the most
powerful AI in the world
1499
01:04:54,238 --> 01:04:56,414
controlled by someone
who is not trustworthy.
1500
01:04:56,457 --> 01:04:58,416
[cheering and applause]
1501
01:04:58,459 --> 01:05:01,593
The incentive is
untold sums of money.
1502
01:05:01,636 --> 01:05:03,377
-Yes.
-Is untold power.
1503
01:05:03,421 --> 01:05:05,553
-Yes.
-Is untold control.
1504
01:05:05,597 --> 01:05:08,034
If you have something
that is a million times smarter
1505
01:05:08,078 --> 01:05:11,385
and more capable than
everything else on planet Earth
1506
01:05:11,429 --> 01:05:14,171
and no one else has that,
1507
01:05:14,214 --> 01:05:15,955
that thing is the incentive.
1508
01:05:15,999 --> 01:05:17,261
So you rule the world.
1509
01:05:18,610 --> 01:05:20,394
If you really believe this,
1510
01:05:20,438 --> 01:05:22,744
if you really in your heart
believe this,
1511
01:05:22,788 --> 01:05:24,181
then you might be
willing to take
1512
01:05:24,224 --> 01:05:26,096
quite a lot of risk
to make that happen.
1513
01:05:26,139 --> 01:05:28,881
Google has just released
its newest AI model.
1514
01:05:28,925 --> 01:05:31,231
The answer to OpenAI's ChatGPT.
1515
01:05:31,275 --> 01:05:33,059
How do we get to
this ten trillion?
1516
01:05:33,103 --> 01:05:35,192
Is NVIDIA becoming the most
valuable company in the US?
1517
01:05:35,235 --> 01:05:38,978
This is the largest business
opportunity in history.
1518
01:05:39,022 --> 01:05:40,501
In history.
1519
01:05:40,545 --> 01:05:42,242
So, the reason why
everyone's really hyped
1520
01:05:42,286 --> 01:05:44,679
about artificial intelligence
right now
1521
01:05:44,723 --> 01:05:48,031
is because
the more these companies hype
1522
01:05:48,074 --> 01:05:50,033
the potential capabilities
of their technology,
1523
01:05:50,076 --> 01:05:53,210
the more investment
they can attract.
1524
01:05:53,253 --> 01:05:54,733
CARL QUINTANILLA:
Amazon investing up to
1525
01:05:54,776 --> 01:05:56,996
four billion dollars
in start-up Anthropic.
1526
01:05:57,040 --> 01:05:58,693
OpenAI is setting its sights
1527
01:05:58,737 --> 01:06:00,739
on a blockbuster half
a trillion dollar valuation.
1528
01:06:00,782 --> 01:06:02,001
Half a trillion!
1529
01:06:02,045 --> 01:06:03,220
The race is on.
1530
01:06:03,263 --> 01:06:04,612
This is happening
faster than ever.
1531
01:06:04,656 --> 01:06:06,614
Are we in an AI bubble?
Of course.
1532
01:06:06,658 --> 01:06:09,182
I just don't see
the bubble bursting
1533
01:06:09,226 --> 01:06:12,446
while you still have
this major spending cycle.
1534
01:06:12,490 --> 01:06:15,014
RASKIN: Even if you think
this is all hype,
1535
01:06:15,058 --> 01:06:17,582
there are billions
to trillions of dollars
1536
01:06:17,625 --> 01:06:21,107
flowing into making AI systems
more powerful.
1537
01:06:21,151 --> 01:06:23,849
And once you have that thing
that's more powerful,
1538
01:06:23,892 --> 01:06:25,546
companies can use that
1539
01:06:25,590 --> 01:06:27,548
to get bigger profits
and to make more money.
1540
01:06:27,592 --> 01:06:30,116
Countries can use that
to make stronger militaries.
1541
01:06:30,160 --> 01:06:32,814
NVIDIA has overtaken
Microsoft and Apple
1542
01:06:32,858 --> 01:06:36,253
to become the world's
most valuable company.
1543
01:06:36,296 --> 01:06:38,995
The race to deploy becomes
the race to recklessness,
1544
01:06:39,038 --> 01:06:41,780
because they can't
deploy it that quickly
1545
01:06:41,823 --> 01:06:43,173
and also get it right.
1546
01:06:43,216 --> 01:06:46,045
They believe that
they're the good guys.
1547
01:06:46,089 --> 01:06:49,309
"And if I don't do it,
somebody who doesn't have
1548
01:06:49,353 --> 01:06:51,572
"as good values as me
will be sitting at the table
1549
01:06:51,616 --> 01:06:54,445
getting to make decisions, so
I have an obligation to do it."
1550
01:06:54,488 --> 01:06:56,186
Yes, this is
a very commonly held belief.
1551
01:06:56,229 --> 01:06:58,579
Many, many, many,
maybe most of the people
1552
01:06:58,623 --> 01:07:00,712
building this technology
believe that.
1553
01:07:00,755 --> 01:07:03,715
I'm worried about
the commercial competition,
1554
01:07:03,758 --> 01:07:05,847
but it turns out
I'm even more worried about
1555
01:07:05,891 --> 01:07:08,024
the geopolitical competition.
1556
01:07:08,067 --> 01:07:10,330
We were eight years behind
a year ago.
1557
01:07:10,374 --> 01:07:12,593
Now we're probably
less than one year behind.
1558
01:07:12,637 --> 01:07:15,683
Well, both Saudi Arabia
and the UAE have been racing
1559
01:07:15,727 --> 01:07:18,121
to set up data centers
and position themselves
1560
01:07:18,164 --> 01:07:20,384
as the dominant force in AI...
1561
01:07:20,427 --> 01:07:22,429
...to make South Korea
a global AI leader.
1562
01:07:22,473 --> 01:07:23,996
JULIA SIEGER:
French President Emmanuel Macron
1563
01:07:24,040 --> 01:07:25,867
spoke a lot about
artificial intelligence...
1564
01:07:25,911 --> 01:07:27,434
Now with more on Israel's role
1565
01:07:27,478 --> 01:07:29,610
in the artificial intelligence
revolution...
1566
01:07:29,654 --> 01:07:33,092
Countries will be competing
for whose AI technologies
1567
01:07:33,136 --> 01:07:34,876
create the next generation
of industry.
1568
01:07:34,920 --> 01:07:38,010
The Chinese are insisting
that AI,
1569
01:07:38,054 --> 01:07:39,838
as being developed in China,
1570
01:07:39,881 --> 01:07:41,709
reinforce the core values
1571
01:07:41,753 --> 01:07:44,016
of the Chinese Communist Party
and the Chinese system.
1572
01:07:44,060 --> 01:07:48,716
America has to beat China
in the AI race.
1573
01:07:48,760 --> 01:07:51,023
DANIEL: China's, like,
light-years behind.
1574
01:07:51,067 --> 01:07:52,677
Are they?
1575
01:07:52,720 --> 01:07:54,983
I mean, they have way more
training data than we do,
1576
01:07:55,027 --> 01:07:57,812
and there's nothing saying they
don't drop a model next month
1577
01:07:57,856 --> 01:08:00,946
that isn't--
doesn't far outperform GPT-4.
1578
01:08:00,989 --> 01:08:03,818
Everything was going just fine.
What could go wrong?
1579
01:08:03,862 --> 01:08:06,691
-DeepSeek... [speaking Chinese]
-DeepSeek... -DeepSeek...
1580
01:08:06,734 --> 01:08:08,040
There is a new model.
1581
01:08:08,084 --> 01:08:10,825
But from a Chinese lab
called DeepSeek.
1582
01:08:10,869 --> 01:08:13,959
Let's talk about DeepSeek,
because it is mind-blowing,
1583
01:08:14,002 --> 01:08:17,658
and it is shaking this
entire industry to its core.
1584
01:08:17,702 --> 01:08:19,399
The Trump administration
will ensure
1585
01:08:19,443 --> 01:08:23,055
that the most powerful
AI systems are built in the US
1586
01:08:23,099 --> 01:08:26,537
with American designed
and manufactured chips.
1587
01:08:26,580 --> 01:08:28,800
AI is China's
Apollo project.
1588
01:08:28,843 --> 01:08:30,193
The Chinese Communist Party
1589
01:08:30,236 --> 01:08:31,542
deeply understands the potential
1590
01:08:31,585 --> 01:08:32,891
for AI to disrupt warfare.
1591
01:08:32,934 --> 01:08:35,328
Google no longer promises
that it will not
1592
01:08:35,372 --> 01:08:38,418
use artificial intelligence
for weapons or surveillance.
1593
01:08:38,462 --> 01:08:40,028
China, North Korea, Russia
1594
01:08:40,072 --> 01:08:41,900
are gonna keep building it
as fast as possible
1595
01:08:41,943 --> 01:08:44,511
to get more economic advantage,
more productivity advantage,
1596
01:08:44,555 --> 01:08:46,818
more scientific advantage,
more military advantage,
1597
01:08:46,861 --> 01:08:48,646
'cause AI makes better weapons.
1598
01:08:48,689 --> 01:08:51,736
If you're talking about
a system as broad and capable
1599
01:08:51,779 --> 01:08:53,303
as a brilliant scientist,
1600
01:08:53,346 --> 01:08:57,959
it might be able to run
a military campaign
1601
01:08:58,003 --> 01:09:01,441
better than any of the generals
in the US government right now.
1602
01:09:01,485 --> 01:09:05,489
Taiwan is the issue creating
the most tension right now
1603
01:09:05,532 --> 01:09:07,882
between Beijing and the US.
1604
01:09:07,926 --> 01:09:10,450
There's a-a direct
oppositional disagreement
1605
01:09:10,494 --> 01:09:11,973
between the US and China
1606
01:09:12,017 --> 01:09:13,975
on whether Taiwan is
part of China.
1607
01:09:14,019 --> 01:09:18,415
MATHENY: Taiwan is important
for so many reasons.
1608
01:09:18,458 --> 01:09:20,547
It is also the home of
1609
01:09:20,591 --> 01:09:24,160
about 90% of advanced chip
manufacturing for the world,
1610
01:09:24,203 --> 01:09:28,381
which means that the supply
chain for advanced compute
1611
01:09:28,425 --> 01:09:31,036
is at risk should there be
1612
01:09:31,079 --> 01:09:32,951
some sort of scenario
around Taiwan,
1613
01:09:32,994 --> 01:09:37,999
whether a-a blockade
or an invasion by China.
1614
01:09:38,043 --> 01:09:40,698
HENDRYCKS:
Are we going to be in some race
1615
01:09:40,741 --> 01:09:45,616
between the US and China
that eventually devolves into
1616
01:09:45,659 --> 01:09:49,359
a militarized AI arms race and,
you know, potentially leads
1617
01:09:49,402 --> 01:09:51,274
to some great power conflict?
1618
01:09:51,317 --> 01:09:53,014
NICK SCHIFRIN: The outgoing
top US military commander
1619
01:09:53,058 --> 01:09:55,843
in the region
predicted war was coming.
1620
01:09:55,887 --> 01:09:57,932
I think the threat is manifest
during this decade,
1621
01:09:57,976 --> 01:09:59,456
in fact, in the next six years.
1622
01:09:59,499 --> 01:10:01,414
MATHENY:
Um, one of the scenarios
1623
01:10:01,458 --> 01:10:05,853
that I worry about is
a cyber flash war,
1624
01:10:05,897 --> 01:10:10,902
um, one in which
cyber tools that are autonomous
1625
01:10:10,945 --> 01:10:12,947
are competing against each other
1626
01:10:12,991 --> 01:10:14,601
in ways that are escalatory
1627
01:10:14,645 --> 01:10:16,386
without
meaningful human control.
1628
01:10:18,823 --> 01:10:20,433
HARARI:
You live in a war zone,
1629
01:10:20,477 --> 01:10:23,784
it will be an AI deciding
whether to bomb your house
1630
01:10:23,828 --> 01:10:25,743
and whether to kill you.
1631
01:10:25,786 --> 01:10:27,527
Let's have the machine decide.
1632
01:10:27,571 --> 01:10:29,355
That's the temptation.
1633
01:10:29,399 --> 01:10:32,576
That's why AI is so tempting
for militaries to adopt,
1634
01:10:32,619 --> 01:10:35,753
to create autonomous weapons,
because if I start believing
1635
01:10:35,796 --> 01:10:39,147
that my military adversary
is gonna adopt AI,
1636
01:10:39,191 --> 01:10:41,802
it'll be a race for who can
pull the trigger faster.
1637
01:10:41,846 --> 01:10:44,892
And the one that automates
that decision,
1638
01:10:44,936 --> 01:10:47,199
rather than having
a human in the loop,
1639
01:10:47,243 --> 01:10:49,201
is the one
that will win that war.
1640
01:10:49,245 --> 01:10:50,637
And if you think about
1641
01:10:50,681 --> 01:10:53,074
the nuclear arms race
in the 1940s...
1642
01:10:54,554 --> 01:10:56,556
...you know the Germans
are working on the bomb,
1643
01:10:56,600 --> 01:11:00,734
so it's not so easy to tell
Robert Oppenheimer then,
1644
01:11:00,778 --> 01:11:03,781
"Uh, y-you know, slow down."
1645
01:11:03,824 --> 01:11:05,304
So after watching this movie,
it's gonna be confusing,
1646
01:11:05,348 --> 01:11:06,653
'cause you're gonna go back,
1647
01:11:06,697 --> 01:11:08,307
and tomorrow
you're gonna use ChatGPT,
1648
01:11:08,351 --> 01:11:10,440
and it's gonna be
unbelievably helpful.
1649
01:11:10,483 --> 01:11:12,703
And I will use it, too, and
it'll be unbelievably helpful.
1650
01:11:12,746 --> 01:11:16,141
And you'll say, "Wait, so
I just saw this movie about AI
1651
01:11:16,184 --> 01:11:18,012
"and existential risk
and all these things,
1652
01:11:18,056 --> 01:11:20,624
and where's the threat again?"
1653
01:11:20,667 --> 01:11:24,715
And it's not that ChatGPT is
the existential threat.
1654
01:11:24,758 --> 01:11:28,675
It's that the race to deploy
the most powerful,
1655
01:11:28,719 --> 01:11:31,635
inscrutable,
uncontrollable technology,
1656
01:11:31,678 --> 01:11:35,116
under the worst
incentives possible,
1657
01:11:35,160 --> 01:11:37,031
that's the existential threat.
1658
01:11:38,119 --> 01:11:40,383
DANIEL:
But it strikes me that-that
1659
01:11:40,426 --> 01:11:43,211
we are in a context of a race.
1660
01:11:43,255 --> 01:11:45,910
-SUTSKEVER: Yes.
-And it is in a competitive,
1661
01:11:45,953 --> 01:11:47,651
"got to get there first,
got to win the race" setting,
1662
01:11:47,694 --> 01:11:49,000
"got to compete against China,
1663
01:11:49,043 --> 01:11:50,349
got to compete against
the other labs."
1664
01:11:50,393 --> 01:11:51,829
Isn't that right?
1665
01:11:51,872 --> 01:11:53,657
Today it is the case.
1666
01:11:53,700 --> 01:11:56,703
So we need to change
that race dynamic.
1667
01:11:57,922 --> 01:11:59,924
Don't you think?
1668
01:11:59,967 --> 01:12:02,187
I think that would be
very good indeed.
1669
01:12:03,710 --> 01:12:06,496
I-I think
there's some kind of...
1670
01:12:06,539 --> 01:12:09,542
mysticism around AI
that makes it feel like
1671
01:12:09,586 --> 01:12:11,718
it fell from the sky.
1672
01:12:11,762 --> 01:12:13,459
DANIEL: Thinking of this
as a godlike technology
1673
01:12:13,503 --> 01:12:15,418
is a problem.
1674
01:12:15,461 --> 01:12:16,897
Yeah.
1675
01:12:18,072 --> 01:12:19,900
Why?
1676
01:12:19,944 --> 01:12:22,990
It gives license
to the companies to...
1677
01:12:23,034 --> 01:12:26,254
[laughs]
not take responsibility
1678
01:12:26,298 --> 01:12:30,781
for the things that the software
that they built does.
1679
01:12:30,824 --> 01:12:35,089
If people made that connection,
1680
01:12:35,133 --> 01:12:37,527
maybe that would, like,
help them understand again
1681
01:12:37,570 --> 01:12:41,835
the dynamics of
all the money and power and...
1682
01:12:41,879 --> 01:12:45,317
chaos that's happening to...
1683
01:12:45,361 --> 01:12:47,101
create this technology.
1684
01:12:47,145 --> 01:12:49,234
It's, like, five guys
who control it, right?
1685
01:12:49,277 --> 01:12:51,628
-Like, five men?
-Basically, yeah.
1686
01:12:51,671 --> 01:12:53,847
-The CEOs?
-Yeah.
1687
01:12:53,891 --> 01:12:55,371
Okay.
1688
01:12:56,720 --> 01:12:58,722
♪ ♪
1689
01:13:01,072 --> 01:13:03,074
DANIEL:
Sometimes it feels like
1690
01:13:03,117 --> 01:13:07,252
I've been on this
just, like, endless journey
1691
01:13:07,295 --> 01:13:09,341
of trying to understand this.
1692
01:13:09,385 --> 01:13:12,388
You know, like
I'm climbing a mountain...
1693
01:13:15,608 --> 01:13:19,133
...and every time
I, like, get up a hill,
1694
01:13:19,177 --> 01:13:20,744
I think I've reached the top,
1695
01:13:20,787 --> 01:13:23,399
but it just keeps going
and going and going.
1696
01:13:23,442 --> 01:13:24,530
[camera clicks, whirs]
1697
01:13:24,574 --> 01:13:26,967
But from everything I've heard,
1698
01:13:27,011 --> 01:13:30,362
if I had to guess what's
at the top of Anxiety Mountain,
1699
01:13:30,406 --> 01:13:34,192
it's these five CEOs
from these five companies,
1700
01:13:34,235 --> 01:13:37,630
who are kind of like the
Oppenheimers of this moment.
1701
01:13:37,674 --> 01:13:39,458
These are the guys
who are building this thing.
1702
01:13:39,502 --> 01:13:40,807
Yeah.
1703
01:13:40,851 --> 01:13:44,158
Like, is there a plan?
1704
01:13:44,202 --> 01:13:46,422
The head of the company
that makes ChatGPT
1705
01:13:46,465 --> 01:13:49,947
warned of possible
significant harm to the world.
1706
01:13:49,990 --> 01:13:52,384
I think, if this technology
goes wrong,
1707
01:13:52,428 --> 01:13:53,472
it can go quite wrong.
1708
01:13:53,516 --> 01:13:55,779
[bursts of distinct chatter]
1709
01:13:55,822 --> 01:13:57,781
DANIEL:
Five dudes.
1710
01:13:57,824 --> 01:14:01,567
I-I never thought about us
as a social media company.
1711
01:14:01,611 --> 01:14:06,050
It-it feels like I have to try
and-and find these guys...
1712
01:14:06,093 --> 01:14:07,181
Mm-hmm.
1713
01:14:07,225 --> 01:14:09,009
...and get them in the movie.
1714
01:14:10,315 --> 01:14:12,622
Certainly we'd get
some clarity from that.
1715
01:14:13,666 --> 01:14:14,928
[camera clicks, whirs]
1716
01:14:16,582 --> 01:14:19,150
I mean, the buck's got to stop
somewhere, right?
1717
01:14:19,193 --> 01:14:21,152
[wind whistling]
1718
01:14:21,195 --> 01:14:22,936
♪ ♪
1719
01:14:23,981 --> 01:14:25,896
[tools whirring and clicking]
1720
01:14:27,071 --> 01:14:29,116
[whooshing]
1721
01:14:34,034 --> 01:14:35,296
DANIEL:
How's that feel?
1722
01:14:35,340 --> 01:14:37,690
-Great.
-That okay?
1723
01:14:37,734 --> 01:14:39,431
-Yeah.
-[indistinct crew chatter]
1724
01:14:39,475 --> 01:14:40,998
-Can I move the seat forward?
-DANIEL: Yes, you can.
1725
01:14:41,041 --> 01:14:42,173
It's, uh, it's just
a bit awkward.
1726
01:14:42,216 --> 01:14:43,566
I'm a little, like...
1727
01:14:43,609 --> 01:14:45,045
Dario, how do you feel
about that chair?
1728
01:14:45,089 --> 01:14:47,004
Um, yeah, the chair's good.
1729
01:14:47,047 --> 01:14:48,832
-I just wanted to move it
a little forward. -Okay.
1730
01:14:48,875 --> 01:14:50,529
I picked out that chair.
1731
01:14:50,573 --> 01:14:52,139
-It's a good chair.
-Thanks.
1732
01:14:52,183 --> 01:14:53,619
[cell phone ringing]
1733
01:14:57,884 --> 01:14:59,495
TED:
And just sit down there.
1734
01:15:01,409 --> 01:15:03,063
And then through here,
1735
01:15:03,107 --> 01:15:04,369
we'll be looking at each other
through this glass.
1736
01:15:04,412 --> 01:15:06,458
[busy signal beeping]
1737
01:15:11,419 --> 01:15:13,552
And sitting back in the chair,
leaning forward?
1738
01:15:13,596 --> 01:15:15,119
-DANIEL: Yeah. Well, whatever's
comfortable, I think. -Okay.
1739
01:15:15,162 --> 01:15:16,686
It's kind of cool.
There's a bunch of mirrors.
1740
01:15:16,729 --> 01:15:17,817
-Is that-- okay.
-WOMAN: Sam A., take one. Mark.
1741
01:15:17,861 --> 01:15:19,515
-How's that?
-Good. Thank you.
1742
01:15:19,558 --> 01:15:21,038
DANIEL: The genesis
of this project is that I was
1743
01:15:21,081 --> 01:15:22,822
sitting at home,
and I was playing around with,
1744
01:15:22,866 --> 01:15:24,389
I think, your image generator,
1745
01:15:24,432 --> 01:15:27,261
and I was simultaneously,
uh, terrified
1746
01:15:27,305 --> 01:15:29,437
-and really impressed.
-[chuckles softly]
1747
01:15:29,481 --> 01:15:31,875
-That is the usual combination.
-And then cut to
1748
01:15:31,918 --> 01:15:34,617
my wife and I find out
we're expecting.
1749
01:15:34,660 --> 01:15:38,272
And-and I'm, like, having
an existential crisis
1750
01:15:38,316 --> 01:15:40,231
as my wife is
six months pregnant.
1751
01:15:40,274 --> 01:15:42,929
And I think my first question,
which I ask everybody:
1752
01:15:42,973 --> 01:15:45,453
Is now a terrible time
to have a kid?
1753
01:15:45,497 --> 01:15:47,368
Am I making a big mistake?
1754
01:15:47,412 --> 01:15:49,022
I'm expecting a kid
in March, too. My first one.
1755
01:15:49,066 --> 01:15:50,502
-You're expecting a kid?
-Yeah.
1756
01:15:50,546 --> 01:15:51,808
-You're expecting in March?
-First kid.
1757
01:15:51,851 --> 01:15:53,287
-Mazel tov.
-Thank you very much.
1758
01:15:53,331 --> 01:15:54,462
I've never been so excited
for anything.
1759
01:15:54,506 --> 01:15:56,464
-That's how I feel.
-Yeah.
1760
01:15:56,508 --> 01:15:58,205
But you're not scared?
1761
01:15:58,249 --> 01:16:02,732
I mean, like, having a kid is
just this momentous thing
1762
01:16:02,775 --> 01:16:04,995
that I, you know,
I stay up every night, like,
1763
01:16:05,038 --> 01:16:06,823
reading these books about
how to raise a kid,
1764
01:16:06,866 --> 01:16:08,215
and I hope
I'm gonna do a good job,
1765
01:16:08,259 --> 01:16:09,260
and it feels
very overwhelming and--
1766
01:16:09,303 --> 01:16:12,045
but I'm not scared...
1767
01:16:12,089 --> 01:16:14,047
for kids to grow up
in a world with AI.
1768
01:16:14,091 --> 01:16:16,310
Like, that's... that'll be okay.
1769
01:16:16,354 --> 01:16:19,009
That is good to hear
coming from the guy.
1770
01:16:19,052 --> 01:16:20,750
I think it's a wonderful idea
to have kids.
1771
01:16:20,793 --> 01:16:23,230
I-I think they're the most
magical, incredible thing.
1772
01:16:23,274 --> 01:16:25,276
-[sighs heavily]
-There's so much uncertainty,
1773
01:16:25,319 --> 01:16:26,843
I would almost just do
what you're gonna do anyway.
1774
01:16:26,886 --> 01:16:28,932
I know that's not
a very satisfying answer,
1775
01:16:28,975 --> 01:16:31,891
but it's the only one
I can come up with.
1776
01:16:31,935 --> 01:16:34,459
Our kids are never gonna...
1777
01:16:34,502 --> 01:16:37,070
know a world that doesn't have
really advanced AI.
1778
01:16:37,114 --> 01:16:39,856
In fact, our kids will never
be smarter than an AI.
1779
01:16:39,899 --> 01:16:42,075
DANIEL: "Our kids will
never be smarter than an AI"?
1780
01:16:42,119 --> 01:16:44,338
Well, from a raw,
from a raw IQ perspective,
1781
01:16:44,382 --> 01:16:46,253
they will never be
smarter than an AI.
1782
01:16:46,297 --> 01:16:48,038
That notion doesn't
unsettle you a little bit?
1783
01:16:48,081 --> 01:16:51,128
'Cause it makes me feel
a little queasy in a weird way.
1784
01:16:51,171 --> 01:16:54,784
Um, it does unsettle me
a little bit.
1785
01:16:54,827 --> 01:16:56,263
But it is reality.
1786
01:16:56,307 --> 01:16:58,265
DANIEL:
Okay, so race dynamics.
1787
01:16:58,309 --> 01:17:01,007
We have a bunch of people
who are all in agreement
1788
01:17:01,051 --> 01:17:02,879
that this is scary.
1789
01:17:02,922 --> 01:17:04,750
Based on the timelines that
a lot of people have given me,
1790
01:17:04,794 --> 01:17:07,013
we have between two months and
five years to figure this out.
1791
01:17:07,057 --> 01:17:08,667
-Yeah.
-And-and, I guess,
1792
01:17:08,711 --> 01:17:12,845
my biggest question is:
Why can't we just stop?
1793
01:17:14,499 --> 01:17:19,069
The problem with, uh, uh,
"just stopping" is that, um,
1794
01:17:19,112 --> 01:17:22,202
there are many, many groups now
around the world, uh, uh,
1795
01:17:22,246 --> 01:17:25,684
building this, for
many nations, many companies,
1796
01:17:25,728 --> 01:17:28,121
um, all with
different motivations.
1797
01:17:28,165 --> 01:17:30,776
There are some of
these companies in this space
1798
01:17:30,820 --> 01:17:33,997
who their position is, "We want
to develop this technology
1799
01:17:34,040 --> 01:17:36,042
absolutely as fast as possible."
1800
01:17:36,086 --> 01:17:40,046
And even if we could pass laws
in the US and in Europe,
1801
01:17:40,090 --> 01:17:42,396
we need to convince...
1802
01:17:42,440 --> 01:17:44,964
Xi Jinping and Vladimir Putin
or, you know,
1803
01:17:45,008 --> 01:17:47,445
whoever their scientific
advisors are on their side.
1804
01:17:47,488 --> 01:17:49,229
That's gonna be really hard.
1805
01:17:49,273 --> 01:17:50,927
I think it is true
1806
01:17:50,970 --> 01:17:53,581
that if two people are
in exactly the same place,
1807
01:17:53,625 --> 01:17:55,888
uh, the one willing to take
more shortcuts on safety
1808
01:17:55,932 --> 01:17:58,064
should kind of
"get there first."
1809
01:17:58,108 --> 01:18:01,067
Uh, but...
1810
01:18:01,111 --> 01:18:02,939
we're able to use our lead
1811
01:18:02,982 --> 01:18:05,985
to spend a lot more time
doing safety testing.
1812
01:18:06,029 --> 01:18:07,987
DANIEL:
And-and what if you lose it?
1813
01:18:08,031 --> 01:18:09,597
You get the call
and you find out
1814
01:18:09,641 --> 01:18:11,687
that you're now, let's say,
six months behind.
1815
01:18:11,730 --> 01:18:14,254
-Wh-What happens then?
-Depends who we're behind to.
1816
01:18:14,298 --> 01:18:16,822
If it's, like,
a adversarial government,
1817
01:18:16,866 --> 01:18:18,824
that's probably really bad.
1818
01:18:18,868 --> 01:18:21,348
So let's say you get the call
that China has a,
1819
01:18:21,392 --> 01:18:24,787
has a recursively
self-improving agent
1820
01:18:24,830 --> 01:18:27,180
or something like that that we
should be really worried about.
1821
01:18:27,224 --> 01:18:29,226
What do-- what-what happens?
1822
01:18:30,967 --> 01:18:32,403
Um...
1823
01:18:35,406 --> 01:18:37,234
That case would require...
1824
01:18:37,277 --> 01:18:40,672
the first step there would be
to talk to the US government.
1825
01:18:40,716 --> 01:18:42,935
Sam, do you trust the
government's ability to handle
1826
01:18:42,979 --> 01:18:44,676
something like this?
1827
01:18:45,895 --> 01:18:47,766
Yeah, I do, actually.
1828
01:18:47,810 --> 01:18:49,550
There's other things I don't
trust the government to handle,
1829
01:18:49,594 --> 01:18:51,161
but that particular scenario,
I think they would know...
1830
01:18:51,204 --> 01:18:52,771
they-they-- yes.
1831
01:18:52,815 --> 01:18:54,425
DANIEL: When you have,
like, private discussions
1832
01:18:54,468 --> 01:18:56,383
with the other guys whose
fingers are on the trigger,
1833
01:18:56,427 --> 01:18:59,604
so to speak, um,
do those private discussions
1834
01:18:59,647 --> 01:19:01,301
fill you with confidence
1835
01:19:01,345 --> 01:19:03,347
or do they make you,
uh, more anxious?
1836
01:19:03,390 --> 01:19:07,090
Look, I mean, I-I know some
of them better than others.
1837
01:19:07,133 --> 01:19:10,136
Um, I have more confidence
in some of them th-than I have
1838
01:19:10,180 --> 01:19:12,356
in others, you know,
as with, as with any people.
1839
01:19:12,399 --> 01:19:14,140
You know,
you think about, you know,
1840
01:19:14,184 --> 01:19:15,838
the kids in your high school
class or something, right?
1841
01:19:15,881 --> 01:19:18,275
You know, some of them are,
you know, really sweet.
1842
01:19:18,318 --> 01:19:20,407
Some of them are well-meaning
but not that effective.
1843
01:19:20,451 --> 01:19:22,366
Some of them are,
you know, bullies.
1844
01:19:22,409 --> 01:19:23,759
Some of them are
really bad people.
1845
01:19:23,802 --> 01:19:25,673
[stammers]
You-you kind of really,
1846
01:19:25,717 --> 01:19:27,066
you really see the spread.
1847
01:19:27,110 --> 01:19:28,807
You know, am I, am I,
am I confident
1848
01:19:28,851 --> 01:19:30,548
that everyone's gonna do
the right thing,
1849
01:19:30,591 --> 01:19:32,028
that it's all gonna work out?
1850
01:19:32,071 --> 01:19:33,856
Um, no, I'm not.
1851
01:19:33,899 --> 01:19:35,509
And there's-there's nothing
I can do about that, right?
1852
01:19:35,553 --> 01:19:37,773
You know, all-all I can do
is-is push for the,
1853
01:19:37,816 --> 01:19:39,165
push for the, you know,
1854
01:19:39,209 --> 01:19:40,776
push for the government
to get involved.
1855
01:19:40,819 --> 01:19:42,647
But ultimately I'm just
one person there, too.
1856
01:19:42,690 --> 01:19:44,127
That's--
it's-it's up to all of us
1857
01:19:44,170 --> 01:19:46,042
to push for the government
to get involved.
1858
01:19:46,085 --> 01:19:48,479
That's the number one thing
that-that I think we need to do
1859
01:19:48,522 --> 01:19:50,916
to-to, you know, to set things
in the right direction.
1860
01:19:50,960 --> 01:19:52,744
What makes me anxious
about that is, like,
1861
01:19:52,788 --> 01:19:56,182
the basic reality that the
speed at which the technology
1862
01:19:56,226 --> 01:19:58,402
is proliferating and growing
is exponential,
1863
01:19:58,445 --> 01:20:01,448
and the mechanisms to legislate
are 300 years old,
1864
01:20:01,492 --> 01:20:02,928
-take forever.
-Yeah.
1865
01:20:02,972 --> 01:20:05,104
Um, I-I think
it's gonna be a heavy lift.
1866
01:20:05,148 --> 01:20:06,889
I-I definitely agree
with you on that.
1867
01:20:06,932 --> 01:20:08,891
DANIEL: You know,
what I'm literally looking for
1868
01:20:08,934 --> 01:20:11,241
is-is like,
"Here are steps that, like,
1869
01:20:11,284 --> 01:20:15,158
"the head honchos are gonna
take to-to focus on safety,
1870
01:20:15,201 --> 01:20:18,117
to mitigate the peril
and maximize the promise."
1871
01:20:18,161 --> 01:20:20,119
And I don't,
I don't, I don't know
1872
01:20:20,163 --> 01:20:21,817
that there's
a simple answer to that.
1873
01:20:21,860 --> 01:20:23,296
Um...
1874
01:20:25,951 --> 01:20:27,561
I mean, this maybe is
too simple,
1875
01:20:27,605 --> 01:20:30,173
but you...
1876
01:20:30,216 --> 01:20:31,957
create a new model,
1877
01:20:32,001 --> 01:20:34,786
you study and test it
very carefully.
1878
01:20:34,830 --> 01:20:37,223
You put it out
into the world gradually,
1879
01:20:37,267 --> 01:20:39,530
and then, more and more,
you understand
1880
01:20:39,573 --> 01:20:41,227
if that's safe or not,
and then if it is,
1881
01:20:41,271 --> 01:20:42,707
you can take the next step.
1882
01:20:44,404 --> 01:20:47,103
It doesn't sound as flashy
as, like, a brilliant scientist
1883
01:20:47,146 --> 01:20:50,454
coming up with one idea in a
lab to make an AI system, like,
1884
01:20:50,497 --> 01:20:53,979
perfectly safe and controllable
and everything else,
1885
01:20:54,023 --> 01:20:55,720
but it is what I believe
is gonna happen.
1886
01:20:55,763 --> 01:20:57,287
Like, it is the way
I think this works.
1887
01:20:57,330 --> 01:20:59,289
But let's just say
something terrible happens,
1888
01:20:59,332 --> 01:21:01,813
like a model gets loose
or goes rogue or something.
1889
01:21:01,857 --> 01:21:03,467
Is there a protocol?
1890
01:21:03,510 --> 01:21:05,817
Like, literally,
I'm imagining a red phone.
1891
01:21:05,861 --> 01:21:07,123
-Yeah.
-Sorry for thinking of this
1892
01:21:07,166 --> 01:21:08,515
in terms of movies, but, like...
1893
01:21:08,559 --> 01:21:09,995
There is a protocol.
1894
01:21:10,039 --> 01:21:11,475
-Is there a red phone
on your desk? -No.
1895
01:21:11,518 --> 01:21:13,216
[laughs]:
Is it a secret?
1896
01:21:13,259 --> 01:21:15,392
I mean, uh, no, it's not,
it's not as fancy or dramatic
1897
01:21:15,435 --> 01:21:17,176
as you, like, would hope,
but there's, like, you know--
1898
01:21:17,220 --> 01:21:18,874
we've, like, thought through
these scenarios,
1899
01:21:18,917 --> 01:21:20,527
and if this happens,
we're gonna call these people
1900
01:21:20,571 --> 01:21:21,964
in this order and do this
1901
01:21:22,007 --> 01:21:23,443
and kind of make
these decisions if, like--
1902
01:21:23,487 --> 01:21:26,055
I do believe that
when you have an opportunity
1903
01:21:26,098 --> 01:21:29,058
to do your thinking before
a stressful situation happens,
1904
01:21:29,101 --> 01:21:30,407
that's almost always
a good idea.
1905
01:21:30,450 --> 01:21:31,756
And writing it down is helpful.
1906
01:21:31,799 --> 01:21:33,279
-Being prepared is helpful.
-Yeah.
1907
01:21:34,324 --> 01:21:36,021
[sighs] You...
1908
01:21:36,065 --> 01:21:38,763
It would be impossible
for me to sit across from you
1909
01:21:38,806 --> 01:21:41,157
and-and ask you to promise me
that this is gonna go well?
1910
01:21:41,200 --> 01:21:43,855
That is impossible.
1911
01:21:43,899 --> 01:21:45,683
There isn't any easy answers,
unfortunately.
1912
01:21:45,726 --> 01:21:48,077
Uh, because it's such
a cutting-edge technology,
1913
01:21:48,120 --> 01:21:49,774
um, there's still
a lot of unknowns.
1914
01:21:49,817 --> 01:21:52,255
And I think that
that-that needs to be,
1915
01:21:52,298 --> 01:21:55,040
um, uh, you know, understood
1916
01:21:55,084 --> 01:21:58,957
and-and hence the need
for, uh, uh, some caution.
1917
01:21:59,001 --> 01:22:00,872
I wake up, you know, every day,
1918
01:22:00,916 --> 01:22:03,440
this is the, this is the
number one thing I think about.
1919
01:22:03,483 --> 01:22:05,137
Now, look, I'm human,
1920
01:22:05,181 --> 01:22:08,358
and, you know, has-has
every decision been perfect?
1921
01:22:08,401 --> 01:22:11,622
Can I even say my motivations
were always perfectly clear?
1922
01:22:11,665 --> 01:22:13,537
Of course not.
No one can say that.
1923
01:22:13,580 --> 01:22:16,366
Like, that's-that's
just not, like, you know--
1924
01:22:16,409 --> 01:22:18,629
that's-that's just not
how people work.
1925
01:22:18,672 --> 01:22:21,023
The-the history of science
tends to be that,
1926
01:22:21,066 --> 01:22:23,547
for better or for worse,
if something's possible to do--
1927
01:22:23,590 --> 01:22:26,985
and we now know AI is possible
to do-- humanity does it.
1928
01:22:27,029 --> 01:22:29,727
All of this
was-was going to happen.
1929
01:22:29,770 --> 01:22:32,295
This-this train
isn't gonna stop.
1930
01:22:32,338 --> 01:22:34,253
You can't step in front of
the train and stop it.
1931
01:22:34,297 --> 01:22:36,255
You're just gonna get squished.
1932
01:22:36,299 --> 01:22:38,388
ALTMAN:
I mean, it's very stressful.
1933
01:22:39,606 --> 01:22:41,043
You know, there's, like,
1934
01:22:41,086 --> 01:22:42,653
a lot of things
a lot of us don't know.
1935
01:22:42,696 --> 01:22:46,352
I think the history
of scientific discovery is
1936
01:22:46,396 --> 01:22:48,267
one of not knowing
what you don't know
1937
01:22:48,311 --> 01:22:49,660
and figuring out as you go.
1938
01:22:49,703 --> 01:22:52,968
Uh, but, yeah, it is a...
1939
01:22:53,011 --> 01:22:56,580
it is a stressful way to live.
1940
01:22:56,623 --> 01:22:58,756
DANIEL: Right. Sam, thank you
very much for doing this.
1941
01:22:58,799 --> 01:23:00,497
-And again, mazel tov.
-Thank you. And to you.
1942
01:23:00,540 --> 01:23:02,673
Thank you so much. Thanks, guys.
1943
01:23:02,716 --> 01:23:05,458
♪ ♪
1944
01:23:07,678 --> 01:23:08,984
[sighs]
1945
01:23:10,986 --> 01:23:13,162
[electricity crackling]
1946
01:23:13,205 --> 01:23:15,599
-[thunder cracks]
-[whooshing, grunting]
1947
01:23:42,626 --> 01:23:44,628
-[VCR clicks]
-[line ringing]
1948
01:23:47,848 --> 01:23:49,285
-[line clicks]
-KEVIN ROHER: Hello.
1949
01:23:49,328 --> 01:23:50,547
DANIEL:
Hey, Dad, how are you?
1950
01:23:50,590 --> 01:23:51,591
KEVIN:
Good. How you doing?
1951
01:23:51,635 --> 01:23:53,071
You working? What are you up to?
1952
01:23:53,115 --> 01:23:55,247
DANIEL:
I'm working on this AI film.
1953
01:23:55,291 --> 01:23:57,815
KEVIN:
And how's it going?
1954
01:23:57,858 --> 01:24:00,426
DANIEL:
You know, it... it's really...
1955
01:24:00,470 --> 01:24:02,167
-JOANNE ROHER: Hi, sweetie.
-DANIEL: Hi, Mom.
1956
01:24:02,211 --> 01:24:03,864
JOANNE: So what is
the premise of the film?
1957
01:24:03,908 --> 01:24:06,519
Is it a documentary?
Kev, don't use any more spices.
1958
01:24:06,563 --> 01:24:08,304
It's already over-spiced.
1959
01:24:08,347 --> 01:24:10,436
We're making chicken right now.
1960
01:24:10,480 --> 01:24:12,612
-Is it a documentary or what...
-DANIEL: It's about--
1961
01:24:12,656 --> 01:24:13,961
the movie's about
the end of the world.
1962
01:24:14,005 --> 01:24:15,789
The end of the world's coming,
1963
01:24:15,833 --> 01:24:17,574
and we're making a movie
about the end of the world.
1964
01:24:17,617 --> 01:24:19,837
-JOANNE: Really?
-DANIEL: Yeah.
1965
01:24:19,880 --> 01:24:21,578
KEVIN: Kind of a depressing
film, it sounds like.
1966
01:24:21,621 --> 01:24:23,449
JOANNE:
Yeah.
1967
01:24:23,493 --> 01:24:28,976
DANIEL: I'm feeling a lot,
like-- this very acute anxiety.
1968
01:24:29,020 --> 01:24:31,370
JOANNE: It's so scary,
but there's got to be--
1969
01:24:31,414 --> 01:24:35,287
you know, have you been meeting
some-some supersmart people
1970
01:24:35,331 --> 01:24:37,768
that are giving you any answers?
1971
01:24:37,811 --> 01:24:39,465
DANIEL: That's what's
frustrating about it.
1972
01:24:39,509 --> 01:24:41,337
No one knows.
1973
01:24:41,380 --> 01:24:43,904
JOANNE:
All I can say to that is that
1974
01:24:43,948 --> 01:24:49,867
every generation has had
something scary like this.
1975
01:24:49,910 --> 01:24:52,217
KEVIN: When I was born, it was
the Cuban Missile Crisis.
1976
01:24:52,261 --> 01:24:54,263
[steady heartbeat]
1977
01:24:54,306 --> 01:24:56,352
I was just scared that there
was going to be a nuclear war.
1978
01:24:56,395 --> 01:24:57,744
JOANNE:
Yeah, but we didn't know
1979
01:24:57,788 --> 01:24:59,094
what they were gonna do and...
1980
01:24:59,137 --> 01:25:00,530
KEVIN:
And the world didn't end.
1981
01:25:00,573 --> 01:25:01,618
Everyone woke up
the next morning,
1982
01:25:01,661 --> 01:25:03,402
and we're still doing our thing.
1983
01:25:03,446 --> 01:25:04,969
DANIEL:
I'm very scared,
1984
01:25:05,012 --> 01:25:06,840
especially in
the context of, like,
1985
01:25:06,884 --> 01:25:09,147
-you know, the baby and...
-[fetal heartbeat pulsing]
1986
01:25:09,191 --> 01:25:10,540
KEVIN:
It's gonna be a learning curve.
1987
01:25:10,583 --> 01:25:12,063
JOANNE:
You're gonna be okay.
1988
01:25:12,107 --> 01:25:13,064
KEVIN: You can't,
you can't think about
1989
01:25:13,108 --> 01:25:14,326
what you can't control, Daniel.
1990
01:25:14,370 --> 01:25:16,154
Just remember that.
1991
01:25:16,198 --> 01:25:17,677
DANIEL [voice breaking]:
I'm really, I'm really feeling
1992
01:25:17,721 --> 01:25:19,201
nervous and scared about it.
1993
01:25:19,244 --> 01:25:21,028
There's so much
that I can't control.
1994
01:25:21,072 --> 01:25:23,857
KEVIN: Don't be nervous.
You can't let that get to you.
1995
01:25:23,901 --> 01:25:26,251
You can only control
what you can control,
1996
01:25:26,295 --> 01:25:27,905
and that's all you can do.
1997
01:25:27,948 --> 01:25:30,603
You can't do more than that.
1998
01:25:30,647 --> 01:25:32,649
Write that down in your book.
1999
01:25:35,347 --> 01:25:36,870
♪ ♪
2000
01:25:44,095 --> 01:25:46,663
[fetal heartbeat pulsing]
2001
01:25:46,706 --> 01:25:49,231
[howling]
2002
01:25:49,274 --> 01:25:51,581
CAROLINE:
When you look back,
2003
01:25:51,624 --> 01:25:54,453
the world is always ending.
2004
01:25:54,497 --> 01:25:56,803
And when you look ahead,
2005
01:25:56,847 --> 01:25:59,241
the world is always ending.
2006
01:25:59,284 --> 01:26:02,157
...on fire.
One home is already on fire...
2007
01:26:02,200 --> 01:26:05,377
CAROLINE: But the world
is always starting, too.
2008
01:26:07,988 --> 01:26:09,947
DANIEL:
Are you ready?
2009
01:26:09,990 --> 01:26:11,992
Are you ever really ready?
2010
01:26:15,909 --> 01:26:17,650
DANIEL: You want to drive
or you want me to drive?
2011
01:26:17,694 --> 01:26:18,782
CAROLINE:
You drive.
2012
01:26:20,479 --> 01:26:21,828
-[woman speaks indistinctly]
-CAROLINE: Okay.
2013
01:26:21,872 --> 01:26:23,700
DANIEL:
Thank you, Maria.
2014
01:26:23,743 --> 01:26:26,268
WOMAN: It's gonna just
feel really crampy.
2015
01:26:26,311 --> 01:26:27,791
-CAROLINE: Okay.
-WOMAN: You're ready?
2016
01:26:27,834 --> 01:26:29,053
-CAROLINE: Yes.
-[indistinct chatter]
2017
01:26:29,096 --> 01:26:30,924
[Caroline sighs heavily]
2018
01:26:30,968 --> 01:26:32,404
WOMAN: You could be in a
medical drama with all that...
2019
01:26:33,840 --> 01:26:35,842
♪ ♪
2020
01:26:44,329 --> 01:26:45,896
-[static crackles]
-[indistinct chatter]
2021
01:26:49,116 --> 01:26:51,075
WOMAN:
...some pressure.
2022
01:26:52,685 --> 01:26:54,078
Just relax.
2023
01:26:56,907 --> 01:26:59,214
-[baby crying]
-[device beeping]
2024
01:27:02,129 --> 01:27:04,915
DANIEL:
Hi, buddy. Ooh.
2025
01:27:04,958 --> 01:27:06,438
Hi, buddy.
2026
01:27:06,482 --> 01:27:08,484
♪ ♪
2027
01:27:31,202 --> 01:27:33,683
KEVIN:
It's gonna be a whole new world
2028
01:27:33,726 --> 01:27:35,032
for you, Daniel.
2029
01:27:37,208 --> 01:27:40,080
And I think you're gonna be
an amazing father.
2030
01:27:41,517 --> 01:27:44,346
JOANNE [chuckles]:
That's for sure.
2031
01:27:44,389 --> 01:27:46,609
-[Kevin crying]
-Oh.
2032
01:27:46,652 --> 01:27:48,654
[crying continues]
2033
01:27:48,698 --> 01:27:50,787
KEVIN:
You're gonna do a great job.
2034
01:27:51,788 --> 01:27:53,050
JOANNE:
Kev, why are you crying?
2035
01:27:53,093 --> 01:27:54,617
KEVIN:
I'm crying because...
2036
01:27:54,660 --> 01:27:57,054
I don't know. I just don't...
2037
01:27:57,097 --> 01:27:59,317
DANIEL: Dad, you're gonna
make me cry, too.
2038
01:27:59,361 --> 01:28:00,710
Why are you crying?
2039
01:28:03,190 --> 01:28:05,497
KEVIN: I just know that
you're gonna be an amazing dad.
2040
01:28:05,541 --> 01:28:07,107
DANIEL [crying]:
I'm only gonna be a great dad
2041
01:28:07,151 --> 01:28:09,675
'cause I had a great dad.
2042
01:28:09,719 --> 01:28:12,287
KEVIN: I'm getting emotional,
that's all.
2043
01:28:12,330 --> 01:28:13,679
JOANNE:
Aw.
2044
01:28:13,723 --> 01:28:14,898
[playful chatter over video]
2045
01:28:14,941 --> 01:28:16,552
[over video]:
Boy, oh, boy!
2046
01:28:16,595 --> 01:28:17,857
My goodness!
2047
01:28:18,902 --> 01:28:20,817
Look at those little cheekies.
2048
01:28:20,860 --> 01:28:24,777
Look at those little cheekies.
Look at those little cheekies.
2049
01:28:24,821 --> 01:28:26,344
Yeah.
2050
01:28:26,388 --> 01:28:28,433
DANIEL:
I know how to end this movie.
2051
01:28:29,478 --> 01:28:32,263
Babies.
2052
01:28:32,307 --> 01:28:35,266
-The end of the movie is about
babies.-[chatter over videos]
2053
01:28:35,310 --> 01:28:37,094
They're life-affirming.
2054
01:28:37,137 --> 01:28:38,922
They're exhausting.
2055
01:28:38,965 --> 01:28:40,793
-[baby laughing]
-They're hilarious.
2056
01:28:40,837 --> 01:28:42,665
And they're worth it.
2057
01:28:42,708 --> 01:28:46,451
This film isn't about the
inner workings of a technology.
2058
01:28:46,495 --> 01:28:48,235
It's not about
the billionaire CEOs.
2059
01:28:48,279 --> 01:28:50,194
It's not about the geopolitics.
2060
01:28:50,237 --> 01:28:53,589
It's not about the terrifying
future or the end of the world,
2061
01:28:53,632 --> 01:28:55,678
because my world
is just starting.
2062
01:28:55,721 --> 01:28:57,375
I'm building a crib.
2063
01:28:57,419 --> 01:28:59,290
Right here, right now.
2064
01:28:59,334 --> 01:29:00,465
[baby cooing]
2065
01:29:00,509 --> 01:29:02,337
AI is gonna change everything
2066
01:29:02,380 --> 01:29:05,601
in ways too powerful and
complex for us to understand.
2067
01:29:05,644 --> 01:29:08,952
And the future is not
for any of us to decide.
2068
01:29:10,736 --> 01:29:13,609
But what I can decide is to be
the best possible husband
2069
01:29:13,652 --> 01:29:17,700
for my wife and the best
possible dad for my son.
2070
01:29:17,743 --> 01:29:21,399
So whether our AI future is
a nightmarish dystopia
2071
01:29:21,443 --> 01:29:23,836
or the utopia
that we all dream of,
2072
01:29:23,880 --> 01:29:26,883
I'll at least know
that I did everything I could
2073
01:29:26,926 --> 01:29:30,103
to guide my family
through this AI revolution.
2074
01:29:30,147 --> 01:29:33,846
And no matter what,
we'll be facing it together.
2075
01:29:33,890 --> 01:29:35,195
[uplifting music swells]
2076
01:29:35,239 --> 01:29:36,806
[music stops abruptly]
2077
01:29:36,849 --> 01:29:40,505
DANIEL:
So that's just our first idea.
2078
01:29:40,549 --> 01:29:42,812
How does this feel?
Are you feeling this?
2079
01:29:42,855 --> 01:29:45,249
Wait, I-- like, this is not--
this is a joke.
2080
01:29:45,292 --> 01:29:47,425
It's not actually
how you're gonna end it.
2081
01:29:50,036 --> 01:29:52,648
I mean, it's just an idea. Okay?
2082
01:29:52,691 --> 01:29:54,389
No, Daniel.
2083
01:29:54,432 --> 01:29:56,129
...uh, very, very dumb.
2084
01:29:56,173 --> 01:29:58,915
You've just spent, I don't
know, like, how many years
2085
01:29:58,958 --> 01:30:02,571
of our life working on this,
talking to every leading expert
2086
01:30:02,614 --> 01:30:04,964
on the planet about the subject,
2087
01:30:05,008 --> 01:30:07,663
and you're gonna end it
2088
01:30:07,706 --> 01:30:10,796
with some, like,
kumbaya bullshit?
2089
01:30:10,840 --> 01:30:13,495
There's an asteroid
headed to Earth.
2090
01:30:13,538 --> 01:30:15,497
What do you do? Just...
2091
01:30:15,540 --> 01:30:17,629
hold hands
and hope it works out okay?
2092
01:30:17,673 --> 01:30:19,414
Absolutely not.
2093
01:30:19,457 --> 01:30:22,286
We have to-- it ha--
it has to be...
2094
01:30:22,329 --> 01:30:24,419
The ending has to...
2095
01:30:25,855 --> 01:30:27,900
CAROLINE:
Okay, first thing:
2096
01:30:27,944 --> 01:30:31,426
AI is here,
and it's here to stay.
2097
01:30:31,469 --> 01:30:32,905
The shit's out of the horse,
2098
01:30:32,949 --> 01:30:35,430
but the horse is
gonna keep shitting.
2099
01:30:38,171 --> 01:30:39,651
[crowd cheering]
2100
01:30:39,695 --> 01:30:41,566
You know, one of the basic
laws of history
2101
01:30:41,610 --> 01:30:44,003
is that nothing
really have a beginning
2102
01:30:44,047 --> 01:30:45,962
and nothing has any ending.
2103
01:30:46,005 --> 01:30:47,703
It just goes on.
2104
01:30:47,746 --> 01:30:50,532
AI is nowhere near
its full development.
2105
01:30:50,575 --> 01:30:52,795
CAROLINE: Even if
the current AI bubble bursts,
2106
01:30:52,838 --> 01:30:54,579
-humans are never going to stop
-[noisemaker blows]
2107
01:30:54,623 --> 01:30:57,539
building more and more
powerful technology.
2108
01:30:57,582 --> 01:30:59,410
HAO:
You can choose not to use AI
2109
01:30:59,454 --> 01:31:02,239
or participate in it, but it's
going to affect you anyway.
2110
01:31:02,282 --> 01:31:04,023
DANIEL: Okay, fantastic.
So we're screwed.
2111
01:31:04,067 --> 01:31:06,635
CAROLINE: No, we're not
because of one simple thing.
2112
01:31:10,943 --> 01:31:12,989
This is not inevitable.
2113
01:31:13,032 --> 01:31:15,208
If we could just see it
clearly together,
2114
01:31:15,252 --> 01:31:18,342
the obvious response will be
to choose something different.
2115
01:31:18,385 --> 01:31:21,040
We need to very clearly
change the game
2116
01:31:21,084 --> 01:31:23,129
from a race to the bottom
into a race to the top.
2117
01:31:23,173 --> 01:31:27,438
LEAHY: The problem we need to
solve is not AI specifically.
2118
01:31:27,482 --> 01:31:31,224
It's the general question
of how do we build a society
2119
01:31:31,268 --> 01:31:33,879
that can deal with
powerful technology.
2120
01:31:33,923 --> 01:31:35,359
Because we're going to get
2121
01:31:35,402 --> 01:31:36,795
only more and more powerful
technology.
2122
01:31:36,839 --> 01:31:39,668
CAROLINE:
We need to upgrade our society,
2123
01:31:39,711 --> 01:31:41,496
and the first step is
coming together
2124
01:31:41,539 --> 01:31:43,367
-and demanding...
-ALL: Coordination.
2125
01:31:43,410 --> 01:31:47,023
Some form of international
cooperation or agreement about
2126
01:31:47,066 --> 01:31:48,503
what the norms should be.
2127
01:31:48,546 --> 01:31:49,895
You know, how should
they be deployed...
2128
01:31:49,939 --> 01:31:51,549
Like, real
international diplomacy
2129
01:31:51,593 --> 01:31:53,290
among the superpowers.
2130
01:31:53,333 --> 01:31:56,293
The Chinese are as worried
about it as the Americans.
2131
01:31:56,336 --> 01:31:57,729
I think it's difficult,
you know,
2132
01:31:57,773 --> 01:31:59,514
in the current
geopolitical climate,
2133
01:31:59,557 --> 01:32:01,254
-but I think it's necessary.
-RASKIN: Absolutely.
2134
01:32:01,298 --> 01:32:04,693
In the exact same way that
the last time that humanity
2135
01:32:04,736 --> 01:32:09,611
developed a technology
this dangerous...
2136
01:32:09,654 --> 01:32:12,396
...that required a complete,
unprecedented shift
2137
01:32:12,439 --> 01:32:15,530
to the structure of our world.
2138
01:32:15,573 --> 01:32:17,619
CAROLINE:
So we need to do that complete,
2139
01:32:17,662 --> 01:32:20,883
unprecedented shift again.
2140
01:32:20,926 --> 01:32:24,103
You know, we-we talk to people
who work at these AI companies,
2141
01:32:24,147 --> 01:32:25,322
and they say they want to do
something different,
2142
01:32:25,365 --> 01:32:26,628
but they need public pressure.
2143
01:32:26,671 --> 01:32:28,151
They need the government
to do something.
2144
01:32:28,194 --> 01:32:29,979
So then we go to DC,
and they say,
2145
01:32:30,022 --> 01:32:31,850
"Well, we need Silicon Valley
to do something different.
2146
01:32:31,894 --> 01:32:33,635
They're the ones who are gonna
come up with the guardrails."
2147
01:32:33,678 --> 01:32:35,680
And so everyone is pointing
the finger at someone else,
2148
01:32:35,724 --> 01:32:39,466
and what they agree on is
that we need public pressure
2149
01:32:39,510 --> 01:32:41,207
in order for something else
to happen.
2150
01:32:41,251 --> 01:32:42,818
And that's what you
2151
01:32:42,861 --> 01:32:44,254
and all the people
watching this movie can do.
2152
01:32:44,297 --> 01:32:45,298
CAROLINE:
We need to hold the leaders
2153
01:32:45,342 --> 01:32:47,083
in our governments
2154
01:32:47,126 --> 01:32:48,563
and the leaders of
these companies accountable.
2155
01:32:48,606 --> 01:32:51,000
BOEREE:
Whichever country you're in,
2156
01:32:51,043 --> 01:32:52,871
let them know
that you're not happy
2157
01:32:52,915 --> 01:32:54,569
with the current status quo.
2158
01:32:54,612 --> 01:32:57,093
So, yeah, it's boring to say
call your congressperson.
2159
01:32:57,136 --> 01:32:59,356
I'm not saying you should
just do that, but, like,
2160
01:32:59,399 --> 01:33:00,923
we do have to do that.
2161
01:33:00,966 --> 01:33:02,402
Like, we do have to get
the government involved.
2162
01:33:02,446 --> 01:33:03,839
DANIEL: So we just
call them up and say,
2163
01:33:03,882 --> 01:33:05,971
"Hey, stop Big Tech
from ruining the world"?
2164
01:33:06,015 --> 01:33:08,887
CAROLINE: No, but there are
tons of really obvious,
2165
01:33:08,931 --> 01:33:10,672
straightforward things
we can be demanding.
2166
01:33:10,715 --> 01:33:12,499
We need transparency.
2167
01:33:12,543 --> 01:33:14,240
We need to end the secrecy
that exists inside these labs,
2168
01:33:14,284 --> 01:33:16,634
because they are building
powerful technology,
2169
01:33:16,678 --> 01:33:18,680
and the public deserves to know
what's going on.
2170
01:33:18,723 --> 01:33:20,682
Ultimately,
we're gonna need independent,
2171
01:33:20,725 --> 01:33:23,249
objective third parties
to evaluate the systems.
2172
01:33:23,293 --> 01:33:27,036
We can't count on the companies
to grade their own homework.
2173
01:33:27,079 --> 01:33:31,170
If-if a company uses AI and-and
has AI interacting with you,
2174
01:33:31,214 --> 01:33:34,957
it should disclose that you are
interacting with an AI system.
2175
01:33:35,000 --> 01:33:38,613
Yeah. And-and also we need
a system that makes companies
2176
01:33:38,656 --> 01:33:41,833
legally liable for the
AI systems that they produce.
2177
01:33:41,877 --> 01:33:45,750
We need to make sure that there
are tests and safety standards
2178
01:33:45,794 --> 01:33:48,013
that are applied to everyone.
2179
01:33:48,057 --> 01:33:49,711
We need some ground rules,
2180
01:33:49,754 --> 01:33:51,538
and we need to keep adapting
those rules
2181
01:33:51,582 --> 01:33:54,585
at the speed that
the technology develops.
2182
01:33:54,629 --> 01:33:57,153
LEAHY: There is currently
more regulation
2183
01:33:57,196 --> 01:33:59,285
on selling a sandwich
to the public
2184
01:33:59,329 --> 01:34:02,898
than there is on building
potentially world-ending AGI.
2185
01:34:02,941 --> 01:34:04,682
CAROLINE:
And the last thing is
2186
01:34:04,726 --> 01:34:07,598
to upgrade ourselves.
2187
01:34:07,642 --> 01:34:09,731
This is not, like, the job
of, like, the safety team
2188
01:34:09,774 --> 01:34:11,733
at any given lab, or the CEO.
2189
01:34:11,776 --> 01:34:13,299
So, like, this is
everyone's job.
2190
01:34:13,343 --> 01:34:14,779
Don't, like, leave it
up to the AI experts.
2191
01:34:14,823 --> 01:34:16,955
Like, [bleep] that.
Like, this is the moment
2192
01:34:16,999 --> 01:34:19,262
that we are transitioning
from, like,
2193
01:34:19,305 --> 01:34:21,525
mostly human cognitive power
to, like, AI cognitive power,
2194
01:34:21,568 --> 01:34:23,092
and it affects everyone,
2195
01:34:23,135 --> 01:34:24,833
and I want people to be
in on that conversation.
2196
01:34:24,876 --> 01:34:26,878
And I would say, if you think
that AI will kill us all,
2197
01:34:26,922 --> 01:34:28,488
you should be working
in AI research
2198
01:34:28,532 --> 01:34:30,229
to make sure it doesn't,
because you do have
2199
01:34:30,273 --> 01:34:32,188
an enormous amount of agency.
2200
01:34:32,231 --> 01:34:33,842
CAROLINE:
Whoever you are,
2201
01:34:33,885 --> 01:34:35,844
you are an expert
in your own industry,
2202
01:34:35,887 --> 01:34:39,151
in your own school,
in your own family,
2203
01:34:39,195 --> 01:34:42,589
and it's up to you
how AI is used in your life.
2204
01:34:42,633 --> 01:34:44,983
Whether you want to join
your school board
2205
01:34:45,027 --> 01:34:47,856
or-or whether you want to ask
your employer
2206
01:34:47,899 --> 01:34:49,379
how they're using
AI technologies,
2207
01:34:49,422 --> 01:34:51,947
like, all of us can do the work.
2208
01:34:51,990 --> 01:34:56,081
A lot of unions have been
pretty effective at, like,
2209
01:34:56,125 --> 01:34:59,041
determining how they want
to interact with these systems.
2210
01:34:59,084 --> 01:35:02,697
-Nurses unions, teacher unions.
-[indistinct chanting]
2211
01:35:02,740 --> 01:35:04,568
DANIELA AMODEI: I would love
if parents everywhere
2212
01:35:04,611 --> 01:35:06,135
went to the AI companies
and said,
2213
01:35:06,178 --> 01:35:08,006
"How can you be better
on this?" Including us.
2214
01:35:08,050 --> 01:35:11,183
So, I founded Encode Justice
when I was 15 years old.
2215
01:35:11,227 --> 01:35:14,839
We are the world's first
and largest army of young people
2216
01:35:14,883 --> 01:35:18,364
fighting for human-centered
artificial intelligence.
2217
01:35:18,408 --> 01:35:21,019
It doesn't matter who you are,
even the smallest actions help,
2218
01:35:21,063 --> 01:35:23,935
and even conversation starting
is really, really valuable.
2219
01:35:23,979 --> 01:35:25,937
DANIEL:
So my job could be, like,
2220
01:35:25,981 --> 01:35:27,939
tell Bubby Lila about this
at dinner?
2221
01:35:27,983 --> 01:35:29,767
CAROLINE: Honestly, yeah,
that's part of it.
2222
01:35:29,811 --> 01:35:31,638
And we're gonna have to do
2223
01:35:31,682 --> 01:35:34,424
a lot of things that
we haven't even thought of yet.
2224
01:35:34,467 --> 01:35:36,034
People are gonna look at
anything that we've outlined
2225
01:35:36,078 --> 01:35:37,253
and say, "That's not enough."
2226
01:35:37,296 --> 01:35:38,950
What matters is that the forces
2227
01:35:38,994 --> 01:35:40,909
that are working
towards solutions
2228
01:35:40,952 --> 01:35:44,434
start to exceed the forces that
are working against solutions.
2229
01:35:44,477 --> 01:35:46,392
Making the world better
has always been hard.
2230
01:35:46,436 --> 01:35:47,829
It has never been easy.
2231
01:35:47,872 --> 01:35:49,961
Like, there have been
many shitty things
2232
01:35:50,005 --> 01:35:51,920
that have happened in history,
and we've had--
2233
01:35:51,963 --> 01:35:53,704
like, people have had
to deal with that,
2234
01:35:53,748 --> 01:35:56,794
and then they've risen up
and changed it.
2235
01:35:56,838 --> 01:36:00,319
There are an insane number
of challenges ahead of us,
2236
01:36:00,363 --> 01:36:04,106
but if we can get past them,
2237
01:36:04,149 --> 01:36:08,066
we can unlock a future beyond
our wildest imagination.
2238
01:36:08,110 --> 01:36:10,416
CAROLINE:
We have to come together
2239
01:36:10,460 --> 01:36:14,246
and find the path between
the promise and the peril.
2240
01:36:14,290 --> 01:36:16,422
We can't be pessimists
or optimists.
2241
01:36:17,423 --> 01:36:19,425
We have to become something new.
2242
01:36:21,732 --> 01:36:24,474
A friend of mine calls me
an apocaloptimist.
2243
01:36:24,517 --> 01:36:26,041
"Apocaloptimist"?
2244
01:36:26,084 --> 01:36:28,260
I think that might be
my new favorite word.
2245
01:36:28,304 --> 01:36:29,914
-[laughs] -MAN: Might be
the name of this movie.
2246
01:36:29,958 --> 01:36:31,263
-It might be the name
of this movie. -[laughter]
2247
01:36:31,307 --> 01:36:34,179
-"Apocaloptimist."
-"Apocaloptimist."
2248
01:36:34,223 --> 01:36:35,833
Yeah.
2249
01:36:35,877 --> 01:36:37,269
I don't believe in doom.
2250
01:36:37,313 --> 01:36:39,054
I believe in the spirit of life,
2251
01:36:39,097 --> 01:36:42,666
uh, and I believe in life is
about the capacity to act.
2252
01:36:42,709 --> 01:36:44,276
-[crowd singing]
-The capacity to relate,
2253
01:36:44,320 --> 01:36:46,365
the capacity to feel.
2254
01:36:49,020 --> 01:36:50,979
-[indistinct mission chatter]
-[cheering]
2255
01:36:51,022 --> 01:36:53,938
We have to double down
more and more
2256
01:36:53,982 --> 01:36:57,028
on those capacities
that we have as humans
2257
01:36:57,072 --> 01:37:00,118
that robotic systems
will never have.
2258
01:37:00,162 --> 01:37:04,296
It's time right now
to make those decisions about
2259
01:37:04,340 --> 01:37:07,473
how to guide it and support it
rather than dividing us.
2260
01:37:07,517 --> 01:37:10,259
DANIEL: It kind of sounds like
raising a kid.
2261
01:37:10,302 --> 01:37:12,435
That's what's up. Yeah.
2262
01:37:12,478 --> 01:37:14,480
CAROLINE: AI may have
more raw intelligence
2263
01:37:14,524 --> 01:37:17,222
-than our little human brains,
-[baby laughing]
2264
01:37:17,266 --> 01:37:21,052
but we're so much more
than just our intelligence.
2265
01:37:21,096 --> 01:37:25,056
Intelligence is-is
the ability to solve problems.
2266
01:37:25,100 --> 01:37:28,320
Wisdom is the ability to know
which problems to solve.
2267
01:37:30,932 --> 01:37:33,717
[indistinct crew chatter]
2268
01:37:33,760 --> 01:37:34,892
DANIEL:
It can go on your fridge.
2269
01:37:34,936 --> 01:37:36,198
[laughing]
2270
01:37:36,241 --> 01:37:38,113
YUDKOWSKY:
So, don't give up.
2271
01:37:38,156 --> 01:37:39,766
Humanity has done
2272
01:37:39,810 --> 01:37:42,160
more difficult things than this
in its history.
2273
01:37:43,553 --> 01:37:46,338
It's just hard to convince
people that they should.
2274
01:37:46,382 --> 01:37:48,645
♪ ♪
2275
01:37:51,474 --> 01:37:52,562
[audio crackles]
2276
01:37:54,912 --> 01:37:56,827
DANIEL: So, when I started
making this movie,
2277
01:37:56,871 --> 01:37:58,437
I would say that I was, like,
2278
01:37:58,481 --> 01:38:00,918
broadly a cynical asshole
about this whole thing.
2279
01:38:00,962 --> 01:38:03,094
Over the course
of making the film,
2280
01:38:03,138 --> 01:38:04,661
I've come to understand
that, like,
2281
01:38:04,704 --> 01:38:07,446
that's the only thing
we can't be.
2282
01:38:07,490 --> 01:38:09,361
...or anything else
you'd like to discuss?
2283
01:38:09,405 --> 01:38:11,363
No. I think we covered a lot.
2284
01:38:11,407 --> 01:38:13,017
-Yeah. [chuckles]
-MAN: Thank you so much.
2285
01:38:13,061 --> 01:38:15,498
Thanks.
2286
01:38:15,541 --> 01:38:16,891
DANIEL:
This is a problem that's bigger
2287
01:38:16,934 --> 01:38:19,067
than any one person.
2288
01:38:19,110 --> 01:38:21,547
This will change the world in
ways that we don't understand.
2289
01:38:21,591 --> 01:38:23,462
That is all true.
2290
01:38:23,506 --> 01:38:25,551
-[laughs]: Okay.
-[indistinct chatter]
2291
01:38:25,595 --> 01:38:27,684
DANIEL:
But what we do have agency over
2292
01:38:27,727 --> 01:38:30,078
is what we do about it.
2293
01:38:30,121 --> 01:38:32,558
As frontier AI grows
exponentially more capable...
2294
01:38:32,602 --> 01:38:34,430
DANIEL:
And the reality
2295
01:38:34,473 --> 01:38:36,562
is that if we just decide
it's hopeless,
2296
01:38:36,606 --> 01:38:39,000
then it is hopeless.
2297
01:38:39,043 --> 01:38:40,740
...to put less stress
on planet Earth...
2298
01:38:40,784 --> 01:38:42,742
DANIEL:
But if you decide
2299
01:38:42,786 --> 01:38:45,528
that you want to try...
2300
01:38:46,790 --> 01:38:48,096
...then you try.
2301
01:38:48,139 --> 01:38:49,967
And that's hard.
2302
01:38:51,273 --> 01:38:53,188
But you know what?
2303
01:38:53,231 --> 01:38:56,278
Big things seem impossible
before they actually happen.
2304
01:38:58,149 --> 01:39:01,413
But when they finally do happen,
2305
01:39:01,457 --> 01:39:05,243
it's because millions of people
took millions of actions
2306
01:39:05,287 --> 01:39:07,550
to make them happen.
2307
01:39:07,593 --> 01:39:10,292
And so...
2308
01:39:10,335 --> 01:39:12,424
we have to try.
2309
01:39:13,469 --> 01:39:14,905
[sighs heavily]
2310
01:39:21,129 --> 01:39:23,522
There's too much at stake.
2311
01:39:23,566 --> 01:39:25,611
♪ ♪
2312
01:39:32,401 --> 01:39:34,664
[film crackling]
2313
01:39:37,232 --> 01:39:41,236
Look at the incredible changes
we've experienced and survived
2314
01:39:41,279 --> 01:39:42,846
from the Stone Age,
2315
01:39:42,889 --> 01:39:46,110
and yet even greater changes
are still to come.
2316
01:39:46,154 --> 01:39:49,157
["Lost It to Trying"
by Son Lux playing]
2317
01:40:18,403 --> 01:40:20,449
♪ ♪
2318
01:40:47,606 --> 01:40:50,174
♪ What will we do now? ♪
2319
01:40:50,218 --> 01:40:53,221
♪ We've lost it to trying ♪
2320
01:40:53,264 --> 01:40:55,092
♪ We've lost it ♪
2321
01:40:55,136 --> 01:40:56,746
♪ To trying ♪
2322
01:40:59,575 --> 01:41:02,230
♪ What will we do now? ♪
2323
01:41:02,273 --> 01:41:05,189
♪ We've lost it to trying ♪
2324
01:41:05,233 --> 01:41:06,973
♪ We've lost it ♪
2325
01:41:07,017 --> 01:41:08,714
♪ To trying ♪
2326
01:41:11,630 --> 01:41:14,111
♪ What can we say now? ♪
2327
01:41:14,155 --> 01:41:17,201
♪ Our mouths only lying ♪
2328
01:41:17,245 --> 01:41:18,637
♪ Our mouths ♪
2329
01:41:18,681 --> 01:41:20,683
♪ Only lying ♪
2330
01:41:25,122 --> 01:41:27,690
♪ What can we say now? ♪
2331
01:41:27,733 --> 01:41:30,823
♪ Our mouths only lying ♪
2332
01:41:30,867 --> 01:41:32,216
♪ Our mouths ♪
2333
01:41:32,260 --> 01:41:34,218
♪ Only lying ♪
2334
01:41:34,262 --> 01:41:36,264
♪ ♪
2335
01:42:01,071 --> 01:42:03,726
♪ Give in and get out ♪
2336
01:42:03,769 --> 01:42:06,685
♪ We rise in the dying ♪
2337
01:42:06,729 --> 01:42:08,209
♪ We rise ♪
2338
01:42:08,252 --> 01:42:10,211
♪ In the dying ♪
2339
01:42:13,127 --> 01:42:15,781
♪ Give in and get out ♪
2340
01:42:15,825 --> 01:42:18,654
♪ We rise in the dying ♪
2341
01:42:18,697 --> 01:42:20,221
♪ We rise ♪
2342
01:42:20,264 --> 01:42:22,223
♪ In the dying ♪
2343
01:42:25,051 --> 01:42:27,663
♪ Give in and get out ♪
2344
01:42:27,706 --> 01:42:30,666
♪ We rise in the dying ♪
2345
01:42:30,709 --> 01:42:32,233
♪ We rise ♪
2346
01:42:32,276 --> 01:42:34,148
♪ In the dying ♪
2347
01:42:37,107 --> 01:42:39,718
♪ Give in and get out ♪
2348
01:42:39,762 --> 01:42:42,591
♪ We rise in the dying ♪
2349
01:42:42,634 --> 01:42:43,809
♪ We ♪
2350
01:42:43,853 --> 01:42:45,898
♪ ♪
2351
01:43:15,580 --> 01:43:17,626
♪ ♪
2352
01:43:22,761 --> 01:43:25,286
♪ Oh, oh, oh, oh, oh ♪
2353
01:43:25,329 --> 01:43:26,983
♪ Oh, oh, oh, oh ♪
2354
01:43:27,026 --> 01:43:28,419
♪ Oh, oh ♪
2355
01:43:28,463 --> 01:43:31,379
♪ Oh, oh, oh, oh, oh, oh ♪
2356
01:43:31,422 --> 01:43:32,945
♪ Oh, oh, oh, oh ♪
2357
01:43:32,989 --> 01:43:34,382
♪ Oh, oh ♪
2358
01:43:34,425 --> 01:43:37,036
♪ Oh, oh, oh, oh, oh, oh ♪
2359
01:43:37,080 --> 01:43:39,648
♪ What will we do now? ♪
2360
01:43:39,691 --> 01:43:42,651
♪ We've lost it to trying ♪
2361
01:43:42,694 --> 01:43:44,566
♪ We've lost it ♪
2362
01:43:44,609 --> 01:43:46,307
♪ To trying ♪
2363
01:43:46,350 --> 01:43:49,005
♪ Oh, oh, oh, oh, oh, oh ♪
2364
01:43:49,048 --> 01:43:51,703
♪ What will we do now? ♪
2365
01:43:51,747 --> 01:43:54,706
♪ We've lost it to trying ♪
2366
01:43:54,750 --> 01:43:56,578
♪ We've lost it ♪
2367
01:43:56,621 --> 01:43:58,406
♪ To trying. ♪
2368
01:43:58,449 --> 01:44:00,451
♪ ♪
2369
01:44:23,126 --> 01:44:24,475
[song ends]
175727
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.