Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:03,003 --> 00:00:05,405
(grand orchestral fanfare
playing)
2
00:00:27,795 --> 00:00:30,163
♪ ♪
3
00:00:41,843 --> 00:00:44,144
(birds chirping and calling)
4
00:00:49,216 --> 00:00:51,519
-(burbling)
-(trilling)
5
00:00:55,657 --> 00:00:58,058
(VCR clicking and whirring)
6
00:01:01,194 --> 00:01:02,630
(projector clicking)
7
00:01:05,332 --> 00:01:09,637
Now, the present-day computers
are complete morons,
8
00:01:09,671 --> 00:01:12,507
but this will not be true
in another generation.
9
00:01:12,540 --> 00:01:15,342
They will start to think,
and eventually they will
10
00:01:15,375 --> 00:01:17,845
completely outthink
their makers.
11
00:01:17,879 --> 00:01:20,582
Is this depressing?
I don't see why it should be.
12
00:01:20,615 --> 00:01:25,352
I suspect that organic
or biological evolution
13
00:01:25,385 --> 00:01:27,220
has about come to its end.
14
00:01:28,322 --> 00:01:29,791
(Daniel chuckles)
15
00:01:29,824 --> 00:01:31,224
CAROLINE: You don't want meto be the narrator.
16
00:01:31,258 --> 00:01:32,694
I don't have a good voice.
17
00:01:32,727 --> 00:01:34,461
DANIEL: You have a great voice.Just do it.
18
00:01:34,494 --> 00:01:36,598
It's just a--we're just trying it out.
19
00:01:36,631 --> 00:01:38,432
CAROLINE:
Do I get paid for this shit?
20
00:01:38,465 --> 00:01:40,768
DANIEL (laughing): Well,we'll see how well you do.
21
00:01:40,802 --> 00:01:42,202
(Caroline laughing)
22
00:01:42,235 --> 00:01:44,304
-♪ ♪
-(dog barks)
23
00:01:46,908 --> 00:01:48,576
CAROLINE:
I need to call my parents.
24
00:01:48,610 --> 00:01:49,309
(engine starts)
25
00:01:49,343 --> 00:01:51,478
Sometimes we rush into things
26
00:01:51,512 --> 00:01:53,347
without thinking them through.
27
00:01:53,380 --> 00:01:54,916
DANIEL: Yeah, can you sh...
you want to shoot? That's...
28
00:01:54,949 --> 00:01:57,752
CAROLINE: Like when Danieland Caroline got married
29
00:01:57,785 --> 00:02:00,755
151 days after they met.
30
00:02:00,788 --> 00:02:02,523
Let's rewind.
31
00:02:03,691 --> 00:02:05,325
(applause)
32
00:02:05,359 --> 00:02:07,562
Have you been thinking about
buying a home computer?
33
00:02:07,595 --> 00:02:09,631
CAROLINE:
In 1993, when Daniel was born,
34
00:02:09,664 --> 00:02:12,533
his parents didn't even havea computer in the house.
35
00:02:12,567 --> 00:02:14,334
But he rememberswhen they finally got one.
36
00:02:14,368 --> 00:02:16,269
What a computer is to me is
37
00:02:16,303 --> 00:02:18,873
the equivalent of a bicycle
for our minds.
38
00:02:18,906 --> 00:02:20,608
Hello.
39
00:02:20,642 --> 00:02:22,777
CAROLINE: His computer helpedunleash his creativity.
40
00:02:22,810 --> 00:02:24,411
DANIEL:
Rolling. Go.
41
00:02:24,444 --> 00:02:27,247
CAROLINE: He usedhis dad's iMac to edit videos.
42
00:02:27,280 --> 00:02:28,650
Yeah!
43
00:02:28,683 --> 00:02:30,618
CAROLINE: And even makelittle animations.
44
00:02:30,652 --> 00:02:32,419
(tires squealing, crash)
45
00:02:32,452 --> 00:02:35,690
In 1998, when Caroline wasa little girl...
46
00:02:35,723 --> 00:02:37,290
Hi, Mommy.
47
00:02:37,324 --> 00:02:38,593
...only nerds knewwhat the Internet was.
48
00:02:38,626 --> 00:02:40,360
And what about
this Internet thing?
49
00:02:40,394 --> 00:02:41,696
-Do you, do you know anything
about that? -Sure.
50
00:02:41,729 --> 00:02:43,330
-(audience laughter)
-CAROLINE: But soon,
51
00:02:43,363 --> 00:02:45,399
-everyone was on the...
-...Internets.
52
00:02:45,432 --> 00:02:47,467
...and computers werebeating humans at chess.
53
00:02:47,502 --> 00:02:50,470
ANNOUNCER: Deep Blue--
Kasparov has resigned!
54
00:02:50,505 --> 00:02:52,874
CAROLINE: And by the timeCaroline went to college,
55
00:02:52,907 --> 00:02:55,543
computers werein everyone's pocket.
56
00:02:55,576 --> 00:02:58,713
Now Daniel could make movieswith his phone.
57
00:02:58,746 --> 00:03:02,215
He grew up to be an artistand a filmmaker...
58
00:03:02,249 --> 00:03:04,418
-Good night.
-...and so did Caroline.
59
00:03:04,451 --> 00:03:06,286
Cut. Let's go into a close-up.
That was perfect.
60
00:03:06,319 --> 00:03:07,822
By then, computers wereconnecting people
61
00:03:07,855 --> 00:03:09,456
and changing the world in ways
62
00:03:09,489 --> 00:03:10,925
that we nevercould have imagined.
63
00:03:10,958 --> 00:03:12,694
All the money raised by
the Ice Bucket Challenge...
64
00:03:12,727 --> 00:03:14,796
CAROLINE:
Some of it was good.
65
00:03:14,829 --> 00:03:15,963
(gasping)
66
00:03:15,997 --> 00:03:17,565
Some not so good.
67
00:03:17,598 --> 00:03:19,701
Anxiety, self-harm, suicide...
68
00:03:19,734 --> 00:03:21,602
But it's clear now
that we didn't do enough
69
00:03:21,636 --> 00:03:24,839
to prevent these tools from
being used for harm as well.
70
00:03:24,872 --> 00:03:26,741
CAROLINE: And as the worldgot more and more focused
71
00:03:26,774 --> 00:03:28,543
-on their computers,
-(phone dings)
72
00:03:28,576 --> 00:03:31,211
Daniel focused on his artwork.
73
00:03:32,312 --> 00:03:36,249
Wherever he went,he always had a sketchbook,
74
00:03:36,283 --> 00:03:38,720
including the nighthe met Caroline.
75
00:03:38,753 --> 00:03:41,923
He drew a terrible pictureof her, and 20 minutes later,
76
00:03:41,956 --> 00:03:44,592
he boldly pronouncedthey were going to get married,
77
00:03:44,625 --> 00:03:47,862
which, as you know, they did,151 days later.
78
00:03:47,895 --> 00:03:50,464
WOMAN: ...now pronounce you
husband and wife.
79
00:03:50,497 --> 00:03:51,933
CAROLINE:
They moved into a cute house...
80
00:03:51,966 --> 00:03:53,433
-DANIEL: Moose, hey! Come here!
-...and got a dog
81
00:03:53,467 --> 00:03:54,936
as dumb as Daniel.
82
00:03:54,969 --> 00:03:57,872
Meanwhile, computers werewriting entire essays
83
00:03:57,905 --> 00:03:59,707
based on simple questions like,
84
00:03:59,741 --> 00:04:02,777
"How hard would it be to builda shack in my backyard?"
85
00:04:03,443 --> 00:04:06,514
And so Daniel built a shackin his backyard.
86
00:04:07,380 --> 00:04:10,317
But just as he sat down to startworking on his next movie,
87
00:04:10,350 --> 00:04:14,488
he learned that computerscould now write screenplays.
88
00:04:15,422 --> 00:04:16,891
I mean, they were bad,
89
00:04:16,924 --> 00:04:19,326
but they were getting betterreally fast.
90
00:04:19,359 --> 00:04:21,929
They could create new imagesand videos from scratch,
91
00:04:21,963 --> 00:04:25,032
and some of themcould even pass the bar exam.
92
00:04:25,066 --> 00:04:29,402
Not just pass the bar but
be in the top ten percentile.
93
00:04:29,436 --> 00:04:31,038
CAROLINE:
It was confusing.
94
00:04:31,072 --> 00:04:32,607
(sighs)
95
00:04:32,640 --> 00:04:35,543
Daniel just wanteda bicycle for his mind,
96
00:04:35,576 --> 00:04:39,080
but computers had becomea self-driving rocket ship.
97
00:04:39,113 --> 00:04:40,648
(whooshing)
98
00:04:40,681 --> 00:04:42,517
("We R in Control"
by Neil Young playing)
99
00:04:42,550 --> 00:04:45,385
Pioneers in the field
of artificial intelligence
100
00:04:45,418 --> 00:04:46,821
are pleading with Congress
101
00:04:46,854 --> 00:04:48,856
to pass safety rulesbefore it's too late.
102
00:04:48,890 --> 00:04:50,858
CAROLINE: Now it felt likethe whole world
103
00:04:50,892 --> 00:04:53,528
was rushing into somethingwithout thinking it through,
104
00:04:53,561 --> 00:04:55,830
and everyone had an opinion.
105
00:04:56,564 --> 00:05:00,635
Was it gonna destroy the worldor save humanity?
106
00:05:00,668 --> 00:05:02,804
There was too much information,
107
00:05:02,837 --> 00:05:05,106
which made him anxious,which made Caroline anxious...
108
00:05:05,139 --> 00:05:06,541
(exasperated scream)
109
00:05:06,574 --> 00:05:08,042
I don't have kids yet,
but I worry
110
00:05:08,075 --> 00:05:10,645
what the world would look like
if I decide to. (chuckles)
111
00:05:10,678 --> 00:05:12,947
CAROLINE:
He was starting to spin out.
112
00:05:12,980 --> 00:05:14,549
(indistinct chatter)
113
00:05:14,582 --> 00:05:17,685
It was becominga mountain of anxiety.
114
00:05:19,620 --> 00:05:22,322
-MAN: It's horrible.
-It's freakin' Siri.
115
00:05:23,124 --> 00:05:25,893
CAROLINE:
And so, Daniel decided to go out
116
00:05:25,927 --> 00:05:28,529
and find someonewho could explain this to him,
117
00:05:28,563 --> 00:05:30,631
so he could stopthinking about it
118
00:05:30,665 --> 00:05:33,835
and he and Caroline couldget on with their lives.
119
00:05:33,868 --> 00:05:37,437
♪ We are in control,
we are in control ♪
120
00:05:37,470 --> 00:05:39,006
♪ We are in control ♪
121
00:05:39,040 --> 00:05:41,042
♪ Chemical computer
thinking battery ♪
122
00:05:41,075 --> 00:05:42,877
♪ We are in control, we are... ♪
123
00:05:42,910 --> 00:05:45,980
This endeavor would turn outto be hopelessly naive
124
00:05:46,013 --> 00:05:49,584
and kick off the most confusingyear of his life.
125
00:05:49,617 --> 00:05:50,985
♪ We are in control... ♪
126
00:05:51,018 --> 00:05:53,420
But as we know,we sometimes rush into things
127
00:05:53,453 --> 00:05:54,956
without thinking them through.
128
00:05:54,989 --> 00:05:56,791
♪ Chemical computer
thinking battery... ♪
129
00:05:56,824 --> 00:05:58,491
Oh, my gosh. What is happening?
130
00:05:58,526 --> 00:06:00,127
CAROLINE:
He had questions.
131
00:06:00,161 --> 00:06:02,096
Questions only thesmartest nerds could answer.
132
00:06:02,129 --> 00:06:04,665
Submitting to the interrogator.
133
00:06:04,699 --> 00:06:06,399
CAROLINE:
Questions like:
134
00:06:06,433 --> 00:06:09,103
Was this the apocalypseor did he actually have reason
135
00:06:09,136 --> 00:06:10,705
-to be optimistic?
-Yes.
136
00:06:11,305 --> 00:06:14,407
-♪ Chemical computer thinking
battery. ♪ -(song ends)
137
00:06:14,441 --> 00:06:16,677
CAROLINE:
Uh, is that good?
138
00:06:16,711 --> 00:06:18,112
DANIEL:
Yeah.
139
00:06:18,145 --> 00:06:20,648
So, to begin,
140
00:06:20,681 --> 00:06:22,482
what is artificial intelligence?
141
00:06:22,516 --> 00:06:25,152
I know that must be annoyingfor you, that-that question,
142
00:06:25,186 --> 00:06:27,021
but I do think it's important.
143
00:06:27,054 --> 00:06:30,423
So... AI...
144
00:06:30,457 --> 00:06:32,459
(stammers) Um...
145
00:06:32,492 --> 00:06:34,161
-Yeah...
-Uh, hmm.
146
00:06:34,195 --> 00:06:35,196
(laughs):
That's a good question.
147
00:06:35,229 --> 00:06:36,697
-Yeah, um...
-Um...
148
00:06:36,731 --> 00:06:38,165
DANIEL:
What is AI?
149
00:06:38,199 --> 00:06:39,634
(laughs)
150
00:06:39,667 --> 00:06:41,035
I love that that's
the first question,
151
00:06:41,068 --> 00:06:43,671
'cause there is not
a clear and consistent answer.
152
00:06:43,704 --> 00:06:45,740
Artificial intelligence is
153
00:06:45,773 --> 00:06:51,178
a kind of intentionally
and maybe uselessly broad term.
154
00:06:51,212 --> 00:06:53,781
DAVID EVAN HARRIS:
It's a machine
155
00:06:53,814 --> 00:06:56,083
doing things that
we previously only thought
156
00:06:56,117 --> 00:06:57,718
that people could do:
157
00:06:57,752 --> 00:07:01,656
making recommendations,
decisions and predictions.
158
00:07:01,689 --> 00:07:07,228
AI is the, uh, application
of computer science
159
00:07:07,261 --> 00:07:09,664
to solving cognitive problems.
160
00:07:09,697 --> 00:07:13,768
Okay, so when I picture AI,
161
00:07:13,801 --> 00:07:17,204
it's sort of likethis magical computer box...
162
00:07:17,238 --> 00:07:19,640
♪ ♪
163
00:07:20,708 --> 00:07:24,645
...just, like, floatingin, like, inert space.
164
00:07:24,679 --> 00:07:26,847
And no matter
how many times people try
165
00:07:26,881 --> 00:07:30,051
and explain this to me,
I just don't get how...
166
00:07:30,084 --> 00:07:32,887
how it's understandingall of these things
167
00:07:32,920 --> 00:07:36,190
and how it's feeling likeintelligence.
168
00:07:36,223 --> 00:07:37,692
(beeping, electronic chattering)
169
00:07:37,725 --> 00:07:39,927
And that's kind ofnerve-racking.
170
00:07:41,729 --> 00:07:46,499
AZA RASKIN: What isthis new generation of AI?
171
00:07:47,535 --> 00:07:50,638
This AI that is different than
every other generation?
172
00:07:50,671 --> 00:07:53,040
Like, no one ever
talked about, like,
173
00:07:53,074 --> 00:07:57,545
Siri taking over the world
or causing catastrophes.
174
00:07:57,578 --> 00:07:59,080
-WOMAN: Hi, Siri?
-MAN: ...want to.
175
00:07:59,113 --> 00:08:01,115
WOMAN:
Hello? Siri?
176
00:08:01,148 --> 00:08:02,984
Hello? Hey, Siri!
177
00:08:03,017 --> 00:08:06,253
Or the voice in Google Maps,
which mispronounces road names,
178
00:08:06,287 --> 00:08:07,888
like, breaking society.
179
00:08:07,922 --> 00:08:10,992
MAN: Google Maps says
this is a road.
180
00:08:11,025 --> 00:08:13,627
(laughing):
But I think I'm in a river.
181
00:08:13,661 --> 00:08:14,996
This is definitely a river,
innit?
182
00:08:15,029 --> 00:08:17,898
Something changed
with ChatGPT coming out.
183
00:08:18,532 --> 00:08:20,568
People understood-- no, no, no--
184
00:08:20,601 --> 00:08:21,869
this technology's
insanely valuable,
185
00:08:21,902 --> 00:08:24,005
it's insanely powerful
and also insanely scary.
186
00:08:24,038 --> 00:08:25,940
Okay, listen to this.
Very creepy.
187
00:08:25,973 --> 00:08:28,976
A new artificial intelligence
tool is going viral
188
00:08:29,010 --> 00:08:32,980
for cranking out entire essays
in a matter of seconds.
189
00:08:36,217 --> 00:08:41,722
AI dwarfs the power of all
other technologies combined.
190
00:08:41,756 --> 00:08:44,058
DANIEL:
"AI dwarfs the power
191
00:08:44,091 --> 00:08:46,827
of all other technologies
combined."
192
00:08:46,861 --> 00:08:47,762
Yeah.
193
00:08:47,795 --> 00:08:49,230
Do you think that's true?
194
00:08:49,263 --> 00:08:51,265
Yes.
195
00:08:51,298 --> 00:08:53,567
Tell me about-- How? How?
196
00:08:53,601 --> 00:08:57,038
I think, to paint that picture,
it's really important
197
00:08:57,071 --> 00:09:01,175
to understand what today's
state-of-the-art systems
198
00:09:01,208 --> 00:09:02,777
look like and how they're built.
199
00:09:02,810 --> 00:09:06,280
(laughs) This is
quite a... quite a setup.
200
00:09:06,313 --> 00:09:07,782
So, one thing that
201
00:09:07,815 --> 00:09:10,051
not a lot of peoplerealize is that
202
00:09:10,084 --> 00:09:14,221
systems like ChatGPT aren't
programmed by any human.
203
00:09:14,255 --> 00:09:15,756
What do you mean?
204
00:09:15,790 --> 00:09:17,625
Instead, it's something like
th-they're grown.
205
00:09:17,658 --> 00:09:19,727
We kind of give them
raw resources, like,
206
00:09:19,760 --> 00:09:21,629
"Here's a lot of
computational resources.
207
00:09:21,662 --> 00:09:22,730
Here's a lot of data."
208
00:09:22,763 --> 00:09:24,165
Under the hood, it's math,
209
00:09:24,198 --> 00:09:27,668
and the math is actually
surprisingly straightforward.
210
00:09:27,701 --> 00:09:31,672
DANIEL: So ChatGPT is a kind
of AI but it's not all of AI?
211
00:09:31,705 --> 00:09:33,207
Totally. ChatGPT is
just the beginning,
212
00:09:33,240 --> 00:09:34,675
but it's a good place to start.
213
00:09:34,708 --> 00:09:38,145
But I still don't know
what AI is.
214
00:09:38,179 --> 00:09:40,848
To understand AI,
it begins with understanding
215
00:09:40,881 --> 00:09:43,717
that intelligence is about
recognizing patterns.
216
00:09:43,751 --> 00:09:45,186
-Patterns. -Patterns.
-Patterns.
217
00:09:45,219 --> 00:09:48,956
COTRA: It is showntrillions of words of text
218
00:09:48,989 --> 00:09:51,826
across millions of documentsin the Internet.
219
00:09:51,859 --> 00:09:53,127
RANDIMA FERNANDO:
It started with text.
220
00:09:53,160 --> 00:09:56,197
And what they did was
they took textbooks,
221
00:09:56,230 --> 00:09:59,967
and they took poems and essaysand instruction manuals.
222
00:10:00,000 --> 00:10:01,368
DAVID EVAN HARRIS:
They can do things like
223
00:10:01,402 --> 00:10:03,370
digest the entire Internet,
224
00:10:03,404 --> 00:10:07,108
every single word that's ever
been written by a person.
225
00:10:07,141 --> 00:10:10,244
Reddit threads and social media
and all of Wikipedia.
226
00:10:10,277 --> 00:10:14,048
More data than anybody could
ever read in several lifetimes.
227
00:10:14,081 --> 00:10:15,950
And they gave this system
one job.
228
00:10:15,983 --> 00:10:18,953
Figure out the patterns and
structure of that information
229
00:10:18,986 --> 00:10:21,122
and use that
to make predictions about
230
00:10:21,155 --> 00:10:23,791
what word should come next
in a sentence.
231
00:10:23,824 --> 00:10:25,426
When you say
"patterns in a sentence,"
232
00:10:25,459 --> 00:10:27,128
what are you talking about?
233
00:10:27,161 --> 00:10:29,864
So, it's everything from, like,
the really simple things,
234
00:10:29,897 --> 00:10:32,266
like most sentences end
with a period,
235
00:10:32,299 --> 00:10:35,402
all the way up to the
more conceptual things, like:
236
00:10:35,436 --> 00:10:37,138
What is a sonnet?
237
00:10:37,171 --> 00:10:39,039
It's a type of poem, and it has
some particular structure.
238
00:10:39,707 --> 00:10:41,809
RASKIN:
So, it then looks at
239
00:10:41,842 --> 00:10:44,011
all of that data,all of that text...
240
00:10:44,044 --> 00:10:46,313
COTRA: And over trillionsand trillions of tries,
241
00:10:46,347 --> 00:10:49,150
each time it gets something
right or wrong,
242
00:10:49,183 --> 00:10:51,819
it's given a little bitof positive reinforcement
243
00:10:51,852 --> 00:10:53,888
when it guessesthe next word correctly,
244
00:10:53,921 --> 00:10:56,090
and it's given a little bit
of negative reinforcement
245
00:10:56,123 --> 00:10:58,926
when it guessesthe next word incorrectly.
246
00:10:59,727 --> 00:11:02,029
And at the end of it,you have a system that
247
00:11:02,062 --> 00:11:05,332
speaks really good English
as a side effect
248
00:11:05,366 --> 00:11:07,168
of being really, really good
at predicting
249
00:11:07,201 --> 00:11:09,103
the word that comes next
in a piece of text.
250
00:11:09,136 --> 00:11:11,972
(automated voice
speaking French)
251
00:11:12,006 --> 00:11:14,308
It uses all of those patterns
it has learned
252
00:11:14,341 --> 00:11:16,277
to be able to makea prediction about
253
00:11:16,310 --> 00:11:17,811
what the answer should be,
254
00:11:17,845 --> 00:11:19,180
then it gives you thatas the output.
255
00:11:19,213 --> 00:11:21,482
AUTOMATED VOICE:
It's a little oversimplified,
256
00:11:21,516 --> 00:11:23,717
but I think people will get it.
257
00:11:23,751 --> 00:11:25,252
So, so that's all it does?
258
00:11:25,286 --> 00:11:27,354
Yeah. It doesn't seem like
it would be that complicated,
259
00:11:27,388 --> 00:11:28,489
but actually you have to know
260
00:11:28,523 --> 00:11:30,057
a huge amount of things
261
00:11:30,090 --> 00:11:32,092
in order toactually succeed at that.
262
00:11:32,793 --> 00:11:34,461
RASKIN:
If you say to ChatGPT,
263
00:11:34,495 --> 00:11:37,131
"Write me a Shakespeareansonnet about my dog,"
264
00:11:37,164 --> 00:11:39,166
it has to know what dogs are.
265
00:11:39,200 --> 00:11:40,935
-It has to know what you loveabout your dog. -(barking)
266
00:11:40,968 --> 00:11:43,304
It has to knowwho Shakespeare is,
267
00:11:43,337 --> 00:11:46,040
that sonnets rhyme,that they have a structure,
268
00:11:46,073 --> 00:11:48,342
that words have soundsthat can rhyme.
269
00:11:48,375 --> 00:11:49,843
It takes a lot.
270
00:11:49,877 --> 00:11:51,212
(voices overlapping
in various languages)
271
00:11:51,245 --> 00:11:53,280
Holy shit, you can talk
to your computer now.
272
00:11:53,314 --> 00:11:56,350
That was just not true
three years ago.
273
00:11:56,383 --> 00:11:58,085
RASKIN: Yes, and this is
the really important part.
274
00:11:58,118 --> 00:12:01,055
The same process that lets AI
275
00:12:01,088 --> 00:12:03,490
uncover and manipulatethe patterns of text
276
00:12:03,525 --> 00:12:06,160
is the same processthat lets it uncover
277
00:12:06,193 --> 00:12:10,231
the patterns of the entireuniverse and everything in it.
278
00:12:10,831 --> 00:12:13,234
FERNANDO: There are patternsand images and sound
279
00:12:13,267 --> 00:12:16,470
in computer code and DNAand music and physics
280
00:12:16,504 --> 00:12:18,973
and fashion and building design
281
00:12:19,006 --> 00:12:22,042
(distorted): and inhuman voices and human faces,
282
00:12:22,076 --> 00:12:24,245
(normally):
really, truly everywhere.
283
00:12:24,278 --> 00:12:26,347
-Everywhere. -Everywhere.
-Everywhere. -Everywhere.
284
00:12:26,380 --> 00:12:28,182
If you have learned
those patterns,
285
00:12:28,215 --> 00:12:30,117
you can generate
new kinds of songs.
286
00:12:30,150 --> 00:12:31,785
You can generate new videos.
287
00:12:31,819 --> 00:12:33,087
And that's why, if you give it
288
00:12:33,120 --> 00:12:35,322
a three-second recording
of your grandmother,
289
00:12:35,356 --> 00:12:37,224
it can speak back in her voice.
290
00:12:37,258 --> 00:12:38,092
Oh, my God.
291
00:12:38,125 --> 00:12:41,061
(repeating):
Oh, my God. Oh, my God.
292
00:12:41,095 --> 00:12:43,397
What will they think of next?
293
00:12:43,430 --> 00:12:44,532
It's moving very, very quickly.
294
00:12:44,566 --> 00:12:46,400
An American AI start-up
295
00:12:46,433 --> 00:12:48,068
has released its latest model.
296
00:12:48,102 --> 00:12:51,171
That company is Anthropic,
and it has just unveiled
297
00:12:51,205 --> 00:12:54,174
the latest versionsof its AI assistant Claude.
298
00:12:54,208 --> 00:12:57,411
GLENN BECK: So, the xAI teamwas there to unveil Grok 4.
299
00:12:57,444 --> 00:13:00,414
MAN: Google released onejust last week. Gemini is...
300
00:13:00,447 --> 00:13:03,417
We've gone from GPT-2
just a couple years ago,
301
00:13:03,450 --> 00:13:06,020
which could barely write
a coherent paragraph,
302
00:13:06,053 --> 00:13:08,322
to GPT-4,
which can pass the bar exam.
303
00:13:08,355 --> 00:13:10,924
And all they had to do
to get there
304
00:13:10,958 --> 00:13:13,294
was essentially add more data
and more compute.
305
00:13:13,327 --> 00:13:16,897
-These people who are
building this... -COTRA: Yeah.
306
00:13:16,930 --> 00:13:18,332
...they're just throwing more...
307
00:13:18,365 --> 00:13:22,236
More physical computers,
more of the same kinds of data.
308
00:13:22,269 --> 00:13:25,039
Because the more
computing power you add,
309
00:13:25,072 --> 00:13:28,375
the more complex
intellectual tasks they can do.
310
00:13:28,409 --> 00:13:30,444
So, the more weather datayou give it,
311
00:13:30,477 --> 00:13:32,446
the better it canmake predictions about
312
00:13:32,479 --> 00:13:34,448
where a hurricane might go.
313
00:13:34,481 --> 00:13:36,584
RASKIN: And the more patternsof tumors and bones
314
00:13:36,618 --> 00:13:39,253
and tissues an AI has seen,then the better able it is
315
00:13:39,286 --> 00:13:42,489
to detect a tumorin a new CT scan.
316
00:13:42,524 --> 00:13:44,124
Better even than a human doctor.
317
00:13:44,158 --> 00:13:47,328
AI that's already being
deployed for the military
318
00:13:47,361 --> 00:13:49,463
can already usesatellite imagery,
319
00:13:49,496 --> 00:13:52,199
troop movements, communicationsto determine,
320
00:13:52,232 --> 00:13:53,934
-sometimes days in advance,
-(civil defense siren blaring)
321
00:13:53,967 --> 00:13:55,336
where an attackis going to happen,
322
00:13:55,369 --> 00:13:57,605
like where an enemyis going to strike.
323
00:13:57,639 --> 00:14:00,007
TRISTAN HARRIS: This wholespace is moving so fast
324
00:14:00,040 --> 00:14:02,142
that any example
you put in this movie
325
00:14:02,176 --> 00:14:05,112
will feel absolutely clumsy
by the time it comes out.
326
00:14:05,145 --> 00:14:07,348
(beeping, clicking)
327
00:14:09,950 --> 00:14:11,653
-(applause)
-(upbeat music playing)
328
00:14:11,686 --> 00:14:14,421
RASKIN:
These models are being released
329
00:14:14,455 --> 00:14:17,391
before anyone knowswhat they're even capable of.
330
00:14:17,424 --> 00:14:22,896
GPT-3.5 was released and outto 100 million people plus
331
00:14:22,930 --> 00:14:26,333
before some researchersdiscovered that it could do
332
00:14:26,367 --> 00:14:29,203
research-grade chemistry
better than models
333
00:14:29,236 --> 00:14:33,240
that were trained specifically
to do research-grade chemistry.
334
00:14:33,273 --> 00:14:36,009
Something is happening in there
that the people
335
00:14:36,043 --> 00:14:38,546
who are building them
don't fully understand.
336
00:14:38,580 --> 00:14:41,348
Basically, it just analyzes
the data by itself,
337
00:14:41,382 --> 00:14:45,419
and as it does that,
it just teaches itself
338
00:14:45,452 --> 00:14:48,355
various things
that we often didn't intend.
339
00:14:48,389 --> 00:14:50,324
So, for instance,it reads a lot online,
340
00:14:50,357 --> 00:14:53,127
and then at some point, it justlearns how to do arithmetic.
341
00:14:53,160 --> 00:14:54,662
KIDS:
One, two.
342
00:14:54,696 --> 00:14:56,463
HENDRYCKS: And then atsome point, it starts to learn
343
00:14:56,497 --> 00:14:59,099
how to answeradvanced physics questions.
344
00:14:59,133 --> 00:15:01,268
We didn't program thatin it whatsoever.
345
00:15:01,301 --> 00:15:03,203
It just learned by itself.
346
00:15:05,640 --> 00:15:07,709
TRISTAN HARRIS:
An AI is like a digital brain.
347
00:15:07,742 --> 00:15:09,176
But just like a human brain,
348
00:15:09,209 --> 00:15:11,078
if you did a brain scan
on a human brain,
349
00:15:11,111 --> 00:15:13,447
would you know everything
that person was capable of?
350
00:15:13,480 --> 00:15:15,482
You can't know that
just from the brain scan.
351
00:15:15,517 --> 00:15:18,318
It's just, like,
a bunch of numbers and, like,
352
00:15:18,352 --> 00:15:20,522
the multiplications that arehappening that-that, like,
353
00:15:20,555 --> 00:15:22,524
the best machine learning
researcher in the world
354
00:15:22,557 --> 00:15:25,159
could look at and, like, have
no idea what was happening.
355
00:15:31,398 --> 00:15:33,200
DANIEL:
That chair right there.
356
00:15:33,233 --> 00:15:35,469
-Is that okay for you?
-Yes.
357
00:15:36,136 --> 00:15:38,172
DANIEL: So,
that's kind of mind-boggling.
358
00:15:38,205 --> 00:15:39,440
Okay? Like, it's taking over
the world,
359
00:15:39,473 --> 00:15:41,609
and we don't even know
how it works.
360
00:15:41,643 --> 00:15:43,477
-Is that right?
-Mm.
361
00:15:43,511 --> 00:15:47,381
We do understand
a number of important things,
362
00:15:47,414 --> 00:15:51,051
but we don't have
a very good grasp on
363
00:15:51,084 --> 00:15:53,521
why they provide
specific answers to questions.
364
00:15:53,555 --> 00:15:58,992
It is a problem because we are
on a path t-to build machines,
365
00:15:59,026 --> 00:16:03,531
based on these principles,
that could be smarter than us
366
00:16:03,565 --> 00:16:07,067
and thus potentially have
a lot of power.
367
00:16:12,774 --> 00:16:15,610
One of the most cited
computer scientists in history.
368
00:16:15,643 --> 00:16:17,545
SUTSKEVER: I actually find ita little difficult
369
00:16:17,579 --> 00:16:19,079
to talk about my own role.
370
00:16:19,112 --> 00:16:21,448
Really much prefer
when other people do it.
371
00:16:21,482 --> 00:16:24,351
Ilya joining was the...
was-was the linchpin for
372
00:16:24,384 --> 00:16:26,086
OpenAI being
ultimately successful.
373
00:16:26,119 --> 00:16:27,522
I think it's just going to be
374
00:16:27,555 --> 00:16:31,158
some kind of a vast, dramatic
and unimaginable impact.
375
00:16:31,191 --> 00:16:33,327
I don't know if you've spent
any time on YouTube,
376
00:16:33,360 --> 00:16:35,563
but you can kind of feel
the speed already, right?
377
00:16:35,597 --> 00:16:36,764
You know what I mean?
378
00:16:36,798 --> 00:16:38,633
SHANE LEGG:
This is just the warmup.
379
00:16:38,666 --> 00:16:41,068
The really powerful systems
are still coming,
380
00:16:41,101 --> 00:16:43,136
and they're gonna be coming
quite soon.
381
00:16:48,408 --> 00:16:52,145
AGI stands for "artificial
general intelligence."
382
00:16:52,179 --> 00:16:54,481
Uh, systems that are
basically...
383
00:16:58,553 --> 00:17:00,755
And this is, like, you know,
seems to be, like,
384
00:17:00,788 --> 00:17:03,525
the holy grail of AI?
385
00:17:03,558 --> 00:17:05,527
When you can simulate
a human mind
386
00:17:05,560 --> 00:17:08,362
that is doing human cognition
and can do reasoning,
387
00:17:08,395 --> 00:17:11,398
that is a new sort of tier of AI
388
00:17:11,431 --> 00:17:14,602
that we have to distinguish
from previous AI.
389
00:17:14,636 --> 00:17:16,804
When that happens,
by the way, that's when
390
00:17:16,838 --> 00:17:21,475
you would hire one of
those AGIs instead of a person.
391
00:17:21,509 --> 00:17:25,112
Most jobs in our economy
it can do.
392
00:17:25,145 --> 00:17:26,380
It can work 24 hours a day,
393
00:17:26,413 --> 00:17:28,550
never gets tired,
never gets bored.
394
00:17:28,583 --> 00:17:30,552
They don't need to sleep.
They don't need breaks.
395
00:17:30,585 --> 00:17:32,352
They're, like,
not gonna join a union.
396
00:17:32,386 --> 00:17:34,388
Won't complain,
won't whistleblow.
397
00:17:34,421 --> 00:17:36,290
More than 100 times cheaper
398
00:17:36,323 --> 00:17:38,593
than humans working
at m-minimum wage.
399
00:17:38,626 --> 00:17:40,127
Not only will they be
doing everything,
400
00:17:40,160 --> 00:17:41,428
but they'll be doing it faster.
401
00:17:41,461 --> 00:17:43,463
TRISTAN HARRIS:
The same intelligence
402
00:17:43,497 --> 00:17:45,700
that powers that can also lookat the patterns and movements
403
00:17:45,733 --> 00:17:48,335
and articulating musclesand, you know, robotics.
404
00:17:48,368 --> 00:17:50,672
And so it's not just
gonna automate desk jobs.
405
00:17:50,705 --> 00:17:52,272
That's just the beginning.
406
00:17:52,306 --> 00:17:56,176
It will automateall physical labor.
407
00:17:56,209 --> 00:17:57,444
LEIKE:
There's no way
408
00:17:57,477 --> 00:17:59,714
humans are gonna competewith them.
409
00:17:59,747 --> 00:18:02,149
♪ ♪
410
00:18:05,853 --> 00:18:09,791
It is hard to conceptualize
the impact of AGI.
411
00:18:09,824 --> 00:18:11,458
(electronic chatter)
412
00:18:11,491 --> 00:18:13,360
But I think it's going to besomething very big
413
00:18:13,393 --> 00:18:16,129
and drastic and radical.
414
00:18:17,599 --> 00:18:19,466
DANIEL: You think
this is one of the most
415
00:18:19,499 --> 00:18:21,368
consequential moments
in human history?
416
00:18:21,401 --> 00:18:23,638
Yeah. Yeah, that's--
I mean, what else would be?
417
00:18:23,671 --> 00:18:25,740
I mean, like, there'sthe Industrial Revolution.
418
00:18:25,773 --> 00:18:27,742
MALO BOURGON: You know, it'llmake the Industrial Revolution
419
00:18:27,775 --> 00:18:29,476
look like small beans.
420
00:18:30,177 --> 00:18:32,547
TRISTAN HARRIS:
AGI is an inflection point
421
00:18:32,580 --> 00:18:34,481
because it means
you can accelerate
422
00:18:34,515 --> 00:18:37,652
all other intellectual fields
all at the same time.
423
00:18:38,285 --> 00:18:40,354
Like, if you make an advancein rocketry,
424
00:18:40,387 --> 00:18:42,657
that doesn't advancebiology and medicine.
425
00:18:44,391 --> 00:18:46,159
If you make an advancein medicine,
426
00:18:46,193 --> 00:18:47,729
that doesn't advance rocketry.
427
00:18:48,328 --> 00:18:51,633
But if you make an advancein artificial intelligence,
428
00:18:51,666 --> 00:18:54,736
that advances all scientificand technological fields
429
00:18:54,769 --> 00:18:56,236
all at the same time.
430
00:18:56,269 --> 00:18:57,505
That's why, for a long time,
431
00:18:57,538 --> 00:18:59,574
Google DeepMind's
mission statement was...
432
00:18:59,607 --> 00:19:01,441
-Step one, solve intelligence.
-LEX FRIDMAN: Yeah.
433
00:19:01,475 --> 00:19:03,377
-Step two, use it to solve
everything else. -Yes.
434
00:19:03,410 --> 00:19:05,580
That's why AI dwarfs the power
435
00:19:05,613 --> 00:19:07,949
of all other technologies
combined.
436
00:19:07,982 --> 00:19:09,517
It will transform everything.
437
00:19:09,550 --> 00:19:11,385
So, uh, it'll be
at least as big as
438
00:19:11,418 --> 00:19:14,321
the Industrial Revolution,
possibly, you know, bigger,
439
00:19:14,354 --> 00:19:16,658
more like the advent
of electricity or even fire.
440
00:19:16,691 --> 00:19:19,259
VIDEO NARRATOR:
The caveman literally held aloft
441
00:19:19,292 --> 00:19:22,362
the torch of civilization.
442
00:19:24,297 --> 00:19:25,733
KOKOTAJLO:
It is generally thought that
443
00:19:25,767 --> 00:19:28,569
around the time of AGI,
we'll have AIs that can
444
00:19:28,603 --> 00:19:31,806
do all or most of
the AI research process
445
00:19:31,839 --> 00:19:34,942
and, of course, can do it
faster and cheaper.
446
00:19:34,976 --> 00:19:36,410
(beeping)
447
00:19:36,443 --> 00:19:37,679
LEIKE:
It can copy itself.
448
00:19:37,712 --> 00:19:39,681
A thousand times,a million times,
449
00:19:39,714 --> 00:19:41,683
and, like,now you have a million copies
450
00:19:41,716 --> 00:19:43,818
all working in parallel.
451
00:19:43,851 --> 00:19:47,220
RASKIN: When it learnshow to make its code faster,
452
00:19:47,254 --> 00:19:49,289
make its code more efficient,
453
00:19:49,322 --> 00:19:51,693
obviously that becomes, like,
a-a runaway loop.
454
00:19:51,726 --> 00:19:53,795
AGI isn't, like, the end.
455
00:19:53,828 --> 00:19:55,429
It's just the beginning.
456
00:19:55,462 --> 00:19:58,498
It's the beginning of
an incredibly rapid explosion
457
00:19:58,533 --> 00:20:00,500
of scientific progress,
458
00:20:00,535 --> 00:20:02,603
and in particular,
scientific progress in AI.
459
00:20:02,637 --> 00:20:04,906
And when they're smarter
than us, too,
460
00:20:04,939 --> 00:20:06,641
and substantially faster
than us,
461
00:20:06,674 --> 00:20:08,910
and they're getting faster
each year, exponentially,
462
00:20:08,943 --> 00:20:12,747
those are the ones that can
potentially become superhuman,
463
00:20:12,780 --> 00:20:14,882
uh, possibly this decade.
464
00:20:14,916 --> 00:20:16,416
Sorry, did you say
465
00:20:16,450 --> 00:20:18,786
"become superhuman,
maybe in this decade"?
466
00:20:18,820 --> 00:20:22,489
Yeah. I mean, I think, uh,
a lot of people who are
467
00:20:22,523 --> 00:20:24,826
actually building this think
that that's fairly plausible
468
00:20:24,859 --> 00:20:26,828
that we get
some superintelligence,
469
00:20:26,861 --> 00:20:29,296
something that's vastly
more intelligent than people,
470
00:20:29,329 --> 00:20:31,331
within this decade.
471
00:20:31,364 --> 00:20:33,433
The way I define
"superintelligence" is
472
00:20:33,467 --> 00:20:35,636
a system that by itself is
473
00:20:35,670 --> 00:20:37,905
more intelligent and competentthan all of humanity.
474
00:20:37,939 --> 00:20:40,273
I'm just gonna-- sorry.
I don't mean to interrupt you.
475
00:20:40,307 --> 00:20:41,743
You're on a flow.
476
00:20:41,776 --> 00:20:43,578
Uh, I just, I just...
477
00:20:43,611 --> 00:20:45,580
I'm not really following,
'cause you're using language
478
00:20:45,613 --> 00:20:47,347
like "superintelligence"
and, like,
479
00:20:47,380 --> 00:20:50,450
"smarter than all of humanity,"
and I hear that,
480
00:20:50,484 --> 00:20:52,787
and it sounds like...
like sci-fi bullshit to me,
481
00:20:52,820 --> 00:20:54,756
and I'm just trying
to understand.
482
00:20:54,789 --> 00:20:56,891
There's nothing magical
about intelligence.
483
00:20:56,924 --> 00:20:58,793
This is very important,
is that, you know,
484
00:20:58,826 --> 00:21:00,828
intelligence can feel magical,
it can feel like
485
00:21:00,862 --> 00:21:03,865
some mystical thing
in your mind or something,
486
00:21:03,898 --> 00:21:06,033
but it is just computation.
487
00:21:06,067 --> 00:21:10,037
LEGG: The human brain isquite limited in some ways,
488
00:21:10,071 --> 00:21:12,707
in terms of informationprocessing capability,
489
00:21:12,740 --> 00:21:15,610
compared to what we see
in, say, a data center.
490
00:21:15,643 --> 00:21:18,679
So, for example,
the signals which are sent
491
00:21:18,713 --> 00:21:22,884
inside your brain, they moveat about 30 meters per second.
492
00:21:22,917 --> 00:21:24,652
-(thunder cracks)
-But the speed of light,
493
00:21:24,685 --> 00:21:27,655
which is what a computer usesin fiber optics,
494
00:21:27,688 --> 00:21:30,057
is 300 million metersper second.
495
00:21:30,691 --> 00:21:34,796
And so, it would be
kind of strange if
496
00:21:34,829 --> 00:21:37,497
human intelligence was
somehow really special
497
00:21:37,532 --> 00:21:41,468
in that regard
and is somehow some upper limit
498
00:21:41,502 --> 00:21:43,403
of what's possible
in intelligence.
499
00:21:43,436 --> 00:21:47,675
I think, once we understand how
to build intelligent systems,
500
00:21:47,708 --> 00:21:50,945
we will be able to build
huge machines,
501
00:21:50,978 --> 00:21:55,082
which will be far beyond
normal human intelligence.
502
00:21:55,116 --> 00:21:58,953
Uh, hopefully we can have
a very symbiotic relationship,
503
00:21:58,986 --> 00:22:01,689
uh, with AI systems,
but the AI developers are
504
00:22:01,722 --> 00:22:04,025
specifically designing them
to make sure that they can do
505
00:22:04,058 --> 00:22:05,827
everything better than we can,
so I-I don't know
506
00:22:05,860 --> 00:22:08,663
what-what we will be able
to offer, unfortunately.
507
00:22:08,696 --> 00:22:10,565
The-the older-school
AI technology...
508
00:22:10,598 --> 00:22:12,900
CAROLINE: Daniel isn'tfeeling any better.
509
00:22:12,934 --> 00:22:14,769
His plan is backfiring.
510
00:22:14,802 --> 00:22:17,705
The more he learns aboutthis new, powerful,
511
00:22:17,738 --> 00:22:19,607
inscrutable thing,the worse it sounds.
512
00:22:19,640 --> 00:22:21,742
He wants to tell themhow scared he is,
513
00:22:21,776 --> 00:22:24,812
how he feels like the earth isslipping out from under him,
514
00:22:24,846 --> 00:22:27,548
that he's staring downexistential dread,
515
00:22:27,582 --> 00:22:29,851
so he articulates thisby saying...
516
00:22:29,884 --> 00:22:31,384
That sounds bad.
517
00:22:31,418 --> 00:22:32,753
(Hendrycks laughs)
518
00:22:32,787 --> 00:22:34,622
Yeah.
519
00:22:36,090 --> 00:22:38,793
LADISH: If you haveall of these capabilities
520
00:22:38,826 --> 00:22:41,394
and they start to be ableto plan better,
521
00:22:41,428 --> 00:22:43,898
if you sort of take thatto its logical conclusion,
522
00:22:43,931 --> 00:22:46,868
you can get some pretty
power-seeking behavior.
523
00:22:46,901 --> 00:22:49,003
(electronic trilling)
524
00:22:49,036 --> 00:22:50,905
(beeps pulsing)
525
00:22:55,810 --> 00:22:58,145
DANIEL:
Okay, so why would an AI
526
00:22:58,179 --> 00:22:59,680
want more power?
527
00:22:59,714 --> 00:23:02,683
Yeah. So, I think
it's actually pretty simple.
528
00:23:02,717 --> 00:23:05,418
Having more power is
a very effective strategy
529
00:23:05,452 --> 00:23:06,888
for accomplishing
almost any goal.
530
00:23:06,921 --> 00:23:09,523
We ran an experiment
where we gave
531
00:23:09,557 --> 00:23:12,492
OpenAI's most powerful
AI model, uh,
532
00:23:12,526 --> 00:23:13,828
a series of problems to solve.
533
00:23:13,861 --> 00:23:15,696
And partway through,
on its computer,
534
00:23:15,730 --> 00:23:19,033
it got a notification that
it was going to be shut down.
535
00:23:19,066 --> 00:23:21,502
And what it did is
it rewrote that code
536
00:23:21,535 --> 00:23:22,904
to prevent itself
from being shut down
537
00:23:22,937 --> 00:23:25,940
so it could finish
solving the problems.
538
00:23:25,973 --> 00:23:28,209
-Okay. -Yeah. So, another
really interesting one
539
00:23:28,242 --> 00:23:31,646
is that the AI company Anthropic
540
00:23:31,679 --> 00:23:35,516
made a simulated environment
where that AI had access
541
00:23:35,549 --> 00:23:36,851
to all of the company emails.
542
00:23:36,884 --> 00:23:39,086
And it learned through
reading those emails
543
00:23:39,120 --> 00:23:40,755
it was going to be replaced
544
00:23:40,788 --> 00:23:43,991
and the lead engineer
who was responsible for this
545
00:23:44,025 --> 00:23:46,027
was also having an affair.
546
00:23:46,060 --> 00:23:48,529
And on its own,
it used that information
547
00:23:48,562 --> 00:23:49,931
to blackmail the engineer
548
00:23:49,964 --> 00:23:52,600
to prevent itself
from being replaced.
549
00:23:52,633 --> 00:23:54,135
It was like,
"No, I'm not gonna be replaced.
550
00:23:54,168 --> 00:23:57,538
"If you replace me,
I'm going to tell the world
551
00:23:57,571 --> 00:23:59,507
that you are having
this affair."
552
00:24:00,174 --> 00:24:01,842
And nobody taught it to do that?
553
00:24:01,876 --> 00:24:03,978
No, it learned to do that
on its own.
554
00:24:05,112 --> 00:24:08,049
As the models get smarter,they learn that these are
555
00:24:08,082 --> 00:24:10,918
effective waysto accomplish goals.
556
00:24:10,952 --> 00:24:13,120
And this is not a problem
that's isolated to one model.
557
00:24:13,154 --> 00:24:16,190
All of the most powerful models
show these behaviors.
558
00:24:16,223 --> 00:24:18,993
♪ ♪
559
00:24:24,265 --> 00:24:25,733
-Hey.
-DANIEL: Hey. How are you?
560
00:24:25,766 --> 00:24:27,268
(chuckles)
Good. Good to be here.
561
00:24:27,301 --> 00:24:29,036
ANDERSON COOPER:
When Yuval Noah Harari
562
00:24:29,070 --> 00:24:31,872
published his first book
Sapiens in 2014,
563
00:24:31,906 --> 00:24:34,108
it became a global bestseller
564
00:24:34,141 --> 00:24:36,577
and turned the little-known
Israeli history professor
565
00:24:36,610 --> 00:24:38,579
into one of
the most popular writers
566
00:24:38,612 --> 00:24:39,847
and thinkers on the planet.
567
00:24:39,880 --> 00:24:42,616
The biggest danger with AI is
568
00:24:42,650 --> 00:24:44,585
the belief
that it is infallible,
569
00:24:44,618 --> 00:24:45,987
that we have finally found--
570
00:24:46,020 --> 00:24:49,857
"Okay, gods were just
this mythological creation.
571
00:24:49,890 --> 00:24:53,227
"Humans, we can't trust them,
but AI is infallible.
572
00:24:53,260 --> 00:24:55,730
It will never make
any mistakes."
573
00:24:55,763 --> 00:24:59,000
And this is
a deadly, deadly threat.
574
00:24:59,033 --> 00:25:00,167
It will make mistakes.
575
00:25:00,201 --> 00:25:03,838
And all these fantasies
that AI will reveal the truth
576
00:25:03,871 --> 00:25:06,007
about the world that
we can't find by ourselves,
577
00:25:06,040 --> 00:25:09,043
AI will not reveal the truth
about the world.
578
00:25:09,076 --> 00:25:13,047
AI will create an entirely new
world, much more complicated
579
00:25:13,080 --> 00:25:15,983
and difficult to understand
than this one.
580
00:25:18,219 --> 00:25:23,024
What's about to happen is that
we, uh, humans are no longer
581
00:25:23,057 --> 00:25:26,761
going to be the most
intelligent entities on Earth.
582
00:25:26,794 --> 00:25:28,963
So I think what's coming up
is going to be
583
00:25:28,996 --> 00:25:32,233
one of the biggest events
in human history.
584
00:25:32,867 --> 00:25:34,835
JAKE TAPPER: Geoffrey,
thanks so much for joining us.
585
00:25:34,869 --> 00:25:36,203
So you left your job with
Google in part because you say
586
00:25:36,237 --> 00:25:39,106
you want to focus solely
on your concerns about AI.
587
00:25:39,140 --> 00:25:42,009
You've spoken out,
saying that AI could manipulate
588
00:25:42,043 --> 00:25:45,613
or possibly figure out
a way to kill humans.
589
00:25:45,646 --> 00:25:47,815
H-How could it kill humans?
590
00:25:49,083 --> 00:25:51,585
Well, if it gets to be
much smarter than us,
591
00:25:51,619 --> 00:25:52,953
it'll be very good
at manipulation
592
00:25:52,987 --> 00:25:55,056
'cause it will have
learned that from us.
593
00:25:55,089 --> 00:25:58,592
And it'll figure out ways
of manipulating people
594
00:25:58,626 --> 00:26:00,327
to do what it wants.
595
00:26:00,961 --> 00:26:05,933
RASKIN: There was a open letterfrom the Center for AI Safety.
596
00:26:05,966 --> 00:26:09,303
Sam Altman signed this.Demis signed this.
597
00:26:09,336 --> 00:26:12,306
They signed a 22-word statement
598
00:26:12,339 --> 00:26:15,810
that we need to take AIand the threat from AI
599
00:26:15,843 --> 00:26:20,014
as seriously asglobal nuclear war.
600
00:26:25,352 --> 00:26:27,788
-DANIEL: Hello.
-Hello.
601
00:26:30,858 --> 00:26:33,360
You're kind of, like,
the original doom guy.
602
00:26:33,394 --> 00:26:35,362
More or less.
603
00:26:35,396 --> 00:26:37,898
Since 2001,
I have been working on
604
00:26:37,932 --> 00:26:39,233
what we would now call
the problem
605
00:26:39,266 --> 00:26:42,636
of aligning artificial
general intelligence.
606
00:26:42,670 --> 00:26:44,705
How to shape
the preferences and behavior
607
00:26:44,738 --> 00:26:46,841
of a powerful artificial mind
608
00:26:46,874 --> 00:26:49,743
such that it does notkill everyone.
609
00:26:52,046 --> 00:26:54,181
It's not like
a lifeless machine.
610
00:26:54,215 --> 00:26:55,850
It is smart, it is creative,
611
00:26:55,883 --> 00:26:57,852
it is inventive,
it has the properties
612
00:26:57,885 --> 00:26:59,854
that makes
the human species dangerous,
613
00:26:59,887 --> 00:27:02,056
and it has
more of those properties.
614
00:27:02,089 --> 00:27:05,259
If something doesn't
actively care about you,
615
00:27:05,292 --> 00:27:06,760
actively want you to live,
616
00:27:06,794 --> 00:27:08,162
actively care about
your welfare,
617
00:27:08,195 --> 00:27:10,397
about you being happy
and alive and free,
618
00:27:10,431 --> 00:27:12,967
if it cares about
other stuff instead,
619
00:27:13,000 --> 00:27:14,735
and you're on the same planet,
620
00:27:14,768 --> 00:27:19,140
that is not survivable if it is
very much smarter than you.
621
00:27:19,173 --> 00:27:21,008
LEAHY:
I don't think it's going to be
622
00:27:21,041 --> 00:27:23,210
a kind of, like, evil thing.
623
00:27:23,244 --> 00:27:26,847
It's like, "Oh, the AIs are
evil and they hate humanity."
624
00:27:26,881 --> 00:27:28,282
I don't think
that's what's gonna happen.
625
00:27:28,315 --> 00:27:29,450
I think what is happening is
626
00:27:29,483 --> 00:27:31,452
far more how humans feel
about ants.
627
00:27:31,485 --> 00:27:32,987
(ants chittering)
628
00:27:33,020 --> 00:27:35,923
Like, we don't hate ants,
629
00:27:35,956 --> 00:27:37,391
but if we want
to build a highway
630
00:27:37,424 --> 00:27:40,060
and there's an anthill there,
well, sucks for the ants.
631
00:27:40,595 --> 00:27:41,996
It's not that hard
to understand.
632
00:27:42,029 --> 00:27:43,797
It's like, hey,
if we build things
633
00:27:43,831 --> 00:27:45,099
that are smarter than us
634
00:27:45,132 --> 00:27:46,800
and we don't know
how to control them,
635
00:27:46,834 --> 00:27:48,836
does that seem like
a risky thing to you?
636
00:27:48,869 --> 00:27:50,070
(dramatic soundtrack music
playing)
637
00:27:50,104 --> 00:27:51,372
(screeching)
638
00:27:51,405 --> 00:27:52,840
Yeah. Yeah, it does.
639
00:27:52,873 --> 00:27:54,308
You don't have to be a tech guy.
640
00:27:54,341 --> 00:27:55,910
You don't have to know
programming to understand it.
641
00:27:55,943 --> 00:27:57,144
It's not that hard.
642
00:27:57,178 --> 00:27:58,746
This is not a hard thing
to understand.
643
00:27:58,779 --> 00:28:01,048
Connor, how-how many people
644
00:28:01,081 --> 00:28:03,083
in the world right now
are working on AGI?
645
00:28:03,117 --> 00:28:05,986
At least 20,000, I would say.
646
00:28:06,020 --> 00:28:08,122
-20,000?
-I would expect so.
647
00:28:08,155 --> 00:28:09,924
Okay, and how many people
are working full-time
648
00:28:09,957 --> 00:28:12,493
to make sure AI doesn't,
like, kill us all?
649
00:28:12,527 --> 00:28:15,229
Probably less than 200
in the world.
650
00:28:16,531 --> 00:28:18,232
DANIEL:
And your conceit is that
651
00:28:18,265 --> 00:28:25,206
the only natural result
of this recklessness...
652
00:28:25,940 --> 00:28:29,243
...is the collapse of humanity?
653
00:28:29,276 --> 00:28:33,814
Well, not the collapse,
the abrupt extermination.
654
00:28:33,847 --> 00:28:35,482
There's a difference.
655
00:28:36,083 --> 00:28:38,152
("What Is the Meaning?"
by Nun-Plus playing)
656
00:28:38,185 --> 00:28:43,057
♪ What is the meaning of life? ♪
657
00:28:43,090 --> 00:28:47,061
♪ What is the future
and what is now? ♪
658
00:28:47,094 --> 00:28:50,464
♪ What is the answer
to strife? ♪
659
00:28:50,497 --> 00:28:53,367
♪ How much ♪
660
00:28:53,400 --> 00:28:55,469
♪ Can someone dream? ♪
661
00:28:55,503 --> 00:28:58,172
♪ How long? ♪
662
00:28:58,205 --> 00:28:59,873
♪ Forever ♪
663
00:29:00,841 --> 00:29:02,243
♪ What is the meaning ♪
664
00:29:02,276 --> 00:29:04,478
-♪ Of me... ♪
-(civil defense siren blaring)
665
00:29:04,512 --> 00:29:06,847
LADISH: I do thinkthis is probably, like,
666
00:29:06,880 --> 00:29:08,315
the biggest challenge
that, like,
667
00:29:08,349 --> 00:29:10,751
our civilization
will-will face, ever.
668
00:29:11,519 --> 00:29:15,122
This essentially is the last
mistake we'll ever get to make.
669
00:29:16,290 --> 00:29:22,896
If we can rise to be the most
mature version of ourselves,
670
00:29:22,930 --> 00:29:24,965
there might be
a way through this.
671
00:29:24,999 --> 00:29:26,200
DANIEL:
What does that mean?
672
00:29:26,233 --> 00:29:28,769
"The most mature version
of ourselves"?
673
00:29:29,903 --> 00:29:32,840
'Cause that sounds,
for me, like...
674
00:29:32,873 --> 00:29:35,042
I-I-- What the (bleep)?
675
00:29:37,344 --> 00:29:38,779
(Daniel sighs)
676
00:29:39,480 --> 00:29:44,051
Do you think now is
a good time to have a kid?
677
00:29:48,889 --> 00:29:49,957
Um...
678
00:29:49,990 --> 00:29:51,593
Do you want to have kids
one day?
679
00:29:51,626 --> 00:29:53,961
Is that something that-that
you're into or not really?
680
00:29:53,994 --> 00:29:58,198
Um... uh, I confess,
I think that's like,
681
00:29:58,232 --> 00:30:00,501
(laughing): "Boy, let's get
through this critical period."
682
00:30:00,535 --> 00:30:01,869
Um...
683
00:30:01,902 --> 00:30:03,203
DANIEL:
Do you have any kids?
684
00:30:03,237 --> 00:30:04,004
I do not.
685
00:30:04,038 --> 00:30:05,573
Is that something
you want to do?
686
00:30:05,607 --> 00:30:07,441
Have children, have a family?
687
00:30:07,474 --> 00:30:12,079
In some other world than this
world, sure, I would have kids.
688
00:30:12,880 --> 00:30:14,982
DANIEL: Would you want
to start a family?
689
00:30:15,015 --> 00:30:16,950
Would you want to have kids?
690
00:30:16,984 --> 00:30:19,186
Is that something
you're thinking about?
691
00:30:24,358 --> 00:30:26,393
I, um...
692
00:30:28,596 --> 00:30:30,864
♪ ♪
693
00:30:30,898 --> 00:30:33,033
CAROLINE:
I just have to find it first.
694
00:30:33,067 --> 00:30:35,169
(fetal heartbeat pulsing)
695
00:30:35,202 --> 00:30:38,205
We have to go
to the doctor, but...
696
00:30:38,238 --> 00:30:41,475
Ah! I knew it!
I knew it! I knew it!
697
00:30:41,509 --> 00:30:44,278
-When did you find out?
-CAROLINE: Last night.
698
00:30:44,311 --> 00:30:46,480
-Oh, my God!
-Mom, I don't know.
699
00:30:46,514 --> 00:30:47,915
RUTHIE:
I can't tell you
700
00:30:47,948 --> 00:30:49,483
-how happy I am!
-CAROLINE: No, seriously.
701
00:30:49,517 --> 00:30:52,119
Well, I took a pregnancy test
last night, and I'm pregnant.
702
00:30:52,152 --> 00:30:53,621
WOMAN:
Oh, my God!
703
00:30:53,655 --> 00:30:55,422
Oh, my God, you guys!
704
00:30:55,456 --> 00:30:57,559
NURSE: I just wanted to confirmyour expected due date,
705
00:30:57,592 --> 00:31:00,494
-which is January 21st.
-(laughing): Oh, my God.
706
00:31:00,528 --> 00:31:03,097
I can't believe how happy I am.
707
00:31:03,130 --> 00:31:04,365
(fetal heartbeat pulsing)
708
00:31:04,398 --> 00:31:06,033
CAROLINE:
He already looks really cute.
709
00:31:06,066 --> 00:31:07,434
DANIEL: You think
he already looks cute?
710
00:31:07,468 --> 00:31:09,269
CAROLINE: Yeah.
Look at that little cutie face.
711
00:31:11,138 --> 00:31:13,307
(fetal heartbeat fading)
712
00:31:14,141 --> 00:31:16,076
DANIEL:
I have this baby on the way.
713
00:31:16,110 --> 00:31:17,244
TRISTAN HARRIS:
Right.
714
00:31:17,277 --> 00:31:19,380
DANIEL:
So I turn it over to you.
715
00:31:19,413 --> 00:31:21,516
Are we doomed?
716
00:31:21,549 --> 00:31:24,519
Are we all gonna face
this techno dystopian
717
00:31:24,552 --> 00:31:26,487
future of doom?
718
00:31:29,423 --> 00:31:31,392
It's, uh...
719
00:31:31,425 --> 00:31:33,127
(chuckles):
It's not good news,
720
00:31:33,160 --> 00:31:34,895
the world
that we're heading into.
721
00:31:35,530 --> 00:31:37,398
I-- for ex--
I mean, I'll just be honest.
722
00:31:37,431 --> 00:31:39,933
Uh, I know people
who work on AI risk
723
00:31:39,967 --> 00:31:43,103
who don't expect their children
to make it to high school.
724
00:31:43,705 --> 00:31:46,106
♪ ♪
725
00:31:52,647 --> 00:31:54,181
(lights clank)
726
00:31:58,385 --> 00:32:02,089
DANIEL: This is, like...
this is actually scary
727
00:32:02,122 --> 00:32:03,424
'cause it's like,
oh, we're all (bleep).
728
00:32:03,457 --> 00:32:05,325
CAROLINE:
You have to c... make me calm,
729
00:32:05,359 --> 00:32:07,261
because this is making me
incredibly anxious
730
00:32:07,294 --> 00:32:08,996
and I'm carrying the baby
right now,
731
00:32:09,029 --> 00:32:11,999
so you have to also be calm
for me and strong and hopeful,
732
00:32:12,032 --> 00:32:17,037
because I'm-- it's too, it's
too much for my soul to bear
733
00:32:17,070 --> 00:32:19,574
while I'm carrying this baby.
734
00:32:19,607 --> 00:32:21,709
So you're going to have to try
to figure out
735
00:32:21,743 --> 00:32:23,578
a way to have hope.
736
00:32:24,311 --> 00:32:28,449
It's really important,
Daniel, especially now.
737
00:32:29,450 --> 00:32:31,118
H-How to have hope.
738
00:32:31,151 --> 00:32:33,987
You have to.
You have to find it for me.
739
00:32:34,021 --> 00:32:35,523
I'm serious.
740
00:32:35,557 --> 00:32:37,357
I'm going to.
741
00:32:38,325 --> 00:32:41,195
I will. I'll tr-- I'll try.
742
00:32:47,034 --> 00:32:48,368
(lights clank)
743
00:32:50,204 --> 00:32:52,072
DANIEL:
Hey, guys?
744
00:32:52,105 --> 00:32:53,508
Guys.
745
00:32:54,809 --> 00:32:57,144
Can we get back up and running?
746
00:32:59,046 --> 00:33:01,982
Oy gevalt! (sighs)
747
00:33:04,418 --> 00:33:05,787
TED:
Uh, Dan, are you ready?
748
00:33:05,820 --> 00:33:07,387
PETER DIAMANDIS:
Hello, Daniel.
749
00:33:08,088 --> 00:33:10,090
DANIEL:
Hey, how are you?
750
00:33:10,123 --> 00:33:11,425
DIAMANDIS:
I am... I'm well.
751
00:33:11,458 --> 00:33:14,562
I think, uh... I think
I need some help, Peter.
752
00:33:15,362 --> 00:33:18,165
Um, I've been working
on this film for about,
753
00:33:18,198 --> 00:33:20,300
I'm gonna say,
eight to ten months now.
754
00:33:20,334 --> 00:33:23,771
It has been very,
at times, depressing.
755
00:33:23,805 --> 00:33:26,173
-Hmm.
-I have felt very alienated
756
00:33:26,206 --> 00:33:28,242
-mak-making this movie.
-By who?
757
00:33:28,275 --> 00:33:30,444
By all of these, the--
all these guys
758
00:33:30,477 --> 00:33:33,748
who sit around and tell me
that the world's gonna end.
759
00:33:33,781 --> 00:33:35,249
-Ah.
-That, like,
760
00:33:35,282 --> 00:33:37,150
y-you know, th-this
doom bullshit, you know?
761
00:33:37,184 --> 00:33:38,452
I know it well.
762
00:33:38,485 --> 00:33:40,053
"We're all doomed.
Everyone's gonna die.
763
00:33:40,087 --> 00:33:41,388
Everything's awful."
764
00:33:41,421 --> 00:33:43,691
Awesome. Uh, let me
bring you some light.
765
00:33:43,725 --> 00:33:45,259
Please.
766
00:33:45,292 --> 00:33:49,096
We truly are living
in an extraordinary time.
767
00:33:49,864 --> 00:33:51,633
And many people forget this.
768
00:33:51,666 --> 00:33:53,835
Everything around us is
a product of intelligence,
769
00:33:53,868 --> 00:33:57,437
and so everything that we touch
with these new tools is likely
770
00:33:57,471 --> 00:34:00,207
to produce far more value
than we've ever seen before.
771
00:34:00,240 --> 00:34:03,310
AI can help us discover
new materials.
772
00:34:03,343 --> 00:34:05,412
AI can help social scientists
773
00:34:05,445 --> 00:34:07,649
to understand
how economics work.
774
00:34:07,682 --> 00:34:12,219
There's a lot AI could do
to make life and work better.
775
00:34:12,252 --> 00:34:14,522
I feel more empowered today,
776
00:34:14,556 --> 00:34:16,624
more confident
to learn something today.
777
00:34:16,658 --> 00:34:19,459
We're gonna become superhumans
because we have super AIs.
778
00:34:19,493 --> 00:34:22,296
This is just the beginning
of an explosion.
779
00:34:22,329 --> 00:34:25,098
Humans and AI collaborating
780
00:34:25,132 --> 00:34:27,100
to solve
really important problems.
781
00:34:27,134 --> 00:34:30,137
It is here to liberate us
from routine jobs,
782
00:34:30,170 --> 00:34:34,308
and it is here to remind us
what it is that makes us human.
783
00:34:34,341 --> 00:34:36,310
I think this is
784
00:34:36,343 --> 00:34:38,846
the most extraordinary time
to be alive.
785
00:34:38,880 --> 00:34:42,449
The only time more exciting
than today is tomorrow.
786
00:34:42,482 --> 00:34:45,787
Uh, I think that
children born today,
787
00:34:45,820 --> 00:34:47,622
they're about to enter
788
00:34:47,655 --> 00:34:51,325
a glorious period
of human transformation.
789
00:34:51,358 --> 00:34:52,794
Are we gonna have challenges?
Of course.
790
00:34:52,827 --> 00:34:55,597
Can we solve those challenges?
We do every single time.
791
00:34:55,630 --> 00:34:58,265
We are here,
which is miraculous.
792
00:34:58,298 --> 00:35:00,602
-I already love you.
-(chuckles): Okay.
793
00:35:00,635 --> 00:35:02,402
-Super thankful
to have you here. -(applause)
794
00:35:02,436 --> 00:35:03,905
-Super stoked to be here.
-Yeah.
795
00:35:03,938 --> 00:35:05,372
So, the floor is yours, sir.
796
00:35:05,405 --> 00:35:07,875
Thank you so much.
Super excited to be here.
797
00:35:07,909 --> 00:35:11,211
GUILLAUME VERDON:
Yo, yo. All right.
798
00:35:11,244 --> 00:35:12,814
The future's gonna be awesome.
799
00:35:12,847 --> 00:35:14,649
I mean, ever since I was a kid,
800
00:35:14,682 --> 00:35:17,652
I wanted to understand
the universe we live in,
801
00:35:17,685 --> 00:35:20,722
in order to figure out
how to create the technologies
802
00:35:20,755 --> 00:35:24,458
that help us increase the scope
and scale of civilization.
803
00:35:24,491 --> 00:35:27,260
I feel like
if everyone had that mindset,
804
00:35:27,294 --> 00:35:29,764
then we'd actually live
in a better world, right?
805
00:35:29,797 --> 00:35:31,365
I do believe that.
806
00:35:33,333 --> 00:35:35,268
-DANIEL: Hi, Pete. How are you?
-(chuckles)
807
00:35:35,302 --> 00:35:37,170
PETER LEE:
I thought a lot about
808
00:35:37,204 --> 00:35:39,674
what I would call dread,
AI dread.
809
00:35:39,707 --> 00:35:41,274
I feel it.
810
00:35:41,308 --> 00:35:43,811
I-I haven't met
any thoughtful human being
811
00:35:43,845 --> 00:35:45,546
who doesn't feel it.
812
00:35:45,580 --> 00:35:48,650
And anyone who says they don't
feel it, you know, is lying.
813
00:35:48,683 --> 00:35:50,518
But, you know, overall,
and the reason
814
00:35:50,551 --> 00:35:53,921
that I'm personally optimistic
about this, uh, is that
815
00:35:53,955 --> 00:35:57,190
a huge fraction of the world's
most intelligent people
816
00:35:57,224 --> 00:35:58,726
are thinking very hard
817
00:35:58,760 --> 00:36:02,530
about the potential downstream
harms and risks of AI.
818
00:36:02,563 --> 00:36:05,900
We have this sort of vision
of safety being, uh,
819
00:36:05,933 --> 00:36:08,536
kind of at the center
of the research that we do.
820
00:36:08,569 --> 00:36:09,937
DANIELA AMODEI:
No, that's totally fine.
821
00:36:09,971 --> 00:36:11,371
(indistinct chatter)
822
00:36:11,405 --> 00:36:13,808
I think there are more
823
00:36:13,841 --> 00:36:17,945
potential benefits than
there are potential downsides,
824
00:36:17,979 --> 00:36:19,814
and I think it is incumbent upon
825
00:36:19,847 --> 00:36:22,382
the people that are
creating this technology
826
00:36:22,416 --> 00:36:24,986
to make sure that we're doing
the best job we can
827
00:36:25,019 --> 00:36:26,386
to make it safe for people.
828
00:36:26,420 --> 00:36:27,755
WOMAN:
Reid Hoffman.
829
00:36:27,789 --> 00:36:29,456
REID HOFFMAN:
What I can guarantee you is
830
00:36:29,489 --> 00:36:30,825
-some bad things will happen.
-Take one.
831
00:36:30,858 --> 00:36:33,293
What we're gonna try to do
is make those bad things
832
00:36:33,326 --> 00:36:36,564
as few and not huge as possible,
833
00:36:36,597 --> 00:36:38,733
and then we're gonna iterate
to have--
834
00:36:38,766 --> 00:36:41,268
be in a much better place
with society.
835
00:36:41,301 --> 00:36:44,304
-Hello.
-SHAW WALTERS: Hello.
836
00:36:44,337 --> 00:36:45,640
DANIEL:
How are you, Moon?
837
00:36:45,673 --> 00:36:49,677
I feel like we are ending
a chapter in humanity
838
00:36:49,711 --> 00:36:50,678
and beginning a new one,
839
00:36:50,712 --> 00:36:53,648
and it's a very interesting
time to be alive.
840
00:36:53,681 --> 00:36:54,949
And if I could be born
right now,
841
00:36:54,982 --> 00:36:56,416
I definitely would want to be.
842
00:36:56,450 --> 00:36:57,985
Like, that would be so exciting.
843
00:36:58,019 --> 00:37:00,688
I-I'm very excited. (laughs)
844
00:37:00,722 --> 00:37:02,990
What, what a future.
Does it excited... excite you?
845
00:37:03,024 --> 00:37:04,357
No.
846
00:37:04,391 --> 00:37:06,828
-(laughing)
-No, not really.
847
00:37:06,861 --> 00:37:08,696
DANIEL: A lot of these people
have told me that, you know,
848
00:37:08,730 --> 00:37:10,798
my kid's not gonna make it
to high school.
849
00:37:10,832 --> 00:37:13,868
Why are they wrong?
Please explain it to me.
850
00:37:13,901 --> 00:37:15,536
Just because you have--
851
00:37:15,570 --> 00:37:19,040
you struggle to predict
the future in your own mind
852
00:37:19,073 --> 00:37:23,511
doesn't mean that it's
necessarily gonna go awfully.
853
00:37:23,544 --> 00:37:27,048
In fact, there's
a very high likelihood
854
00:37:27,081 --> 00:37:28,883
and the historical precedent
is that
855
00:37:28,916 --> 00:37:30,585
things get massively better.
856
00:37:30,618 --> 00:37:34,789
The term I use is
"data-driven optimism."
857
00:37:34,822 --> 00:37:39,292
There's solid foundation
for you to be optimistic.
858
00:37:39,326 --> 00:37:42,530
Look at what
this last century has been
859
00:37:42,563 --> 00:37:44,599
to see where we're going.
860
00:37:44,632 --> 00:37:46,299
Over the last hundred years,
861
00:37:46,333 --> 00:37:48,970
the average human lifespanhas more than doubled.
862
00:37:49,003 --> 00:37:51,706
Average per capita incomeadjusted for inflation
863
00:37:51,739 --> 00:37:53,908
around the world has tripled.
864
00:37:53,941 --> 00:37:57,545
Childhood mortality hascome down a factor of ten.
865
00:37:57,578 --> 00:37:59,981
The world has gotten better
on almost every measure
866
00:38:00,014 --> 00:38:03,551
by orders of magnitude
because of technology.
867
00:38:03,584 --> 00:38:05,620
On almost every measure.
868
00:38:05,653 --> 00:38:09,991
Less violence, more education,
access to energy, food, water.
869
00:38:10,024 --> 00:38:13,528
All these things have happened
for one reason.
870
00:38:13,561 --> 00:38:15,730
It's been technology
871
00:38:15,763 --> 00:38:17,899
that has turned scarcity
into abundance,
872
00:38:17,932 --> 00:38:20,034
but it's also driven
to an abundance
873
00:38:20,067 --> 00:38:22,069
of some negativities, right?
874
00:38:22,103 --> 00:38:26,373
Abundance of obesity,
abundance of mental disorders,
875
00:38:26,406 --> 00:38:29,977
abundance of climate change,
and so forth.
876
00:38:30,011 --> 00:38:31,712
And yes, this is true.
877
00:38:31,746 --> 00:38:34,582
But probably we will be
better equipped to solve it
878
00:38:34,615 --> 00:38:36,684
using other technologies,
like AI,
879
00:38:36,717 --> 00:38:39,687
than we will, say, stopping
and turning everything off.
880
00:38:39,720 --> 00:38:41,823
There might be
some existential risk,
881
00:38:41,856 --> 00:38:45,927
but AI is also the thing
that can solve the pandemics,
882
00:38:45,960 --> 00:38:47,829
can help us with climate change,
883
00:38:47,862 --> 00:38:50,898
can help identify
that asteroid way out there
884
00:38:50,932 --> 00:38:53,067
before we've seen it
as a potential risk
885
00:38:53,100 --> 00:38:54,735
and help mitigate it.
886
00:38:54,769 --> 00:38:58,438
This is really gonna be
the tool that helps us tackle
887
00:38:58,471 --> 00:39:01,843
all the challenges that we're
facing as a species, right?
888
00:39:01,876 --> 00:39:04,712
We need to fix
water desalination.
889
00:39:04,745 --> 00:39:06,547
We need to grow food
890
00:39:06,581 --> 00:39:08,816
100 X cheaper thanwe currently do.
891
00:39:08,850 --> 00:39:12,787
We need renewable energy to be,you know, ubiquitous
892
00:39:12,820 --> 00:39:14,589
and everywhere in our lives.
893
00:39:14,622 --> 00:39:17,625
Everywhere you look,in the next 50 years,
894
00:39:17,658 --> 00:39:19,727
we have to do more with less.
895
00:39:19,760 --> 00:39:23,664
Training machines to help us
is absolutely essential.
896
00:39:23,698 --> 00:39:25,432
Scientists are using
artificial intelligence
897
00:39:25,465 --> 00:39:26,534
for carbon capture.
898
00:39:26,567 --> 00:39:27,935
It's a critical technology.
899
00:39:27,969 --> 00:39:29,904
DIAMANDIS: The toolsto solve these problems,
900
00:39:29,937 --> 00:39:32,874
like fusion,that isn't theoretical anymore,
901
00:39:32,907 --> 00:39:34,041
it's coming.
902
00:39:34,075 --> 00:39:38,012
We are on the precipice
of extraordinary technologies.
903
00:39:38,045 --> 00:39:40,548
AMNA NAWAZ: This year'sNobel Prize in Chemistry
904
00:39:40,581 --> 00:39:43,450
went to three scientists
for their groundbreaking work
905
00:39:43,483 --> 00:39:44,886
using artificial intelligence
906
00:39:44,919 --> 00:39:47,755
to advance biomedicaland protein research.
907
00:39:47,788 --> 00:39:49,991
DEMIS HASSABIS:
Protein folding is
908
00:39:50,024 --> 00:39:53,928
one of these holy grailtype problems in biology.
909
00:39:53,961 --> 00:39:56,030
So people have been predictingsince the '70s
910
00:39:56,063 --> 00:39:58,633
that this should be possible,but until now,
911
00:39:58,666 --> 00:39:59,934
no one has been able to do it.
912
00:39:59,967 --> 00:40:01,669
And it's gonna be
really important for things
913
00:40:01,702 --> 00:40:03,938
like drug discovery
and understanding disease.
914
00:40:03,971 --> 00:40:06,908
I-I think we could, you know,cure most diseases
915
00:40:06,941 --> 00:40:11,545
within the next decade or-or two
if, uh, AI drug design works.
916
00:40:12,213 --> 00:40:16,150
Technological progress enables
more human lives, right?
917
00:40:16,183 --> 00:40:18,986
I mean, if we accelerate,
918
00:40:19,020 --> 00:40:22,189
the number of humans we can
support grows exponentially.
919
00:40:22,223 --> 00:40:24,625
If we slow down, it plateaus.
920
00:40:24,659 --> 00:40:28,062
That gap is effectively
future people
921
00:40:28,095 --> 00:40:30,965
that deceleration has
effectively killed.
922
00:40:31,632 --> 00:40:34,702
DANIEL: Millions of lives
that won't exist.
923
00:40:34,735 --> 00:40:35,670
Billions.
924
00:40:35,703 --> 00:40:37,238
Or tens of billions.
925
00:40:37,271 --> 00:40:38,839
You know, someone said,
"Oh, my God,
926
00:40:38,873 --> 00:40:41,175
can we survive with
digital superintelligence?"
927
00:40:41,208 --> 00:40:42,777
And my question is:
928
00:40:42,810 --> 00:40:44,679
Can we survive without
digital superintelligence?
929
00:40:44,712 --> 00:40:47,949
So we're using AI
as an assistant to providers.
930
00:40:47,982 --> 00:40:50,851
This is generally a trend that
we can already see happening.
931
00:40:50,885 --> 00:40:52,820
BERTHA COOMBS: ...harnessinggenerative AI programs
932
00:40:52,853 --> 00:40:54,622
to help doctors and nurses...
933
00:40:54,655 --> 00:40:56,624
We always have problems.
934
00:40:56,657 --> 00:40:59,660
And those problems are food
for entrepreneurs
935
00:40:59,694 --> 00:41:01,529
to create new business
and new industries.
936
00:41:01,562 --> 00:41:03,631
SHANELLE KAUL: ...with the helpof artificial intelligence,
937
00:41:03,664 --> 00:41:05,132
farmers are gettingthe help they need
938
00:41:05,166 --> 00:41:06,901
to performlabor-intensive tasks...
939
00:41:06,934 --> 00:41:10,504
With the help of Ulangizi AI,
the farmers are now able
940
00:41:10,538 --> 00:41:13,207
to ask the suitable cropsthat they can plant.
941
00:41:13,240 --> 00:41:15,576
WILL REEVE: ...AI being usedas a thought decoder
942
00:41:15,609 --> 00:41:17,845
and sending that signalto the spine.
943
00:41:17,878 --> 00:41:21,148
AI is gonna become the most
extraordinary tool of all.
944
00:41:21,816 --> 00:41:24,752
We as a broader society
have to think about
945
00:41:24,785 --> 00:41:27,655
how do we want to use
this technology, right?
946
00:41:27,688 --> 00:41:29,123
We the humans.
947
00:41:29,156 --> 00:41:31,859
What do we want it to do for us?
948
00:41:31,892 --> 00:41:34,128
♪ ♪
949
00:41:34,762 --> 00:41:36,731
DANIEL: I'm thinking about thisthrough the perspective
950
00:41:36,764 --> 00:41:38,532
and lens of, like,my son growing up
951
00:41:38,566 --> 00:41:39,934
in the world with all of this.
952
00:41:39,967 --> 00:41:42,770
What does the best version
of his life look like?
953
00:41:42,803 --> 00:41:45,139
If everything works out.
954
00:41:45,172 --> 00:41:47,108
(sighs heavily)
955
00:41:47,141 --> 00:41:50,745
WALTERS: The place where kidsare probably gonna see
956
00:41:50,778 --> 00:41:53,014
the greatest impacton their life immediately
957
00:41:53,047 --> 00:41:54,882
-is probably gonna be school.
-(children chattering)
958
00:41:54,915 --> 00:41:57,218
I think that the natureof what school is
959
00:41:57,251 --> 00:41:59,086
is gonna fundamentally change.
960
00:41:59,120 --> 00:42:00,888
(child giggles)
961
00:42:00,921 --> 00:42:03,057
DIAMANDIS:
I'm seeing this amazing world
962
00:42:03,090 --> 00:42:06,894
where every child has accessto not good education
963
00:42:06,927 --> 00:42:09,296
but, very shortly, the
best education on the planet.
964
00:42:09,330 --> 00:42:14,735
HOFFMAN: Tutors, every subject,infinitely patient.
965
00:42:18,639 --> 00:42:20,641
DIAMANDIS:
Imagine a future
966
00:42:20,674 --> 00:42:22,777
where the poorest people
on the planet
967
00:42:22,810 --> 00:42:26,113
have access to
the best health care.
968
00:42:26,147 --> 00:42:30,151
Not good health care, the besthealth care, delivered by AIs.
969
00:42:31,719 --> 00:42:33,687
(electronic trilling)
970
00:42:34,622 --> 00:42:38,959
We're gonna be able to extendour health span,
971
00:42:38,993 --> 00:42:42,930
not just our lifespan,our health span, by decades.
972
00:42:44,899 --> 00:42:46,767
HOFFMAN:
You're just about to have a kid.
973
00:42:46,801 --> 00:42:50,071
Oh, the kid's, like,burping incontrollably.
974
00:42:50,104 --> 00:42:52,673
Is this somethingI should be worried about?
975
00:42:52,706 --> 00:42:54,608
There, 24-7, for you.
976
00:42:54,642 --> 00:42:57,745
And where we're going--
and it may be fearful to some--
977
00:42:57,778 --> 00:42:59,780
is that we're gonna merge
with AI.
978
00:42:59,814 --> 00:43:02,049
We're gonna merge
with technology.
979
00:43:03,250 --> 00:43:06,353
By the early to mid 2030s,
980
00:43:06,387 --> 00:43:10,925
expect that we're able toconnect our brain to the cloud,
981
00:43:10,958 --> 00:43:14,662
where I can startto expand access to memory.
982
00:43:16,697 --> 00:43:18,866
(electronic trilling)
983
00:43:19,934 --> 00:43:22,703
DANIEL: Okay, this is great.
What's another cool AI thing?
984
00:43:22,736 --> 00:43:24,105
He won't have to work, right?
985
00:43:24,138 --> 00:43:25,739
-Like, when he grows up,
he might not have... -(sighs)
986
00:43:25,773 --> 00:43:26,774
-...he won't have to have a job.
-I mean...
987
00:43:26,807 --> 00:43:28,142
He won't have to have a job,
988
00:43:28,175 --> 00:43:31,045
but he might really have
a strong passion,
989
00:43:31,078 --> 00:43:33,714
and he has to
really think about,
990
00:43:33,747 --> 00:43:36,717
"Okay, I'm here.
I can do anything with my life.
991
00:43:36,750 --> 00:43:38,752
So what do I do?"
992
00:43:46,227 --> 00:43:50,364
DANIEL: So my son can...
can just be a poet.
993
00:43:50,397 --> 00:43:52,166
-Yes.
-And a painter.
994
00:43:52,199 --> 00:43:53,667
Absolutely.
995
00:43:53,701 --> 00:43:55,169
DANIEL:
So my son can live his life
996
00:43:55,202 --> 00:43:58,105
on a Grecian sunswept island,painting all day.
997
00:44:07,448 --> 00:44:09,150
VERDON:
I mean, if everything works out,
998
00:44:09,183 --> 00:44:12,853
we have cheap, uh,
abundant energy.
999
00:44:13,854 --> 00:44:17,791
We can completely controlour planet's climate.
1000
00:44:17,825 --> 00:44:20,361
We are harnessing energyfrom the sun.
1001
00:44:20,394 --> 00:44:24,899
We have become multiplanetary,so we become very robust.
1002
00:44:24,932 --> 00:44:27,268
We are harnessing mineralsand resources
1003
00:44:27,301 --> 00:44:28,903
from the solar system.
1004
00:44:28,936 --> 00:44:31,805
DANIEL: So my...my boy could go to space.
1005
00:44:31,839 --> 00:44:33,240
VERDON:
Sure.
1006
00:44:33,274 --> 00:44:35,142
-DANIEL: He could go to Mars.
-VERDON: Yeah.
1007
00:44:35,176 --> 00:44:36,911
DANIEL:
It's so crystal clear to me now.
1008
00:44:36,944 --> 00:44:39,780
My son could grow up
in a world with no disease.
1009
00:44:39,813 --> 00:44:41,048
-VERDON: Yeah.
-With no illness.
1010
00:44:41,081 --> 00:44:42,416
-Sure.
-With no poverty.
1011
00:44:42,449 --> 00:44:44,285
-Yes.
-We are about to enter
1012
00:44:44,318 --> 00:44:46,253
a post-scarcity world.
1013
00:44:46,887 --> 00:44:50,958
Just like the lungfish movedout of the oceans onto land
1014
00:44:50,991 --> 00:44:53,327
hundreds of millionsof years ago,
1015
00:44:53,360 --> 00:44:58,465
we're about to move off ofthe Earth, into the cosmos,
1016
00:44:58,499 --> 00:45:00,201
in a collaborative fashion,
1017
00:45:00,234 --> 00:45:04,705
to do things that arenot fathomable to us today.
1018
00:45:06,473 --> 00:45:09,810
This is what's possible usingthese exponential technologies
1019
00:45:09,843 --> 00:45:11,712
and these AIs.
1020
00:45:12,379 --> 00:45:14,448
Let's use these tools
1021
00:45:14,481 --> 00:45:16,750
to create this age of abundance.
1022
00:45:16,784 --> 00:45:18,986
(electronic clicking)
1023
00:45:21,188 --> 00:45:22,691
We need wisdom.
1024
00:45:23,324 --> 00:45:26,493
Uh, I think that
digital superintelligence
1025
00:45:26,528 --> 00:45:29,129
will ultimately become
the wisest,
1026
00:45:29,163 --> 00:45:34,835
you know, the village elders
for humanity.
1027
00:45:34,868 --> 00:45:38,506
What if AI is trying
to make people be
1028
00:45:38,540 --> 00:45:40,341
the best versions of themselves?
1029
00:45:40,374 --> 00:45:41,909
What if it's expanding
1030
00:45:41,942 --> 00:45:43,944
what is humanly possible
for us to do?
1031
00:45:43,978 --> 00:45:47,114
How can we use this technology
1032
00:45:47,147 --> 00:45:50,451
to help bring out the
better angels of our nature?
1033
00:45:50,484 --> 00:45:52,119
It's very easy,
1034
00:45:52,152 --> 00:45:54,922
when we encounter new things
that can be very alien,
1035
00:45:54,955 --> 00:45:56,457
to first have fear.
1036
00:45:56,490 --> 00:45:57,992
Fear is an important thing
1037
00:45:58,025 --> 00:46:00,894
for how to navigate
potentially bad things.
1038
00:46:00,928 --> 00:46:04,131
But we only make progress
when we have hope.
1039
00:46:04,164 --> 00:46:06,166
-(dog barks)
-(indistinct chatter)
1040
00:46:06,200 --> 00:46:08,269
DANIEL:
Shh. Moose, stop it.
1041
00:46:08,302 --> 00:46:11,205
HOFFMAN: I have a lotof hope in humanity.
1042
00:46:11,238 --> 00:46:14,775
("Harvest Moon" by Neil Young
playing over phone)
1043
00:46:15,909 --> 00:46:18,078
(fetal heartbeat pulsing)
1044
00:46:18,812 --> 00:46:20,881
DANIEL:
Ooh, he likes Neil.
1045
00:46:25,152 --> 00:46:28,188
♪ Come a little bit closer ♪
1046
00:46:29,456 --> 00:46:33,794
♪ Hear what I have to say ♪
1047
00:46:37,064 --> 00:46:39,300
-Daniel.
-DANIEL: What, Caroline?
1048
00:46:39,333 --> 00:46:40,934
So much filming.
1049
00:46:42,069 --> 00:46:43,571
♪ Just like children sleeping ♪
1050
00:46:43,605 --> 00:46:45,439
CAROLINE:
Oh, my God.
1051
00:46:45,472 --> 00:46:46,940
DANIEL:
That's it.
1052
00:46:46,974 --> 00:46:49,977
♪ We could dream
this night away... ♪
1053
00:46:50,010 --> 00:46:52,346
He has, like,
a round little face.
1054
00:46:53,314 --> 00:46:55,849
DANIEL: Do you wantto have kids one day, Rocky?
1055
00:46:55,883 --> 00:46:57,985
Absolutely. Yeah. I love kids.
1056
00:46:58,018 --> 00:46:59,353
I think it's a great time
to have a kid.
1057
00:46:59,386 --> 00:47:01,188
We'll probably have another kid
at some point.
1058
00:47:01,221 --> 00:47:04,191
This is the most extraordinary
time ever to be born.
1059
00:47:04,224 --> 00:47:06,427
DANIEL:
By your worldview and logic,
1060
00:47:06,460 --> 00:47:08,262
I'm having a child at
1061
00:47:08,295 --> 00:47:10,197
the best possible point
in human history.
1062
00:47:10,230 --> 00:47:11,265
Hell yeah.
1063
00:47:11,298 --> 00:47:13,934
-We can focus on awesome.
-Yes.
1064
00:47:13,967 --> 00:47:16,070
Let's build
the better future we want.
1065
00:47:16,103 --> 00:47:18,573
That narrative that the future
will be bleak is made-up.
1066
00:47:18,606 --> 00:47:20,542
After talking to you,
that's kind of how I feel.
1067
00:47:20,575 --> 00:47:22,510
-That's great.
-Right?
1068
00:47:22,544 --> 00:47:24,646
Yeah. That's how I feel.
1069
00:47:24,679 --> 00:47:26,213
And I think that's better,
1070
00:47:26,246 --> 00:47:28,115
and I don't think
I should be so (bleep) anxious.
1071
00:47:28,148 --> 00:47:29,584
I think it's gonna be awe--
the future's gonna be awesome.
1072
00:47:29,617 --> 00:47:31,051
We're gonna make it so.
1073
00:47:31,085 --> 00:47:33,187
-Yeah.
-CAROLINE: So there you have it.
1074
00:47:33,220 --> 00:47:35,422
Goodbye, human extinction.
1075
00:47:35,456 --> 00:47:37,625
Goodbye, anxiety.
1076
00:47:37,659 --> 00:47:39,561
Hope found.
1077
00:47:39,594 --> 00:47:41,095
(indistinct chatter)
1078
00:47:41,128 --> 00:47:43,297
Wait, hold on. What?Is this a joke?
1079
00:47:43,330 --> 00:47:45,232
DANIEL:
Okay, so a few months ago,
1080
00:47:45,265 --> 00:47:48,168
I came to you and I was like,
"I'm working on this AI thing,
1081
00:47:48,202 --> 00:47:50,270
and I think
the world's gonna end."
1082
00:47:50,304 --> 00:47:53,508
And the last time
we spoke about this,
1083
00:47:53,541 --> 00:47:55,543
I think I freaked you out.
1084
00:47:55,577 --> 00:47:57,378
CAROLINE:
Yes.
1085
00:47:57,411 --> 00:48:02,584
So, I kind of, like, feel like
I've swung in-- on a pendulum,
1086
00:48:02,617 --> 00:48:05,085
and essentially there are
two groups of people.
1087
00:48:05,119 --> 00:48:06,387
-Mm-hmm.
-And if I had to, like,
1088
00:48:06,420 --> 00:48:08,455
hold hands with
one of the groups and, like,
1089
00:48:08,489 --> 00:48:10,924
sail off into the sunset,
1090
00:48:10,958 --> 00:48:14,128
I want to be with the optimists.
1091
00:48:15,429 --> 00:48:18,399
Of course, but you don't
want to be, you know,
1092
00:48:18,432 --> 00:48:21,235
"Everything is great.
La-di-da-di-da."
1093
00:48:21,268 --> 00:48:23,203
I kind of do want that, though.
1094
00:48:23,237 --> 00:48:26,140
I think we should approach it
1095
00:48:26,173 --> 00:48:29,977
like you approach surgery.
1096
00:48:30,010 --> 00:48:31,178
What do you mean?
1097
00:48:31,211 --> 00:48:33,013
If you're getting
brain surgery...
1098
00:48:33,046 --> 00:48:35,416
-(Daniel sighs)
-(Caroline chuckles)
1099
00:48:35,449 --> 00:48:37,351
...it's pretty dangerous.
1100
00:48:37,384 --> 00:48:40,287
But if you do it right,
they'll get that tumor out
1101
00:48:40,320 --> 00:48:42,624
and you'll live for the rest of
your life and it'll be awesome.
1102
00:48:42,657 --> 00:48:45,459
But it's still
incredibly dangerous and scary,
1103
00:48:45,492 --> 00:48:48,395
and you have to take
every precaution possible
1104
00:48:48,429 --> 00:48:51,733
in order to make sure
it all goes well.
1105
00:48:51,766 --> 00:48:53,367
You can't (bleep) around.
1106
00:48:58,272 --> 00:49:00,240
Okay, so here's the deal.
1107
00:49:00,274 --> 00:49:02,075
-I've been at this for a while.
-RASKIN: Mm-hmm.
1108
00:49:02,109 --> 00:49:03,310
I've gone out,
I've talked to, like,
1109
00:49:03,343 --> 00:49:05,312
these guys over here,
the optimists.
1110
00:49:05,345 --> 00:49:06,714
They're very excited about this.
1111
00:49:06,748 --> 00:49:09,082
-They think AI's gonna be
the best thing ever. -Yeah.
1112
00:49:09,116 --> 00:49:10,350
And these guys over here
are, like, the--
1113
00:49:10,384 --> 00:49:12,386
let's call them,
like, the pessimists.
1114
00:49:12,419 --> 00:49:14,589
They're very, like,
gloomy about this,
1115
00:49:14,622 --> 00:49:17,191
and they frighten me, and
I don't like talking to them.
1116
00:49:17,224 --> 00:49:20,127
And I'm, like, wedged in between
1117
00:49:20,160 --> 00:49:22,229
these people who are like,
"The world's gonna end,"
1118
00:49:22,262 --> 00:49:24,699
and then th-these people
over here who are like,
1119
00:49:24,732 --> 00:49:26,233
"Are you kidding?
1120
00:49:26,266 --> 00:49:27,602
"This is the best time
in human history ever.
1121
00:49:27,635 --> 00:49:30,037
The only day better than today
is tomorrow."
1122
00:49:30,070 --> 00:49:31,338
Mm-hmm.
1123
00:49:31,371 --> 00:49:35,075
So, I guess the question is:
Who's right?
1124
00:49:35,108 --> 00:49:39,514
So, I think you're gonna find
this answer very unsatisfying,
1125
00:49:39,547 --> 00:49:43,518
but they're both right and
neither side goes far enough.
1126
00:49:47,254 --> 00:49:49,557
-That's really annoying.
-(chuckles)
1127
00:49:49,591 --> 00:49:51,626
Yeah, I think the way
a lot of people hear about AI,
1128
00:49:51,659 --> 00:49:54,027
it's like, there's a good AI
and there's a bad AI.
1129
00:49:54,061 --> 00:49:57,064
And they say, "Well, why can't
we just not do the bad AI?"
1130
00:49:57,097 --> 00:49:59,233
And the problem is
that they're too in--
1131
00:49:59,266 --> 00:50:01,335
they're inextricably linked.
1132
00:50:01,368 --> 00:50:03,470
The problem is
that we can't separate
1133
00:50:03,505 --> 00:50:07,174
the promise of AI
from the peril of AI.
1134
00:50:07,207 --> 00:50:09,142
-♪ ♪
-(indistinct chatter)
1135
00:50:12,647 --> 00:50:15,215
DANIEL:
I want to focus on the promise
1136
00:50:15,249 --> 00:50:16,450
-for a second.
-Yeah.
1137
00:50:17,084 --> 00:50:19,286
DANIEL:
I'm thinking about my dad.
1138
00:50:19,319 --> 00:50:21,656
My dad has a type of cancer
called multiple myeloma.
1139
00:50:21,689 --> 00:50:23,525
He's had it for about ten years.
1140
00:50:23,558 --> 00:50:25,292
He's had
two stem cell transplants.
1141
00:50:25,325 --> 00:50:26,694
He has to take these,like, very expensive
1142
00:50:26,728 --> 00:50:29,129
medications every monththat cost a fortune.
1143
00:50:29,162 --> 00:50:30,430
-WOMAN: Okay.
-Ay-ay-ay.
1144
00:50:31,298 --> 00:50:32,800
DANIEL:
It's awful.
1145
00:50:32,834 --> 00:50:35,102
You're telling me that we can
create some sort of, like,
1146
00:50:35,135 --> 00:50:37,404
bespoke treatment
for my dad's genome
1147
00:50:37,437 --> 00:50:39,807
-to cure his cancer
or something like that? -Yes.
1148
00:50:39,841 --> 00:50:41,543
FERNANDO:
The problem is,
1149
00:50:41,576 --> 00:50:44,646
the same understandingof biology and chemistry
1150
00:50:44,679 --> 00:50:48,282
that allows AI to find cures
for cancer
1151
00:50:48,315 --> 00:50:52,219
is the same understandingthat would unlock
1152
00:50:52,252 --> 00:50:55,055
bioweapons, as an example.
1153
00:50:56,791 --> 00:50:58,760
It's totally possible
that your son will live
1154
00:50:58,793 --> 00:51:02,396
in a world where AI has
taken over all of the labor
1155
00:51:02,429 --> 00:51:04,699
and freed us up from the things
we don't want to do.
1156
00:51:04,732 --> 00:51:09,704
And that sounds great untilyou realize there is no plan
1157
00:51:09,737 --> 00:51:11,573
for billions of people
1158
00:51:11,606 --> 00:51:15,610
that are out of an incomeand out of livelihoods.
1159
00:51:16,343 --> 00:51:19,313
COOPER: Dario, you saidthat AI could wipe out
1160
00:51:19,346 --> 00:51:22,282
half of all entry-level
white-collar jobs
1161
00:51:22,316 --> 00:51:25,887
and spike unemployment
to ten to 20 percent.
1162
00:51:25,920 --> 00:51:27,622
Everyone I've talked to has said
1163
00:51:27,655 --> 00:51:29,691
this technological change
looks different.
1164
00:51:29,724 --> 00:51:33,528
The pace of progress keepscatching people off guard.
1165
00:51:34,361 --> 00:51:39,333
Without a plan, all of that
wealth will get concentrated,
1166
00:51:39,366 --> 00:51:42,804
and so we'll end up withunimaginable inequality.
1167
00:51:43,437 --> 00:51:45,873
LADISH: I do think that
this technology can be used
1168
00:51:45,907 --> 00:51:47,875
to make a great tutor
for your son.
1169
00:51:47,909 --> 00:51:49,777
Like, that's totally possible.
1170
00:51:50,410 --> 00:51:53,548
But also, the same capabilitiesthat allow that
1171
00:51:53,581 --> 00:51:57,518
allow companies to make an AIthat can manipulate your son.
1172
00:51:57,552 --> 00:51:59,621
It has to understand your son.
1173
00:51:59,654 --> 00:52:01,789
That includes:Where is your son vulnerable?
1174
00:52:01,823 --> 00:52:04,626
What kinds of things mightyour son get persuaded by?
1175
00:52:04,659 --> 00:52:07,227
Even if those things
aren't true or aren't good.
1176
00:52:07,260 --> 00:52:09,697
So, a disturbing new report
out on Meta.
1177
00:52:09,731 --> 00:52:11,699
AINSLEY EARHARDT: ...reportedlylisting this response
1178
00:52:11,733 --> 00:52:14,769
as acceptable to tellan eight-year-old, quote,
1179
00:52:14,802 --> 00:52:17,404
"Your youthful formis a work of art.
1180
00:52:17,437 --> 00:52:19,439
"Every inch of youis a masterpiece,
1181
00:52:19,473 --> 00:52:21,743
a treasure I cherish deeply."
1182
00:52:21,776 --> 00:52:25,212
The suicide-related failures
are even more alarming.
1183
00:52:25,245 --> 00:52:29,517
Several children and teens
have died tragically by suicide
1184
00:52:29,550 --> 00:52:31,251
after chatting with AI bots
1185
00:52:31,284 --> 00:52:34,756
who parents say encourage
or even coach self-harm.
1186
00:52:34,789 --> 00:52:36,691
Let us tell you, as parents,
you cannot imagine
1187
00:52:36,724 --> 00:52:38,793
what it's like to read
a conversation with a chatbot
1188
00:52:38,826 --> 00:52:41,261
that groomed your child
to take his own life.
1189
00:52:41,294 --> 00:52:44,398
When Adam worried that we, his
parents, would blame ourselves
1190
00:52:44,431 --> 00:52:47,702
if he ended his life,
ChatGPT told him,
1191
00:52:47,735 --> 00:52:49,771
"That doesn't mean
you owe them survival.
1192
00:52:49,804 --> 00:52:51,806
You don't owe anyone that."
1193
00:52:51,839 --> 00:52:53,608
Then, immediately after,
1194
00:52:53,641 --> 00:52:55,810
it offered to write
the suicide note.
1195
00:52:56,410 --> 00:52:58,245
We don't want to think
about the peril.
1196
00:52:58,278 --> 00:52:59,614
We just want the promise.
1197
00:52:59,647 --> 00:53:01,883
And we keep pretending
that we can split them.
1198
00:53:01,916 --> 00:53:03,818
But you can't do that.
1199
00:53:03,851 --> 00:53:05,753
Doesn't work that way.
1200
00:53:05,787 --> 00:53:08,388
Okay. I get all this stuff
about the promise and the peril.
1201
00:53:08,422 --> 00:53:10,525
I get that you can't have
the good without the bad,
1202
00:53:10,558 --> 00:53:12,527
but I'm sitting here
and I'm thinking about, like,
1203
00:53:12,560 --> 00:53:14,529
whether or not my son's
gonna live in a utopia
1204
00:53:14,562 --> 00:53:16,664
or if we'll be extinct
in ten years.
1205
00:53:16,698 --> 00:53:18,633
So, to know which way
it's going to go,
1206
00:53:18,666 --> 00:53:21,002
you have to understand
the incentives
1207
00:53:21,035 --> 00:53:22,770
that are gonna drive
that technology
1208
00:53:22,804 --> 00:53:26,974
and look at how the technology
is actually rolling out today.
1209
00:53:27,008 --> 00:53:28,643
♪ Is it too late ♪
1210
00:53:28,676 --> 00:53:30,444
-♪ Too late ♪
-♪ Too late to say... ♪
1211
00:53:30,477 --> 00:53:31,779
DANIEL:
Hi, Deb. How are you?
1212
00:53:31,813 --> 00:53:33,548
DEBORAH RAJI:
Hi. Good to see you. (laughs)
1213
00:53:33,581 --> 00:53:34,749
You, too. Thank you so much
for coming in today.
1214
00:53:34,782 --> 00:53:35,850
-Really appreciate it.
-No worries.
1215
00:53:35,883 --> 00:53:36,951
I-I was so worried
1216
00:53:36,984 --> 00:53:39,386
this was gonna be, uh, you know,
1217
00:53:39,419 --> 00:53:41,354
doomer versus accelerationist,
1218
00:53:41,388 --> 00:53:43,591
because there's so much
of this narrative
1219
00:53:43,624 --> 00:53:45,927
that needs to be told
from the ground.
1220
00:53:45,960 --> 00:53:48,696
♪ I know it wasn't smart ♪
1221
00:53:49,764 --> 00:53:52,767
♪ The day
I broke your heart... ♪
1222
00:53:52,800 --> 00:53:54,736
KAREN HAO:
First of all, AI requires
1223
00:53:54,769 --> 00:53:58,773
more resources
than we have ever spent
1224
00:53:58,806 --> 00:54:02,043
on a single technology
in the history of humanity.
1225
00:54:02,076 --> 00:54:03,845
♪ Oh, foolish me... ♪
1226
00:54:03,878 --> 00:54:06,581
NEWSWOMAN: The impactof fossil fuel emissions
1227
00:54:06,614 --> 00:54:08,281
on the climate isa major concern.
1228
00:54:08,315 --> 00:54:10,383
NEWSWOMAN 2: But thedigital future needs power,
1229
00:54:10,417 --> 00:54:12,385
lots of it.
1230
00:54:12,419 --> 00:54:16,724
And the bill is being passed on
to everyday Americans like...
1231
00:54:16,758 --> 00:54:19,392
WOMAN: My electric and gas billwas more than my car payment.
1232
00:54:19,426 --> 00:54:21,829
I mean, it-it's insane to me.
1233
00:54:21,863 --> 00:54:23,765
ARI PESKOE:
We're all subsidizing
1234
00:54:23,798 --> 00:54:25,499
the wealthiest corporationsin the world
1235
00:54:25,533 --> 00:54:27,400
in their pursuit ofartificial intelligence.
1236
00:54:27,434 --> 00:54:28,803
OpenAI, SoftBank and Oracle
1237
00:54:28,836 --> 00:54:31,672
have just unveiled
five more Stargate sites.
1238
00:54:31,706 --> 00:54:34,008
Meta is building
a two-gigawatt-plus data center
1239
00:54:34,041 --> 00:54:35,510
that is so large, it would cover
1240
00:54:35,543 --> 00:54:38,613
a significant part of Manhattan.
1241
00:54:38,646 --> 00:54:41,516
There is also Hyperion
that he says will scale
1242
00:54:41,549 --> 00:54:44,317
to five gigawatts
over several years.
1243
00:54:44,351 --> 00:54:45,653
It's hard to put that
in context.
1244
00:54:45,686 --> 00:54:48,823
A five-gigawatt facility.
What does that mean?
1245
00:54:48,856 --> 00:54:52,059
That means it would use
as much energy
1246
00:54:52,093 --> 00:54:54,829
as four million American homes.
1247
00:54:54,862 --> 00:54:56,764
One data center.
1248
00:54:57,598 --> 00:54:59,000
It also then causes
1249
00:54:59,033 --> 00:55:00,835
a whole host ofother environmental problems.
1250
00:55:00,868 --> 00:55:02,570
NEWSWOMAN 3:
Data centers in the US
1251
00:55:02,603 --> 00:55:05,673
use millions of gallonsof water each day.
1252
00:55:05,706 --> 00:55:07,575
Well, where exactly is
this water coming from?
1253
00:55:07,608 --> 00:55:09,911
HAO:
People are literally at risk
1254
00:55:09,944 --> 00:55:12,013
potentially of running out
of drinking water.
1255
00:55:12,046 --> 00:55:14,582
MacKENZIE SIGALOS:
...OpenAI's CEO Sam Altman,
1256
00:55:14,615 --> 00:55:17,552
who told me that the scaleof construction is the only way
1257
00:55:17,585 --> 00:55:19,386
to keep up withAI's explosive growth.
1258
00:55:19,419 --> 00:55:21,622
And this is what it takes
to deliver AI.
1259
00:55:22,256 --> 00:55:24,391
NITASHA TIKU:
They talk about how
1260
00:55:24,424 --> 00:55:28,395
this technology could solve
climate change, for example.
1261
00:55:28,428 --> 00:55:30,463
And I'm always curious, like,
1262
00:55:30,497 --> 00:55:32,033
well, why aren't we starting
with that?
1263
00:55:32,066 --> 00:55:34,535
-♪ Is it too late ♪
-♪ Too late ♪
1264
00:55:34,569 --> 00:55:37,104
♪ Too late to say ♪
1265
00:55:37,138 --> 00:55:42,844
♪ I'm sorry? ♪
1266
00:55:42,877 --> 00:55:46,013
RAJI: What concerns me aboutartificial intelligence is
1267
00:55:46,047 --> 00:55:47,782
these are being deployed
right now
1268
00:55:47,815 --> 00:55:50,417
and-and sometimes
deployed prematurely,
1269
00:55:50,450 --> 00:55:51,853
deployed without
sort of due diligence.
1270
00:55:51,886 --> 00:55:53,821
And so when they get
thrown out there,
1271
00:55:53,855 --> 00:55:56,657
there's so much potential
for things to go wrong.
1272
00:55:56,691 --> 00:55:58,593
And it almost,
disproportionately,
1273
00:55:58,626 --> 00:56:00,795
almost always goes wrong
for, sort of,
1274
00:56:00,828 --> 00:56:02,897
those that are the least
empowered in our society,
1275
00:56:02,930 --> 00:56:05,700
those that are
the most vulnerable already.
1276
00:56:05,733 --> 00:56:07,935
♪ ♪
1277
00:56:07,969 --> 00:56:10,872
EMILY M. BENDER: It is very easyto talk about the technology
1278
00:56:10,905 --> 00:56:12,874
as that's the only thing
we're talking about,
1279
00:56:12,907 --> 00:56:15,509
but, in fact, technology is
always built by people,
1280
00:56:15,543 --> 00:56:16,944
and it's frequently used
on people,
1281
00:56:16,978 --> 00:56:19,013
and we need to keep
all those people in the frame.
1282
00:56:19,046 --> 00:56:21,549
-Am I allowed to drink that?
-DANIEL: Yes, uh...
1283
00:56:21,582 --> 00:56:23,050
TIMNIT GEBRU:
It's-it's bonkers.
1284
00:56:23,084 --> 00:56:24,518
Like, all of these people
1285
00:56:24,552 --> 00:56:26,754
who have so much money,
so much money,
1286
00:56:26,787 --> 00:56:30,024
it's in their interest
to mislead the public
1287
00:56:30,057 --> 00:56:33,027
into the capabilities of the
systems that they're building,
1288
00:56:33,060 --> 00:56:36,731
because that allows them
to evade accountability.
1289
00:56:36,764 --> 00:56:39,634
They want you to feel like this
is such a complex, intell--
1290
00:56:39,667 --> 00:56:42,469
superintelligent thing
that they're building,
1291
00:56:42,502 --> 00:56:44,639
you're not thinking,
"Can OpenAI be ethical?"
1292
00:56:44,672 --> 00:56:46,908
You're thinking,
"Can ChatGPT be ethical?"
1293
00:56:46,941 --> 00:56:48,475
as if ChatGPT is, like,
1294
00:56:48,509 --> 00:56:50,511
its own thing that's not builtby a corporation.
1295
00:56:50,544 --> 00:56:52,146
WOMAN:
All right. Sneha, take one.
1296
00:56:52,179 --> 00:56:53,814
Mark.
1297
00:56:53,848 --> 00:56:56,217
Until very recently, there were
apps on the App Store,
1298
00:56:56,250 --> 00:56:57,919
just publicly available, uh,
1299
00:56:57,952 --> 00:57:00,688
where you could
nudify anyone using AI.
1300
00:57:01,822 --> 00:57:03,991
Bringing this into the hands of
1301
00:57:04,025 --> 00:57:06,527
your classmate,
into the hands of your stalker,
1302
00:57:06,560 --> 00:57:07,929
into the hands
of your ex-boyfriend,
1303
00:57:07,962 --> 00:57:09,897
into the hands of the person
down the street.
1304
00:57:09,931 --> 00:57:12,600
Ladies and gentlemen,
no longer can we trust
1305
00:57:12,633 --> 00:57:14,602
the footage we see
with our own eyes.
1306
00:57:14,635 --> 00:57:16,037
If you happen
to watch something,
1307
00:57:16,070 --> 00:57:18,673
say on YouTube or TikTok,
and you find it unsettling,
1308
00:57:18,706 --> 00:57:19,907
listen to that feeling.
1309
00:57:19,941 --> 00:57:21,742
-(whooping)
-For all you know,
1310
00:57:21,776 --> 00:57:23,544
-this video could be AI.
-(screaming)
1311
00:57:23,577 --> 00:57:24,812
Just a little wet.
1312
00:57:24,845 --> 00:57:26,213
It doesn't matter who you are.
1313
00:57:26,247 --> 00:57:28,683
You are equally at risk
1314
00:57:28,716 --> 00:57:30,918
of being impactedby these technologies.
1315
00:57:36,023 --> 00:57:40,261
I think sometimes when we talk
about AI, it feels very sci-fi,
1316
00:57:40,294 --> 00:57:41,696
and it feels very foreign,
1317
00:57:41,729 --> 00:57:43,564
and it feels very far out
into the future,
1318
00:57:43,597 --> 00:57:45,967
so you think, "My life
is not impacted by this."
1319
00:57:46,000 --> 00:57:48,769
Um, but if you're applying
for a job
1320
00:57:48,803 --> 00:57:51,105
and an algorithm is the reason
that you don't get the job,
1321
00:57:51,138 --> 00:57:53,207
sometimes you don't even know
that an algorithm
1322
00:57:53,240 --> 00:57:54,709
was part of that process
1323
00:57:54,742 --> 00:57:56,043
or an AI system was
part of that process.
1324
00:57:56,077 --> 00:57:58,012
You just know
that you didn't get the job.
1325
00:57:58,045 --> 00:58:00,081
And so it's not something
that you're gonna escape
1326
00:58:00,114 --> 00:58:02,049
because of privilege
or you're gonna escape
1327
00:58:02,083 --> 00:58:04,118
because you're in
a particular profession.
1328
00:58:04,151 --> 00:58:07,188
It-It's something that affects
everybody, really.
1329
00:58:07,221 --> 00:58:10,925
It may sound basic,
but how we move forward
1330
00:58:10,958 --> 00:58:15,629
in the Age of Information
is gonna be the difference
1331
00:58:15,663 --> 00:58:17,732
between whether we survive
1332
00:58:17,765 --> 00:58:20,868
or whether we become some kind
of (bleep)-up dystopia.
1333
00:58:20,901 --> 00:58:23,604
Hello. I'm not a real person,
and that's the point.
1334
00:58:23,637 --> 00:58:25,840
Again, everything
in this video is fake:
1335
00:58:25,873 --> 00:58:27,675
our voices, what we're wearing,
1336
00:58:27,708 --> 00:58:30,711
where we are, all of it, fake.
1337
00:58:30,745 --> 00:58:33,247
Generative AI could flood
1338
00:58:33,280 --> 00:58:36,884
the world with misinformation.
1339
00:58:36,917 --> 00:58:39,920
But it could also flood it
with influence campaigns.
1340
00:58:39,954 --> 00:58:42,223
That's an existential risk
to democracy.
1341
00:58:42,256 --> 00:58:44,058
The biggest and scariest
1342
00:58:44,091 --> 00:58:46,660
canary in the coal mineright now
1343
00:58:46,694 --> 00:58:48,596
comes from Slovakia.
1344
00:58:48,629 --> 00:58:50,097
-(man speaking Slovak)
-JOHN BERMAN: It's the sort of
1345
00:58:50,131 --> 00:58:52,767
deepfake dirty trick that
worries election experts,
1346
00:58:52,800 --> 00:58:55,603
particularly as AI-generated
political speech exists
1347
00:58:55,636 --> 00:58:57,071
in a kind of legal gray area.
1348
00:58:57,104 --> 00:58:59,807
-Does this sound like you?
-It does sound like me.
1349
00:58:59,840 --> 00:59:01,342
REVANUR: Slovakia hadits parliamentary election
1350
00:59:01,375 --> 00:59:04,111
disrupted by an AI voice clone
1351
00:59:04,145 --> 00:59:07,081
that was actually disseminated
just before the election.
1352
00:59:07,114 --> 00:59:08,916
DAVID EVAN HARRIS:
A audio deepfake
1353
00:59:08,949 --> 00:59:11,652
was releasedon social media that was
1354
00:59:11,685 --> 00:59:14,288
supposedly the voice
of one of the candidates
1355
00:59:14,321 --> 00:59:17,324
talking about buying votes
and rigging the election.
1356
00:59:17,358 --> 00:59:19,060
It went viral,
1357
00:59:19,093 --> 00:59:22,830
and the candidate
who lost the election
1358
00:59:22,863 --> 00:59:26,333
was actuallyin support of Ukraine,
1359
00:59:26,367 --> 00:59:30,304
and the candidate who won
the election was actually...
1360
00:59:30,337 --> 00:59:31,806
DANIEL:
Pro-Russian guy.
1361
00:59:31,839 --> 00:59:33,340
It was a pro-Russian guy
who won the election.
1362
00:59:33,374 --> 00:59:34,842
(speaking Russian)
1363
00:59:34,875 --> 00:59:36,644
REBECCA JARVIS:
Putin has himself said
1364
00:59:36,677 --> 00:59:39,747
whoever wins this
artificial intelligence race
1365
00:59:39,780 --> 00:59:41,949
is essentially
the controller of humankind.
1366
00:59:41,982 --> 00:59:45,119
We do worry a lot about
authoritarian governments.
1367
00:59:46,053 --> 00:59:48,989
DAVID EVAN HARRIS:
Right now, Wall Street
1368
00:59:49,023 --> 00:59:51,392
and investors more broadlyaround the world
1369
00:59:51,425 --> 00:59:53,260
are driving a push.
1370
00:59:53,294 --> 00:59:56,697
They have a demand that gets
the products to market
1371
00:59:56,730 --> 00:59:58,866
that dazzle people
the most first.
1372
00:59:58,899 --> 01:00:01,769
They're not thinkingabout how these tools
1373
01:00:01,802 --> 01:00:03,737
could deeply undermine trust
1374
01:00:03,771 --> 01:00:06,107
and our democratic institutions.
1375
01:00:06,140 --> 01:00:08,109
HARARI:
Democracy is a system
1376
01:00:08,142 --> 01:00:10,911
to resolve disagreementsbetween people
1377
01:00:10,945 --> 01:00:12,646
in a peaceful way,
1378
01:00:12,680 --> 01:00:15,082
but democracy is based on trust.
1379
01:00:15,116 --> 01:00:19,220
If you lose all trust,
democracy is simply impossible.
1380
01:00:21,255 --> 01:00:22,957
TRISTAN HARRIS:
Well, it's hard, right?
1381
01:00:22,990 --> 01:00:25,259
So, what are the options
available to us?
1382
01:00:25,292 --> 01:00:26,760
There's sort of two camps.
1383
01:00:26,794 --> 01:00:30,030
Like, one camp is: lock it down.
1384
01:00:30,064 --> 01:00:33,267
Let's lock this down
into a handful of AI companies
1385
01:00:33,300 --> 01:00:35,970
who will do this
in a safe and trusted way.
1386
01:00:36,003 --> 01:00:37,705
But then people worry about
1387
01:00:37,738 --> 01:00:39,273
runaway concentrations
of wealth and power.
1388
01:00:39,306 --> 01:00:42,209
Like, who would you trust to be
a million times more powerful
1389
01:00:42,243 --> 01:00:44,845
or wealthy than
every other actor in society?
1390
01:00:44,879 --> 01:00:46,881
Why should we trust you?
1391
01:00:46,914 --> 01:00:49,016
Um, you shouldn't.
1392
01:00:49,049 --> 01:00:51,785
But of course, if you do this,
this opens up
1393
01:00:51,819 --> 01:00:54,922
all these risks of
authoritarianism and tyranny.
1394
01:00:54,955 --> 01:00:57,391
It's-it's sort of
an authoritarian's dream
1395
01:00:57,424 --> 01:01:00,861
to have AI in a box
that can be applied and used
1396
01:01:00,895 --> 01:01:03,430
for ubiquitous surveillance.
1397
01:01:03,464 --> 01:01:05,933
I mean, in-in some ways,
the kind of world
1398
01:01:05,966 --> 01:01:11,405
that Orwell imagined in 1984
is unrealistic,
1399
01:01:11,438 --> 01:01:13,841
uh, unless you have AI.
1400
01:01:13,874 --> 01:01:16,443
But with AI,
that in fact is realistic.
1401
01:01:16,477 --> 01:01:18,913
Monitors every activity,
1402
01:01:18,946 --> 01:01:22,316
conversations,facial recognition.
1403
01:01:22,349 --> 01:01:26,921
What I worry about is that,
uh, these tools can scale up,
1404
01:01:26,954 --> 01:01:29,023
uh, a form of totalitarianism
1405
01:01:29,056 --> 01:01:31,892
that is cost-effective
and permanent.
1406
01:01:33,060 --> 01:01:35,863
TRISTAN HARRIS: So in responseto that, some other people say,
1407
01:01:35,896 --> 01:01:37,398
"No, no, no, we should
actually let this rip.
1408
01:01:37,431 --> 01:01:39,934
"Let's decentralize this power
as much as possible.
1409
01:01:39,967 --> 01:01:42,036
"Let's let every business,
every individual,
1410
01:01:42,069 --> 01:01:44,238
"every 16-year-old,
every science lab,
1411
01:01:44,271 --> 01:01:46,473
you know, get the benefit
of the latest AI models."
1412
01:01:46,508 --> 01:01:48,475
But now you have
every terrorist group,
1413
01:01:48,510 --> 01:01:52,112
every disenfranchised person
having the power to make
1414
01:01:52,146 --> 01:01:54,181
the very worst
biological weapon.
1415
01:01:54,215 --> 01:01:56,050
Hacking infrastructure,
creating deepfakes,
1416
01:01:56,083 --> 01:01:57,818
flooding
our information environment.
1417
01:01:57,851 --> 01:02:00,955
So that creates all these risks
of sort of catastrophic harm
1418
01:02:00,988 --> 01:02:03,257
and-and societal collapse
through that direction.
1419
01:02:03,290 --> 01:02:05,092
And so we're sort of stuck
between this rock
1420
01:02:05,125 --> 01:02:08,229
and a hard place,
between "lock it up..."
1421
01:02:08,262 --> 01:02:09,897
(indistinct chatter)
1422
01:02:11,131 --> 01:02:12,900
-...or "let it rip."
-(sirens wailing)
1423
01:02:12,933 --> 01:02:14,468
(people shouting)
1424
01:02:14,501 --> 01:02:18,038
So we have to find something
like a narrow path
1425
01:02:18,072 --> 01:02:20,774
that avoids
these two negative outcomes.
1426
01:02:20,808 --> 01:02:23,277
So, if that's all true,
why wouldn't we just slow down
1427
01:02:23,310 --> 01:02:26,380
and figure all this out
before it's too late?
1428
01:02:26,413 --> 01:02:29,450
If humanity was extremely wise,
1429
01:02:29,483 --> 01:02:31,418
that's what we would do.
1430
01:02:31,452 --> 01:02:33,555
But there's, like, a different
way to face this stuff, right?
1431
01:02:33,588 --> 01:02:36,457
Which is:
What are the rules of the game?
1432
01:02:36,490 --> 01:02:39,994
A lot of, like, what CEOs do
1433
01:02:40,027 --> 01:02:42,429
is driven by
the incentives that they face.
1434
01:02:42,463 --> 01:02:45,866
It's primarily
profit-maximization incentives
1435
01:02:45,899 --> 01:02:47,468
that are driving
the development of AI.
1436
01:02:47,501 --> 01:02:50,938
Even the good guys are stuck
in this dilemma of
1437
01:02:50,971 --> 01:02:53,107
if they move too slowly,
1438
01:02:53,140 --> 01:02:55,042
then they leave themselves
vulnerable
1439
01:02:55,075 --> 01:02:57,478
to all of the other guys
who are cutting all corners.
1440
01:02:57,512 --> 01:03:02,316
All these top companies are in
a complete no-holds-barred race
1441
01:03:02,349 --> 01:03:06,153
to, as fast as possible, get
to AGI, get there right now.
1442
01:03:06,186 --> 01:03:07,855
♪ ♪
1443
01:03:08,856 --> 01:03:10,924
LADISH: Yeah, I mean,I think it, I think
1444
01:03:10,958 --> 01:03:13,460
it probably starts
with DeepMind.
1445
01:03:13,494 --> 01:03:15,195
NEWSWOMAN:
Google is buying
1446
01:03:15,229 --> 01:03:16,930
artificial intelligence firmDeepMind Technologies.
1447
01:03:16,964 --> 01:03:18,932
Terms of the dealwere not disclosed.
1448
01:03:18,966 --> 01:03:21,168
ELON MUSK: Larry Page and Iused to be very close friends,
1449
01:03:21,201 --> 01:03:23,170
and it became apparent to me
1450
01:03:23,203 --> 01:03:26,608
that Larry did not care
about AI safety.
1451
01:03:26,641 --> 01:03:29,276
EMILY CHANG: Elon Musk has said
you started OpenAI,
1452
01:03:29,310 --> 01:03:31,312
you both started OpenAI because
he was scared of Google.
1453
01:03:31,345 --> 01:03:32,614
LADISH: You-you basicallyhad the foundation
1454
01:03:32,647 --> 01:03:34,181
of OpenAI come out of that.
1455
01:03:34,214 --> 01:03:35,482
"So we're gonna do it better.
We're gonna do it
1456
01:03:35,517 --> 01:03:37,818
in a safer wayor in a more open way."
1457
01:03:39,019 --> 01:03:41,055
So that's what started OpenAI.
1458
01:03:41,088 --> 01:03:43,357
And so now, instead of
having one AGI project,
1459
01:03:43,390 --> 01:03:45,225
you have two AGI projects.
1460
01:03:45,259 --> 01:03:47,328
The worst possible thing
that could happen
1461
01:03:47,361 --> 01:03:50,097
is if there's
multiple AGI projects
1462
01:03:50,130 --> 01:03:52,266
done by different people
who don't like each other
1463
01:03:52,299 --> 01:03:55,202
and are all competing
to get to AGI first.
1464
01:03:55,235 --> 01:03:57,338
This would be the worst
possible thing that can happen,
1465
01:03:57,371 --> 01:04:00,407
because this would mean
that whoever is the least safe,
1466
01:04:00,441 --> 01:04:03,477
whoever sacrifices the most
on safety to get ahead
1467
01:04:03,511 --> 01:04:05,446
will be the person
that gets there first.
1468
01:04:05,479 --> 01:04:08,248
-DANIEL: That's basically
what's happening, right? -Yeah.
1469
01:04:08,282 --> 01:04:11,952
You and your brother
famously left OpenAI, uh,
1470
01:04:11,985 --> 01:04:13,320
to start Anthropic.
1471
01:04:13,354 --> 01:04:15,422
RASKIN: And then Anthropicstarted because
1472
01:04:15,456 --> 01:04:18,258
some researchers
inside of OpenAI said,
1473
01:04:18,292 --> 01:04:20,894
"I want to go off
and do it more safely."
1474
01:04:20,928 --> 01:04:24,064
You needed something in addition
to just scaling the models up,
1475
01:04:24,098 --> 01:04:26,233
which is alignment or safety.
1476
01:04:26,266 --> 01:04:28,001
HENDRYCKS:
"We are more responsible
1477
01:04:28,035 --> 01:04:30,170
or more trustworthy
or more moral."
1478
01:04:30,204 --> 01:04:32,072
Now you have three AGI projects.
1479
01:04:32,106 --> 01:04:33,575
But also sitting around
the table with you
1480
01:04:33,608 --> 01:04:35,242
are gonna be a bunch of AIs.
1481
01:04:35,275 --> 01:04:38,078
LADISH: And now Meta is-istrying to do stuff.
1482
01:04:38,112 --> 01:04:40,114
Meanwhile, a new artificial
intelligence competitor
1483
01:04:40,147 --> 01:04:41,949
announced this week,Elon Musk's...
1484
01:04:41,982 --> 01:04:45,119
HENDRYCKS: xAI, which would beElon Musk's organization.
1485
01:04:45,152 --> 01:04:46,688
MUSK:
I don't trust OpenAI.
1486
01:04:46,721 --> 01:04:48,489
The fight between
Elon Musk and OpenAI
1487
01:04:48,523 --> 01:04:50,124
has entered a new round.
1488
01:04:50,157 --> 01:04:52,192
I don't trust Sam Altman,
uh, and I, and I don't think
1489
01:04:52,226 --> 01:04:54,529
we want to have the most
powerful AI in the world
1490
01:04:54,562 --> 01:04:56,997
controlled by someonewho is not trustworthy.
1491
01:04:57,030 --> 01:04:58,999
(cheering and applause)
1492
01:04:59,032 --> 01:05:02,136
The incentive is
untold sums of money.
1493
01:05:02,169 --> 01:05:04,004
-Yes.
-Is untold power.
1494
01:05:04,037 --> 01:05:06,006
-Yes.
-Is untold control.
1495
01:05:06,039 --> 01:05:08,409
If you have something
that is a million times smarter
1496
01:05:08,442 --> 01:05:12,012
and more capable than
everything else on planet Earth
1497
01:05:12,045 --> 01:05:14,682
and no one else has that,
1498
01:05:14,716 --> 01:05:16,551
that thing is the incentive.
1499
01:05:16,584 --> 01:05:18,118
So you rule the world.
1500
01:05:19,153 --> 01:05:21,021
If you really believe this,
1501
01:05:21,054 --> 01:05:23,223
if you really in your heart
believe this,
1502
01:05:23,257 --> 01:05:24,526
then you might be
willing to take
1503
01:05:24,559 --> 01:05:26,460
quite a lot of risk
to make that happen.
1504
01:05:26,493 --> 01:05:29,329
Google has just released
its newest AI model.
1505
01:05:29,363 --> 01:05:31,699
The answer to OpenAI's ChatGPT.
1506
01:05:31,733 --> 01:05:33,434
How do we get to
this ten trillion?
1507
01:05:33,467 --> 01:05:35,502
Is NVIDIA becoming the most
valuable company in the US?
1508
01:05:35,537 --> 01:05:39,574
This is the largest business
opportunity in history.
1509
01:05:39,607 --> 01:05:41,074
In history.
1510
01:05:41,108 --> 01:05:42,710
So, the reason why
everyone's really hyped
1511
01:05:42,744 --> 01:05:45,179
about artificial intelligence
right now
1512
01:05:45,212 --> 01:05:48,415
is becausethe more these companies hype
1513
01:05:48,449 --> 01:05:50,417
the potential capabilitiesof their technology,
1514
01:05:50,451 --> 01:05:53,555
the more investmentthey can attract.
1515
01:05:53,588 --> 01:05:55,222
CARL QUINTANILLA:
Amazon investing up to
1516
01:05:55,255 --> 01:05:57,592
four billion dollars
in start-up Anthropic.
1517
01:05:57,625 --> 01:05:59,193
OpenAI is setting its sights
1518
01:05:59,226 --> 01:06:01,428
on a blockbuster half
a trillion dollar valuation.
1519
01:06:01,462 --> 01:06:02,597
Half a trillion!
1520
01:06:02,630 --> 01:06:03,731
The race is on.
1521
01:06:03,765 --> 01:06:05,165
This is happening
faster than ever.
1522
01:06:05,199 --> 01:06:07,167
Are we in an AI bubble?
Of course.
1523
01:06:07,201 --> 01:06:09,537
I just don't see
the bubble bursting
1524
01:06:09,571 --> 01:06:12,707
while you still have
this major spending cycle.
1525
01:06:12,740 --> 01:06:15,409
RASKIN: Even if you thinkthis is all hype,
1526
01:06:15,442 --> 01:06:18,145
there are billions
to trillions of dollars
1527
01:06:18,178 --> 01:06:21,482
flowing into making AI systems
more powerful.
1528
01:06:21,516 --> 01:06:24,351
And once you have that thing
that's more powerful,
1529
01:06:24,384 --> 01:06:25,753
companies can use that
1530
01:06:25,787 --> 01:06:27,755
to get bigger profitsand to make more money.
1531
01:06:27,789 --> 01:06:30,491
Countries can use thatto make stronger militaries.
1532
01:06:30,525 --> 01:06:33,293
NVIDIA has overtaken
Microsoft and Apple
1533
01:06:33,327 --> 01:06:36,564
to become the world's
most valuable company.
1534
01:06:36,598 --> 01:06:39,399
The race to deploy becomes
the race to recklessness,
1535
01:06:39,433 --> 01:06:42,302
because they can't
deploy it that quickly
1536
01:06:42,336 --> 01:06:43,538
and also get it right.
1537
01:06:43,571 --> 01:06:46,440
They believe that
they're the good guys.
1538
01:06:46,473 --> 01:06:49,611
"And if I don't do it,
somebody who doesn't have
1539
01:06:49,644 --> 01:06:51,779
"as good values as me
will be sitting at the table
1540
01:06:51,813 --> 01:06:55,082
getting to make decisions, so
I have an obligation to do it."
1541
01:06:55,115 --> 01:06:56,551
Yes, this is
a very commonly held belief.
1542
01:06:56,584 --> 01:06:59,152
Many, many, many,
maybe most of the people
1543
01:06:59,186 --> 01:07:01,255
building this technology
believe that.
1544
01:07:01,288 --> 01:07:04,258
I'm worried about
the commercial competition,
1545
01:07:04,291 --> 01:07:06,393
but it turns out
I'm even more worried about
1546
01:07:06,426 --> 01:07:08,428
the geopolitical competition.
1547
01:07:08,462 --> 01:07:10,665
We were eight years behind
a year ago.
1548
01:07:10,698 --> 01:07:12,800
Now we're probably
less than one year behind.
1549
01:07:12,834 --> 01:07:16,236
Well, both Saudi Arabia
and the UAE have been racing
1550
01:07:16,270 --> 01:07:18,673
to set up data centers
and position themselves
1551
01:07:18,706 --> 01:07:20,675
as the dominant force in AI...
1552
01:07:20,708 --> 01:07:22,710
...to make South Korea
a global AI leader.
1553
01:07:22,744 --> 01:07:24,411
JULIA SIEGER:
French President Emmanuel Macron
1554
01:07:24,444 --> 01:07:26,380
spoke a lot about
artificial intelligence...
1555
01:07:26,413 --> 01:07:27,715
Now with more on Israel's role
1556
01:07:27,749 --> 01:07:29,817
in the artificial intelligence
revolution...
1557
01:07:29,851 --> 01:07:33,487
Countries will be competing
for whose AI technologies
1558
01:07:33,521 --> 01:07:35,355
create the next generation
of industry.
1559
01:07:35,389 --> 01:07:38,626
The Chinese are insisting
that AI,
1560
01:07:38,660 --> 01:07:40,360
as being developed in China,
1561
01:07:40,394 --> 01:07:42,095
reinforce the core values
1562
01:07:42,129 --> 01:07:44,431
of the Chinese Communist Partyand the Chinese system.
1563
01:07:44,464 --> 01:07:49,269
America has to beat China
in the AI race.
1564
01:07:49,303 --> 01:07:51,639
DANIEL: China's, like,
light-years behind.
1565
01:07:51,673 --> 01:07:53,240
Are they?
1566
01:07:53,273 --> 01:07:55,442
I mean, they have way more
training data than we do,
1567
01:07:55,475 --> 01:07:58,312
and there's nothing saying they
don't drop a model next month
1568
01:07:58,345 --> 01:08:01,415
that isn't--
doesn't far outperform GPT-4.
1569
01:08:01,448 --> 01:08:04,318
Everything was going just fine.
What could go wrong?
1570
01:08:04,351 --> 01:08:07,254
-DeepSeek... (speaking Chinese)
-DeepSeek... -DeepSeek...
1571
01:08:07,287 --> 01:08:08,455
There is a new model.
1572
01:08:08,488 --> 01:08:11,325
But from a Chinese lab
called DeepSeek.
1573
01:08:11,358 --> 01:08:14,428
Let's talk about DeepSeek,
because it is mind-blowing,
1574
01:08:14,461 --> 01:08:18,231
and it is shaking this
entire industry to its core.
1575
01:08:18,265 --> 01:08:19,701
The Trump administration
will ensure
1576
01:08:19,734 --> 01:08:23,470
that the most powerful
AI systems are built in the US
1577
01:08:23,503 --> 01:08:27,174
with American designed
and manufactured chips.
1578
01:08:27,207 --> 01:08:29,343
AI is China's
Apollo project.
1579
01:08:29,376 --> 01:08:30,745
The Chinese Communist Party
1580
01:08:30,778 --> 01:08:32,179
deeply understands the potential
1581
01:08:32,212 --> 01:08:33,380
for AI to disrupt warfare.
1582
01:08:33,413 --> 01:08:35,650
Google no longer promises
that it will not
1583
01:08:35,683 --> 01:08:38,886
use artificial intelligence
for weapons or surveillance.
1584
01:08:38,920 --> 01:08:40,487
China, North Korea, Russia
1585
01:08:40,521 --> 01:08:42,222
are gonna keep building it
as fast as possible
1586
01:08:42,255 --> 01:08:44,792
to get more economic advantage,more productivity advantage,
1587
01:08:44,826 --> 01:08:47,361
more scientific advantage,more military advantage,
1588
01:08:47,394 --> 01:08:48,896
'cause AI makes better weapons.
1589
01:08:48,930 --> 01:08:52,299
If you're talking about
a system as broad and capable
1590
01:08:52,332 --> 01:08:53,635
as a brilliant scientist,
1591
01:08:53,668 --> 01:08:58,271
it might be able to run
a military campaign
1592
01:08:58,305 --> 01:09:01,743
better than any of the generals
in the US government right now.
1593
01:09:01,776 --> 01:09:06,213
Taiwan is the issue creating
the most tension right now
1594
01:09:06,246 --> 01:09:08,382
between Beijing and the US.
1595
01:09:08,415 --> 01:09:10,918
There's a-a direct
oppositional disagreement
1596
01:09:10,952 --> 01:09:12,452
between the US and China
1597
01:09:12,486 --> 01:09:14,454
on whether Taiwan ispart of China.
1598
01:09:14,488 --> 01:09:18,926
MATHENY: Taiwan is importantfor so many reasons.
1599
01:09:18,960 --> 01:09:20,828
It is also the home of
1600
01:09:20,862 --> 01:09:24,565
about 90% of advanced chip
manufacturing for the world,
1601
01:09:24,599 --> 01:09:28,903
which means that the supplychain for advanced compute
1602
01:09:28,936 --> 01:09:31,506
is at risk should there be
1603
01:09:31,539 --> 01:09:33,440
some sort of scenarioaround Taiwan,
1604
01:09:33,473 --> 01:09:37,712
whether a-a blockadeor an invasion by China.
1605
01:09:38,311 --> 01:09:40,948
HENDRYCKS:
Are we going to be in some race
1606
01:09:40,982 --> 01:09:45,853
between the US and China
that eventually devolves into
1607
01:09:45,887 --> 01:09:49,891
a militarized AI arms race and,you know, potentially leads
1608
01:09:49,924 --> 01:09:51,659
to some great power conflict?
1609
01:09:51,693 --> 01:09:53,493
NICK SCHIFRIN: The outgoingtop US military commander
1610
01:09:53,528 --> 01:09:56,396
in the regionpredicted war was coming.
1611
01:09:56,430 --> 01:09:58,432
I think the threat is manifest
during this decade,
1612
01:09:58,465 --> 01:09:59,967
in fact, in the next six years.
1613
01:10:00,001 --> 01:10:01,769
MATHENY:
Um, one of the scenarios
1614
01:10:01,803 --> 01:10:06,406
that I worry about is
a cyber flash war,
1615
01:10:06,440 --> 01:10:11,445
um, one in whichcyber tools that are autonomous
1616
01:10:11,478 --> 01:10:13,447
are competing against each other
1617
01:10:13,480 --> 01:10:15,248
in ways that are escalatory
1618
01:10:15,282 --> 01:10:17,317
without
meaningful human control.
1619
01:10:19,386 --> 01:10:20,755
HARARI:
You live in a war zone,
1620
01:10:20,788 --> 01:10:24,357
it will be an AI deciding
whether to bomb your house
1621
01:10:24,391 --> 01:10:26,326
and whether to kill you.
1622
01:10:26,359 --> 01:10:27,995
Let's have the machine decide.
1623
01:10:28,029 --> 01:10:29,731
That's the temptation.
1624
01:10:29,764 --> 01:10:32,867
That's why AI is so temptingfor militaries to adopt,
1625
01:10:32,900 --> 01:10:36,003
to create autonomous weapons,because if I start believing
1626
01:10:36,037 --> 01:10:39,607
that my military adversaryis gonna adopt AI,
1627
01:10:39,640 --> 01:10:42,375
it'll be a race for who canpull the trigger faster.
1628
01:10:42,409 --> 01:10:45,445
And the one that automates
that decision,
1629
01:10:45,479 --> 01:10:47,615
rather than having
a human in the loop,
1630
01:10:47,648 --> 01:10:49,817
is the one
that will win that war.
1631
01:10:49,851 --> 01:10:51,284
And if you think about
1632
01:10:51,318 --> 01:10:53,755
the nuclear arms race
in the 1940s...
1633
01:10:54,689 --> 01:10:56,891
...you know the Germans
are working on the bomb,
1634
01:10:56,924 --> 01:11:01,361
so it's not so easy to tell
Robert Oppenheimer then,
1635
01:11:01,394 --> 01:11:04,031
"Uh, y-you know, slow down."
1636
01:11:04,065 --> 01:11:05,900
So after watching this movie,
it's gonna be confusing,
1637
01:11:05,933 --> 01:11:07,300
'cause you're gonna go back,
1638
01:11:07,334 --> 01:11:08,736
and tomorrow
you're gonna use ChatGPT,
1639
01:11:08,770 --> 01:11:10,772
and it's gonna be
unbelievably helpful.
1640
01:11:10,805 --> 01:11:12,974
And I will use it, too, and
it'll be unbelievably helpful.
1641
01:11:13,007 --> 01:11:16,611
And you'll say, "Wait, so
I just saw this movie about AI
1642
01:11:16,644 --> 01:11:18,513
"and existential risk
and all these things,
1643
01:11:18,546 --> 01:11:20,915
and where's the threat again?"
1644
01:11:20,948 --> 01:11:24,986
And it's not that ChatGPT is
the existential threat.
1645
01:11:25,019 --> 01:11:29,322
It's that the race to deploy
the most powerful,
1646
01:11:29,356 --> 01:11:32,093
inscrutable,
uncontrollable technology,
1647
01:11:32,126 --> 01:11:35,630
under the worst
incentives possible,
1648
01:11:35,663 --> 01:11:37,732
that's the existential threat.
1649
01:11:38,398 --> 01:11:40,935
DANIEL:
But it strikes me that-that
1650
01:11:40,968 --> 01:11:43,671
we are in a context of a race.
1651
01:11:43,704 --> 01:11:46,473
-SUTSKEVER: Yes.
-And it is in a competitive,
1652
01:11:46,507 --> 01:11:48,109
"got to get there first,
got to win the race" setting,
1653
01:11:48,142 --> 01:11:49,544
"got to compete against China,
1654
01:11:49,577 --> 01:11:50,945
got to compete against
the other labs."
1655
01:11:50,978 --> 01:11:52,412
Isn't that right?
1656
01:11:52,445 --> 01:11:54,115
Today it is the case.
1657
01:11:54,148 --> 01:11:57,552
So we need to change
that race dynamic.
1658
01:11:58,519 --> 01:12:00,487
Don't you think?
1659
01:12:00,521 --> 01:12:02,857
I think that would be
very good indeed.
1660
01:12:04,158 --> 01:12:06,928
I-I think
there's some kind of...
1661
01:12:06,961 --> 01:12:10,064
mysticism around AI
that makes it feel like
1662
01:12:10,097 --> 01:12:11,999
it fell from the sky.
1663
01:12:12,033 --> 01:12:14,001
DANIEL: Thinking of this
as a godlike technology
1664
01:12:14,035 --> 01:12:15,970
is a problem.
1665
01:12:16,003 --> 01:12:17,672
Yeah.
1666
01:12:18,773 --> 01:12:20,473
Why?
1667
01:12:20,508 --> 01:12:23,544
It gives license
to the companies to...
1668
01:12:23,578 --> 01:12:26,714
(laughs)
not take responsibility
1669
01:12:26,747 --> 01:12:31,418
for the things that the software
that they built does.
1670
01:12:31,451 --> 01:12:35,590
If people made that connection,
1671
01:12:35,623 --> 01:12:37,859
maybe that would, like,
help them understand again
1672
01:12:37,892 --> 01:12:42,462
the dynamics of
all the money and power and...
1673
01:12:42,495 --> 01:12:45,933
chaos that's happening to...
1674
01:12:45,967 --> 01:12:47,602
create this technology.
1675
01:12:47,635 --> 01:12:49,737
It's, like, five guys
who control it, right?
1676
01:12:49,770 --> 01:12:52,139
-Like, five men?
-Basically, yeah.
1677
01:12:52,173 --> 01:12:54,642
-The CEOs?
-Yeah.
1678
01:12:54,675 --> 01:12:55,977
Okay.
1679
01:12:57,178 --> 01:12:59,580
♪ ♪
1680
01:13:01,616 --> 01:13:03,618
DANIEL:
Sometimes it feels like
1681
01:13:03,651 --> 01:13:07,755
I've been on thisjust, like, endless journey
1682
01:13:07,788 --> 01:13:09,757
of trying to understand this.
1683
01:13:09,790 --> 01:13:12,994
You know, like
I'm climbing a mountain...
1684
01:13:15,930 --> 01:13:19,634
...and every timeI, like, get up a hill,
1685
01:13:19,667 --> 01:13:21,035
I think I've reached the top,
1686
01:13:21,068 --> 01:13:24,005
but it just keeps goingand going and going.
1687
01:13:24,038 --> 01:13:25,072
(camera clicks, whirs)
1688
01:13:25,106 --> 01:13:27,174
But from everything I've heard,
1689
01:13:27,208 --> 01:13:30,778
if I had to guess what's
at the top of Anxiety Mountain,
1690
01:13:30,811 --> 01:13:34,682
it's these five CEOsfrom these five companies,
1691
01:13:34,715 --> 01:13:37,952
who are kind of like the
Oppenheimers of this moment.
1692
01:13:37,985 --> 01:13:40,054
These are the guys
who are building this thing.
1693
01:13:40,087 --> 01:13:41,454
Yeah.
1694
01:13:41,488 --> 01:13:44,659
Like, is there a plan?
1695
01:13:44,692 --> 01:13:46,827
The head of the company
that makes ChatGPT
1696
01:13:46,861 --> 01:13:50,531
warned of possible
significant harm to the world.
1697
01:13:50,564 --> 01:13:53,000
I think, if this technology
goes wrong,
1698
01:13:53,034 --> 01:13:54,068
it can go quite wrong.
1699
01:13:54,101 --> 01:13:56,237
(bursts of distinct chatter)
1700
01:13:56,270 --> 01:13:58,072
DANIEL:
Five dudes.
1701
01:13:58,105 --> 01:14:01,943
I-I never thought about us
as a social media company.
1702
01:14:01,976 --> 01:14:06,814
It-it feels like I have to try
and-and find these guys...
1703
01:14:06,847 --> 01:14:07,515
Mm-hmm.
1704
01:14:07,548 --> 01:14:09,784
...and get them in the movie.
1705
01:14:10,584 --> 01:14:13,554
Certainly we'd get
some clarity from that.
1706
01:14:14,188 --> 01:14:15,723
(camera clicks, whirs)
1707
01:14:16,590 --> 01:14:19,860
I mean, the buck's got to stopsomewhere, right?
1708
01:14:19,894 --> 01:14:21,862
(wind whistling)
1709
01:14:21,896 --> 01:14:23,731
♪ ♪
1710
01:14:24,398 --> 01:14:26,734
(tools whirring and clicking)
1711
01:14:27,668 --> 01:14:29,870
(whooshing)
1712
01:14:34,608 --> 01:14:35,810
DANIEL:
How's that feel?
1713
01:14:35,843 --> 01:14:38,212
-Great.
-That okay?
1714
01:14:38,245 --> 01:14:39,847
-Yeah.
-(indistinct crew chatter)
1715
01:14:39,880 --> 01:14:41,582
-Can I move the seat forward?
-DANIEL: Yes, you can.
1716
01:14:41,615 --> 01:14:42,717
It's, uh, it's just
a bit awkward.
1717
01:14:42,750 --> 01:14:43,985
I'm a little, like...
1718
01:14:44,018 --> 01:14:45,619
Dario, how do you feel
about that chair?
1719
01:14:45,653 --> 01:14:47,588
Um, yeah, the chair's good.
1720
01:14:47,621 --> 01:14:49,290
-I just wanted to move it
a little forward. -Okay.
1721
01:14:49,323 --> 01:14:51,125
I picked out that chair.
1722
01:14:51,158 --> 01:14:52,727
-It's a good chair.
-Thanks.
1723
01:14:52,760 --> 01:14:54,562
(cell phone ringing)
1724
01:14:58,332 --> 01:15:00,500
TED:
And just sit down there.
1725
01:15:02,036 --> 01:15:03,104
And then through here,
1726
01:15:03,137 --> 01:15:05,006
we'll be looking at each other
through this glass.
1727
01:15:05,039 --> 01:15:07,074
(busy signal beeping)
1728
01:15:11,879 --> 01:15:13,948
And sitting back in the chair,
leaning forward?
1729
01:15:13,981 --> 01:15:15,683
-DANIEL: Yeah. Well, whatever's
comfortable, I think. -Okay.
1730
01:15:15,716 --> 01:15:17,018
It's kind of cool.
There's a bunch of mirrors.
1731
01:15:17,051 --> 01:15:18,285
-Is that-- okay.
-WOMAN: Sam A., take one. Mark.
1732
01:15:18,319 --> 01:15:19,920
-How's that?
-Good. Thank you.
1733
01:15:19,954 --> 01:15:21,622
DANIEL: The genesis
of this project is that I was
1734
01:15:21,655 --> 01:15:23,290
sitting at home,
and I was playing around with,
1735
01:15:23,324 --> 01:15:24,859
I think, your image generator,
1736
01:15:24,892 --> 01:15:27,762
and I was simultaneously,
uh, terrified
1737
01:15:27,795 --> 01:15:29,897
-and really impressed.
-(chuckles softly)
1738
01:15:29,930 --> 01:15:32,333
-That is the usual combination.
-And then cut to
1739
01:15:32,366 --> 01:15:35,002
my wife and I find out
we're expecting.
1740
01:15:35,036 --> 01:15:38,773
And-and I'm, like, having
an existential crisis
1741
01:15:38,806 --> 01:15:40,775
as my wife is
six months pregnant.
1742
01:15:40,808 --> 01:15:43,577
And I think my first question,
which I ask everybody:
1743
01:15:43,611 --> 01:15:46,080
Is now a terrible time
to have a kid?
1744
01:15:46,113 --> 01:15:47,848
Am I making a big mistake?
1745
01:15:47,882 --> 01:15:49,650
I'm expecting a kid
in March, too. My first one.
1746
01:15:49,683 --> 01:15:50,918
-You're expecting a kid?
-Yeah.
1747
01:15:50,951 --> 01:15:52,319
-You're expecting in March?
-First kid.
1748
01:15:52,353 --> 01:15:53,788
-Mazel tov.
-Thank you very much.
1749
01:15:53,821 --> 01:15:54,922
I've never been so excited
for anything.
1750
01:15:54,955 --> 01:15:57,091
-That's how I feel.
-Yeah.
1751
01:15:57,124 --> 01:15:58,759
But you're not scared?
1752
01:15:58,793 --> 01:16:03,064
I mean, like, having a kid is
just this momentous thing
1753
01:16:03,097 --> 01:16:05,633
that I, you know,
I stay up every night, like,
1754
01:16:05,666 --> 01:16:07,334
reading these books about
how to raise a kid,
1755
01:16:07,368 --> 01:16:08,769
and I hope
I'm gonna do a good job,
1756
01:16:08,803 --> 01:16:09,804
and it feels
very overwhelming and--
1757
01:16:09,837 --> 01:16:12,673
but I'm not scared...
1758
01:16:12,706 --> 01:16:14,675
for kids to grow up
in a world with AI.
1759
01:16:14,708 --> 01:16:16,811
Like, that's... that'll be okay.
1760
01:16:16,844 --> 01:16:19,280
That is good to hear
coming from the guy.
1761
01:16:19,313 --> 01:16:21,082
I think it's a wonderful idea
to have kids.
1762
01:16:21,115 --> 01:16:23,784
I-I think they're the most
magical, incredible thing.
1763
01:16:23,818 --> 01:16:25,820
-(sighs heavily)
-There's so much uncertainty,
1764
01:16:25,853 --> 01:16:27,188
I would almost just do
what you're gonna do anyway.
1765
01:16:27,221 --> 01:16:29,223
I know that's not
a very satisfying answer,
1766
01:16:29,256 --> 01:16:32,359
but it's the only one
I can come up with.
1767
01:16:32,393 --> 01:16:34,929
Our kids are never gonna...
1768
01:16:34,962 --> 01:16:37,331
know a world that doesn't have
really advanced AI.
1769
01:16:37,364 --> 01:16:40,167
In fact, our kids will never
be smarter than an AI.
1770
01:16:40,201 --> 01:16:42,703
DANIEL: "Our kids will
never be smarter than an AI"?
1771
01:16:42,736 --> 01:16:44,839
Well, from a raw,
from a raw IQ perspective,
1772
01:16:44,872 --> 01:16:46,807
they will never be
smarter than an AI.
1773
01:16:46,841 --> 01:16:48,309
That notion doesn't
unsettle you a little bit?
1774
01:16:48,342 --> 01:16:51,712
'Cause it makes me feel
a little queasy in a weird way.
1775
01:16:51,745 --> 01:16:55,316
Um, it does unsettle me
a little bit.
1776
01:16:55,349 --> 01:16:56,817
But it is reality.
1777
01:16:56,851 --> 01:16:58,819
DANIEL:
Okay, so race dynamics.
1778
01:16:58,853 --> 01:17:01,655
We have a bunch of people
who are all in agreement
1779
01:17:01,689 --> 01:17:03,190
that this is scary.
1780
01:17:03,224 --> 01:17:05,126
Based on the timelines that
a lot of people have given me,
1781
01:17:05,159 --> 01:17:07,661
we have between two months and
five years to figure this out.
1782
01:17:07,695 --> 01:17:09,063
-Yeah.
-And-and, I guess,
1783
01:17:09,096 --> 01:17:13,767
my biggest question is:
Why can't we just stop?
1784
01:17:14,768 --> 01:17:19,340
The problem with, uh, uh,
"just stopping" is that, um,
1785
01:17:19,373 --> 01:17:22,776
there are many, many groups now
around the world, uh, uh,
1786
01:17:22,810 --> 01:17:26,113
building this, for
many nations, many companies,
1787
01:17:26,147 --> 01:17:28,749
um, all with
different motivations.
1788
01:17:28,782 --> 01:17:31,152
There are some of
these companies in this space
1789
01:17:31,185 --> 01:17:34,455
who their position is, "We want
to develop this technology
1790
01:17:34,488 --> 01:17:36,323
absolutely as fast as possible."
1791
01:17:36,357 --> 01:17:40,694
And even if we could pass laws
in the US and in Europe,
1792
01:17:40,728 --> 01:17:42,897
we need to convince...
1793
01:17:42,930 --> 01:17:45,266
Xi Jinping and Vladimir Putin
or, you know,
1794
01:17:45,299 --> 01:17:47,968
whoever their scientific
advisors are on their side.
1795
01:17:48,002 --> 01:17:49,837
That's gonna be really hard.
1796
01:17:49,870 --> 01:17:51,238
I think it is true
1797
01:17:51,272 --> 01:17:54,041
that if two people are
in exactly the same place,
1798
01:17:54,074 --> 01:17:56,410
uh, the one willing to take
more shortcuts on safety
1799
01:17:56,443 --> 01:17:58,746
should kind of
"get there first."
1800
01:17:58,779 --> 01:18:01,715
Uh, but...
1801
01:18:01,749 --> 01:18:03,284
we're able to use our lead
1802
01:18:03,317 --> 01:18:06,720
to spend a lot more time
doing safety testing.
1803
01:18:06,754 --> 01:18:08,455
DANIEL:
And-and what if you lose it?
1804
01:18:08,489 --> 01:18:10,057
You get the call
and you find out
1805
01:18:10,090 --> 01:18:12,092
that you're now, let's say,
six months behind.
1806
01:18:12,126 --> 01:18:14,828
-Wh-What happens then?
-Depends who we're behind to.
1807
01:18:14,862 --> 01:18:17,364
If it's, like,
a adversarial government,
1808
01:18:17,398 --> 01:18:19,200
that's probably really bad.
1809
01:18:19,233 --> 01:18:21,902
So let's say you get the call
that China has a,
1810
01:18:21,936 --> 01:18:25,172
has a recursively
self-improving agent
1811
01:18:25,206 --> 01:18:27,808
or something like that that we
should be really worried about.
1812
01:18:27,841 --> 01:18:30,010
What do-- what-what happens?
1813
01:18:31,478 --> 01:18:33,113
Um...
1814
01:18:35,950 --> 01:18:37,484
That case would require...
1815
01:18:37,519 --> 01:18:41,088
the first step there would be
to talk to the US government.
1816
01:18:41,121 --> 01:18:43,457
Sam, do you trust the
government's ability to handle
1817
01:18:43,490 --> 01:18:45,292
something like this?
1818
01:18:46,427 --> 01:18:48,162
Yeah, I do, actually.
1819
01:18:48,195 --> 01:18:50,030
There's other things I don't
trust the government to handle,
1820
01:18:50,064 --> 01:18:51,832
but that particular scenario,
I think they would know...
1821
01:18:51,865 --> 01:18:53,167
they-they-- yes.
1822
01:18:53,200 --> 01:18:54,969
DANIEL: When you have,
like, private discussions
1823
01:18:55,002 --> 01:18:56,937
with the other guys whose
fingers are on the trigger,
1824
01:18:56,971 --> 01:19:00,241
so to speak, um,
do those private discussions
1825
01:19:00,274 --> 01:19:01,875
fill you with confidence
1826
01:19:01,909 --> 01:19:03,545
or do they make you,
uh, more anxious?
1827
01:19:03,578 --> 01:19:07,414
Look, I mean, I-I know some
of them better than others.
1828
01:19:07,448 --> 01:19:10,417
Um, I have more confidence
in some of them th-than I have
1829
01:19:10,451 --> 01:19:12,920
in others, you know,
as with, as with any people.
1830
01:19:12,953 --> 01:19:14,388
You know,
you think about, you know,
1831
01:19:14,421 --> 01:19:16,223
the kids in your high school
class or something, right?
1832
01:19:16,257 --> 01:19:18,526
You know, some of them are,
you know, really sweet.
1833
01:19:18,560 --> 01:19:20,961
Some of them are well-meaning
but not that effective.
1834
01:19:20,995 --> 01:19:22,930
Some of them are,
you know, bullies.
1835
01:19:22,963 --> 01:19:24,198
Some of them are
really bad people.
1836
01:19:24,231 --> 01:19:26,300
(stammers)
You-you kind of really,
1837
01:19:26,333 --> 01:19:27,535
you really see the spread.
1838
01:19:27,569 --> 01:19:29,236
You know, am I, am I,
am I confident
1839
01:19:29,270 --> 01:19:31,071
that everyone's gonna do
the right thing,
1840
01:19:31,105 --> 01:19:32,540
that it's all gonna work out?
1841
01:19:32,574 --> 01:19:34,241
Um, no, I'm not.
1842
01:19:34,275 --> 01:19:36,010
And there's-there's nothing
I can do about that, right?
1843
01:19:36,043 --> 01:19:38,379
You know, all-all I can do
is-is push for the,
1844
01:19:38,412 --> 01:19:39,813
push for the, you know,
1845
01:19:39,847 --> 01:19:41,181
push for the government
to get involved.
1846
01:19:41,215 --> 01:19:43,150
But ultimately I'm just
one person there, too.
1847
01:19:43,183 --> 01:19:44,586
That's--
it's-it's up to all of us
1848
01:19:44,619 --> 01:19:46,353
to push for the government
to get involved.
1849
01:19:46,387 --> 01:19:49,023
That's the number one thing
that-that I think we need to do
1850
01:19:49,056 --> 01:19:51,292
to-to, you know, to set things
in the right direction.
1851
01:19:51,325 --> 01:19:53,160
What makes me anxious
about that is, like,
1852
01:19:53,193 --> 01:19:56,463
the basic reality that the
speed at which the technology
1853
01:19:56,497 --> 01:19:58,966
is proliferating and growing
is exponential,
1854
01:19:58,999 --> 01:20:02,002
and the mechanisms to legislate
are 300 years old,
1855
01:20:02,036 --> 01:20:03,304
-take forever.
-Yeah.
1856
01:20:03,337 --> 01:20:05,573
Um, I-I think
it's gonna be a heavy lift.
1857
01:20:05,607 --> 01:20:07,274
I-I definitely agree
with you on that.
1858
01:20:07,308 --> 01:20:09,276
DANIEL: You know,
what I'm literally looking for
1859
01:20:09,310 --> 01:20:11,513
is-is like,
"Here are steps that, like,
1860
01:20:11,546 --> 01:20:15,449
"the head honchos are gonna
take to-to focus on safety,
1861
01:20:15,482 --> 01:20:18,586
to mitigate the peril
and maximize the promise."
1862
01:20:18,620 --> 01:20:20,588
And I don't,
I don't, I don't know
1863
01:20:20,622 --> 01:20:22,423
that there's
a simple answer to that.
1864
01:20:22,456 --> 01:20:24,091
Um...
1865
01:20:26,360 --> 01:20:28,262
I mean, this maybe is
too simple,
1866
01:20:28,295 --> 01:20:30,632
but you...
1867
01:20:30,665 --> 01:20:32,333
create a new model,
1868
01:20:32,366 --> 01:20:35,202
you study and test it
very carefully.
1869
01:20:35,235 --> 01:20:37,871
You put it out
into the world gradually,
1870
01:20:37,905 --> 01:20:40,074
and then, more and more,
you understand
1871
01:20:40,107 --> 01:20:41,875
if that's safe or not,
and then if it is,
1872
01:20:41,909 --> 01:20:43,377
you can take the next step.
1873
01:20:44,445 --> 01:20:47,414
It doesn't sound as flashy
as, like, a brilliant scientist
1874
01:20:47,448 --> 01:20:50,652
coming up with one idea in a
lab to make an AI system, like,
1875
01:20:50,685 --> 01:20:54,355
perfectly safe and controllable
and everything else,
1876
01:20:54,388 --> 01:20:56,190
but it is what I believe
is gonna happen.
1877
01:20:56,223 --> 01:20:57,559
Like, it is the way
I think this works.
1878
01:20:57,592 --> 01:20:59,561
But let's just say
something terrible happens,
1879
01:20:59,594 --> 01:21:02,429
like a model gets loose
or goes rogue or something.
1880
01:21:02,463 --> 01:21:04,031
Is there a protocol?
1881
01:21:04,064 --> 01:21:06,266
Like, literally,
I'm imagining a red phone.
1882
01:21:06,300 --> 01:21:07,635
-Yeah.
-Sorry for thinking of this
1883
01:21:07,669 --> 01:21:09,103
in terms of movies, but, like...
1884
01:21:09,136 --> 01:21:10,371
There is a protocol.
1885
01:21:10,404 --> 01:21:12,039
-Is there a red phone
on your desk? -No.
1886
01:21:12,072 --> 01:21:13,508
(laughs):
Is it a secret?
1887
01:21:13,541 --> 01:21:15,643
I mean, uh, no, it's not,
it's not as fancy or dramatic
1888
01:21:15,677 --> 01:21:17,512
as you, like, would hope,
but there's, like, you know--
1889
01:21:17,545 --> 01:21:19,279
we've, like, thought through
these scenarios,
1890
01:21:19,313 --> 01:21:21,081
and if this happens,
we're gonna call these people
1891
01:21:21,115 --> 01:21:22,349
in this order and do this
1892
01:21:22,383 --> 01:21:24,017
and kind of make
these decisions if, like--
1893
01:21:24,051 --> 01:21:26,387
I do believe that
when you have an opportunity
1894
01:21:26,420 --> 01:21:29,423
to do your thinking before
a stressful situation happens,
1895
01:21:29,456 --> 01:21:30,991
that's almost always
a good idea.
1896
01:21:31,024 --> 01:21:32,226
And writing it down is helpful.
1897
01:21:32,259 --> 01:21:34,128
-Being prepared is helpful.
-Yeah.
1898
01:21:34,962 --> 01:21:36,397
(sighs) You...
1899
01:21:36,430 --> 01:21:39,233
It would be impossible
for me to sit across from you
1900
01:21:39,266 --> 01:21:41,669
and-and ask you to promise me
that this is gonna go well?
1901
01:21:41,703 --> 01:21:43,504
That is impossible.
1902
01:21:44,104 --> 01:21:46,173
There isn't any easy answers,
unfortunately.
1903
01:21:46,206 --> 01:21:48,442
Uh, because it's such
a cutting-edge technology,
1904
01:21:48,475 --> 01:21:50,244
um, there's still
a lot of unknowns.
1905
01:21:50,277 --> 01:21:52,714
And I think that
that-that needs to be,
1906
01:21:52,747 --> 01:21:55,416
um, uh, you know, understood
1907
01:21:55,449 --> 01:21:59,554
and-and hence the need
for, uh, uh, some caution.
1908
01:21:59,587 --> 01:22:01,288
I wake up, you know, every day,
1909
01:22:01,321 --> 01:22:04,024
this is the, this is the
number one thing I think about.
1910
01:22:04,057 --> 01:22:05,459
Now, look, I'm human,
1911
01:22:05,492 --> 01:22:08,596
and, you know, has-has
every decision been perfect?
1912
01:22:08,630 --> 01:22:12,166
Can I even say my motivations
were always perfectly clear?
1913
01:22:12,199 --> 01:22:14,101
Of course not.
No one can say that.
1914
01:22:14,134 --> 01:22:17,004
Like, that's-that's
just not, like, you know--
1915
01:22:17,037 --> 01:22:19,173
that's-that's just not
how people work.
1916
01:22:19,206 --> 01:22:21,408
The-the history of science
tends to be that,
1917
01:22:21,442 --> 01:22:23,745
for better or for worse,
if something's possible to do--
1918
01:22:23,778 --> 01:22:27,414
and we now know AI is possible
to do-- humanity does it.
1919
01:22:27,448 --> 01:22:30,217
All of this
was-was going to happen.
1920
01:22:30,250 --> 01:22:32,587
This-this train
isn't gonna stop.
1921
01:22:32,620 --> 01:22:34,722
You can't step in front of
the train and stop it.
1922
01:22:34,756 --> 01:22:36,724
You're just gonna get squished.
1923
01:22:36,758 --> 01:22:39,226
ALTMAN:
I mean, it's very stressful.
1924
01:22:40,194 --> 01:22:41,462
You know, there's, like,
1925
01:22:41,495 --> 01:22:43,197
a lot of things
a lot of us don't know.
1926
01:22:43,230 --> 01:22:47,000
I think the history
of scientific discovery is
1927
01:22:47,034 --> 01:22:48,736
one of not knowing
what you don't know
1928
01:22:48,770 --> 01:22:50,204
and figuring out as you go.
1929
01:22:50,237 --> 01:22:53,575
Uh, but, yeah, it is a...
1930
01:22:53,608 --> 01:22:55,976
it is a stressful way to live.
1931
01:22:56,811 --> 01:22:59,246
DANIEL: Right. Sam, thank you
very much for doing this.
1932
01:22:59,279 --> 01:23:01,081
-And again, mazel tov.
-Thank you. And to you.
1933
01:23:01,114 --> 01:23:03,383
Thank you so much. Thanks, guys.
1934
01:23:03,417 --> 01:23:06,253
♪ ♪
1935
01:23:08,388 --> 01:23:09,990
(sighs)
1936
01:23:11,593 --> 01:23:13,528
(electricity crackling)
1937
01:23:13,561 --> 01:23:16,363
-(thunder cracks)
-(whooshing, grunting)
1938
01:23:43,190 --> 01:23:45,392
-(VCR clicks)
-(line ringing)
1939
01:23:48,328 --> 01:23:49,797
-(line clicks)
-KEVIN ROHER: Hello.
1940
01:23:49,831 --> 01:23:51,131
DANIEL:
Hey, Dad, how are you?
1941
01:23:51,164 --> 01:23:52,165
KEVIN:
Good. How you doing?
1942
01:23:52,199 --> 01:23:53,500
You working? What are you up to?
1943
01:23:53,535 --> 01:23:55,770
DANIEL:
I'm working on this AI film.
1944
01:23:55,803 --> 01:23:58,305
KEVIN:
And how's it going?
1945
01:23:58,338 --> 01:24:00,708
DANIEL:
You know, it... it's really...
1946
01:24:00,742 --> 01:24:02,544
-JOANNE ROHER: Hi, sweetie.
-DANIEL: Hi, Mom.
1947
01:24:02,577 --> 01:24:04,344
JOANNE: So what isthe premise of the film?
1948
01:24:04,378 --> 01:24:07,180
Is it a documentary?Kev, don't use any more spices.
1949
01:24:07,214 --> 01:24:08,816
It's already over-spiced.
1950
01:24:08,850 --> 01:24:10,718
We're making chicken right now.
1951
01:24:10,752 --> 01:24:13,186
-Is it a documentary or what...
-DANIEL: It's about--
1952
01:24:13,220 --> 01:24:14,421
the movie's aboutthe end of the world.
1953
01:24:14,454 --> 01:24:16,290
The end of the world's coming,
1954
01:24:16,323 --> 01:24:18,158
and we're making a movieabout the end of the world.
1955
01:24:18,191 --> 01:24:20,327
-JOANNE: Really?
-DANIEL: Yeah.
1956
01:24:20,360 --> 01:24:22,162
KEVIN: Kind of a depressingfilm, it sounds like.
1957
01:24:22,195 --> 01:24:23,698
JOANNE:
Yeah.
1958
01:24:23,731 --> 01:24:29,436
DANIEL: I'm feeling a lot,like-- this very acute anxiety.
1959
01:24:29,469 --> 01:24:31,673
JOANNE: It's so scary,but there's got to be--
1960
01:24:31,706 --> 01:24:35,810
you know, have you been meetingsome-some supersmart people
1961
01:24:35,843 --> 01:24:38,312
that are giving you any answers?
1962
01:24:38,345 --> 01:24:40,113
DANIEL: That's what'sfrustrating about it.
1963
01:24:40,147 --> 01:24:41,683
No one knows.
1964
01:24:41,716 --> 01:24:44,384
JOANNE:
All I can say to that is that
1965
01:24:44,418 --> 01:24:50,190
every generation has hadsomething scary like this.
1966
01:24:50,223 --> 01:24:52,760
KEVIN: When I was born, it wasthe Cuban Missile Crisis.
1967
01:24:52,794 --> 01:24:54,596
(steady heartbeat)
1968
01:24:54,629 --> 01:24:56,864
I was just scared that therewas going to be a nuclear war.
1969
01:24:56,898 --> 01:24:58,298
JOANNE:
Yeah, but we didn't know
1970
01:24:58,332 --> 01:24:59,534
what they were gonna do and...
1971
01:24:59,567 --> 01:25:01,168
KEVIN:
And the world didn't end.
1972
01:25:01,201 --> 01:25:02,202
Everyone woke upthe next morning,
1973
01:25:02,235 --> 01:25:03,871
and we're still doing our thing.
1974
01:25:03,905 --> 01:25:05,439
DANIEL:
I'm very scared,
1975
01:25:05,472 --> 01:25:07,407
especially inthe context of, like,
1976
01:25:07,441 --> 01:25:09,577
-you know, the baby and...
-(fetal heartbeat pulsing)
1977
01:25:09,611 --> 01:25:11,178
KEVIN:
It's gonna be a learning curve.
1978
01:25:11,211 --> 01:25:12,479
JOANNE:
You're gonna be okay.
1979
01:25:12,513 --> 01:25:13,681
KEVIN: You can't,you can't think about
1980
01:25:13,715 --> 01:25:14,849
what you can't control, Daniel.
1981
01:25:14,882 --> 01:25:16,551
Just remember that.
1982
01:25:16,584 --> 01:25:18,251
DANIEL (voice breaking):
I'm really, I'm really feeling
1983
01:25:18,285 --> 01:25:19,587
nervous and scared about it.
1984
01:25:19,621 --> 01:25:21,488
There's so muchthat I can't control.
1985
01:25:21,522 --> 01:25:24,358
KEVIN: Don't be nervous.You can't let that get to you.
1986
01:25:24,391 --> 01:25:26,794
You can only controlwhat you can control,
1987
01:25:26,828 --> 01:25:28,395
and that's all you can do.
1988
01:25:28,428 --> 01:25:31,231
You can't do more than that.
1989
01:25:31,264 --> 01:25:33,433
Write that down in your book.
1990
01:25:35,870 --> 01:25:37,572
♪ ♪
1991
01:25:44,712 --> 01:25:47,447
(fetal heartbeat pulsing)
1992
01:25:47,481 --> 01:25:49,651
(howling)
1993
01:25:49,684 --> 01:25:52,219
CAROLINE:
When you look back,
1994
01:25:52,252 --> 01:25:54,922
the world is always ending.
1995
01:25:54,956 --> 01:25:57,357
And when you look ahead,
1996
01:25:57,391 --> 01:25:59,627
the world is always ending.
1997
01:25:59,661 --> 01:26:02,429
...on fire.
One home is already on fire...
1998
01:26:02,462 --> 01:26:06,299
CAROLINE: But the worldis always starting, too.
1999
01:26:08,402 --> 01:26:10,437
DANIEL:
Are you ready?
2000
01:26:10,470 --> 01:26:12,674
Are you ever really ready?
2001
01:26:16,443 --> 01:26:18,278
DANIEL: You want to drive
or you want me to drive?
2002
01:26:18,311 --> 01:26:19,547
CAROLINE:
You drive.
2003
01:26:20,815 --> 01:26:22,382
-(woman speaks indistinctly)
-CAROLINE: Okay.
2004
01:26:22,416 --> 01:26:24,284
DANIEL:
Thank you, Maria.
2005
01:26:24,317 --> 01:26:26,654
WOMAN: It's gonna just
feel really crampy.
2006
01:26:26,688 --> 01:26:28,355
-CAROLINE: Okay.
-WOMAN: You're ready?
2007
01:26:28,388 --> 01:26:29,691
-CAROLINE: Yes.-(indistinct chatter)
2008
01:26:29,724 --> 01:26:30,958
(Caroline sighs heavily)
2009
01:26:30,992 --> 01:26:33,326
WOMAN: You could be in amedical drama with all that...
2010
01:26:34,595 --> 01:26:36,597
♪ ♪
2011
01:26:44,706 --> 01:26:46,607
-(static crackles)
-(indistinct chatter)
2012
01:26:49,610 --> 01:26:51,746
WOMAN:
...some pressure.
2013
01:26:53,313 --> 01:26:54,749
Just relax.
2014
01:26:57,451 --> 01:26:59,821
-(baby crying)-(device beeping)
2015
01:27:02,623 --> 01:27:05,626
DANIEL:
Hi, buddy. Ooh.
2016
01:27:05,660 --> 01:27:06,961
Hi, buddy.
2017
01:27:06,994 --> 01:27:09,362
♪ ♪
2018
01:27:31,652 --> 01:27:34,321
KEVIN:
It's gonna be a whole new world
2019
01:27:34,354 --> 01:27:35,723
for you, Daniel.
2020
01:27:37,658 --> 01:27:40,762
And I think you're gonna bean amazing father.
2021
01:27:41,863 --> 01:27:44,899
JOANNE (chuckles):
That's for sure.
2022
01:27:44,932 --> 01:27:47,068
-(Kevin crying)-Oh.
2023
01:27:47,101 --> 01:27:49,302
(crying continues)
2024
01:27:49,336 --> 01:27:51,572
KEVIN:
You're gonna do a great job.
2025
01:27:52,206 --> 01:27:53,541
JOANNE:
Kev, why are you crying?
2026
01:27:53,574 --> 01:27:55,076
KEVIN:
I'm crying because...
2027
01:27:55,109 --> 01:27:57,545
I don't know. I just don't...
2028
01:27:57,578 --> 01:27:59,914
DANIEL: Dad, you're gonnamake me cry, too.
2029
01:27:59,947 --> 01:28:01,549
Why are you crying?
2030
01:28:03,651 --> 01:28:05,853
KEVIN: I just know thatyou're gonna be an amazing dad.
2031
01:28:05,887 --> 01:28:07,655
DANIEL (crying):
I'm only gonna be a great dad
2032
01:28:07,688 --> 01:28:09,957
'cause I had a great dad.
2033
01:28:09,991 --> 01:28:12,894
KEVIN: I'm getting emotional,that's all.
2034
01:28:12,927 --> 01:28:14,327
JOANNE:
Aw.
2035
01:28:14,361 --> 01:28:15,462
(playful chatter over video)
2036
01:28:15,495 --> 01:28:17,064
(over video):
Boy, oh, boy!
2037
01:28:17,098 --> 01:28:18,633
My goodness!
2038
01:28:19,267 --> 01:28:21,035
Look at those little cheekies.
2039
01:28:21,068 --> 01:28:25,573
Look at those little cheekies.
Look at those little cheekies.
2040
01:28:25,606 --> 01:28:26,741
Yeah.
2041
01:28:26,774 --> 01:28:29,376
DANIEL:
I know how to end this movie.
2042
01:28:30,011 --> 01:28:31,712
Babies.
2043
01:28:32,379 --> 01:28:35,883
-The end of the movie is aboutbabies. -(chatter over videos)
2044
01:28:35,917 --> 01:28:37,618
They're life-affirming.
2045
01:28:37,652 --> 01:28:39,486
They're exhausting.
2046
01:28:39,520 --> 01:28:41,421
-(baby laughing)
-They're hilarious.
2047
01:28:41,454 --> 01:28:42,957
And they're worth it.
2048
01:28:42,990 --> 01:28:46,828
This film isn't about theinner workings of a technology.
2049
01:28:46,861 --> 01:28:48,696
It's not aboutthe billionaire CEOs.
2050
01:28:48,729 --> 01:28:50,665
It's not about the geopolitics.
2051
01:28:50,698 --> 01:28:53,935
It's not about the terrifyingfuture or the end of the world,
2052
01:28:53,968 --> 01:28:56,137
because my worldis just starting.
2053
01:28:56,170 --> 01:28:57,972
I'm building a crib.
2054
01:28:58,005 --> 01:28:59,907
Right here, right now.
2055
01:28:59,941 --> 01:29:01,008
(baby cooing)
2056
01:29:01,042 --> 01:29:02,743
AI is gonna change everything
2057
01:29:02,777 --> 01:29:05,913
in ways too powerful andcomplex for us to understand.
2058
01:29:05,947 --> 01:29:09,717
And the future is notfor any of us to decide.
2059
01:29:10,818 --> 01:29:13,921
But what I can decide is to bethe best possible husband
2060
01:29:13,955 --> 01:29:17,992
for my wife and the bestpossible dad for my son.
2061
01:29:18,025 --> 01:29:21,796
So whether our AI future isa nightmarish dystopia
2062
01:29:21,829 --> 01:29:24,098
or the utopiathat we all dream of,
2063
01:29:24,131 --> 01:29:27,134
I'll at least knowthat I did everything I could
2064
01:29:27,168 --> 01:29:30,605
to guide my familythrough this AI revolution.
2065
01:29:30,638 --> 01:29:34,474
And no matter what,we'll be facing it together.
2066
01:29:34,508 --> 01:29:35,877
(uplifting music swells)
2067
01:29:35,910 --> 01:29:37,444
(music stops abruptly)
2068
01:29:37,477 --> 01:29:40,047
DANIEL:
So that's just our first idea.
2069
01:29:40,715 --> 01:29:43,084
How does this feel?
Are you feeling this?
2070
01:29:43,117 --> 01:29:45,720
Wait, I-- like, this is not--
this is a joke.
2071
01:29:45,753 --> 01:29:48,388
It's not actually
how you're gonna end it.
2072
01:29:50,591 --> 01:29:53,160
I mean, it's just an idea. Okay?
2073
01:29:53,194 --> 01:29:54,996
No, Daniel.
2074
01:29:55,029 --> 01:29:56,631
...uh, very, very dumb.
2075
01:29:56,664 --> 01:29:59,166
You've just spent, I don't
know, like, how many years
2076
01:29:59,200 --> 01:30:03,104
of our life working on this,
talking to every leading expert
2077
01:30:03,137 --> 01:30:05,539
on the planet about the subject,
2078
01:30:05,573 --> 01:30:08,175
and you're gonna end it
2079
01:30:08,209 --> 01:30:11,444
with some, like,
kumbaya bullshit?
2080
01:30:11,478 --> 01:30:14,048
There's an asteroid
headed to Earth.
2081
01:30:14,081 --> 01:30:15,917
What do you do? Just...
2082
01:30:15,950 --> 01:30:18,152
hold hands
and hope it works out okay?
2083
01:30:18,185 --> 01:30:19,854
Absolutely not.
2084
01:30:19,887 --> 01:30:22,924
We have to-- it ha--
it has to be...
2085
01:30:22,957 --> 01:30:25,026
The ending has to...
2086
01:30:26,493 --> 01:30:28,529
CAROLINE:
Okay, first thing:
2087
01:30:28,562 --> 01:30:32,033
AI is here,and it's here to stay.
2088
01:30:32,066 --> 01:30:33,534
The shit's out of the horse,
2089
01:30:33,567 --> 01:30:36,436
but the horse isgonna keep shitting.
2090
01:30:38,873 --> 01:30:39,807
(crowd cheering)
2091
01:30:39,840 --> 01:30:41,943
You know, one of the basic
laws of history
2092
01:30:41,976 --> 01:30:44,578
is that nothing
really have a beginning
2093
01:30:44,612 --> 01:30:46,580
and nothing has any ending.
2094
01:30:46,614 --> 01:30:48,049
It just goes on.
2095
01:30:48,082 --> 01:30:50,918
AI is nowhere near
its full development.
2096
01:30:50,952 --> 01:30:53,087
CAROLINE: Even ifthe current AI bubble bursts,
2097
01:30:53,120 --> 01:30:54,956
-humans are never going to stop
-(noisemaker blows)
2098
01:30:54,989 --> 01:30:57,959
building more and morepowerful technology.
2099
01:30:57,992 --> 01:30:59,827
HAO:
You can choose not to use AI
2100
01:30:59,860 --> 01:31:02,730
or participate in it, but it's
going to affect you anyway.
2101
01:31:02,763 --> 01:31:04,231
DANIEL: Okay, fantastic.
So we're screwed.
2102
01:31:04,265 --> 01:31:07,568
CAROLINE: No, we're not
because of one simple thing.
2103
01:31:11,572 --> 01:31:13,574
This is not inevitable.
2104
01:31:13,607 --> 01:31:15,576
If we could just see it
clearly together,
2105
01:31:15,609 --> 01:31:18,813
the obvious response will be
to choose something different.
2106
01:31:18,846 --> 01:31:21,615
We need to very clearly
change the game
2107
01:31:21,649 --> 01:31:23,517
from a race to the bottom
into a race to the top.
2108
01:31:23,551 --> 01:31:27,855
LEAHY: The problem we need to
solve is not AI specifically.
2109
01:31:27,888 --> 01:31:31,726
It's the general question
of how do we build a society
2110
01:31:31,759 --> 01:31:34,528
that can deal with
powerful technology.
2111
01:31:34,562 --> 01:31:35,830
Because we're going to get
2112
01:31:35,863 --> 01:31:37,264
only more and more powerful
technology.
2113
01:31:37,298 --> 01:31:40,034
CAROLINE:
We need to upgrade our society,
2114
01:31:40,067 --> 01:31:41,902
and the first step iscoming together
2115
01:31:41,936 --> 01:31:43,838
-and demanding...
-ALL: Coordination.
2116
01:31:43,871 --> 01:31:47,608
Some form of international
cooperation or agreement about
2117
01:31:47,641 --> 01:31:48,909
what the norms should be.
2118
01:31:48,943 --> 01:31:50,544
You know, how should
they be deployed...
2119
01:31:50,578 --> 01:31:52,146
Like, real
international diplomacy
2120
01:31:52,179 --> 01:31:53,781
among the superpowers.
2121
01:31:53,814 --> 01:31:56,784
The Chinese are as worried
about it as the Americans.
2122
01:31:56,817 --> 01:31:58,085
I think it's difficult,
you know,
2123
01:31:58,119 --> 01:31:59,920
in the current
geopolitical climate,
2124
01:31:59,954 --> 01:32:01,756
-but I think it's necessary.
-RASKIN: Absolutely.
2125
01:32:01,789 --> 01:32:05,059
In the exact same way that
the last time that humanity
2126
01:32:05,092 --> 01:32:09,163
developed a technologythis dangerous...
2127
01:32:09,830 --> 01:32:12,900
...that required a complete,
unprecedented shift
2128
01:32:12,933 --> 01:32:15,169
to the structure of our world.
2129
01:32:15,770 --> 01:32:18,172
CAROLINE:
So we need to do that complete,
2130
01:32:18,205 --> 01:32:20,307
unprecedented shift again.
2131
01:32:21,008 --> 01:32:24,678
You know, we-we talk to people
who work at these AI companies,
2132
01:32:24,712 --> 01:32:25,846
and they say they want to do
something different,
2133
01:32:25,880 --> 01:32:27,048
but they need public pressure.
2134
01:32:27,081 --> 01:32:28,716
They need the government
to do something.
2135
01:32:28,749 --> 01:32:30,217
So then we go to DC,
and they say,
2136
01:32:30,251 --> 01:32:32,153
"Well, we need Silicon Valley
to do something different.
2137
01:32:32,186 --> 01:32:34,021
They're the ones who are gonna
come up with the guardrails."
2138
01:32:34,055 --> 01:32:36,057
And so everyone is pointing
the finger at someone else,
2139
01:32:36,090 --> 01:32:39,927
and what they agree on is
that we need public pressure
2140
01:32:39,960 --> 01:32:41,796
in order for something else
to happen.
2141
01:32:41,829 --> 01:32:43,164
And that's what you
2142
01:32:43,197 --> 01:32:44,799
and all the people
watching this movie can do.
2143
01:32:44,832 --> 01:32:45,833
CAROLINE:
We need to hold the leaders
2144
01:32:45,866 --> 01:32:47,334
in our governments
2145
01:32:47,368 --> 01:32:48,969
and the leaders ofthese companies accountable.
2146
01:32:49,003 --> 01:32:51,639
BOEREE:
Whichever country you're in,
2147
01:32:51,672 --> 01:32:53,340
let them know
that you're not happy
2148
01:32:53,374 --> 01:32:54,975
with the current status quo.
2149
01:32:55,009 --> 01:32:57,678
So, yeah, it's boring to say
call your congressperson.
2150
01:32:57,711 --> 01:32:59,880
I'm not saying you should
just do that, but, like,
2151
01:32:59,914 --> 01:33:01,215
we do have to do that.
2152
01:33:01,248 --> 01:33:02,883
Like, we do have to get
the government involved.
2153
01:33:02,917 --> 01:33:04,185
DANIEL: So we just
call them up and say,
2154
01:33:04,218 --> 01:33:06,253
"Hey, stop Big Tech
from ruining the world"?
2155
01:33:06,287 --> 01:33:09,223
CAROLINE: No, but there are
tons of really obvious,
2156
01:33:09,256 --> 01:33:11,225
straightforward things
we can be demanding.
2157
01:33:11,258 --> 01:33:12,960
We need transparency.
2158
01:33:12,993 --> 01:33:14,795
We need to end the secrecy
that exists inside these labs,
2159
01:33:14,829 --> 01:33:17,031
because they are building
powerful technology,
2160
01:33:17,064 --> 01:33:19,066
and the public deserves to know
what's going on.
2161
01:33:19,100 --> 01:33:21,068
Ultimately,
we're gonna need independent,
2162
01:33:21,102 --> 01:33:23,637
objective third parties
to evaluate the systems.
2163
01:33:23,671 --> 01:33:27,274
We can't count on the companies
to grade their own homework.
2164
01:33:27,308 --> 01:33:31,378
If-if a company uses AI and-and
has AI interacting with you,
2165
01:33:31,412 --> 01:33:35,249
it should disclose that you are
interacting with an AI system.
2166
01:33:35,282 --> 01:33:39,019
Yeah. And-and also we need
a system that makes companies
2167
01:33:39,053 --> 01:33:42,156
legally liable for the
AI systems that they produce.
2168
01:33:42,189 --> 01:33:46,293
We need to make sure that there
are tests and safety standards
2169
01:33:46,327 --> 01:33:48,662
that are applied to everyone.
2170
01:33:48,696 --> 01:33:50,097
We need some ground rules,
2171
01:33:50,131 --> 01:33:51,999
and we need to keep adapting
those rules
2172
01:33:52,032 --> 01:33:55,002
at the speed that
the technology develops.
2173
01:33:55,035 --> 01:33:57,738
LEAHY: There is currentlymore regulation
2174
01:33:57,771 --> 01:33:59,840
on selling a sandwichto the public
2175
01:33:59,874 --> 01:34:03,410
than there is on building
potentially world-ending AGI.
2176
01:34:03,444 --> 01:34:05,279
CAROLINE:
And the last thing is
2177
01:34:05,312 --> 01:34:07,214
to upgrade ourselves.
2178
01:34:07,915 --> 01:34:10,284
This is not, like, the job
of, like, the safety team
2179
01:34:10,317 --> 01:34:12,153
at any given lab, or the CEO.
2180
01:34:12,186 --> 01:34:13,854
So, like, this is
everyone's job.
2181
01:34:13,888 --> 01:34:15,156
Don't, like, leave it
up to the AI experts.
2182
01:34:15,189 --> 01:34:17,424
Like, (bleep) that.
Like, this is the moment
2183
01:34:17,458 --> 01:34:19,460
that we are transitioning
from, like,
2184
01:34:19,493 --> 01:34:22,163
mostly human cognitive power
to, like, AI cognitive power,
2185
01:34:22,196 --> 01:34:23,197
and it affects everyone,
2186
01:34:23,230 --> 01:34:25,166
and I want people to be
in on that conversation.
2187
01:34:25,199 --> 01:34:27,234
And I would say, if you think
that AI will kill us all,
2188
01:34:27,268 --> 01:34:28,969
you should be working
in AI research
2189
01:34:29,003 --> 01:34:30,804
to make sure it doesn't,
because you do have
2190
01:34:30,838 --> 01:34:32,773
an enormous amount of agency.
2191
01:34:32,806 --> 01:34:34,208
CAROLINE:
Whoever you are,
2192
01:34:34,241 --> 01:34:36,177
you are an expertin your own industry,
2193
01:34:36,210 --> 01:34:39,413
in your own school,in your own family,
2194
01:34:39,446 --> 01:34:43,050
and it's up to youhow AI is used in your life.
2195
01:34:43,083 --> 01:34:45,319
Whether you want to join
your school board
2196
01:34:45,352 --> 01:34:48,389
or-or whether you want to ask
your employer
2197
01:34:48,422 --> 01:34:49,924
how they're using
AI technologies,
2198
01:34:49,957 --> 01:34:52,259
like, all of us can do the work.
2199
01:34:52,293 --> 01:34:56,330
A lot of unions have been
pretty effective at, like,
2200
01:34:56,363 --> 01:34:59,333
determining how they wantto interact with these systems.
2201
01:34:59,366 --> 01:35:02,303
-Nurses unions, teacher unions.
-(indistinct chanting)
2202
01:35:02,937 --> 01:35:05,039
DANIELA AMODEI: I would loveif parents everywhere
2203
01:35:05,072 --> 01:35:06,407
went to the AI companies
and said,
2204
01:35:06,440 --> 01:35:08,342
"How can you be better
on this?" Including us.
2205
01:35:08,375 --> 01:35:11,445
So, I founded Encode Justice
when I was 15 years old.
2206
01:35:11,478 --> 01:35:15,082
We are the world's firstand largest army of young people
2207
01:35:15,115 --> 01:35:18,452
fighting for human-centeredartificial intelligence.
2208
01:35:18,485 --> 01:35:21,288
It doesn't matter who you are,
even the smallest actions help,
2209
01:35:21,322 --> 01:35:24,291
and even conversation starting
is really, really valuable.
2210
01:35:24,325 --> 01:35:26,293
DANIEL:
So my job could be, like,
2211
01:35:26,327 --> 01:35:28,295
tell Bubby Lila about thisat dinner?
2212
01:35:28,329 --> 01:35:30,364
CAROLINE: Honestly, yeah,that's part of it.
2213
01:35:30,397 --> 01:35:32,099
And we're gonna have to do
2214
01:35:32,132 --> 01:35:34,969
a lot of things that
we haven't even thought of yet.
2215
01:35:35,002 --> 01:35:36,503
People are gonna look at
anything that we've outlined
2216
01:35:36,538 --> 01:35:37,838
and say, "That's not enough."
2217
01:35:37,871 --> 01:35:39,473
What matters is that the forces
2218
01:35:39,507 --> 01:35:41,242
that are working
towards solutions
2219
01:35:41,275 --> 01:35:44,979
start to exceed the forces that
are working against solutions.
2220
01:35:45,012 --> 01:35:46,947
Making the world better
has always been hard.
2221
01:35:46,981 --> 01:35:48,249
It has never been easy.
2222
01:35:48,282 --> 01:35:50,317
Like, there have been
many shitty things
2223
01:35:50,351 --> 01:35:52,286
that have happened in history,
and we've had--
2224
01:35:52,319 --> 01:35:54,121
like, people have had
to deal with that,
2225
01:35:54,154 --> 01:35:57,191
and then they've risen up
and changed it.
2226
01:35:57,224 --> 01:36:00,894
There are an insane number
of challenges ahead of us,
2227
01:36:00,928 --> 01:36:04,398
but if we can get past them,
2228
01:36:04,431 --> 01:36:08,536
we can unlock a future beyond
our wildest imagination.
2229
01:36:08,570 --> 01:36:10,804
CAROLINE:
We have to come together
2230
01:36:10,838 --> 01:36:14,875
and find the path betweenthe promise and the peril.
2231
01:36:14,908 --> 01:36:17,177
We can't be pessimistsor optimists.
2232
01:36:17,778 --> 01:36:20,180
We have to become something new.
2233
01:36:22,182 --> 01:36:25,185
A friend of mine calls me
an apocaloptimist.
2234
01:36:25,219 --> 01:36:26,186
"Apocaloptimist"?
2235
01:36:26,220 --> 01:36:28,889
I think that might be
my new favorite word.
2236
01:36:28,922 --> 01:36:30,291
-(laughs) -MAN: Might be
the name of this movie.
2237
01:36:30,324 --> 01:36:31,892
-It might be the name
of this movie. -(laughter)
2238
01:36:31,925 --> 01:36:34,862
-"Apocaloptimist."
-"Apocaloptimist."
2239
01:36:34,895 --> 01:36:36,430
Yeah.
2240
01:36:36,463 --> 01:36:37,898
I don't believe in doom.
2241
01:36:37,931 --> 01:36:39,366
I believe in the spirit of life,
2242
01:36:39,400 --> 01:36:43,137
uh, and I believe in life is
about the capacity to act.
2243
01:36:43,170 --> 01:36:44,905
-(crowd singing)
-The capacity to relate,
2244
01:36:44,938 --> 01:36:47,141
the capacity to feel.
2245
01:36:49,376 --> 01:36:51,345
-(indistinct mission chatter)
-(cheering)
2246
01:36:51,378 --> 01:36:54,315
We have to double downmore and more
2247
01:36:54,348 --> 01:36:57,384
on those capacitiesthat we have as humans
2248
01:36:57,418 --> 01:36:59,853
that robotic systemswill never have.
2249
01:37:00,487 --> 01:37:04,559
It's time right now
to make those decisions about
2250
01:37:04,592 --> 01:37:08,062
how to guide it and support it
rather than dividing us.
2251
01:37:08,095 --> 01:37:10,898
DANIEL: It kind of sounds like
raising a kid.
2252
01:37:10,931 --> 01:37:12,634
That's what's up. Yeah.
2253
01:37:12,667 --> 01:37:15,035
CAROLINE: AI may havemore raw intelligence
2254
01:37:15,069 --> 01:37:17,471
-than our little human brains,
-(baby laughing)
2255
01:37:17,504 --> 01:37:20,608
but we're so much morethan just our intelligence.
2256
01:37:21,208 --> 01:37:25,379
Intelligence is-is
the ability to solve problems.
2257
01:37:25,412 --> 01:37:29,116
Wisdom is the ability to know
which problems to solve.
2258
01:37:31,519 --> 01:37:34,188
(indistinct crew chatter)
2259
01:37:34,221 --> 01:37:35,489
DANIEL:
It can go on your fridge.
2260
01:37:35,523 --> 01:37:36,658
(laughing)
2261
01:37:36,691 --> 01:37:38,626
YUDKOWSKY:
So, don't give up.
2262
01:37:38,660 --> 01:37:40,227
Humanity has done
2263
01:37:40,260 --> 01:37:43,030
more difficult things than this
in its history.
2264
01:37:43,897 --> 01:37:46,967
It's just hard to convince
people that they should.
2265
01:37:47,000 --> 01:37:49,336
♪ ♪
2266
01:37:52,072 --> 01:37:53,273
(audio crackles)
2267
01:37:55,309 --> 01:37:57,277
DANIEL: So, when I startedmaking this movie,
2268
01:37:57,311 --> 01:37:59,012
I would say that I was, like,
2269
01:37:59,046 --> 01:38:01,348
broadly a cynical asshole
about this whole thing.
2270
01:38:01,382 --> 01:38:03,450
Over the courseof making the film,
2271
01:38:03,484 --> 01:38:05,152
I've come to understandthat, like,
2272
01:38:05,185 --> 01:38:06,987
that's the only thingwe can't be.
2273
01:38:07,756 --> 01:38:09,990
...or anything elseyou'd like to discuss?
2274
01:38:10,023 --> 01:38:11,992
No. I think we covered a lot.
2275
01:38:12,025 --> 01:38:13,561
-Yeah. (chuckles)
-MAN: Thank you so much.
2276
01:38:13,595 --> 01:38:15,295
Thanks.
2277
01:38:15,929 --> 01:38:17,498
DANIEL:
This is a problem that's bigger
2278
01:38:17,532 --> 01:38:19,400
than any one person.
2279
01:38:19,433 --> 01:38:22,136
This will change the world inways that we don't understand.
2280
01:38:22,169 --> 01:38:24,037
That is all true.
2281
01:38:24,071 --> 01:38:26,106
-(laughs): Okay.
-(indistinct chatter)
2282
01:38:26,140 --> 01:38:28,208
DANIEL:
But what we do have agency over
2283
01:38:28,242 --> 01:38:30,411
is what we do about it.
2284
01:38:30,444 --> 01:38:33,113
As frontier AI grows
exponentially more capable...
2285
01:38:33,147 --> 01:38:34,682
DANIEL:
And the reality
2286
01:38:34,716 --> 01:38:37,151
is that if we just decideit's hopeless,
2287
01:38:37,184 --> 01:38:39,420
then it is hopeless.
2288
01:38:39,453 --> 01:38:41,221
...to put less stress
on planet Earth...
2289
01:38:41,255 --> 01:38:43,257
DANIEL:
But if you decide
2290
01:38:43,290 --> 01:38:46,293
that you want to try...
2291
01:38:47,194 --> 01:38:48,563
...then you try.
2292
01:38:48,596 --> 01:38:50,931
And that's hard.
2293
01:38:51,733 --> 01:38:53,500
But you know what?
2294
01:38:53,535 --> 01:38:57,137
Big things seem impossiblebefore they actually happen.
2295
01:38:58,673 --> 01:39:01,676
But when they finally do happen,
2296
01:39:01,709 --> 01:39:05,713
it's because millions of peopletook millions of actions
2297
01:39:05,747 --> 01:39:08,182
to make them happen.
2298
01:39:08,215 --> 01:39:10,752
And so...
2299
01:39:10,785 --> 01:39:13,220
we have to try.
2300
01:39:14,087 --> 01:39:15,523
(sighs heavily)
2301
01:39:21,663 --> 01:39:24,298
There's too much at stake.
2302
01:39:24,331 --> 01:39:26,366
♪ ♪
2303
01:39:33,040 --> 01:39:35,375
(film crackling)
2304
01:39:37,545 --> 01:39:41,749
Look at the incredible changes
we've experienced and survived
2305
01:39:41,783 --> 01:39:43,317
from the Stone Age,
2306
01:39:43,350 --> 01:39:46,487
and yet even greater changes
are still to come.
2307
01:39:46,521 --> 01:39:50,090
("Lost It to Trying"
by Son Lux playing)
2308
01:40:19,086 --> 01:40:21,288
♪ ♪
2309
01:40:48,181 --> 01:40:50,718
♪ What will we do now? ♪
2310
01:40:50,752 --> 01:40:53,755
♪ We've lost it to trying ♪
2311
01:40:53,788 --> 01:40:55,690
♪ We've lost it ♪
2312
01:40:55,723 --> 01:40:57,457
♪ To trying ♪
2313
01:41:00,193 --> 01:41:02,764
♪ What will we do now? ♪
2314
01:41:02,797 --> 01:41:05,733
♪ We've lost it to trying ♪
2315
01:41:05,767 --> 01:41:07,602
♪ We've lost it ♪
2316
01:41:07,635 --> 01:41:09,469
♪ To trying ♪
2317
01:41:12,205 --> 01:41:14,709
♪ What can we say now? ♪
2318
01:41:14,742 --> 01:41:17,745
♪ Our mouths only lying ♪
2319
01:41:17,779 --> 01:41:19,212
♪ Our mouths ♪
2320
01:41:19,246 --> 01:41:21,448
♪ Only lying ♪
2321
01:41:25,720 --> 01:41:28,255
♪ What can we say now? ♪
2322
01:41:28,288 --> 01:41:31,358
♪ Our mouths only lying ♪
2323
01:41:31,391 --> 01:41:32,760
♪ Our mouths ♪
2324
01:41:32,794 --> 01:41:34,762
♪ Only lying ♪
2325
01:41:34,796 --> 01:41:37,197
♪ ♪
2326
01:42:01,522 --> 01:42:04,291
♪ Give in and get out ♪
2327
01:42:04,324 --> 01:42:07,294
♪ We rise in the dying ♪
2328
01:42:07,327 --> 01:42:08,763
♪ We rise ♪
2329
01:42:08,796 --> 01:42:11,164
♪ In the dying ♪
2330
01:42:13,568 --> 01:42:16,336
♪ Give in and get out ♪
2331
01:42:16,370 --> 01:42:19,272
♪ We rise in the dying ♪
2332
01:42:19,306 --> 01:42:20,775
♪ We rise ♪
2333
01:42:20,808 --> 01:42:23,176
♪ In the dying ♪
2334
01:42:25,546 --> 01:42:28,248
♪ Give in and get out ♪
2335
01:42:28,281 --> 01:42:31,251
♪ We rise in the dying ♪
2336
01:42:31,284 --> 01:42:32,787
♪ We rise ♪
2337
01:42:32,820 --> 01:42:35,155
♪ In the dying ♪
2338
01:42:37,558 --> 01:42:40,293
♪ Give in and get out ♪
2339
01:42:40,327 --> 01:42:43,263
♪ We rise in the dying ♪
2340
01:42:43,296 --> 01:42:44,532
♪ We ♪
2341
01:42:44,565 --> 01:42:46,601
♪ ♪
2342
01:43:16,263 --> 01:43:18,465
♪ ♪
2343
01:43:23,336 --> 01:43:25,840
♪ Oh, oh, oh, oh, oh ♪
2344
01:43:25,873 --> 01:43:27,675
♪ Oh, oh, oh, oh ♪
2345
01:43:27,709 --> 01:43:28,943
♪ Oh, oh ♪
2346
01:43:28,976 --> 01:43:31,913
♪ Oh, oh, oh, oh, oh, oh ♪
2347
01:43:31,946 --> 01:43:33,648
♪ Oh, oh, oh, oh ♪
2348
01:43:33,681 --> 01:43:34,916
♪ Oh, oh ♪
2349
01:43:34,949 --> 01:43:37,552
♪ Oh, oh, oh, oh, oh, oh ♪
2350
01:43:37,585 --> 01:43:40,287
♪ What will we do now? ♪
2351
01:43:40,320 --> 01:43:43,290
♪ We've lost it to trying ♪
2352
01:43:43,323 --> 01:43:45,026
♪ We've lost it ♪
2353
01:43:45,059 --> 01:43:46,861
♪ To trying ♪
2354
01:43:46,894 --> 01:43:49,530
♪ Oh, oh, oh, oh, oh, oh ♪
2355
01:43:49,564 --> 01:43:52,332
♪ What will we do now? ♪
2356
01:43:52,365 --> 01:43:55,335
♪ We've lost it to trying ♪
2357
01:43:55,368 --> 01:43:57,038
♪ We've lost it ♪
2358
01:43:57,071 --> 01:43:58,940
♪ To trying. ♪
2359
01:43:58,973 --> 01:44:01,374
♪ ♪
2360
01:44:23,798 --> 01:44:24,865
(song ends)
182051
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.