All language subtitles for The.Social.Dilemma.2020.sdh.Eng.720p.WEBRip.x264-Mkvking.com
Afrikaans
Akan
Albanian
Amharic
Arabic
Armenian
Azerbaijani
Basque
Belarusian
Bemba
Bengali
Bihari
Bosnian
Breton
Bulgarian
Cambodian
Catalan
Cebuano
Cherokee
Chichewa
Chinese (Simplified)
Chinese (Traditional)
Corsican
Croatian
Czech
Danish
Dutch
English
Esperanto
Estonian
Ewe
Faroese
Filipino
Finnish
French
Frisian
Ga
Galician
Georgian
German
Greek
Guarani
Gujarati
Haitian Creole
Hausa
Hawaiian
Hebrew
Hindi
Hmong
Hungarian
Icelandic
Igbo
Indonesian
Interlingua
Irish
Italian
Japanese
Javanese
Kannada
Kazakh
Kinyarwanda
Kirundi
Kongo
Korean
Krio (Sierra Leone)
Kurdish
Kurdish (Soranî)
Kyrgyz
Laothian
Latin
Latvian
Lingala
Lithuanian
Lozi
Luganda
Luo
Luxembourgish
Macedonian
Malagasy
Malay
Malayalam
Maltese
Maori
Marathi
Mauritian Creole
Moldavian
Mongolian
Myanmar (Burmese)
Montenegrin
Nepali
Nigerian Pidgin
Northern Sotho
Norwegian
Norwegian (Nynorsk)
Occitan
Oriya
Oromo
Pashto
Persian
Polish
Portuguese (Brazil)
Portuguese (Portugal)
Punjabi
Quechua
Romanian
Romansh
Runyakitara
Russian
Samoan
Scots Gaelic
Serbian
Serbo-Croatian
Sesotho
Setswana
Seychellois Creole
Shona
Sindhi
Sinhalese
Slovak
Slovenian
Somali
Spanish
Spanish (Latin American)
Sundanese
Swahili
Swedish
Tajik
Tamil
Tatar
Telugu
Thai
Tigrinya
Tonga
Tshiluba
Tumbuka
Turkish
Turkmen
Twi
Uighur
Ukrainian
Urdu
Uzbek
Vietnamese
Welsh
Wolof
Xhosa
Yiddish
Yoruba
Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,649 --> 00:00:15,567
[ 480p | 720p | Movies Download | Mkvking.com ]
2
00:00:31,114 --> 00:00:34,659
[interviewer] Why don't you go ahead?
Sit down and see if you can get comfy.
3
00:00:37,579 --> 00:00:39,789
-You good? All right.
-Yeah. [exhales]
4
00:00:39,914 --> 00:00:42,125
-[interviewer] Um...
-[cell phone vibrates]
5
00:00:43,043 --> 00:00:44,794
[crew member] Take one, marker.
6
00:00:46,796 --> 00:00:48,798
[interviewer] Wanna start
by introducing yourself?
7
00:00:48,882 --> 00:00:49,799
[crew member coughs]
8
00:00:50,467 --> 00:00:53,344
Hello, world. Bailey. Take three.
9
00:00:53,970 --> 00:00:56,347
-[interviewer] You good?
-This is the worst part, man.
10
00:00:56,890 --> 00:00:59,517
[chuckling] I don't like this.
11
00:00:59,851 --> 00:01:02,228
I worked at Facebook in 2011 and 2012.
12
00:01:02,312 --> 00:01:05,190
I was one of the really early employees
at Instagram.
13
00:01:05,273 --> 00:01:08,693
[man 1] I worked at, uh, Google,uh, YouTube.
14
00:01:08,777 --> 00:01:11,696
[woman] Apple, Google, Twitter, Palm.
15
00:01:12,739 --> 00:01:15,533
I helped start Mozilla Labs
and switched over to the Firefox side.
16
00:01:15,617 --> 00:01:18,119
-[interviewer] Are we rolling? Everybody?
-[crew members reply]
17
00:01:18,203 --> 00:01:19,162
[interviewer] Great.
18
00:01:21,206 --> 00:01:22,624
[man 2] I worked at Twitter.
19
00:01:23,041 --> 00:01:23,917
My last job there
20
00:01:24,000 --> 00:01:26,169
was the senior vice president
of engineering.
21
00:01:27,337 --> 00:01:29,255
-[man 3] I was the president of Pinterest.
-[sips]
22
00:01:29,339 --> 00:01:32,717
Before that, um,
I was the... the director of monetization
23
00:01:32,801 --> 00:01:34,260
at Facebook for five years.
24
00:01:34,344 --> 00:01:37,972
While at Twitter, I spent a number
of years running their developer platform,
25
00:01:38,056 --> 00:01:40,225
and then I became
head of consumer product.
26
00:01:40,308 --> 00:01:44,270
I was the coinventor of Google Drive,
Gmail Chat,
27
00:01:44,354 --> 00:01:46,689
Facebook Pages,
and the Facebook like button.
28
00:01:47,440 --> 00:01:50,777
Yeah. This is... This is why I spent,
like, eight months
29
00:01:50,860 --> 00:01:52,779
talking back and forth with lawyers.
30
00:01:54,072 --> 00:01:55,406
This freaks me out.
31
00:01:58,409 --> 00:01:59,702
[man 2] When I was there,
32
00:01:59,786 --> 00:02:02,914
I always felt like,
fundamentally, it was a force for good.
33
00:02:03,414 --> 00:02:05,375
I don't know if I feel that way anymore.
34
00:02:05,458 --> 00:02:10,588
I left Google in June 2017, uh,
due to ethical concerns.
35
00:02:10,672 --> 00:02:14,134
And... And not just at Google
but within the industry at large.
36
00:02:14,217 --> 00:02:15,385
I'm very concerned.
37
00:02:16,636 --> 00:02:17,679
I'm very concerned.
38
00:02:19,097 --> 00:02:21,808
It's easy today to lose sight of the fact
39
00:02:21,891 --> 00:02:27,814
that these tools actually have created
some wonderful things in the world.
40
00:02:27,897 --> 00:02:31,943
They've reunited lost family members.
They've found organ donors.
41
00:02:32,026 --> 00:02:36,573
I mean, there were meaningful,
systemic changes happening
42
00:02:36,656 --> 00:02:39,159
around the world
because of these platforms
43
00:02:39,242 --> 00:02:40,285
that were positive!
44
00:02:40,827 --> 00:02:44,539
I think we were naive
about the flip side of that coin.
45
00:02:45,540 --> 00:02:48,585
Yeah, these things, you release them,
and they take on a life of their own.
46
00:02:48,668 --> 00:02:52,005
And how they're used is pretty different
than how you expected.
47
00:02:52,088 --> 00:02:56,509
Nobody, I deeply believe,
ever intended any of these consequences.
48
00:02:56,593 --> 00:02:59,554
There's no one bad guy.
No. Absolutely not.
49
00:03:01,598 --> 00:03:03,975
[interviewer] So, then,
what's the... what's the problem?
50
00:03:09,147 --> 00:03:11,482
[interviewer] Is there a problem,
and what is the problem?
51
00:03:12,108 --> 00:03:13,026
[swallows]
52
00:03:17,614 --> 00:03:19,991
[clicks tongue] Yeah, it is hard
to give a single, succinct...
53
00:03:20,074 --> 00:03:22,118
I'm trying to touch on
many different problems.
54
00:03:22,535 --> 00:03:23,953
[interviewer] What is the problem?
55
00:03:24,621 --> 00:03:25,914
[clicks tongue, chuckles]
56
00:03:27,916 --> 00:03:29,500
[birds singing]
57
00:03:31,169 --> 00:03:32,670
[dog barking in distance]
58
00:03:33,463 --> 00:03:35,340
[reporter 1]
Despite facing mounting criticism,
59
00:03:35,423 --> 00:03:37,675
the so-called Big Tech namesare getting bigger.
60
00:03:37,759 --> 00:03:40,929
The entire tech industry is
under a new level of scrutiny.
61
00:03:41,012 --> 00:03:43,806
And a new study sheds light on the link
62
00:03:43,890 --> 00:03:46,142
between mental health
and social media use.
63
00:03:46,226 --> 00:03:48,686
[on TV]
Here to talk about the latest research...
64
00:03:48,770 --> 00:03:51,397
[Tucker Carlson] ...is going onthat gets no coverage at all.
65
00:03:51,481 --> 00:03:54,108
Tens of millions of Americansare hopelessly addicted
66
00:03:54,192 --> 00:03:56,319
to their electronic devices.
67
00:03:56,402 --> 00:03:57,987
[reporter 2] It's exacerbated by the fact
68
00:03:58,071 --> 00:04:00,698
that you can literally isolate yourselfnow
69
00:04:00,782 --> 00:04:02,742
in a bubble, thanks to our technology.
70
00:04:02,825 --> 00:04:04,577
Fake news is becoming more advanced
71
00:04:04,661 --> 00:04:06,788
and threatening societies
around the world.
72
00:04:06,871 --> 00:04:10,250
We weren't expecting any of this
when we created Twitter over 12 years ago.
73
00:04:10,333 --> 00:04:12,502
White House officials say
they have no reason to believe
74
00:04:12,585 --> 00:04:14,754
the Russian cyberattacks will stop.
75
00:04:14,837 --> 00:04:18,132
YouTube is being forced
to concentrate on cleansing the site.
76
00:04:18,216 --> 00:04:21,552
[reporter 3] TikTok,if you talk to any tween out there...
77
00:04:21,636 --> 00:04:24,013
[on TV] ...there's no chancethey'll delete this thing...
78
00:04:24,097 --> 00:04:26,224
Hey, Isla,
can you get the table ready, please?
79
00:04:26,307 --> 00:04:28,601
[reporter 4] There's a questionabout whether social media
80
00:04:28,685 --> 00:04:29,978
is making your child depressed.
81
00:04:30,061 --> 00:04:32,105
[mom] Isla,
can you set the table, please?
82
00:04:32,188 --> 00:04:35,316
[reporter 5] These cosmetic proceduresare becoming so popular with teens,
83
00:04:35,400 --> 00:04:37,902
plastic surgeons have coined
a new syndrome for it,
84
00:04:37,986 --> 00:04:40,822
"Snapchat dysmorphia,"
with young patients wanting surgery
85
00:04:40,905 --> 00:04:43,741
so they can look more like they do
in filtered selfies.
86
00:04:43,825 --> 00:04:45,910
Still don't see why you let her have
that thing.
87
00:04:45,994 --> 00:04:47,412
What was I supposed to do?
88
00:04:47,495 --> 00:04:49,580
I mean, every other kid
in her class had one.
89
00:04:50,164 --> 00:04:51,165
She's only 11.
90
00:04:51,249 --> 00:04:52,959
Cass, no one's forcing you to get one.
91
00:04:53,042 --> 00:04:55,086
You can stay disconnected
as long as you want.
92
00:04:55,169 --> 00:04:59,340
Hey, I'm connected without a cell phone,
okay? I'm on the Internet right now.
93
00:04:59,424 --> 00:05:03,094
Also, that isn't even actual connection.
It's just a load of sh--
94
00:05:03,177 --> 00:05:05,013
Surveillance capitalism has come to shape
95
00:05:05,096 --> 00:05:07,765
our politics and culture
in ways many people don't perceive.
96
00:05:07,849 --> 00:05:10,101
[reporter 6]
ISIS inspired followers online,
97
00:05:10,184 --> 00:05:12,812
and now white supremacistsare doing the same.
98
00:05:12,895 --> 00:05:14,147
Recently in India,
99
00:05:14,230 --> 00:05:17,442
Internet lynch mobs have killed
a dozen people, including these five...
100
00:05:17,525 --> 00:05:20,361
[reporter 7] It's not just fake news;it's fake news with consequences.
101
00:05:20,445 --> 00:05:24,073
[reporter 8] How do you handle an epidemicin the age of fake news?
102
00:05:24,157 --> 00:05:26,993
Can you get the coronavirus
by eating Chinese food?
103
00:05:27,535 --> 00:05:32,540
We have gone from the information age
into the disinformation age.
104
00:05:32,623 --> 00:05:34,667
Our democracy is under assault.
105
00:05:34,751 --> 00:05:36,919
[man 4] What I said was,
"I think the tools
106
00:05:37,003 --> 00:05:39,005
that have been created today are starting
107
00:05:39,088 --> 00:05:41,799
to erode the social fabric
of how society works."
108
00:05:41,883 --> 00:05:44,427
[eerie instrumental music continues]
109
00:05:55,980 --> 00:05:58,483
-[music fades]
-[indistinct chatter]
110
00:05:58,566 --> 00:05:59,442
[crew member] Fine.
111
00:06:00,151 --> 00:06:03,446
[stage manager] Aza does
welcoming remarks. We play the video.
112
00:06:04,197 --> 00:06:07,325
And then, "Ladies and gentlemen,
Tristan Harris."
113
00:06:07,408 --> 00:06:08,868
-Right.
-[stage manager] Great.
114
00:06:08,951 --> 00:06:12,038
So, I come up, and...
115
00:06:13,831 --> 00:06:17,126
basically say, "Thank you all for coming."
Um...
116
00:06:17,919 --> 00:06:22,048
So, today, I wanna talk about a new agenda
for technology.
117
00:06:22,131 --> 00:06:25,468
And why we wanna do that
is because if you ask people,
118
00:06:25,551 --> 00:06:27,804
"What's wrong in the tech industry
right now?"
119
00:06:28,262 --> 00:06:31,641
there's a cacophony of grievances
and scandals,
120
00:06:31,724 --> 00:06:33,893
and "They stole our data."
And there's tech addiction.
121
00:06:33,976 --> 00:06:35,978
And there's fake news.
And there's polarization
122
00:06:36,062 --> 00:06:37,855
and some elections
that are getting hacked.
123
00:06:38,189 --> 00:06:41,609
But is there something
that is beneath all these problems
124
00:06:41,692 --> 00:06:44,612
that's causing all these things
to happen at once?
125
00:06:44,821 --> 00:06:46,364
[stage manager speaking indistinctly]
126
00:06:46,447 --> 00:06:48,408
-Does this feel good?
-Very good. Yeah.
127
00:06:49,033 --> 00:06:49,992
Um... [sighs]
128
00:06:50,743 --> 00:06:52,954
I'm just trying to...
Like, I want people to see...
129
00:06:53,037 --> 00:06:55,123
Like, there's a problem happening
in the tech industry,
130
00:06:55,206 --> 00:06:56,707
and it doesn't have a name,
131
00:06:56,791 --> 00:07:00,211
and it has to do with one source,
like, one...
132
00:07:00,795 --> 00:07:03,589
[eerie instrumental music playing]
133
00:07:05,091 --> 00:07:09,387
[Tristan] When you look around you,
it feels like the world is going crazy.
134
00:07:12,765 --> 00:07:15,309
You have to ask yourself, like,
"Is this normal?
135
00:07:16,102 --> 00:07:18,771
Or have we all fallen under some kind
of spell?"
136
00:07:27,989 --> 00:07:30,491
I wish more people could understand
how this works
137
00:07:30,575 --> 00:07:34,036
because it shouldn't be something
that only the tech industry knows.
138
00:07:34,120 --> 00:07:36,247
It should be something
that everybody knows.
139
00:07:36,330 --> 00:07:38,708
[backpack zips]
140
00:07:41,419 --> 00:07:42,378
[softly] Bye.
141
00:07:43,629 --> 00:07:44,881
[guard] Here you go, sir.
142
00:07:47,383 --> 00:07:48,676
-[employee] Hello!
-[Tristan] Hi.
143
00:07:48,759 --> 00:07:50,678
-Tristan. Nice to meet you.
-It's Tris-tan, right?
144
00:07:50,761 --> 00:07:51,721
-Yes.
-Awesome. Cool.
145
00:07:53,181 --> 00:07:55,933
[presenter] Tristan Harris
is a former design ethicist for Google
146
00:07:56,017 --> 00:07:59,395
and has been called the closest thing
Silicon Valley has to a conscience.
147
00:07:59,479 --> 00:08:00,730
[reporter] He's asking tech
148
00:08:00,813 --> 00:08:04,192
to bring what he calls "ethical design"
to its products.
149
00:08:04,275 --> 00:08:06,903
[Anderson Cooper] It's rarefor a tech insider to be so blunt,
150
00:08:06,986 --> 00:08:10,114
but Tristan Harris believessomeone needs to be.
151
00:08:11,324 --> 00:08:12,700
[Tristan] When I was at Google,
152
00:08:12,783 --> 00:08:16,037
I was on the Gmail team,
and I just started getting burnt out
153
00:08:16,120 --> 00:08:18,372
'cause we'd had
so many conversations about...
154
00:08:19,457 --> 00:08:23,169
you know, what the inbox should look like
and what color it should be, and...
155
00:08:23,252 --> 00:08:25,880
And I, you know, felt personally addicted
to e-mail,
156
00:08:26,297 --> 00:08:27,632
and I found it fascinating
157
00:08:27,715 --> 00:08:31,511
there was no one at Gmail working
on making it less addictive.
158
00:08:31,969 --> 00:08:34,514
And I was like,
"Is anybody else thinking about this?
159
00:08:34,597 --> 00:08:36,390
I haven't heard anybody talk about this."
160
00:08:36,849 --> 00:08:39,685
-And I was feeling this frustration...
-[sighs]
161
00:08:39,769 --> 00:08:41,229
...with the tech industry, overall,
162
00:08:41,312 --> 00:08:43,147
that we'd kind of, like, lost our way.
163
00:08:43,231 --> 00:08:46,442
-[ominous instrumental music playing]
-[message alerts chiming]
164
00:08:46,817 --> 00:08:49,820
[Tristan] You know, I really struggled
to try and figure out
165
00:08:49,904 --> 00:08:52,573
how, from the inside, we could change it.
166
00:08:52,907 --> 00:08:55,117
[energetic piano music playing]
167
00:08:55,201 --> 00:08:58,120
[Tristan] And that was when I decided
to make a presentation,
168
00:08:58,204 --> 00:08:59,497
kind of a call to arms.
169
00:09:00,998 --> 00:09:04,961
Every day, I went home and I worked on it
for a couple hours every single night.
170
00:09:05,044 --> 00:09:06,087
[typing]
171
00:09:06,170 --> 00:09:08,548
[Tristan] It basically just said,
you know,
172
00:09:08,631 --> 00:09:11,884
never before
in history have 50 designers--
173
00:09:12,426 --> 00:09:15,263
20- to 35-year-old white guys
in California--
174
00:09:15,888 --> 00:09:19,725
made decisions that would have an impact
on two billion people.
175
00:09:21,018 --> 00:09:24,438
Two billion people will have thoughts
that they didn't intend to have
176
00:09:24,522 --> 00:09:28,401
because a designer at Google said,
"This is how notifications work
177
00:09:28,484 --> 00:09:30,778
on that screen that you wake up to
in the morning."
178
00:09:31,195 --> 00:09:35,283
And we have a moral responsibility,
as Google, for solving this problem.
179
00:09:36,075 --> 00:09:37,743
And I sent this presentation
180
00:09:37,827 --> 00:09:41,789
to about 15, 20 of my closest colleagues
at Google,
181
00:09:41,872 --> 00:09:44,959
and I was very nervous about it.
I wasn't sure how it was gonna land.
182
00:09:46,460 --> 00:09:48,045
When I went to work the next day,
183
00:09:48,129 --> 00:09:50,464
most of the laptops
had the presentation open.
184
00:09:52,133 --> 00:09:54,552
Later that day, there was, like,
400 simultaneous viewers,
185
00:09:54,635 --> 00:09:56,053
so it just kept growing and growing.
186
00:09:56,137 --> 00:10:00,266
I got e-mails from all around the company.
I mean, people in every department saying,
187
00:10:00,349 --> 00:10:02,852
"I totally agree."
"I see this affecting my kids."
188
00:10:02,935 --> 00:10:04,979
"I see this affecting
the people around me."
189
00:10:05,062 --> 00:10:06,939
"We have to do something about this."
190
00:10:07,481 --> 00:10:10,818
It felt like I was sort of launching
a revolution or something like that.
191
00:10:11,861 --> 00:10:15,197
Later, I found out Larry Page
had been notified about this presentation
192
00:10:15,281 --> 00:10:17,908
-in three separate meetings that day.
-[indistinct chatter]
193
00:10:17,992 --> 00:10:20,286
[Tristan] And so, it created
this kind of cultural moment
194
00:10:20,870 --> 00:10:24,415
-that Google needed to take seriously.
-[whooshing]
195
00:10:26,000 --> 00:10:28,878
-[Tristan] And then... nothing.
-[whooshing fades]
196
00:10:32,673 --> 00:10:34,216
[message alerts chiming]
197
00:10:34,300 --> 00:10:36,135
[Tim] Everyone in 2006...
198
00:10:37,219 --> 00:10:39,221
including all of us at Facebook,
199
00:10:39,305 --> 00:10:43,392
just had total admiration for Google
and what Google had built,
200
00:10:43,476 --> 00:10:47,396
which was this incredibly useful service
201
00:10:47,480 --> 00:10:51,442
that did, far as we could tell,
lots of goodness for the world,
202
00:10:51,525 --> 00:10:54,695
and they built
this parallel money machine.
203
00:10:55,404 --> 00:11:00,034
We had such envy for that,
and it seemed so elegant to us...
204
00:11:00,826 --> 00:11:02,161
and so perfect.
205
00:11:02,953 --> 00:11:05,289
Facebook had been around
for about two years,
206
00:11:05,373 --> 00:11:08,376
um, and I was hired to come in
and figure out
207
00:11:08,459 --> 00:11:10,586
what the business model was gonna be
for the company.
208
00:11:10,670 --> 00:11:13,422
I was the director of monetization.
The point was, like,
209
00:11:13,506 --> 00:11:17,051
"You're the person who's gonna figure out
how this thing monetizes."
210
00:11:17,134 --> 00:11:19,804
And there were a lot of people
who did a lot of the work,
211
00:11:19,887 --> 00:11:25,476
but I was clearly one of the people
who was pointing towards...
212
00:11:26,769 --> 00:11:28,562
"Well, we have to make money, A...
213
00:11:29,313 --> 00:11:33,651
and I think this advertising model
is probably the most elegant way.
214
00:11:36,278 --> 00:11:38,280
[bright instrumental music playing]
215
00:11:42,243 --> 00:11:44,370
Uh-oh. What's this video Mom just sent us?
216
00:11:44,453 --> 00:11:46,747
Oh, that's from a talk show,
but that's pretty good.
217
00:11:46,831 --> 00:11:47,873
Guy's kind of a genius.
218
00:11:47,957 --> 00:11:50,584
He's talking all about deleting
social media, which you gotta do.
219
00:11:50,668 --> 00:11:52,878
I might have to start blocking
her e-mails.
220
00:11:52,962 --> 00:11:54,880
I don't even know
what she's talking about, man.
221
00:11:54,964 --> 00:11:56,090
She's worse than I am.
222
00:11:56,173 --> 00:11:58,509
-No, she only uses it for recipes.
-Right, and work.
223
00:11:58,592 --> 00:12:00,553
-And workout videos.
-[guy] And to check up on us.
224
00:12:00,636 --> 00:12:03,055
And everyone else she's ever met
in her entire life.
225
00:12:04,932 --> 00:12:07,893
If you are scrolling throughyour social media feed
226
00:12:07,977 --> 00:12:11,731
while you're watchin' us, you need to putthe damn phone down and listen up
227
00:12:11,814 --> 00:12:14,817
'cause our next guest has written
an incredible book
228
00:12:14,900 --> 00:12:18,112
about how much it's wrecking our lives.
229
00:12:18,195 --> 00:12:19,447
Please welcome author
230
00:12:19,530 --> 00:12:23,951
of Ten Arguments for DeletingYour Social Media Accounts Right Now...
231
00:12:24,034 --> 00:12:26,287
-[Sunny Hostin] Uh-huh.
-...Jaron Lanier.
232
00:12:26,370 --> 00:12:27,913
[cohosts speaking indistinctly]
233
00:12:27,997 --> 00:12:31,834
[Jaron] Companies like Google and Facebook
are some of the wealthiest
234
00:12:31,917 --> 00:12:33,544
and most successful of all time.
235
00:12:33,711 --> 00:12:36,839
Uh, they have relatively few employees.
236
00:12:36,922 --> 00:12:41,427
They just have this giant computer
that rakes in money, right? Uh...
237
00:12:41,510 --> 00:12:42,970
Now, what are they being paid for?
238
00:12:43,053 --> 00:12:45,222
[chuckles]
That's a really important question.
239
00:12:47,308 --> 00:12:50,311
[Roger] So, I've been an investor
in technology for 35 years.
240
00:12:51,020 --> 00:12:54,356
The first 50 years of Silicon Valley,
the industry made products--
241
00:12:54,440 --> 00:12:55,566
hardware, software--
242
00:12:55,649 --> 00:12:58,402
sold 'em to customers.
Nice, simple business.
243
00:12:58,486 --> 00:13:01,447
For the last ten years,
the biggest companies in Silicon Valley
244
00:13:01,530 --> 00:13:03,866
have been in the business
of selling their users.
245
00:13:03,949 --> 00:13:05,910
It's a little even trite to say now,
246
00:13:05,993 --> 00:13:09,205
but... because we don't pay
for the products that we use,
247
00:13:09,288 --> 00:13:12,166
advertisers pay
for the products that we use.
248
00:13:12,249 --> 00:13:14,210
Advertisers are the customers.
249
00:13:14,710 --> 00:13:16,086
We're the thing being sold.
250
00:13:16,170 --> 00:13:17,630
The classic saying is:
251
00:13:17,713 --> 00:13:21,592
"If you're not paying for the product,
then you are the product."
252
00:13:23,385 --> 00:13:27,223
A lot of people think, you know,
"Oh, well, Google's just a search box,
253
00:13:27,306 --> 00:13:29,850
and Facebook's just a place to see
what my friends are doing
254
00:13:29,934 --> 00:13:31,101
and see their photos."
255
00:13:31,185 --> 00:13:35,481
But what they don't realize
is they're competing for your attention.
256
00:13:36,524 --> 00:13:41,111
So, you know, Facebook, Snapchat,
Twitter, Instagram, YouTube,
257
00:13:41,195 --> 00:13:45,699
companies like this, their business model
is to keep people engaged on the screen.
258
00:13:46,283 --> 00:13:49,578
Let's figure out how to get
as much of this person's attention
259
00:13:49,662 --> 00:13:50,955
as we possibly can.
260
00:13:51,455 --> 00:13:53,374
How much time can we get you to spend?
261
00:13:53,874 --> 00:13:56,669
How much of your life can we get you
to give to us?
262
00:13:58,629 --> 00:14:01,090
[Justin] When you think about
how some of these companies work,
263
00:14:01,173 --> 00:14:02,424
it starts to make sense.
264
00:14:03,050 --> 00:14:06,095
There are all these services
on the Internet that we think of as free,
265
00:14:06,178 --> 00:14:09,473
but they're not free.
They're paid for by advertisers.
266
00:14:09,557 --> 00:14:11,559
Why do advertisers pay those companies?
267
00:14:11,642 --> 00:14:14,687
They pay in exchange for showing their ads
to us.
268
00:14:14,770 --> 00:14:18,357
We're the product. Our attention
is the product being sold to advertisers.
269
00:14:18,816 --> 00:14:20,442
That's a little too simplistic.
270
00:14:20,860 --> 00:14:23,654
It's the gradual, slight,
imperceptible change
271
00:14:23,737 --> 00:14:26,574
in your own behavior and perception
that is the product.
272
00:14:27,658 --> 00:14:30,244
And that is the product.
It's the only possible product.
273
00:14:30,327 --> 00:14:34,081
There's nothing else on the table
that could possibly be called the product.
274
00:14:34,164 --> 00:14:37,001
That's the only thing there is
for them to make money from.
275
00:14:37,668 --> 00:14:39,253
Changing what you do,
276
00:14:39,336 --> 00:14:41,714
how you think, who you are.
277
00:14:42,631 --> 00:14:45,301
It's a gradual change. It's slight.
278
00:14:45,384 --> 00:14:48,971
If you can go to somebody and you say,
"Give me $10 million,
279
00:14:49,054 --> 00:14:54,310
and I will change the world one percent
in the direction you want it to change..."
280
00:14:54,852 --> 00:14:58,188
It's the world! That can be incredible,
and that's worth a lot of money.
281
00:14:59,315 --> 00:15:00,149
Okay.
282
00:15:00,691 --> 00:15:04,570
[Shoshana] This is what every business
has always dreamt of:
283
00:15:04,653 --> 00:15:10,910
to have a guarantee that if it places
an ad, it will be successful.
284
00:15:11,327 --> 00:15:12,786
That's their business.
285
00:15:12,870 --> 00:15:14,413
They sell certainty.
286
00:15:14,997 --> 00:15:17,625
In order to be successful
in that business,
287
00:15:17,708 --> 00:15:19,793
you have to have great predictions.
288
00:15:20,085 --> 00:15:24,173
Great predictions begin
with one imperative:
289
00:15:25,215 --> 00:15:26,926
you need a lot of data.
290
00:15:29,136 --> 00:15:31,305
Many people call this
surveillance capitalism,
291
00:15:31,639 --> 00:15:34,350
capitalism profiting
off of the infinite tracking
292
00:15:34,433 --> 00:15:38,062
of everywhere everyone goes
by large technology companies
293
00:15:38,145 --> 00:15:40,356
whose business model is to make sure
294
00:15:40,439 --> 00:15:42,858
that advertisers are as successful
as possible.
295
00:15:42,942 --> 00:15:45,569
This is a new kind of marketplace now.
296
00:15:45,653 --> 00:15:48,072
It's a marketplace
that never existed before.
297
00:15:48,822 --> 00:15:55,371
And it's a marketplace
that trades exclusively in human futures.
298
00:15:56,080 --> 00:16:01,585
Just like there are markets that trade
in pork belly futures or oil futures.
299
00:16:02,127 --> 00:16:07,591
We now have markets
that trade in human futures at scale,
300
00:16:08,175 --> 00:16:13,472
and those markets have produced
the trillions of dollars
301
00:16:14,014 --> 00:16:19,269
that have made the Internet companies
the richest companies
302
00:16:19,353 --> 00:16:22,356
in the history of humanity.
303
00:16:23,357 --> 00:16:25,359
[indistinct chatter]
304
00:16:27,361 --> 00:16:30,990
[Jeff] What I want people to know
is that everything they're doing online
305
00:16:31,073 --> 00:16:34,326
is being watched, is being tracked,
is being measured.
306
00:16:35,035 --> 00:16:39,623
Every single action you take
is carefully monitored and recorded.
307
00:16:39,707 --> 00:16:43,836
Exactly what image you stop and look at,
for how long you look at it.
308
00:16:43,919 --> 00:16:45,796
Oh, yeah, seriously,
for how long you look at it.
309
00:16:45,879 --> 00:16:47,881
[monitors beeping]
310
00:16:50,509 --> 00:16:52,219
[Tristan] They knowwhen people are lonely.
311
00:16:52,302 --> 00:16:53,804
They know when people are depressed.
312
00:16:53,887 --> 00:16:57,099
They know when people are lookingat photos of your ex-romantic partners.
313
00:16:57,182 --> 00:17:00,853
They know what you're doing late at night.They know the entire thing.
314
00:17:01,270 --> 00:17:03,230
Whether you're an introvertor an extrovert,
315
00:17:03,313 --> 00:17:06,817
or what kind of neuroses you have,what your personality type is like.
316
00:17:08,193 --> 00:17:11,613
[Shoshana] They have more information
about us
317
00:17:11,697 --> 00:17:14,324
than has ever been imagined
in human history.
318
00:17:14,950 --> 00:17:16,368
It is unprecedented.
319
00:17:18,579 --> 00:17:22,791
And so, all of this data that we're...
that we're just pouring out all the time
320
00:17:22,875 --> 00:17:26,754
is being fed into these systems
that have almost no human supervision
321
00:17:27,463 --> 00:17:30,883
and that are making better and better
and better and better predictions
322
00:17:30,966 --> 00:17:33,552
about what we're gonna do
and... and who we are.
323
00:17:33,635 --> 00:17:35,637
[indistinct chatter]
324
00:17:36,305 --> 00:17:39,349
[Aza] People have the misconception
it's our data being sold.
325
00:17:40,350 --> 00:17:43,187
It's not in Facebook's business interest
to give up the data.
326
00:17:45,522 --> 00:17:47,107
What do they do with that data?
327
00:17:49,401 --> 00:17:50,986
[console whirring]
328
00:17:51,070 --> 00:17:54,490
[Aza] They build models
that predict our actions,
329
00:17:54,573 --> 00:17:57,618
and whoever has the best model wins.
330
00:18:02,706 --> 00:18:04,041
His scrolling speed is slowing.
331
00:18:04,124 --> 00:18:06,085
Nearing the end
of his average session length.
332
00:18:06,168 --> 00:18:07,002
Decreasing ad load.
333
00:18:07,086 --> 00:18:08,337
Pull back on friends and family.
334
00:18:09,588 --> 00:18:11,340
[Tristan] On the other side of the screen,
335
00:18:11,423 --> 00:18:15,469
it's almost as if they had
this avatar voodoo doll-like model of us.
336
00:18:16,845 --> 00:18:18,180
All of the things we've ever done,
337
00:18:18,263 --> 00:18:19,473
all the clicks we've ever made,
338
00:18:19,556 --> 00:18:21,642
all the videos we've watched,
all the likes,
339
00:18:21,725 --> 00:18:25,354
that all gets brought back into building
a more and more accurate model.
340
00:18:25,896 --> 00:18:27,481
The model, once you have it,
341
00:18:27,564 --> 00:18:29,858
you can predict the kinds of things
that person does.
342
00:18:29,942 --> 00:18:31,777
Right, let me just test.
343
00:18:32,569 --> 00:18:34,988
[Tristan] Where you'll go.
I can predict what kind of videos
344
00:18:35,072 --> 00:18:36,115
will keep you watching.
345
00:18:36,198 --> 00:18:39,159
I can predict what kinds of emotions tend
to trigger you.
346
00:18:39,243 --> 00:18:40,410
[blue AI] Yes, perfect.
347
00:18:41,578 --> 00:18:43,372
The most epic fails of the year.
348
00:18:46,125 --> 00:18:47,543
-[crowd groans on video]
-[whooshes]
349
00:18:48,627 --> 00:18:51,088
-Perfect. That worked.
-Following with another video.
350
00:18:51,171 --> 00:18:54,049
Beautiful. Let's squeeze in a sneaker ad
before it starts.
351
00:18:56,426 --> 00:18:58,178
[Tristan] At a lot
of technology companies,
352
00:18:58,262 --> 00:18:59,721
there's three main goals.
353
00:18:59,805 --> 00:19:01,348
There's the engagement goal:
354
00:19:01,431 --> 00:19:03,684
to drive up your usage,
to keep you scrolling.
355
00:19:04,601 --> 00:19:06,145
There's the growth goal:
356
00:19:06,228 --> 00:19:08,689
to keep you coming back
and inviting as many friends
357
00:19:08,772 --> 00:19:10,816
and getting them to invite more friends.
358
00:19:11,650 --> 00:19:13,152
And then there's the advertising goal:
359
00:19:13,235 --> 00:19:14,987
to make sure that,
as all that's happening,
360
00:19:15,070 --> 00:19:17,406
we're making as much money as possible
from advertising.
361
00:19:18,115 --> 00:19:19,158
[console beeps]
362
00:19:19,241 --> 00:19:21,994
Each of these goals are powered
by algorithms
363
00:19:22,077 --> 00:19:24,454
whose job is to figure out
what to show you
364
00:19:24,538 --> 00:19:26,165
to keep those numbers going up.
365
00:19:26,623 --> 00:19:29,918
We often talked about, at Facebook,
this idea
366
00:19:30,002 --> 00:19:34,006
of being able to just dial that as needed.
367
00:19:34,673 --> 00:19:38,594
And, you know, we talked
about having Mark have those dials.
368
00:19:41,305 --> 00:19:44,474
"Hey, I want more users in Korea today."
369
00:19:45,684 --> 00:19:46,602
"Turn the dial."
370
00:19:47,436 --> 00:19:49,188
"Let's dial up the ads a little bit."
371
00:19:49,980 --> 00:19:51,899
"Dial up monetization, just slightly."
372
00:19:52,858 --> 00:19:55,444
And so, that happ--
373
00:19:55,527 --> 00:19:59,239
I mean, at all of these companies,
there is that level of precision.
374
00:19:59,990 --> 00:20:02,409
-Dude, how--
-I don't know how I didn't get carded.
375
00:20:02,492 --> 00:20:05,704
-That ref just, like, sucked or something.
-You got literally all the way...
376
00:20:05,787 --> 00:20:07,956
-That's Rebecca. Go talk to her.
-I know who it is.
377
00:20:08,040 --> 00:20:10,834
-Dude, yo, go talk to her.
-[guy] I'm workin' on it.
378
00:20:10,918 --> 00:20:14,171
His calendar says he's on a break
right now. We should be live.
379
00:20:14,755 --> 00:20:16,465
[sighs] Want me to nudge him?
380
00:20:17,132 --> 00:20:18,050
Yeah, nudge away.
381
00:20:18,133 --> 00:20:19,092
[console beeps]
382
00:20:21,637 --> 00:20:24,181
"Your friend Tyler just joined.
Say hi with a wave."
383
00:20:26,016 --> 00:20:27,184
[Engagement AI] Come on, Ben.
384
00:20:27,267 --> 00:20:29,311
Send a wave. [sighs]
385
00:20:29,394 --> 00:20:32,606
-You're not... Go talk to her, dude.
-[phone vibrates, chimes]
386
00:20:33,857 --> 00:20:35,484
-[Ben sighs]
-[cell phone chimes]
387
00:20:36,902 --> 00:20:37,986
[console beeps]
388
00:20:38,070 --> 00:20:40,447
New link! All right, we're on. [exhales]
389
00:20:40,948 --> 00:20:46,078
Follow that up with a post
from User 079044238820, Rebecca.
390
00:20:46,161 --> 00:20:49,790
Good idea. GPS coordinates indicate
that they're in close proximity.
391
00:20:55,921 --> 00:20:57,172
He's primed for an ad.
392
00:20:57,631 --> 00:20:58,632
Auction time.
393
00:21:00,133 --> 00:21:02,803
Sold! To Deep Fade hair wax.
394
00:21:03,387 --> 00:21:07,933
We had 468 interested bidders. We sold Ben
at 3.262 cents for an impression.
395
00:21:08,850 --> 00:21:10,852
[melancholy piano music playing]
396
00:21:14,147 --> 00:21:15,065
[Ben sighs]
397
00:21:17,109 --> 00:21:18,735
[Jaron] We've created a world
398
00:21:18,819 --> 00:21:21,530
in which online connection
has become primary,
399
00:21:22,072 --> 00:21:23,907
especially for younger generations.
400
00:21:23,991 --> 00:21:28,328
And yet, in that world,
any time two people connect,
401
00:21:29,162 --> 00:21:33,250
the only way it's financed
is through a sneaky third person
402
00:21:33,333 --> 00:21:35,627
who's paying to manipulate
those two people.
403
00:21:36,128 --> 00:21:39,381
So, we've created
an entire global generation of people
404
00:21:39,464 --> 00:21:44,011
who are raised within a context
where the very meaning of communication,
405
00:21:44,094 --> 00:21:47,431
the very meaning of culture,
is manipulation.
406
00:21:47,514 --> 00:21:49,641
We've put deceit and sneakiness
407
00:21:49,725 --> 00:21:52,311
at the absolute center
of everything we do.
408
00:22:05,615 --> 00:22:07,242
-[interviewer] Grab the...
-[Tristan] Okay.
409
00:22:07,326 --> 00:22:09,286
-Where's it help to hold it?
-[interviewer] Great.
410
00:22:09,369 --> 00:22:10,787
-[Tristan] Here?
-[interviewer] Yeah.
411
00:22:10,871 --> 00:22:13,832
How does this come across on camera
if I were to do, like, this move--
412
00:22:13,915 --> 00:22:15,542
-[interviewer] We can--
-[blows] Like that?
413
00:22:15,625 --> 00:22:16,918
-[interviewer laughs] What?
-Yeah.
414
00:22:17,002 --> 00:22:19,004
-[interviewer] Do that again.
-Exactly. Yeah. [blows]
415
00:22:19,087 --> 00:22:20,589
Yeah. No, it's probably not...
416
00:22:20,672 --> 00:22:21,965
Like... yeah.
417
00:22:22,466 --> 00:22:23,884
I mean, this one is less...
418
00:22:29,681 --> 00:22:33,268
[interviewer laughs] Larissa's, like,
actually freaking out over here.
419
00:22:34,728 --> 00:22:35,562
Is that good?
420
00:22:35,645 --> 00:22:37,773
[instrumental music playing]
421
00:22:37,856 --> 00:22:41,068
[Tristan] I was, like, five years old
when I learned how to do magic.
422
00:22:41,151 --> 00:22:45,781
And I could fool adults,
fully-grown adults with, like, PhDs.
423
00:22:55,040 --> 00:22:57,709
Magicians were almost like
the first neuroscientists
424
00:22:57,793 --> 00:22:58,960
and psychologists.
425
00:22:59,044 --> 00:23:02,005
Like, they were the ones
who first understood
426
00:23:02,089 --> 00:23:03,382
how people's minds work.
427
00:23:04,216 --> 00:23:07,677
They just, in real time, are testing
lots and lots of stuff on people.
428
00:23:09,137 --> 00:23:11,139
A magician understands something,
429
00:23:11,223 --> 00:23:14,017
some part of your mind
that we're not aware of.
430
00:23:14,101 --> 00:23:15,936
That's what makes the illusion work.
431
00:23:16,019 --> 00:23:20,607
Doctors, lawyers, people who know
how to build 747s or nuclear missiles,
432
00:23:20,690 --> 00:23:24,361
they don't know more about
how their own mind is vulnerable.
433
00:23:24,444 --> 00:23:26,113
That's a separate discipline.
434
00:23:26,571 --> 00:23:28,990
And it's a discipline
that applies to all human beings.
435
00:23:30,909 --> 00:23:34,079
From that perspective, you can have
a very different understanding
436
00:23:34,162 --> 00:23:35,580
of what technology is doing.
437
00:23:36,873 --> 00:23:39,584
When I was
at the Stanford Persuasive Technology Lab,
438
00:23:39,668 --> 00:23:41,044
this is what we learned.
439
00:23:41,628 --> 00:23:43,463
How could you use everything we know
440
00:23:43,547 --> 00:23:45,882
about the psychology
of what persuades people
441
00:23:45,966 --> 00:23:48,385
and build that into technology?
442
00:23:48,468 --> 00:23:50,887
Now, many of you in the audience
are geniuses already.
443
00:23:50,971 --> 00:23:55,851
I think that's true, but my goal is
to turn you into a behavior-change genius.
444
00:23:56,852 --> 00:24:01,148
There are many prominent Silicon Valley
figures who went through that class--
445
00:24:01,231 --> 00:24:05,485
key growth figures at Facebook and Uber
and... and other companies--
446
00:24:05,569 --> 00:24:09,197
and learned how to make technology
more persuasive,
447
00:24:09,614 --> 00:24:10,782
Tristan being one.
448
00:24:12,284 --> 00:24:14,619
[Tristan] Persuasive technology
is just sort of design
449
00:24:14,703 --> 00:24:16,580
intentionally applied to the extreme,
450
00:24:16,663 --> 00:24:18,874
where we really want to modify
someone's behavior.
451
00:24:18,957 --> 00:24:20,542
We want them to take this action.
452
00:24:20,625 --> 00:24:23,336
We want them to keep doing this
with their finger.
453
00:24:23,420 --> 00:24:26,256
You pull down and you refresh,
it's gonna be a new thing at the top.
454
00:24:26,339 --> 00:24:28,508
Pull down and refresh again, it's new.
Every single time.
455
00:24:28,592 --> 00:24:33,722
Which, in psychology, we call
a positive intermittent reinforcement.
456
00:24:33,805 --> 00:24:37,142
You don't know when you're gonna get it
or if you're gonna get something,
457
00:24:37,225 --> 00:24:40,061
which operates just like the slot machines
in Vegas.
458
00:24:40,145 --> 00:24:42,230
It's not enough
that you use the product consciously,
459
00:24:42,314 --> 00:24:44,024
I wanna dig down deeper
into the brain stem
460
00:24:44,107 --> 00:24:45,817
and implant, inside of you,
461
00:24:45,901 --> 00:24:47,652
an unconscious habit
462
00:24:47,736 --> 00:24:50,864
so that you are being programmed
at a deeper level.
463
00:24:50,947 --> 00:24:52,115
You don't even realize it.
464
00:24:52,532 --> 00:24:54,034
[teacher] A man, James Marshall...
465
00:24:54,117 --> 00:24:56,286
[Tristan] Every time you see it there
on the counter,
466
00:24:56,369 --> 00:24:59,789
and you just look at it,
and you know if you reach over,
467
00:24:59,873 --> 00:25:01,333
it just might have something for you,
468
00:25:01,416 --> 00:25:03,877
so you play that slot machine
to see what you got, right?
469
00:25:03,960 --> 00:25:06,046
That's not by accident.
That's a design technique.
470
00:25:06,129 --> 00:25:08,632
[teacher] He brings a golden nugget
to an officer
471
00:25:09,841 --> 00:25:11,301
in the army in San Francisco.
472
00:25:12,219 --> 00:25:15,388
Mind you, the... the population
of San Francisco was only...
473
00:25:15,472 --> 00:25:17,432
[Jeff]
Another example is photo tagging.
474
00:25:17,516 --> 00:25:19,643
-[teacher] The secret didn't last.
-[phone vibrates]
475
00:25:19,726 --> 00:25:21,186
[Jeff] So, if you get an e-mail
476
00:25:21,269 --> 00:25:24,064
that says your friend just tagged you
in a photo,
477
00:25:24,147 --> 00:25:28,568
of course you're going to click
on that e-mail and look at the photo.
478
00:25:29,152 --> 00:25:31,821
It's not something
you can just decide to ignore.
479
00:25:32,364 --> 00:25:34,157
This is deep-seated, like,
480
00:25:34,241 --> 00:25:36,326
human personality
that they're tapping into.
481
00:25:36,409 --> 00:25:38,078
What you should be asking yourself is:
482
00:25:38,161 --> 00:25:40,288
"Why doesn't that e-mail contain
the photo in it?
483
00:25:40,372 --> 00:25:42,457
It would be a lot easier
to see the photo."
484
00:25:42,541 --> 00:25:45,919
When Facebook found that feature,
they just dialed the hell out of that
485
00:25:46,002 --> 00:25:48,505
because they said, "This is gonna be
a great way to grow activity.
486
00:25:48,588 --> 00:25:51,091
Let's just get people tagging each other
in photos all day long."
487
00:25:51,174 --> 00:25:53,176
[upbeat techno music playing]
488
00:25:57,889 --> 00:25:58,890
[cell phone chimes]
489
00:25:59,349 --> 00:26:00,475
He commented.
490
00:26:00,559 --> 00:26:01,434
[Growth AI] Nice.
491
00:26:01,935 --> 00:26:04,688
Okay, Rebecca received it,
and she is responding.
492
00:26:04,771 --> 00:26:07,566
All right, let Ben know that she's typing
so we don't lose him.
493
00:26:07,649 --> 00:26:08,733
Activating ellipsis.
494
00:26:09,776 --> 00:26:11,945
[teacher continues speaking indistinctly]
495
00:26:13,697 --> 00:26:15,865
[tense instrumental music playing]
496
00:26:19,953 --> 00:26:21,329
Great, she posted.
497
00:26:21,454 --> 00:26:24,249
He's commenting on her comment
about his comment on her post.
498
00:26:25,041 --> 00:26:26,418
Hold on, he stopped typing.
499
00:26:26,751 --> 00:26:27,752
Let's autofill.
500
00:26:28,420 --> 00:26:30,005
Emojis. He loves emojis.
501
00:26:33,842 --> 00:26:34,676
He went with fire.
502
00:26:34,759 --> 00:26:36,803
[clicks tongue, sighs]
I was rootin' for eggplant.
503
00:26:38,597 --> 00:26:42,726
[Tristan] There's an entire discipline
and field called "growth hacking."
504
00:26:42,809 --> 00:26:47,147
Teams of engineers
whose job is to hack people's psychology
505
00:26:47,230 --> 00:26:48,565
so they can get more growth.
506
00:26:48,648 --> 00:26:50,984
They can get more user sign-ups,
more engagement.
507
00:26:51,067 --> 00:26:52,861
They can get you to invite more people.
508
00:26:52,944 --> 00:26:55,989
After all the testing, all the iterating,
all of this stuff,
509
00:26:56,072 --> 00:26:57,907
you know the single biggest thing
we realized?
510
00:26:57,991 --> 00:27:00,702
Get any individual to seven friends
in ten days.
511
00:27:01,953 --> 00:27:02,787
That was it.
512
00:27:02,871 --> 00:27:05,498
Chamath was the head of growth at Facebook
early on,
513
00:27:05,582 --> 00:27:08,251
and he's very well known
in the tech industry
514
00:27:08,335 --> 00:27:11,004
for pioneering a lot of the growth tactics
515
00:27:11,087 --> 00:27:14,758
that were used to grow Facebook
at incredible speed.
516
00:27:14,841 --> 00:27:18,553
And those growth tactics have then become
the standard playbook for Silicon Valley.
517
00:27:18,637 --> 00:27:21,222
They were used at Uber
and at a bunch of other companies.
518
00:27:21,306 --> 00:27:27,062
One of the things that he pioneered
was the use of scientific A/B testing
519
00:27:27,145 --> 00:27:28,480
of small feature changes.
520
00:27:29,022 --> 00:27:30,940
Companies like Google and Facebook
521
00:27:31,024 --> 00:27:34,569
would roll out
lots of little, tiny experiments
522
00:27:34,653 --> 00:27:36,821
that they were constantly doing on users.
523
00:27:36,905 --> 00:27:39,866
And over time,
by running these constant experiments,
524
00:27:39,949 --> 00:27:43,036
you... you develop the most optimal way
525
00:27:43,119 --> 00:27:45,288
to get users to do
what you want them to do.
526
00:27:45,372 --> 00:27:46,790
It's... It's manipulation.
527
00:27:47,332 --> 00:27:49,459
[interviewer]
Uh, you're making me feel like a lab rat.
528
00:27:49,834 --> 00:27:51,920
You are a lab rat. We're all lab rats.
529
00:27:52,545 --> 00:27:55,548
And it's not like we're lab rats
for developing a cure for cancer.
530
00:27:55,632 --> 00:27:58,134
It's not like they're trying
to benefit us.
531
00:27:58,218 --> 00:28:01,680
Right? We're just zombies,
and they want us to look at more ads
532
00:28:01,763 --> 00:28:03,181
so they can make more money.
533
00:28:03,556 --> 00:28:05,266
[Shoshana] Facebook conducted
534
00:28:05,350 --> 00:28:08,228
what they called
"massive-scale contagion experiments."
535
00:28:08,311 --> 00:28:09,145
Okay.
536
00:28:09,229 --> 00:28:13,066
[Shoshana] How do we use subliminal cues
on the Facebook pages
537
00:28:13,400 --> 00:28:17,654
to get more people to go vote
in the midterm elections?
538
00:28:17,987 --> 00:28:20,824
And they discovered
that they were able to do that.
539
00:28:20,907 --> 00:28:24,160
One thing they concluded
is that we now know
540
00:28:24,744 --> 00:28:28,915
we can affect real-world behavior
and emotions
541
00:28:28,998 --> 00:28:32,877
without ever triggering
the user's awareness.
542
00:28:33,378 --> 00:28:37,382
They are completely clueless.
543
00:28:38,049 --> 00:28:41,970
We're pointing these engines of AI
back at ourselves
544
00:28:42,053 --> 00:28:46,224
to reverse-engineer what elicits responses
from us.
545
00:28:47,100 --> 00:28:49,561
Almost like you're stimulating nerve cells
on a spider
546
00:28:49,644 --> 00:28:51,479
to see what causes its legs to respond.
547
00:28:51,938 --> 00:28:53,940
So, it really is
this kind of prison experiment
548
00:28:54,023 --> 00:28:56,735
where we're just, you know,
roping people into the matrix,
549
00:28:56,818 --> 00:29:00,572
and we're just harvesting all this money
and... and data from all their activity
550
00:29:00,655 --> 00:29:01,489
to profit from.
551
00:29:01,573 --> 00:29:03,450
And we're not even aware
that it's happening.
552
00:29:04,117 --> 00:29:07,912
So, we want to psychologically figure out
how to manipulate you as fast as possible
553
00:29:07,996 --> 00:29:10,081
and then give you back that dopamine hit.
554
00:29:10,165 --> 00:29:12,375
We did that brilliantly at Facebook.
555
00:29:12,625 --> 00:29:14,919
Instagram has done it.
WhatsApp has done it.
556
00:29:15,003 --> 00:29:17,380
You know, Snapchat has done it.
Twitter has done it.
557
00:29:17,464 --> 00:29:19,424
I mean, it's exactly the kind of thing
558
00:29:19,507 --> 00:29:22,427
that a... that a hacker like myself
would come up with
559
00:29:22,510 --> 00:29:27,015
because you're exploiting a vulnerability
in... in human psychology.
560
00:29:27,807 --> 00:29:29,726
[chuckles] And I just...
I think that we...
561
00:29:29,809 --> 00:29:33,438
you know, the inventors, creators...
562
00:29:33,980 --> 00:29:37,317
uh, you know, and it's me, it's Mark,
it's the...
563
00:29:37,400 --> 00:29:40,403
you know, Kevin Systrom at Instagram...
It's all of these people...
564
00:29:40,487 --> 00:29:46,451
um, understood this consciously,
and we did it anyway.
565
00:29:50,580 --> 00:29:53,750
No one got upset when bicycles showed up.
566
00:29:55,043 --> 00:29:58,004
Right? Like, if everyone's starting
to go around on bicycles,
567
00:29:58,087 --> 00:30:00,924
no one said,
"Oh, my God, we've just ruined society.
568
00:30:01,007 --> 00:30:03,051
[chuckles]
Like, bicycles are affecting people.
569
00:30:03,134 --> 00:30:05,303
They're pulling people
away from their kids.
570
00:30:05,386 --> 00:30:08,723
They're ruining the fabric of democracy.
People can't tell what's true."
571
00:30:08,807 --> 00:30:11,476
Like, we never said any of that stuff
about a bicycle.
572
00:30:12,769 --> 00:30:16,147
If something is a tool,
it genuinely is just sitting there,
573
00:30:16,731 --> 00:30:18,733
waiting patiently.
574
00:30:19,317 --> 00:30:22,821
If something is not a tool,
it's demanding things from you.
575
00:30:22,904 --> 00:30:26,533
It's seducing you. It's manipulating you.
It wants things from you.
576
00:30:26,950 --> 00:30:30,495
And we've moved away from having
a tools-based technology environment
577
00:30:31,037 --> 00:30:34,499
to an addiction- and manipulation-based
technology environment.
578
00:30:34,582 --> 00:30:35,708
That's what's changed.
579
00:30:35,792 --> 00:30:39,420
Social media isn't a tool
that's just waiting to be used.
580
00:30:39,504 --> 00:30:43,466
It has its own goals,
and it has its own means of pursuing them
581
00:30:43,550 --> 00:30:45,677
by using your psychology against you.
582
00:30:45,760 --> 00:30:47,762
[ominous instrumental music playing]
583
00:30:57,564 --> 00:31:00,567
[Tim] Rewind a few years ago,
I was the...
584
00:31:00,650 --> 00:31:02,318
I was the president of Pinterest.
585
00:31:03,152 --> 00:31:05,113
I was coming home,
586
00:31:05,196 --> 00:31:08,366
and I couldn't get off my phone
once I got home,
587
00:31:08,449 --> 00:31:12,161
despite having two young kids
who needed my love and attention.
588
00:31:12,245 --> 00:31:15,748
I was in the pantry, you know,
typing away on an e-mail
589
00:31:15,832 --> 00:31:17,542
or sometimes looking at Pinterest.
590
00:31:18,001 --> 00:31:19,627
I thought, "God, this is classic irony.
591
00:31:19,711 --> 00:31:22,046
I am going to work during the day
592
00:31:22,130 --> 00:31:26,426
and building something
that then I am falling prey to."
593
00:31:26,509 --> 00:31:30,096
And I couldn't... I mean, some
of those moments, I couldn't help myself.
594
00:31:30,179 --> 00:31:31,848
-[notification chimes]
-[woman gasps]
595
00:31:32,307 --> 00:31:36,102
The one
that I'm... I'm most prone to is Twitter.
596
00:31:36,185 --> 00:31:38,021
Uh, used to be Reddit.
597
00:31:38,104 --> 00:31:42,859
I actually had to write myself software
to break my addiction to reading Reddit.
598
00:31:42,942 --> 00:31:44,903
-[notifications chime]
-[slot machines whir]
599
00:31:45,403 --> 00:31:47,780
I'm probably most addicted to my e-mail.
600
00:31:47,864 --> 00:31:49,866
I mean, really. I mean, I... I feel it.
601
00:31:49,949 --> 00:31:51,409
-[notifications chime]
-[woman gasps]
602
00:31:51,492 --> 00:31:52,493
[electricity crackles]
603
00:31:52,577 --> 00:31:54,954
Well, I mean, it's sort-- it's interesting
604
00:31:55,038 --> 00:31:58,166
that knowing what was going on
behind the curtain,
605
00:31:58,249 --> 00:32:01,628
I still wasn't able to control my usage.
606
00:32:01,711 --> 00:32:03,046
So, that's a little scary.
607
00:32:03,630 --> 00:32:07,050
Even knowing how these tricks work,
I'm still susceptible to them.
608
00:32:07,133 --> 00:32:09,886
I'll still pick up the phone,
and 20 minutes will disappear.
609
00:32:09,969 --> 00:32:11,387
[notifications chime]
610
00:32:11,471 --> 00:32:12,722
-[fluid rushes]
-[woman gasps]
611
00:32:12,805 --> 00:32:15,725
Do you check your smartphone
before you pee in the morning
612
00:32:15,808 --> 00:32:17,477
or while you're peeing in the morning?
613
00:32:17,560 --> 00:32:19,479
'Cause those are the only two choices.
614
00:32:19,562 --> 00:32:23,274
I tried through willpower,
just pure willpower...
615
00:32:23,358 --> 00:32:26,903
"I'll put down my phone, I'll leave
my phone in the car when I get home."
616
00:32:26,986 --> 00:32:30,573
I think I told myself a thousand times,
a thousand different days,
617
00:32:30,657 --> 00:32:32,617
"I am not gonna bring my phone
to the bedroom,"
618
00:32:32,700 --> 00:32:34,535
and then 9:00 p.m. rolls around.
619
00:32:34,619 --> 00:32:37,121
"Well, I wanna bring my phone
in the bedroom."
620
00:32:37,205 --> 00:32:39,290
[takes a deep breath]
And so, that was sort of...
621
00:32:39,374 --> 00:32:41,125
Willpower was kind of attempt one,
622
00:32:41,209 --> 00:32:44,295
and then attempt two was,
you know, brute force.
623
00:32:44,379 --> 00:32:48,091
[announcer] Introducing the Kitchen Safe.
The Kitchen Safe is a revolutionary,
624
00:32:48,174 --> 00:32:51,678
new, time-locking container
that helps you fight temptation.
625
00:32:51,761 --> 00:32:56,724
All David has to do is placethose temptations in the Kitchen Safe.
626
00:32:57,392 --> 00:33:00,395
Next, he rotates the dialto set the timer.
627
00:33:01,479 --> 00:33:04,232
And, finally, he presses the dialto activate the lock.
628
00:33:04,315 --> 00:33:05,525
The Kitchen Safe is great...
629
00:33:05,608 --> 00:33:06,776
We have that, don't we?
630
00:33:06,859 --> 00:33:08,653
...video games, credit cards,and cell phones.
631
00:33:08,736 --> 00:33:09,654
Yeah, we do.
632
00:33:09,737 --> 00:33:12,407
[announcer] Once the Kitchen Safeis locked, it cannot be opened
633
00:33:12,490 --> 00:33:13,866
until the timer reaches zero.
634
00:33:13,950 --> 00:33:15,618
[Anna] So, here's the thing.
635
00:33:15,702 --> 00:33:17,537
Social media is a drug.
636
00:33:17,620 --> 00:33:20,873
I mean,
we have a basic biological imperative
637
00:33:20,957 --> 00:33:23,084
to connect with other people.
638
00:33:23,167 --> 00:33:28,214
That directly affects the release
of dopamine in the reward pathway.
639
00:33:28,297 --> 00:33:32,552
Millions of years of evolution, um,
are behind that system
640
00:33:32,635 --> 00:33:35,596
to get us to come together
and live in communities,
641
00:33:35,680 --> 00:33:38,016
to find mates, to propagate our species.
642
00:33:38,099 --> 00:33:41,853
So, there's no doubt
that a vehicle like social media,
643
00:33:41,936 --> 00:33:45,690
which optimizes this connection
between people,
644
00:33:45,773 --> 00:33:48,568
is going to have the potential
for addiction.
645
00:33:52,071 --> 00:33:54,115
-Mmm! [laughs]
-Dad, stop!
646
00:33:55,450 --> 00:33:58,453
I have, like, 1,000 more snips
to send before dinner.
647
00:33:58,536 --> 00:34:00,788
-[dad] Snips?
-I don't know what a snip is.
648
00:34:00,872 --> 00:34:03,207
-Mm, that smells good, baby.
-All right. Thank you.
649
00:34:03,291 --> 00:34:05,877
I was, um, thinking we could use
all five senses
650
00:34:05,960 --> 00:34:07,712
to enjoy our dinner tonight.
651
00:34:07,795 --> 00:34:11,382
So, I decided that we're not gonna have
any cell phones at the table tonight.
652
00:34:11,466 --> 00:34:13,301
So, turn 'em in.
653
00:34:13,801 --> 00:34:14,802
-Really?
-[mom] Yep.
654
00:34:15,928 --> 00:34:18,056
-All right.
-Thank you. Ben?
655
00:34:18,139 --> 00:34:20,433
-Okay.
-Mom, the phone pirate. [scoffs]
656
00:34:21,100 --> 00:34:21,934
-Got it.
-Mom!
657
00:34:22,518 --> 00:34:26,147
So, they will be safe in here
until after dinner...
658
00:34:27,273 --> 00:34:30,651
-and everyone can just chill out.
-[safe whirs]
659
00:34:30,735 --> 00:34:31,569
Okay?
660
00:34:40,828 --> 00:34:41,704
[Cass sighs]
661
00:34:45,708 --> 00:34:47,043
[notification chimes]
662
00:34:47,418 --> 00:34:49,253
-Can I just see who it is?
-No.
663
00:34:54,759 --> 00:34:56,969
Just gonna go get another fork.
664
00:34:58,304 --> 00:34:59,263
Thank you.
665
00:35:04,727 --> 00:35:06,771
Honey, you can't open that.
666
00:35:06,854 --> 00:35:09,315
I locked it for an hour,
so just leave it alone.
667
00:35:11,192 --> 00:35:13,361
So, what should we talk about?
668
00:35:13,444 --> 00:35:14,695
Well, we could talk
669
00:35:14,779 --> 00:35:17,615
about the, uh, Extreme Center wackos
I drove by today.
670
00:35:17,698 --> 00:35:18,825
-[mom] Please, Frank.
-What?
671
00:35:18,908 --> 00:35:20,785
[mom] I don't wanna talk about politics.
672
00:35:20,868 --> 00:35:23,538
-What's wrong with the Extreme Center?
-See? He doesn't even get it.
673
00:35:23,621 --> 00:35:24,622
It depends on who you ask.
674
00:35:24,705 --> 00:35:26,624
It's like asking,
"What's wrong with propaganda?"
675
00:35:26,707 --> 00:35:28,376
-[safe smashes]
-[mom and Frank scream]
676
00:35:28,709 --> 00:35:29,710
[Frank] Isla!
677
00:35:32,797 --> 00:35:33,756
Oh, my God.
678
00:35:36,425 --> 00:35:38,553
-[sighs] Do you want me to...
-[mom] Yeah.
679
00:35:41,973 --> 00:35:43,933
[Anna] I... I'm worried about my kids.
680
00:35:44,016 --> 00:35:46,686
And if you have kids,
I'm worried about your kids.
681
00:35:46,769 --> 00:35:50,189
Armed with all the knowledge that I have
and all of the experience,
682
00:35:50,273 --> 00:35:52,108
I am fighting my kids about the time
683
00:35:52,191 --> 00:35:54,443
that they spend on phones
and on the computer.
684
00:35:54,527 --> 00:35:58,197
I will say to my son, "How many hours do
you think you're spending on your phone?"
685
00:35:58,281 --> 00:36:01,075
He'll be like, "It's, like, half an hour.
It's half an hour, tops."
686
00:36:01,159 --> 00:36:04,829
I'd say upwards hour, hour and a half.
687
00:36:04,912 --> 00:36:06,789
I looked at his screen report
a couple weeks ago.
688
00:36:06,873 --> 00:36:08,708
-Three hours and 45 minutes.
-[James] That...
689
00:36:11,377 --> 00:36:13,588
I don't think that's...
No. Per day, on average?
690
00:36:13,671 --> 00:36:15,506
-Yeah.
-Should I go get it right now?
691
00:36:15,590 --> 00:36:19,177
There's not a day that goes by
that I don't remind my kids
692
00:36:19,260 --> 00:36:21,762
about the pleasure-pain balance,
693
00:36:21,846 --> 00:36:24,390
about dopamine deficit states,
694
00:36:24,473 --> 00:36:26,267
about the risk of addiction.
695
00:36:26,350 --> 00:36:27,310
[Mary] Moment of truth.
696
00:36:27,935 --> 00:36:29,687
Two hours, 50 minutes per day.
697
00:36:29,770 --> 00:36:31,772
-Let's see.
-Actually, I've been using a lot today.
698
00:36:31,856 --> 00:36:33,357
-Last seven days.
-That's probably why.
699
00:36:33,441 --> 00:36:37,361
Instagram, six hours, 13 minutes.
Okay, so my Instagram's worse.
700
00:36:39,572 --> 00:36:41,991
My screen's completely shattered.
701
00:36:42,200 --> 00:36:43,201
Thanks, Cass.
702
00:36:44,410 --> 00:36:45,995
What do you mean, "Thanks, Cass"?
703
00:36:46,078 --> 00:36:49,040
You keep freaking Mom out about our phones
when it's not really a problem.
704
00:36:49,373 --> 00:36:51,167
We don't need our phones to eat dinner!
705
00:36:51,250 --> 00:36:53,878
I get what you're saying.
It's just not that big a deal. It's not.
706
00:36:56,047 --> 00:36:58,382
If it's not that big a deal,
don't use it for a week.
707
00:36:59,634 --> 00:37:00,593
[Ben sighs]
708
00:37:01,135 --> 00:37:06,349
Yeah. Yeah, actually, if you can put
that thing away for, like, a whole week...
709
00:37:07,725 --> 00:37:09,518
I will buy you a new screen.
710
00:37:10,978 --> 00:37:12,897
-Like, starting now?
-[mom] Starting now.
711
00:37:15,149 --> 00:37:16,859
-Okay. You got a deal.
-[mom] Okay.
712
00:37:16,943 --> 00:37:19,111
Okay, you gotta leave it here, though,
buddy.
713
00:37:19,862 --> 00:37:21,364
All right, I'm plugging it in.
714
00:37:22,531 --> 00:37:25,076
Let the record show... I'm backing away.
715
00:37:25,159 --> 00:37:25,993
Okay.
716
00:37:27,787 --> 00:37:29,413
-You're on the clock.
-[Ben] One week.
717
00:37:29,497 --> 00:37:30,331
Oh, my...
718
00:37:31,457 --> 00:37:32,416
Think he can do it?
719
00:37:33,000 --> 00:37:34,252
I don't know. We'll see.
720
00:37:35,002 --> 00:37:36,128
Just eat, okay?
721
00:37:44,220 --> 00:37:45,263
Good family dinner!
722
00:37:47,682 --> 00:37:49,809
[Tristan] These technology products
were not designed
723
00:37:49,892 --> 00:37:53,896
by child psychologists who are trying
to protect and nurture children.
724
00:37:53,980 --> 00:37:56,148
They were just designing
to make these algorithms
725
00:37:56,232 --> 00:37:58,734
that were really good at recommending
the next video to you
726
00:37:58,818 --> 00:38:02,321
or really good at getting you
to take a photo with a filter on it.
727
00:38:15,710 --> 00:38:16,669
[cell phone chimes]
728
00:38:16,752 --> 00:38:18,879
[Tristan] It's not just
that it's controlling
729
00:38:18,963 --> 00:38:20,548
where they spend their attention.
730
00:38:21,173 --> 00:38:26,304
Especially social media starts to dig
deeper and deeper down into the brain stem
731
00:38:26,387 --> 00:38:29,765
and take over kids' sense of self-worth
and identity.
732
00:38:41,736 --> 00:38:42,903
[notifications chiming]
733
00:38:52,371 --> 00:38:56,208
[Tristan] We evolved to care about
whether other people in our tribe...
734
00:38:56,751 --> 00:38:59,128
think well of us or not
'cause it matters.
735
00:38:59,837 --> 00:39:04,550
But were we evolved to be aware
of what 10,000 people think of us?
736
00:39:04,633 --> 00:39:08,763
We were not evolved
to have social approval being dosed to us
737
00:39:08,846 --> 00:39:10,348
every five minutes.
738
00:39:10,431 --> 00:39:13,142
That was not at all what we were built
to experience.
739
00:39:15,394 --> 00:39:19,982
[Chamath] We curate our lives
around this perceived sense of perfection
740
00:39:20,733 --> 00:39:23,527
because we get rewarded
in these short-term signals--
741
00:39:23,611 --> 00:39:25,154
hearts, likes, thumbs-up--
742
00:39:25,237 --> 00:39:28,407
and we conflate that with value,
and we conflate it with truth.
743
00:39:29,825 --> 00:39:33,120
And instead, what it really is
is fake, brittle popularity...
744
00:39:33,913 --> 00:39:37,458
that's short-term and that leaves you
even more, and admit it,
745
00:39:37,541 --> 00:39:39,919
vacant and empty before you did it.
746
00:39:41,295 --> 00:39:43,381
Because then it forces you
into this vicious cycle
747
00:39:43,464 --> 00:39:47,176
where you're like, "What's the next thing
I need to do now? 'Cause I need it back."
748
00:39:48,260 --> 00:39:50,846
Think about that compounded
by two billion people,
749
00:39:50,930 --> 00:39:54,767
and then think about how people react then
to the perceptions of others.
750
00:39:54,850 --> 00:39:56,435
It's just a... It's really bad.
751
00:39:56,977 --> 00:39:58,229
It's really, really bad.
752
00:40:00,856 --> 00:40:03,484
[Jonathan] There has been
a gigantic increase
753
00:40:03,567 --> 00:40:06,529
in depression and anxiety
for American teenagers
754
00:40:06,612 --> 00:40:10,950
which began right around...
between 2011 and 2013.
755
00:40:11,033 --> 00:40:15,371
The number of teenage girls out of 100,000
in this country
756
00:40:15,454 --> 00:40:17,123
who were admitted to a hospital every year
757
00:40:17,206 --> 00:40:19,917
because they cut themselves
or otherwise harmed themselves,
758
00:40:20,000 --> 00:40:23,921
that number was pretty stable
until around 2010, 2011,
759
00:40:24,004 --> 00:40:25,756
and then it begins going way up.
760
00:40:28,759 --> 00:40:32,513
It's up 62 percent for older teen girls.
761
00:40:33,848 --> 00:40:38,310
It's up 189 percent for the preteen girls.
That's nearly triple.
762
00:40:40,312 --> 00:40:43,524
Even more horrifying,
we see the same pattern with suicide.
763
00:40:44,775 --> 00:40:47,570
The older teen girls, 15 to 19 years old,
764
00:40:47,653 --> 00:40:49,196
they're up 70 percent,
765
00:40:49,280 --> 00:40:51,699
compared to the first decade
of this century.
766
00:40:52,158 --> 00:40:55,077
The preteen girls,
who have very low rates to begin with,
767
00:40:55,161 --> 00:40:57,663
they are up 151 percent.
768
00:40:58,831 --> 00:41:01,709
And that pattern points to social media.
769
00:41:04,044 --> 00:41:07,214
Gen Z, the kids born after 1996 or so,
770
00:41:07,298 --> 00:41:10,342
those kids are the first generation
in history
771
00:41:10,426 --> 00:41:12,636
that got on social media in middle school.
772
00:41:12,720 --> 00:41:14,722
[thunder rumbling in distance]
773
00:41:15,890 --> 00:41:17,600
[Jonathan] How do they spend their time?
774
00:41:19,727 --> 00:41:22,730
They come home from school,
and they're on their devices.
775
00:41:24,315 --> 00:41:29,195
A whole generation is more anxious,
more fragile, more depressed.
776
00:41:29,320 --> 00:41:30,529
-[thunder rumbles]
-[Isla gasps]
777
00:41:30,613 --> 00:41:33,282
[Jonathan] They're much less comfortable
taking risks.
778
00:41:34,325 --> 00:41:37,536
The rates at which they get
driver's licenses have been dropping.
779
00:41:38,954 --> 00:41:41,081
The number
who have ever gone out on a date
780
00:41:41,165 --> 00:41:44,251
or had any kind of romantic interaction
is dropping rapidly.
781
00:41:47,505 --> 00:41:49,715
This is a real change in a generation.
782
00:41:53,177 --> 00:41:57,306
And remember, for every one of these,
for every hospital admission,
783
00:41:57,389 --> 00:42:00,267
there's a family that is traumatized
and horrified.
784
00:42:00,351 --> 00:42:02,353
"My God, what is happening to our kids?"
785
00:42:08,734 --> 00:42:09,693
[Isla sighs]
786
00:42:19,411 --> 00:42:21,413
[Tim] It's plain as day to me.
787
00:42:22,873 --> 00:42:28,128
These services are killing people...
and causing people to kill themselves.
788
00:42:29,088 --> 00:42:33,300
I don't know any parent who says, "Yeah,
I really want my kids to be growing up
789
00:42:33,384 --> 00:42:36,887
feeling manipulated by tech designers, uh,
790
00:42:36,971 --> 00:42:39,723
manipulating their attention,
making it impossible to do their homework,
791
00:42:39,807 --> 00:42:42,560
making them compare themselves
to unrealistic standards of beauty."
792
00:42:42,643 --> 00:42:44,687
Like, no one wants that. [chuckles]
793
00:42:45,104 --> 00:42:46,355
No one does.
794
00:42:46,438 --> 00:42:48,482
We... We used to have these protections.
795
00:42:48,566 --> 00:42:50,943
When children watched
Saturday morning cartoons,
796
00:42:51,026 --> 00:42:52,778
we cared about protecting children.
797
00:42:52,861 --> 00:42:56,574
We would say, "You can't advertise
to these age children in these ways."
798
00:42:57,366 --> 00:42:58,784
But then you take YouTube for Kids,
799
00:42:58,867 --> 00:43:02,454
and it gobbles up that entire portion
of the attention economy,
800
00:43:02,538 --> 00:43:04,915
and now all kids are exposed
to YouTube for Kids.
801
00:43:04,999 --> 00:43:07,710
And all those protections
and all those regulations are gone.
802
00:43:08,210 --> 00:43:10,212
[tense instrumental music playing]
803
00:43:18,304 --> 00:43:22,141
[Tristan] We're training and conditioning
a whole new generation of people...
804
00:43:23,434 --> 00:43:29,148
that when we are uncomfortable or lonely
or uncertain or afraid,
805
00:43:29,231 --> 00:43:31,775
we have a digital pacifier for ourselves
806
00:43:32,234 --> 00:43:36,488
that is kind of atrophying our own ability
to deal with that.
807
00:43:53,881 --> 00:43:55,674
[Tristan] Photoshop didn't have
1,000 engineers
808
00:43:55,758 --> 00:43:58,969
on the other side of the screen,
using notifications, using your friends,
809
00:43:59,053 --> 00:44:02,431
using AI to predict what's gonna
perfectly addict you, or hook you,
810
00:44:02,514 --> 00:44:04,516
or manipulate you, or allow advertisers
811
00:44:04,600 --> 00:44:08,437
to test 60,000 variations
of text or colors to figure out
812
00:44:08,520 --> 00:44:11,065
what's the perfect manipulation
of your mind.
813
00:44:11,148 --> 00:44:14,985
This is a totally new species
of power and influence.
814
00:44:16,070 --> 00:44:19,156
I... I would say, again, the methods used
815
00:44:19,239 --> 00:44:22,785
to play on people's ability
to be addicted or to be influenced
816
00:44:22,868 --> 00:44:25,204
may be different this time,
and they probably are different.
817
00:44:25,287 --> 00:44:28,749
They were different when newspapers
came in and the printing press came in,
818
00:44:28,832 --> 00:44:31,835
and they were different
when television came in,
819
00:44:31,919 --> 00:44:34,004
and you had three major networks and...
820
00:44:34,463 --> 00:44:36,423
-At the time.
-At the time. That's what I'm saying.
821
00:44:36,507 --> 00:44:38,384
But I'm saying the idea
that there's a new level
822
00:44:38,467 --> 00:44:42,054
and that new level has happened
so many times before.
823
00:44:42,137 --> 00:44:45,099
I mean, this is just the latest new level
that we've seen.
824
00:44:45,182 --> 00:44:48,727
There's this narrative that, you know,
"We'll just adapt to it.
825
00:44:48,811 --> 00:44:51,188
We'll learn how to live
with these devices,
826
00:44:51,271 --> 00:44:53,732
just like we've learned how to live
with everything else."
827
00:44:53,816 --> 00:44:56,694
And what this misses
is there's something distinctly new here.
828
00:44:57,486 --> 00:45:00,155
Perhaps the most dangerous piece
of all this is the fact
829
00:45:00,239 --> 00:45:04,410
that it's driven by technology
that's advancing exponentially.
830
00:45:05,869 --> 00:45:09,081
Roughly, if you say from, like,
the 1960s to today,
831
00:45:09,873 --> 00:45:12,960
processing power has gone up
about a trillion times.
832
00:45:13,794 --> 00:45:18,340
Nothing else that we have has improved
at anything near that rate.
833
00:45:18,424 --> 00:45:22,177
Like, cars are, you know,
roughly twice as fast.
834
00:45:22,261 --> 00:45:25,013
And almost everything else is negligible.
835
00:45:25,347 --> 00:45:27,182
And perhaps most importantly,
836
00:45:27,266 --> 00:45:31,353
our human-- our physiology,
our brains have evolved not at all.
837
00:45:37,401 --> 00:45:41,488
[Tristan] Human beings, at a mind and body
and sort of physical level,
838
00:45:41,947 --> 00:45:43,866
are not gonna fundamentally change.
839
00:45:44,825 --> 00:45:45,868
[indistinct chatter]
840
00:45:47,035 --> 00:45:48,954
[chuckling] I know, but they...
841
00:45:49,037 --> 00:45:51,623
[continues speaking indistinctly]
842
00:45:53,584 --> 00:45:54,752
[camera shutter clicks]
843
00:45:56,837 --> 00:46:00,924
[Tristan] We can do genetic engineering
and develop new kinds of human beings,
844
00:46:01,008 --> 00:46:05,220
but realistically speaking,
you're living inside of hardware, a brain,
845
00:46:05,304 --> 00:46:07,222
that was, like, millions of years old,
846
00:46:07,306 --> 00:46:10,559
and then there's this screen, and then
on the opposite side of the screen,
847
00:46:10,642 --> 00:46:13,562
there's these thousands of engineers
and supercomputers
848
00:46:13,645 --> 00:46:16,106
that have goals that are different
than your goals,
849
00:46:16,190 --> 00:46:19,693
and so, who's gonna win in that game?
Who's gonna win?
850
00:46:25,699 --> 00:46:26,617
How are we losing?
851
00:46:27,159 --> 00:46:29,828
-I don't know.
-Where is he? This is not normal.
852
00:46:29,912 --> 00:46:32,080
Did I overwhelm him
with friends and family content?
853
00:46:32,164 --> 00:46:34,082
-Probably.
-Well, maybe it was all the ads.
854
00:46:34,166 --> 00:46:37,795
No. Something's very wrong.
Let's switch to resurrection mode.
855
00:46:39,713 --> 00:46:44,051
[Tristan] When you think of AI,
you know, an AI's gonna ruin the world,
856
00:46:44,134 --> 00:46:47,221
and you see, like, a Terminator,
and you see Arnold Schwarzenegger.
857
00:46:47,638 --> 00:46:48,680
I'll be back.
858
00:46:48,764 --> 00:46:50,933
[Tristan] You see drones,
and you think, like,
859
00:46:51,016 --> 00:46:52,684
"Oh, we're gonna kill people with AI."
860
00:46:53,644 --> 00:46:59,817
And what people miss is that AI
already runs today's world right now.
861
00:46:59,900 --> 00:47:03,237
Even talking about "an AI"
is just a metaphor.
862
00:47:03,320 --> 00:47:09,451
At these companies like... like Google,
there's just massive, massive rooms,
863
00:47:10,327 --> 00:47:13,121
some of them underground,
some of them underwater,
864
00:47:13,205 --> 00:47:14,498
of just computers.
865
00:47:14,581 --> 00:47:17,835
Tons and tons of computers,
as far as the eye can see.
866
00:47:18,460 --> 00:47:20,504
They're deeply interconnected
with each other
867
00:47:20,587 --> 00:47:22,923
and running
extremely complicated programs,
868
00:47:23,006 --> 00:47:26,009
sending information back and forth
between each other all the time.
869
00:47:26,802 --> 00:47:28,595
And they'll be running
many different programs,
870
00:47:28,679 --> 00:47:31,014
many different products
on those same machines.
871
00:47:31,348 --> 00:47:33,684
Some of those things could be described
as simple algorithms,
872
00:47:33,767 --> 00:47:35,227
some could be described as algorithms
873
00:47:35,310 --> 00:47:37,521
that are so complicated,
you would call them intelligence.
874
00:47:39,022 --> 00:47:39,982
[crew member sighs]
875
00:47:40,065 --> 00:47:42,568
[Cathy]
I like to say that algorithms are opinions
876
00:47:42,651 --> 00:47:43,777
embedded in code...
877
00:47:45,070 --> 00:47:47,656
and that algorithms are not objective.
878
00:47:48,365 --> 00:47:51,577
Algorithms are optimized
to some definition of success.
879
00:47:52,244 --> 00:47:53,370
So, if you can imagine,
880
00:47:53,453 --> 00:47:57,124
if a... if a commercial enterprise builds
an algorithm
881
00:47:57,207 --> 00:47:59,293
to their definition of success,
882
00:47:59,835 --> 00:48:01,211
it's a commercial interest.
883
00:48:01,587 --> 00:48:02,671
It's usually profit.
884
00:48:03,130 --> 00:48:07,384
You are giving the computer
the goal state, "I want this outcome,"
885
00:48:07,467 --> 00:48:10,262
and then the computer itself is learning
how to do it.
886
00:48:10,345 --> 00:48:12,598
That's where the term "machine learning"
comes from.
887
00:48:12,681 --> 00:48:14,850
And so, every day, it gets slightly better
888
00:48:14,933 --> 00:48:16,977
at picking the right posts
in the right order
889
00:48:17,060 --> 00:48:19,438
so that you spend longer and longer
in that product.
890
00:48:19,521 --> 00:48:22,232
And no one really understands
what they're doing
891
00:48:22,316 --> 00:48:23,901
in order to achieve that goal.
892
00:48:23,984 --> 00:48:28,238
The algorithm has a mind of its own,
so even though a person writes it,
893
00:48:28,906 --> 00:48:30,657
it's written in a way
894
00:48:30,741 --> 00:48:35,037
that you kind of build the machine,
and then the machine changes itself.
895
00:48:35,120 --> 00:48:37,873
There's only a handful of people
at these companies,
896
00:48:37,956 --> 00:48:40,000
at Facebook and Twitter
and other companies...
897
00:48:40,083 --> 00:48:43,795
There's only a few people who understand
how those systems work,
898
00:48:43,879 --> 00:48:46,715
and even they don't necessarily
fully understand
899
00:48:46,798 --> 00:48:49,551
what's gonna happen
with a particular piece of content.
900
00:48:49,968 --> 00:48:55,474
So, as humans, we've almost lost control
over these systems.
901
00:48:55,891 --> 00:48:59,603
Because they're controlling, you know,
the information that we see,
902
00:48:59,686 --> 00:49:02,189
they're controlling us more
than we're controlling them.
903
00:49:02,522 --> 00:49:04,733
-[console whirs]
-[Growth AI] Cross-referencing him
904
00:49:04,816 --> 00:49:07,319
against comparables
in his geographic zone.
905
00:49:07,402 --> 00:49:09,571
His psychometric doppelgangers.
906
00:49:09,655 --> 00:49:13,700
There are 13,694 people
behaving just like him in his region.
907
00:49:13,784 --> 00:49:16,370
-What's trending with them?
-We need something actually good
908
00:49:16,453 --> 00:49:17,704
for a proper resurrection,
909
00:49:17,788 --> 00:49:19,957
given that the typical stuff
isn't working.
910
00:49:20,040 --> 00:49:21,875
Not even that cute girl from school.
911
00:49:22,334 --> 00:49:25,253
My analysis shows that going political
with Extreme Center content
912
00:49:25,337 --> 00:49:28,256
has a 62.3 percent chance
of long-term engagement.
913
00:49:28,340 --> 00:49:29,299
That's not bad.
914
00:49:29,383 --> 00:49:32,010
[sighs] It's not good enough to lead with.
915
00:49:32,302 --> 00:49:35,305
Okay, okay, so we've tried notifying him
about tagged photos,
916
00:49:35,389 --> 00:49:39,017
invitations, current events,
even a direct message from Rebecca.
917
00:49:39,101 --> 00:49:42,813
But what about User 01265923010?
918
00:49:42,896 --> 00:49:44,648
Yeah, Ben loved all of her posts.
919
00:49:44,731 --> 00:49:47,776
For months and, like,
literally all of them, and then nothing.
920
00:49:47,859 --> 00:49:50,445
I calculate a 92.3 percent chance
of resurrection
921
00:49:50,529 --> 00:49:52,030
with a notification about Ana.
922
00:49:56,535 --> 00:49:57,494
And her new friend.
923
00:49:59,621 --> 00:50:01,623
[eerie instrumental music playing]
924
00:50:10,590 --> 00:50:11,675
[cell phone vibrates]
925
00:50:25,689 --> 00:50:27,441
[Ben] Oh, you gotta be kiddin' me.
926
00:50:32,404 --> 00:50:33,613
Uh... [sighs]
927
00:50:35,657 --> 00:50:36,616
Okay.
928
00:50:38,869 --> 00:50:40,996
-What?
-[fanfare plays, fireworks pop]
929
00:50:41,413 --> 00:50:42,789
[claps] Bam! We're back!
930
00:50:42,873 --> 00:50:44,374
Let's get back to making money, boys.
931
00:50:44,458 --> 00:50:46,334
Yes, and connecting Ben
with the entire world.
932
00:50:46,418 --> 00:50:49,087
I'm giving him access
to all the information he might like.
933
00:50:49,755 --> 00:50:53,717
Hey, do you guys ever wonder if, you know,
like, the feed is good for Ben?
934
00:50:57,095 --> 00:50:58,430
-No.
-No. [chuckles slightly]
935
00:51:00,307 --> 00:51:03,268
-[chuckles softly]
-["I Put a Spell on You" playing]
936
00:51:17,491 --> 00:51:19,076
♪ I put a spell on you ♪
937
00:51:25,040 --> 00:51:26,374
♪ 'Cause you're mine ♪
938
00:51:28,627 --> 00:51:32,089
[vocalizing] ♪ Ah! ♪
939
00:51:34,508 --> 00:51:36,593
♪ You better stop the things you do ♪
940
00:51:41,181 --> 00:51:42,265
♪ I ain't lyin' ♪
941
00:51:44,976 --> 00:51:46,686
♪ No, I ain't lyin' ♪
942
00:51:49,981 --> 00:51:51,817
♪ You know I can't stand it ♪
943
00:51:53,026 --> 00:51:54,611
♪ You're runnin' around ♪
944
00:51:55,612 --> 00:51:57,239
♪ You know better, Daddy ♪
945
00:51:58,782 --> 00:52:02,077
♪ I can't stand it'Cause you put me down♪
946
00:52:03,286 --> 00:52:04,121
♪ Yeah, yeah ♪
947
00:52:06,456 --> 00:52:08,375
♪ I put a spell on you ♪
948
00:52:12,379 --> 00:52:14,840
♪ Because you're mine ♪
949
00:52:18,718 --> 00:52:19,845
♪ You're mine ♪
950
00:52:20,929 --> 00:52:24,349
[Roger] So, imagine you're on Facebook...
951
00:52:24,766 --> 00:52:29,312
and you're effectively playing
against this artificial intelligence
952
00:52:29,396 --> 00:52:31,314
that knows everything about you,
953
00:52:31,398 --> 00:52:34,568
can anticipate your next move,
and you know literally nothing about it,
954
00:52:34,651 --> 00:52:37,404
except that there are cat videos
and birthdays on it.
955
00:52:37,821 --> 00:52:39,656
That's not a fair fight.
956
00:52:41,575 --> 00:52:43,869
Ben and Jerry, it's time to go, bud!
957
00:52:48,039 --> 00:52:48,874
[sighs]
958
00:52:51,126 --> 00:52:51,960
Ben?
959
00:53:01,011 --> 00:53:02,137
[knocks lightly on door]
960
00:53:02,679 --> 00:53:04,723
-[Cass] Ben.
-[Ben] Mm.
961
00:53:05,182 --> 00:53:06,057
Come on.
962
00:53:07,225 --> 00:53:08,351
School time. [claps]
963
00:53:08,435 --> 00:53:09,269
Let's go.
964
00:53:12,189 --> 00:53:13,148
[Ben sighs]
965
00:53:25,118 --> 00:53:27,120
[excited chatter]
966
00:53:31,374 --> 00:53:33,627
-[tech] How you doing today?
-Oh, I'm... I'm nervous.
967
00:53:33,710 --> 00:53:35,003
-Are ya?
-Yeah. [chuckles]
968
00:53:37,380 --> 00:53:39,132
[Tristan]
We were all looking for the moment
969
00:53:39,216 --> 00:53:42,969
when technology would overwhelm
human strengths and intelligence.
970
00:53:43,053 --> 00:53:47,015
When is it gonna cross the singularity,
replace our jobs, be smarter than humans?
971
00:53:48,141 --> 00:53:50,101
But there's this much earlier moment...
972
00:53:50,977 --> 00:53:55,315
when technology exceeds
and overwhelms human weaknesses.
973
00:53:57,484 --> 00:54:01,780
This point being crossed
is at the root of addiction,
974
00:54:02,113 --> 00:54:04,741
polarization, radicalization,
outrage-ification,
975
00:54:04,824 --> 00:54:06,368
vanity-ification, the entire thing.
976
00:54:07,702 --> 00:54:09,913
This is overpowering human nature,
977
00:54:10,538 --> 00:54:13,500
and this is checkmate on humanity.
978
00:54:20,131 --> 00:54:21,883
-[sighs deeply]
-[door opens]
979
00:54:30,558 --> 00:54:31,851
I'm sorry. [sighs]
980
00:54:37,607 --> 00:54:39,609
-[seat belt clicks]
-[engine starts]
981
00:54:41,736 --> 00:54:44,656
[Jaron] One of the ways
I try to get people to understand
982
00:54:45,198 --> 00:54:49,828
just how wrong feeds from places
like Facebook are
983
00:54:49,911 --> 00:54:51,454
is to think about the Wikipedia.
984
00:54:52,956 --> 00:54:56,209
When you go to a page, you're seeing
the same thing as other people.
985
00:54:56,584 --> 00:55:00,297
So, it's one of the few things online
that we at least hold in common.
986
00:55:00,380 --> 00:55:03,425
Now, just imagine for a second
that Wikipedia said,
987
00:55:03,508 --> 00:55:07,178
"We're gonna give each person
a different customized definition,
988
00:55:07,262 --> 00:55:09,472
and we're gonna be paid by people
for that."
989
00:55:09,556 --> 00:55:13,435
So, Wikipedia would be spying on you.
Wikipedia would calculate,
990
00:55:13,518 --> 00:55:17,188
"What's the thing I can do
to get this person to change a little bit
991
00:55:17,272 --> 00:55:19,899
on behalf of some commercial interest?"
Right?
992
00:55:19,983 --> 00:55:21,818
And then it would change the entry.
993
00:55:22,444 --> 00:55:24,738
Can you imagine that?
Well, you should be able to,
994
00:55:24,821 --> 00:55:26,823
'cause that's exactly what's happening
on Facebook.
995
00:55:26,906 --> 00:55:28,992
It's exactly what's happening
in your YouTube feed.
996
00:55:29,075 --> 00:55:31,786
When you go to Google and type in
"Climate change is,"
997
00:55:31,870 --> 00:55:34,998
you're going to see different results
depending on where you live.
998
00:55:36,166 --> 00:55:38,460
In certain cities,
you're gonna see it autocomplete
999
00:55:38,543 --> 00:55:40,462
with "climate change is a hoax."
1000
00:55:40,545 --> 00:55:42,088
In other cases, you're gonna see
1001
00:55:42,172 --> 00:55:44,841
"climate change is causing the destruction
of nature."
1002
00:55:44,924 --> 00:55:48,428
And that's a function not
of what the truth is about climate change,
1003
00:55:48,511 --> 00:55:51,097
but about
where you happen to be Googling from
1004
00:55:51,181 --> 00:55:54,100
and the particular things
Google knows about your interests.
1005
00:55:54,851 --> 00:55:58,021
Even two friends
who are so close to each other,
1006
00:55:58,104 --> 00:56:00,190
who have almost the exact same set
of friends,
1007
00:56:00,273 --> 00:56:02,817
they think, you know,
"I'm going to news feeds on Facebook.
1008
00:56:02,901 --> 00:56:05,403
I'll see the exact same set of updates."
1009
00:56:05,487 --> 00:56:06,738
But it's not like that at all.
1010
00:56:06,821 --> 00:56:08,448
They see completely different worlds
1011
00:56:08,531 --> 00:56:10,575
because they're based
on these computers calculating
1012
00:56:10,658 --> 00:56:12,035
what's perfect for each of them.
1013
00:56:12,118 --> 00:56:14,245
[whistling over monitor]
1014
00:56:14,329 --> 00:56:18,416
[Roger] The way to think about it
is it's 2.7 billion Truman Shows.
1015
00:56:18,500 --> 00:56:21,294
Each person has their own reality,
with their own...
1016
00:56:22,670 --> 00:56:23,671
facts.
1017
00:56:23,755 --> 00:56:27,008
Why do you thinkthat, uh, Truman has never come close
1018
00:56:27,092 --> 00:56:30,095
to discovering the true natureof his world until now?
1019
00:56:31,054 --> 00:56:34,140
We accept the reality of the world
with which we're presented.
1020
00:56:34,224 --> 00:56:35,141
It's as simple as that.
1021
00:56:36,476 --> 00:56:41,064
Over time, you have the false sense
that everyone agrees with you,
1022
00:56:41,147 --> 00:56:44,067
because everyone in your news feed
sounds just like you.
1023
00:56:44,567 --> 00:56:49,072
And that once you're in that state,
it turns out you're easily manipulated,
1024
00:56:49,155 --> 00:56:51,741
the same way you would be manipulated
by a magician.
1025
00:56:51,825 --> 00:56:55,370
A magician shows you a card trick
and says, "Pick a card, any card."
1026
00:56:55,453 --> 00:56:58,164
What you don't realize
was that they've done a set-up,
1027
00:56:58,456 --> 00:57:00,583
so you pick the card
they want you to pick.
1028
00:57:00,667 --> 00:57:03,169
And that's how Facebook works.
Facebook sits there and says,
1029
00:57:03,253 --> 00:57:06,172
"Hey, you pick your friends.
You pick the links that you follow."
1030
00:57:06,256 --> 00:57:08,716
But that's all nonsense.
It's just like the magician.
1031
00:57:08,800 --> 00:57:11,302
Facebook is in charge of your news feed.
1032
00:57:11,386 --> 00:57:14,514
We all simply are operating
on a different set of facts.
1033
00:57:14,597 --> 00:57:16,474
When that happens at scale,
1034
00:57:16,558 --> 00:57:20,645
you're no longer able to reckon with
or even consume information
1035
00:57:20,728 --> 00:57:23,690
that contradicts with that world view
that you've created.
1036
00:57:23,773 --> 00:57:26,443
That means we aren't actually being
objective,
1037
00:57:26,526 --> 00:57:28,319
constructive individuals. [chuckles]
1038
00:57:28,403 --> 00:57:32,449
[crowd chanting] Open up your eyes,
don't believe the lies! Open up...
1039
00:57:32,532 --> 00:57:34,701
[Justin] And then you look
over at the other side,
1040
00:57:35,243 --> 00:57:38,746
and you start to think,
"How can those people be so stupid?
1041
00:57:38,830 --> 00:57:42,125
Look at all of this information
that I'm constantly seeing.
1042
00:57:42,208 --> 00:57:44,627
How are they not seeing
that same information?"
1043
00:57:44,711 --> 00:57:47,297
And the answer is, "They're not seeing
that same information."
1044
00:57:47,380 --> 00:57:50,800
[crowd continues chanting]
Open up your eyes, don't believe the lies!
1045
00:57:50,884 --> 00:57:52,010
[shouting indistinctly]
1046
00:57:52,093 --> 00:57:55,472
-[interviewer] What are Republicans like?
-People that don't have a clue.
1047
00:57:55,555 --> 00:57:58,933
The Democrat Party is a crime syndicate,
not a real political party.
1048
00:57:59,017 --> 00:58:03,188
A huge new Pew Research Center study
of 10,000 American adults
1049
00:58:03,271 --> 00:58:05,315
finds us more divided than ever,
1050
00:58:05,398 --> 00:58:09,152
with personal and political polarization
at a 20-year high.
1051
00:58:11,738 --> 00:58:14,199
[pundit] You have
more than a third of Republicans saying
1052
00:58:14,282 --> 00:58:16,826
the Democratic Party is a threat
to the nation,
1053
00:58:16,910 --> 00:58:20,580
more than a quarter of Democrats saying
the same thing about the Republicans.
1054
00:58:20,663 --> 00:58:22,499
So many of the problems
that we're discussing,
1055
00:58:22,582 --> 00:58:24,417
like, around political polarization
1056
00:58:24,501 --> 00:58:28,046
exist in spades on cable television.
1057
00:58:28,129 --> 00:58:31,007
The media has this exact same problem,
1058
00:58:31,090 --> 00:58:33,343
where their business model, by and large,
1059
00:58:33,426 --> 00:58:35,762
is that they're selling our attention
to advertisers.
1060
00:58:35,845 --> 00:58:38,890
And the Internet is just a new,
even more efficient way to do that.
1061
00:58:40,141 --> 00:58:44,145
[Guillaume] At YouTube, I was working
on YouTube recommendations.
1062
00:58:44,229 --> 00:58:47,148
It worries me that an algorithm
that I worked on
1063
00:58:47,232 --> 00:58:50,401
is actually increasing polarization
in society.
1064
00:58:50,485 --> 00:58:53,112
But from the point of view of watch time,
1065
00:58:53,196 --> 00:58:57,617
this polarization is extremely efficient
at keeping people online.
1066
00:58:58,785 --> 00:59:00,870
The only reasonthese teachers are teaching this stuff
1067
00:59:00,954 --> 00:59:02,288
is 'cause they're getting paid to.
1068
00:59:02,372 --> 00:59:04,374
-It's absolutely absurd.
-[Cass] Hey, Benji.
1069
00:59:04,916 --> 00:59:06,292
No soccer practice today?
1070
00:59:06,376 --> 00:59:08,795
Oh, there is. I'm just catching up
on some news stuff.
1071
00:59:08,878 --> 00:59:11,506
[vlogger] Do research. Anythingthat sways from the Extreme Center--
1072
00:59:11,589 --> 00:59:14,008
Wouldn't exactly call the stuff
that you're watching news.
1073
00:59:15,552 --> 00:59:18,846
You're always talking about how messed up
everything is. So are they.
1074
00:59:19,305 --> 00:59:21,140
But that stuff is just propaganda.
1075
00:59:21,224 --> 00:59:24,060
[vlogger] Neither is true.It's all about what makes sense.
1076
00:59:24,769 --> 00:59:26,938
Ben, I'm serious.
That stuff is bad for you.
1077
00:59:27,021 --> 00:59:29,232
-You should go to soccer practice.
-[Ben] Mm.
1078
00:59:31,109 --> 00:59:31,943
[Cass sighs]
1079
00:59:35,154 --> 00:59:37,490
I share this stuff because I care.
1080
00:59:37,574 --> 00:59:41,077
I care that you are being misled,and it's not okay. All right?
1081
00:59:41,160 --> 00:59:43,121
[Guillaume] People think
the algorithm is designed
1082
00:59:43,204 --> 00:59:46,833
to give them what they really want,
only it's not.
1083
00:59:46,916 --> 00:59:52,589
The algorithm is actually trying to find
a few rabbit holes that are very powerful,
1084
00:59:52,672 --> 00:59:56,217
trying to find which rabbit hole
is the closest to your interest.
1085
00:59:56,301 --> 00:59:59,262
And then if you start watching
one of those videos,
1086
00:59:59,846 --> 01:00:02,223
then it will recommend it
over and over again.
1087
01:00:02,682 --> 01:00:04,934
It's not like anybody wants this
to happen.
1088
01:00:05,018 --> 01:00:07,812
It's just that this is
what the recommendation system is doing.
1089
01:00:07,895 --> 01:00:10,815
So much so that Kyrie Irving,
the famous basketball player,
1090
01:00:11,065 --> 01:00:14,235
uh, said he believed the Earth was flat,
and he apologized later
1091
01:00:14,319 --> 01:00:16,154
because he blamed it
on a YouTube rabbit hole.
1092
01:00:16,487 --> 01:00:18,656
You know, like,
you click the YouTube click
1093
01:00:18,740 --> 01:00:21,534
and it goes, like,
how deep the rabbit hole goes.
1094
01:00:21,618 --> 01:00:23,369
When he later came on to NPR to say,
1095
01:00:23,453 --> 01:00:25,955
"I'm sorry for believing this.
I didn't want to mislead people,"
1096
01:00:26,039 --> 01:00:28,291
a bunch of students in a classroom
were interviewed saying,
1097
01:00:28,374 --> 01:00:29,667
"The round-Earthers got to him."
1098
01:00:29,751 --> 01:00:30,960
[audience chuckles]
1099
01:00:31,044 --> 01:00:33,963
The flat-Earth conspiracy theory
was recommended
1100
01:00:34,047 --> 01:00:37,634
hundreds of millions of times
by the algorithm.
1101
01:00:37,717 --> 01:00:43,890
It's easy to think that it's just
a few stupid people who get convinced,
1102
01:00:43,973 --> 01:00:46,893
but the algorithm is getting smarter
and smarter every day.
1103
01:00:46,976 --> 01:00:50,188
So, today, they are convincing the people
that the Earth is flat,
1104
01:00:50,271 --> 01:00:53,983
but tomorrow, they will be convincing you
of something that's false.
1105
01:00:54,317 --> 01:00:57,820
[reporter] On November 7th,the hashtag "Pizzagate" was born.
1106
01:00:57,904 --> 01:00:59,197
[Renée] Pizzagate...
1107
01:01:00,114 --> 01:01:01,449
[clicks tongue] Oh, boy.
1108
01:01:01,532 --> 01:01:02,533
Uh... [laughs]
1109
01:01:03,159 --> 01:01:06,913
I still am not 100 percent sure
how this originally came about,
1110
01:01:06,996 --> 01:01:12,377
but the idea that ordering a pizza
meant ordering a trafficked person.
1111
01:01:12,460 --> 01:01:15,046
As the groups got bigger on Facebook,
1112
01:01:15,129 --> 01:01:19,967
Facebook's recommendation engine
started suggesting to regular users
1113
01:01:20,051 --> 01:01:21,761
that they join Pizzagate groups.
1114
01:01:21,844 --> 01:01:27,392
So, if a user was, for example,
anti-vaccine or believed in chemtrails
1115
01:01:27,475 --> 01:01:30,645
or had indicated to Facebook's algorithms
in some way
1116
01:01:30,728 --> 01:01:33,398
that they were prone to belief
in conspiracy theories,
1117
01:01:33,481 --> 01:01:36,859
Facebook's recommendation engine
would serve them Pizzagate groups.
1118
01:01:36,943 --> 01:01:41,072
Eventually, this culminated in
a man showing up with a gun,
1119
01:01:41,155 --> 01:01:44,617
deciding that he was gonna go liberate
the children from the basement
1120
01:01:44,701 --> 01:01:46,911
of the pizza place
that did not have a basement.
1121
01:01:46,994 --> 01:01:48,538
[officer 1] What were you doing?
1122
01:01:48,871 --> 01:01:50,498
[man] Making sure
there was nothing there.
1123
01:01:50,581 --> 01:01:52,458
-[officer 1] Regarding?
-[man] Pedophile ring.
1124
01:01:52,542 --> 01:01:54,293
-[officer 1] What?
-[man] Pedophile ring.
1125
01:01:54,377 --> 01:01:55,962
[officer 2] He's talking about Pizzagate.
1126
01:01:56,045 --> 01:02:00,216
This is an example of a conspiracy theory
1127
01:02:00,299 --> 01:02:03,678
that was propagated
across all social networks.
1128
01:02:03,761 --> 01:02:06,097
The social network's
own recommendation engine
1129
01:02:06,180 --> 01:02:07,974
is voluntarily serving this up to people
1130
01:02:08,057 --> 01:02:10,643
who had never searched
for the term "Pizzagate" in their life.
1131
01:02:12,437 --> 01:02:14,439
[Tristan] There's a study, an MIT study,
1132
01:02:14,522 --> 01:02:19,819
that fake news on Twitter spreads
six times faster than true news.
1133
01:02:19,902 --> 01:02:21,863
What is that world gonna look like
1134
01:02:21,946 --> 01:02:24,741
when one has a six-times advantage
to the other one?
1135
01:02:25,283 --> 01:02:27,660
You can imagine
these things are sort of like...
1136
01:02:27,744 --> 01:02:31,706
they... they tilt the floor
of... of human behavior.
1137
01:02:31,789 --> 01:02:34,709
They make some behavior harder
and some easier.
1138
01:02:34,792 --> 01:02:37,420
And you're always free
to walk up the hill,
1139
01:02:37,503 --> 01:02:38,796
but fewer people do,
1140
01:02:38,880 --> 01:02:43,092
and so, at scale, at society's scale,
you really are just tilting the floor
1141
01:02:43,176 --> 01:02:45,970
and changing what billions of people think
and do.
1142
01:02:46,053 --> 01:02:52,018
We've created a system
that biases towards false information.
1143
01:02:52,643 --> 01:02:54,437
Not because we want to,
1144
01:02:54,520 --> 01:02:58,816
but because false information makes
the companies more money
1145
01:02:59,400 --> 01:03:01,319
than the truth. The truth is boring.
1146
01:03:01,986 --> 01:03:04,489
It's a disinformation-for-profit
business model.
1147
01:03:04,906 --> 01:03:08,159
You make money the more you allow
unregulated messages
1148
01:03:08,701 --> 01:03:11,287
to reach anyone for the best price.
1149
01:03:11,662 --> 01:03:13,956
Because climate change? Yeah.
1150
01:03:14,040 --> 01:03:16,751
It's a hoax. Yeah, it's real.That's the point.
1151
01:03:16,834 --> 01:03:20,046
The more they talk about itand the more they divide us,
1152
01:03:20,129 --> 01:03:22,423
the more they have the power,the more...
1153
01:03:22,507 --> 01:03:25,468
[Tristan] Facebook has trillions
of these news feed posts.
1154
01:03:26,552 --> 01:03:29,180
They can't know what's real
or what's true...
1155
01:03:29,972 --> 01:03:33,726
which is why this conversation
is so critical right now.
1156
01:03:33,810 --> 01:03:37,021
[reporter 1] It's not just COVID-19that's spreading fast.
1157
01:03:37,104 --> 01:03:40,191
There's a flow of misinformation onlineabout the virus.
1158
01:03:40,274 --> 01:03:41,818
[reporter 2] The notiondrinking water
1159
01:03:41,901 --> 01:03:43,694
will flush coronavirus from your system
1160
01:03:43,778 --> 01:03:47,490
is one of several myths about the viruscirculating on social media.
1161
01:03:47,573 --> 01:03:50,451
[automated voice] The government plannedthis event, created the virus,
1162
01:03:50,535 --> 01:03:53,621
and had a simulationof how the countries would react.
1163
01:03:53,955 --> 01:03:55,581
Coronavirus is a... a hoax.
1164
01:03:56,165 --> 01:03:57,959
[man] SARS, coronavirus.
1165
01:03:58,376 --> 01:04:01,045
And look at when it was made. 2018.
1166
01:04:01,128 --> 01:04:03,798
I think the US government started
this shit.
1167
01:04:04,215 --> 01:04:09,095
Nobody is sick. Nobody is sick.
Nobody knows anybody who's sick.
1168
01:04:09,512 --> 01:04:13,015
Maybe the government is using
the coronavirus as an excuse
1169
01:04:13,099 --> 01:04:15,643
to get everyone to stay inside
because something else is happening.
1170
01:04:15,726 --> 01:04:18,020
Coronavirus is not killing people,
1171
01:04:18,104 --> 01:04:20,940
it's the 5G radiation
that they're pumping out.
1172
01:04:21,023 --> 01:04:22,525
[crowd shouting]
1173
01:04:22,608 --> 01:04:24,944
[Tristan]
We're being bombarded with rumors.
1174
01:04:25,403 --> 01:04:28,823
People are blowing upactual physical cell phone towers.
1175
01:04:28,906 --> 01:04:32,201
We see Russia and China spreading rumorsand conspiracy theories.
1176
01:04:32,285 --> 01:04:35,246
[reporter 3] This morning,panic and protest in Ukraine as...
1177
01:04:35,329 --> 01:04:38,916
[Tristan] People have no idea what's true,and now it's a matter of life and death.
1178
01:04:39,876 --> 01:04:42,628
[woman] Those sources that are spreading
coronavirus misinformation
1179
01:04:42,712 --> 01:04:45,798
have amassed
something like 52 million engagements.
1180
01:04:45,882 --> 01:04:50,094
You're saying that silver solution
would be effective.
1181
01:04:50,177 --> 01:04:54,140
Well, let's say it hasn't been tested
on this strain of the coronavirus, but...
1182
01:04:54,223 --> 01:04:57,226
[Tristan] What we're seeing with COVIDis just an extreme version
1183
01:04:57,310 --> 01:05:00,521
of what's happeningacross our information ecosystem.
1184
01:05:00,938 --> 01:05:05,026
Social media amplifies exponential gossipand exponential hearsay
1185
01:05:05,109 --> 01:05:07,111
to the pointthat we don't know what's true,
1186
01:05:07,194 --> 01:05:08,946
no matter what issue we care about.
1187
01:05:15,161 --> 01:05:16,579
[teacher] He discovers this.
1188
01:05:16,662 --> 01:05:18,664
[continues lecturing indistinctly]
1189
01:05:19,874 --> 01:05:21,292
[Rebecca whispers] Ben.
1190
01:05:26,130 --> 01:05:28,257
-Are you still on the team?
-[Ben] Mm-hmm.
1191
01:05:30,384 --> 01:05:32,678
[Rebecca] Okay, well,
I'm gonna get a snack before practice
1192
01:05:32,762 --> 01:05:34,430
if you... wanna come.
1193
01:05:35,640 --> 01:05:36,515
[Ben] Hm?
1194
01:05:36,974 --> 01:05:38,601
[Rebecca] You know, never mind.
1195
01:05:38,684 --> 01:05:40,686
[footsteps fading]
1196
01:05:45,066 --> 01:05:47,526
[vlogger] Nine out of ten peopleare dissatisfied right now.
1197
01:05:47,610 --> 01:05:50,613
The EC is like any political movementin history, when you think about it.
1198
01:05:50,696 --> 01:05:54,492
We are standing up, and we are...we are standing up to this noise.
1199
01:05:54,575 --> 01:05:57,036
You are my people. I trust you guys.
1200
01:05:59,246 --> 01:06:02,583
-The Extreme Center content is brilliant.
-He absolutely loves it.
1201
01:06:02,667 --> 01:06:03,626
Running an auction.
1202
01:06:04,627 --> 01:06:08,547
840 bidders. He sold for 4.35 cents
to a weapons manufacturer.
1203
01:06:08,631 --> 01:06:10,800
Let's promote some of these events.
1204
01:06:10,883 --> 01:06:13,511
Upcoming rallies in his geographic zone
later this week.
1205
01:06:13,594 --> 01:06:15,179
I've got a new vlogger lined up, too.
1206
01:06:15,262 --> 01:06:16,263
[chuckles]
1207
01:06:17,765 --> 01:06:22,979
And... and, honestly, I'm telling you,I'm willing to do whatever it takes.
1208
01:06:23,062 --> 01:06:24,939
And I mean whatever.
1209
01:06:32,154 --> 01:06:33,197
-Subscribe...
-[Cass] Ben?
1210
01:06:33,280 --> 01:06:35,908
...and also come backbecause I'm telling you, yo...
1211
01:06:35,992 --> 01:06:38,869
-[knocking on door]
-...I got some real big things comin'.
1212
01:06:38,953 --> 01:06:40,162
Some real big things.
1213
01:06:40,788 --> 01:06:45,292
[Roger] One of the problems with Facebook
is that, as a tool of persuasion,
1214
01:06:45,793 --> 01:06:47,920
it may be the greatest thing ever created.
1215
01:06:48,004 --> 01:06:52,508
Now, imagine what that means in the hands
of a dictator or an authoritarian.
1216
01:06:53,718 --> 01:06:57,638
If you want to control the population
of your country,
1217
01:06:57,722 --> 01:07:01,308
there has never been a tool
as effective as Facebook.
1218
01:07:04,937 --> 01:07:07,398
[Cynthia]
Some of the most troubling implications
1219
01:07:07,481 --> 01:07:10,985
of governments and other bad actors
weaponizing social media,
1220
01:07:11,235 --> 01:07:13,612
um, is that it has led
to real, offline harm.
1221
01:07:13,696 --> 01:07:15,072
I think the most prominent example
1222
01:07:15,156 --> 01:07:17,658
that's gotten a lot of press
is what's happened in Myanmar.
1223
01:07:19,243 --> 01:07:21,203
In Myanmar,
when people think of the Internet,
1224
01:07:21,287 --> 01:07:22,913
what they are thinking about is Facebook.
1225
01:07:22,997 --> 01:07:25,916
And what often happens is
when people buy their cell phone,
1226
01:07:26,000 --> 01:07:29,920
the cell phone shop owner will actually
preload Facebook on there for them
1227
01:07:30,004 --> 01:07:31,505
and open an account for them.
1228
01:07:31,589 --> 01:07:34,884
And so when people get their phone,
the first thing they open
1229
01:07:34,967 --> 01:07:37,595
and the only thing they know how to open
is Facebook.
1230
01:07:38,179 --> 01:07:41,891
Well, a new bombshell investigation
exposes Facebook's growing struggle
1231
01:07:41,974 --> 01:07:43,809
to tackle hate speech in Myanmar.
1232
01:07:43,893 --> 01:07:46,020
[crowd shouting]
1233
01:07:46,103 --> 01:07:49,190
Facebook really gave the military
and other bad actors
1234
01:07:49,273 --> 01:07:51,776
a new way to manipulate public opinion
1235
01:07:51,859 --> 01:07:55,529
and to help incite violence
against the Rohingya Muslims
1236
01:07:55,613 --> 01:07:57,406
that included mass killings,
1237
01:07:58,115 --> 01:07:59,867
burning of entire villages,
1238
01:07:59,950 --> 01:08:03,704
mass rape, and other serious crimes
against humanity
1239
01:08:03,788 --> 01:08:04,955
that have now led
1240
01:08:05,039 --> 01:08:08,209
to 700,000 Rohingya Muslims
having to flee the country.
1241
01:08:11,170 --> 01:08:14,799
It's not
that highly motivated propagandists
1242
01:08:14,882 --> 01:08:16,550
haven't existed before.
1243
01:08:16,634 --> 01:08:19,762
It's that the platforms make it possible
1244
01:08:19,845 --> 01:08:23,724
to spread manipulative narratives
with phenomenal ease,
1245
01:08:23,808 --> 01:08:25,434
and without very much money.
1246
01:08:25,518 --> 01:08:27,812
If I want to manipulate an election,
1247
01:08:27,895 --> 01:08:30,564
I can now go into
a conspiracy theory group on Facebook,
1248
01:08:30,648 --> 01:08:32,233
and I can find 100 people
1249
01:08:32,316 --> 01:08:34,443
who believe
that the Earth is completely flat
1250
01:08:34,860 --> 01:08:37,780
and think it's all this conspiracy theory
that we landed on the moon,
1251
01:08:37,863 --> 01:08:41,450
and I can tell Facebook,
"Give me 1,000 users who look like that."
1252
01:08:42,118 --> 01:08:46,080
Facebook will happily send me
thousands of users that look like them
1253
01:08:46,163 --> 01:08:49,250
that I can now hit
with more conspiracy theories.
1254
01:08:50,376 --> 01:08:53,087
-[button clicks]
-Sold for 3.4 cents an impression.
1255
01:08:53,379 --> 01:08:56,382
-New EC video to promote.
-[Advertising AI] Another ad teed up.
1256
01:08:58,509 --> 01:09:00,928
[Justin] Algorithms
and manipulative politicians
1257
01:09:01,011 --> 01:09:02,138
are becoming so expert
1258
01:09:02,221 --> 01:09:04,056
at learning how to trigger us,
1259
01:09:04,140 --> 01:09:08,352
getting so good at creating fake news
that we absorb as if it were reality,
1260
01:09:08,435 --> 01:09:10,813
and confusing us into believing
those lies.
1261
01:09:10,896 --> 01:09:12,606
It's as though we have
less and less control
1262
01:09:12,690 --> 01:09:14,150
over who we are and what we believe.
1263
01:09:14,233 --> 01:09:16,235
[ominous instrumental music playing]
1264
01:09:31,375 --> 01:09:32,835
[vlogger] ...so they can pick sides.
1265
01:09:32,918 --> 01:09:34,879
There's lies here,and there's lies over there.
1266
01:09:34,962 --> 01:09:36,338
So they can keep the power,
1267
01:09:36,422 --> 01:09:39,967
-so they can control everything.
-[police siren blaring]
1268
01:09:40,050 --> 01:09:42,553
[vlogger] They can control our minds,
1269
01:09:42,636 --> 01:09:46,390
-so that they can keep their secrets.
-[crowd chanting]
1270
01:09:48,517 --> 01:09:50,895
[Tristan] Imagine a world
where no one believes anything true.
1271
01:09:52,897 --> 01:09:55,649
Everyone believes
the government's lying to them.
1272
01:09:56,317 --> 01:09:58,444
Everything is a conspiracy theory.
1273
01:09:58,527 --> 01:10:01,197
"I shouldn't trust anyone.
I hate the other side."
1274
01:10:01,280 --> 01:10:02,698
That's where all this is heading.
1275
01:10:02,781 --> 01:10:06,160
The political earthquakes in Europe
continue to rumble.
1276
01:10:06,243 --> 01:10:08,412
This time, in Italy and Spain.
1277
01:10:08,495 --> 01:10:11,999
[reporter] Overall, Europe's traditional,centrist coalition lost its majority
1278
01:10:12,082 --> 01:10:15,002
while far rightand far left populist parties made gains.
1279
01:10:15,085 --> 01:10:16,086
[man shouts]
1280
01:10:16,170 --> 01:10:17,504
[crowd chanting]
1281
01:10:19,757 --> 01:10:20,591
Back up.
1282
01:10:21,300 --> 01:10:22,509
-[radio beeps]
-Okay, let's go.
1283
01:10:24,845 --> 01:10:26,847
[police siren wailing]
1284
01:10:28,390 --> 01:10:31,268
[reporter] These accountswere deliberately, specifically attempting
1285
01:10:31,352 --> 01:10:34,355
-to sow political discord in Hong Kong.
-[crowd shouting]
1286
01:10:36,440 --> 01:10:37,399
[sighs]
1287
01:10:38,609 --> 01:10:40,361
-All right, Ben.
-[car doors lock]
1288
01:10:42,863 --> 01:10:45,032
What does it look like to be a country
1289
01:10:45,115 --> 01:10:48,410
that's entire diet is Facebook
and social media?
1290
01:10:48,953 --> 01:10:50,871
Democracy crumbled quickly.
1291
01:10:50,955 --> 01:10:51,830
Six months.
1292
01:10:51,914 --> 01:10:53,791
[reporter 1] After that chaos in Chicago,
1293
01:10:53,874 --> 01:10:57,086
violent clashes between protestersand supporters...
1294
01:10:58,003 --> 01:11:01,632
[reporter 2] Democracy is facinga crisis of confidence.
1295
01:11:01,715 --> 01:11:04,343
What we're seeing is a global assault
on democracy.
1296
01:11:04,426 --> 01:11:05,427
[crowd shouting]
1297
01:11:05,511 --> 01:11:07,930
[Renée] Most of the countries
that are targeted are countries
1298
01:11:08,013 --> 01:11:09,723
that run democratic elections.
1299
01:11:10,641 --> 01:11:12,518
[Tristan] This is happening at scale.
1300
01:11:12,601 --> 01:11:15,562
By state actors,
by people with millions of dollars saying,
1301
01:11:15,646 --> 01:11:18,524
"I wanna destabilize Kenya.
I wanna destabilize Cameroon.
1302
01:11:18,607 --> 01:11:20,651
Oh, Angola? That only costs this much."
1303
01:11:20,734 --> 01:11:23,362
[reporter] An extraordinary electiontook place Sunday in Brazil.
1304
01:11:23,445 --> 01:11:25,823
With a campaign that's been powered
by social media.
1305
01:11:25,906 --> 01:11:29,702
[crowd chanting in Portuguese]
1306
01:11:31,036 --> 01:11:33,956
[Tristan] We in the tech industry
have created the tools
1307
01:11:34,039 --> 01:11:37,418
to destabilize
and erode the fabric of society
1308
01:11:37,501 --> 01:11:40,254
in every country, all at once, everywhere.
1309
01:11:40,337 --> 01:11:44,508
You have this in Germany, Spain, France,
Brazil, Australia.
1310
01:11:44,591 --> 01:11:47,261
Some of the most "developed nations"
in the world
1311
01:11:47,344 --> 01:11:49,221
are now imploding on each other,
1312
01:11:49,305 --> 01:11:50,931
and what do they have in common?
1313
01:11:51,974 --> 01:11:52,975
Knowing what you know now,
1314
01:11:53,058 --> 01:11:56,312
do you believe Facebook impacted
the results of the 2016 election?
1315
01:11:56,770 --> 01:11:58,814
[Mark Zuckerberg]
Oh, that's... that is hard.
1316
01:11:58,897 --> 01:12:00,691
You know, it's... the...
1317
01:12:01,275 --> 01:12:04,653
the reality is, well, there
were so many different forces at play.
1318
01:12:04,737 --> 01:12:07,865
Representatives from Facebook, Twitter,
and Google are back on Capitol Hill
1319
01:12:07,948 --> 01:12:09,450
for a second day of testimony
1320
01:12:09,533 --> 01:12:12,578
about Russia's interference
in the 2016 election.
1321
01:12:12,661 --> 01:12:17,291
The manipulation
by third parties is not a hack.
1322
01:12:18,500 --> 01:12:21,462
Right? The Russians didn't hack Facebook.
1323
01:12:21,545 --> 01:12:24,965
What they did was they used the tools
that Facebook created
1324
01:12:25,049 --> 01:12:27,843
for legitimate advertisers
and legitimate users,
1325
01:12:27,926 --> 01:12:30,346
and they applied it
to a nefarious purpose.
1326
01:12:32,014 --> 01:12:34,391
[Tristan]
It's like remote-control warfare.
1327
01:12:34,475 --> 01:12:36,602
One country can manipulate another one
1328
01:12:36,685 --> 01:12:39,229
without actually invading
its physical borders.
1329
01:12:39,605 --> 01:12:42,232
[reporter 1] We're seeing violent images.It appears to be a dumpster
1330
01:12:42,316 --> 01:12:43,317
being pushed around...
1331
01:12:43,400 --> 01:12:46,028
[Tristan] But it wasn't
about who you wanted to vote for.
1332
01:12:46,362 --> 01:12:50,574
It was about sowing total chaos
and division in society.
1333
01:12:50,657 --> 01:12:53,035
[reporter 2] Now,this was in Huntington Beach. A march...
1334
01:12:53,118 --> 01:12:54,870
[Tristan] It's about making two sides
1335
01:12:54,953 --> 01:12:56,413
who couldn't hear each other anymore,
1336
01:12:56,497 --> 01:12:58,123
who didn't want to hear each other
anymore,
1337
01:12:58,207 --> 01:12:59,875
who didn't trust each other anymore.
1338
01:12:59,958 --> 01:13:03,212
[reporter 3] This is a citywhere hatred was laid bare
1339
01:13:03,295 --> 01:13:05,464
and transformed into racial violence.
1340
01:13:05,547 --> 01:13:07,549
[crowd shouting]
1341
01:13:09,009 --> 01:13:11,178
[indistinct shouting]
1342
01:13:12,471 --> 01:13:14,014
[men grunting]
1343
01:13:17,851 --> 01:13:20,062
[police siren blaring]
1344
01:13:20,145 --> 01:13:20,979
[Cass] Ben!
1345
01:13:21,605 --> 01:13:22,439
Cassandra!
1346
01:13:22,981 --> 01:13:23,816
-Cass!
-Ben!
1347
01:13:23,899 --> 01:13:25,484
[officer 1] Come here! Come here!
1348
01:13:27,486 --> 01:13:31,156
Arms up. Arms up.
Get down on your knees. Now, down.
1349
01:13:31,240 --> 01:13:32,491
[crowd continues shouting]
1350
01:13:36,120 --> 01:13:37,204
-[officer 2] Calm--
-Ben!
1351
01:13:37,287 --> 01:13:38,664
[officer 2] Hey! Hands up!
1352
01:13:39,623 --> 01:13:41,750
Turn around. On the ground. On the ground!
1353
01:13:43,710 --> 01:13:46,463
-[crowd echoing]
-[melancholy piano music playing]
1354
01:13:51,969 --> 01:13:54,388
[siren continues wailing]
1355
01:13:56,723 --> 01:14:00,018
[Tristan] Do we want this system for sale
to the highest bidder?
1356
01:14:01,437 --> 01:14:05,399
For democracy to be completely for sale,
where you can reach any mind you want,
1357
01:14:05,482 --> 01:14:09,069
target a lie to that specific population,
and create culture wars?
1358
01:14:09,236 --> 01:14:10,237
Do we want that?
1359
01:14:14,700 --> 01:14:16,577
[Marco Rubio] We are a nation of people...
1360
01:14:16,952 --> 01:14:18,871
that no longer speak to each other.
1361
01:14:19,872 --> 01:14:23,000
We are a nation of people
who have stopped being friends with people
1362
01:14:23,083 --> 01:14:25,461
because of who they voted for
in the last election.
1363
01:14:25,878 --> 01:14:28,422
We are a nation of people
who have isolated ourselves
1364
01:14:28,505 --> 01:14:30,966
to only watch channels
that tell us that we're right.
1365
01:14:32,259 --> 01:14:36,597
My message here today is that tribalism
is ruining us.
1366
01:14:37,347 --> 01:14:39,183
It is tearing our country apart.
1367
01:14:40,267 --> 01:14:42,811
It is no way for sane adults to act.
1368
01:14:43,187 --> 01:14:45,314
If everyone's entitled to their own facts,
1369
01:14:45,397 --> 01:14:49,401
there's really no need for compromise,
no need for people to come together.
1370
01:14:49,485 --> 01:14:51,695
In fact, there's really no need
for people to interact.
1371
01:14:52,321 --> 01:14:53,530
We need to have...
1372
01:14:53,989 --> 01:14:58,410
some shared understanding of reality.
Otherwise, we aren't a country.
1373
01:14:58,952 --> 01:15:02,998
So, uh, long-term, the solution here is
to build more AI tools
1374
01:15:03,081 --> 01:15:08,128
that find patterns of people using
the services that no real person would do.
1375
01:15:08,212 --> 01:15:11,840
We are allowing the technologists
to frame this as a problem
1376
01:15:11,924 --> 01:15:13,884
that they're equipped to solve.
1377
01:15:15,135 --> 01:15:16,470
That is... That's a lie.
1378
01:15:17,679 --> 01:15:20,724
People talk about AI
as if it will know truth.
1379
01:15:21,683 --> 01:15:23,685
AI's not gonna solve these problems.
1380
01:15:24,269 --> 01:15:27,189
AI cannot solve the problem of fake news.
1381
01:15:28,649 --> 01:15:31,026
Google doesn't have the option of saying,
1382
01:15:31,109 --> 01:15:36,240
"Oh, is this conspiracy? Is this truth?"
Because they don't know what truth is.
1383
01:15:36,782 --> 01:15:37,783
They don't have a...
1384
01:15:37,908 --> 01:15:40,827
They don't have a proxy for truth
that's better than a click.
1385
01:15:41,870 --> 01:15:45,123
If we don't agree on what is true
1386
01:15:45,207 --> 01:15:47,584
or that there is such a thing as truth,
1387
01:15:48,293 --> 01:15:49,294
we're toast.
1388
01:15:49,753 --> 01:15:52,089
This is the problem
beneath other problems
1389
01:15:52,172 --> 01:15:54,424
because if we can't agree on what's true,
1390
01:15:55,092 --> 01:15:57,803
then we can't navigate
out of any of our problems.
1391
01:15:57,886 --> 01:16:00,806
-[ominous instrumental music playing]
-[console droning]
1392
01:16:05,435 --> 01:16:07,729
[Growth AI] We should suggest
Flat Earth Football Club.
1393
01:16:07,813 --> 01:16:10,566
[Engagement AI] Don't show him
sports updates. He doesn't engage.
1394
01:16:11,483 --> 01:16:14,027
[AIs speaking indistinctly]
1395
01:16:15,696 --> 01:16:17,698
[music swells]
1396
01:16:39,886 --> 01:16:42,764
[Jaron] A lot of people in Silicon Valley
subscribe to some kind of theory
1397
01:16:42,848 --> 01:16:45,142
that we're building
some global super brain,
1398
01:16:45,309 --> 01:16:48,020
and all of our users
are just interchangeable little neurons,
1399
01:16:48,103 --> 01:16:49,563
no one of which is important.
1400
01:16:50,230 --> 01:16:53,150
And it subjugates people
into this weird role
1401
01:16:53,233 --> 01:16:56,069
where you're just, like,
this little computing element
1402
01:16:56,153 --> 01:16:58,905
that we're programming
through our behavior manipulation
1403
01:16:58,989 --> 01:17:02,367
for the service of this giant brain,
and you don't matter.
1404
01:17:02,451 --> 01:17:04,911
You're not gonna get paid.
You're not gonna get acknowledged.
1405
01:17:04,995 --> 01:17:06,455
You don't have self-determination.
1406
01:17:06,538 --> 01:17:09,416
We'll sneakily just manipulate you
because you're a computing node,
1407
01:17:09,499 --> 01:17:12,336
so we need to program you 'cause that's
what you do with computing nodes.
1408
01:17:14,504 --> 01:17:16,506
[reflective instrumental music playing]
1409
01:17:20,093 --> 01:17:21,845
Oh, man. [sighs]
1410
01:17:21,928 --> 01:17:25,390
[Tristan] When you think about technology
and it being an existential threat,
1411
01:17:25,474 --> 01:17:28,060
you know, that's a big claim, and...
1412
01:17:29,603 --> 01:17:33,982
it's easy to then, in your mind, think,
"Okay, so, there I am with the phone...
1413
01:17:35,609 --> 01:17:37,235
scrolling, clicking, using it.
1414
01:17:37,319 --> 01:17:39,196
Like, where's the existential threat?
1415
01:17:40,280 --> 01:17:41,615
Okay, there's the supercomputer.
1416
01:17:41,698 --> 01:17:43,950
The other side of the screen,
pointed at my brain,
1417
01:17:44,409 --> 01:17:47,537
got me to watch one more video.
Where's the existential threat?"
1418
01:17:47,621 --> 01:17:49,623
[indistinct chatter]
1419
01:17:54,252 --> 01:17:57,631
[Tristan] It's not
about the technology
1420
01:17:57,714 --> 01:17:59,341
being the existential threat.
1421
01:18:03,679 --> 01:18:06,264
It's the technology's ability
1422
01:18:06,348 --> 01:18:09,476
to bring out the worst in society...
[chuckles]
1423
01:18:09,559 --> 01:18:13,522
...and the worst in society
being the existential threat.
1424
01:18:18,819 --> 01:18:20,570
If technology creates...
1425
01:18:21,697 --> 01:18:23,115
mass chaos,
1426
01:18:23,198 --> 01:18:24,533
outrage, incivility,
1427
01:18:24,616 --> 01:18:26,326
lack of trust in each other,
1428
01:18:27,452 --> 01:18:30,414
loneliness, alienation, more polarization,
1429
01:18:30,706 --> 01:18:33,333
more election hacking, more populism,
1430
01:18:33,917 --> 01:18:36,962
more distraction and inability
to focus on the real issues...
1431
01:18:37,963 --> 01:18:39,715
that's just society. [scoffs]
1432
01:18:40,340 --> 01:18:46,388
And now society
is incapable of healing itself
1433
01:18:46,471 --> 01:18:48,515
and just devolving into a kind of chaos.
1434
01:18:51,977 --> 01:18:54,938
This affects everyone,
even if you don't use these products.
1435
01:18:55,397 --> 01:18:57,524
These things have become
digital Frankensteins
1436
01:18:57,607 --> 01:19:00,068
that are terraforming the world
in their image,
1437
01:19:00,152 --> 01:19:01,862
whether it's the mental health of children
1438
01:19:01,945 --> 01:19:04,489
or our politics
and our political discourse,
1439
01:19:04,573 --> 01:19:07,492
without taking responsibility
for taking over the public square.
1440
01:19:07,576 --> 01:19:10,579
-So, again, it comes back to--
-And who do you think's responsible?
1441
01:19:10,662 --> 01:19:13,582
I think we have
to have the platforms be responsible
1442
01:19:13,665 --> 01:19:15,584
for when they take over
election advertising,
1443
01:19:15,667 --> 01:19:17,794
they're responsible
for protecting elections.
1444
01:19:17,878 --> 01:19:20,380
When they take over mental health of kids
or Saturday morning,
1445
01:19:20,464 --> 01:19:22,841
they're responsible
for protecting Saturday morning.
1446
01:19:23,592 --> 01:19:27,929
The race to keep people's attention
isn't going away.
1447
01:19:28,388 --> 01:19:31,850
Our technology's gonna become
more integrated into our lives, not less.
1448
01:19:31,933 --> 01:19:34,895
The AIs are gonna get better at predicting
what keeps us on the screen,
1449
01:19:34,978 --> 01:19:37,105
not worse at predicting
what keeps us on the screen.
1450
01:19:38,940 --> 01:19:42,027
I... I am 62 years old,
1451
01:19:42,110 --> 01:19:44,821
getting older every minute,
the more this conversation goes on...
1452
01:19:44,905 --> 01:19:48,033
-[crowd chuckles]
-...but... but I will tell you that, um...
1453
01:19:48,700 --> 01:19:52,370
I'm probably gonna be dead and gone,
and I'll probably be thankful for it,
1454
01:19:52,454 --> 01:19:54,331
when all this shit comes to fruition.
1455
01:19:54,790 --> 01:19:59,586
Because... Because I think
that this scares me to death.
1456
01:20:00,754 --> 01:20:03,048
Do... Do you...
Do you see it the same way?
1457
01:20:03,548 --> 01:20:06,885
Or am I overreacting to a situation
that I don't know enough about?
1458
01:20:09,805 --> 01:20:11,598
[interviewer]
What are you most worried about?
1459
01:20:13,850 --> 01:20:18,480
[sighs] I think,
in the... in the shortest time horizon...
1460
01:20:19,523 --> 01:20:20,524
civil war.
1461
01:20:24,444 --> 01:20:29,908
If we go down the current status quo
for, let's say, another 20 years...
1462
01:20:31,117 --> 01:20:34,579
we probably destroy our civilization
through willful ignorance.
1463
01:20:34,663 --> 01:20:37,958
We probably fail to meet the challenge
of climate change.
1464
01:20:38,041 --> 01:20:42,087
We probably degrade
the world's democracies
1465
01:20:42,170 --> 01:20:46,132
so that they fall into some sort
of bizarre autocratic dysfunction.
1466
01:20:46,216 --> 01:20:48,426
We probably ruin the global economy.
1467
01:20:48,760 --> 01:20:52,264
Uh, we probably, um, don't survive.
1468
01:20:52,347 --> 01:20:54,808
You know,
I... I really do view it as existential.
1469
01:20:54,891 --> 01:20:56,893
[helicopter blades whirring]
1470
01:21:02,524 --> 01:21:04,985
[Tristan]
Is this the last generation of people
1471
01:21:05,068 --> 01:21:08,488
that are gonna know what it was like
before this illusion took place?
1472
01:21:11,074 --> 01:21:14,578
Like, how do you wake up from the matrix
when you don't know you're in the matrix?
1473
01:21:14,661 --> 01:21:16,538
[ominous instrumental music playing]
1474
01:21:27,382 --> 01:21:30,635
[Tristan] A lot of what we're saying
sounds like it's just this...
1475
01:21:31,511 --> 01:21:33,680
one-sided doom and gloom.
1476
01:21:33,763 --> 01:21:36,808
Like, "Oh, my God,
technology's just ruining the world
1477
01:21:36,892 --> 01:21:38,059
and it's ruining kids,"
1478
01:21:38,143 --> 01:21:40,061
and it's like... "No." [chuckles]
1479
01:21:40,228 --> 01:21:44,065
It's confusing
because it's simultaneous utopia...
1480
01:21:44,608 --> 01:21:45,567
and dystopia.
1481
01:21:45,942 --> 01:21:50,447
Like, I could hit a button on my phone,
and a car shows up in 30 seconds,
1482
01:21:50,530 --> 01:21:52,699
and I can go exactly where I need to go.
1483
01:21:52,782 --> 01:21:55,660
That is magic. That's amazing.
1484
01:21:56,161 --> 01:21:57,662
When we were making the like button,
1485
01:21:57,746 --> 01:22:01,499
our entire motivation was, "Can we spread
positivity and love in the world?"
1486
01:22:01,583 --> 01:22:05,003
The idea that, fast-forward to today,
and teens would be getting depressed
1487
01:22:05,086 --> 01:22:06,421
when they don't have enough likes,
1488
01:22:06,504 --> 01:22:08,632
or it could be leading
to political polarization
1489
01:22:08,715 --> 01:22:09,883
was nowhere on our radar.
1490
01:22:09,966 --> 01:22:12,135
I don't think these guys set out
to be evil.
1491
01:22:13,511 --> 01:22:15,764
It's just the business model
that has a problem.
1492
01:22:15,847 --> 01:22:20,226
You could shut down the service
and destroy whatever it is--
1493
01:22:20,310 --> 01:22:24,522
$20 billion of shareholder value--
and get sued and...
1494
01:22:24,606 --> 01:22:27,108
But you can't, in practice,
put the genie back in the bottle.
1495
01:22:27,192 --> 01:22:30,403
You can make some tweaks,
but at the end of the day,
1496
01:22:30,487 --> 01:22:34,032
you've gotta grow revenue and usage,
quarter over quarter. It's...
1497
01:22:34,658 --> 01:22:37,535
The bigger it gets,
the harder it is for anyone to change.
1498
01:22:38,495 --> 01:22:43,458
What I see is a bunch of people
who are trapped by a business model,
1499
01:22:43,541 --> 01:22:46,169
an economic incentive,
and shareholder pressure
1500
01:22:46,252 --> 01:22:48,922
that makes it almost impossible
to do something else.
1501
01:22:49,005 --> 01:22:50,924
I think we need to accept that it's okay
1502
01:22:51,007 --> 01:22:53,176
for companies to be focused
on making money.
1503
01:22:53,259 --> 01:22:55,637
What's not okay
is when there's no regulation, no rules,
1504
01:22:55,720 --> 01:22:56,888
and no competition,
1505
01:22:56,972 --> 01:23:00,850
and the companies are acting
as sort of de facto governments.
1506
01:23:00,934 --> 01:23:03,353
And then they're saying,
"Well, we can regulate ourselves."
1507
01:23:03,436 --> 01:23:05,981
I mean, that's just a lie.
That's just ridiculous.
1508
01:23:06,064 --> 01:23:08,650
Financial incentives kind of run
the world,
1509
01:23:08,733 --> 01:23:12,529
so any solution to this problem
1510
01:23:12,612 --> 01:23:15,573
has to realign the financial incentives.
1511
01:23:16,074 --> 01:23:18,785
There's no fiscal reason
for these companies to change.
1512
01:23:18,868 --> 01:23:21,329
And that is why I think
we need regulation.
1513
01:23:21,413 --> 01:23:24,290
The phone company
has tons of sensitive data about you,
1514
01:23:24,374 --> 01:23:27,544
and we have a lot of laws that make sure
they don't do the wrong things.
1515
01:23:27,627 --> 01:23:31,506
We have almost no laws
around digital privacy, for example.
1516
01:23:31,589 --> 01:23:34,426
We could tax data collection
and processing
1517
01:23:34,509 --> 01:23:37,554
the same way that you, for example,
pay your water bill
1518
01:23:37,637 --> 01:23:39,723
by monitoring the amount of water
that you use.
1519
01:23:39,806 --> 01:23:43,226
You tax these companies on the data assets
that they have.
1520
01:23:43,309 --> 01:23:44,769
It gives them a fiscal reason
1521
01:23:44,853 --> 01:23:47,856
to not acquire every piece of data
on the planet.
1522
01:23:47,939 --> 01:23:50,567
The law runs way behind on these things,
1523
01:23:50,650 --> 01:23:55,864
but what I know is the current situation
exists not for the protection of users,
1524
01:23:55,947 --> 01:23:58,700
but for the protection
of the rights and privileges
1525
01:23:58,783 --> 01:24:01,453
of these gigantic,
incredibly wealthy companies.
1526
01:24:02,245 --> 01:24:05,832
Are we always gonna defer to the richest,
most powerful people?
1527
01:24:05,915 --> 01:24:07,417
Or are we ever gonna say,
1528
01:24:07,959 --> 01:24:12,047
"You know, there are times
when there is a national interest.
1529
01:24:12,130 --> 01:24:15,592
There are times
when the interests of people, of users,
1530
01:24:15,675 --> 01:24:17,385
is actually more important
1531
01:24:18,011 --> 01:24:21,473
than the profits of somebody
who's already a billionaire"?
1532
01:24:21,556 --> 01:24:26,603
These markets undermine democracy,
and they undermine freedom,
1533
01:24:26,686 --> 01:24:28,521
and they should be outlawed.
1534
01:24:29,147 --> 01:24:31,816
This is not a radical proposal.
1535
01:24:31,900 --> 01:24:34,194
There are other markets that we outlaw.
1536
01:24:34,277 --> 01:24:36,988
We outlaw markets in human organs.
1537
01:24:37,072 --> 01:24:39,491
We outlaw markets in human slaves.
1538
01:24:39,949 --> 01:24:44,037
Because they have
inevitable destructive consequences.
1539
01:24:44,537 --> 01:24:45,830
We live in a world
1540
01:24:45,914 --> 01:24:50,001
in which a tree is worth more,
financially, dead than alive,
1541
01:24:50,085 --> 01:24:53,838
in a world in which a whale
is worth more dead than alive.
1542
01:24:53,922 --> 01:24:56,341
For so long as our economy works
in that way
1543
01:24:56,424 --> 01:24:58,134
and corporations go unregulated,
1544
01:24:58,218 --> 01:25:00,678
they're going to continue
to destroy trees,
1545
01:25:00,762 --> 01:25:01,763
to kill whales,
1546
01:25:01,846 --> 01:25:06,101
to mine the earth, and to continue
to pull oil out of the ground,
1547
01:25:06,184 --> 01:25:08,394
even though we know
it is destroying the planet
1548
01:25:08,478 --> 01:25:12,148
and we know that it's going to leave
a worse world for future generations.
1549
01:25:12,232 --> 01:25:13,858
This is short-term thinking
1550
01:25:13,942 --> 01:25:16,694
based on this religion of profit
at all costs,
1551
01:25:16,778 --> 01:25:20,156
as if somehow, magically, each corporation
acting in its selfish interest
1552
01:25:20,240 --> 01:25:21,950
is going to produce the best result.
1553
01:25:22,033 --> 01:25:24,494
This has been affecting the environment
for a long time.
1554
01:25:24,577 --> 01:25:27,288
What's frightening,
and what hopefully is the last straw
1555
01:25:27,372 --> 01:25:29,207
that will make us wake up
as a civilization
1556
01:25:29,290 --> 01:25:31,709
to how flawed this theory has been
in the first place
1557
01:25:31,793 --> 01:25:35,004
is to see that now we're the tree,
we're the whale.
1558
01:25:35,088 --> 01:25:37,048
Our attention can be mined.
1559
01:25:37,132 --> 01:25:39,134
We are more profitable to a corporation
1560
01:25:39,217 --> 01:25:41,594
if we're spending time
staring at a screen,
1561
01:25:41,678 --> 01:25:42,971
staring at an ad,
1562
01:25:43,054 --> 01:25:45,890
than if we're spending that time
living our life in a rich way.
1563
01:25:45,974 --> 01:25:47,559
And so, we're seeing the results of that.
1564
01:25:47,642 --> 01:25:50,687
We're seeing corporations using
powerful artificial intelligence
1565
01:25:50,770 --> 01:25:53,648
to outsmart us and figure out
how to pull our attention
1566
01:25:53,731 --> 01:25:55,358
toward the things they want us to look at,
1567
01:25:55,441 --> 01:25:57,277
rather than the things
that are most consistent
1568
01:25:57,360 --> 01:25:59,237
with our goals and our values
and our lives.
1569
01:25:59,320 --> 01:26:01,322
[static crackles]
1570
01:26:02,991 --> 01:26:04,450
[crowd cheering]
1571
01:26:05,535 --> 01:26:06,911
[Steve Jobs] What a computer is,
1572
01:26:06,995 --> 01:26:10,290
is it's the most remarkable tool
that we've ever come up with.
1573
01:26:11,124 --> 01:26:13,877
And it's the equivalent of a bicycle
for our minds.
1574
01:26:15,628 --> 01:26:20,091
The idea of humane technology,
that's where Silicon Valley got its start.
1575
01:26:21,050 --> 01:26:25,722
And we've lost sight of it
because it became the cool thing to do,
1576
01:26:25,805 --> 01:26:27,265
as opposed to the right thing to do.
1577
01:26:27,348 --> 01:26:29,726
The Internet was, like,
a weird, wacky place.
1578
01:26:29,809 --> 01:26:31,394
It was experimental.
1579
01:26:31,477 --> 01:26:34,731
Creative things happened on the Internet,
and certainly, they do still,
1580
01:26:34,814 --> 01:26:38,610
but, like, it just feels like this,
like, giant mall. [chuckles]
1581
01:26:38,693 --> 01:26:42,071
You know, it's just like, "God,
there's gotta be...
1582
01:26:42,155 --> 01:26:44,157
there's gotta be more to it than that."
1583
01:26:44,991 --> 01:26:45,992
[man typing]
1584
01:26:46,659 --> 01:26:48,411
[Bailey] I guess I'm just an optimist.
1585
01:26:48,494 --> 01:26:52,040
'Cause I think we can change
what social media looks like and means.
1586
01:26:54,083 --> 01:26:56,711
[Justin] The way the technology works
is not a law of physics.
1587
01:26:56,794 --> 01:26:57,921
It is not set in stone.
1588
01:26:58,004 --> 01:27:02,175
These are choices that human beings
like myself have been making.
1589
01:27:02,759 --> 01:27:05,345
And human beings can change
those technologies.
1590
01:27:06,971 --> 01:27:09,974
[Tristan] And the question now is
whether or not we're willing to admit
1591
01:27:10,475 --> 01:27:15,438
that those bad outcomes are coming
directly as a product of our work.
1592
01:27:21,027 --> 01:27:24,864
It's that we built these things,
and we have a responsibility to change it.
1593
01:27:28,409 --> 01:27:30,411
[static crackling]
1594
01:27:37,085 --> 01:27:38,711
[Tristan] The attention extraction model
1595
01:27:38,795 --> 01:27:42,298
is not how we want to treat
human beings.
1596
01:27:45,343 --> 01:27:48,137
[distorted] Is it just me or...
1597
01:27:49,722 --> 01:27:51,099
[distorted] Poor sucker.
1598
01:27:51,516 --> 01:27:53,226
[Tristan] The fabric of a healthy society
1599
01:27:53,309 --> 01:27:56,145
depends on us getting off
this corrosive business model.
1600
01:27:56,938 --> 01:27:58,064
[console beeps]
1601
01:27:58,147 --> 01:28:00,149
[gentle instrumental music playing]
1602
01:28:01,526 --> 01:28:04,612
[console whirs, grows quiet]
1603
01:28:04,696 --> 01:28:08,157
[Tristan] We can demand
that these products be designed humanely.
1604
01:28:09,409 --> 01:28:13,121
We can demand to not be treated
as an extractable resource.
1605
01:28:15,164 --> 01:28:18,334
The intention could be:
"How do we make the world better?"
1606
01:28:20,336 --> 01:28:21,504
[Jaron] Throughout history,
1607
01:28:21,587 --> 01:28:23,798
every single time
something's gotten better,
1608
01:28:23,881 --> 01:28:26,342
it's because somebody has come along
to say,
1609
01:28:26,426 --> 01:28:28,428
"This is stupid. We can do better."
[laughs]
1610
01:28:29,178 --> 01:28:32,557
Like, it's the critics
that drive improvement.
1611
01:28:33,141 --> 01:28:35,393
It's the critics
who are the true optimists.
1612
01:28:37,020 --> 01:28:39,147
[sighs] Hello.
1613
01:28:42,984 --> 01:28:44,277
[sighs] Um...
1614
01:28:46,195 --> 01:28:47,697
I mean, it seems kind of crazy, right?
1615
01:28:47,780 --> 01:28:51,534
It's like the fundamental way
that this stuff is designed...
1616
01:28:52,994 --> 01:28:55,163
isn't going in a good direction.
[chuckles]
1617
01:28:55,246 --> 01:28:56,873
Like, the entire thing.
1618
01:28:56,956 --> 01:29:00,626
So, it sounds crazy to say
we need to change all that,
1619
01:29:01,169 --> 01:29:02,670
but that's what we need to do.
1620
01:29:04,297 --> 01:29:05,923
[interviewer] Think we're gonna get there?
1621
01:29:07,383 --> 01:29:08,301
We have to.
1622
01:29:14,515 --> 01:29:16,476
[tense instrumental music playing]
1623
01:29:20,646 --> 01:29:24,942
[interviewer] Um,
it seems like you're very optimistic.
1624
01:29:26,194 --> 01:29:27,570
-Is that how I sound?
-[crew laughs]
1625
01:29:27,653 --> 01:29:28,905
[interviewer] Yeah, I mean...
1626
01:29:28,988 --> 01:29:31,449
I can't believe you keep saying that,
because I'm like, "Really?
1627
01:29:31,532 --> 01:29:33,409
I feel like we're headed toward dystopia.
1628
01:29:33,493 --> 01:29:35,328
I feel like we're on the fast track
to dystopia,
1629
01:29:35,411 --> 01:29:37,830
and it's gonna take a miracle
to get us out of it."
1630
01:29:37,914 --> 01:29:40,291
And that miracle is, of course,
collective will.
1631
01:29:41,000 --> 01:29:44,587
I am optimistic
that we're going to figure it out,
1632
01:29:44,670 --> 01:29:47,048
but I think it's gonna take a long time.
1633
01:29:47,131 --> 01:29:50,385
Because not everybody recognizes
that this is a problem.
1634
01:29:50,468 --> 01:29:55,890
I think one of the big failures
in technology today
1635
01:29:55,973 --> 01:29:58,643
is a real failure of leadership,
1636
01:29:58,726 --> 01:30:01,979
of, like, people coming out
and having these open conversations
1637
01:30:02,063 --> 01:30:05,900
about things that... not just
what went well, but what isn't perfect
1638
01:30:05,983 --> 01:30:08,194
so that someone can come in
and build something new.
1639
01:30:08,277 --> 01:30:10,321
At the end of the day, you know,
1640
01:30:10,405 --> 01:30:14,617
this machine isn't gonna turn around
until there's massive public pressure.
1641
01:30:14,700 --> 01:30:18,329
By having these conversations
and... and voicing your opinion,
1642
01:30:18,413 --> 01:30:21,082
in some cases
through these very technologies,
1643
01:30:21,165 --> 01:30:24,252
we can start to change the tide.
We can start to change the conversation.
1644
01:30:24,335 --> 01:30:27,004
It might sound strange,
but it's my world. It's my community.
1645
01:30:27,088 --> 01:30:29,632
I don't hate them. I don't wanna do
any harm to Google or Facebook.
1646
01:30:29,715 --> 01:30:32,885
I just want to reform them
so they don't destroy the world. You know?
1647
01:30:32,969 --> 01:30:35,513
I've uninstalled a ton of apps
from my phone
1648
01:30:35,596 --> 01:30:37,723
that I felt were just wasting my time.
1649
01:30:37,807 --> 01:30:40,685
All the social media apps,
all the news apps,
1650
01:30:40,768 --> 01:30:42,520
and I've turned off notifications
1651
01:30:42,603 --> 01:30:45,815
on anything that was vibrating my leg
with information
1652
01:30:45,898 --> 01:30:48,943
that wasn't timely and important to me
right now.
1653
01:30:49,026 --> 01:30:51,279
It's for the same reason
I don't keep cookies in my pocket.
1654
01:30:51,362 --> 01:30:53,197
Reduce the number of notifications
you get.
1655
01:30:53,281 --> 01:30:54,449
Turn off notifications.
1656
01:30:54,532 --> 01:30:55,950
Turning off all notifications.
1657
01:30:56,033 --> 01:30:58,536
I'm not using Google anymore,
I'm using Qwant,
1658
01:30:58,619 --> 01:31:01,497
which doesn't store your search history.
1659
01:31:01,581 --> 01:31:04,459
Never accept a video recommended to you
on YouTube.
1660
01:31:04,542 --> 01:31:07,003
Always choose.
That's another way to fight.
1661
01:31:07,086 --> 01:31:12,133
There are tons of Chrome extensions
that remove recommendations.
1662
01:31:12,216 --> 01:31:15,178
[interviewer] You're recommending
something to undo what you made.
1663
01:31:15,261 --> 01:31:16,554
[laughing] Yep.
1664
01:31:16,929 --> 01:31:21,642
Before you share, fact-check,
consider the source, do that extra Google.
1665
01:31:21,726 --> 01:31:25,104
If it seems like it's something designed
to really push your emotional buttons,
1666
01:31:25,188 --> 01:31:26,314
like, it probably is.
1667
01:31:26,397 --> 01:31:29,025
Essentially, you vote with your clicks.
1668
01:31:29,108 --> 01:31:30,359
If you click on clickbait,
1669
01:31:30,443 --> 01:31:33,779
you're creating a financial incentive
that perpetuates this existing system.
1670
01:31:33,863 --> 01:31:36,949
Make sure that you get
lots of different kinds of information
1671
01:31:37,033 --> 01:31:37,909
in your own life.
1672
01:31:37,992 --> 01:31:40,995
I follow people on Twitter
that I disagree with
1673
01:31:41,078 --> 01:31:44,207
because I want to be exposed
to different points of view.
1674
01:31:44,665 --> 01:31:46,584
Notice that many people
in the tech industry
1675
01:31:46,667 --> 01:31:49,045
don't give these devices
to their own children.
1676
01:31:49,128 --> 01:31:51,047
My kids don't use social media at all.
1677
01:31:51,839 --> 01:31:53,549
[interviewer] Is that a rule,
or is that a...
1678
01:31:53,633 --> 01:31:54,509
That's a rule.
1679
01:31:55,092 --> 01:31:57,845
We are zealots about it.
1680
01:31:57,929 --> 01:31:59,222
We're... We're crazy.
1681
01:31:59,305 --> 01:32:05,603
And we don't let our kids have
really any screen time.
1682
01:32:05,686 --> 01:32:08,564
I've worked out
what I think are three simple rules, um,
1683
01:32:08,648 --> 01:32:12,610
that make life a lot easier for families
and that are justified by the research.
1684
01:32:12,693 --> 01:32:15,571
So, the first rule is
all devices out of the bedroom
1685
01:32:15,655 --> 01:32:17,281
at a fixed time every night.
1686
01:32:17,365 --> 01:32:20,535
Whatever the time is, half an hour
before bedtime, all devices out.
1687
01:32:20,618 --> 01:32:24,038
The second rule is no social media
until high school.
1688
01:32:24,121 --> 01:32:26,374
Personally, I think the age should be 16.
1689
01:32:26,457 --> 01:32:28,960
Middle school's hard enough.
Keep it out until high school.
1690
01:32:29,043 --> 01:32:32,964
And the third rule is
work out a time budget with your kid.
1691
01:32:33,047 --> 01:32:34,757
And if you talk with them and say,
1692
01:32:34,840 --> 01:32:37,927
"Well, how many hours a day
do you wanna spend on your device?
1693
01:32:38,010 --> 01:32:39,637
What do you think is a good amount?"
1694
01:32:39,720 --> 01:32:41,597
they'll often say
something pretty reasonable.
1695
01:32:42,056 --> 01:32:44,642
Well, look, I know perfectly well
1696
01:32:44,725 --> 01:32:48,563
that I'm not gonna get everybody
to delete their social media accounts,
1697
01:32:48,646 --> 01:32:50,439
but I think I can get a few.
1698
01:32:50,523 --> 01:32:54,402
And just getting a few people
to delete their accounts matters a lot,
1699
01:32:54,485 --> 01:32:58,406
and the reason why is that that creates
the space for a conversation
1700
01:32:58,489 --> 01:33:00,908
because I want there to be enough people
out in the society
1701
01:33:00,992 --> 01:33:05,204
who are free of the manipulation engines
to have a societal conversation
1702
01:33:05,288 --> 01:33:07,540
that isn't bounded
by the manipulation engines.
1703
01:33:07,623 --> 01:33:10,126
So, do it! Get out of the system.
1704
01:33:10,209 --> 01:33:12,503
Yeah, delete. Get off the stupid stuff.
1705
01:33:13,546 --> 01:33:16,507
The world's beautiful.
Look. Look, it's great out there.
1706
01:33:17,258 --> 01:33:18,384
[laughs]
1707
01:33:21,971 --> 01:33:24,432
-[birds singing]
-[children playing and shouting]
142779
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.