Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:31,114 --> 00:00:34,659
Why don't you go ahead?
Sit down and see if you can get comfy.
2
00:00:37,579 --> 00:00:39,789
- You good? All right.
- Yeah.
3
00:00:39,914 --> 00:00:42,125
Um...
4
00:00:43,043 --> 00:00:44,794
Take one, marker.
5
00:00:46,796 --> 00:00:48,798
Wanna start
by introducing yourself?
6
00:00:50,467 --> 00:00:53,344
Hello, world. Bailey. Take three.
7
00:00:53,970 --> 00:00:56,347
- You good?
- This is the worst part, man.
8
00:00:56,890 --> 00:00:59,517
I don't like this.
9
00:00:59,851 --> 00:01:02,228
I worked at Facebook in 2011 and 2012.
10
00:01:02,312 --> 00:01:05,190
I was one of the really early employees
at Instagram.
11
00:01:05,272 --> 00:01:08,693
I worked at, uh, Google,uh, YouTube.
12
00:01:08,777 --> 00:01:11,696
Apple, Google, Twitter, Palm.
13
00:01:12,739 --> 00:01:15,533
I helped start Mozilla Labs
and switched over to the Firefox side.
14
00:01:15,617 --> 00:01:18,119
Are we rolling? Everybody?
15
00:01:18,203 --> 00:01:19,162
Great.
16
00:01:21,206 --> 00:01:22,624
I worked at Twitter.
17
00:01:23,041 --> 00:01:23,917
My last job there
18
00:01:24,000 --> 00:01:26,169
was the senior vice president
of engineering.
19
00:01:27,337 --> 00:01:29,255
I was the president of Pinterest.
20
00:01:29,339 --> 00:01:32,717
Before that, um,
I was the... the director of monetization
21
00:01:32,801 --> 00:01:34,260
at Facebook for five years.
22
00:01:34,344 --> 00:01:37,972
While at Twitter, I spent a number
of years running their developer platform,
23
00:01:38,056 --> 00:01:40,225
and then I became
head of consumer product.
24
00:01:40,308 --> 00:01:44,270
I was the coinventor of Google Drive,
Gmail Chat,
25
00:01:44,354 --> 00:01:46,689
Facebook Pages,
and the Facebook like button.
26
00:01:47,440 --> 00:01:50,777
Yeah. This is... This is why I spent,
like, eight months
27
00:01:50,860 --> 00:01:52,779
talking back and forth with lawyers.
28
00:01:54,072 --> 00:01:55,406
This freaks me out.
29
00:01:58,409 --> 00:01:59,702
When I was there,
30
00:01:59,786 --> 00:02:02,914
I always felt like,
fundamentally, it was a force for good.
31
00:02:03,414 --> 00:02:05,375
I don't know if I feel that way anymore.
32
00:02:05,458 --> 00:02:10,588
I left Google in June 2017, uh,
due to ethical concerns.
33
00:02:10,672 --> 00:02:14,134
And... And not just at Google
but within the industry at large.
34
00:02:14,217 --> 00:02:15,385
I'm very concerned.
35
00:02:16,636 --> 00:02:17,679
I'm very concerned.
36
00:02:19,097 --> 00:02:21,808
It's easy today to lose sight of the fact
37
00:02:21,891 --> 00:02:27,814
that these tools actually have created
some wonderful things in the world.
38
00:02:27,897 --> 00:02:31,943
They've reunited lost family members.
They've found organ donors.
39
00:02:32,026 --> 00:02:36,573
I mean, there were meaningful,
systemic changes happening
40
00:02:36,656 --> 00:02:39,159
around the world
because of these platforms
41
00:02:39,242 --> 00:02:40,285
that were positive!
42
00:02:40,827 --> 00:02:44,539
I think we were naive
about the flip side of that coin.
43
00:02:45,540 --> 00:02:48,585
Yeah, these things, you release them,
and they take on a life of their own.
44
00:02:48,668 --> 00:02:52,005
And how they're used is pretty different
than how you expected.
45
00:02:52,088 --> 00:02:56,509
Nobody, I deeply believe,
ever intended any of these consequences.
46
00:02:56,593 --> 00:02:59,554
There's no one bad guy.
No. Absolutely not.
47
00:03:01,598 --> 00:03:03,975
So, then,
what's the... what's the problem?
48
00:03:09,147 --> 00:03:11,482
Is there a problem,
and what is the problem?
49
00:03:17,614 --> 00:03:19,991
Yeah, it is hard
to give a single, succinct...
50
00:03:20,074 --> 00:03:22,118
I'm trying to touch on
many different problems.
51
00:03:22,535 --> 00:03:23,953
What is the problem?
52
00:03:33,463 --> 00:03:35,340
Despite facing mounting criticism,
53
00:03:35,423 --> 00:03:37,675
the so-called Big Tech namesare getting bigger.
54
00:03:37,759 --> 00:03:40,929
The entire tech industry is
under a new level of scrutiny.
55
00:03:41,012 --> 00:03:43,806
And a new study sheds light on the link
56
00:03:43,890 --> 00:03:46,142
between mental health
and social media use.
57
00:03:46,226 --> 00:03:48,686
Here to talk about the latest research...
58
00:03:48,770 --> 00:03:51,397
...is going onthat gets no coverage at all.
59
00:03:51,481 --> 00:03:54,108
Tens of millions of Americansare hopelessly addicted
60
00:03:54,192 --> 00:03:56,319
to their electronic devices.
61
00:03:56,402 --> 00:03:57,987
It's exacerbated by the fact
62
00:03:58,071 --> 00:04:00,698
that you can literally isolate yourselfnow
63
00:04:00,782 --> 00:04:02,742
in a bubble, thanks to our technology.
64
00:04:02,825 --> 00:04:04,577
Fake news is becoming more advanced
65
00:04:04,661 --> 00:04:06,788
and threatening societies
around the world.
66
00:04:06,871 --> 00:04:10,250
We weren't expecting any of this
when we created Twitter over 12 years ago.
67
00:04:10,333 --> 00:04:12,502
White House officials say
they have no reason to believe
68
00:04:12,585 --> 00:04:14,754
the Russian cyberattacks will stop.
69
00:04:14,837 --> 00:04:18,132
YouTube is being forced
to concentrate on cleansing the site.
70
00:04:18,216 --> 00:04:21,552
TikTok,if you talk to any tween out there...
71
00:04:21,636 --> 00:04:24,013
...there's no chancethey'll delete this thing...
72
00:04:24,097 --> 00:04:26,224
Hey, Isla,
can you get the table ready, please?
73
00:04:26,307 --> 00:04:28,601
There's a questionabout whether social media
74
00:04:28,685 --> 00:04:29,978
is making your child depressed.
75
00:04:30,061 --> 00:04:32,105
Isla,
can you set the table, please?
76
00:04:32,188 --> 00:04:35,316
These cosmetic proceduresare becoming so popular with teens,
77
00:04:35,400 --> 00:04:37,902
plastic surgeons have coined
a new syndrome for it,
78
00:04:37,986 --> 00:04:40,822
"Snapchat dysmorphia,"
with young patients wanting surgery
79
00:04:40,905 --> 00:04:43,741
so they can look more like they do
in filtered selfies.
80
00:04:43,825 --> 00:04:45,910
Still don't see why you let her have
that thing.
81
00:04:45,994 --> 00:04:47,412
What was I supposed to do?
82
00:04:47,495 --> 00:04:49,580
I mean, every other kid
in her class had one.
83
00:04:50,164 --> 00:04:51,165
She's only 11.
84
00:04:51,249 --> 00:04:52,959
Cass, no one's forcing you to get one.
85
00:04:53,042 --> 00:04:55,086
You can stay disconnected
as long as you want.
86
00:04:55,169 --> 00:04:59,340
Hey, I'm connected without a cell phone,
okay? I'm on the Internet right now.
87
00:04:59,424 --> 00:05:03,094
Also, that isn't even actual connection.
It's just a load of sh--
88
00:05:03,177 --> 00:05:05,013
Surveillance capitalism has come to shape
89
00:05:05,096 --> 00:05:07,765
our politics and culture
in ways many people don't perceive.
90
00:05:07,849 --> 00:05:10,101
ISIS inspired followers online,
91
00:05:10,184 --> 00:05:12,812
and now white supremacistsare doing the same.
92
00:05:12,895 --> 00:05:14,147
Recently in India,
93
00:05:14,230 --> 00:05:17,442
Internet lynch mobs have killed
a dozen people, including these five...
94
00:05:17,525 --> 00:05:20,361
It's not just fake news;it's fake news with consequences.
95
00:05:20,445 --> 00:05:24,073
How do you handle an epidemicin the age of fake news?
96
00:05:24,157 --> 00:05:26,993
Can you get the coronavirus
by eating Chinese food?
97
00:05:27,535 --> 00:05:32,540
We have gone from the information age
into the disinformation age.
98
00:05:32,623 --> 00:05:34,667
Our democracy is under assault.
99
00:05:34,751 --> 00:05:36,919
What I said was,
"I think the tools
100
00:05:37,003 --> 00:05:39,005
that have been created today are starting
101
00:05:39,088 --> 00:05:41,799
to erode the social fabric
of how society works."
102
00:05:58,566 --> 00:05:59,442
Fine.
103
00:06:00,151 --> 00:06:03,446
Aza does
welcoming remarks. We play the video.
104
00:06:04,197 --> 00:06:07,325
And then, "Ladies and gentlemen,
Tristan Harris."
105
00:06:07,408 --> 00:06:08,868
- Right.
- Great.
106
00:06:08,951 --> 00:06:12,038
So, I come up, and...
107
00:06:13,831 --> 00:06:17,126
basically say, "Thank you all for coming."
Um...
108
00:06:17,919 --> 00:06:22,048
So, today, I wanna talk about a new agenda
for technology.
109
00:06:22,131 --> 00:06:25,468
And why we wanna do that
is because if you ask people,
110
00:06:25,551 --> 00:06:27,804
"What's wrong in the tech industry
right now?"
111
00:06:28,262 --> 00:06:31,641
There's a cacophony of grievances
and scandals,
112
00:06:31,724 --> 00:06:33,893
and "They stole our data."
And there's tech addiction.
113
00:06:33,976 --> 00:06:35,978
And there's fake news.
And there's polarization
114
00:06:36,062 --> 00:06:37,855
and some elections
that are getting hacked.
115
00:06:38,189 --> 00:06:41,609
But is there something
that is beneath all these problems
116
00:06:41,692 --> 00:06:44,612
that's causing all these things
to happen at once?
117
00:06:46,447 --> 00:06:48,408
- Does this feel good?
- Very good. Yeah.
118
00:06:49,033 --> 00:06:49,992
Um...
119
00:06:50,743 --> 00:06:52,954
I'm just trying to...
Like, I want people to see...
120
00:06:53,037 --> 00:06:55,123
Like, there's a problem happening
in the tech industry,
121
00:06:55,206 --> 00:06:56,707
and it doesn't have a name,
122
00:06:56,791 --> 00:07:00,211
and it has to do with one source,
like, one...
123
00:07:05,091 --> 00:07:09,387
When you look around you,
it feels like the world is going crazy.
124
00:07:12,765 --> 00:07:15,309
You have to ask yourself, like,
"Is this normal?
125
00:07:16,102 --> 00:07:18,771
Or have we all fallen under some kind
of spell?"
126
00:07:27,989 --> 00:07:30,491
I wish more people could understand
how this works
127
00:07:30,575 --> 00:07:34,036
because it shouldn't be something
that only the tech industry knows.
128
00:07:34,120 --> 00:07:36,247
It should be something
that everybody knows.
129
00:07:41,419 --> 00:07:42,378
Bye.
130
00:07:43,629 --> 00:07:44,881
Here you go, sir.
131
00:07:47,383 --> 00:07:48,676
- Hello!
- Hi.
132
00:07:48,759 --> 00:07:50,678
- Tristan. Nice to meet you.
- It's Tris-tan, right?
133
00:07:50,761 --> 00:07:51,721
- Yes.
- Awesome. Cool.
134
00:07:53,181 --> 00:07:55,933
Tristan Harris
is a former design ethicist for Google
135
00:07:56,017 --> 00:07:59,395
and has been called the closest thing
Silicon Valley has to a conscience.
136
00:07:59,479 --> 00:08:00,730
He's asking tech
137
00:08:00,813 --> 00:08:04,192
to bring what he calls "ethical design"
to its products.
138
00:08:04,275 --> 00:08:06,903
It's rarefor a tech insider to be so blunt,
139
00:08:06,986 --> 00:08:10,114
but Tristan Harris believessomeone needs to be.
140
00:08:11,324 --> 00:08:12,700
When I was at Google,
141
00:08:12,783 --> 00:08:16,037
I was on the Gmail team,
and I just started getting burnt out
142
00:08:16,120 --> 00:08:18,372
'cause we'd had
so many conversations about...
143
00:08:19,457 --> 00:08:23,169
you know, what the inbox should look like
and what color it should be, and...
144
00:08:23,252 --> 00:08:25,880
And I, you know, felt personally addicted
to e-mail,
145
00:08:26,297 --> 00:08:27,632
and I found it fascinating
146
00:08:27,715 --> 00:08:31,511
there was no one at Gmail working
on making it less addictive.
147
00:08:31,969 --> 00:08:34,514
And I was like,
"Is anybody else thinking about this?
148
00:08:34,597 --> 00:08:36,390
I haven't heard anybody talk about this."
149
00:08:36,849 --> 00:08:39,684
And I was feeling this frustration...
150
00:08:39,769 --> 00:08:41,229
...with the tech industry, overall,
151
00:08:41,312 --> 00:08:43,147
that we'd kind of, like, lost our way.
152
00:08:46,817 --> 00:08:49,820
You know, I really struggled
to try and figure out
153
00:08:49,904 --> 00:08:52,573
how, from the inside, we could change it.
154
00:08:55,201 --> 00:08:58,120
And that was when I decided
to make a presentation,
155
00:08:58,204 --> 00:08:59,497
kind of a call to arms.
156
00:09:00,998 --> 00:09:04,961
Every day, I went home and I worked on it
for a couple hours every single night.
157
00:09:06,170 --> 00:09:08,548
It basically just said,
you know,
158
00:09:08,631 --> 00:09:11,884
never before
in history have 50 designers--
159
00:09:12,426 --> 00:09:15,263
20- to 35-year-old white guys
in California--
160
00:09:15,888 --> 00:09:19,725
made decisions that would have an impact
on two billion people.
161
00:09:21,018 --> 00:09:24,438
Two billion people will have thoughts
that they didn't intend to have
162
00:09:24,522 --> 00:09:28,401
because a designer at Google said,
"This is how notifications work
163
00:09:28,484 --> 00:09:30,778
on that screen that you wake up to
in the morning."
164
00:09:31,195 --> 00:09:35,283
And we have a moral responsibility,
as Google, for solving this problem.
165
00:09:36,075 --> 00:09:37,743
And I sent this presentation
166
00:09:37,827 --> 00:09:41,789
to about 15, 20 of my closest colleagues
at Google,
167
00:09:41,872 --> 00:09:44,959
and I was very nervous about it.
I wasn't sure how it was gonna land.
168
00:09:46,460 --> 00:09:48,045
When I went to work the next day,
169
00:09:48,129 --> 00:09:50,464
most of the laptops
had the presentation open.
170
00:09:52,133 --> 00:09:54,552
Later that day, there was, like,
400 simultaneous viewers,
171
00:09:54,635 --> 00:09:56,053
so it just kept growing and growing.
172
00:09:56,137 --> 00:10:00,266
I got e-mails from all around the company.
I mean, people in every department saying,
173
00:10:00,349 --> 00:10:02,852
"I totally agree."
"I see this affecting my kids."
174
00:10:02,935 --> 00:10:04,979
"I see this affecting
the people around me."
175
00:10:05,062 --> 00:10:06,939
"We have to do something about this."
176
00:10:07,481 --> 00:10:10,818
It felt like I was sort of launching
a revolution or something like that.
177
00:10:11,861 --> 00:10:15,197
Later, I found out Larry Page
had been notified about this presentation
178
00:10:15,281 --> 00:10:17,908
in three separate meetings that day.
179
00:10:17,992 --> 00:10:20,286
And so, it created
this kind of cultural moment
180
00:10:20,870 --> 00:10:24,415
that Google needed to take seriously.
181
00:10:26,000 --> 00:10:28,878
And then... nothing.
182
00:10:34,300 --> 00:10:36,135
Everyone in 2006...
183
00:10:37,219 --> 00:10:39,221
including all of us at Facebook,
184
00:10:39,305 --> 00:10:43,392
just had total admiration for Google
and what Google had built,
185
00:10:43,476 --> 00:10:47,396
which was this incredibly useful service
186
00:10:47,480 --> 00:10:51,442
that did, far as we could tell,
lots of goodness for the world,
187
00:10:51,525 --> 00:10:54,695
and they built
this parallel money machine.
188
00:10:55,404 --> 00:11:00,034
We had such envy for that,
and it seemed so elegant to us...
189
00:11:00,826 --> 00:11:02,161
and so perfect.
190
00:11:02,953 --> 00:11:05,289
Facebook had been around
for about two years,
191
00:11:05,373 --> 00:11:08,376
um, and I was hired to come in
and figure out
192
00:11:08,459 --> 00:11:10,586
what the business model was gonna be
for the company.
193
00:11:10,670 --> 00:11:13,422
I was the director of monetization.
The point was, like,
194
00:11:13,506 --> 00:11:17,051
"You're the person who's gonna figure out
how this thing monetizes."
195
00:11:17,134 --> 00:11:19,804
And there were a lot of people
who did a lot of the work,
196
00:11:19,887 --> 00:11:25,476
but I was clearly one of the people
who was pointing towards...
197
00:11:26,769 --> 00:11:28,562
"Well, we have to make money, A...
198
00:11:29,313 --> 00:11:33,651
and I think this advertising model
is probably the most elegant way.
199
00:11:42,243 --> 00:11:44,370
Uh-oh. What's this video Mom just sent us?
200
00:11:44,453 --> 00:11:46,747
Oh, that's from a talk show,
but that's pretty good.
201
00:11:46,831 --> 00:11:47,873
Guy's kind of a genius.
202
00:11:47,957 --> 00:11:50,584
He's talking all about deleting
social media, which you gotta do.
203
00:11:50,668 --> 00:11:52,878
I might have to start blocking
her e-mails.
204
00:11:52,962 --> 00:11:54,880
I don't even know
what she's talking about, man.
205
00:11:54,964 --> 00:11:56,090
She's worse than I am.
206
00:11:56,173 --> 00:11:58,509
- No, she only uses it for recipes.
- Right, and work.
207
00:11:58,592 --> 00:12:00,553
- And workout videos.
- And to check up on us.
208
00:12:00,636 --> 00:12:03,055
And everyone else she's ever met
in her entire life.
209
00:12:04,932 --> 00:12:07,893
If you are scrolling throughyour social media feed
210
00:12:07,977 --> 00:12:11,731
while you're watchin' us, you need to putthe damn phone down and listen up
211
00:12:11,814 --> 00:12:14,817
'cause our next guest has written
an incredible book
212
00:12:14,900 --> 00:12:18,112
about how much it's wrecking our lives.
213
00:12:18,195 --> 00:12:19,447
Please welcome author
214
00:12:19,530 --> 00:12:23,951
of Ten Arguments for DeletingYour Social Media Accounts Right Now...
215
00:12:24,034 --> 00:12:26,287
- Uh-huh.
- ...Jaron Lanier.
216
00:12:27,997 --> 00:12:31,834
Companies like Google and Facebook
are some of the wealthiest
217
00:12:31,917 --> 00:12:33,544
and most successful of all time.
218
00:12:33,711 --> 00:12:36,839
Uh, they have relatively few employees.
219
00:12:36,922 --> 00:12:41,427
They just have this giant computer
that rakes in money, right? Uh...
220
00:12:41,510 --> 00:12:42,970
Now, what are they being paid for?
221
00:12:43,053 --> 00:12:45,222
That's a really important question.
222
00:12:47,308 --> 00:12:50,311
So, I've been an investor
in technology for 35 years.
223
00:12:51,020 --> 00:12:54,356
The first 50 years of Silicon Valley,
the industry made products--
224
00:12:54,440 --> 00:12:55,566
hardware, software--
225
00:12:55,649 --> 00:12:58,402
sold 'em to customers.
Nice, simple business.
226
00:12:58,486 --> 00:13:01,447
For the last ten years,
the biggest companies in Silicon Valley
227
00:13:01,530 --> 00:13:03,866
have been in the business
of selling their users.
228
00:13:03,949 --> 00:13:05,910
It's a little even trite to say now,
229
00:13:05,993 --> 00:13:09,205
but... because we don't pay
for the products that we use,
230
00:13:09,288 --> 00:13:12,166
advertisers pay
for the products that we use.
231
00:13:12,249 --> 00:13:14,210
Advertisers are the customers.
232
00:13:14,710 --> 00:13:16,086
We're the thing being sold.
233
00:13:16,170 --> 00:13:17,630
The classic saying is:
234
00:13:17,713 --> 00:13:21,592
"If you're not paying for the product,
then you are the product."
235
00:13:23,385 --> 00:13:27,223
A lot of people think, you know,
"Oh, well, Google's just a search box,
236
00:13:27,306 --> 00:13:29,850
and Facebook's just a place to see
what my friends are doing
237
00:13:29,934 --> 00:13:31,101
and see their photos."
238
00:13:31,185 --> 00:13:35,481
But what they don't realize
is they're competing for your attention.
239
00:13:36,524 --> 00:13:41,111
So, you know, Facebook, Snapchat,
Twitter, Instagram, YouTube,
240
00:13:41,195 --> 00:13:45,699
companies like this, their business model
is to keep people engaged on the screen.
241
00:13:46,283 --> 00:13:49,578
Let's figure out how to get
as much of this person's attention
242
00:13:49,662 --> 00:13:50,955
as we possibly can.
243
00:13:51,455 --> 00:13:53,374
How much time can we get you to spend?
244
00:13:53,874 --> 00:13:56,669
How much of your life can we get you
to give to us?
245
00:13:58,629 --> 00:14:01,090
When you think about
how some of these companies work,
246
00:14:01,173 --> 00:14:02,424
it starts to make sense.
247
00:14:03,050 --> 00:14:06,095
There are all these services
on the Internet that we think of as free,
248
00:14:06,178 --> 00:14:09,473
but they're not free.
They're paid for by advertisers.
249
00:14:09,557 --> 00:14:11,559
Why do advertisers pay those companies?
250
00:14:11,642 --> 00:14:14,687
They pay in exchange for showing their ads
to us.
251
00:14:14,770 --> 00:14:18,357
We're the product. Our attention
is the product being sold to advertisers.
252
00:14:18,816 --> 00:14:20,442
That's a little too simplistic.
253
00:14:20,860 --> 00:14:23,654
It's the gradual, slight,
imperceptible change
254
00:14:23,737 --> 00:14:26,574
in your own behavior and perception
that is the product.
255
00:14:27,658 --> 00:14:30,244
And that is the product.
It's the only possible product.
256
00:14:30,327 --> 00:14:34,081
There's nothing else on the table
that could possibly be called the product.
257
00:14:34,164 --> 00:14:37,001
That's the only thing there is
for them to make money from.
258
00:14:37,668 --> 00:14:39,253
Changing what you do,
259
00:14:39,336 --> 00:14:41,714
how you think, who you are.
260
00:14:42,631 --> 00:14:45,301
It's a gradual change. It's slight.
261
00:14:45,384 --> 00:14:48,971
If you can go to somebody and you say,
"Give me $10 million,
262
00:14:49,054 --> 00:14:54,310
and I will change the world one percent
in the direction you want it to change..."
263
00:14:54,852 --> 00:14:58,188
It's the world! That can be incredible,
and that's worth a lot of money.
264
00:14:59,315 --> 00:15:00,149
Okay.
265
00:15:00,691 --> 00:15:04,570
This is what every business
has always dreamt of:
266
00:15:04,653 --> 00:15:10,910
to have a guarantee that if it places
an ad, it will be successful.
267
00:15:11,327 --> 00:15:12,786
That's their business.
268
00:15:12,870 --> 00:15:14,413
They sell certainty.
269
00:15:14,997 --> 00:15:17,625
In order to be successful
in that business,
270
00:15:17,708 --> 00:15:19,793
you have to have great predictions.
271
00:15:20,085 --> 00:15:24,173
Great predictions begin
with one imperative:
272
00:15:25,215 --> 00:15:26,926
you need a lot of data.
273
00:15:29,136 --> 00:15:31,305
Many people call this
surveillance capitalism,
274
00:15:31,639 --> 00:15:34,350
capitalism profiting
off of the infinite tracking
275
00:15:34,433 --> 00:15:38,062
of everywhere everyone goes
by large technology companies
276
00:15:38,145 --> 00:15:40,356
whose business model is to make sure
277
00:15:40,439 --> 00:15:42,858
that advertisers are as successful
as possible.
278
00:15:42,942 --> 00:15:45,569
This is a new kind of marketplace now.
279
00:15:45,653 --> 00:15:48,072
It's a marketplace
that never existed before.
280
00:15:48,822 --> 00:15:55,371
And it's a marketplace
that trades exclusively in human futures.
281
00:15:56,080 --> 00:16:01,585
Just like there are markets that trade
in pork belly futures or oil futures.
282
00:16:02,127 --> 00:16:07,591
We now have markets
that trade in human futures at scale,
283
00:16:08,175 --> 00:16:13,472
and those markets have produced
the trillions of dollars
284
00:16:14,014 --> 00:16:19,269
that have made the Internet companies
the richest companies
285
00:16:19,353 --> 00:16:22,356
in the history of humanity.
286
00:16:27,361 --> 00:16:30,990
What I want people to know
is that everything they're doing online
287
00:16:31,073 --> 00:16:34,326
is being watched, is being tracked,
is being measured.
288
00:16:35,035 --> 00:16:39,623
Every single action you take
is carefully monitored and recorded.
289
00:16:39,707 --> 00:16:43,836
Exactly what image you stop and look at,
for how long you look at it.
290
00:16:43,919 --> 00:16:45,796
Oh, yeah, seriously,
for how long you look at it.
291
00:16:50,509 --> 00:16:52,219
They knowwhen people are lonely.
292
00:16:52,302 --> 00:16:53,804
They know when people are depressed.
293
00:16:53,887 --> 00:16:57,099
They know when people are lookingat photos of your ex-romantic partners.
294
00:16:57,182 --> 00:17:00,853
They know what you're doing late at night.They know the entire thing.
295
00:17:01,270 --> 00:17:03,230
Whether you're an introvertor an extrovert,
296
00:17:03,313 --> 00:17:06,817
or what kind of neuroses you have,what your personality type is like.
297
00:17:08,193 --> 00:17:11,613
They have more information
about us
298
00:17:11,696 --> 00:17:14,324
than has ever been imagined
in human history.
299
00:17:14,950 --> 00:17:16,367
It is unprecedented.
300
00:17:18,579 --> 00:17:22,790
And so, all of this data that we're...
that we're just pouring out all the time
301
00:17:22,875 --> 00:17:26,753
is being fed into these systems
that have almost no human supervision
302
00:17:27,463 --> 00:17:30,883
and that are making better and better
and better and better predictions
303
00:17:30,966 --> 00:17:33,552
about what we're gonna do
and... and who we are.
304
00:17:36,305 --> 00:17:39,349
People have the misconception
it's our data being sold.
305
00:17:40,350 --> 00:17:43,187
It's not in Facebook's business interest
to give up the data.
306
00:17:45,522 --> 00:17:47,107
What do they do with that data?
307
00:17:51,070 --> 00:17:54,490
They build models
that predict our actions,
308
00:17:54,573 --> 00:17:57,618
and whoever has the best model wins.
309
00:18:02,706 --> 00:18:04,041
His scrolling speed is slowing.
310
00:18:04,124 --> 00:18:06,085
Nearing the end
of his average session length.
311
00:18:06,168 --> 00:18:07,002
Decreasing ad load.
312
00:18:07,086 --> 00:18:08,337
Pull back on friends and family.
313
00:18:09,588 --> 00:18:11,340
On the other side of the screen,
314
00:18:11,423 --> 00:18:15,469
it's almost as if they had
this avatar voodoo doll-like model of us.
315
00:18:16,845 --> 00:18:18,180
All of the things we've ever done,
316
00:18:18,263 --> 00:18:19,473
all the clicks we've ever made,
317
00:18:19,556 --> 00:18:21,642
all the videos we've watched,
all the likes,
318
00:18:21,725 --> 00:18:25,354
that all gets brought back into building
a more and more accurate model.
319
00:18:25,896 --> 00:18:27,481
The model, once you have it,
320
00:18:27,564 --> 00:18:29,858
you can predict the kinds of things
that person does.
321
00:18:29,942 --> 00:18:31,777
Right, let me just test.
322
00:18:32,569 --> 00:18:34,988
Where you'll go.
I can predict what kind of videos
323
00:18:35,072 --> 00:18:36,115
will keep you watching.
324
00:18:36,198 --> 00:18:39,159
I can predict what kinds of emotions tend
to trigger you.
325
00:18:39,243 --> 00:18:40,410
Yes, perfect.
326
00:18:41,578 --> 00:18:43,372
The most epic fails of the year.
327
00:18:48,627 --> 00:18:51,088
- Perfect. That worked.
- Following with another video.
328
00:18:51,171 --> 00:18:54,049
Beautiful. Let's squeeze in a sneaker ad
before it starts.
329
00:18:56,426 --> 00:18:58,178
At a lot
of technology companies,
330
00:18:58,262 --> 00:18:59,721
there's three main goals.
331
00:18:59,805 --> 00:19:01,348
There's the engagement goal:
332
00:19:01,431 --> 00:19:03,684
to drive up your usage,
to keep you scrolling.
333
00:19:04,601 --> 00:19:06,145
There's the growth goal:
334
00:19:06,228 --> 00:19:08,689
to keep you coming back
and inviting as many friends
335
00:19:08,772 --> 00:19:10,816
and getting them to invite more friends.
336
00:19:11,650 --> 00:19:13,152
And then there's the advertising goal:
337
00:19:13,235 --> 00:19:14,987
to make sure that,
as all that's happening,
338
00:19:15,070 --> 00:19:17,406
we're making as much money as possible
from advertising.
339
00:19:19,241 --> 00:19:21,994
Each of these goals are powered
by algorithms
340
00:19:22,077 --> 00:19:24,454
whose job is to figure out
what to show you
341
00:19:24,538 --> 00:19:26,165
to keep those numbers going up.
342
00:19:26,623 --> 00:19:29,918
We often talked about, at Facebook,
this idea
343
00:19:30,002 --> 00:19:34,006
of being able to just dial that as needed.
344
00:19:34,673 --> 00:19:38,594
And, you know, we talked
about having Mark have those dials.
345
00:19:41,305 --> 00:19:44,474
"Hey, I want more users in Korea today."
346
00:19:45,684 --> 00:19:46,602
"Turn the dial."
347
00:19:47,436 --> 00:19:49,188
"Let's dial up the ads a little bit."
348
00:19:49,980 --> 00:19:51,899
"Dial up monetization, just slightly."
349
00:19:52,858 --> 00:19:55,444
And so, that happ--
350
00:19:55,527 --> 00:19:59,239
I mean, at all of these companies,
there is that level of precision.
351
00:19:59,990 --> 00:20:02,409
- Dude, how--
- I don't know how I didn't get carded.
352
00:20:02,492 --> 00:20:05,704
- That ref just, like, sucked or something.
- You got literally all the way...
353
00:20:05,787 --> 00:20:07,956
- That's Rebecca. Go talk to her.
- I know who it is.
354
00:20:08,040 --> 00:20:10,834
- Dude, yo, go talk to her.
- I'm workin' on it.
355
00:20:10,918 --> 00:20:14,171
His calendar says he's on a break
right now. We should be live.
356
00:20:14,755 --> 00:20:16,465
Want me to nudge him?
357
00:20:17,132 --> 00:20:18,050
Yeah, nudge away.
358
00:20:21,637 --> 00:20:24,181
"Your friend Tyler just joined.
Say hi with a wave."
359
00:20:26,016 --> 00:20:27,184
Come on, Ben.
360
00:20:27,267 --> 00:20:29,311
Send a wave.
361
00:20:29,394 --> 00:20:32,606
You're not... Go talk to her, dude.
362
00:20:38,070 --> 00:20:40,447
New link! All right, we're on.
363
00:20:40,948 --> 00:20:46,078
Follow that up with a post
from User 079044238820, Rebecca.
364
00:20:46,161 --> 00:20:49,790
Good idea. GPS coordinates indicate
that they're in close proximity.
365
00:20:55,921 --> 00:20:57,172
He's primed for an ad.
366
00:20:57,631 --> 00:20:58,632
Auction time.
367
00:21:00,133 --> 00:21:02,803
Sold! To Deep Fade hair wax.
368
00:21:03,387 --> 00:21:07,933
We had 468 interested bidders. We sold Ben
at 3.262 cents for an impression.
369
00:21:17,109 --> 00:21:18,735
We've created a world
370
00:21:18,819 --> 00:21:21,530
in which online connection
has become primary,
371
00:21:22,072 --> 00:21:23,907
especially for younger generations.
372
00:21:23,991 --> 00:21:28,328
And yet, in that world,
any time two people connect,
373
00:21:29,162 --> 00:21:33,250
the only way it's financed
is through a sneaky third person
374
00:21:33,333 --> 00:21:35,627
who's paying to manipulate
those two people.
375
00:21:36,128 --> 00:21:39,381
So, we've created
an entire global generation of people
376
00:21:39,464 --> 00:21:44,011
who are raised within a context
where the very meaning of communication,
377
00:21:44,094 --> 00:21:47,431
the very meaning of culture,
is manipulation.
378
00:21:47,514 --> 00:21:49,641
We've put deceit and sneakiness
379
00:21:49,725 --> 00:21:52,311
at the absolute center
of everything we do.
380
00:22:05,615 --> 00:22:07,242
- Grab the...
- Okay.
381
00:22:07,326 --> 00:22:09,286
- Where's it help to hold it?
- Great.
382
00:22:09,369 --> 00:22:10,787
- Here?
- Yeah.
383
00:22:10,871 --> 00:22:13,832
How does this come across on camera
if I were to do, like, this move--
384
00:22:13,915 --> 00:22:15,542
- We can--
- Like that?
385
00:22:15,625 --> 00:22:16,918
- What?
- Yeah.
386
00:22:17,002 --> 00:22:19,004
- Do that again.
- Exactly. Yeah.
387
00:22:19,087 --> 00:22:20,589
Yeah. No, it's probably not...
388
00:22:20,672 --> 00:22:21,965
Like... yeah.
389
00:22:22,466 --> 00:22:23,884
I mean, this one is less...
390
00:22:29,681 --> 00:22:33,268
Larissa's, like,
actually freaking out over here.
391
00:22:34,728 --> 00:22:35,562
Is that good?
392
00:22:37,856 --> 00:22:41,068
I was, like, five years old
when I learned how to do magic.
393
00:22:41,151 --> 00:22:45,781
And I could fool adults,
fully-grown adults with, like, PhDs.
394
00:22:55,040 --> 00:22:57,709
Magicians were almost like
the first neuroscientists
395
00:22:57,793 --> 00:22:58,960
and psychologists.
396
00:22:59,044 --> 00:23:02,005
Like, they were the ones
who first understood
397
00:23:02,089 --> 00:23:03,382
how people's minds work.
398
00:23:04,216 --> 00:23:07,677
They just, in real time, are testing
lots and lots of stuff on people.
399
00:23:09,137 --> 00:23:11,139
A magician understands something,
400
00:23:11,223 --> 00:23:14,017
some part of your mind
that we're not aware of.
401
00:23:14,101 --> 00:23:15,936
That's what makes the illusion work.
402
00:23:16,019 --> 00:23:20,607
Doctors, lawyers, people who know
how to build 747s or nuclear missiles,
403
00:23:20,690 --> 00:23:24,361
they don't know more about
how their own mind is vulnerable.
404
00:23:24,444 --> 00:23:26,113
That's a separate discipline.
405
00:23:26,571 --> 00:23:28,990
And it's a discipline
that applies to all human beings.
406
00:23:30,909 --> 00:23:34,079
From that perspective, you can have
a very different understanding
407
00:23:34,162 --> 00:23:35,580
of what technology is doing.
408
00:23:36,873 --> 00:23:39,584
When I was
at the Stanford Persuasive Technology Lab,
409
00:23:39,668 --> 00:23:41,044
this is what we learned.
410
00:23:41,628 --> 00:23:43,463
How could you use everything we know
411
00:23:43,547 --> 00:23:45,882
about the psychology
of what persuades people
412
00:23:45,966 --> 00:23:48,385
and build that into technology?
413
00:23:48,468 --> 00:23:50,887
Now, many of you in the audience
are geniuses already.
414
00:23:50,971 --> 00:23:55,851
I think that's true, but my goal is
to turn you into a behavior-change genius.
415
00:23:56,852 --> 00:24:01,148
There are many prominent Silicon Valley
figures who went through that class--
416
00:24:01,231 --> 00:24:05,485
key growth figures at Facebook and Uber
and... and other companies--
417
00:24:05,569 --> 00:24:09,197
and learned how to make technology
more persuasive,
418
00:24:09,614 --> 00:24:10,782
Tristan being one.
419
00:24:12,284 --> 00:24:14,619
Persuasive technology
is just sort of design
420
00:24:14,703 --> 00:24:16,580
intentionally applied to the extreme,
421
00:24:16,663 --> 00:24:18,874
where we really want to modify
someone's behavior.
422
00:24:18,957 --> 00:24:20,542
We want them to take this action.
423
00:24:20,625 --> 00:24:23,336
We want them to keep doing this
with their finger.
424
00:24:23,420 --> 00:24:26,256
You pull down and you refresh,
it's gonna be a new thing at the top.
425
00:24:26,339 --> 00:24:28,508
Pull down and refresh again, it's new.
Every single time.
426
00:24:28,592 --> 00:24:33,722
Which, in psychology, we call
a positive intermittent reinforcement.
427
00:24:33,805 --> 00:24:37,142
You don't know when you're gonna get it
or if you're gonna get something,
428
00:24:37,225 --> 00:24:40,061
which operates just like the slot machines
in Vegas.
429
00:24:40,145 --> 00:24:42,230
It's not enough
that you use the product consciously,
430
00:24:42,314 --> 00:24:44,024
I wanna dig down deeper
into the brain stem
431
00:24:44,107 --> 00:24:45,817
and implant, inside of you,
432
00:24:45,901 --> 00:24:47,652
an unconscious habit
433
00:24:47,736 --> 00:24:50,864
so that you are being programmed
at a deeper level.
434
00:24:50,947 --> 00:24:52,115
You don't even realize it.
435
00:24:52,532 --> 00:24:54,034
A man, James Marshall...
436
00:24:54,117 --> 00:24:56,286
Every time you see it there
on the counter,
437
00:24:56,369 --> 00:24:59,789
and you just look at it,
and you know if you reach over,
438
00:24:59,873 --> 00:25:01,333
it just might have something for you,
439
00:25:01,416 --> 00:25:03,877
so you play that slot machine
to see what you got, right?
440
00:25:03,960 --> 00:25:06,046
That's not by accident.
That's a design technique.
441
00:25:06,129 --> 00:25:08,632
He brings a golden nugget
to an officer
442
00:25:09,841 --> 00:25:11,301
in the army in San Francisco.
443
00:25:12,219 --> 00:25:15,388
Mind you, the... the population
of San Francisco was only...
444
00:25:15,472 --> 00:25:17,432
Another example is photo tagging.
445
00:25:17,516 --> 00:25:19,643
The secret didn't last.
446
00:25:19,726 --> 00:25:21,186
So, if you get an e-mail
447
00:25:21,269 --> 00:25:24,064
that says your friend just tagged you
in a photo,
448
00:25:24,147 --> 00:25:28,568
of course you're going to click
on that e-mail and look at the photo.
449
00:25:29,152 --> 00:25:31,821
It's not something
you can just decide to ignore.
450
00:25:32,364 --> 00:25:34,157
This is deep-seated, like,
451
00:25:34,241 --> 00:25:36,326
human personality
that they're tapping into.
452
00:25:36,409 --> 00:25:38,078
What you should be asking yourself is:
453
00:25:38,161 --> 00:25:40,288
"Why doesn't that e-mail contain
the photo in it?
454
00:25:40,372 --> 00:25:42,457
It would be a lot easier
to see the photo."
455
00:25:42,541 --> 00:25:45,919
When Facebook found that feature,
they just dialed the hell out of that
456
00:25:46,002 --> 00:25:48,505
because they said, "This is gonna be
a great way to grow activity.
457
00:25:48,588 --> 00:25:51,091
Let's just get people tagging each other
in photos all day long."
458
00:25:59,349 --> 00:26:00,475
He commented.
459
00:26:00,559 --> 00:26:01,434
Nice.
460
00:26:01,935 --> 00:26:04,688
Okay, Rebecca received it,
and she is responding.
461
00:26:04,771 --> 00:26:07,566
All right, let Ben know that she's typing
so we don't lose him.
462
00:26:07,649 --> 00:26:08,733
Activating ellipsis.
463
00:26:19,953 --> 00:26:21,329
Great, she posted.
464
00:26:21,454 --> 00:26:24,249
He's commenting on her comment
about his comment on her post.
465
00:26:25,041 --> 00:26:26,418
Hold on, he stopped typing.
466
00:26:26,751 --> 00:26:27,752
Let's autofill.
467
00:26:28,420 --> 00:26:30,005
Emojis. He loves emojis.
468
00:26:33,842 --> 00:26:34,676
He went with fire.
469
00:26:34,759 --> 00:26:36,803
I was rootin' for eggplant.
470
00:26:38,597 --> 00:26:42,726
There's an entire discipline
and field called "growth hacking."
471
00:26:42,809 --> 00:26:47,147
Teams of engineers
whose job is to hack people's psychology
472
00:26:47,230 --> 00:26:48,565
so they can get more growth.
473
00:26:48,648 --> 00:26:50,984
They can get more user sign-ups,
more engagement.
474
00:26:51,067 --> 00:26:52,861
They can get you to invite more people.
475
00:26:52,944 --> 00:26:55,989
After all the testing, all the iterating,
all of this stuff,
476
00:26:56,072 --> 00:26:57,907
you know the single biggest thing
we realized?
477
00:26:57,991 --> 00:27:00,702
Get any individual to seven friends
in ten days.
478
00:27:01,953 --> 00:27:02,787
That was it.
479
00:27:02,871 --> 00:27:05,498
Chamath was the head of growth at Facebook
early on,
480
00:27:05,582 --> 00:27:08,251
and he's very well known
in the tech industry
481
00:27:08,335 --> 00:27:11,004
for pioneering a lot of the growth tactics
482
00:27:11,087 --> 00:27:14,758
that were used to grow Facebook
at incredible speed.
483
00:27:14,841 --> 00:27:18,553
And those growth tactics have then become
the standard playbook for Silicon Valley.
484
00:27:18,637 --> 00:27:21,222
They were used at Uber
and at a bunch of other companies.
485
00:27:21,306 --> 00:27:27,062
One of the things that he pioneered
was the use of scientific A/B testing
486
00:27:27,145 --> 00:27:28,480
of small feature changes.
487
00:27:29,022 --> 00:27:30,940
Companies like Google and Facebook
488
00:27:31,024 --> 00:27:34,569
would roll out
lots of little, tiny experiments
489
00:27:34,653 --> 00:27:36,821
that they were constantly doing on users.
490
00:27:36,905 --> 00:27:39,866
And over time,
by running these constant experiments,
491
00:27:39,949 --> 00:27:43,036
you... you develop the most optimal way
492
00:27:43,119 --> 00:27:45,288
to get users to do
what you want them to do.
493
00:27:45,372 --> 00:27:46,790
It's... It's manipulation.
494
00:27:47,332 --> 00:27:49,459
Uh, you're making me feel like a lab rat.
495
00:27:49,834 --> 00:27:51,920
You are a lab rat. We're all lab rats.
496
00:27:52,545 --> 00:27:55,548
And it's not like we're lab rats
for developing a cure for cancer.
497
00:27:55,632 --> 00:27:58,134
It's not like they're trying
to benefit us.
498
00:27:58,218 --> 00:28:01,680
Right? We're just zombies,
and they want us to look at more ads
499
00:28:01,763 --> 00:28:03,181
so they can make more money.
500
00:28:03,556 --> 00:28:05,266
Facebook conducted
501
00:28:05,350 --> 00:28:08,228
what they called
"massive-scale contagion experiments."
502
00:28:08,311 --> 00:28:09,145
Okay.
503
00:28:09,229 --> 00:28:13,066
How do we use subliminal cues
on the Facebook pages
504
00:28:13,400 --> 00:28:17,654
to get more people to go vote
in the midterm elections?
505
00:28:17,987 --> 00:28:20,824
And they discovered
that they were able to do that.
506
00:28:20,907 --> 00:28:24,160
One thing they concluded
is that we now know
507
00:28:24,744 --> 00:28:28,915
we can affect real-world behavior
and emotions
508
00:28:28,998 --> 00:28:32,877
without ever triggering
the user's awareness.
509
00:28:33,378 --> 00:28:37,382
They are completely clueless.
510
00:28:38,049 --> 00:28:41,970
We're pointing these engines of AI
back at ourselves
511
00:28:42,053 --> 00:28:46,224
to reverse-engineer what elicits responses
from us.
512
00:28:47,100 --> 00:28:49,561
Almost like you're stimulating nerve cells
on a spider
513
00:28:49,644 --> 00:28:51,479
to see what causes its legs to respond.
514
00:28:51,938 --> 00:28:53,940
So, it really is
this kind of prison experiment
515
00:28:54,023 --> 00:28:56,735
where we're just, you know,
roping people into the matrix,
516
00:28:56,818 --> 00:29:00,572
and we're just harvesting all this money
and... and data from all their activity
517
00:29:00,655 --> 00:29:01,489
to profit from.
518
00:29:01,573 --> 00:29:03,450
And we're not even aware
that it's happening.
519
00:29:04,117 --> 00:29:07,912
So, we want to psychologically figure out
how to manipulate you as fast as possible
520
00:29:07,996 --> 00:29:10,081
and then give you back that dopamine hit.
521
00:29:10,165 --> 00:29:12,375
We did that brilliantly at Facebook.
522
00:29:12,625 --> 00:29:14,919
Instagram has done it.
WhatsApp has done it.
523
00:29:15,003 --> 00:29:17,380
You know, Snapchat has done it.
Twitter has done it.
524
00:29:17,464 --> 00:29:19,424
I mean, it's exactly the kind of thing
525
00:29:19,507 --> 00:29:22,427
that a... that a hacker like myself
would come up with
526
00:29:22,510 --> 00:29:27,015
because you're exploiting a vulnerability
in... in human psychology.
527
00:29:27,807 --> 00:29:29,726
And I just...
I think that we...
528
00:29:29,809 --> 00:29:33,438
you know, the inventors, creators...
529
00:29:33,980 --> 00:29:37,317
uh, you know, and it's me, it's Mark,
it's the...
530
00:29:37,400 --> 00:29:40,403
you know, Kevin Systrom at Instagram...
It's all of these people...
531
00:29:40,487 --> 00:29:46,451
um, understood this consciously,
and we did it anyway.
532
00:29:50,580 --> 00:29:53,750
No one got upset when bicycles showed up.
533
00:29:55,043 --> 00:29:58,004
Right? Like, if everyone's starting
to go around on bicycles,
534
00:29:58,087 --> 00:30:00,924
no one said,
"Oh, my God, we've just ruined society.
535
00:30:01,007 --> 00:30:03,051
Like, bicycles are affecting people.
536
00:30:03,134 --> 00:30:05,303
They're pulling people
away from their kids.
537
00:30:05,386 --> 00:30:08,723
They're ruining the fabric of democracy.
People can't tell what's true."
538
00:30:08,807 --> 00:30:11,476
Like, we never said any of that stuff
about a bicycle.
539
00:30:12,769 --> 00:30:16,147
If something is a tool,
it genuinely is just sitting there,
540
00:30:16,731 --> 00:30:18,733
waiting patiently.
541
00:30:19,317 --> 00:30:22,821
If something is not a tool,
it's demanding things from you.
542
00:30:22,904 --> 00:30:26,533
It's seducing you. It's manipulating you.
It wants things from you.
543
00:30:26,950 --> 00:30:30,495
And we've moved away from having
a tools-based technology environment
544
00:30:31,037 --> 00:30:34,499
to an addiction- and manipulation-based
technology environment.
545
00:30:34,582 --> 00:30:35,708
That's what's changed.
546
00:30:35,792 --> 00:30:39,420
Social media isn't a tool
that's just waiting to be used.
547
00:30:39,504 --> 00:30:43,466
It has its own goals,
and it has its own means of pursuing them
548
00:30:43,550 --> 00:30:45,677
by using your psychology against you.
549
00:30:57,564 --> 00:31:00,567
Rewind a few years ago,
I was the...
550
00:31:00,650 --> 00:31:02,318
I was the president of Pinterest.
551
00:31:03,152 --> 00:31:05,113
I was coming home,
552
00:31:05,196 --> 00:31:08,366
and I couldn't get off my phone
once I got home,
553
00:31:08,449 --> 00:31:12,161
despite having two young kids
who needed my love and attention.
554
00:31:12,245 --> 00:31:15,748
I was in the pantry, you know,
typing away on an e-mail
555
00:31:15,832 --> 00:31:17,542
or sometimes looking at Pinterest.
556
00:31:18,001 --> 00:31:19,627
I thought, "God, this is classic irony.
557
00:31:19,711 --> 00:31:22,046
I am going to work during the day
558
00:31:22,130 --> 00:31:26,426
and building something
that then I am falling prey to."
559
00:31:26,509 --> 00:31:30,096
And I couldn't... I mean, some
of those moments, I couldn't help myself.
560
00:31:32,307 --> 00:31:36,102
The one
that I'm... I'm most prone to is Twitter.
561
00:31:36,185 --> 00:31:38,021
Uh, used to be Reddit.
562
00:31:38,104 --> 00:31:42,859
I actually had to write myself software
to break my addiction to reading Reddit.
563
00:31:45,403 --> 00:31:47,780
I'm probably most addicted to my e-mail.
564
00:31:47,864 --> 00:31:49,866
I mean, really. I mean, I... I feel it.
565
00:31:52,577 --> 00:31:54,954
Well, I mean, it's sort-- it's interesting
566
00:31:55,038 --> 00:31:58,166
that knowing what was going on
behind the curtain,
567
00:31:58,249 --> 00:32:01,628
I still wasn't able to control my usage.
568
00:32:01,711 --> 00:32:03,046
So, that's a little scary.
569
00:32:03,630 --> 00:32:07,050
Even knowing how these tricks work,
I'm still susceptible to them.
570
00:32:07,133 --> 00:32:09,886
I'll still pick up the phone,
and 20 minutes will disappear.
571
00:32:12,805 --> 00:32:15,725
Do you check your smartphone
before you pee in the morning
572
00:32:15,808 --> 00:32:17,477
or while you're peeing in the morning?
573
00:32:17,560 --> 00:32:19,479
'Cause those are the only two choices.
574
00:32:19,562 --> 00:32:23,274
I tried through willpower,
just pure willpower...
575
00:32:23,358 --> 00:32:26,903
"I'll put down my phone, I'll leave
my phone in the car when I get home."
576
00:32:26,986 --> 00:32:30,573
I think I told myself a thousand times,
a thousand different days,
577
00:32:30,657 --> 00:32:32,617
"I am not gonna bring my phone
to the bedroom,"
578
00:32:32,700 --> 00:32:34,535
and then 9:00 p.m. rolls around.
579
00:32:34,619 --> 00:32:37,121
"Well, I wanna bring my phone
in the bedroom."
580
00:32:37,205 --> 00:32:39,290
And so, that was sort of...
581
00:32:39,374 --> 00:32:41,125
Willpower was kind of attempt one,
582
00:32:41,209 --> 00:32:44,295
and then attempt two was,
you know, brute force.
583
00:32:44,379 --> 00:32:48,091
Introducing the Kitchen Safe.
The Kitchen Safe is a revolutionary,
584
00:32:48,174 --> 00:32:51,678
new, time-locking container
that helps you fight temptation.
585
00:32:51,761 --> 00:32:56,724
All David has to do is placethose temptations in the Kitchen Safe.
586
00:32:57,392 --> 00:33:00,395
Next, he rotates the dialto set the timer.
587
00:33:01,479 --> 00:33:04,232
And, finally, he presses the dialto activate the lock.
588
00:33:04,315 --> 00:33:05,525
The Kitchen Safe is great...
589
00:33:05,608 --> 00:33:06,776
We have that, don't we?
590
00:33:06,859 --> 00:33:08,653
...video games, credit cards,and cell phones.
591
00:33:08,736 --> 00:33:09,654
Yeah, we do.
592
00:33:09,737 --> 00:33:12,407
Once the Kitchen Safeis locked, it cannot be opened
593
00:33:12,490 --> 00:33:13,866
until the timer reaches zero.
594
00:33:13,950 --> 00:33:15,618
So, here's the thing.
595
00:33:15,702 --> 00:33:17,537
Social media is a drug.
596
00:33:17,620 --> 00:33:20,873
I mean,
we have a basic biological imperative
597
00:33:20,957 --> 00:33:23,084
to connect with other people.
598
00:33:23,167 --> 00:33:28,214
That directly affects the release
of dopamine in the reward pathway.
599
00:33:28,297 --> 00:33:32,552
Millions of years of evolution, um,
are behind that system
600
00:33:32,635 --> 00:33:35,596
to get us to come together
and live in communities,
601
00:33:35,680 --> 00:33:38,016
to find mates, to propagate our species.
602
00:33:38,099 --> 00:33:41,853
So, there's no doubt
that a vehicle like social media,
603
00:33:41,936 --> 00:33:45,690
which optimizes this connection
between people,
604
00:33:45,773 --> 00:33:48,568
is going to have the potential
for addiction.
605
00:33:52,071 --> 00:33:54,115
- Mmm!
- Dad, stop!
606
00:33:55,450 --> 00:33:58,453
I have, like, 1,000 more snips
to send before dinner.
607
00:33:58,536 --> 00:34:00,788
- Snips?
- I don't know what a snip is.
608
00:34:00,872 --> 00:34:03,207
- Mm, that smells good, baby.
- All right. Thank you.
609
00:34:03,291 --> 00:34:05,877
I was, um, thinking we could use
all five senses
610
00:34:05,960 --> 00:34:07,712
to enjoy our dinner tonight.
611
00:34:07,795 --> 00:34:11,382
So, I decided that we're not gonna have
any cell phones at the table tonight.
612
00:34:11,466 --> 00:34:13,301
So, turn 'em in.
613
00:34:13,801 --> 00:34:14,802
- Really?
- Yep.
614
00:34:15,928 --> 00:34:18,056
- All right.
- Thank you. Ben?
615
00:34:18,139 --> 00:34:20,433
- Okay.
- Mom, the phone pirate.
616
00:34:21,100 --> 00:34:21,934
- Got it.
- Mom!
617
00:34:22,518 --> 00:34:26,147
So, they will be safe in here
until after dinner...
618
00:34:27,273 --> 00:34:30,650
and everyone can just chill out.
619
00:34:30,735 --> 00:34:31,569
Okay?
620
00:34:47,418 --> 00:34:49,253
- Can I just see who it is?
- No.
621
00:34:54,759 --> 00:34:56,969
Just gonna go get another fork.
622
00:34:58,304 --> 00:34:59,263
Thank you.
623
00:35:04,727 --> 00:35:06,771
Honey, you can't open that.
624
00:35:06,854 --> 00:35:09,315
I locked it for an hour,
so just leave it alone.
625
00:35:11,192 --> 00:35:13,361
So, what should we talk about?
626
00:35:13,444 --> 00:35:14,695
Well, we could talk
627
00:35:14,779 --> 00:35:17,615
about the, uh, Extreme Center wackos
I drove by today.
628
00:35:17,698 --> 00:35:18,825
- Please, Frank.
- What?
629
00:35:18,908 --> 00:35:20,785
I don't wanna talk about politics.
630
00:35:20,868 --> 00:35:23,538
- What's wrong with the Extreme Center?
- See? He doesn't even get it.
631
00:35:23,621 --> 00:35:24,622
It depends on who you ask.
632
00:35:24,705 --> 00:35:26,624
It's like asking,
"What's wrong with propaganda?"
633
00:35:28,709 --> 00:35:29,710
Isla!
634
00:35:32,797 --> 00:35:33,756
Oh, my God.
635
00:35:36,425 --> 00:35:38,553
- Do you want me to...
- Yeah.
636
00:35:41,973 --> 00:35:43,933
I... I'm worried about my kids.
637
00:35:44,016 --> 00:35:46,686
And if you have kids,
I'm worried about your kids.
638
00:35:46,769 --> 00:35:50,189
Armed with all the knowledge that I have
and all of the experience,
639
00:35:50,273 --> 00:35:52,108
I am fighting my kids about the time
640
00:35:52,191 --> 00:35:54,443
that they spend on phones
and on the computer.
641
00:35:54,527 --> 00:35:58,197
I will say to my son, "How many hours do
you think you're spending on your phone?"
642
00:35:58,281 --> 00:36:01,075
He'll be like, "It's, like, half an hour.
It's half an hour, tops."
643
00:36:01,159 --> 00:36:04,829
I'd say upwards hour, hour and a half.
644
00:36:04,912 --> 00:36:06,789
I looked at his screen report
a couple weeks ago.
645
00:36:06,873 --> 00:36:08,708
- Three hours and 45 minutes.
- That...
646
00:36:11,377 --> 00:36:13,588
I don't think that's...
No. Per day, on average?
647
00:36:13,671 --> 00:36:15,506
- Yeah.
- Should I go get it right now?
648
00:36:15,590 --> 00:36:19,177
There's not a day that goes by
that I don't remind my kids
649
00:36:19,260 --> 00:36:21,762
about the pleasure-pain balance,
650
00:36:21,846 --> 00:36:24,390
about dopamine deficit states,
651
00:36:24,473 --> 00:36:26,267
about the risk of addiction.
652
00:36:26,350 --> 00:36:27,310
Moment of truth.
653
00:36:27,935 --> 00:36:29,687
Two hours, 50 minutes per day.
654
00:36:29,770 --> 00:36:31,772
- Let's see.
- Actually, I've been using a lot today.
655
00:36:31,856 --> 00:36:33,357
- Last seven days.
- That's probably why.
656
00:36:33,441 --> 00:36:37,361
Instagram, six hours, 13 minutes.
Okay, so my Instagram's worse.
657
00:36:39,572 --> 00:36:41,991
My screen's completely shattered.
658
00:36:42,200 --> 00:36:43,201
Thanks, Cass.
659
00:36:44,410 --> 00:36:45,995
What do you mean, "Thanks, Cass"?
660
00:36:46,078 --> 00:36:49,040
You keep freaking Mom out about our phones
when it's not really a problem.
661
00:36:49,373 --> 00:36:51,167
We don't need our phones to eat dinner!
662
00:36:51,250 --> 00:36:53,878
I get what you're saying.
It's just not that big a deal. It's not.
663
00:36:56,047 --> 00:36:58,382
If it's not that big a deal,
don't use it for a week.
664
00:37:01,135 --> 00:37:06,349
Yeah. Yeah, actually, if you can put
that thing away for, like, a whole week...
665
00:37:07,725 --> 00:37:09,518
I will buy you a new screen.
666
00:37:10,978 --> 00:37:12,897
- Like, starting now?
- Starting now.
667
00:37:15,149 --> 00:37:16,859
- Okay. You got a deal.
- Okay.
668
00:37:16,943 --> 00:37:19,111
Okay, you gotta leave it here, though,
buddy.
669
00:37:19,862 --> 00:37:21,364
All right, I'm plugging it in.
670
00:37:22,531 --> 00:37:25,076
Let the record show... I'm backing away.
671
00:37:25,159 --> 00:37:25,993
Okay.
672
00:37:27,787 --> 00:37:29,413
- You're on the clock.
- One week.
673
00:37:29,497 --> 00:37:30,331
Oh, my...
674
00:37:31,457 --> 00:37:32,416
Think he can do it?
675
00:37:33,000 --> 00:37:34,252
I don't know. We'll see.
676
00:37:35,002 --> 00:37:36,128
Just eat, okay?
677
00:37:44,220 --> 00:37:45,263
Good family dinner!
678
00:37:47,682 --> 00:37:49,809
These technology products
were not designed
679
00:37:49,892 --> 00:37:53,896
by child psychologists who are trying
to protect and nurture children.
680
00:37:53,980 --> 00:37:56,148
They were just designing
to make these algorithms
681
00:37:56,232 --> 00:37:58,734
that were really good at recommending
the next video to you
682
00:37:58,818 --> 00:38:02,321
or really good at getting you
to take a photo with a filter on it.
683
00:38:16,752 --> 00:38:18,879
It's not just
that it's controlling
684
00:38:18,963 --> 00:38:20,548
where they spend their attention.
685
00:38:21,173 --> 00:38:26,304
Especially social media starts to dig
deeper and deeper down into the brain stem
686
00:38:26,387 --> 00:38:29,765
and take over kids' sense of self-worth
and identity.
687
00:38:52,371 --> 00:38:56,208
We evolved to care about
whether other people in our tribe...
688
00:38:56,751 --> 00:38:59,128
think well of us or not
'cause it matters.
689
00:38:59,837 --> 00:39:04,550
But were we evolved to be aware
of what 10,000 people think of us?
690
00:39:04,633 --> 00:39:08,763
We were not evolved
to have social approval being dosed to us
691
00:39:08,846 --> 00:39:10,348
every five minutes.
692
00:39:10,431 --> 00:39:13,142
That was not at all what we were built
to experience.
693
00:39:15,394 --> 00:39:19,982
We curate our lives
around this perceived sense of perfection
694
00:39:20,733 --> 00:39:23,527
because we get rewarded
in these short-term signals--
695
00:39:23,611 --> 00:39:25,154
hearts, likes, thumbs-up--
696
00:39:25,237 --> 00:39:28,407
and we conflate that with value,
and we conflate it with truth.
697
00:39:29,825 --> 00:39:33,120
And instead, what it really is
is fake, brittle popularity...
698
00:39:33,913 --> 00:39:37,458
that's short-term and that leaves you
even more, and admit it,
699
00:39:37,541 --> 00:39:39,919
vacant and empty before you did it.
700
00:39:41,295 --> 00:39:43,381
Because then it forces you
into this vicious cycle
701
00:39:43,464 --> 00:39:47,176
where you're like, "What's the next thing
I need to do now? 'Cause I need it back."
702
00:39:48,260 --> 00:39:50,846
Think about that compounded
by two billion people,
703
00:39:50,930 --> 00:39:54,767
and then think about how people react then
to the perceptions of others.
704
00:39:54,850 --> 00:39:56,435
It's just a... It's really bad.
705
00:39:56,977 --> 00:39:58,229
It's really, really bad.
706
00:40:00,856 --> 00:40:03,484
There has been
a gigantic increase
707
00:40:03,567 --> 00:40:06,529
in depression and anxiety
for American teenagers
708
00:40:06,612 --> 00:40:10,950
which began right around...
between 2011 and 2013.
709
00:40:11,033 --> 00:40:15,371
The number of teenage girls out of 100,000
in this country
710
00:40:15,454 --> 00:40:17,123
who were admitted to a hospital every year
711
00:40:17,206 --> 00:40:19,917
because they cut themselves
or otherwise harmed themselves,
712
00:40:20,000 --> 00:40:23,921
that number was pretty stable
until around 2010, 2011,
713
00:40:24,004 --> 00:40:25,756
and then it begins going way up.
714
00:40:28,759 --> 00:40:32,513
It's up 62 percent for older teen girls.
715
00:40:33,848 --> 00:40:38,310
It's up 189 percent for the preteen girls.
That's nearly triple.
716
00:40:40,312 --> 00:40:43,524
Even more horrifying,
we see the same pattern with suicide.
717
00:40:44,775 --> 00:40:47,570
The older teen girls, 15 to 19 years old,
718
00:40:47,653 --> 00:40:49,196
they're up 70 percent,
719
00:40:49,280 --> 00:40:51,699
compared to the first decade
of this century.
720
00:40:52,158 --> 00:40:55,077
The preteen girls,
who have very low rates to begin with,
721
00:40:55,161 --> 00:40:57,663
they are up 151 percent.
722
00:40:58,831 --> 00:41:01,709
And that pattern points to social media.
723
00:41:04,044 --> 00:41:07,214
Gen Z, the kids born after 1996 or so,
724
00:41:07,298 --> 00:41:10,342
those kids are the first generation
in history
725
00:41:10,426 --> 00:41:12,636
that got on social media in middle school.
726
00:41:15,890 --> 00:41:17,600
How do they spend their time?
727
00:41:19,727 --> 00:41:22,730
They come home from school,
and they're on their devices.
728
00:41:24,315 --> 00:41:29,195
A whole generation is more anxious,
more fragile, more depressed.
729
00:41:30,613 --> 00:41:33,282
They're much less comfortable
taking risks.
730
00:41:34,325 --> 00:41:37,536
The rates at which they get
driver's licenses have been dropping.
731
00:41:38,954 --> 00:41:41,081
The number
who have ever gone out on a date
732
00:41:41,165 --> 00:41:44,251
or had any kind of romantic interaction
is dropping rapidly.
733
00:41:47,505 --> 00:41:49,715
This is a real change in a generation.
734
00:41:53,177 --> 00:41:57,306
And remember, for every one of these,
for every hospital admission,
735
00:41:57,389 --> 00:42:00,267
there's a family that is traumatized
and horrified.
736
00:42:00,351 --> 00:42:02,353
"My God, what is happening to our kids?"
737
00:42:19,411 --> 00:42:21,413
It's plain as day to me.
738
00:42:22,873 --> 00:42:28,128
These services are killing people...
and causing people to kill themselves.
739
00:42:29,088 --> 00:42:33,300
I don't know any parent who says, "Yeah,
I really want my kids to be growing up
740
00:42:33,384 --> 00:42:36,887
feeling manipulated by tech designers, uh,
741
00:42:36,971 --> 00:42:39,723
manipulating their attention,
making it impossible to do their homework,
742
00:42:39,807 --> 00:42:42,560
making them compare themselves
to unrealistic standards of beauty."
743
00:42:42,643 --> 00:42:44,687
Like, no one wants that.
744
00:42:45,104 --> 00:42:46,355
No one does.
745
00:42:46,438 --> 00:42:48,482
We... We used to have these protections.
746
00:42:48,566 --> 00:42:50,943
When children watched
Saturday morning cartoons,
747
00:42:51,026 --> 00:42:52,778
we cared about protecting children.
748
00:42:52,861 --> 00:42:56,574
We would say, "You can't advertise
to these age children in these ways."
749
00:42:57,366 --> 00:42:58,784
But then you take YouTube for Kids,
750
00:42:58,867 --> 00:43:02,454
and it gobbles up that entire portion
of the attention economy,
751
00:43:02,538 --> 00:43:04,915
and now all kids are exposed
to YouTube for Kids.
752
00:43:04,999 --> 00:43:07,710
And all those protections
and all those regulations are gone.
753
00:43:18,304 --> 00:43:22,141
We're training and conditioning
a whole new generation of people...
754
00:43:23,434 --> 00:43:29,148
that when we are uncomfortable or lonely
or uncertain or afraid,
755
00:43:29,231 --> 00:43:31,775
we have a digital pacifier for ourselves
756
00:43:32,234 --> 00:43:36,488
that is kind of atrophying our own ability
to deal with that.
757
00:43:53,881 --> 00:43:55,674
Photoshop didn't have
1,000 engineers
758
00:43:55,758 --> 00:43:58,969
on the other side of the screen,
using notifications, using your friends,
759
00:43:59,053 --> 00:44:02,431
using AI to predict what's gonna
perfectly addict you, or hook you,
760
00:44:02,514 --> 00:44:04,516
or manipulate you, or allow advertisers
761
00:44:04,600 --> 00:44:08,437
to test 60,000 variations
of text or colors to figure out
762
00:44:08,520 --> 00:44:11,065
what's the perfect manipulation
of your mind.
763
00:44:11,148 --> 00:44:14,985
This is a totally new species
of power and influence.
764
00:44:16,070 --> 00:44:19,156
I... I would say, again, the methods used
765
00:44:19,239 --> 00:44:22,785
to play on people's ability
to be addicted or to be influenced
766
00:44:22,868 --> 00:44:25,204
may be different this time,
and they probably are different.
767
00:44:25,287 --> 00:44:28,749
They were different when newspapers
came in and the printing press came in,
768
00:44:28,832 --> 00:44:31,835
and they were different
when television came in,
769
00:44:31,919 --> 00:44:34,004
and you had three major networks and...
770
00:44:34,463 --> 00:44:36,423
- At the time.
- At the time. That's what I'm saying.
771
00:44:36,507 --> 00:44:38,384
But I'm saying the idea
that there's a new level
772
00:44:38,467 --> 00:44:42,054
and that new level has happened
so many times before.
773
00:44:42,137 --> 00:44:45,099
I mean, this is just the latest new level
that we've seen.
774
00:44:45,182 --> 00:44:48,727
There's this narrative that, you know,
"We'll just adapt to it.
775
00:44:48,811 --> 00:44:51,188
We'll learn how to live
with these devices,
776
00:44:51,271 --> 00:44:53,732
just like we've learned how to live
with everything else."
777
00:44:53,816 --> 00:44:56,694
And what this misses
is there's something distinctly new here.
778
00:44:57,486 --> 00:45:00,155
Perhaps the most dangerous piece
of all this is the fact
779
00:45:00,239 --> 00:45:04,410
that it's driven by technology
that's advancing exponentially.
780
00:45:05,869 --> 00:45:09,081
Roughly, if you say from, like,
the 1960s to today,
781
00:45:09,873 --> 00:45:12,960
processing power has gone up
about a trillion times.
782
00:45:13,794 --> 00:45:18,340
Nothing else that we have has improved
at anything near that rate.
783
00:45:18,424 --> 00:45:22,177
Like, cars are, you know,
roughly twice as fast.
784
00:45:22,261 --> 00:45:25,013
And almost everything else is negligible.
785
00:45:25,347 --> 00:45:27,182
And perhaps most importantly,
786
00:45:27,266 --> 00:45:31,353
our human-- our physiology,
our brains have evolved not at all.
787
00:45:37,401 --> 00:45:41,488
Human beings, at a mind and body
and sort of physical level,
788
00:45:41,947 --> 00:45:43,866
are not gonna fundamentally change.
789
00:45:47,035 --> 00:45:48,954
I know, but they...
790
00:45:56,837 --> 00:46:00,924
We can do genetic engineering
and develop new kinds of human beings,
791
00:46:01,008 --> 00:46:05,220
but realistically speaking,
you're living inside of hardware, a brain,
792
00:46:05,304 --> 00:46:07,222
that was, like, millions of years old,
793
00:46:07,306 --> 00:46:10,559
and then there's this screen, and then
on the opposite side of the screen,
794
00:46:10,642 --> 00:46:13,562
there's these thousands of engineers
and supercomputers
795
00:46:13,645 --> 00:46:16,106
that have goals that are different
than your goals,
796
00:46:16,190 --> 00:46:19,693
and so, who's gonna win in that game?
Who's gonna win?
797
00:46:25,699 --> 00:46:26,617
How are we losing?
798
00:46:27,159 --> 00:46:29,828
- I don't know.
- Where is he? This is not normal.
799
00:46:29,912 --> 00:46:32,080
Did I overwhelm him
with friends and family content?
800
00:46:32,164 --> 00:46:34,082
- Probably.
- Well, maybe it was all the ads.
801
00:46:34,166 --> 00:46:37,795
No. Something's very wrong.
Let's switch to resurrection mode.
802
00:46:39,713 --> 00:46:44,051
When you think of AI,
you know, an AI's gonna ruin the world,
803
00:46:44,134 --> 00:46:47,221
and you see, like, a Terminator,
and you see Arnold Schwarzenegger.
804
00:46:47,638 --> 00:46:48,680
I'll be back.
805
00:46:48,764 --> 00:46:50,933
You see drones,
and you think, like,
806
00:46:51,016 --> 00:46:52,684
"Oh, we're gonna kill people with AI."
807
00:46:53,644 --> 00:46:59,817
And what people miss is that AI
already runs today's world right now.
808
00:46:59,900 --> 00:47:03,237
Even talking about "an AI"
is just a metaphor.
809
00:47:03,320 --> 00:47:09,451
At these companies like... like Google,
there's just massive, massive rooms,
810
00:47:10,327 --> 00:47:13,121
some of them underground,
some of them underwater,
811
00:47:13,205 --> 00:47:14,498
of just computers.
812
00:47:14,581 --> 00:47:17,835
Tons and tons of computers,
as far as the eye can see.
813
00:47:18,460 --> 00:47:20,504
They're deeply interconnected
with each other
814
00:47:20,587 --> 00:47:22,923
and running
extremely complicated programs,
815
00:47:23,006 --> 00:47:26,009
sending information back and forth
between each other all the time.
816
00:47:26,802 --> 00:47:28,595
And they'll be running
many different programs,
817
00:47:28,679 --> 00:47:31,014
many different products
on those same machines.
818
00:47:31,348 --> 00:47:33,684
Some of those things could be described
as simple algorithms,
819
00:47:33,767 --> 00:47:35,227
some could be described as algorithms
820
00:47:35,310 --> 00:47:37,521
that are so complicated,
you would call them intelligence.
821
00:47:40,065 --> 00:47:42,568
I like to say that algorithms are opinions
822
00:47:42,651 --> 00:47:43,777
embedded in code...
823
00:47:45,070 --> 00:47:47,656
and that algorithms are not objective.
824
00:47:48,365 --> 00:47:51,577
Algorithms are optimized
to some definition of success.
825
00:47:52,244 --> 00:47:53,370
So, if you can imagine,
826
00:47:53,453 --> 00:47:57,124
if a... if a commercial enterprise builds
an algorithm
827
00:47:57,207 --> 00:47:59,293
to their definition of success,
828
00:47:59,835 --> 00:48:01,211
it's a commercial interest.
829
00:48:01,587 --> 00:48:02,671
It's usually profit.
830
00:48:03,130 --> 00:48:07,384
You are giving the computer
the goal state, "I want this outcome,"
831
00:48:07,467 --> 00:48:10,262
and then the computer itself is learning
how to do it.
832
00:48:10,345 --> 00:48:12,598
That's where the term "machine learning"
comes from.
833
00:48:12,681 --> 00:48:14,850
And so, every day, it gets slightly better
834
00:48:14,933 --> 00:48:16,977
at picking the right posts
in the right order
835
00:48:17,060 --> 00:48:19,438
so that you spend longer and longer
in that product.
836
00:48:19,521 --> 00:48:22,232
And no one really understands
what they're doing
837
00:48:22,316 --> 00:48:23,901
in order to achieve that goal.
838
00:48:23,984 --> 00:48:28,238
The algorithm has a mind of its own,
so even though a person writes it,
839
00:48:28,906 --> 00:48:30,657
it's written in a way
840
00:48:30,741 --> 00:48:35,037
that you kind of build the machine,
and then the machine changes itself.
841
00:48:35,120 --> 00:48:37,873
There's only a handful of people
at these companies,
842
00:48:37,956 --> 00:48:40,000
at Facebook and Twitter
and other companies...
843
00:48:40,083 --> 00:48:43,795
There's only a few people who understand
how those systems work,
844
00:48:43,879 --> 00:48:46,715
and even they don't necessarily
fully understand
845
00:48:46,798 --> 00:48:49,551
what's gonna happen
with a particular piece of content.
846
00:48:49,968 --> 00:48:55,474
So, as humans, we've almost lost control
over these systems.
847
00:48:55,891 --> 00:48:59,603
Because they're controlling, you know,
the information that we see,
848
00:48:59,686 --> 00:49:02,189
they're controlling us more
than we're controlling them.
849
00:49:02,522 --> 00:49:04,733
Cross-referencing him
850
00:49:04,816 --> 00:49:07,319
against comparables
in his geographic zone.
851
00:49:07,402 --> 00:49:09,571
His psychometric doppelgangers.
852
00:49:09,655 --> 00:49:13,700
There are 13,694 people
behaving just like him in his region.
853
00:49:13,784 --> 00:49:16,370
- What's trending with them?
- We need something actually good
854
00:49:16,453 --> 00:49:17,704
for a proper resurrection,
855
00:49:17,788 --> 00:49:19,957
given that the typical stuff
isn't working.
856
00:49:20,040 --> 00:49:21,875
Not even that cute girl from school.
857
00:49:22,334 --> 00:49:25,253
My analysis shows that going political
with Extreme Center content
858
00:49:25,337 --> 00:49:28,256
has a 62.3 percent chance
of long-term engagement.
859
00:49:28,340 --> 00:49:29,299
That's not bad.
860
00:49:29,383 --> 00:49:32,010
It's not good enough to lead with.
861
00:49:32,302 --> 00:49:35,305
Okay, okay, so we've tried notifying him
about tagged photos,
862
00:49:35,389 --> 00:49:39,017
invitations, current events,
even a direct message from Rebecca.
863
00:49:39,101 --> 00:49:42,813
But what about User 01265923010?
864
00:49:42,896 --> 00:49:44,648
Yeah, Ben loved all of her posts.
865
00:49:44,731 --> 00:49:47,776
For months and, like,
literally all of them, and then nothing.
866
00:49:47,859 --> 00:49:50,445
I calculate a 92.3 percent chance
of resurrection
867
00:49:50,529 --> 00:49:52,030
with a notification about Ana.
868
00:49:56,535 --> 00:49:57,494
And her new friend.
869
00:50:25,689 --> 00:50:27,441
Oh, you gotta be kiddin' me.
870
00:50:32,404 --> 00:50:33,613
Uh...
871
00:50:35,657 --> 00:50:36,616
Okay.
872
00:50:38,869 --> 00:50:40,996
What?
873
00:50:41,413 --> 00:50:42,789
Bam! We're back!
874
00:50:42,873 --> 00:50:44,374
Let's get back to making money, boys.
875
00:50:44,458 --> 00:50:46,334
Yes, and connecting Ben
with the entire world.
876
00:50:46,418 --> 00:50:49,087
I'm giving him access
to all the information he might like.
877
00:50:49,755 --> 00:50:53,717
Hey, do you guys ever wonder if, you know,
like, the feed is good for Ben?
878
00:50:57,095 --> 00:50:58,430
- No.
- No.
879
00:51:17,491 --> 00:51:19,076
♪ I put a spell on you ♪
880
00:51:25,040 --> 00:51:26,374
♪ 'Cause you're mine ♪
881
00:51:28,627 --> 00:51:32,089
♪ Ah! ♪
882
00:51:34,508 --> 00:51:36,593
♪ You better stop the things you do ♪
883
00:51:41,181 --> 00:51:42,265
♪ I ain't lyin' ♪
884
00:51:44,976 --> 00:51:46,686
♪ No, I ain't lyin' ♪
885
00:51:49,981 --> 00:51:51,817
♪ You know I can't stand it ♪
886
00:51:53,026 --> 00:51:54,611
♪ You're runnin' around ♪
887
00:51:55,612 --> 00:51:57,239
♪ You know better, Daddy ♪
888
00:51:58,782 --> 00:52:02,077
♪ I can't stand it'Cause you put me down ♪
889
00:52:03,286 --> 00:52:04,121
♪ Yeah, yeah ♪
890
00:52:06,456 --> 00:52:08,375
♪ I put a spell on you ♪
891
00:52:12,379 --> 00:52:14,840
♪ Because you're mine ♪
892
00:52:18,718 --> 00:52:19,845
♪ You're mine ♪
893
00:52:20,929 --> 00:52:24,349
So, imagine you're on Facebook...
894
00:52:24,766 --> 00:52:29,312
and you're effectively playing
against this artificial intelligence
895
00:52:29,396 --> 00:52:31,314
that knows everything about you,
896
00:52:31,398 --> 00:52:34,568
can anticipate your next move,
and you know literally nothing about it,
897
00:52:34,651 --> 00:52:37,404
except that there are cat videos
and birthdays on it.
898
00:52:37,821 --> 00:52:39,656
That's not a fair fight.
899
00:52:41,575 --> 00:52:43,869
Ben and Jerry, it's time to go, bud!
900
00:52:51,126 --> 00:52:51,960
Ben?
901
00:53:02,679 --> 00:53:04,723
- Ben.
- Mm.
902
00:53:05,182 --> 00:53:06,057
Come on.
903
00:53:07,225 --> 00:53:08,351
School time.
904
00:53:08,435 --> 00:53:09,269
Let's go.
905
00:53:31,374 --> 00:53:33,627
- How you doing today?
- Oh, I'm... I'm nervous.
906
00:53:33,710 --> 00:53:35,003
- Are ya?
- Yeah.
907
00:53:37,380 --> 00:53:39,132
We were all looking for the moment
908
00:53:39,216 --> 00:53:42,969
when technology would overwhelm
human strengths and intelligence.
909
00:53:43,053 --> 00:53:47,015
When is it gonna cross the singularity,
replace our jobs, be smarter than humans?
910
00:53:48,141 --> 00:53:50,101
But there's this much earlier moment...
911
00:53:50,977 --> 00:53:55,315
when technology exceeds
and overwhelms human weaknesses.
912
00:53:57,484 --> 00:54:01,780
This point being crossed
is at the root of addiction,
913
00:54:02,113 --> 00:54:04,741
polarization, radicalization,
outrage-ification,
914
00:54:04,824 --> 00:54:06,368
vanity-ification, the entire thing.
915
00:54:07,702 --> 00:54:09,913
This is overpowering human nature,
916
00:54:10,538 --> 00:54:13,500
and this is checkmate on humanity.
917
00:54:30,558 --> 00:54:31,851
I'm sorry.
918
00:54:41,736 --> 00:54:44,656
One of the ways
I try to get people to understand
919
00:54:45,198 --> 00:54:49,828
just how wrong feeds from places
like Facebook are
920
00:54:49,911 --> 00:54:51,454
is to think about the Wikipedia.
921
00:54:52,956 --> 00:54:56,209
When you go to a page, you're seeing
the same thing as other people.
922
00:54:56,584 --> 00:55:00,297
So, it's one of the few things online
that we at least hold in common.
923
00:55:00,380 --> 00:55:03,425
Now, just imagine for a second
that Wikipedia said,
924
00:55:03,508 --> 00:55:07,178
"We're gonna give each person
a different customized definition,
925
00:55:07,262 --> 00:55:09,472
and we're gonna be paid by people
for that."
926
00:55:09,556 --> 00:55:13,435
So, Wikipedia would be spying on you.
Wikipedia would calculate,
927
00:55:13,518 --> 00:55:17,188
"What's the thing I can do
to get this person to change a little bit
928
00:55:17,272 --> 00:55:19,899
on behalf of some commercial interest?"
Right?
929
00:55:19,983 --> 00:55:21,818
And then it would change the entry.
930
00:55:22,444 --> 00:55:24,738
Can you imagine that?
Well, you should be able to,
931
00:55:24,821 --> 00:55:26,823
'cause that's exactly what's happening
on Facebook.
932
00:55:26,906 --> 00:55:28,992
It's exactly what's happening
in your YouTube feed.
933
00:55:29,075 --> 00:55:31,786
When you go to Google and type in
"Climate change is,"
934
00:55:31,870 --> 00:55:34,998
you're going to see different results
depending on where you live.
935
00:55:36,166 --> 00:55:38,460
In certain cities,
you're gonna see it autocomplete
936
00:55:38,543 --> 00:55:40,462
with "climate change is a hoax."
937
00:55:40,545 --> 00:55:42,088
In other cases, you're gonna see
938
00:55:42,172 --> 00:55:44,841
"climate change is causing the destruction
of nature."
939
00:55:44,924 --> 00:55:48,428
And that's a function not
of what the truth is about climate change,
940
00:55:48,511 --> 00:55:51,097
but about
where you happen to be Googling from
941
00:55:51,181 --> 00:55:54,100
and the particular things
Google knows about your interests.
942
00:55:54,851 --> 00:55:58,021
Even two friends
who are so close to each other,
943
00:55:58,104 --> 00:56:00,190
who have almost the exact same set
of friends,
944
00:56:00,273 --> 00:56:02,817
they think, you know,
"I'm going to news feeds on Facebook.
945
00:56:02,901 --> 00:56:05,403
I'll see the exact same set of updates."
946
00:56:05,487 --> 00:56:06,738
But it's not like that at all.
947
00:56:06,821 --> 00:56:08,448
They see completely different worlds
948
00:56:08,531 --> 00:56:10,575
because they're based
on these computers calculating
949
00:56:10,658 --> 00:56:12,035
what's perfect for each of them.
950
00:56:14,329 --> 00:56:18,416
The way to think about it
is it's 2.7 billion Truman Shows.
951
00:56:18,500 --> 00:56:21,294
Each person has their own reality,
with their own...
952
00:56:22,670 --> 00:56:23,671
facts.
953
00:56:23,755 --> 00:56:27,008
Why do you thinkthat, uh, Truman has never come close
954
00:56:27,092 --> 00:56:30,095
to discovering the true natureof his world until now?
955
00:56:31,054 --> 00:56:34,140
We accept the reality of the world
with which we're presented.
956
00:56:34,224 --> 00:56:35,141
It's as simple as that.
957
00:56:36,476 --> 00:56:41,064
Over time, you have the false sense
that everyone agrees with you,
958
00:56:41,147 --> 00:56:44,067
because everyone in your news feed
sounds just like you.
959
00:56:44,567 --> 00:56:49,072
And that once you're in that state,
it turns out you're easily manipulated,
960
00:56:49,155 --> 00:56:51,741
the same way you would be manipulated
by a magician.
961
00:56:51,825 --> 00:56:55,370
A magician shows you a card trick
and says, "Pick a card, any card."
962
00:56:55,453 --> 00:56:58,164
What you don't realize
was that they've done a set-up,
963
00:56:58,456 --> 00:57:00,583
so you pick the card
they want you to pick.
964
00:57:00,667 --> 00:57:03,169
And that's how Facebook works.
Facebook sits there and says,
965
00:57:03,253 --> 00:57:06,172
"Hey, you pick your friends.
You pick the links that you follow."
966
00:57:06,256 --> 00:57:08,716
But that's all nonsense.
It's just like the magician.
967
00:57:08,800 --> 00:57:11,302
Facebook is in charge of your news feed.
968
00:57:11,386 --> 00:57:14,514
We all simply are operating
on a different set of facts.
969
00:57:14,597 --> 00:57:16,474
When that happens at scale,
970
00:57:16,558 --> 00:57:20,645
you're no longer able to reckon with
or even consume information
971
00:57:20,728 --> 00:57:23,690
that contradicts with that world view
that you've created.
972
00:57:23,773 --> 00:57:26,443
That means we aren't actually being
objective,
973
00:57:26,526 --> 00:57:28,319
constructive individuals.
974
00:57:28,403 --> 00:57:32,449
Open up your eyes,
don't believe the lies! Open up...
975
00:57:32,532 --> 00:57:34,701
And then you look
over at the other side,
976
00:57:35,243 --> 00:57:38,746
and you start to think,
"How can those people be so stupid?
977
00:57:38,830 --> 00:57:42,125
Look at all of this information
that I'm constantly seeing.
978
00:57:42,208 --> 00:57:44,627
How are they not seeing
that same information?"
979
00:57:44,711 --> 00:57:47,297
And the answer is, "They're not seeing
that same information."
980
00:57:47,380 --> 00:57:50,800
Open up your eyes, don't believe the lies!
981
00:57:52,093 --> 00:57:55,472
- What are Republicans like?
- People that don't have a clue.
982
00:57:55,555 --> 00:57:58,933
The Democrat Party is a crime syndicate,
not a real political party.
983
00:57:59,017 --> 00:58:03,188
A huge new Pew Research Center study
of 10,000 American adults
984
00:58:03,271 --> 00:58:05,315
finds us more divided than ever,
985
00:58:05,398 --> 00:58:09,152
with personal and political polarization
at a 20-year high.
986
00:58:11,738 --> 00:58:14,199
You have
more than a third of Republicans saying
987
00:58:14,282 --> 00:58:16,826
the Democratic Party is a threat
to the nation,
988
00:58:16,910 --> 00:58:20,580
more than a quarter of Democrats saying
the same thing about the Republicans.
989
00:58:20,663 --> 00:58:22,499
So many of the problems
that we're discussing,
990
00:58:22,582 --> 00:58:24,417
like, around political polarization
991
00:58:24,501 --> 00:58:28,046
exist in spades on cable television.
992
00:58:28,129 --> 00:58:31,007
The media has this exact same problem,
993
00:58:31,090 --> 00:58:33,343
where their business model, by and large,
994
00:58:33,426 --> 00:58:35,762
is that they're selling our attention
to advertisers.
995
00:58:35,845 --> 00:58:38,890
And the Internet is just a new,
even more efficient way to do that.
996
00:58:40,141 --> 00:58:44,145
At YouTube, I was working
on YouTube recommendations.
997
00:58:44,229 --> 00:58:47,148
It worries me that an algorithm
that I worked on
998
00:58:47,232 --> 00:58:50,401
is actually increasing polarization
in society.
999
00:58:50,485 --> 00:58:53,112
But from the point of view of watch time,
1000
00:58:53,196 --> 00:58:57,617
this polarization is extremely efficient
at keeping people online.
1001
00:58:58,785 --> 00:59:00,870
The only reasonthese teachers are teaching this stuff
1002
00:59:00,954 --> 00:59:02,288
is 'cause they're getting paid to.
1003
00:59:02,372 --> 00:59:04,374
- It's absolutely absurd.
- Hey, Benji.
1004
00:59:04,916 --> 00:59:06,292
No soccer practice today?
1005
00:59:06,376 --> 00:59:08,795
Oh, there is. I'm just catching up
on some news stuff.
1006
00:59:08,878 --> 00:59:11,506
Do research. Anythingthat sways from the Extreme Center--
1007
00:59:11,589 --> 00:59:14,008
Wouldn't exactly call the stuff
that you're watching news.
1008
00:59:15,552 --> 00:59:18,846
You're always talking about how messed up
everything is. So are they.
1009
00:59:19,305 --> 00:59:21,140
But that stuff is just propaganda.
1010
00:59:21,224 --> 00:59:24,060
Neither is true.It's all about what makes sense.
1011
00:59:24,769 --> 00:59:26,938
Ben, I'm serious.
That stuff is bad for you.
1012
00:59:27,021 --> 00:59:29,232
- You should go to soccer practice.
- Mm.
1013
00:59:35,154 --> 00:59:37,490
I share this stuff because I care.
1014
00:59:37,574 --> 00:59:41,077
I care that you are being misled,and it's not okay. All right?
1015
00:59:41,160 --> 00:59:43,121
People think
the algorithm is designed
1016
00:59:43,204 --> 00:59:46,833
to give them what they really want,
only it's not.
1017
00:59:46,916 --> 00:59:52,589
The algorithm is actually trying to find
a few rabbit holes that are very powerful,
1018
00:59:52,672 --> 00:59:56,217
trying to find which rabbit hole
is the closest to your interest.
1019
00:59:56,301 --> 00:59:59,262
And then if you start watching
one of those videos,
1020
00:59:59,846 --> 01:00:02,223
then it will recommend it
over and over again.
1021
01:00:02,682 --> 01:00:04,934
It's not like anybody wants this
to happen.
1022
01:00:05,018 --> 01:00:07,812
It's just that this is
what the recommendation system is doing.
1023
01:00:07,895 --> 01:00:10,815
So much so that Kyrie Irving,
the famous basketball player,
1024
01:00:11,065 --> 01:00:14,235
uh, said he believed the Earth was flat,
and he apologized later
1025
01:00:14,319 --> 01:00:16,154
because he blamed it
on a YouTube rabbit hole.
1026
01:00:16,487 --> 01:00:18,656
You know, like,
you click the YouTube click
1027
01:00:18,740 --> 01:00:21,534
and it goes, like,
how deep the rabbit hole goes.
1028
01:00:21,618 --> 01:00:23,369
When he later came on to NPR to say,
1029
01:00:23,453 --> 01:00:25,955
"I'm sorry for believing this.
I didn't want to mislead people,"
1030
01:00:26,039 --> 01:00:28,291
a bunch of students in a classroom
were interviewed saying,
1031
01:00:28,374 --> 01:00:29,667
"The round-Earthers got to him."
1032
01:00:31,044 --> 01:00:33,963
The flat-Earth conspiracy theory
was recommended
1033
01:00:34,047 --> 01:00:37,634
hundreds of millions of times
by the algorithm.
1034
01:00:37,717 --> 01:00:43,890
It's easy to think that it's just
a few stupid people who get convinced,
1035
01:00:43,973 --> 01:00:46,893
but the algorithm is getting smarter
and smarter every day.
1036
01:00:46,976 --> 01:00:50,188
So, today, they are convincing the people
that the Earth is flat,
1037
01:00:50,271 --> 01:00:53,983
but tomorrow, they will be convincing you
of something that's false.
1038
01:00:54,317 --> 01:00:57,820
On November 7th,the hashtag "Pizzagate" was born.
1039
01:00:57,904 --> 01:00:59,197
Pizzagate...
1040
01:01:00,114 --> 01:01:01,449
Oh, boy.
1041
01:01:01,532 --> 01:01:02,533
Uh...
1042
01:01:03,159 --> 01:01:06,913
I still am not 100 percent sure
how this originally came about,
1043
01:01:06,996 --> 01:01:12,377
but the idea that ordering a pizza
meant ordering a trafficked person.
1044
01:01:12,460 --> 01:01:15,046
As the groups got bigger on Facebook,
1045
01:01:15,129 --> 01:01:19,967
Facebook's recommendation engine
started suggesting to regular users
1046
01:01:20,051 --> 01:01:21,761
that they join Pizzagate groups.
1047
01:01:21,844 --> 01:01:27,392
So, if a user was, for example,
anti-vaccine or believed in chemtrails
1048
01:01:27,475 --> 01:01:30,645
or had indicated to Facebook's algorithms
in some way
1049
01:01:30,728 --> 01:01:33,398
that they were prone to belief
in conspiracy theories,
1050
01:01:33,481 --> 01:01:36,859
Facebook's recommendation engine
would serve them Pizzagate groups.
1051
01:01:36,943 --> 01:01:41,072
Eventually, this culminated in
a man showing up with a gun,
1052
01:01:41,155 --> 01:01:44,617
deciding that he was gonna go liberate
the children from the basement
1053
01:01:44,701 --> 01:01:46,911
of the pizza place
that did not have a basement.
1054
01:01:46,994 --> 01:01:48,538
What were you doing?
1055
01:01:48,871 --> 01:01:50,498
Making sure
there was nothing there.
1056
01:01:50,581 --> 01:01:52,458
- Regarding?
- Pedophile ring.
1057
01:01:52,542 --> 01:01:54,293
- What?
- Pedophile ring.
1058
01:01:54,377 --> 01:01:55,962
He's talking about Pizzagate.
1059
01:01:56,045 --> 01:02:00,216
This is an example of a conspiracy theory
1060
01:02:00,299 --> 01:02:03,678
that was propagated
across all social networks.
1061
01:02:03,761 --> 01:02:06,097
The social network's
own recommendation engine
1062
01:02:06,180 --> 01:02:07,974
is voluntarily serving this up to people
1063
01:02:08,057 --> 01:02:10,643
who had never searched
for the term "Pizzagate" in their life.
1064
01:02:12,437 --> 01:02:14,439
There's a study, an MIT study,
1065
01:02:14,522 --> 01:02:19,819
that fake news on Twitter spreads
six times faster than true news.
1066
01:02:19,902 --> 01:02:21,863
What is that world gonna look like
1067
01:02:21,946 --> 01:02:24,741
when one has a six-times advantage
to the other one?
1068
01:02:25,283 --> 01:02:27,660
You can imagine
these things are sort of like...
1069
01:02:27,744 --> 01:02:31,706
they... they tilt the floor
of... of human behavior.
1070
01:02:31,789 --> 01:02:34,709
They make some behavior harder
and some easier.
1071
01:02:34,792 --> 01:02:37,420
And you're always free
to walk up the hill,
1072
01:02:37,503 --> 01:02:38,796
but fewer people do,
1073
01:02:38,880 --> 01:02:43,092
and so, at scale, at society's scale,
you really are just tilting the floor
1074
01:02:43,176 --> 01:02:45,970
and changing what billions of people think
and do.
1075
01:02:46,053 --> 01:02:52,018
We've created a system
that biases towards false information.
1076
01:02:52,643 --> 01:02:54,437
Not because we want to,
1077
01:02:54,520 --> 01:02:58,816
but because false information makes
the companies more money
1078
01:02:59,400 --> 01:03:01,319
than the truth. The truth is boring.
1079
01:03:01,986 --> 01:03:04,489
It's a disinformation-for-profit
business model.
1080
01:03:04,906 --> 01:03:08,159
You make money the more you allow
unregulated messages
1081
01:03:08,701 --> 01:03:11,287
to reach anyone for the best price.
1082
01:03:11,662 --> 01:03:13,956
Because climate change? Yeah.
1083
01:03:14,040 --> 01:03:16,751
It's a hoax. Yeah, it's real.That's the point.
1084
01:03:16,834 --> 01:03:20,046
The more they talk about itand the more they divide us,
1085
01:03:20,129 --> 01:03:22,423
the more they have the power,the more...
1086
01:03:22,507 --> 01:03:25,468
Facebook has trillions
of these news feed posts.
1087
01:03:26,552 --> 01:03:29,180
They can't know what's real
or what's true...
1088
01:03:29,972 --> 01:03:33,726
which is why this conversation
is so critical right now.
1089
01:03:33,810 --> 01:03:37,021
It's not just COVID-19that's spreading fast.
1090
01:03:37,104 --> 01:03:40,191
There's a flow of misinformation onlineabout the virus.
1091
01:03:40,274 --> 01:03:41,818
The notiondrinking water
1092
01:03:41,901 --> 01:03:43,694
will flush coronavirus from your system
1093
01:03:43,778 --> 01:03:47,490
is one of several myths about the viruscirculating on social media.
1094
01:03:47,573 --> 01:03:50,451
The government plannedthis event, created the virus,
1095
01:03:50,535 --> 01:03:53,621
and had a simulationof how the countries would react.
1096
01:03:53,955 --> 01:03:55,581
Coronavirus is a... a hoax.
1097
01:03:56,165 --> 01:03:57,959
SARS, coronavirus.
1098
01:03:58,376 --> 01:04:01,045
And look at when it was made. 2018.
1099
01:04:01,128 --> 01:04:03,798
I think the US government started
this shit.
1100
01:04:04,215 --> 01:04:09,095
Nobody is sick. Nobody is sick.
Nobody knows anybody who's sick.
1101
01:04:09,512 --> 01:04:13,015
Maybe the government is using
the coronavirus as an excuse
1102
01:04:13,099 --> 01:04:15,643
to get everyone to stay inside
because something else is happening.
1103
01:04:15,726 --> 01:04:18,020
Coronavirus is not killing people,
1104
01:04:18,104 --> 01:04:20,940
it's the 5G radiation
that they're pumping out.
1105
01:04:22,608 --> 01:04:24,944
We're being bombarded with rumors.
1106
01:04:25,403 --> 01:04:28,823
People are blowing upactual physical cell phone towers.
1107
01:04:28,906 --> 01:04:32,201
We see Russia and China spreading rumorsand conspiracy theories.
1108
01:04:32,285 --> 01:04:35,246
This morning,panic and protest in Ukraine as...
1109
01:04:35,329 --> 01:04:38,916
People have no idea what's true,and now it's a matter of life and death.
1110
01:04:39,876 --> 01:04:42,628
Those sources that are spreading
coronavirus misinformation
1111
01:04:42,712 --> 01:04:45,798
have amassed
something like 52 million engagements.
1112
01:04:45,882 --> 01:04:50,094
You're saying that silver solution
would be effective.
1113
01:04:50,177 --> 01:04:54,140
Well, let's say it hasn't been tested
on this strain of the coronavirus, but...
1114
01:04:54,223 --> 01:04:57,226
What we're seeing with COVIDis just an extreme version
1115
01:04:57,310 --> 01:05:00,521
of what's happeningacross our information ecosystem.
1116
01:05:00,938 --> 01:05:05,026
Social media amplifies exponential gossipand exponential hearsay
1117
01:05:05,109 --> 01:05:07,111
to the pointthat we don't know what's true,
1118
01:05:07,194 --> 01:05:08,946
no matter what issue we care about.
1119
01:05:15,161 --> 01:05:16,579
He discovers this.
1120
01:05:19,874 --> 01:05:21,292
Ben.
1121
01:05:26,130 --> 01:05:28,257
- Are you still on the team?
- Mm-hmm.
1122
01:05:30,384 --> 01:05:32,678
Okay, well,
I'm gonna get a snack before practice
1123
01:05:32,762 --> 01:05:34,430
if you... wanna come.
1124
01:05:35,640 --> 01:05:36,515
Hm?
1125
01:05:36,974 --> 01:05:38,601
You know, never mind.
1126
01:05:45,066 --> 01:05:47,526
Nine out of ten peopleare dissatisfied right now.
1127
01:05:47,610 --> 01:05:50,613
The EC is like any political movementin history, when you think about it.
1128
01:05:50,696 --> 01:05:54,492
We are standing up, and we are...we are standing up to this noise.
1129
01:05:54,575 --> 01:05:57,036
You are my people. I trust you guys.
1130
01:05:59,246 --> 01:06:02,583
- The Extreme Center content is brilliant.
- He absolutely loves it.
1131
01:06:02,667 --> 01:06:03,626
Running an auction.
1132
01:06:04,627 --> 01:06:08,547
840 bidders. He sold for 4.35 cents
to a weapons manufacturer.
1133
01:06:08,631 --> 01:06:10,800
Let's promote some of these events.
1134
01:06:10,883 --> 01:06:13,511
Upcoming rallies in his geographic zone
later this week.
1135
01:06:13,594 --> 01:06:15,179
I've got a new vlogger lined up, too.
1136
01:06:17,765 --> 01:06:22,979
And... and, honestly, I'm telling you,I'm willing to do whatever it takes.
1137
01:06:23,062 --> 01:06:24,939
And I mean whatever.
1138
01:06:32,154 --> 01:06:33,197
- Subscribe...
- Ben?
1139
01:06:33,280 --> 01:06:35,908
...and also come backbecause I'm telling you, yo...
1140
01:06:35,992 --> 01:06:38,869
...I got some real big things comin'.
1141
01:06:38,953 --> 01:06:40,162
Some real big things.
1142
01:06:40,788 --> 01:06:45,292
One of the problems with Facebook
is that, as a tool of persuasion,
1143
01:06:45,793 --> 01:06:47,920
it may be the greatest thing ever created.
1144
01:06:48,004 --> 01:06:52,508
Now, imagine what that means in the hands
of a dictator or an authoritarian.
1145
01:06:53,718 --> 01:06:57,638
If you want to control the population
of your country,
1146
01:06:57,722 --> 01:07:01,308
there has never been a tool
as effective as Facebook.
1147
01:07:04,937 --> 01:07:07,398
Some of the most troubling implications
1148
01:07:07,481 --> 01:07:10,985
of governments and other bad actors
weaponizing social media,
1149
01:07:11,235 --> 01:07:13,612
um, is that it has led
to real, offline harm.
1150
01:07:13,696 --> 01:07:15,072
I think the most prominent example
1151
01:07:15,156 --> 01:07:17,658
that's gotten a lot of press
is what's happened in Myanmar.
1152
01:07:19,243 --> 01:07:21,203
In Myanmar,
when people think of the Internet,
1153
01:07:21,287 --> 01:07:22,913
what they are thinking about is Facebook.
1154
01:07:22,997 --> 01:07:25,916
And what often happens is
when people buy their cell phone,
1155
01:07:26,000 --> 01:07:29,920
the cell phone shop owner will actually
preload Facebook on there for them
1156
01:07:30,004 --> 01:07:31,505
and open an account for them.
1157
01:07:31,589 --> 01:07:34,884
And so when people get their phone,
the first thing they open
1158
01:07:34,967 --> 01:07:37,595
and the only thing they know how to open
is Facebook.
1159
01:07:38,179 --> 01:07:41,891
Well, a new bombshell investigation
exposes Facebook's growing struggle
1160
01:07:41,974 --> 01:07:43,809
to tackle hate speech in Myanmar.
1161
01:07:46,103 --> 01:07:49,190
Facebook really gave the military
and other bad actors
1162
01:07:49,273 --> 01:07:51,776
a new way to manipulate public opinion
1163
01:07:51,859 --> 01:07:55,529
and to help incite violence
against the Rohingya Muslims
1164
01:07:55,613 --> 01:07:57,406
that included mass killings,
1165
01:07:58,115 --> 01:07:59,867
burning of entire villages,
1166
01:07:59,950 --> 01:08:03,704
mass rape, and other serious crimes
against humanity
1167
01:08:03,788 --> 01:08:04,955
that have now led
1168
01:08:05,039 --> 01:08:08,209
to 700,000 Rohingya Muslims
having to flee the country.
1169
01:08:11,170 --> 01:08:14,799
It's not
that highly motivated propagandists
1170
01:08:14,882 --> 01:08:16,550
haven't existed before.
1171
01:08:16,634 --> 01:08:19,761
It's that the platforms make it possible
1172
01:08:19,845 --> 01:08:23,724
to spread manipulative narratives
with phenomenal ease,
1173
01:08:23,808 --> 01:08:25,434
and without very much money.
1174
01:08:25,518 --> 01:08:27,812
If I want to manipulate an election,
1175
01:08:27,895 --> 01:08:30,564
I can now go into
a conspiracy theory group on Facebook,
1176
01:08:30,648 --> 01:08:32,233
and I can find 100 people
1177
01:08:32,316 --> 01:08:34,443
who believe
that the Earth is completely flat
1178
01:08:34,859 --> 01:08:37,779
and think it's all this conspiracy theory
that we landed on the moon,
1179
01:08:37,863 --> 01:08:41,450
and I can tell Facebook,
"Give me 1,000 users who look like that."
1180
01:08:42,118 --> 01:08:46,080
Facebook will happily send me
thousands of users that look like them
1181
01:08:46,162 --> 01:08:49,250
that I can now hit
with more conspiracy theories.
1182
01:08:50,376 --> 01:08:53,087
Sold for 3.4 cents an impression.
1183
01:08:53,379 --> 01:08:56,381
- New EC video to promote.
- Another ad teed up.
1184
01:08:58,509 --> 01:09:00,928
Algorithms
and manipulative politicians
1185
01:09:01,011 --> 01:09:02,138
are becoming so expert
1186
01:09:02,220 --> 01:09:04,055
at learning how to trigger us,
1187
01:09:04,140 --> 01:09:08,352
getting so good at creating fake news
that we absorb as if it were reality,
1188
01:09:08,435 --> 01:09:10,813
and confusing us into believing
those lies.
1189
01:09:10,895 --> 01:09:12,606
It's as though we have
less and less control
1190
01:09:12,689 --> 01:09:14,149
over who we are and what we believe.
1191
01:09:31,375 --> 01:09:32,835
...so they can pick sides.
1192
01:09:32,917 --> 01:09:34,879
There's lies here,and there's lies over there.
1193
01:09:34,962 --> 01:09:36,337
So they can keep the power,
1194
01:09:36,421 --> 01:09:39,966
so they can control everything.
1195
01:09:40,050 --> 01:09:42,553
They can control our minds,
1196
01:09:42,636 --> 01:09:46,390
so that they can keep their secrets.
1197
01:09:48,517 --> 01:09:50,895
Imagine a world
where no one believes anything true.
1198
01:09:52,897 --> 01:09:55,649
Everyone believes
the government's lying to them.
1199
01:09:56,317 --> 01:09:58,444
Everything is a conspiracy theory.
1200
01:09:58,527 --> 01:10:01,197
"I shouldn't trust anyone.
I hate the other side."
1201
01:10:01,280 --> 01:10:02,698
That's where all this is heading.
1202
01:10:02,781 --> 01:10:06,160
The political earthquakes in Europe
continue to rumble.
1203
01:10:06,243 --> 01:10:08,412
This time, in Italy and Spain.
1204
01:10:08,495 --> 01:10:11,999
Overall, Europe's traditional,centrist coalition lost its majority
1205
01:10:12,082 --> 01:10:15,002
while far rightand far left populist parties made gains.
1206
01:10:19,757 --> 01:10:20,591
Back up.
1207
01:10:21,300 --> 01:10:22,509
Okay, let's go.
1208
01:10:28,390 --> 01:10:31,268
These accountswere deliberately, specifically attempting
1209
01:10:31,352 --> 01:10:34,355
to sow political discord in Hong Kong.
1210
01:10:38,609 --> 01:10:40,361
All right, Ben.
1211
01:10:42,863 --> 01:10:45,032
What does it look like to be a country
1212
01:10:45,115 --> 01:10:48,410
that's entire diet is Facebook
and social media?
1213
01:10:48,953 --> 01:10:50,871
Democracy crumbled quickly.
1214
01:10:50,955 --> 01:10:51,830
Six months.
1215
01:10:51,914 --> 01:10:53,791
After that chaos in Chicago,
1216
01:10:53,874 --> 01:10:57,086
violent clashes between protestersand supporters...
1217
01:10:58,003 --> 01:11:01,632
Democracy is facinga crisis of confidence.
1218
01:11:01,715 --> 01:11:04,343
What we're seeing is a global assault
on democracy.
1219
01:11:05,511 --> 01:11:07,930
Most of the countries
that are targeted are countries
1220
01:11:08,013 --> 01:11:09,723
that run democratic elections.
1221
01:11:10,641 --> 01:11:12,518
This is happening at scale.
1222
01:11:12,601 --> 01:11:15,562
By state actors,
by people with millions of dollars saying,
1223
01:11:15,646 --> 01:11:18,524
"I wanna destabilize Kenya.
I wanna destabilize Cameroon.
1224
01:11:18,607 --> 01:11:20,651
Oh, Angola? That only costs this much."
1225
01:11:20,734 --> 01:11:23,362
An extraordinary electiontook place Sunday in Brazil.
1226
01:11:23,445 --> 01:11:25,823
With a campaign that's been powered
by social media.
1227
01:11:31,036 --> 01:11:33,956
We in the tech industry
have created the tools
1228
01:11:34,039 --> 01:11:37,418
to destabilize
and erode the fabric of society
1229
01:11:37,501 --> 01:11:40,254
in every country, all at once, everywhere.
1230
01:11:40,337 --> 01:11:44,508
You have this in Germany, Spain, France,
Brazil, Australia.
1231
01:11:44,591 --> 01:11:47,261
Some of the most "developed nations"
in the world
1232
01:11:47,344 --> 01:11:49,221
are now imploding on each other,
1233
01:11:49,305 --> 01:11:50,931
and what do they have in common?
1234
01:11:51,974 --> 01:11:52,975
Knowing what you know now,
1235
01:11:53,058 --> 01:11:56,312
do you believe Facebook impacted
the results of the 2016 election?
1236
01:11:56,770 --> 01:11:58,814
Oh, that's... that is hard.
1237
01:11:58,897 --> 01:12:00,691
You know, it's... the...
1238
01:12:01,275 --> 01:12:04,653
the reality is, well, there
were so many different forces at play.
1239
01:12:04,737 --> 01:12:07,865
Representatives from Facebook, Twitter,
and Google are back on Capitol Hill
1240
01:12:07,948 --> 01:12:09,450
for a second day of testimony
1241
01:12:09,533 --> 01:12:12,578
about Russia's interference
in the 2016 election.
1242
01:12:12,661 --> 01:12:17,291
The manipulation
by third parties is not a hack.
1243
01:12:18,500 --> 01:12:21,462
Right? The Russians didn't hack Facebook.
1244
01:12:21,545 --> 01:12:24,965
What they did was they used the tools
that Facebook created
1245
01:12:25,049 --> 01:12:27,843
for legitimate advertisers
and legitimate users,
1246
01:12:27,926 --> 01:12:30,346
and they applied it
to a nefarious purpose.
1247
01:12:32,014 --> 01:12:34,391
It's like remote-control warfare.
1248
01:12:34,475 --> 01:12:36,602
One country can manipulate another one
1249
01:12:36,685 --> 01:12:39,229
without actually invading
its physical borders.
1250
01:12:39,605 --> 01:12:42,232
We're seeing violent images.It appears to be a dumpster
1251
01:12:42,316 --> 01:12:43,317
being pushed around...
1252
01:12:43,400 --> 01:12:46,028
But it wasn't
about who you wanted to vote for.
1253
01:12:46,362 --> 01:12:50,574
It was about sowing total chaos
and division in society.
1254
01:12:50,657 --> 01:12:53,035
Now,this was in Huntington Beach. A march...
1255
01:12:53,118 --> 01:12:54,870
It's about making two sides
1256
01:12:54,953 --> 01:12:56,413
who couldn't hear each other anymore,
1257
01:12:56,497 --> 01:12:58,123
who didn't want to hear each other
anymore,
1258
01:12:58,207 --> 01:12:59,875
who didn't trust each other anymore.
1259
01:12:59,958 --> 01:13:03,212
This is a citywhere hatred was laid bare
1260
01:13:03,295 --> 01:13:05,464
and transformed into racial violence.
1261
01:13:20,145 --> 01:13:20,979
Ben!
1262
01:13:21,605 --> 01:13:22,439
Cassandra!
1263
01:13:22,981 --> 01:13:23,816
- Cass!
- Ben!
1264
01:13:23,899 --> 01:13:25,484
Come here! Come here!
1265
01:13:27,486 --> 01:13:31,156
Arms up. Arms up.
Get down on your knees. Now, down.
1266
01:13:36,120 --> 01:13:37,204
- Calm--
- Ben!
1267
01:13:37,287 --> 01:13:38,664
Hey! Hands up!
1268
01:13:39,623 --> 01:13:41,750
Turn around. On the ground. On the ground!
1269
01:13:56,723 --> 01:14:00,018
Do we want this system for sale
to the highest bidder?
1270
01:14:01,437 --> 01:14:05,399
For democracy to be completely for sale,
where you can reach any mind you want,
1271
01:14:05,482 --> 01:14:09,069
target a lie to that specific population,
and create culture wars?
1272
01:14:09,236 --> 01:14:10,237
Do we want that?
1273
01:14:14,700 --> 01:14:16,577
We are a nation of people...
1274
01:14:16,952 --> 01:14:18,871
that no longer speak to each other.
1275
01:14:19,872 --> 01:14:23,000
We are a nation of people
who have stopped being friends with people
1276
01:14:23,083 --> 01:14:25,461
because of who they voted for
in the last election.
1277
01:14:25,878 --> 01:14:28,422
We are a nation of people
who have isolated ourselves
1278
01:14:28,505 --> 01:14:30,966
to only watch channels
that tell us that we're right.
1279
01:14:32,259 --> 01:14:36,597
My message here today is that tribalism
is ruining us.
1280
01:14:37,347 --> 01:14:39,183
It is tearing our country apart.
1281
01:14:40,267 --> 01:14:42,811
It is no way for sane adults to act.
1282
01:14:43,187 --> 01:14:45,314
If everyone's entitled to their own facts,
1283
01:14:45,397 --> 01:14:49,401
there's really no need for compromise,
no need for people to come together.
1284
01:14:49,485 --> 01:14:51,695
In fact, there's really no need
for people to interact.
1285
01:14:52,321 --> 01:14:53,530
We need to have...
1286
01:14:53,989 --> 01:14:58,410
some shared understanding of reality.
Otherwise, we aren't a country.
1287
01:14:58,952 --> 01:15:02,998
So, uh, long-term, the solution here is
to build more AI tools
1288
01:15:03,081 --> 01:15:08,128
that find patterns of people using
the services that no real person would do.
1289
01:15:08,212 --> 01:15:11,840
We are allowing the technologists
to frame this as a problem
1290
01:15:11,924 --> 01:15:13,884
that they're equipped to solve.
1291
01:15:15,135 --> 01:15:16,470
That is... That's a lie.
1292
01:15:17,679 --> 01:15:20,724
People talk about AI
as if it will know truth.
1293
01:15:21,683 --> 01:15:23,685
AI's not gonna solve these problems.
1294
01:15:24,269 --> 01:15:27,189
AI cannot solve the problem of fake news.
1295
01:15:28,649 --> 01:15:31,026
Google doesn't have the option of saying,
1296
01:15:31,109 --> 01:15:36,240
"Oh, is this conspiracy? Is this truth?"
Because they don't know what truth is.
1297
01:15:36,782 --> 01:15:37,783
They don't have a...
1298
01:15:37,908 --> 01:15:40,827
They don't have a proxy for truth
that's better than a click.
1299
01:15:41,870 --> 01:15:45,123
If we don't agree on what is true
1300
01:15:45,207 --> 01:15:47,584
or that there is such a thing as truth,
1301
01:15:48,293 --> 01:15:49,294
we're toast.
1302
01:15:49,753 --> 01:15:52,089
This is the problem
beneath other problems
1303
01:15:52,172 --> 01:15:54,424
because if we can't agree on what's true,
1304
01:15:55,092 --> 01:15:57,803
then we can't navigate
out of any of our problems.
1305
01:16:05,435 --> 01:16:07,729
We should suggest
Flat Earth Football Club.
1306
01:16:07,813 --> 01:16:10,566
Don't show him
sports updates. He doesn't engage.
1307
01:16:39,886 --> 01:16:42,764
A lot of people in Silicon Valley
subscribe to some kind of theory
1308
01:16:42,848 --> 01:16:45,142
that we're building
some global super brain,
1309
01:16:45,309 --> 01:16:48,020
and all of our users
are just interchangeable little neurons,
1310
01:16:48,103 --> 01:16:49,563
no one of which is important.
1311
01:16:50,230 --> 01:16:53,150
And it subjugates people
into this weird role
1312
01:16:53,233 --> 01:16:56,069
where you're just, like,
this little computing element
1313
01:16:56,153 --> 01:16:58,905
that we're programming
through our behavior manipulation
1314
01:16:58,989 --> 01:17:02,367
for the service of this giant brain,
and you don't matter.
1315
01:17:02,451 --> 01:17:04,911
You're not gonna get paid.
You're not gonna get acknowledged.
1316
01:17:04,995 --> 01:17:06,455
You don't have self-determination.
1317
01:17:06,538 --> 01:17:09,416
We'll sneakily just manipulate you
because you're a computing node,
1318
01:17:09,499 --> 01:17:12,336
so we need to program you 'cause that's
what you do with computing nodes.
1319
01:17:20,093 --> 01:17:21,845
Oh, man.
1320
01:17:21,928 --> 01:17:25,390
When you think about technology
and it being an existential threat,
1321
01:17:25,474 --> 01:17:28,060
you know, that's a big claim, and...
1322
01:17:29,603 --> 01:17:33,982
it's easy to then, in your mind, think,
"Okay, so, there I am with the phone...
1323
01:17:35,609 --> 01:17:37,235
scrolling, clicking, using it.
1324
01:17:37,319 --> 01:17:39,196
Like, where's the existential threat?
1325
01:17:40,280 --> 01:17:41,615
Okay, there's the supercomputer.
1326
01:17:41,698 --> 01:17:43,950
The other side of the screen,
pointed at my brain,
1327
01:17:44,409 --> 01:17:47,537
got me to watch one more video.
Where's the existential threat?"
1328
01:17:54,252 --> 01:17:57,631
It's not
about the technology
1329
01:17:57,714 --> 01:17:59,341
being the existential threat.
1330
01:18:03,679 --> 01:18:06,264
It's the technology's ability
1331
01:18:06,348 --> 01:18:09,476
to bring out the worst in society...
1332
01:18:09,559 --> 01:18:13,522
...and the worst in society
being the existential threat.
1333
01:18:18,819 --> 01:18:20,570
If technology creates...
1334
01:18:21,697 --> 01:18:23,115
mass chaos,
1335
01:18:23,198 --> 01:18:24,533
outrage, incivility,
1336
01:18:24,616 --> 01:18:26,326
lack of trust in each other,
1337
01:18:27,452 --> 01:18:30,414
loneliness, alienation, more polarization,
1338
01:18:30,706 --> 01:18:33,333
more election hacking, more populism,
1339
01:18:33,917 --> 01:18:36,962
more distraction and inability
to focus on the real issues...
1340
01:18:37,963 --> 01:18:39,715
that's just society.
1341
01:18:40,340 --> 01:18:46,388
And now society
is incapable of healing itself
1342
01:18:46,471 --> 01:18:48,515
and just devolving into a kind of chaos.
1343
01:18:51,977 --> 01:18:54,938
This affects everyone,
even if you don't use these products.
1344
01:18:55,397 --> 01:18:57,524
These things have become
digital Frankensteins
1345
01:18:57,607 --> 01:19:00,068
that are terraforming the world
in their image,
1346
01:19:00,152 --> 01:19:01,862
whether it's the mental health of children
1347
01:19:01,945 --> 01:19:04,489
or our politics
and our political discourse,
1348
01:19:04,573 --> 01:19:07,492
without taking responsibility
for taking over the public square.
1349
01:19:07,576 --> 01:19:10,579
- So, again, it comes back to--
- And who do you think's responsible?
1350
01:19:10,662 --> 01:19:13,582
I think we have
to have the platforms be responsible
1351
01:19:13,665 --> 01:19:15,584
for when they take over
election advertising,
1352
01:19:15,667 --> 01:19:17,794
they're responsible
for protecting elections.
1353
01:19:17,878 --> 01:19:20,380
When they take over mental health of kids
or Saturday morning,
1354
01:19:20,464 --> 01:19:22,841
they're responsible
for protecting Saturday morning.
1355
01:19:23,592 --> 01:19:27,929
The race to keep people's attention
isn't going away.
1356
01:19:28,388 --> 01:19:31,850
Our technology's gonna become
more integrated into our lives, not less.
1357
01:19:31,933 --> 01:19:34,895
The AIs are gonna get better at predicting
what keeps us on the screen,
1358
01:19:34,978 --> 01:19:37,105
not worse at predicting
what keeps us on the screen.
1359
01:19:38,940 --> 01:19:42,027
I... I am 62 years old,
1360
01:19:42,110 --> 01:19:44,821
getting older every minute,
the more this conversation goes on...
1361
01:19:44,905 --> 01:19:48,033
...but... but I will tell you that, um...
1362
01:19:48,700 --> 01:19:52,370
I'm probably gonna be dead and gone,
and I'll probably be thankful for it,
1363
01:19:52,454 --> 01:19:54,331
when all this shit comes to fruition.
1364
01:19:54,790 --> 01:19:59,586
Because... Because I think
that this scares me to death.
1365
01:20:00,754 --> 01:20:03,048
Do... Do you...
Do you see it the same way?
1366
01:20:03,548 --> 01:20:06,885
Or am I overreacting to a situation
that I don't know enough about?
1367
01:20:09,805 --> 01:20:11,598
What are you most worried about?
1368
01:20:13,850 --> 01:20:18,480
I think,
in the... in the shortest time horizon...
1369
01:20:19,523 --> 01:20:20,524
civil war.
1370
01:20:24,444 --> 01:20:29,908
If we go down the current status quo
for, let's say, another 20 years...
1371
01:20:31,117 --> 01:20:34,579
we probably destroy our civilization
through willful ignorance.
1372
01:20:34,663 --> 01:20:37,958
We probably fail to meet the challenge
of climate change.
1373
01:20:38,041 --> 01:20:42,087
We probably degrade
the world's democracies
1374
01:20:42,170 --> 01:20:46,132
so that they fall into some sort
of bizarre autocratic dysfunction.
1375
01:20:46,216 --> 01:20:48,426
We probably ruin the global economy.
1376
01:20:48,760 --> 01:20:52,264
Uh, we probably, um, don't survive.
1377
01:20:52,347 --> 01:20:54,808
You know,
I... I really do view it as existential.
1378
01:21:02,524 --> 01:21:04,985
Is this the last generation of people
1379
01:21:05,068 --> 01:21:08,488
that are gonna know what it was like
before this illusion took place?
1380
01:21:11,074 --> 01:21:14,578
Like, how do you wake up from the matrix
when you don't know you're in the matrix?
1381
01:21:27,382 --> 01:21:30,635
A lot of what we're saying
sounds like it's just this...
1382
01:21:31,511 --> 01:21:33,680
one-sided doom and gloom.
1383
01:21:33,763 --> 01:21:36,808
Like, "Oh, my God,
technology's just ruining the world
1384
01:21:36,892 --> 01:21:38,059
and it's ruining kids,"
1385
01:21:38,143 --> 01:21:40,061
and it's like... "No."
1386
01:21:40,228 --> 01:21:44,065
It's confusing
because it's simultaneous utopia...
1387
01:21:44,608 --> 01:21:45,567
and dystopia.
1388
01:21:45,942 --> 01:21:50,447
Like, I could hit a button on my phone,
and a car shows up in 30 seconds,
1389
01:21:50,530 --> 01:21:52,699
and I can go exactly where I need to go.
1390
01:21:52,782 --> 01:21:55,660
That is magic. That's amazing.
1391
01:21:56,161 --> 01:21:57,662
When we were making the like button,
1392
01:21:57,746 --> 01:22:01,499
our entire motivation was, "Can we spread
positivity and love in the world?"
1393
01:22:01,583 --> 01:22:05,003
The idea that, fast-forward to today,
and teens would be getting depressed
1394
01:22:05,086 --> 01:22:06,421
when they don't have enough likes,
1395
01:22:06,504 --> 01:22:08,632
or it could be leading
to political polarization
1396
01:22:08,715 --> 01:22:09,883
was nowhere on our radar.
1397
01:22:09,966 --> 01:22:12,135
I don't think these guys set out
to be evil.
1398
01:22:13,511 --> 01:22:15,764
It's just the business model
that has a problem.
1399
01:22:15,847 --> 01:22:20,226
You could shut down the service
and destroy whatever it is--
1400
01:22:20,310 --> 01:22:24,522
$20 billion of shareholder value--
and get sued and...
1401
01:22:24,606 --> 01:22:27,108
But you can't, in practice,
put the genie back in the bottle.
1402
01:22:27,192 --> 01:22:30,403
You can make some tweaks,
but at the end of the day,
1403
01:22:30,487 --> 01:22:34,032
you've gotta grow revenue and usage,
quarter over quarter. It's...
1404
01:22:34,658 --> 01:22:37,535
The bigger it gets,
the harder it is for anyone to change.
1405
01:22:38,495 --> 01:22:43,458
What I see is a bunch of people
who are trapped by a business model,
1406
01:22:43,541 --> 01:22:46,169
an economic incentive,
and shareholder pressure
1407
01:22:46,252 --> 01:22:48,922
that makes it almost impossible
to do something else.
1408
01:22:49,005 --> 01:22:50,924
I think we need to accept that it's okay
1409
01:22:51,007 --> 01:22:53,176
for companies to be focused
on making money.
1410
01:22:53,259 --> 01:22:55,637
What's not okay
is when there's no regulation, no rules,
1411
01:22:55,720 --> 01:22:56,888
and no competition,
1412
01:22:56,972 --> 01:23:00,850
and the companies are acting
as sort of de facto governments.
1413
01:23:00,934 --> 01:23:03,353
And then they're saying,
"Well, we can regulate ourselves."
1414
01:23:03,436 --> 01:23:05,981
I mean, that's just a lie.
That's just ridiculous.
1415
01:23:06,064 --> 01:23:08,650
Financial incentives kind of run
the world,
1416
01:23:08,733 --> 01:23:12,529
so any solution to this problem
1417
01:23:12,612 --> 01:23:15,573
has to realign the financial incentives.
1418
01:23:16,074 --> 01:23:18,785
There's no fiscal reason
for these companies to change.
1419
01:23:18,868 --> 01:23:21,329
And that is why I think
we need regulation.
1420
01:23:21,413 --> 01:23:24,290
The phone company
has tons of sensitive data about you,
1421
01:23:24,374 --> 01:23:27,544
and we have a lot of laws that make sure
they don't do the wrong things.
1422
01:23:27,627 --> 01:23:31,506
We have almost no laws
around digital privacy, for example.
1423
01:23:31,589 --> 01:23:34,426
We could tax data collection
and processing
1424
01:23:34,509 --> 01:23:37,554
the same way that you, for example,
pay your water bill
1425
01:23:37,637 --> 01:23:39,723
by monitoring the amount of water
that you use.
1426
01:23:39,806 --> 01:23:43,226
You tax these companies on the data assets
that they have.
1427
01:23:43,309 --> 01:23:44,769
It gives them a fiscal reason
1428
01:23:44,853 --> 01:23:47,856
to not acquire every piece of data
on the planet.
1429
01:23:47,939 --> 01:23:50,567
The law runs way behind on these things,
1430
01:23:50,650 --> 01:23:55,864
but what I know is the current situation
exists not for the protection of users,
1431
01:23:55,947 --> 01:23:58,700
but for the protection
of the rights and privileges
1432
01:23:58,783 --> 01:24:01,453
of these gigantic,
incredibly wealthy companies.
1433
01:24:02,245 --> 01:24:05,832
Are we always gonna defer to the richest,
most powerful people?
1434
01:24:05,915 --> 01:24:07,417
Or are we ever gonna say,
1435
01:24:07,959 --> 01:24:12,047
"You know, there are times
when there is a national interest.
1436
01:24:12,130 --> 01:24:15,592
There are times
when the interests of people, of users,
1437
01:24:15,675 --> 01:24:17,385
is actually more important
1438
01:24:18,011 --> 01:24:21,473
than the profits of somebody
who's already a billionaire"?
1439
01:24:21,556 --> 01:24:26,603
These markets undermine democracy,
and they undermine freedom,
1440
01:24:26,686 --> 01:24:28,521
and they should be outlawed.
1441
01:24:29,147 --> 01:24:31,816
This is not a radical proposal.
1442
01:24:31,900 --> 01:24:34,194
There are other markets that we outlaw.
1443
01:24:34,277 --> 01:24:36,988
We outlaw markets in human organs.
1444
01:24:37,072 --> 01:24:39,491
We outlaw markets in human slaves.
1445
01:24:39,949 --> 01:24:44,037
Because they have
inevitable destructive consequences.
1446
01:24:44,537 --> 01:24:45,830
We live in a world
1447
01:24:45,914 --> 01:24:50,001
in which a tree is worth more,
financially, dead than alive,
1448
01:24:50,085 --> 01:24:53,838
in a world in which a whale
is worth more dead than alive.
1449
01:24:53,922 --> 01:24:56,341
For so long as our economy works
in that way
1450
01:24:56,424 --> 01:24:58,134
and corporations go unregulated,
1451
01:24:58,218 --> 01:25:00,678
they're going to continue
to destroy trees,
1452
01:25:00,762 --> 01:25:01,763
to kill whales,
1453
01:25:01,846 --> 01:25:06,101
to mine the earth, and to continue
to pull oil out of the ground,
1454
01:25:06,184 --> 01:25:08,394
even though we know
it is destroying the planet
1455
01:25:08,478 --> 01:25:12,148
and we know that it's going to leave
a worse world for future generations.
1456
01:25:12,232 --> 01:25:13,858
This is short-term thinking
1457
01:25:13,942 --> 01:25:16,694
based on this religion of profit
at all costs,
1458
01:25:16,778 --> 01:25:20,156
as if somehow, magically, each corporation
acting in its selfish interest
1459
01:25:20,240 --> 01:25:21,950
is going to produce the best result.
1460
01:25:22,033 --> 01:25:24,494
This has been affecting the environment
for a long time.
1461
01:25:24,577 --> 01:25:27,288
What's frightening,
and what hopefully is the last straw
1462
01:25:27,372 --> 01:25:29,207
that will make us wake up
as a civilization
1463
01:25:29,290 --> 01:25:31,709
to how flawed this theory has been
in the first place
1464
01:25:31,793 --> 01:25:35,004
is to see that now we're the tree,
we're the whale.
1465
01:25:35,088 --> 01:25:37,048
Our attention can be mined.
1466
01:25:37,132 --> 01:25:39,134
We are more profitable to a corporation
1467
01:25:39,217 --> 01:25:41,594
if we're spending time
staring at a screen,
1468
01:25:41,678 --> 01:25:42,971
staring at an ad,
1469
01:25:43,054 --> 01:25:45,890
than if we're spending that time
living our life in a rich way.
1470
01:25:45,974 --> 01:25:47,559
And so, we're seeing the results of that.
1471
01:25:47,642 --> 01:25:50,687
We're seeing corporations using
powerful artificial intelligence
1472
01:25:50,770 --> 01:25:53,648
to outsmart us and figure out
how to pull our attention
1473
01:25:53,731 --> 01:25:55,358
toward the things they want us to look at,
1474
01:25:55,441 --> 01:25:57,277
rather than the things
that are most consistent
1475
01:25:57,360 --> 01:25:59,237
with our goals and our values
and our lives.
1476
01:26:05,535 --> 01:26:06,911
What a computer is,
1477
01:26:06,995 --> 01:26:10,290
is it's the most remarkable tool
that we've ever come up with.
1478
01:26:11,124 --> 01:26:13,877
And it's the equivalent of a bicycle
for our minds.
1479
01:26:15,628 --> 01:26:20,091
The idea of humane technology,
that's where Silicon Valley got its start.
1480
01:26:21,050 --> 01:26:25,722
And we've lost sight of it
because it became the cool thing to do,
1481
01:26:25,805 --> 01:26:27,265
as opposed to the right thing to do.
1482
01:26:27,348 --> 01:26:29,726
The Internet was, like,
a weird, wacky place.
1483
01:26:29,809 --> 01:26:31,394
It was experimental.
1484
01:26:31,477 --> 01:26:34,731
Creative things happened on the Internet,
and certainly, they do still,
1485
01:26:34,814 --> 01:26:38,610
but, like, it just feels like this,
like, giant mall.
1486
01:26:38,693 --> 01:26:42,071
You know, it's just like, "God,
there's gotta be...
1487
01:26:42,155 --> 01:26:44,157
there's gotta be more to it than that."
1488
01:26:46,659 --> 01:26:48,411
I guess I'm just an optimist.
1489
01:26:48,494 --> 01:26:52,040
'Cause I think we can change
what social media looks like and means.
1490
01:26:54,083 --> 01:26:56,711
The way the technology works
is not a law of physics.
1491
01:26:56,794 --> 01:26:57,921
It is not set in stone.
1492
01:26:58,004 --> 01:27:02,175
These are choices that human beings
like myself have been making.
1493
01:27:02,759 --> 01:27:05,345
And human beings can change
those technologies.
1494
01:27:06,971 --> 01:27:09,974
And the question now is
whether or not we're willing to admit
1495
01:27:10,475 --> 01:27:15,438
that those bad outcomes are coming
directly as a product of our work.
1496
01:27:21,027 --> 01:27:24,864
It's that we built these things,
and we have a responsibility to change it.
1497
01:27:37,085 --> 01:27:38,711
The attention extraction model
1498
01:27:38,795 --> 01:27:42,298
is not how we want to treat
human beings.
1499
01:27:45,343 --> 01:27:48,137
Is it just me or...
1500
01:27:49,722 --> 01:27:51,099
Poor sucker.
1501
01:27:51,516 --> 01:27:53,226
The fabric of a healthy society
1502
01:27:53,309 --> 01:27:56,145
depends on us getting off
this corrosive business model.
1503
01:28:04,696 --> 01:28:08,157
We can demand
that these products be designed humanely.
1504
01:28:09,409 --> 01:28:13,121
We can demand to not be treated
as an extractable resource.
1505
01:28:15,164 --> 01:28:18,334
"How do we make the world better?"
1506
01:28:20,336 --> 01:28:21,504
Throughout history,
1507
01:28:21,587 --> 01:28:23,798
every single time
something's gotten better,
1508
01:28:23,881 --> 01:28:26,342
it's because somebody has come along
to say,
1509
01:28:26,426 --> 01:28:28,428
"This is stupid. We can do better."
1510
01:28:29,178 --> 01:28:32,557
Like, it's the critics
that drive improvement.
1511
01:28:33,141 --> 01:28:35,393
It's the critics
who are the true optimists.
1512
01:28:37,020 --> 01:28:39,147
Hello.
1513
01:28:42,984 --> 01:28:44,277
Um...
1514
01:28:46,195 --> 01:28:47,697
I mean, it seems kind of crazy, right?
1515
01:28:47,780 --> 01:28:51,534
It's like the fundamental way
that this stuff is designed...
1516
01:28:52,994 --> 01:28:55,163
isn't going in a good direction.
1517
01:28:55,246 --> 01:28:56,873
Like, the entire thing.
1518
01:28:56,956 --> 01:29:00,626
So, it sounds crazy to say
we need to change all that,
1519
01:29:01,169 --> 01:29:02,670
but that's what we need to do.
1520
01:29:04,297 --> 01:29:05,923
Think we're gonna get there?
1521
01:29:07,383 --> 01:29:08,301
We have to.
1522
01:29:20,646 --> 01:29:24,942
Um,
it seems like you're very optimistic.
1523
01:29:26,194 --> 01:29:27,570
Is that how I sound?
1524
01:29:27,653 --> 01:29:28,905
Yeah, I mean...
1525
01:29:28,988 --> 01:29:31,449
I can't believe you keep saying that,
because I'm like, "Really?
1526
01:29:31,532 --> 01:29:33,409
I feel like we're headed toward dystopia.
1527
01:29:33,493 --> 01:29:35,328
I feel like we're on the fast track
to dystopia,
1528
01:29:35,411 --> 01:29:37,830
and it's gonna take a miracle
to get us out of it."
1529
01:29:37,914 --> 01:29:40,291
And that miracle is, of course,
collective will.
1530
01:29:41,000 --> 01:29:44,587
I am optimistic
that we're going to figure it out,
1531
01:29:44,670 --> 01:29:47,048
but I think it's gonna take a long time.
1532
01:29:47,131 --> 01:29:50,385
Because not everybody recognizes
that this is a problem.
1533
01:29:50,468 --> 01:29:55,890
I think one of the big failures
in technology today
1534
01:29:55,973 --> 01:29:58,643
is a real failure of leadership,
1535
01:29:58,726 --> 01:30:01,979
of, like, people coming out
and having these open conversations
1536
01:30:02,063 --> 01:30:05,900
about things that... not just
what went well, but what isn't perfect
1537
01:30:05,983 --> 01:30:08,194
so that someone can come in
and build something new.
1538
01:30:08,277 --> 01:30:10,321
At the end of the day, you know,
1539
01:30:10,405 --> 01:30:14,617
this machine isn't gonna turn around
until there's massive public pressure.
1540
01:30:14,700 --> 01:30:18,329
By having these conversations
and... and voicing your opinion,
1541
01:30:18,413 --> 01:30:21,082
in some cases
through these very technologies,
1542
01:30:21,165 --> 01:30:24,252
we can start to change the tide.
We can start to change the conversation.
1543
01:30:24,335 --> 01:30:27,004
It might sound strange,
but it's my world. It's my community.
1544
01:30:27,088 --> 01:30:29,632
I don't hate them. I don't wanna do
any harm to Google or Facebook.
1545
01:30:29,715 --> 01:30:32,885
I just want to reform them
so they don't destroy the world. You know?
1546
01:30:32,969 --> 01:30:35,513
I've uninstalled a ton of apps
from my phone
1547
01:30:35,596 --> 01:30:37,723
that I felt were just wasting my time.
1548
01:30:37,807 --> 01:30:40,685
All the social media apps,
all the news apps,
1549
01:30:40,768 --> 01:30:42,520
and I've turned off notifications
1550
01:30:42,603 --> 01:30:45,815
on anything that was vibrating my leg
with information
1551
01:30:45,898 --> 01:30:48,943
that wasn't timely and important to me
right now.
1552
01:30:49,026 --> 01:30:51,279
It's for the same reason
I don't keep cookies in my pocket.
1553
01:30:51,362 --> 01:30:53,197
Reduce the number of notifications
you get.
1554
01:30:53,281 --> 01:30:54,449
Turn off notifications.
1555
01:30:54,532 --> 01:30:55,950
Turning off all notifications.
1556
01:30:56,033 --> 01:30:58,536
I'm not using Google anymore,
I'm using Qwant,
1557
01:30:58,619 --> 01:31:01,497
which doesn't store your search history.
1558
01:31:01,581 --> 01:31:04,459
Never accept a video recommended to you
on YouTube.
1559
01:31:04,542 --> 01:31:07,003
Always choose.
That's another way to fight.
1560
01:31:07,086 --> 01:31:12,133
There are tons of Chrome extensions
that remove recommendations.
1561
01:31:12,216 --> 01:31:15,178
You're recommending
something to undo what you made.
1562
01:31:15,261 --> 01:31:16,554
Yep.
1563
01:31:16,929 --> 01:31:21,642
Before you share, fact-check,
consider the source, do that extra Google.
1564
01:31:21,726 --> 01:31:25,104
If it seems like it's something designed
to really push your emotional buttons,
1565
01:31:25,188 --> 01:31:26,314
like, it probably is.
1566
01:31:26,397 --> 01:31:29,025
Essentially, you vote with your clicks.
1567
01:31:29,108 --> 01:31:30,359
If you click on clickbait,
1568
01:31:30,443 --> 01:31:33,779
you're creating a financial incentive
that perpetuates this existing system.
1569
01:31:33,863 --> 01:31:36,949
Make sure that you get
lots of different kinds of information
1570
01:31:37,033 --> 01:31:37,909
in your own life.
1571
01:31:37,992 --> 01:31:40,995
I follow people on Twitter
that I disagree with
1572
01:31:41,078 --> 01:31:44,207
because I want to be exposed
to different points of view.
1573
01:31:44,665 --> 01:31:46,584
Notice that many people
in the tech industry
1574
01:31:46,667 --> 01:31:49,045
don't give these devices
to their own children.
1575
01:31:49,128 --> 01:31:51,047
My kids don't use social media at all.
1576
01:31:51,839 --> 01:31:53,549
Is that a rule,
or is that a...
1577
01:31:53,633 --> 01:31:54,509
That's a rule.
1578
01:31:55,092 --> 01:31:57,845
We are zealots about it.
1579
01:31:57,929 --> 01:31:59,222
We're... We're crazy.
1580
01:31:59,305 --> 01:32:05,603
And we don't let our kids have
really any screen time.
1581
01:32:05,686 --> 01:32:08,564
I've worked out
what I think are three simple rules, um,
1582
01:32:08,648 --> 01:32:12,610
that make life a lot easier for families
and that are justified by the research.
1583
01:32:12,693 --> 01:32:15,571
So, the first rule is
all devices out of the bedroom
1584
01:32:15,655 --> 01:32:17,281
at a fixed time every night.
1585
01:32:17,365 --> 01:32:20,535
Whatever the time is, half an hour
before bedtime, all devices out.
1586
01:32:20,618 --> 01:32:24,038
The second rule is no social media
until high school.
1587
01:32:24,121 --> 01:32:26,374
Personally, I think the age should be 16.
1588
01:32:26,457 --> 01:32:28,960
Middle school's hard enough.
Keep it out until high school.
1589
01:32:29,043 --> 01:32:32,964
And the third rule is
work out a time budget with your kid.
1590
01:32:33,047 --> 01:32:34,757
And if you talk with them and say,
1591
01:32:34,840 --> 01:32:37,927
"Well, how many hours a day
do you wanna spend on your device?
1592
01:32:38,010 --> 01:32:39,637
What do you think is a good amount?"
1593
01:32:39,720 --> 01:32:41,597
They'll often say
something pretty reasonable.
1594
01:32:42,056 --> 01:32:44,642
Well, look, I know perfectly well
1595
01:32:44,725 --> 01:32:48,563
that I'm not gonna get everybody
to delete their social media accounts,
1596
01:32:48,646 --> 01:32:50,439
but I think I can get a few.
1597
01:32:50,523 --> 01:32:54,402
And just getting a few people
to delete their accounts matters a lot,
1598
01:32:54,485 --> 01:32:58,406
and the reason why is that that creates
the space for a conversation
1599
01:32:58,489 --> 01:33:00,908
because I want there to be enough people
out in the society
1600
01:33:00,992 --> 01:33:05,204
who are free of the manipulation engines
to have a societal conversation
1601
01:33:05,288 --> 01:33:07,540
that isn't bounded
by the manipulation engines.
1602
01:33:07,623 --> 01:33:10,126
So, do it! Get out of the system.
1603
01:33:10,209 --> 01:33:12,503
Yeah, delete. Get off the stupid stuff.
1604
01:33:13,546 --> 01:33:16,507
The world's beautiful.
Look. Look, it's great out there.
132316
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.