Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:10,454 --> 00:00:13,610
In 2014, Facebook scientists
published the results
2
00:00:13,634 --> 00:00:17,293
of one of the biggest psychological
experiments ever conducted.
3
00:00:21,533 --> 00:00:25,009
They took almost 700-thousand
Facebook profiles
4
00:00:25,189 --> 00:00:27,212
and deliberately skewed
their news feeds
5
00:00:27,252 --> 00:00:30,252
to be either more positive
or more negative.
6
00:00:33,267 --> 00:00:36,009
Then, they used the company's
sophisticated algorithms
7
00:00:36,025 --> 00:00:38,625
to measure any shift
in people's mood.
8
00:00:40,244 --> 00:00:43,177
Their findings?
The more positive the feed,
9
00:00:43,259 --> 00:00:45,992
the happier Facebook
users seemed to be.
10
00:00:46,283 --> 00:00:49,431
The more negative,
the more depressed they became.
11
00:00:50,055 --> 00:00:53,070
It proved the power of Facebook
to affect what we think
12
00:00:53,095 --> 00:00:54,587
and how we feel.
13
00:00:55,572 --> 00:00:57,907
Facebook has very cleverly
figured out
14
00:00:57,963 --> 00:01:00,430
how to wrap itself
around our lives.
15
00:01:00,486 --> 00:01:02,009
It's the family photo album.
16
00:01:02,259 --> 00:01:04,634
It's your messaging
to your friends.
17
00:01:05,064 --> 00:01:07,478
It's your daily diary.
It's your contact list.
18
00:01:07,503 --> 00:01:09,673
It's all of these things
wrapped around your life.
19
00:01:12,666 --> 00:01:15,173
This is the story of how
one of the World's biggest
20
00:01:15,189 --> 00:01:17,587
and most powerful
private corporations
21
00:01:17,737 --> 00:01:21,664
is turning our lives and
our data into vast profits,
22
00:01:21,774 --> 00:01:24,103
and in ways we have
no control over.
23
00:01:26,627 --> 00:01:30,532
They are the most
successful company
24
00:01:30,564 --> 00:01:35,415
arguably in human history
at just gathering people's time
25
00:01:35,447 --> 00:01:37,603
and turning that time
into money.
26
00:01:40,189 --> 00:01:41,220
Like his company,
27
00:01:41,245 --> 00:01:43,782
Facebook's founder
hardly needs introducing.
28
00:01:44,048 --> 00:01:46,497
Mark Zuckerberg started
the social media platform
29
00:01:46,552 --> 00:01:49,364
13 years ago when
he was just 19
30
00:01:49,388 --> 00:01:52,161
as a site for Harvard
undergraduate students.
31
00:01:52,590 --> 00:01:53,605
When we first launched,
32
00:01:53,630 --> 00:01:56,563
we were hoping for maybe
400 to 500 people.
33
00:01:56,856 --> 00:01:58,145
Harvard didn't have a Facebook
34
00:01:58,170 --> 00:02:00,786
so that was the gap
that we were trying to fill
35
00:02:00,817 --> 00:02:02,379
and now we're at
100,0000 people
36
00:02:02,404 --> 00:02:04,380
so who knows where
we are going next.
37
00:02:07,232 --> 00:02:08,765
Within its first month,
38
00:02:08,794 --> 00:02:11,424
more than half the students
had joined, setting the trend
39
00:02:11,449 --> 00:02:13,590
for the membership
explosion that followed.
40
00:02:14,216 --> 00:02:17,871
Now, almost a quarter of the
world's population has signed on.
41
00:02:18,113 --> 00:02:20,574
It is bigger than any country.
42
00:02:23,349 --> 00:02:26,270
Facebook is now
a global colossus.
43
00:02:26,434 --> 00:02:28,958
It is one of the world's
most valuable corporations,
44
00:02:28,983 --> 00:02:31,544
worth over 400 billion dollars.
45
00:02:34,325 --> 00:02:37,434
Mark Zuckerberg is an
international powerbroker
46
00:02:37,458 --> 00:02:38,809
in his own right.
47
00:02:42,052 --> 00:02:45,059
He's like a king, right?
He's like a monarch.
48
00:02:45,138 --> 00:02:48,880
He's making decisions
about your life on Facebook,
49
00:02:48,966 --> 00:02:52,567
what the rules are,
and he's a benevolent dictator.
50
00:02:54,497 --> 00:02:57,497
You can't say this is
accountable governance
51
00:02:57,560 --> 00:03:02,512
or a participatory governance
in any particular way.
52
00:03:04,825 --> 00:03:08,958
But almost two billion users
still isn't enough for Facebook.
53
00:03:09,193 --> 00:03:11,942
Mark Zuckerberg is aiming
for the next billion.
54
00:03:14,036 --> 00:03:16,192
There is a limit to how
much they can grow
55
00:03:16,216 --> 00:03:19,442
in established markets like
North America and Australia.
56
00:03:19,458 --> 00:03:21,294
But the 32-year-old businessman
57
00:03:21,310 --> 00:03:24,184
sees huge potential
in the developing world.
58
00:03:26,313 --> 00:03:28,579
There are a billion
people in India
59
00:03:28,607 --> 00:03:30,961
who do not have access
to the internet yet
60
00:03:31,525 --> 00:03:35,040
and if what you care about is
connecting everyone in the world
61
00:03:35,235 --> 00:03:38,500
then you can't do that when
there are so many people
62
00:03:38,540 --> 00:03:41,368
who don't even have access
to basic connectivity.
63
00:03:41,797 --> 00:03:43,407
There's a term that's being used
64
00:03:43,432 --> 00:03:46,407
by folks connected to
Facebook and Google
65
00:03:46,524 --> 00:03:49,266
called the last billion where
they're basically trying
66
00:03:49,291 --> 00:03:51,696
to figure out a way to
spread internet access,
67
00:03:51,721 --> 00:03:53,704
but the internet that
they're going to spread
68
00:03:53,766 --> 00:03:57,954
is an internet that's shaped by
Facebook and Facebook's agendas.
69
00:03:58,024 --> 00:04:01,329
That's actually part
of the long game here.
70
00:04:01,376 --> 00:04:04,586
Most people in a lot
of the developing world
71
00:04:05,594 --> 00:04:08,368
are accessing the internet
through their mobile phones
72
00:04:08,399 --> 00:04:12,602
and there are these programs
that are known as zero rating
73
00:04:12,649 --> 00:04:17,836
or Facebook Zero so when
you get you smartphone,
74
00:04:18,055 --> 00:04:22,125
you get free data if
you're using Facebook
75
00:04:22,149 --> 00:04:24,118
and so people stay on Facebook.
76
00:04:24,196 --> 00:04:25,996
They don't go anywhere else,
77
00:04:26,625 --> 00:04:31,632
so that their whole world
on the internet becomes
78
00:04:31,667 --> 00:04:37,114
very much the same
as, you know...
79
00:04:37,149 --> 00:04:39,290
they don't know any
other kind of internet.
80
00:04:42,226 --> 00:04:44,154
Facebook is a free service,
81
00:04:44,256 --> 00:04:45,865
but that's because
Zuckerberg has learned
82
00:04:45,912 --> 00:04:49,623
how to turn our data
into dollars. Lots of dollars.
83
00:04:50,451 --> 00:04:54,459
Last year his company
earned 27-and a half billion US,
84
00:04:54,561 --> 00:04:57,209
just under 16 dollars
for each user,
85
00:04:57,256 --> 00:05:00,029
and he's buying even more
internet real estate.
86
00:05:02,264 --> 00:05:05,597
Clearly there's one topic
we have to start with.
87
00:05:05,724 --> 00:05:08,435
You bought WhatsApp
for 19 billion dollars.
88
00:05:08,467 --> 00:05:10,076
Why did you do it
and what does it mean?
89
00:05:10,287 --> 00:05:11,842
You can look at
other messaging apps
90
00:05:11,866 --> 00:05:15,045
that are out there, whether it's
Cacao or Line or WeChat
91
00:05:15,083 --> 00:05:16,295
that are already monetizing
92
00:05:16,320 --> 00:05:18,272
at a rate of two to
three dollars per person
93
00:05:18,296 --> 00:05:20,795
with pretty early efforts
and I think that shows
94
00:05:20,842 --> 00:05:24,295
if we do a pretty good job
at helping WhatsApp to grow
95
00:05:24,320 --> 00:05:26,475
that is just going to be
a huge business.
96
00:05:26,835 --> 00:05:29,968
Facebook has WhatsApp,
Facebook has Instagram.
97
00:05:30,053 --> 00:05:33,873
Facebook has Oculus Rift,
not necessarily mainstream
98
00:05:33,898 --> 00:05:38,506
but these are very big
corners of the internet.
99
00:05:38,756 --> 00:05:42,014
Should we be concerned
about a monopoly?
100
00:05:42,139 --> 00:05:44,147
We should always be
concerned about monopoly.
101
00:05:44,178 --> 00:05:47,318
We should always be concerned
about concentration of power.
102
00:05:47,835 --> 00:05:49,904
We should always be
concerned about that
103
00:05:50,342 --> 00:05:53,287
and we need to hold their
feet to the fire at all times.
104
00:05:55,459 --> 00:05:57,303
Facebook is all about
community,
105
00:05:57,342 --> 00:06:00,436
what people all around the world
are coming together to do,
106
00:06:00,478 --> 00:06:02,100
connecting with friends
and family,
107
00:06:02,171 --> 00:06:04,014
informing these communities.
108
00:06:05,811 --> 00:06:08,647
Facebook presents itself
as a digital platform,
109
00:06:08,796 --> 00:06:11,522
a neutral stage upon
which life plays out.
110
00:06:12,014 --> 00:06:15,108
2016: we all went
through it together
111
00:06:16,256 --> 00:06:19,443
It says it is a company that
develops digital technology,
112
00:06:19,577 --> 00:06:21,217
not social engineering.
113
00:06:22,717 --> 00:06:24,600
For all the talk
about community,
114
00:06:24,694 --> 00:06:27,894
Facebook is neither democratic
nor transparent.
115
00:06:30,717 --> 00:06:34,475
Any place we go to
that is not truly open,
116
00:06:34,499 --> 00:06:36,748
that's not governed
by us as users,
117
00:06:36,788 --> 00:06:40,162
that's not governed by
some democratic accountability,
118
00:06:40,350 --> 00:06:44,006
is actually a place
that is not truly ours.
119
00:06:44,038 --> 00:06:45,490
It's a place that we can use,
120
00:06:45,515 --> 00:06:47,420
it provides great value
in many ways,
121
00:06:47,445 --> 00:06:49,959
don't get me wrong,
to its users.
122
00:06:49,991 --> 00:06:54,217
But it's incorrect to see it
as a neutral place.
123
00:06:54,600 --> 00:06:57,225
It can do things
like a government
124
00:06:57,241 --> 00:07:02,209
and indeed it has inherited
some government-like functions,
125
00:07:02,287 --> 00:07:04,014
but I don't think that passes
the smell test
126
00:07:04,039 --> 00:07:07,608
to imagine that Facebook
or any online platform
127
00:07:07,633 --> 00:07:09,490
is truly democratic,
they're not.
128
00:07:15,639 --> 00:07:18,839
If we tell the computer
to look at two numbers
129
00:07:18,881 --> 00:07:22,100
and compare them and put
the larger number on one side
130
00:07:22,125 --> 00:07:24,710
and the smaller one
on the other then,
131
00:07:24,741 --> 00:07:27,248
with a series of steps
we will be able to reorder it.
132
00:07:27,561 --> 00:07:29,654
To understand
how Facebook works,
133
00:07:29,694 --> 00:07:32,389
we need to understand
what goes on under the hood.
134
00:07:32,842 --> 00:07:35,897
The engine that drives the
system is built on algorithms,
135
00:07:35,960 --> 00:07:38,701
sets of instructions
that Facebook's engineers use
136
00:07:38,726 --> 00:07:41,139
to determine what we see
in our News Feed.
137
00:07:41,530 --> 00:07:44,795
Dr Suelette Dreyfus,
an information systems expert,
138
00:07:44,921 --> 00:07:47,897
demonstrates how
a basic algorithm works.
139
00:07:49,771 --> 00:07:53,983
Typically, an algorithm might be
for processing some data
140
00:07:53,999 --> 00:07:59,178
or doing some arithmetic,
summing something for example,
141
00:07:59,342 --> 00:08:02,194
or it might be
to try and recreate
142
00:08:02,218 --> 00:08:03,413
the decision-making process
143
00:08:03,429 --> 00:08:06,553
that we use in our human brain
on a more sophisticated level.
144
00:08:10,413 --> 00:08:12,835
Facebook's algorithms
were originally configured
145
00:08:12,859 --> 00:08:14,772
to help Harvard University
students
146
00:08:14,796 --> 00:08:16,171
stay in touch with one another.
147
00:08:16,866 --> 00:08:18,999
They exploited the way
the students had
148
00:08:19,046 --> 00:08:20,881
a small group of close friends,
149
00:08:20,921 --> 00:08:23,254
and a wider,
looser social circle.
150
00:08:24,093 --> 00:08:27,272
The algorithms are now
vastly more complex,
151
00:08:27,515 --> 00:08:28,881
but exactly how they work
152
00:08:28,906 --> 00:08:31,030
is a closely guarded
commercial secret.
153
00:08:31,491 --> 00:08:34,335
We do know that they are
designed with one aim in mind,
154
00:08:34,413 --> 00:08:36,858
to keep us online
for as long as possible.
155
00:08:39,765 --> 00:08:43,467
The algorithms are designed
to be helpful
156
00:08:43,492 --> 00:08:45,631
and give us information
that's relevant to us,
157
00:08:45,718 --> 00:08:47,217
but don't for a minute
assume that
158
00:08:47,242 --> 00:08:48,975
the algorithms are just there
to help us.
159
00:08:49,116 --> 00:08:53,249
The algorithms are there to make
a profit for Facebook.
160
00:08:55,093 --> 00:08:57,093
And that is Facebook's genius.
161
00:08:57,413 --> 00:09:00,420
It is a giant agency
that uses its platform
162
00:09:00,445 --> 00:09:02,147
to deliver us advertising.
163
00:09:08,445 --> 00:09:11,257
By tracking what we do,
who we associate with,
164
00:09:11,320 --> 00:09:12,987
what websites we look at,
165
00:09:13,140 --> 00:09:15,632
Facebook is able make
sophisticated judgements
166
00:09:15,687 --> 00:09:17,354
about the stories we see,
167
00:09:17,609 --> 00:09:20,913
but also advertising that
is likely to move us to spend.
168
00:09:27,140 --> 00:09:28,835
We will probably
always live in a world
169
00:09:28,859 --> 00:09:30,640
with old fashioned display ads.
170
00:09:30,749 --> 00:09:33,374
Times Square simply wouldn't be
the same without it.
171
00:09:33,671 --> 00:09:35,405
But these ads nudge
towards products
172
00:09:35,439 --> 00:09:37,624
with all the subtlety
of a punch in the nose.
173
00:09:38,249 --> 00:09:39,630
Facebook on the other hand uses
174
00:09:39,656 --> 00:09:42,007
the extraordinary
amounts of data that it gathers
175
00:09:42,117 --> 00:09:45,335
on each and every one of us
to help advertisers reach us
176
00:09:45,366 --> 00:09:47,679
with precision that
we've never known before.
177
00:09:47,913 --> 00:09:50,522
And it gives anybody
in the business of persuasion
178
00:09:50,616 --> 00:09:52,811
power that is unprecedented.
179
00:09:54,554 --> 00:09:57,163
Depending on what we post
at any given moment,
180
00:09:57,367 --> 00:10:00,210
Facebook can figure out
what we are doing and thinking,
181
00:10:00,235 --> 00:10:01,421
and exploit that.
182
00:10:03,484 --> 00:10:06,726
Facebook's very well aware
of you know our sentiment,
183
00:10:06,751 --> 00:10:08,117
our mood and how
we talk to people
184
00:10:08,142 --> 00:10:09,647
and it can put
all that data together
185
00:10:09,656 --> 00:10:12,029
and start to understand
like who our exes are
186
00:10:12,054 --> 00:10:14,059
and who our friends are
and who our old friends are
187
00:10:14,084 --> 00:10:17,091
and who our new friends are
and that's how it really works
188
00:10:17,116 --> 00:10:19,319
to incentivise another post.
189
00:10:19,466 --> 00:10:21,309
What you're saying is
Facebook has the capacity
190
00:10:21,334 --> 00:10:23,481
to understand our moods?
191
00:10:23,725 --> 00:10:24,725
Yes.
192
00:10:25,443 --> 00:10:27,950
Could that be used to influence
our buying behaviours?
193
00:10:28,006 --> 00:10:30,716
Of course it can be used
to influence our behaviour
194
00:10:30,741 --> 00:10:32,263
in general, not just buying.
195
00:10:32,630 --> 00:10:35,661
You can be incredibly
hyper targeted.
196
00:10:35,873 --> 00:10:37,318
Can I give you an example?
197
00:10:37,490 --> 00:10:39,395
We don't always act our age
198
00:10:39,420 --> 00:10:41,810
or according to
our gender stereotypes.
199
00:10:41,857 --> 00:10:43,997
A middle-aged woman
might like rap music.
200
00:10:44,060 --> 00:10:45,286
She is sick of getting ads
201
00:10:45,311 --> 00:10:47,114
for gardening gloves
and weight loss.
202
00:10:47,193 --> 00:10:49,700
So she posts on her Facebook
203
00:10:49,748 --> 00:10:52,146
that she likes Seth Sentry's
Waitress Song.
204
00:10:52,224 --> 00:10:54,552
Now she gets ads
for a streaming music service,
205
00:10:54,577 --> 00:10:56,333
something she might
actually buy.
206
00:10:59,012 --> 00:11:02,661
Adam Helfgott runs a digital
marketing company in New York.
207
00:11:02,943 --> 00:11:05,833
He uses a tool called
Facebook Pixel.
208
00:11:06,037 --> 00:11:09,263
Facebook gives it to advertisers
to embed in their sites.
209
00:11:09,443 --> 00:11:11,927
They can track anybody
who visits their site
210
00:11:12,029 --> 00:11:14,833
and target them with ads
on Facebook.
211
00:11:15,473 --> 00:11:17,199
Well if you've ever logged
into Facebook
212
00:11:17,262 --> 00:11:21,324
with any of your browsers,
213
00:11:21,434 --> 00:11:23,363
it's a good chance
it'll know it's you.
214
00:11:23,403 --> 00:11:24,948
You don't have to be logged in,
215
00:11:24,973 --> 00:11:27,496
you have to have been there
at some point in time.
216
00:11:28,356 --> 00:11:29,691
If it's a brand new computer
217
00:11:29,731 --> 00:11:31,293
and you've never
logged into Facebook,
218
00:11:32,160 --> 00:11:34,637
Facebook at that moment in time
won't know it's you,
219
00:11:34,832 --> 00:11:38,340
but based upon their algorithms
and your usage
220
00:11:38,371 --> 00:11:39,433
they'll figure it out.
221
00:11:39,582 --> 00:11:45,567
So, what you can then do
is put this piece of script
222
00:11:45,692 --> 00:11:47,160
onto your website.
223
00:11:47,254 --> 00:11:51,222
And then use Facebook data
to find the people
224
00:11:51,403 --> 00:11:54,465
that looked at your website
and then target ads to them.
225
00:11:54,490 --> 00:11:55,331
That's correct.
226
00:11:55,356 --> 00:11:56,489
- Through Facebook.
- Yep.
227
00:11:57,887 --> 00:12:00,574
That feels a little bit creepy,
I mean...
228
00:12:00,792 --> 00:12:04,316
are there privacy issue
involved with that?
229
00:12:04,637 --> 00:12:07,254
From a legal point of view
there's no privacy issue,
230
00:12:07,285 --> 00:12:09,746
that's just the internet today,
231
00:12:09,793 --> 00:12:12,254
and the state of it
and using a product
232
00:12:12,279 --> 00:12:14,176
that generates a lot
of revenue for Facebook.
233
00:12:18,066 --> 00:12:20,176
For advertisers it is a boon,
234
00:12:20,528 --> 00:12:24,191
giving them access to the most
intimate details of our lives.
235
00:12:24,762 --> 00:12:26,879
Megan Brownlow
is a media strategist
236
00:12:26,904 --> 00:12:29,160
for Price Waterhouse Coopers
in Sydney.
237
00:12:32,153 --> 00:12:35,020
When you change your status,
for example,
238
00:12:35,222 --> 00:12:38,496
we might see something,
a young woman
239
00:12:38,521 --> 00:12:40,621
changes her status to engaged.
240
00:12:40,646 --> 00:12:44,738
Suddenly she gets ads
for bridal services.
241
00:12:44,950 --> 00:12:47,380
These sorts of things are clues
242
00:12:47,403 --> 00:12:50,203
about what her interests
might really be.
243
00:12:50,442 --> 00:12:54,456
The research from consumers
is they don't like advertising
244
00:12:54,481 --> 00:12:56,152
if it's not relevant to them.
245
00:12:56,277 --> 00:12:58,566
If it actually is something
that they want,
246
00:12:58,603 --> 00:13:00,340
they don't mind it so much.
247
00:13:00,379 --> 00:13:02,902
This is actually
not a bad thing.
248
00:13:07,364 --> 00:13:09,262
Nik Cubrilovic is
a former hacker
249
00:13:09,270 --> 00:13:11,137
turned security consultant.
250
00:13:11,856 --> 00:13:13,863
He's been using his skills
to investigate
251
00:13:13,888 --> 00:13:15,582
the way our data is tracked.
252
00:13:17,387 --> 00:13:19,512
One day Cubrilovic
made a discovery
253
00:13:19,543 --> 00:13:21,238
that startled the tech world.
254
00:13:21,762 --> 00:13:24,644
He found that even if you're not
logged on to Facebook,
255
00:13:24,684 --> 00:13:26,137
even if you're not a member,
256
00:13:26,207 --> 00:13:27,879
the company tracks and stores
257
00:13:27,903 --> 00:13:29,902
a huge amount
of your browsing history.
258
00:13:30,175 --> 00:13:31,642
And you can't opt out.
259
00:13:35,270 --> 00:13:36,394
If you don't like Facebook,
260
00:13:36,410 --> 00:13:39,277
if you don't like the kinds
of things you're describing,
261
00:13:39,418 --> 00:13:41,199
just close your account?
262
00:13:42,434 --> 00:13:46,058
It's very difficult to opt out
of Facebook's reach on the web.
263
00:13:46,309 --> 00:13:48,168
Even if you close your account,
264
00:13:48,193 --> 00:13:50,972
even if you log out
of all of your services
265
00:13:51,098 --> 00:13:53,113
the way that they're set up,
266
00:13:53,138 --> 00:13:55,176
with their sharing buttons
and so forth,
267
00:13:55,223 --> 00:13:58,019
they're still going to be able
to build a profile for you.
268
00:13:58,044 --> 00:14:00,268
And it's just not going to have
the same level of information
269
00:14:00,293 --> 00:14:01,527
associated with it.
270
00:14:01,724 --> 00:14:04,472
They don't even tell us clearly
what they're doing.
271
00:14:05,567 --> 00:14:09,988
They tell us some things
but it's not specific enough
272
00:14:10,051 --> 00:14:12,543
to really answer the question,
273
00:14:12,568 --> 00:14:15,701
if somebody was going
to build a dossier on me
274
00:14:16,215 --> 00:14:19,160
based on what Facebook
knows about me,
275
00:14:19,371 --> 00:14:20,621
what would it look like?
276
00:14:20,684 --> 00:14:22,777
I should be able to know that,
277
00:14:23,723 --> 00:14:25,737
so that I can make
informed decisions
278
00:14:25,762 --> 00:14:27,597
about how I'm going
to use the platform.
279
00:14:35,020 --> 00:14:37,949
Facebook is not just
influencing what we buy.
280
00:14:38,106 --> 00:14:40,152
It's changing the world
we live in.
281
00:14:48,199 --> 00:14:50,799
Sure they want to
bring their service
282
00:14:50,824 --> 00:14:52,784
to everybody on the planet.
283
00:14:52,809 --> 00:14:56,863
From a commercial standpoint
that's obviously a goal.
284
00:14:57,137 --> 00:15:00,144
Whether it makes the world
a better place
285
00:15:00,169 --> 00:15:02,035
is another question.
286
00:15:02,082 --> 00:15:03,707
Not only have you built
this big business
287
00:15:03,739 --> 00:15:05,113
and this big social network,
288
00:15:05,215 --> 00:15:08,756
you now are determining
the course of
289
00:15:09,093 --> 00:15:12,156
possibly determining
the course of world events.
290
00:15:14,770 --> 00:15:17,683
That's exactly what happened
in the streets of Cairo.
291
00:15:21,098 --> 00:15:24,051
In January 2011,
millions gathered in the city
292
00:15:24,076 --> 00:15:27,051
demanding the resignation
of the autocrat Hosni Mubarak.
293
00:15:29,700 --> 00:15:32,309
It became known
as the Facebook revolution.
294
00:15:36,535 --> 00:15:40,793
The organizers used Facebook to
rally vast crowds of protesters.
295
00:15:41,074 --> 00:15:42,152
They were so effective
296
00:15:42,184 --> 00:15:44,847
that the government
tried to shut down the internet.
297
00:15:53,371 --> 00:15:56,262
It took just 18 days
to topple Mubarak.
298
00:16:00,371 --> 00:16:04,387
So what Facebook came to stand
for several months I would say
299
00:16:04,410 --> 00:16:06,332
or at least in its early days
300
00:16:06,418 --> 00:16:10,699
after the events of Tahrir
Square in the Arab Spring
301
00:16:10,724 --> 00:16:15,879
was a symbol of people's ability
to organize and express
302
00:16:15,903 --> 00:16:18,236
and share information
more widely.
303
00:16:18,285 --> 00:16:22,301
It symbolised that so much so
that I like to tell stories
304
00:16:22,326 --> 00:16:25,526
about how I could buy
T - shirts in Tahrir Square
305
00:16:25,628 --> 00:16:28,065
which said
"Facebook, Tool of Revolution".
306
00:16:29,183 --> 00:16:30,785
I understand as well as anybody
307
00:16:30,817 --> 00:16:33,340
just how effective
Facebook can be.
308
00:16:34,067 --> 00:16:36,262
Three years ago,
I was imprisoned in Egypt
309
00:16:36,293 --> 00:16:38,426
on trumped up terrorism charges.
310
00:16:39,106 --> 00:16:42,308
My family used Facebook as
a way of organizing supporters,
311
00:16:42,356 --> 00:16:43,816
and keep them informed.
312
00:16:44,035 --> 00:16:47,027
It became one of the most
vital tools in the campaign
313
00:16:47,052 --> 00:16:48,995
that ultimately
got me out of prison.
314
00:16:54,964 --> 00:16:56,955
The Facebook page became a place
315
00:16:56,987 --> 00:16:59,495
that anybody could find
the latest on our case.
316
00:16:59,824 --> 00:17:02,230
The underlying algorithms
helped push it to people
317
00:17:02,254 --> 00:17:05,191
who might have been interested
in supporting our cause,
318
00:17:05,301 --> 00:17:07,230
even before
they knew it existed.
319
00:17:12,215 --> 00:17:13,215
Peter!
320
00:17:13,980 --> 00:17:15,113
How are you, man?
321
00:17:15,527 --> 00:17:18,260
- Good to see you.
- Good to see you, man.
322
00:17:18,503 --> 00:17:20,290
Let's go inside, mate.
It's cold.
323
00:17:26,728 --> 00:17:29,610
Mohamed Soltan was also
arrested for protesting.
324
00:17:29,837 --> 00:17:32,625
He was imprisoned
in the same jail as me.
325
00:17:33,330 --> 00:17:35,298
There was this medium
that people just wanted
326
00:17:35,323 --> 00:17:38,189
to express themselves because
there was no other avenue
327
00:17:38,220 --> 00:17:40,025
in the public space
to express themselves
328
00:17:40,048 --> 00:17:42,267
and then they found
this outlet...
329
00:17:42,423 --> 00:17:44,829
and then they found this outlet
to the outside world as well,
330
00:17:44,884 --> 00:17:47,673
where they would put
331
00:17:47,712 --> 00:17:50,790
how they feel
about social justice issues,
332
00:17:50,821 --> 00:17:54,154
on just day to day inequalities
that they've seen
333
00:17:54,603 --> 00:17:57,376
and then there was
the second phase of that
334
00:17:57,392 --> 00:18:00,126
where they saw that
there's quite a few of them
335
00:18:00,151 --> 00:18:03,087
that feel the same way about
a lot of different things.
336
00:18:06,134 --> 00:18:09,938
It took a prolonged court case,
a 500-day hunger strike
337
00:18:10,047 --> 00:18:13,040
and intense international
pressure to get him released.
338
00:18:14,728 --> 00:18:18,368
For him too, Facebook became
an invaluable tool.
339
00:18:20,626 --> 00:18:25,277
Facebook unlike other platforms
340
00:18:25,288 --> 00:18:27,853
and social media outlets,
341
00:18:27,888 --> 00:18:31,008
it allowed for us
to put out the reports,
342
00:18:31,048 --> 00:18:35,071
the medical reports, it allowed
for us to share my family,
343
00:18:35,360 --> 00:18:39,501
to share stories
and it establishes credibility.
344
00:18:39,720 --> 00:18:45,247
So... Facebook provides this
345
00:18:45,283 --> 00:18:48,941
this place that is almost ideal
for finding like-minded people,
346
00:18:48,976 --> 00:18:52,016
whether it means finding people
who live in a certain place,
347
00:18:52,041 --> 00:18:53,485
who are interested
in a certain thing
348
00:18:53,510 --> 00:18:57,032
or people who are in the thrall
of a dangerous ideology.
349
00:19:02,594 --> 00:19:06,273
In Australia, Facebook has also
become a powerful political tool
350
00:19:06,298 --> 00:19:09,565
for mainstream causes,
and groups on the fringe.
351
00:19:10,946 --> 00:19:12,540
They don't want
to represent you.
352
00:19:12,571 --> 00:19:16,399
They want to represent the great
global international agenda
353
00:19:16,424 --> 00:19:19,337
that corrupts our people
from the inside,
354
00:19:19,368 --> 00:19:21,149
builds mosques
in our communities
355
00:19:21,236 --> 00:19:23,993
and floods our beautiful country
with third world immigrants.
356
00:19:24,032 --> 00:19:25,798
But is that what we want?
357
00:19:25,977 --> 00:19:27,188
No!
358
00:19:27,446 --> 00:19:28,938
Blair Cottrell leads a group
359
00:19:28,977 --> 00:19:30,883
called the United
Patriots Front,
360
00:19:31,071 --> 00:19:33,414
a right-wing movement
built on Facebook,
361
00:19:33,618 --> 00:19:36,985
that campaigns against Muslim
and immigrants across Australia.
362
00:19:40,274 --> 00:19:42,250
Facebook's been
extremely effective for us.
363
00:19:42,275 --> 00:19:44,828
That's indispensable
to the development
364
00:19:44,852 --> 00:19:46,185
of our organisation.
365
00:19:47,953 --> 00:19:52,117
Without it, we would probably be
a separatist cult,
366
00:19:52,399 --> 00:19:54,805
where no one would be able
to relate to us,
367
00:19:54,830 --> 00:19:56,899
because no one would actually
hear us directly,
368
00:19:56,923 --> 00:19:58,696
they would only hear about us
369
00:19:58,721 --> 00:20:00,844
through established
media corporations.
370
00:20:01,524 --> 00:20:05,188
Islam can only pose a threat
to our nation
371
00:20:05,212 --> 00:20:09,602
if our weak leadership,
or rather lack of leadership,
372
00:20:09,634 --> 00:20:11,234
is allowed to continue.
373
00:20:11,680 --> 00:20:14,719
Facebook has helped turn
a disparate group of individuals
374
00:20:14,743 --> 00:20:18,078
into a political force
that some say is dangerous.
375
00:20:21,008 --> 00:20:25,235
It gives us the ability
to cut out the middleman,
376
00:20:25,415 --> 00:20:27,141
to go directly to the people,
377
00:20:27,204 --> 00:20:29,180
to the audience
with our message,
378
00:20:29,391 --> 00:20:31,414
to speak directly
to the Australian people
379
00:20:31,446 --> 00:20:35,336
which is a power that hitherto
has only been held
380
00:20:35,360 --> 00:20:37,110
by established
media corporations
381
00:20:37,134 --> 00:20:40,453
and anybody who speaks
through such media corporations.
382
00:20:40,688 --> 00:20:42,219
But now anybody has that power.
383
00:20:42,243 --> 00:20:44,510
Anybody has access
to that power.
384
00:20:45,767 --> 00:20:49,555
Some of the UPF's more inflammatory
statements have been censored,
385
00:20:49,556 --> 00:20:52,079
he's been prosecuted
for staging a mock beheading
386
00:20:52,095 --> 00:20:54,657
that they filmed
and posted on Facebook.
387
00:20:55,024 --> 00:20:57,164
Facebook removed
some of their posts,
388
00:20:57,282 --> 00:20:59,422
including the original
beheading video,
389
00:20:59,447 --> 00:21:01,375
and threatened
to suspend the page.
390
00:21:03,024 --> 00:21:05,641
Sometimes Facebook has removed
or censored
391
00:21:05,666 --> 00:21:07,446
certain posts of ours
392
00:21:07,866 --> 00:21:10,532
because we've used
the world Muslim for example.
393
00:21:11,126 --> 00:21:12,782
Not in a negative way at all.
394
00:21:12,807 --> 00:21:17,165
If we'd explained an incident
or a point of view
395
00:21:17,189 --> 00:21:18,782
and we've used the world Muslim,
396
00:21:19,040 --> 00:21:21,493
sometimes that registers
in Facebook's computer
397
00:21:21,774 --> 00:21:23,696
and they automatically delete it
for some reason.
398
00:21:23,720 --> 00:21:27,063
I don't know if it's a person
who deletes it or a computer,
399
00:21:27,517 --> 00:21:30,797
but that can be
a bit problematic.
400
00:21:30,821 --> 00:21:33,704
We actually started altering
the way we spelt Muslim
401
00:21:34,001 --> 00:21:35,782
in order to have our posts
remain up
402
00:21:35,806 --> 00:21:36,922
when we were speaking about
403
00:21:36,947 --> 00:21:38,602
the Muslim people
of the Islamic faith.
404
00:21:44,001 --> 00:21:45,336
Facebook has been criticized
405
00:21:45,361 --> 00:21:47,930
for the way it censors
controversial posts.
406
00:21:48,602 --> 00:21:51,172
Whenever someone flags
a post as offensive,
407
00:21:51,259 --> 00:21:53,414
it gets sent
to a human moderator
408
00:21:53,477 --> 00:21:55,594
who decides
if it should be taken down.
409
00:21:56,571 --> 00:21:58,172
The company says it reviews
410
00:21:58,188 --> 00:22:01,455
a hundred million pieces
of content every month.
411
00:22:05,704 --> 00:22:07,219
People are under
a lot of pressure
412
00:22:07,259 --> 00:22:10,282
to review a great deal
of content very quickly
413
00:22:11,610 --> 00:22:17,336
and I certainly hear about
what appear to be mistakes
414
00:22:17,634 --> 00:22:22,477
quite frequently and some
of them are kind of ridiculous
415
00:22:22,501 --> 00:22:27,018
like at the end of 2015
416
00:22:27,366 --> 00:22:30,983
a whole bunch of women named Isis
had their accounts deactivated
417
00:22:31,018 --> 00:22:32,641
because clearly somebody went
418
00:22:32,674 --> 00:22:35,344
and flagged them
as being terrorists.
419
00:22:38,227 --> 00:22:40,025
After an Egyptian court
convicted me
420
00:22:40,050 --> 00:22:41,675
of terrorism charges,
421
00:22:41,783 --> 00:22:44,025
my own Facebook page
was suspended.
422
00:22:45,205 --> 00:22:47,228
We were never told
why it disappeared.
423
00:22:47,705 --> 00:22:50,431
Facebook says it was
to protect my privacy.
424
00:22:51,237 --> 00:22:53,550
We believed I'd been labelled
a terrorist,
425
00:22:54,026 --> 00:22:57,291
violating what Zuckerberg calls
its community standards.
426
00:23:00,588 --> 00:23:03,674
You can't have a common standard
for 1.8 billion people.
427
00:23:03,729 --> 00:23:06,431
Our diversity is actually
our strength, right?
428
00:23:06,596 --> 00:23:08,720
Part of what makes us
a global community
429
00:23:08,893 --> 00:23:11,744
is the reality of that
what forms a global community
430
00:23:11,769 --> 00:23:14,759
are our incredibly
fundamental differences.
431
00:23:15,916 --> 00:23:18,900
In one infamous example,
Facebook removed a post
432
00:23:18,925 --> 00:23:22,158
showing one of the most powerful
images of the Vietnam war.
433
00:23:22,658 --> 00:23:24,259
The photograph of a naked girl
434
00:23:24,284 --> 00:23:26,551
violated
its community standards.
435
00:23:29,057 --> 00:23:31,830
The community standards
are developed by his staff.
436
00:23:33,838 --> 00:23:36,682
The community didn't develop
those standards.
437
00:23:38,526 --> 00:23:40,783
They're called
community standards
438
00:23:41,393 --> 00:23:42,932
but they were developed
by Facebook
439
00:23:42,971 --> 00:23:46,385
and yes they've had input
here and there over time,
440
00:23:46,416 --> 00:23:48,391
they also get input
from governments
441
00:23:48,416 --> 00:23:50,861
about... you know, recently
a number of governments
442
00:23:50,893 --> 00:23:53,346
told them "You need to amend
your community standards
443
00:23:53,370 --> 00:23:57,041
to be harder
on extremist content", you know.
444
00:23:57,237 --> 00:23:59,260
And so they amended
their community standards.
445
00:23:59,291 --> 00:24:00,978
It's not like the community
got together
446
00:24:01,003 --> 00:24:02,736
and developed these standards.
447
00:24:08,229 --> 00:24:11,596
Facebook is also transforming
politics as we know it.
448
00:24:13,284 --> 00:24:16,400
Politicians have used social
media for years of course,
449
00:24:16,432 --> 00:24:19,580
but in this last election
campaigners used big data
450
00:24:19,605 --> 00:24:22,189
to radically transform
American politics.
451
00:24:22,518 --> 00:24:25,557
In the words of some observers,
they weaponized the data.
452
00:24:25,948 --> 00:24:27,697
We're on our way
to Washington DC
453
00:24:27,722 --> 00:24:30,455
to find out what impact
Facebook has had
454
00:24:30,558 --> 00:24:32,150
on the fight for
political power.
455
00:24:37,518 --> 00:24:40,033
At the heart of political power
is information.
456
00:24:40,666 --> 00:24:42,728
That's why government
security agencies
457
00:24:42,745 --> 00:24:45,541
go to extraordinary lengths
to vacuum up data.
458
00:24:46,104 --> 00:24:48,814
But increasingly it is also
becoming the key
459
00:24:48,831 --> 00:24:49,964
to winning power.
460
00:24:51,573 --> 00:24:56,049
I think that there's
a legitimate argument to this
461
00:24:56,074 --> 00:24:58,994
that Facebook influenced
the election,
462
00:24:59,065 --> 00:25:01,465
the United States Election
results.
463
00:25:01,901 --> 00:25:04,103
I think that Facebook
and algorithms
464
00:25:04,127 --> 00:25:09,111
are partially responsible,
if not the main reason why,
465
00:25:09,307 --> 00:25:10,814
there's this shift towards
466
00:25:10,839 --> 00:25:14,775
hyper partisan belief
systems these days.
467
00:25:17,290 --> 00:25:19,290
We will soon have, by the way,
468
00:25:19,315 --> 00:25:21,830
a very strong
and powerful border.
469
00:25:22,346 --> 00:25:24,744
When Donald Trump became
the presidential frontrunner
470
00:25:24,769 --> 00:25:27,572
in last year's US election,
few pundits predicted
471
00:25:27,597 --> 00:25:28,971
that he'd actually win.
472
00:25:31,245 --> 00:25:33,268
One of the Trump campaign's
secret weapons
473
00:25:33,293 --> 00:25:36,004
was an ability to research
social media data
474
00:25:36,041 --> 00:25:37,641
in extraordinary detail.
475
00:25:38,432 --> 00:25:41,260
It helped him understand
and target his voters
476
00:25:41,377 --> 00:25:43,525
with a precision
we've never seen before.
477
00:25:46,230 --> 00:25:50,971
By using Facebook's
ad targeting engine for example,
478
00:25:51,018 --> 00:25:54,471
they know if some of those
independent voters
479
00:25:54,628 --> 00:25:56,596
have actually liked
Republican pages
480
00:25:56,636 --> 00:25:58,299
or liked the Bernie Sanders page
481
00:25:58,316 --> 00:26:00,116
or like a Donald Trump page.
482
00:26:00,213 --> 00:26:02,923
So you can go to them
to spend money,
483
00:26:02,948 --> 00:26:06,549
to target advertising
specifically to those voters
484
00:26:06,698 --> 00:26:09,807
and it is a much more reliable
ultimately form of targeting
485
00:26:09,832 --> 00:26:14,479
than many of the other
online vehicles out there.
486
00:26:15,237 --> 00:26:17,260
Political strategist
Patrick Ruffini
487
00:26:17,292 --> 00:26:20,839
runs a company that mines big
data for the Republican Party.
488
00:26:21,300 --> 00:26:24,151
He produces social media maps
that help them make sure
489
00:26:24,176 --> 00:26:27,003
their political messages
hit their targets.
490
00:26:27,667 --> 00:26:31,776
What it does give us is
much greater level of certainty
491
00:26:31,801 --> 00:26:35,401
and granularity and precision
down to the individual voter,
492
00:26:35,425 --> 00:26:37,253
down to the individual precinct
493
00:26:37,371 --> 00:26:39,081
about how things
are going to go.
494
00:26:39,121 --> 00:26:42,284
It used to be we could survey
eight hundred,
495
00:26:42,363 --> 00:26:45,589
a thousand registered voters
nationwide,
496
00:26:45,707 --> 00:26:47,581
but you couldn't really
make projections
497
00:26:47,605 --> 00:26:51,384
about understanding from that
an individual State would go
498
00:26:51,409 --> 00:26:54,714
or how an individual voter
would ultimately go.
499
00:26:58,058 --> 00:27:01,487
I Donald John Trump
do solemnly swear
500
00:27:01,512 --> 00:27:04,729
that I will faithfully execute
the office
501
00:27:04,754 --> 00:27:07,087
of President
of the United States.
502
00:27:09,996 --> 00:27:12,519
It is one thing to know
your voters of course,
503
00:27:12,544 --> 00:27:15,401
and quite another to get them
to change their minds.
504
00:27:15,699 --> 00:27:17,832
Facebook can help with that too.
505
00:27:21,268 --> 00:27:24,300
The ability to take the pools
of big data that we've got
506
00:27:24,402 --> 00:27:27,362
and do really deep
analysis of it
507
00:27:27,378 --> 00:27:32,409
to understand small groups
of customers' preferences,
508
00:27:32,582 --> 00:27:35,115
can be applied
in a political setting
509
00:27:35,230 --> 00:27:38,612
in a way that is
potentially worrying
510
00:27:38,636 --> 00:27:44,370
because it allows politicians
to potentially lie better.
511
00:27:44,885 --> 00:27:45,705
For instance,
512
00:27:45,730 --> 00:27:47,737
you can't make a political
statement on television,
513
00:27:47,762 --> 00:27:49,894
without it being disclosed
who it was being paid for.
514
00:27:50,332 --> 00:27:52,855
Those same controls
on the web are very lax.
515
00:27:52,871 --> 00:27:54,019
For instance,
516
00:27:54,113 --> 00:27:57,784
I could see a story about
a certain XYZ politician
517
00:27:57,809 --> 00:27:58,894
has done a great thing,
518
00:27:58,919 --> 00:28:02,292
produced on a completely
different third party news site.
519
00:28:02,730 --> 00:28:05,263
I cannot know that
that ad was placed
520
00:28:05,449 --> 00:28:08,933
by a political operation who
have specifically targeted me,
521
00:28:08,972 --> 00:28:11,034
because that information
is not disclosed, anywhere.
522
00:28:11,261 --> 00:28:14,821
I am understanding
yet troubled by the
523
00:28:14,862 --> 00:28:20,155
by the data driven advertising,
and targeting ads that occur,
524
00:28:20,246 --> 00:28:24,416
but I'm even more uncomfortable
by the reality
525
00:28:24,441 --> 00:28:27,456
that our elections and
how our elections are structured
526
00:28:27,481 --> 00:28:31,651
and configured
can be hijacked by these forces
527
00:28:31,675 --> 00:28:33,742
that are not transparent to us.
528
00:28:36,488 --> 00:28:40,058
One of the most important parts
of any democracy is news -
529
00:28:40,105 --> 00:28:43,479
almost half of all Americans
get theirs from Facebook.
530
00:28:44,072 --> 00:28:45,603
But the last US election
531
00:28:45,628 --> 00:28:48,472
also saw the explosion
in fake news,
532
00:28:48,558 --> 00:28:51,025
turbocharged by sharing
on Facebook.
533
00:28:52,714 --> 00:28:55,331
These things look like news,
they function like news,
534
00:28:55,371 --> 00:28:56,815
they're shared like news,
535
00:28:57,019 --> 00:29:00,612
they don't match up
with traditional ideas
536
00:29:00,636 --> 00:29:03,222
of what news is for
and what it should do.
537
00:29:04,355 --> 00:29:05,878
Facebook is in the middle
of this,
538
00:29:05,904 --> 00:29:09,503
they are the company
that can see all of this
539
00:29:09,528 --> 00:29:11,011
and make judgements about it,
540
00:29:12,175 --> 00:29:14,675
I think they would prefer
not to have to do that.
541
00:29:17,832 --> 00:29:20,684
Adam Schrader is a journalist
who used to edit stories
542
00:29:20,722 --> 00:29:22,753
for Facebook's Trending News
section.
543
00:29:23,136 --> 00:29:26,136
Part of his job was
to filter out fake news.
544
00:29:27,230 --> 00:29:29,651
Essentially, we operated
like a newsroom,
545
00:29:30,119 --> 00:29:32,050
it was structured
like a newsroom,
546
00:29:32,075 --> 00:29:37,339
copy editors would make sure
that the topics met standards,
547
00:29:37,425 --> 00:29:40,394
make sure that they were
unbiased, checked facts.
548
00:29:40,628 --> 00:29:44,495
There were often times that
a fake articles would appear
549
00:29:44,613 --> 00:29:47,413
and present themselves
as possibly being
550
00:29:47,550 --> 00:29:50,159
a legitimate trending topic.
551
00:29:50,292 --> 00:29:52,144
And our job was
identifying those
552
00:29:52,167 --> 00:29:54,729
and the original term
was blacklisting.
553
00:29:57,082 --> 00:29:58,346
In the heat of the campaign,
554
00:29:58,371 --> 00:30:01,214
right-wing commentators
accused the team of bias.
555
00:30:01,488 --> 00:30:04,777
Facebook sacked it and
handed the job to an algorithm.
556
00:30:06,223 --> 00:30:09,644
An algorithm cannot do the job
of a trained journalist.
557
00:30:09,715 --> 00:30:13,081
They don't have
the ability to reason,
558
00:30:13,106 --> 00:30:15,917
artificial intelligence
hasn't gotten to the point
559
00:30:15,996 --> 00:30:19,824
where it can really
function like a human brain
560
00:30:19,863 --> 00:30:21,886
and determine
what has news value
561
00:30:21,910 --> 00:30:24,847
and what is good for the public
and what is not.
562
00:30:25,621 --> 00:30:28,207
Schrader says
after the team was sacked,
563
00:30:28,301 --> 00:30:30,160
fake news really took off.
564
00:30:30,387 --> 00:30:33,011
Yeah after the trending news
team was let go,
565
00:30:33,036 --> 00:30:38,253
there was a big problem
with sensational
566
00:30:38,277 --> 00:30:43,558
or factually incorrect
or misleading news sources
567
00:30:43,583 --> 00:30:48,316
and trending topics.
It was just a disaster.
568
00:30:48,520 --> 00:30:52,058
The more partisan
news sources you consume,
569
00:30:52,129 --> 00:30:54,300
the less likely
you are to believe
570
00:30:54,340 --> 00:30:56,722
fact checkers or experts.
571
00:30:56,770 --> 00:31:00,661
And so, this can create
some really dangerous
572
00:31:00,685 --> 00:31:04,214
divisive believers
of alternative facts.
573
00:31:06,660 --> 00:31:08,488
During last year's
election campaign,
574
00:31:08,512 --> 00:31:11,824
a news site published a story
alleging that Hillary Clinton
575
00:31:11,841 --> 00:31:14,035
and her campaign chairman
John Podesta
576
00:31:14,145 --> 00:31:16,078
were running a child sex ring
577
00:31:16,103 --> 00:31:18,457
out of the Comet Ping Pong
Pizza restaurant
578
00:31:18,512 --> 00:31:19,816
in Washington DC.
579
00:31:22,080 --> 00:31:24,813
The story was widely shared
on Facebook.
580
00:31:24,988 --> 00:31:27,816
It was utterly fake,
but it gained so much traction
581
00:31:27,841 --> 00:31:30,816
that a 28-year-old man
finally went to the restaurant
582
00:31:30,840 --> 00:31:34,527
armed with an assault rifle,
a revolver and a shotgun
583
00:31:34,566 --> 00:31:36,558
to find and rescue the children.
584
00:31:39,402 --> 00:31:41,230
One of the hosts runs up
and he's like
585
00:31:41,281 --> 00:31:43,745
"Did you seee that guy?
He had a big gun".
586
00:31:45,261 --> 00:31:47,636
He fired several shots
into the restaurant
587
00:31:47,722 --> 00:31:49,522
before police arrested him.
588
00:31:49,753 --> 00:31:53,659
The story still circulates
on line and on Facebook.
589
00:31:55,355 --> 00:31:58,009
I don't think that, um,
590
00:31:58,322 --> 00:32:02,835
I trust the general
public's ability to
591
00:32:04,032 --> 00:32:09,519
identify fake news, real news,
you know, anything like that.
592
00:32:11,941 --> 00:32:14,097
One study found that
in the closing months
593
00:32:14,136 --> 00:32:15,469
of the US election,
594
00:32:15,527 --> 00:32:18,994
Facebook users shared
the top 20 fake news stories
595
00:32:19,097 --> 00:32:22,081
over a million times more
than the top 20 stories
596
00:32:22,106 --> 00:32:23,773
from major news outlets.
597
00:32:24,043 --> 00:32:27,237
Fake stories are often written
either for political advantage,
598
00:32:27,308 --> 00:32:28,441
or to make money.
599
00:32:29,441 --> 00:32:31,862
There are a lot of people out
there who aren't journalists
600
00:32:31,902 --> 00:32:33,519
and aren't publishers
who are publishing.
601
00:32:33,550 --> 00:32:37,120
They don't have the same
sense of obligation
602
00:32:37,160 --> 00:32:40,105
so we are really
in uncharted territory.
603
00:32:40,214 --> 00:32:42,034
I think one of the most
important things
604
00:32:42,058 --> 00:32:45,175
is that we actually need
a big public debate about this
605
00:32:45,332 --> 00:32:48,800
because it's changed
the nature of news,
606
00:32:48,863 --> 00:32:52,370
and in doing so it's changed
our reality as a society.
607
00:32:55,128 --> 00:32:59,105
If you suspect a news story
is fake, you can report it.
608
00:32:59,464 --> 00:33:01,776
Mark Zuckerberg initially
dismissed the notion
609
00:33:01,801 --> 00:33:04,316
that Fake News somehow
skewed the election,
610
00:33:04,519 --> 00:33:06,183
but he is rolling out a system
611
00:33:06,207 --> 00:33:07,925
that allows
the Facebook community
612
00:33:07,950 --> 00:33:09,821
to flag suspect stories.
613
00:33:10,496 --> 00:33:12,761
Mark Zuckerberg has said that
614
00:33:12,957 --> 00:33:15,487
he's not in the news
publication business, right?
615
00:33:15,519 --> 00:33:17,495
That they're not
a media company.
616
00:33:17,543 --> 00:33:20,753
But I think that's a mistake,
kind of a denial, right?
617
00:33:20,785 --> 00:33:22,276
So, they're definitely
a media company
618
00:33:22,301 --> 00:33:24,408
and I think that they should try
and treat themselves
619
00:33:24,433 --> 00:33:25,800
more as one in the future.
620
00:33:29,816 --> 00:33:33,191
As more and more consumers get
their news from digital sources
621
00:33:33,216 --> 00:33:34,816
and Facebook in particular,
622
00:33:35,043 --> 00:33:36,941
the old-fashioned world
of newspapers
623
00:33:36,972 --> 00:33:39,253
and TV stations is collapsing.
624
00:33:41,308 --> 00:33:42,941
Like most news businesses,
625
00:33:43,004 --> 00:33:44,895
the New York Times
is struggling.
626
00:33:45,675 --> 00:33:47,995
Facebook sends plenty of readers
to their stories,
627
00:33:48,019 --> 00:33:50,839
but most advertising dollars
go to Facebook.
628
00:33:51,925 --> 00:33:54,034
Newsrooms are shrinking
along with the resources
629
00:33:54,058 --> 00:33:55,573
for serious journalism.
630
00:33:56,949 --> 00:33:59,573
You'll hear this from small
publications and large ones
631
00:33:59,605 --> 00:34:04,167
that their absolute audience
is larger than it's ever been
632
00:34:04,644 --> 00:34:07,464
and that surely
has to mean something
633
00:34:07,527 --> 00:34:10,127
but it certainly
hasn't meant profits.
634
00:34:10,410 --> 00:34:13,105
I don't think there's a major
news company in the world
635
00:34:13,130 --> 00:34:17,636
that hasn't changed
its operations around Facebook
636
00:34:17,661 --> 00:34:20,011
in a real way and I mean that
both in the way
637
00:34:20,036 --> 00:34:22,878
that it produces stories
and approaches stories
638
00:34:22,910 --> 00:34:24,425
and the way that it makes money.
639
00:34:26,371 --> 00:34:29,050
If Facebook is playing
an increasingly important role
640
00:34:29,066 --> 00:34:30,808
in how we understand the world,
641
00:34:30,902 --> 00:34:34,542
its social mood study showed
it affects how we feel about it.
642
00:34:35,769 --> 00:34:37,151
When its researchers explained
643
00:34:37,183 --> 00:34:38,791
how they manipulated
the news feeds
644
00:34:38,816 --> 00:34:41,253
of some 700,000 users,
645
00:34:41,425 --> 00:34:43,847
they were criticized for playing
with people's psychology
646
00:34:43,872 --> 00:34:45,081
without telling them.
647
00:34:46,691 --> 00:34:49,198
And yet it's algorithms do that
every day.
648
00:34:49,605 --> 00:34:51,690
They give us stories
they know we want to see,
649
00:34:51,707 --> 00:34:55,574
to keep us online, and help
advertisers send us more ads.
650
00:34:59,199 --> 00:35:01,636
I don't think we can treat
Facebook as benign.
651
00:35:01,668 --> 00:35:03,956
I think it has
enormous implications
652
00:35:03,988 --> 00:35:08,886
for how we experience our lives,
and how we live our lives,
653
00:35:08,910 --> 00:35:13,243
and I think simultaneously
that's one of the things
654
00:35:13,268 --> 00:35:15,847
that makes that network
and others like it
655
00:35:15,872 --> 00:35:20,566
so phenomenally interesting
and important.
656
00:35:22,027 --> 00:35:25,722
But Mark Zuckerberg's plans
would have Facebook do far more.
657
00:35:25,839 --> 00:35:27,972
He wants its users
to rely on the network
658
00:35:28,019 --> 00:35:29,878
for the way we organize society,
659
00:35:30,004 --> 00:35:32,629
discover the world
and conduct politics.
660
00:35:32,925 --> 00:35:35,105
He wants the community
to inform Facebook,
661
00:35:35,160 --> 00:35:37,331
while Facebook
watches over us.
662
00:35:39,277 --> 00:35:42,042
The philosophy of everything
we do at Facebook
663
00:35:42,269 --> 00:35:44,669
is that our community
can teach us what we need to do
664
00:35:44,714 --> 00:35:47,370
and our job is to learn
as quickly as we can
665
00:35:47,395 --> 00:35:49,128
and keep on getting
better and better
666
00:35:49,332 --> 00:35:52,792
and that's especially true when
it comes to keeping people safe.
667
00:35:54,643 --> 00:35:57,370
In a post on his own profile
in February,
668
00:35:57,425 --> 00:35:59,206
he outlined his plan
for the company,
669
00:35:59,261 --> 00:36:01,596
a kind of manifesto
for the central role
670
00:36:01,621 --> 00:36:05,034
he wants the platform to play
in societies around the world.
671
00:36:05,129 --> 00:36:06,910
We're also gonna focus
on building
672
00:36:06,949 --> 00:36:08,542
the infrastructure
for the community,
673
00:36:09,355 --> 00:36:12,659
for supporting us,
for keeping us safe,
674
00:36:12,933 --> 00:36:14,448
for informing us,
675
00:36:14,793 --> 00:36:18,722
for civic engagement
and for inclusion of everyone.
676
00:36:19,324 --> 00:36:22,526
So it's a document that really
felt like an attempt
677
00:36:22,548 --> 00:36:27,292
to take some responsibility
but it wasn't apologetic.
678
00:36:27,433 --> 00:36:32,253
It was fairly bold
and it seems to suggest
679
00:36:32,278 --> 00:36:36,925
that the solution to Facebook's
problems is more Facebook.
680
00:36:38,972 --> 00:36:41,245
Zuckerberg has great ambition.
681
00:36:41,574 --> 00:36:43,784
"Progress needs humanity
to come together
682
00:36:43,800 --> 00:36:45,909
as a global community",
he writes.
683
00:36:46,293 --> 00:36:47,917
Facebook can help build it.
684
00:36:48,511 --> 00:36:50,144
It wants to
'help fight terrorism',
685
00:36:50,183 --> 00:36:53,347
while it's news service can
show us 'more diverse content'.
686
00:36:56,793 --> 00:37:00,566
He is behaving recently in ways
687
00:37:00,652 --> 00:37:03,463
more befitting
of a politician than a CEO.
688
00:37:05,627 --> 00:37:08,111
There's a lot of speculation
that he may run for office
689
00:37:08,159 --> 00:37:11,774
and to my mind,
if Facebook continues to be
690
00:37:11,799 --> 00:37:14,283
as successful as it is,
not just through Facebook
691
00:37:14,308 --> 00:37:16,791
but through its other products,
through Instagram and WhatsApp,
692
00:37:16,816 --> 00:37:19,416
if it continues to be so central
in people's lives,
693
00:37:19,456 --> 00:37:21,057
he doesn't need
to run for office,
694
00:37:21,143 --> 00:37:26,079
he will be presiding
over platform and a venue
695
00:37:26,104 --> 00:37:31,221
where people conduct
a real portion of their lives.
696
00:37:33,581 --> 00:37:35,620
It seems there is
almost no limit
697
00:37:35,644 --> 00:37:37,932
to Facebook's intrusion
into our lives,
698
00:37:37,963 --> 00:37:39,799
for better or for worse.
699
00:37:41,798 --> 00:37:44,135
I want to thank all of you
for joining us to hear more
700
00:37:44,160 --> 00:37:46,361
about some of the things
we're working on at Facebook
701
00:37:46,393 --> 00:37:48,096
to help keep
our communities safe.
702
00:37:49,057 --> 00:37:52,057
Zuckerberg convened what
he called the Social Good Forum
703
00:37:52,096 --> 00:37:54,689
to help people whose
Facebook posts indicate
704
00:37:54,706 --> 00:37:57,315
that they might be at risk
of harming themselves
705
00:37:57,362 --> 00:37:58,829
or committing suicide.
706
00:38:00,182 --> 00:38:02,018
We're starting to do
more proactive work.
707
00:38:02,151 --> 00:38:03,908
Like when we use
artificial intelligence
708
00:38:03,933 --> 00:38:06,658
to identify things that
could be bad or harmful
709
00:38:06,690 --> 00:38:09,088
and then flag them
so our teams can review them.
710
00:38:09,674 --> 00:38:12,072
Or when someone shares
a post that makes it seem
711
00:38:12,097 --> 00:38:13,666
like they might want
to harm themselves.
712
00:38:13,691 --> 00:38:15,057
Then we give them
and their friends
713
00:38:15,082 --> 00:38:16,293
suicide prevention tools
714
00:38:16,338 --> 00:38:18,447
that they can share
to get the help that they need.
715
00:38:18,845 --> 00:38:22,532
The ability to get big data
716
00:38:22,758 --> 00:38:27,878
and to gather data about us
in all aspects of our lives
717
00:38:28,595 --> 00:38:33,838
creates a particular
type of power among the people
718
00:38:33,863 --> 00:38:36,072
or organizations
that have that data.
719
00:38:36,284 --> 00:38:40,228
So they can say "Oh these are
your daily habits".
720
00:38:40,284 --> 00:38:44,557
There's been some research done
that nearly half of what we do
721
00:38:44,627 --> 00:38:47,854
is just repeating patterns
of what we did the day before.
722
00:38:48,159 --> 00:38:51,612
From that,
you can also predict potentially
723
00:38:51,637 --> 00:38:54,815
how people will behave
in the near future as well.
724
00:38:55,018 --> 00:38:57,338
And that's perhaps
a little bit concerning
725
00:38:57,362 --> 00:38:59,955
for people who care a lot
about their privacy.
726
00:39:00,393 --> 00:39:03,877
You use the word potential harm,
that's a fairly big word.
727
00:39:03,901 --> 00:39:05,643
That's a fairly serious phrase.
728
00:39:05,878 --> 00:39:07,604
What sort of harm do you mean?
729
00:39:08,229 --> 00:39:09,752
There are a couple
of factors here.
730
00:39:09,800 --> 00:39:12,276
The first is the issues
that we know about already.
731
00:39:12,448 --> 00:39:16,487
They are from little things,
such as ad targeting,
732
00:39:16,512 --> 00:39:18,823
giving away what your girlfriend
bought you for your birthday,
733
00:39:18,848 --> 00:39:20,127
and doesn't want you
to know about
734
00:39:20,152 --> 00:39:23,760
through to a 14-year-old boy
who hasn't come out as gay yet,
735
00:39:23,785 --> 00:39:25,323
and his parents
discover the fact
736
00:39:25,347 --> 00:39:27,791
because of the advertising
that's being targeted to him,
737
00:39:28,018 --> 00:39:29,987
through to potential
future harms.
738
00:39:30,073 --> 00:39:31,784
One of the problems
in the privacy realm
739
00:39:31,815 --> 00:39:33,502
is that we only have
one identity,
740
00:39:33,542 --> 00:39:35,901
and we can't take back
what we've already handed over.
741
00:39:38,143 --> 00:39:41,010
Facebook is far more intimately
involved in our lives
742
00:39:41,035 --> 00:39:43,557
than any company
we've ever seen before.
743
00:39:46,534 --> 00:39:50,955
Facebook has a responsibility
to inform people
744
00:39:50,987 --> 00:39:53,387
of what is happening
to their data
745
00:39:53,620 --> 00:39:58,818
and so then there can be a
conversation also with "the community"
746
00:39:58,853 --> 00:40:03,914
um, about whether people agree
747
00:40:03,949 --> 00:40:06,172
that this is an appropriate use
and right now
748
00:40:06,197 --> 00:40:09,494
they're not providing
enough information
749
00:40:09,581 --> 00:40:12,182
for that conversation
to take place.
750
00:40:12,471 --> 00:40:15,752
There's no take-back
on private data.
751
00:40:15,777 --> 00:40:17,510
The implications
that are going to occur
752
00:40:17,534 --> 00:40:19,213
in five or 10 years' time,
753
00:40:19,291 --> 00:40:21,065
we need to protect
against that now.
754
00:40:21,128 --> 00:40:23,260
And to an extent,
it almost feels like,
755
00:40:23,401 --> 00:40:26,885
I'm reminded of Einstein's
letter to Eisenhower
756
00:40:26,925 --> 00:40:28,518
warning about
the potential dangers
757
00:40:28,543 --> 00:40:31,788
of nuclear holocaust or whatever,
with the dangers in uranium,
758
00:40:31,823 --> 00:40:34,680
not to say that it's that severe,
but we are at the point now,
759
00:40:34,699 --> 00:40:37,034
where we know that there's danger,
but we don't know the extent of it
760
00:40:37,069 --> 00:40:39,260
and we don't know the potential
implications of it.
761
00:40:45,065 --> 00:40:47,596
Our lives are now
measured in data.
762
00:40:47,956 --> 00:40:50,089
What we look at, who we talk to,
763
00:40:50,114 --> 00:40:53,221
what we do is all recorded
in digital form.
764
00:40:53,987 --> 00:40:55,720
In handing it to Facebook,
765
00:40:55,745 --> 00:40:57,463
we are making
Mark Zuckerberg's company
766
00:40:57,488 --> 00:40:59,807
one of the most powerful
in history.
767
00:41:00,244 --> 00:41:03,674
And the question is,
at what cost to ourselves?
63265
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.