Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:03,580 --> 00:00:05,340
Tonight on Panorama,
2
00:00:05,340 --> 00:00:08,540
we investigate one of the most popular companies on the planet.
3
00:00:10,580 --> 00:00:15,260
I think the relationship of users to Facebook is of that of a drug addict with a drug, in many ways.
4
00:00:15,260 --> 00:00:19,300
Facebook may know more about us than any organisation in history.
5
00:00:19,300 --> 00:00:22,900
What I care about is giving people the power to share, giving every person a voice,
6
00:00:22,900 --> 00:00:26,180
so we can make the world more open and connected.
7
00:00:26,180 --> 00:00:30,340
It makes a fortune from advertising, using our personal information.
8
00:00:31,660 --> 00:00:33,940
It's scary, we are all walking barcodes.
9
00:00:35,500 --> 00:00:39,420
And politicians are now using our data to target us, too.
10
00:00:41,580 --> 00:00:44,580
The way we bought media on Facebook was like no-one else
11
00:00:44,580 --> 00:00:46,580
in politics had ever done.
12
00:00:46,580 --> 00:00:51,820
If it's influencing elections, is it now time to control Facebook?
13
00:00:51,820 --> 00:00:55,900
With Facebook, you have a media which is increasingly seen as the most
14
00:00:55,900 --> 00:00:59,340
valuable media that campaigns invest in, in the election period,
15
00:00:59,340 --> 00:01:01,300
but which is totally unregulated.
16
00:01:20,420 --> 00:01:24,100
32 million of us in the UK share our lives on Facebook.
17
00:01:27,420 --> 00:01:31,700
You may have high privacy settings, but anything you share with friends
18
00:01:31,700 --> 00:01:35,340
can become public if they don't have high privacy settings, too.
19
00:01:37,540 --> 00:01:41,580
I'm trying to get a look at how private your settings really are. Yeah.
20
00:01:41,580 --> 00:01:45,020
How much of your life is out there. Anything on cats I love.
21
00:01:45,020 --> 00:01:48,060
So everyone can just see those? Yes. Oh, my God.
22
00:01:50,940 --> 00:01:55,780
Facebook says it helps people manage their privacy and have control.
23
00:01:55,780 --> 00:01:57,300
Oh!
24
00:01:57,300 --> 00:02:01,140
Right, so I think I'll have to change some of my settings now.
25
00:02:03,500 --> 00:02:07,580
But our tests shows how information and pictures can slip into
26
00:02:07,580 --> 00:02:09,260
public view.
27
00:02:09,260 --> 00:02:12,860
We're looking at basically weapons here.
28
00:02:12,860 --> 00:02:16,860
But are they available? They are available, yes. Wow.
29
00:02:16,860 --> 00:02:21,420
Although I support it, I don't want it in public view.
30
00:02:21,420 --> 00:02:24,580
These are pages that you've liked. Yeah.
31
00:02:24,580 --> 00:02:26,260
Metallica. Yeah.
32
00:02:28,500 --> 00:02:34,740
I'm surprised that people can get to that level and get this information.
33
00:02:34,740 --> 00:02:37,700
I don't think some of my friends will be very happy about that.
34
00:02:41,500 --> 00:02:43,940
So what did you think overall?
35
00:02:43,940 --> 00:02:47,580
Overall, I'm very concerned because I think my pictures are private,
36
00:02:47,580 --> 00:02:48,980
and I like those.
37
00:02:48,980 --> 00:02:52,300
So I thought that it was just your profile picture which you
38
00:02:52,300 --> 00:02:56,540
know is always public. As opposed to...
39
00:02:56,540 --> 00:02:59,580
the rest that we've seen today, so...
40
00:02:59,580 --> 00:03:01,420
Yeah. There weren't too many.
41
00:03:01,420 --> 00:03:05,700
No, there weren't, but, you know, there are too many for me.
42
00:03:10,300 --> 00:03:12,620
When we first launched, we were hoping for, you know,
43
00:03:12,620 --> 00:03:14,300
maybe 400, 500 people.
44
00:03:14,300 --> 00:03:17,660
And now we're at 100,000 people, so who knows where we're going next.
45
00:03:19,820 --> 00:03:23,100
Mark Zuckerberg's college project is now used by
46
00:03:23,100 --> 00:03:24,980
almost 2 billion people.
47
00:03:26,700 --> 00:03:30,900
Facebook is where many of us share those important moments in our lives.
48
00:03:30,900 --> 00:03:33,700
It's transformed the way the world stays in touch.
49
00:03:35,260 --> 00:03:38,740
Our development is guided by the idea that every year,
50
00:03:38,740 --> 00:03:42,820
the amount that people want to add and share and express is increasing.
51
00:03:44,460 --> 00:03:49,140
The freedom of the internet has helped Facebook grow so quickly.
52
00:03:49,140 --> 00:03:52,140
But after years of unfettered free speech,
53
00:03:52,140 --> 00:03:56,260
social media companies are now facing severe criticism
54
00:03:56,260 --> 00:03:59,300
and being called to account for the material on their sites.
55
00:04:02,140 --> 00:04:04,100
With respect, I...
56
00:04:04,100 --> 00:04:07,700
You always preface your comments "with respect", but it's a simple
57
00:04:07,700 --> 00:04:12,340
question - do you feel any shame at all by some of what we've seen here?
58
00:04:12,340 --> 00:04:16,100
I don't think it's a simple question to suggest, do you have no shame?
59
00:04:16,100 --> 00:04:18,460
I feel very responsible, as do my colleagues,
60
00:04:18,460 --> 00:04:22,460
for the safety of the 1.9 billion people using our service.
61
00:04:22,460 --> 00:04:25,820
And I'm very proud of the work we do. Proud of the work...
62
00:04:25,820 --> 00:04:29,580
I don't think Facebook should be necessarily responsible for
63
00:04:29,580 --> 00:04:33,380
every idiot on the internet. But we need to agree that.
64
00:04:33,380 --> 00:04:35,860
We need to have regulation that reflects what we think,
65
00:04:35,860 --> 00:04:39,700
what our values are. And right now, we haven't got any...
66
00:04:39,700 --> 00:04:42,580
any kind of framework that sets that out.
67
00:04:50,700 --> 00:04:53,340
Facebook is the giant of social media.
68
00:04:53,340 --> 00:04:58,260
That scale and power makes it different from other internet companies.
69
00:04:58,260 --> 00:05:01,860
It has personal information about every aspect of our lives.
70
00:05:04,100 --> 00:05:07,940
It uses that information to create the most targeted advertising
71
00:05:07,940 --> 00:05:10,060
machine in history.
72
00:05:10,060 --> 00:05:12,020
And the ad industry loves it.
73
00:05:14,060 --> 00:05:18,460
The amount of targeting you can do on Facebook is extraordinary.
74
00:05:18,460 --> 00:05:21,300
Sometimes I'll be on a website called Competitive Cyclists,
75
00:05:21,300 --> 00:05:24,020
and I'll be looking at a new wheel set or something.
76
00:05:24,020 --> 00:05:28,580
Well, they'll chase me all over the internet with ads for bicycle parts.
77
00:05:28,580 --> 00:05:31,100
Well, if I'm on Facebook and I'm going to see ads, I'd rather
78
00:05:31,100 --> 00:05:35,020
see an ad for a bicycle part than women's pair of shoes, right?
79
00:05:35,020 --> 00:05:38,220
So, as a user, I'm into it because now they're showing
80
00:05:38,220 --> 00:05:40,540
me ads about things that I'm interested in.
81
00:05:41,740 --> 00:05:44,500
It's scary. We are all walking barcodes.
82
00:05:44,500 --> 00:05:45,580
HE LAUGHS
83
00:05:48,980 --> 00:05:52,500
It's a brilliant business model that has turned Facebook into
84
00:05:52,500 --> 00:05:55,020
a $400 billion company.
85
00:05:56,380 --> 00:05:58,980
We produce Facebook's content for free,
86
00:05:58,980 --> 00:06:03,620
while advertisers queue up to sell us products.
87
00:06:03,620 --> 00:06:06,100
It's the ultimate crowd-funded media company.
88
00:06:06,100 --> 00:06:10,260
All the media comes from the users, and all the viewers are users,
89
00:06:10,260 --> 00:06:12,940
and advertisers get to slip into the middle of that.
90
00:06:17,060 --> 00:06:22,100
People love Facebook. But there is a potential downside.
91
00:06:22,100 --> 00:06:25,900
Its computer algorithms track our lives.
92
00:06:25,900 --> 00:06:29,220
It's claimed that Facebook have more information about us than any
93
00:06:29,220 --> 00:06:32,780
government organisation or any other business ever.
94
00:06:32,780 --> 00:06:36,940
And the details can cover just about every aspect of our lives.
95
00:06:36,940 --> 00:06:38,780
This is the house that Mark built.
96
00:06:47,220 --> 00:06:52,300
Ultimately, Facebook is the company that is making billions
97
00:06:52,300 --> 00:06:54,500
out of this data.
98
00:06:54,500 --> 00:06:57,580
It has immense power and it has immense responsibility that
99
00:06:57,580 --> 00:07:00,500
goes with that, and I don't think Facebook recognises that.
100
00:07:13,820 --> 00:07:16,260
The way Facebook's algorithms work
101
00:07:16,260 --> 00:07:18,980
is one of its most closely guarded secrets.
102
00:07:24,220 --> 00:07:27,660
But one former insider has agreed to talk to me.
103
00:07:29,380 --> 00:07:34,020
We're sailing between the San Juan Islands in Washington state,
104
00:07:34,020 --> 00:07:36,900
right up in the north-west corner of America.
105
00:07:36,900 --> 00:07:39,700
Canada, a couple of miles that direction.
106
00:07:39,700 --> 00:07:42,780
Silicon Valley, about 1,000 miles that way.
107
00:07:48,300 --> 00:07:52,900
Antonio Garcia Martinez helped build Facebook's advertising machine.
108
00:07:55,700 --> 00:07:58,300
He now has a very different lifestyle.
109
00:08:00,860 --> 00:08:03,500
He left the company after a difference of opinion.
110
00:08:05,220 --> 00:08:07,980
Somewhat like being inducted into a cult.
111
00:08:07,980 --> 00:08:10,780
There's actually an initiation ritual called "onboarding", in which
112
00:08:10,780 --> 00:08:13,500
they try to inculcate the values of Facebook and the Facebook
113
00:08:13,500 --> 00:08:14,620
way of doing things.
114
00:08:14,620 --> 00:08:17,820
I guess the closest example might be a baptism in the case of a Catholic Church,
115
00:08:17,820 --> 00:08:20,340
or perhaps more an evangelical church, where as an adult,
116
00:08:20,340 --> 00:08:23,900
you sort of leave your sort of past and adopt this new path in life.
117
00:08:23,900 --> 00:08:26,260
Well, if it's a cult, every cult has to have a leader.
118
00:08:26,260 --> 00:08:29,860
Who's the leader? Yeah, well, Mark Zuckerberg is the prophet of that religion.
119
00:08:29,860 --> 00:08:32,900
And he's very much in charge.
120
00:08:32,900 --> 00:08:34,820
He's definitely the Jesus of that church.
121
00:08:36,620 --> 00:08:40,860
Antonio knows how the world's most powerful advertising tool works.
122
00:08:44,100 --> 00:08:47,100
This is how the internet gets paid for.
123
00:08:51,260 --> 00:08:53,500
Every time you get almost every page on the internet,
124
00:08:53,500 --> 00:08:55,860
or every app that you load on your phone.
125
00:08:55,860 --> 00:08:59,540
In that moment, literally in that instant, we're talking tens of milliseconds,
126
00:08:59,540 --> 00:09:02,980
there are signals that go out that say, "Hey, this person showed up."
127
00:09:08,420 --> 00:09:10,140
Then comes the real magic.
128
00:09:12,940 --> 00:09:18,220
Instantly, Facebook gathers all it knows about you and matches
129
00:09:18,220 --> 00:09:21,540
it with data from your real life.
130
00:09:21,540 --> 00:09:24,700
Your age, your home.
131
00:09:26,260 --> 00:09:29,820
Your bank, your credit card purchases...
132
00:09:29,820 --> 00:09:32,540
your holidays...
133
00:09:32,540 --> 00:09:35,100
Anything that helps advertisers target you.
134
00:09:36,620 --> 00:09:40,940
It happens hundreds of billions of times a day, all over the internet.
135
00:09:40,940 --> 00:09:43,020
How long does that process take?
136
00:09:46,020 --> 00:09:49,300
Literally half the time that it takes for you to blink your eyes.
137
00:09:49,300 --> 00:09:51,340
That's how fast it is.
138
00:09:52,820 --> 00:09:56,180
'Facebook says it complies with all regulations that apply to it.
139
00:09:57,740 --> 00:10:01,260
'But critics say those regulations were not designed for
140
00:10:01,260 --> 00:10:03,620
'a company of Facebook's reach and complexity.'
141
00:10:04,940 --> 00:10:09,180
Facebook has brought real value to many people's lives but the
142
00:10:09,180 --> 00:10:14,060
fact is that it's not about whether or not
143
00:10:14,060 --> 00:10:18,820
Mark Zuckerberg is a nice guy or has benign intentions,
144
00:10:18,820 --> 00:10:23,420
it's about people's rights and responsibilities.
145
00:10:23,420 --> 00:10:28,100
I can see a future where we are too tied into Facebook, as citizens,
146
00:10:28,100 --> 00:10:30,140
where we are scared to move from it,
147
00:10:30,140 --> 00:10:32,380
leave it, because it's got all our data.
148
00:10:32,380 --> 00:10:35,220
But I can also see a future with effective regulation,
149
00:10:35,220 --> 00:10:38,380
and that's also the future, I think, which is best for Facebook.
150
00:10:42,420 --> 00:10:44,060
The trouble is,
151
00:10:44,060 --> 00:10:47,380
Facebook's computer algorithms are beyond the comprehension of
152
00:10:47,380 --> 00:10:50,940
most of its own staff, let alone government regulators.
153
00:10:52,340 --> 00:10:54,340
You know what's naive?
154
00:10:54,340 --> 00:10:55,900
The thought that a European bureaucrat,
155
00:10:55,900 --> 00:10:58,420
who's never managed so much as a blog, could somehow,
156
00:10:58,420 --> 00:11:02,340
in any way, get up to speed or even attempt to regulate the algorithm
157
00:11:02,340 --> 00:11:06,260
that drives literally a quarter of the internet in the entire world,
158
00:11:06,260 --> 00:11:09,260
right, that's never going to happen, in any realistic way.
159
00:11:13,380 --> 00:11:16,420
Knowing what Facebook does with our personal information is now
160
00:11:16,420 --> 00:11:19,500
more important than ever.
161
00:11:19,500 --> 00:11:22,340
Because it's no longer just advertisers using our data.
162
00:11:23,500 --> 00:11:26,180
Politicians are targeting us, too.
163
00:11:26,180 --> 00:11:28,820
MAN: Isis scum! CROWD: Off our streets!
164
00:11:28,820 --> 00:11:31,460
Isis scum! Off our streets!
165
00:11:31,460 --> 00:11:35,060
Take the far-right group Britain First.
166
00:11:35,060 --> 00:11:37,260
They've told panorama
167
00:11:37,260 --> 00:11:41,020
that they pay Facebook to repeatedly promote their videos.
168
00:11:43,500 --> 00:11:48,580
And it works. Britain First now has 1.6 million Facebook followers.
169
00:11:51,540 --> 00:11:54,620
The British people have spoken and the answer is, we're out.
170
00:11:55,860 --> 00:11:58,500
But it was the Brexit vote last summer that revealed the true
171
00:11:58,500 --> 00:12:00,460
power of Facebook.
172
00:12:00,460 --> 00:12:02,340
CHEERING
173
00:12:04,580 --> 00:12:07,620
I think Facebook was a game-changer for the campaign,
174
00:12:07,620 --> 00:12:10,660
I don't think there's any question about it.
175
00:12:10,660 --> 00:12:14,660
Both sides in the referendum used Facebook to target us individually.
176
00:12:15,940 --> 00:12:21,660
'You can say to Facebook, "I would like to make sure that I can'
177
00:12:21,660 --> 00:12:27,260
"micro-target that fisherman, in certain parts of the UK,"
178
00:12:27,260 --> 00:12:31,380
so that they are specifically hearing, on Facebook,
179
00:12:31,380 --> 00:12:34,460
that if you vote to leave,
180
00:12:34,460 --> 00:12:40,740
that you will be able to change the way that the regulations
181
00:12:40,740 --> 00:12:43,580
are set for the fishing industry.
182
00:12:43,580 --> 00:12:47,300
Now, I can do the exact same thing, for example, for people who live
183
00:12:47,300 --> 00:12:51,300
in the Midlands, that are struggling because the factory has shut down.
184
00:12:51,300 --> 00:12:55,820
So I may send a specific message through Facebook to them
185
00:12:55,820 --> 00:12:58,420
that nobody else is seeing.
186
00:13:00,340 --> 00:13:03,300
However you want to micro-target that message,
187
00:13:03,300 --> 00:13:05,340
you can do it on Facebook.
188
00:13:05,340 --> 00:13:08,540
The referendum showed how much Facebook has changed.
189
00:13:13,260 --> 00:13:17,900
We think Facebook is about keeping in touch with family and friends.
190
00:13:19,180 --> 00:13:22,500
But it's quietly become an important political player.
191
00:13:25,220 --> 00:13:31,340
Donald Trump's presidential campaign took full advantage.
192
00:13:31,340 --> 00:13:35,780
The way we bought media on Facebook was like no-one else in politics has ever done.
193
00:13:35,780 --> 00:13:39,500
How much money would you have spent with Facebook? Could you tell me?
194
00:13:39,500 --> 00:13:42,460
Uh, that's a number we haven't publicised.
195
00:13:42,460 --> 00:13:45,100
If I said 70 million, from the official campaign,
196
00:13:45,100 --> 00:13:49,220
would that be there or thereabouts? Uh, I would say that's nearby, yes.
197
00:13:56,700 --> 00:14:00,660
By using people's personal information, Facebook helped
198
00:14:00,660 --> 00:14:04,860
the campaigns to target voters more effectively than ever before.
199
00:14:04,860 --> 00:14:07,660
CROWD CHANTS AND APPLAUDS
200
00:14:07,660 --> 00:14:10,100
We take your name, address, you know, phone number,
201
00:14:10,100 --> 00:14:12,540
if we have it, e-mail, and we can take these pieces of data,
202
00:14:12,540 --> 00:14:16,100
basically put 'em in a file, send 'em to Facebook,
203
00:14:16,100 --> 00:14:19,500
Facebook then matches them to the user's profile.
204
00:14:19,500 --> 00:14:21,820
So, if you're on Facebook and you have an arrow,
205
00:14:21,820 --> 00:14:24,540
I live at so-and-so address, I can then match you.
206
00:14:24,540 --> 00:14:27,780
And then once you're matched, I can put you into what is called an audience.
207
00:14:27,780 --> 00:14:32,580
An audience is essentially a bucket of users that I can then target.
208
00:14:32,580 --> 00:14:34,900
Was Facebook decisive? Yes.
209
00:14:36,100 --> 00:14:40,380
Now, I'm going to make our country rich again!
210
00:14:40,380 --> 00:14:43,060
CHEERING AND APPLAUSE
211
00:14:47,140 --> 00:14:51,380
Facebook says it wants to be the most open and transparent company.
212
00:14:54,660 --> 00:14:58,260
But it won't tell us how much cash it took from Republicans
213
00:14:58,260 --> 00:15:01,740
and Democrats, citing client confidentiality.
214
00:15:05,340 --> 00:15:10,340
It's estimated the election earned Facebook at least $250 million.
215
00:15:14,140 --> 00:15:17,060
There it is, Facebook's Washington HQ.
216
00:15:17,060 --> 00:15:20,420
Millions and millions of campaign dollars were spent here.
217
00:15:20,420 --> 00:15:22,460
The company had dedicated teams to help
218
00:15:22,460 --> 00:15:24,180
the political parties reach voters.
219
00:15:25,340 --> 00:15:28,580
Up the street, three blocks, the White House.
220
00:15:30,620 --> 00:15:35,020
Most of us had little idea Facebook had become so involved in politics.
221
00:15:36,380 --> 00:15:41,980
It even sent its own people to work directly with the campaigns.
222
00:15:41,980 --> 00:15:45,580
Both parties are going to have a team that are from Facebook,
223
00:15:45,580 --> 00:15:48,340
you know, we had folks on the ground with us, on the campaign,
224
00:15:48,340 --> 00:15:50,380
working with us day in, day out,
225
00:15:50,380 --> 00:15:52,700
we had dedicated staff that we would call when we ran into
226
00:15:52,700 --> 00:15:56,340
problems or got stuck on using a new tool or product of theirs.
227
00:15:56,340 --> 00:15:59,020
Or, we had an idea and we wanted to see if we could do it on
228
00:15:59,020 --> 00:16:01,820
Facebook and they would help us create a unique solution.
229
00:16:01,820 --> 00:16:05,500
We have been asking Facebook about this for weeks.
230
00:16:07,140 --> 00:16:10,980
In terms of the US presidential election, did you have Facebook
231
00:16:10,980 --> 00:16:15,420
people working directly with their respective campaigns?
232
00:16:15,420 --> 00:16:19,020
No Facebook employee worked directly for a campaign.
233
00:16:19,020 --> 00:16:21,660
Not "for", WITH, alongside?
234
00:16:21,660 --> 00:16:24,380
So, one of the things we absolutely are there to do is to
235
00:16:24,380 --> 00:16:26,860
help people make use of Facebook products.
236
00:16:26,860 --> 00:16:31,220
We do have people who are... whose role is to help politicians
237
00:16:31,220 --> 00:16:33,620
and governments make good use of Facebook.
238
00:16:33,620 --> 00:16:35,940
And that's not just around campaigns.
239
00:16:35,940 --> 00:16:40,220
But how many people did you have working effectively with these campaigns?
240
00:16:41,460 --> 00:16:43,660
I can't give you the number of exactly how many people
241
00:16:43,660 --> 00:16:45,900
worked with those campaigns but I can tell you that it was
242
00:16:45,900 --> 00:16:48,980
completely demand driven. So it's really up to the campaigns.
243
00:16:51,980 --> 00:16:56,860
So, according to the winners of both Brexit and the US presidential
244
00:16:56,860 --> 00:16:59,140
race, Facebook was decisive.
245
00:17:00,580 --> 00:17:04,580
Facebook is now expected to play a major role in our general election.
246
00:17:06,700 --> 00:17:08,860
This matters because critics say Facebook is largely
247
00:17:08,860 --> 00:17:11,900
unregulated and unaccountable.
248
00:17:11,900 --> 00:17:14,740
Historically, there have been quite strict rules about the way
249
00:17:14,740 --> 00:17:17,820
information is presented. Broadcasters worked to a very...
250
00:17:17,820 --> 00:17:21,060
a very strict code in terms of partiality.
251
00:17:21,060 --> 00:17:24,300
And there are restrictions on use of advertising but on something
252
00:17:24,300 --> 00:17:27,940
like Facebook, you have a media which is increasingly seen as
253
00:17:27,940 --> 00:17:30,700
the most important, most valuable media that campaigns
254
00:17:30,700 --> 00:17:34,060
invest in, in an election period, but which is totally unregulated.
255
00:17:42,940 --> 00:17:46,260
Facebook is also under fire for fake news.
256
00:17:50,500 --> 00:17:52,980
It was the main platform for spreading made-up stories
257
00:17:52,980 --> 00:17:55,700
during the US presidential election.
258
00:18:02,740 --> 00:18:06,660
Some of them concerned Donald Trump.
259
00:18:06,660 --> 00:18:08,940
Most were about Hilary Clinton.
260
00:18:15,020 --> 00:18:18,180
There are dozens and dozens of fake news stories about
261
00:18:18,180 --> 00:18:21,380
Hilary Clinton to be found,
262
00:18:21,380 --> 00:18:24,220
linking her to sex scandals and several murders.
263
00:18:27,260 --> 00:18:31,740
'Mark Zuckerberg has tried to play down the importance of fake news.'
264
00:18:33,140 --> 00:18:38,500
The idea that fake news on Facebook influenced the election in
265
00:18:38,500 --> 00:18:42,020
any way, I think, is a pretty crazy idea.
266
00:18:42,020 --> 00:18:45,180
You know, voters make decisions based on their lived experience.
267
00:18:47,540 --> 00:18:50,620
But some of the made-up stories contained false information
268
00:18:50,620 --> 00:18:52,580
about when to vote.
269
00:18:54,860 --> 00:18:57,460
The Democrats say Facebook refused to take them down.
270
00:19:00,180 --> 00:19:04,860
A lot of them were very specifically targeted at African Americans or
271
00:19:04,860 --> 00:19:08,580
targeted at other populations that are very likely to be Democrats.
272
00:19:11,140 --> 00:19:15,260
We took it up with all of the large social media and technology companies.
273
00:19:15,260 --> 00:19:18,860
None of 'em was willing to just sort of get rid of the stuff.
274
00:19:18,860 --> 00:19:22,340
Including Facebook? None of 'em was willing, yeah. Yeah, yeah, none of 'em.
275
00:19:24,980 --> 00:19:28,060
Why didn't you simply take them down when you were asked to?
276
00:19:28,060 --> 00:19:31,260
It is my understanding that indeed we did take them down when we
277
00:19:31,260 --> 00:19:34,540
were asked to. When we were told about those posts, we took them down. And we also...
278
00:19:34,540 --> 00:19:37,340
Well, can I just... Can I just show you that there?
279
00:19:37,340 --> 00:19:41,220
I got that on the internet just on a Facebook page just the other day.
280
00:19:41,220 --> 00:19:43,820
So they're still up there.
281
00:19:43,820 --> 00:19:46,380
One of the things about our platform is we do rely on people to
282
00:19:46,380 --> 00:19:49,140
let us know about content they believe shouldn't be on
283
00:19:49,140 --> 00:19:52,300
Facebook, particularly this kind of material, and therefore,
284
00:19:52,300 --> 00:19:55,540
when people let us know about that, then we take action.
285
00:19:55,540 --> 00:19:58,460
And it's my understanding that we did indeed take down the posts
286
00:19:58,460 --> 00:20:03,940
that we were notified of by the Clinton campaign.
287
00:20:03,940 --> 00:20:09,380
'Facebook says it helped 2 million people register to vote in the US election.
288
00:20:09,380 --> 00:20:12,540
'And it has launched a series of initiatives to combat fake news.
289
00:20:14,740 --> 00:20:17,620
'They include suspending fake accounts and working with
290
00:20:17,620 --> 00:20:21,780
'fact checkers to highlight what Facebook calls disputed stories.
291
00:20:23,980 --> 00:20:27,540
'But why doesn't it simply remove them?'
292
00:20:27,540 --> 00:20:30,780
Facebook says it doesn't want to ban fake news because it doesn't
293
00:20:30,780 --> 00:20:33,020
want to censor the internet.
294
00:20:33,020 --> 00:20:38,060
But there's another reason the company might be reluctant to get rid of fake news.
295
00:20:38,060 --> 00:20:42,580
All those made-up stories - they help boost its profits.
296
00:20:51,340 --> 00:20:55,140
The key to Facebook's profitability is engagement.
297
00:20:55,140 --> 00:20:57,380
The more time people stay online,
298
00:20:57,380 --> 00:20:59,900
the more money Facebook earns from advertising.
299
00:21:01,380 --> 00:21:05,100
And fake news about the US election kept people engaged.
300
00:21:07,540 --> 00:21:10,820
Zuck got up and said, "Well, only about 1% of the content is fake news."
301
00:21:10,820 --> 00:21:13,220
Numerically, that might be true,
302
00:21:13,220 --> 00:21:15,700
in terms of all the content in the world. However,
303
00:21:15,700 --> 00:21:19,300
the fraction of engagement that those posts, you know, drove, and by
304
00:21:19,300 --> 00:21:23,100
engagement I mean likes, comments, shares, is way more than 1%.
305
00:21:23,100 --> 00:21:25,820
And why is that? Well, because click bait, as it's called, works.
306
00:21:25,820 --> 00:21:28,780
Right? a headline that proclaims that Hillary, whatever,
307
00:21:28,780 --> 00:21:30,620
just murdered her campaign manager,
308
00:21:30,620 --> 00:21:33,020
will obviously drive a lot more interest than some quiet
309
00:21:33,020 --> 00:21:35,980
sober analysis about the state of the election, or whatever.
310
00:21:39,420 --> 00:21:43,060
How much money did Facebook make through fake news?
311
00:21:44,500 --> 00:21:46,940
This is a really important issue to us, er,
312
00:21:46,940 --> 00:21:48,980
we talk about it a lot as a company.
313
00:21:48,980 --> 00:21:52,620
I can tell you that the estimate the amount of fake news on Facebook is
314
00:21:52,620 --> 00:21:56,220
very small and the amount of money we make from it is negligible.
315
00:21:56,220 --> 00:21:58,500
What does negligible actually mean?
316
00:21:58,500 --> 00:22:01,820
It means a very small amount, compared with everything else,
317
00:22:01,820 --> 00:22:05,380
all the other content on Facebook. And indeed, if you think about it...
318
00:22:05,380 --> 00:22:09,300
OK, you've got your evaluation right now of $400 billion,
319
00:22:09,300 --> 00:22:12,940
in that context, can you tell me what negligible really means?
320
00:22:12,940 --> 00:22:17,020
So, the amount of advertising revenue that we get from fake news
321
00:22:17,020 --> 00:22:20,180
is negligible because the amount of fake news on Facebook is very small.
322
00:22:20,180 --> 00:22:21,660
Can you give me an actual figure?
323
00:22:21,660 --> 00:22:25,140
So, we take fake news very seriously, we estimate the amount
324
00:22:25,140 --> 00:22:27,820
is very small and the amount of money we make from it is negligible.
325
00:22:27,820 --> 00:22:29,460
But we want to get both of those to zero.
326
00:22:29,460 --> 00:22:33,380
I appreciate that but what is it actually in pounds, shillings and pence?
327
00:22:33,380 --> 00:22:34,900
We take this issue very seriously.
328
00:22:34,900 --> 00:22:37,500
We think it's a very small amount of the content on Facebook.
329
00:22:37,500 --> 00:22:39,620
But you can't give me an actual figure?
330
00:22:39,620 --> 00:22:42,580
We take the issue very seriously but what is very clear, is we
331
00:22:42,580 --> 00:22:45,580
estimate it to be very small and the amount of money that we make
332
00:22:45,580 --> 00:22:48,820
from it is negligible and we're determined to do everything we can,
333
00:22:48,820 --> 00:22:53,660
humanly, and technically possible, to reduce both of those to zero.
334
00:22:56,340 --> 00:22:59,700
A Westminster enquiry into fake news is expected to resume
335
00:22:59,700 --> 00:23:01,820
after the general election.
336
00:23:03,460 --> 00:23:06,380
The analysis done on the spreading of fake news has shown that Facebook
337
00:23:06,380 --> 00:23:09,220
has been the principal platform whereby fake news is shared.
338
00:23:09,220 --> 00:23:11,860
So, we're not only looking at Facebook but clearly they will be
339
00:23:11,860 --> 00:23:13,220
a major part of our enquiry.
340
00:23:13,220 --> 00:23:15,700
Primarily, I think, there is an obligation on behalf of the
341
00:23:15,700 --> 00:23:18,380
companies themselves to recognise there is a problem, that
342
00:23:18,380 --> 00:23:20,780
they have a social observation to do something about it.
343
00:23:20,780 --> 00:23:22,020
But if they refuse to,
344
00:23:22,020 --> 00:23:24,860
then we'll be looking to make recommendations about what
345
00:23:24,860 --> 00:23:27,060
could be done to put in a better regime that, you know,
346
00:23:27,060 --> 00:23:30,460
creates a legal obligation on behalf of those companies to combat
347
00:23:30,460 --> 00:23:33,380
fake news and other sort of unhelpful and illicit material.
348
00:23:37,140 --> 00:23:39,820
It's not just fake news.
349
00:23:43,340 --> 00:23:46,980
Pressure is now growing for Facebook to take responsibility for
350
00:23:46,980 --> 00:23:49,140
all the material on its site.
351
00:23:58,100 --> 00:24:02,060
Critics say the company is just too slow to take content down
352
00:24:02,060 --> 00:24:04,980
when harmful and offensive material is posted.
353
00:24:08,780 --> 00:24:12,380
A possible group action is now being prepared by lawyers
354
00:24:12,380 --> 00:24:16,100
representing children who have been victimised and bullied online.
355
00:24:19,100 --> 00:24:21,780
Facebook tend to use the excuse, as most of the social networking
356
00:24:21,780 --> 00:24:25,380
giants do - they say, "Look, we can't monitor everything.
357
00:24:25,380 --> 00:24:29,060
"There's far too much coming in." But suddenly, if they're faced with a major threat,
358
00:24:29,060 --> 00:24:32,700
it's surprising what they can do and what they can do quickly.
359
00:24:32,700 --> 00:24:34,540
This is all about timing.
360
00:24:39,380 --> 00:24:43,900
The classic example was the 1972 photograph of the naked child
361
00:24:43,900 --> 00:24:46,940
running away from a napalm bomb in the Vietnamese village.
362
00:24:48,540 --> 00:24:51,380
Facebook, that fell foul of their algorithm and it was pulled
363
00:24:51,380 --> 00:24:54,620
until somebody drew it to their attention.
364
00:24:54,620 --> 00:24:59,500
Yet, for some reason, they can't act with the same speed and dexterity
365
00:24:59,500 --> 00:25:06,820
if some child's naked picture is posted, in a revenge porn or a schoolyard prank or whatever.
366
00:25:12,740 --> 00:25:15,980
One common complaint is that Facebook is very slow to
367
00:25:15,980 --> 00:25:19,620
remove material once it's complained of. Why is that?
368
00:25:19,620 --> 00:25:22,460
We take our responsibilities really seriously in this area and we
369
00:25:22,460 --> 00:25:25,300
aim to get...to reach all reports very quickly.
370
00:25:25,300 --> 00:25:27,940
Indeed, we recently committed, along with other companies,
371
00:25:27,940 --> 00:25:30,380
that when it comes to hate speech, for instance,
372
00:25:30,380 --> 00:25:33,820
we would aim to review all reports within 24 hours.
373
00:25:33,820 --> 00:25:38,780
And we certainly aim to get all reports of whatever nature addressed within 48 hours.
374
00:25:38,780 --> 00:25:42,300
'After a series of controversies, Facebook last week announced
375
00:25:42,300 --> 00:25:46,180
'plans to hire an extra 3,000 people to review content.
376
00:25:53,460 --> 00:25:56,980
'Critics say Facebook has an unfair commercial advantage.
377
00:25:58,420 --> 00:26:02,700
'It's a media company that doesn't check most of what it publishes.
378
00:26:02,700 --> 00:26:07,660
'It's an advertising company that no regulator fully understands.
379
00:26:07,660 --> 00:26:11,020
'And it's a political tool that seems beyond regulation.'
380
00:26:14,260 --> 00:26:16,740
They have no competitors.
381
00:26:16,740 --> 00:26:20,340
We don't even know what market it is they should be being regulated in.
382
00:26:20,340 --> 00:26:24,060
And we're all kind of dancing around this huge ten-tonne elephant,
383
00:26:24,060 --> 00:26:26,860
which is Facebook, and nobody knows who is responsible.
384
00:26:33,780 --> 00:26:37,820
We're slowly finding out more about what Facebook does with our information.
385
00:26:39,020 --> 00:26:42,100
By finding little things about you and then putting it all
386
00:26:42,100 --> 00:26:44,900
together, in fact, they're learning a lot about my life
387
00:26:44,900 --> 00:26:48,380
and what I'm like as a person, which I think's really weird.
388
00:26:48,380 --> 00:26:50,340
All right.
389
00:26:50,340 --> 00:26:53,660
We're really getting a sense of your character here, I think.
390
00:26:53,660 --> 00:26:56,500
Of your interests, your job, your music... Yeah.
391
00:26:56,500 --> 00:27:01,380
I think I should pay more attention what I comment
392
00:27:01,380 --> 00:27:07,140
and where I leave comments. Especially if, like, if it's a controversial subject -
393
00:27:07,140 --> 00:27:09,980
Brexit, for example,
394
00:27:09,980 --> 00:27:12,020
or, you know, Trump.
395
00:27:19,260 --> 00:27:22,580
Legislators around the world now seem determined to tighten
396
00:27:22,580 --> 00:27:24,860
the rules for Facebook.
397
00:27:27,180 --> 00:27:30,220
They have to accept that as the owners and creators of this platform,
398
00:27:30,220 --> 00:27:32,940
they have a social obligation towards the way it is used.
399
00:27:34,140 --> 00:27:36,780
Their technology is being used by people to, you know,
400
00:27:36,780 --> 00:27:40,820
spread messages of hate, to spread lies, to disrupt elections,
401
00:27:40,820 --> 00:27:43,860
then they have a social obligation to act against that.
402
00:27:45,260 --> 00:27:49,100
And that could be bad news for the company that Mark built.
403
00:27:51,380 --> 00:27:53,300
If Zuck is losing sleep over anything,
404
00:27:53,300 --> 00:27:56,020
it's the government actually imposing that level
405
00:27:56,020 --> 00:27:58,860
of control over messaging and editorial control over what appears.
406
00:27:58,860 --> 00:28:00,700
That's what he's worried about.
407
00:28:07,460 --> 00:28:10,980
A quarter of the world's population has signed up to Facebook.
408
00:28:12,620 --> 00:28:16,180
The simple social network has developed at astonishing speed.
409
00:28:18,820 --> 00:28:23,060
Is it now time for our lawmakers and regulators to catch up?
37721
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.