Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:42,700 --> 00:00:46,700
Is this a realistic prospect in ten years,
and do you understand
2
00:00:46,750 --> 00:00:49,160
why it kind of creeps
people out a little bit?
3
00:00:51,120 --> 00:00:55,700
There's what I call the creepy line,
and the Google policy about a lot of
4
00:00:55,750 --> 00:00:58,910
these things is to get right
up to the creepy line but not cross it.
5
00:00:59,000 --> 00:01:03,120
I would argue that implanting things in
your brain is beyond the creepy line.
6
00:01:03,160 --> 00:01:06,540
Mine in particular.
Yes, at least for the moment.
7
00:02:45,830 --> 00:02:47,330
What is internet anyway?
8
00:02:47,410 --> 00:02:48,580
What do you mean?
9
00:02:48,660 --> 00:02:51,330
How does one… What,
do you do write to it like mail?
10
00:02:51,410 --> 00:02:53,160
No, a lot of people
use it and communicate.
11
00:02:53,200 --> 00:02:55,950
I guess they can communicate
with NBC writers and producers.
12
00:02:56,000 --> 00:02:58,410
Allison, can you explain what Internet is?
13
00:02:58,540 --> 00:03:00,500
No, she can't say anything
in 10 seconds or less.
14
00:03:02,160 --> 00:03:04,910
Allison will be in the studio shortly.
What is it?
15
00:03:05,000 --> 00:03:08,870
It’s a giant computer network
made up of… Started from…
16
00:03:08,950 --> 00:03:10,330
Oh, I thought you were going
to tell us what this was.
17
00:03:10,410 --> 00:03:11,870
It’s like a computer billboard.
18
00:03:11,950 --> 00:03:15,910
It’s not anything. It's a computer
billboard, but it's nation-wide,
19
00:03:16,000 --> 00:03:18,830
it’s several universities and
everything all joined together.
20
00:03:18,950 --> 00:03:20,370
Right.
And others can access it.
21
00:03:20,450 --> 00:03:21,870
Right.
And it's getting bigger and bigger
22
00:03:21,910 --> 00:03:23,040
all the time.
23
00:03:31,620 --> 00:03:36,330
At the end of the 20th century, Silicon
Valley industry titans like Steve Jobs
24
00:03:36,410 --> 00:03:42,040
and Bill Gates made a promise that would
forever alter how we perceive the world.
25
00:03:42,450 --> 00:03:47,080
It spans the globe like a
superhighway. It is called “Internet”.
26
00:03:47,250 --> 00:03:52,200
Suddenly, you're part of a new mesh of
people, programs, archives, ideas.
27
00:03:52,500 --> 00:03:57,120
The personal computer and the Internet
ushered humanity into the information age,
28
00:03:57,200 --> 00:04:00,160
laying the groundwork for
unlimited innovation.
29
00:04:01,450 --> 00:04:05,450
At the dawn of the 21st century,
there would be a new revolution.
30
00:04:05,540 --> 00:04:11,660
A new generation of young geniuses made
a new promise beyond our wildest dreams.
31
00:04:12,080 --> 00:04:15,830
The idea is that we take all
the world's information
32
00:04:15,910 --> 00:04:18,790
and make it accessible
and useful to everyone.
33
00:04:19,040 --> 00:04:23,700
Limitless information, artificial
intelligence, machines that would know
34
00:04:23,790 --> 00:04:26,620
what we wanted and would
tend to our every need.
35
00:04:27,040 --> 00:04:32,120
The technology would be beyond
our imaginations, almost like magic,
36
00:04:32,410 --> 00:04:36,750
yet the concept would be as
simple as a single word: search.
37
00:04:37,870 --> 00:04:42,040
This was the beginning of Google,
and less than a decade later, Facebook.
38
00:04:42,870 --> 00:04:45,160
They would make a new promise to humanity.
39
00:04:45,700 --> 00:04:47,500
What I think is so interesting
about Google
40
00:04:47,540 --> 00:04:51,120
and Facebook is they were founded
at American universities.
41
00:04:51,370 --> 00:04:55,870
At Stanford University, you had two
students, Larry Page and Sergey Brin,
42
00:04:56,000 --> 00:04:59,830
decide they wanted to create the
ultimate search engine for the Internet.
43
00:05:00,410 --> 00:05:04,200
Across the country, you had in the case
of Facebook, Mark Zuckerberg,
44
00:05:04,450 --> 00:05:08,660
who was a student at Harvard, decided he
wanted to create a social media platform,
45
00:05:08,750 --> 00:05:13,000
basically to meet girls and to make
friends, and hence we got Facebook.
46
00:05:13,620 --> 00:05:17,410
You never think that you could build
this company or anything like that,
47
00:05:17,450 --> 00:05:20,000
right? Because, I mean,
we were college students, right?
48
00:05:20,040 --> 00:05:22,370
And we were just building
stuff because we thought it was cool.
49
00:05:22,450 --> 00:05:25,910
It was all about promise, it was all about
this shiny future, it was all about
50
00:05:25,950 --> 00:05:30,910
the free flow of information, but there
was also a certain idealism behind it.
51
00:05:31,250 --> 00:05:34,200
I think that a company like Google,
52
00:05:34,330 --> 00:05:37,160
we have the potential to
make very big differences,
53
00:05:37,290 --> 00:05:41,580
very big positive differences in the
world, and I think we also have an
54
00:05:41,620 --> 00:05:43,700
obligation as a consequence of that.
55
00:05:43,790 --> 00:05:47,910
Facebook and Google was started
with a great idea and great ideals.
56
00:05:48,290 --> 00:05:51,450
Unfortunately, there was
also a very dark side.
57
00:05:51,580 --> 00:05:56,330
The goal that we went into it with
wasn't to make an online community,
58
00:05:56,370 --> 00:05:59,870
but sort of like a mirror for the real
community that existed in real life.
59
00:06:10,410 --> 00:06:16,830
Google handles 65% of all Internet
search in the US and even more overseas.
60
00:06:19,370 --> 00:06:23,370
In the early days, Google
was just a search engine.
61
00:06:23,620 --> 00:06:29,080
All the search engine was initially was
an index to what's on the Internet.
62
00:06:29,200 --> 00:06:33,410
The Internet was pretty small back then.
They had a better index than other
63
00:06:33,450 --> 00:06:36,120
indexes that were around; theirs
was not the first search engine.
64
00:06:36,620 --> 00:06:40,500
Page famously said that the point of
Google is to get people on to Google
65
00:06:40,580 --> 00:06:44,500
and off of Google out into the open
web as quickly as possible, and largely,
66
00:06:44,580 --> 00:06:45,870
they were really successful about that.
67
00:06:45,910 --> 00:06:50,120
The success of Google led to, in many ways
the success of the open web
68
00:06:50,160 --> 00:06:54,950
and this explosive growth in start-ups
and innovation during that period.
69
00:06:55,080 --> 00:06:59,250
Google had developed this incredible
technology called PageRank,
70
00:06:59,330 --> 00:07:01,830
unveiling what was known as
the BackRub algorithm,
71
00:07:01,950 --> 00:07:05,870
and it was just leaps and bounds ahead
of services like Yahoo, AltaVista,
72
00:07:06,000 --> 00:07:10,200
and quickly became the clear winner in
the space just from a technology basis.
73
00:07:10,290 --> 00:07:14,000
And what PageRank does, is it spiders
around the Internet instead of
74
00:07:14,080 --> 00:07:16,290
being a directory service like Yahoo was.
75
00:07:16,370 --> 00:07:20,200
It’s taking pictures of web pages
and analyzing the links between the
76
00:07:20,250 --> 00:07:24,370
web pages and using that to make
some assumptions about relevance;
77
00:07:24,410 --> 00:07:28,000
and so, if you're looking for some
history of Abraham Lincoln
78
00:07:28,120 --> 00:07:31,370
and there's a bunch of links across
the web pointing to a particular site,
79
00:07:31,410 --> 00:07:35,080
that must mean that this
page is the “most relevant.”
80
00:07:36,000 --> 00:07:38,950
Now, they had to figure out:
how do we make money off this?
81
00:07:39,000 --> 00:07:41,950
We've got an index. How do we make money?
82
00:07:42,160 --> 00:07:44,120
Well, they figured out how to do that.
83
00:07:44,200 --> 00:07:47,790
It's pretty simple.
All you do is track people’s searches.
84
00:07:48,120 --> 00:07:52,040
Your search history is
very, very informative.
85
00:07:52,120 --> 00:07:55,790
That's gonna tell someone immediately
whether you're Republican or Democrat,
86
00:07:55,870 --> 00:07:59,500
whether you like one
breakfast cereal versus another.
87
00:07:59,580 --> 00:08:01,620
That's gonna tell you whether
you're gay or straight.
88
00:08:01,700 --> 00:08:04,950
It's gonna tell you thousands of
things about a person over time,
89
00:08:05,000 --> 00:08:08,000
because they can see
what websites you're going to.
90
00:08:08,250 --> 00:08:09,910
And where do they go?
91
00:08:10,000 --> 00:08:14,080
Did they go to a porn site?
Did they go to a shopping site?
92
00:08:14,160 --> 00:08:16,910
Did they go to a website
looking up how to make bombs?
93
00:08:17,120 --> 00:08:19,870
This information, essentially,
are building blocks,
94
00:08:19,910 --> 00:08:25,200
they are constructing a profile
of you and that profile is real,
95
00:08:25,370 --> 00:08:29,870
it's detailed, it's granular
and it never goes away.
96
00:08:30,580 --> 00:08:33,870
Google has more than doubled
its profits in the last year
97
00:08:33,910 --> 00:08:35,790
selling Internet advertising,
98
00:08:36,160 --> 00:08:40,000
but with Microsoft and Yahoo
gunning for its search engine users,
99
00:08:40,040 --> 00:08:41,750
fortunes can quickly change.
100
00:08:43,620 --> 00:08:48,160
This was really the first large-scale
mechanism for targeted advertising.
101
00:08:48,500 --> 00:08:51,620
For example, I want to sell umbrellas.
102
00:08:52,080 --> 00:08:56,370
Well, Google could tell you exactly,
you know, who is looking for umbrellas
103
00:08:56,500 --> 00:08:58,830
because they have your
search history, they know.
104
00:08:59,080 --> 00:09:03,080
So, they could hook up the umbrella makers
with people searching for umbrellas.
105
00:09:03,200 --> 00:09:06,000
Basically send you targeted ads.
106
00:09:06,700 --> 00:09:10,700
That is where Google gets
more than 90% of its revenue.
107
00:09:12,500 --> 00:09:15,790
If you look at how they structured
the company, it was structured so that
108
00:09:15,910 --> 00:09:19,080
the founders of the company
would always have controlling shares
109
00:09:19,160 --> 00:09:22,620
and would always determine the
direction and the future of the company.
110
00:09:22,700 --> 00:09:24,620
Anybody else who bought into the company
111
00:09:24,700 --> 00:09:27,700
was simply along for the ride
in terms of making money.
112
00:09:27,830 --> 00:09:31,250
So, there was this idealism
that they said motivated them,
113
00:09:31,290 --> 00:09:34,950
and their corporate motto
was “don't be evil.”
114
00:09:37,200 --> 00:09:42,450
When the “don't be evil” slogan
came up - it was during a meeting,
115
00:09:42,620 --> 00:09:45,750
I think in 2003, if I recall right -
people are trying to say, “Well,
116
00:09:45,790 --> 00:09:48,450
what are Google’s values?”
and one of the engineers
117
00:09:48,580 --> 00:09:50,790
who was invited to this meeting said,
118
00:09:50,830 --> 00:09:53,200
“We could just sum it all up
by just saying ‘don't be evil.’”
119
00:09:53,500 --> 00:09:56,700
What's so interesting about this term
that they were not going to be evil,
120
00:09:56,790 --> 00:09:58,500
is they never defined what evil was.
121
00:09:58,580 --> 00:10:00,450
They never even gave us a
sense of what it meant,
122
00:10:00,580 --> 00:10:04,830
but people latched on to it and sort
of adopted it and trusted it because,
123
00:10:04,910 --> 00:10:08,040
of course, nobody wants to be
associated with an evil company,
124
00:10:08,120 --> 00:10:11,000
so people gave the term “evil”
their own meaning.
125
00:10:11,120 --> 00:10:15,080
Even their home page indicates to
you that you're not being manipulated
126
00:10:15,370 --> 00:10:19,500
and they sort of broadcast to you
the idea that this was a clean, simple,
127
00:10:19,580 --> 00:10:24,250
reliable, honest interface, and that
the company that produced it wasn't
128
00:10:24,410 --> 00:10:29,250
overtly nefarious in its operations.
So, there were no ads on the front page,
129
00:10:29,330 --> 00:10:30,910
for example, which was a big deal.
130
00:10:31,040 --> 00:10:34,580
And so, everybody in that
initial phase of excitement, of course,
131
00:10:34,700 --> 00:10:38,080
was more concerned about exploring
the possibilities of the technology
132
00:10:38,160 --> 00:10:41,750
than concerned about the potential
costs of the technology.
133
00:10:41,870 --> 00:10:46,000
It was always kind of a generous thing,
“We're gonna give you these great tools,
134
00:10:46,080 --> 00:10:48,250
we're gonna give you these
things that you can use,”
135
00:10:49,660 --> 00:10:54,250
and then Google was saying, “No, actually,
136
00:10:54,370 --> 00:10:58,870
we're gonna keep your data
and we're gonna organize society.
137
00:10:58,950 --> 00:11:01,080
We know how to improve
and optimize the world.”
138
00:11:02,000 --> 00:11:07,160
In 2007, there were a couple of trends
I think led Google off of its path of
139
00:11:07,250 --> 00:11:11,200
being the place that just matched people
with the best information on the Web
140
00:11:11,370 --> 00:11:16,660
and led them off into the Internet,
and that was the birth of smartphones.
141
00:11:16,750 --> 00:11:22,160
Steve Jobs unveiled the iPhone,
and the rise of Facebook, and this notion
142
00:11:22,250 --> 00:11:25,160
that time spent on a site where users were
143
00:11:25,290 --> 00:11:27,580
spending lots and
lots of time on Facebook,
144
00:11:27,660 --> 00:11:31,540
became a really important metrics that
advertisers were paying attention to,
145
00:11:31,660 --> 00:11:35,950
and so Google's response was basically
to deviate from that original ethos
146
00:11:36,000 --> 00:11:40,120
in becoming a place where they were
trying to keep people on Google.
147
00:11:40,330 --> 00:11:44,790
So, they very rapidly
started to branch out into other areas.
148
00:11:45,200 --> 00:11:48,830
They were getting a lot of information
from people using the search engine,
149
00:11:49,200 --> 00:11:53,410
but if people went directly to
a website, uh-oh, that's bad,
150
00:11:53,540 --> 00:11:55,790
because now Google doesn't know that.
151
00:11:56,000 --> 00:12:00,160
So, they developed a browser,
which is now the most widely
152
00:12:00,200 --> 00:12:02,160
used browser in the world: Chrome.
153
00:12:02,330 --> 00:12:06,250
By getting people to use Chrome,
they were able now to collect information
154
00:12:06,290 --> 00:12:09,660
about every single website you visited,
whether or not you were
155
00:12:09,750 --> 00:12:13,080
using their search engine, but of course,
even that's not enough, right,
156
00:12:13,120 --> 00:12:17,410
because people do a lot of things now on
their mobile devices, on their computers.
157
00:12:17,540 --> 00:12:22,000
Without using their browser, you want
to know what people are doing,
158
00:12:22,040 --> 00:12:23,750
even when they're not online.
159
00:12:23,910 --> 00:12:28,120
So, Google developed an operating system,
which is called Android,
160
00:12:28,290 --> 00:12:31,790
which on mobile devices is the
dominant operating system,
161
00:12:31,870 --> 00:12:38,250
and Android records what we're doing,
even when we are not online.
162
00:12:38,580 --> 00:12:43,410
As soon as you connect to the Internet,
Android uploads to Google a complete
163
00:12:43,500 --> 00:12:47,540
history of where you've been that day,
among many other things.
164
00:12:47,620 --> 00:12:50,120
It's a progression of
surveillance that Google
165
00:12:50,200 --> 00:12:52,370
has been engaged in since the beginning.
166
00:12:52,830 --> 00:12:56,870
When Google develops
another tool for us to use,
167
00:12:56,950 --> 00:13:00,830
they're doing it not to make our
lives easier, they're doing it to get
168
00:13:00,950 --> 00:13:03,660
another source of information about us.
169
00:13:03,950 --> 00:13:06,830
These are all free services,
but obviously they're not,
170
00:13:06,870 --> 00:13:10,330
because huge complicated
machines aren't free.
171
00:13:10,620 --> 00:13:14,160
Your interaction with them is governed
so that it will generate revenue.
172
00:13:18,040 --> 00:13:22,660
And that's what Google Docs is,
and that's what Google Maps is,
173
00:13:22,750 --> 00:13:26,950
and literally now more than a hundred
different platforms that Google controls.
174
00:13:27,250 --> 00:13:30,500
Companies that use the
surveillance business model
175
00:13:30,580 --> 00:13:35,000
the two biggest being Google and
Facebook - they don't sell you anything.
176
00:13:35,370 --> 00:13:39,870
They sell you! We are the product.
177
00:13:40,580 --> 00:13:43,250
Now, what Google will tell you is,
“We're very transparent.
178
00:13:43,330 --> 00:13:44,410
There's a user agreement.
179
00:13:44,540 --> 00:13:47,830
Everybody knows that if you're going to
search on Google and we're going to
180
00:13:47,870 --> 00:13:50,870
give you access to this free
information, we need to get paid,
181
00:13:51,000 --> 00:13:52,620
so we're going to take your data.”
182
00:13:52,700 --> 00:13:57,040
The problem is, most people don't
believe or don't want to believe
183
00:13:57,120 --> 00:14:00,200
that that data is going to
be used to manipulate you.
184
00:14:10,580 --> 00:14:13,450
The commonality between
Facebook and Google,
185
00:14:13,580 --> 00:14:15,830
even though they're
different platforms, is math.
186
00:14:15,910 --> 00:14:18,870
They're driven by math and
the ability to collect data.
187
00:14:19,000 --> 00:14:23,660
You sign on, you make a profile
about yourself by answering some questions
188
00:14:23,790 --> 00:14:28,040
entering in some information, such as
your concentration or major at school,
189
00:14:29,410 --> 00:14:32,200
contact information about phone numbers,
190
00:14:32,250 --> 00:14:35,700
instant messaging, screen names,
anything you want to tell. Interests,
191
00:14:35,790 --> 00:14:40,290
what books you like, movies, and
most importantly, who your friends are.
192
00:14:40,910 --> 00:14:44,870
The founders of these companies were
essentially mathematicians who understood
193
00:14:44,910 --> 00:14:49,120
that by collecting this information,
you not only could monetize it
194
00:14:49,410 --> 00:14:53,790
and sell it to advertisers, but you
could actually control and steer people
195
00:14:53,870 --> 00:14:57,540
or nudge people in the direction
that you wanted them to go.
196
00:14:57,910 --> 00:15:02,000
An algorithm basically comes up
with how certain things should react.
197
00:15:02,120 --> 00:15:06,500
So, for example, you give it some inputs,
be it in the case of Google,
198
00:15:06,540 --> 00:15:10,080
you would enter in a query, a search
phrase, and then certain math will make
199
00:15:10,200 --> 00:15:12,910
decisions on what information
should be presented to you.
200
00:15:13,040 --> 00:15:16,540
Algorithms are used throughout the
whole entire process in everyday life,
201
00:15:16,660 --> 00:15:19,540
everything from search
engines to how cars drive.
202
00:15:19,660 --> 00:15:23,750
The variables you decide, and when
you have a human factor deciding
203
00:15:23,830 --> 00:15:26,290
what the variables are, it’s
gonna affect the outcome.
204
00:15:26,370 --> 00:15:29,500
I want you to imagine walking into a room,
205
00:15:30,790 --> 00:15:33,040
a control room with a bunch of people,
206
00:15:33,080 --> 00:15:36,000
a hundred people hunched
over at desks with little dials,
207
00:15:36,500 --> 00:15:40,500
and that that control room
will shape the thoughts
208
00:15:40,790 --> 00:15:43,950
and feelings of a billion people.
209
00:15:45,700 --> 00:15:52,040
This might sound like science fiction,
but this actually exists right now, today.
210
00:15:52,660 --> 00:15:56,950
Our ethical presuppositions are
built into the software, but our unethical
211
00:15:57,040 --> 00:15:59,750
presuppositions are built
into the software as well,
212
00:16:00,080 --> 00:16:03,700
so whatever it is that
motivates Google are going
213
00:16:03,830 --> 00:16:07,450
to be built right into the
automated processes that sift our data.
214
00:16:07,620 --> 00:16:10,120
I used to be in one of
those control rooms.
215
00:16:10,250 --> 00:16:12,500
I was a design ethicist at Google,
216
00:16:12,790 --> 00:16:16,410
where I studied “How do you
ethically steer people's thoughts?”
217
00:16:16,750 --> 00:16:19,370
Because what we don't talk
about is a handful of people
218
00:16:19,450 --> 00:16:23,660
working at a handful of technology
companies, through their choices,
219
00:16:24,000 --> 00:16:27,370
will steer what a billion
people are thinking today.
220
00:16:27,700 --> 00:16:31,870
What Google is essentially is a
gigantic compression algorithm,
221
00:16:31,950 --> 00:16:36,330
and what a compression algorithm
does is look at a very complex environment
222
00:16:36,450 --> 00:16:38,580
and simplifies it;
and that's what you want,
223
00:16:38,700 --> 00:16:42,540
when you put a term into a search engine,
you want all the possibilities
224
00:16:42,620 --> 00:16:46,500
sifted so that you find the one that
you can take action on essentially.
225
00:16:46,910 --> 00:16:51,910
And so Google simplifies the world
and presents the simplification to you,
226
00:16:52,160 --> 00:16:55,870
which is also what your perceptual
systems do, and so basically
227
00:16:55,950 --> 00:16:59,200
we're building a giant perceptual machine
228
00:16:59,290 --> 00:17:02,580
that's an intermediary between
us and the complex world.
229
00:17:02,700 --> 00:17:06,790
The problem with that is that whatever the
assumptions are that Google operates
230
00:17:06,910 --> 00:17:09,620
under are going to be the filters
231
00:17:09,790 --> 00:17:13,370
that determine how the world
is simplified and presented.
232
00:17:14,330 --> 00:17:17,750
So, when you want to find something,
a term, a person, do you Google it?
233
00:17:17,870 --> 00:17:19,660
It's really fast, no doubt about that,
234
00:17:19,750 --> 00:17:22,370
but are you getting the best
answers to your questions?
235
00:17:22,540 --> 00:17:26,580
The algorithm that they use
to show us search results,
236
00:17:26,700 --> 00:17:31,080
it was written for the precise
purpose of doing two things,
237
00:17:31,200 --> 00:17:36,200
which make it inherently biased,
and those two things are (1) filtering,
238
00:17:36,330 --> 00:17:40,700
that is to say, the algorithm has to
look at all the different web pages
239
00:17:40,790 --> 00:17:44,660
it has in its index,
and it's got billions of them,
240
00:17:44,790 --> 00:17:46,370
and it's got to make a selection.
241
00:17:46,450 --> 00:17:48,950
Let's say I type in,
“What's the best dog food?”
242
00:17:49,000 --> 00:17:54,160
It has to look up all the dog foods it has
in its database and all the websites
243
00:17:54,200 --> 00:17:56,830
associated with those dog foods
that's the filtering,
244
00:17:56,910 --> 00:18:00,500
but then it's got to put them into an
order, and that's the ordering.
245
00:18:00,580 --> 00:18:06,830
So, the filtering is biased, because
why did it pick those websites instead
246
00:18:06,870 --> 00:18:11,120
of the other 14 billion?
And then the ordering is biased,
247
00:18:11,160 --> 00:18:16,080
because why did it put Purina first
and some other company second?
248
00:18:16,500 --> 00:18:22,200
It would be useless for us
unless it was biased.
249
00:18:22,250 --> 00:18:25,790
That's precisely what we want.
We want it to be biased.
250
00:18:25,950 --> 00:18:30,080
So, the problem comes in, obviously,
when we're not talking about dog foods,
251
00:18:30,200 --> 00:18:32,200
but when we're talking
about things like elections.
252
00:18:32,290 --> 00:18:37,000
I am running for President
of the United States.
253
00:18:38,500 --> 00:18:41,910
Because if we're asking questions
about issues like immigration
254
00:18:42,200 --> 00:18:48,580
or about candidates, well, again, that
algorithm is gonna do those two things.
255
00:18:48,700 --> 00:18:53,450
It's gonna do the filtering, so it's gonna
do some selecting from among billions
256
00:18:53,500 --> 00:18:56,750
of web pages, and then it's
gonna put them into an order.
257
00:18:56,790 --> 00:19:00,950
It will always favor one
dog food over another,
258
00:19:02,120 --> 00:19:05,950
one online music service over another,
259
00:19:06,200 --> 00:19:11,040
one comparative shopping service over
another, and one candidate over another.
260
00:19:11,330 --> 00:19:16,250
The Google search algorithm is made
up of several hundred signals that
261
00:19:16,330 --> 00:19:20,540
we try to put together to serve
the best results for each user.
262
00:19:20,660 --> 00:19:25,580
Just last year, we launched over 500
changes to our algorithm, so by some count
263
00:19:25,700 --> 00:19:29,790
we change our algorithm almost every day,
almost twice over.
264
00:19:29,910 --> 00:19:34,160
Now it's possible that they're using
unbiased algorithms, but they're not using
265
00:19:34,200 --> 00:19:38,950
unbiased algorithms to do things like
search for unacceptable content on Twitter
266
00:19:39,000 --> 00:19:40,790
and on YouTube and on Facebook.
267
00:19:40,910 --> 00:19:42,410
Those aren't unbiased at all.
268
00:19:42,450 --> 00:19:46,160
They're built specifically
to filter out whatever's bad.
269
00:19:47,120 --> 00:19:51,080
What happens is the best technical people
with the biggest computers with the
270
00:19:51,120 --> 00:19:54,370
best bandwidth to those computers
become more empowered than the others,
271
00:19:54,450 --> 00:19:56,870
and a great example of that
is a company like Google,
272
00:19:56,950 --> 00:20:00,580
which for the most part is just scraping
the same Internet and the same data
273
00:20:00,660 --> 00:20:04,330
any of us can access and yet is able to
build this huge business of directing
274
00:20:04,410 --> 00:20:08,000
what links we see in front of us, which is
an incredible influence on the world,
275
00:20:08,080 --> 00:20:12,370
by having the bigger computers,
the better scientists, and more access.
276
00:20:12,870 --> 00:20:18,410
As a digital platform, we feel like we
are on the cutting edge of the issues
277
00:20:18,450 --> 00:20:21,290
which society is grappling
with at any given time.
278
00:20:21,450 --> 00:20:24,700
Where do you draw the line of
freedom of speech and so on?
279
00:20:25,250 --> 00:20:30,450
The question precisely is, “What's good
and bad, according to whose judgment?”
280
00:20:30,580 --> 00:20:34,870
And that's especially relevant given that
these systems serve a filtering purpose.
281
00:20:35,040 --> 00:20:40,500
They're denying or allowing us access
to information before we see and think,
282
00:20:40,830 --> 00:20:43,870
so we don't even know what's
going on behind the scenes.
283
00:20:44,000 --> 00:20:48,620
The people who are designing these systems
are building a gigantic unconscious mind
284
00:20:48,830 --> 00:20:52,700
that will filter the world for us and it's
increasingly going to be tuned
285
00:20:52,750 --> 00:20:55,910
to our desires in a manner that
will benefit the purposes
286
00:20:55,950 --> 00:20:57,660
of the people who are
building the machines.
287
00:20:58,790 --> 00:21:03,160
What these algorithms basically do is
they create bubbles for ourselves,
288
00:21:03,410 --> 00:21:08,620
because the algorithm learns, “This is
what Peter Schweizer is interested in,”
289
00:21:08,700 --> 00:21:12,620
and what does it give me when
I go online? It gives me more of myself.
290
00:21:12,950 --> 00:21:15,620
I mean, it's narrowing your
vision in some ways.
291
00:21:15,700 --> 00:21:19,200
It's funneling my vision; it's leading
me to a view of the world.
292
00:21:19,410 --> 00:21:22,700
And it may not be Facebook's view of the
world, but it's the view of the world
293
00:21:22,790 --> 00:21:25,040
that will make Facebook the most money.
294
00:21:25,290 --> 00:21:29,830
They're attempting to addict us and
they’re addicting us on the basis of data.
295
00:21:30,250 --> 00:21:34,040
The problem with that is that you
only see what you already know,
296
00:21:34,330 --> 00:21:38,040
and that's not a good thing, because
you also need to see what you don't know,
297
00:21:38,160 --> 00:21:40,410
because otherwise you can't learn.
298
00:21:40,870 --> 00:21:44,790
Facebook this morning is defending itself
against accusations of political bias.
299
00:21:44,870 --> 00:21:48,870
An article posted Monday on the tech
news site Gizmodo said Facebook
300
00:21:49,000 --> 00:21:53,700
workers suppressed conservative-leaning
news stories in its trending section.
301
00:21:54,040 --> 00:21:58,410
And it creates a terrible dynamic when it
comes to the consumption and information
302
00:21:58,500 --> 00:22:01,700
in news because it leads
us to only see the world
303
00:22:01,790 --> 00:22:04,290
in a very, very small and narrow way.
304
00:22:04,410 --> 00:22:08,120
It makes us easier to control
and easier to be manipulated.
305
00:22:17,500 --> 00:22:21,620
I think what's so troubling about
Google and Facebook is you don't know
306
00:22:21,700 --> 00:22:26,500
what you don't know, and while
the Internet provides a lot of openness,
307
00:22:26,620 --> 00:22:29,500
in the sense that you have
access to all this information,
308
00:22:29,620 --> 00:22:32,540
the fact of the matter is you
have basically two gatekeepers:
309
00:22:32,660 --> 00:22:36,410
Facebook and Google, which
control what we have access to.
310
00:22:36,830 --> 00:22:41,540
News organizations have to be prepared
that their home pages are less important.
311
00:22:41,660 --> 00:22:44,830
People are using… They're using
Facebook as a homepage;
312
00:22:45,000 --> 00:22:47,250
they're using Twitter as
their own homepage.
313
00:22:47,330 --> 00:22:52,250
Those news streams are their
portals into the world of information.
314
00:22:52,540 --> 00:22:56,950
Google and Facebook started out as
Internet companies and it has changed.
315
00:22:57,080 --> 00:22:59,290
They've morphed into media companies,
316
00:22:59,370 --> 00:23:03,000
they've morphed into a telecommunications
provider, and at least in the US,
317
00:23:03,080 --> 00:23:06,160
media companies are regulated,
newspapers are regulated,
318
00:23:06,250 --> 00:23:08,160
telecommunications
providers are regulated.
319
00:23:08,250 --> 00:23:11,330
Internet companies, social media companies
like Google and Facebook, they're not.
320
00:23:11,450 --> 00:23:13,040
So, the roles have changed.
321
00:23:13,080 --> 00:23:17,040
Studies show now that individuals
expect the news to find them.
322
00:23:18,950 --> 00:23:22,580
When we started, the number one thing
that we threw out with our company was
323
00:23:22,620 --> 00:23:25,500
that millennials aren't interested in
news. We were like, “They are.
324
00:23:25,540 --> 00:23:27,660
That doesn't make any sense;
everyone wants to be informed.”
325
00:23:27,700 --> 00:23:29,200
Of course.
It’s just that they're busy
326
00:23:29,290 --> 00:23:33,370
and you need to create a different
form of news to fit into their lives.
327
00:23:36,950 --> 00:23:41,120
There's so much active misinformation,
328
00:23:41,250 --> 00:23:43,620
and it's packaged very well and it looks
329
00:23:43,660 --> 00:23:46,250
the same when you see
it on a Facebook page.
330
00:23:46,290 --> 00:23:52,120
We can lose so much of what we've gained,
in terms of the kind of democratic
331
00:23:52,370 --> 00:23:55,450
freedoms and market-based
economies and prosperity
332
00:23:55,500 --> 00:23:57,540
that we've come to take for granted.
333
00:23:57,620 --> 00:24:01,250
Both Google and Facebook announced
they are updating their company policies
334
00:24:01,290 --> 00:24:04,450
to ban fake new sites from
using their advertising networks.
335
00:24:04,540 --> 00:24:07,750
Both tech giants have come under
scrutiny in recent weeks for presenting
336
00:24:07,830 --> 00:24:10,290
fake news during the
presidential election.
337
00:24:10,370 --> 00:24:14,120
Well, fake news is not even really a
description of anything anymore.
338
00:24:14,200 --> 00:24:19,370
It's a weapon, it gets tossed around.
If there's a real news story that somebody
339
00:24:19,410 --> 00:24:21,910
doesn't like, they just
dismiss it as fake news.
340
00:24:22,000 --> 00:24:25,450
It's kind of a way of trying to get it
to go away and it's being misused.
341
00:24:25,540 --> 00:24:28,040
It's being used as a way to dismiss
342
00:24:28,120 --> 00:24:31,160
actual factual evidence
that something has happened.
343
00:24:31,290 --> 00:24:34,950
Right now, in the news, you'll
see a lot of concern with
344
00:24:35,040 --> 00:24:37,200
so-called fake news stories.
345
00:24:37,410 --> 00:24:41,830
There are three reasons why we
should not worry about fake news stories.
346
00:24:42,120 --> 00:24:47,750
Fake news is competitive. You can
place your fake news stories about me
347
00:24:47,830 --> 00:24:51,410
and I can place my fake news stories
about you; it’s completely competitive.
348
00:24:51,500 --> 00:24:53,750
There have always been fake news stories.
349
00:24:53,830 --> 00:24:57,830
There have always been fake TV
commercials and fake billboards.
350
00:24:57,950 --> 00:25:03,160
That has always existed and always will.
Second, is that you can see them.
351
00:25:03,700 --> 00:25:07,000
They're visible; you can actually
see them with your eyeballs.
352
00:25:07,120 --> 00:25:12,330
Because you can see them, that
means you can weigh in on them.
353
00:25:12,620 --> 00:25:16,080
You also know that there's a
human element, because usually,
354
00:25:16,160 --> 00:25:18,500
there's a name of an author
on the fake news stories.
355
00:25:19,290 --> 00:25:22,250
You know, you can
decide just to ignore them.
356
00:25:23,160 --> 00:25:25,580
Generally speaking, most news stories,
357
00:25:25,660 --> 00:25:29,080
like most advertisements,
are not actually read by anyone,
358
00:25:29,160 --> 00:25:31,120
so you put them in front
of someone's eyeballs,
359
00:25:31,160 --> 00:25:33,660
and people just kind of
skip right over them.
360
00:25:33,790 --> 00:25:37,410
The third reason why fake news
stories are not really an issue
361
00:25:37,500 --> 00:25:40,200
is because of something
called confirmation bias.
362
00:25:40,330 --> 00:25:43,580
We pay attention to
and actually believe things
363
00:25:43,700 --> 00:25:46,080
if they support
what we already believe,
364
00:25:46,200 --> 00:25:50,290
so if you see a fake news story
that says Hillary Clinton is the devil,
365
00:25:50,450 --> 00:25:53,000
but you love Hillary Clinton,
you're not gonna buy it.
366
00:25:53,080 --> 00:25:55,660
You're gonna say,
“That's ridiculous.” On the other hand,
367
00:25:55,750 --> 00:25:57,620
if you happen to think
Hillary Clinton is the devil,
368
00:25:57,660 --> 00:26:00,040
you're gonna go, “Yes! I knew that.”
369
00:26:00,200 --> 00:26:05,870
So, you don't really change people's
opinions with fake news stories, you don't
370
00:26:06,080 --> 00:26:09,290
You support beliefs that
people already have.
371
00:26:09,410 --> 00:26:15,950
The real problem is these new forms
influence that (1) are not competitive;
372
00:26:16,000 --> 00:26:20,540
(2) people can't see and (3) are
not subject to confirmation bias.
373
00:26:21,000 --> 00:26:24,700
If you can't see something,
confirmation bias can't kick in,
374
00:26:24,950 --> 00:26:31,080
so we're in a world right now in which
our opinions, beliefs, attitudes,
375
00:26:31,290 --> 00:26:37,450
voting preferences, purchases are
all being pushed one way or another
376
00:26:37,540 --> 00:26:41,910
every single day by forces
that we cannot see.
377
00:26:43,750 --> 00:26:48,330
The real problem of manipulation
of news and information is secret.
378
00:26:48,500 --> 00:26:51,910
It's what Google and Facebook
are doing on a regular basis.
379
00:26:51,950 --> 00:26:55,290
By suppressing stories,
by steering us towards
380
00:26:55,370 --> 00:26:58,540
other stories rather than the
stories we’re actually seeking,
381
00:26:58,620 --> 00:27:01,370
that's the real
manipulation that's going on.
382
00:27:01,450 --> 00:27:06,040
Fake news is the shiny object
they want us to chase that really isn't
383
00:27:06,120 --> 00:27:08,500
relevant to the problems
we're facing today.
384
00:27:08,910 --> 00:27:12,620
We use Google so extensively;
the average person does,
385
00:27:12,700 --> 00:27:15,830
you know, multiple queries per day,
and some more than others,
386
00:27:15,950 --> 00:27:19,750
and over time, you don't realize
that it's impacting you.
387
00:27:19,870 --> 00:27:24,290
And we already know that people are not
necessarily fact-checking on their own;
388
00:27:24,370 --> 00:27:28,120
they rely on media companies,
they rely on mainstream media that rely on
389
00:27:28,250 --> 00:27:30,870
companies like Google to give
them answers about the world
390
00:27:31,000 --> 00:27:34,290
without necessarily
looking at, “Is that true?”
391
00:27:34,500 --> 00:27:38,950
Facebook has become the number
one source of news for Americans.
392
00:27:39,200 --> 00:27:44,790
Well, gee, what if they were biasing the
newsfeed toward one cause or another,
393
00:27:44,870 --> 00:27:49,200
one candidate or another? Would we have
any way of knowing that? Of course not.
394
00:27:51,540 --> 00:27:53,830
And we don't know what
Facebook's aims are.
395
00:27:53,910 --> 00:27:57,540
In fact, Facebook doesn't know what
its aims are because it's going to be the
396
00:27:57,620 --> 00:28:01,540
sum total of all the people
who are working on these algorithms.
397
00:28:01,660 --> 00:28:05,040
A whistleblower, someone
who used to work for Facebook,
398
00:28:05,160 --> 00:28:07,120
came forward last year and said,
399
00:28:07,250 --> 00:28:11,250
“I was one of the news
curators at Facebook.
400
00:28:11,330 --> 00:28:16,620
A bunch of us used to sit around every
day and we used to remove stories
401
00:28:16,660 --> 00:28:19,580
from the newsfeed that were too
conservative, and now and then,
402
00:28:19,620 --> 00:28:23,830
we’d inject a story that we
just thought was really cool.”
403
00:28:25,500 --> 00:28:29,750
Facebook founder Mark Zuckerberg says
he's committed to giving everyone a voice.
404
00:28:29,910 --> 00:28:32,750
He's responding to an
allegation that Facebook
405
00:28:32,870 --> 00:28:36,750
edits conservative views
out of its trending topics.
406
00:28:37,040 --> 00:28:40,700
They can suppress certain types
of results based on what they think
407
00:28:40,750 --> 00:28:44,040
you should be seeing, based on
what your followers are presenting.
408
00:28:44,120 --> 00:28:47,330
Now a new report claims that according
to a former Facebook employee,
409
00:28:47,410 --> 00:28:51,540
the social media mega-company sometimes
ignores what is actually trending
410
00:28:51,580 --> 00:28:56,750
among its billion users, if the story
originated from a conservative
411
00:28:56,790 --> 00:29:01,540
news source or if it's a topic
causing buzz among conservatives.
412
00:29:03,660 --> 00:29:06,080
Facebook constantly
manipulates their users.
413
00:29:06,200 --> 00:29:08,870
They do it by the things that they
insert into the news feeds,
414
00:29:09,000 --> 00:29:12,330
they do it by the types of posts
they allow their users to see,
415
00:29:12,500 --> 00:29:14,870
and the fact that they
actually decided to do
416
00:29:14,950 --> 00:29:17,250
psychological experiments
on their users
417
00:29:17,290 --> 00:29:21,620
is something that I think a lot of
people need to really fully understand,
418
00:29:21,700 --> 00:29:25,580
and they were doing it based upon
the fact that different things that
419
00:29:25,700 --> 00:29:29,370
people posted, they wanted to see
how other people would react to it.
420
00:29:29,500 --> 00:29:32,500
On HealthWatch, what your
Facebook friends post
421
00:29:32,580 --> 00:29:34,620
can have a direct effect on your mood.
422
00:29:34,700 --> 00:29:37,540
New research shows the
more negative posts you see,
423
00:29:37,660 --> 00:29:40,120
the more negative you could become.
424
00:29:40,540 --> 00:29:44,410
So, for example, let's say somebody
wanted to post something that was
425
00:29:44,500 --> 00:29:47,870
on the newsfeed that was
of a very negative story,
426
00:29:48,040 --> 00:29:50,200
they wanted to see how
their users would react
427
00:29:50,370 --> 00:29:54,290
via their likes, via their statements,
via their posts, and they would show
428
00:29:54,370 --> 00:29:58,660
people who already had a predilection
to maybe having some depression
429
00:29:58,750 --> 00:30:00,910
or maybe having some
other issues in their lives;
430
00:30:01,000 --> 00:30:04,160
and they can figure that out based upon
your likes, based upon your connections,
431
00:30:04,200 --> 00:30:07,870
based upon where you're going, and
so what they wanted to do was take that
432
00:30:07,950 --> 00:30:10,540
information and then use it to basically
433
00:30:10,620 --> 00:30:13,040
weaponize this information
against their users,
434
00:30:13,080 --> 00:30:17,330
so that way their users could see
different things that may affect their
435
00:30:17,370 --> 00:30:20,250
mood and may affect how they
interact with others.
436
00:30:20,330 --> 00:30:22,500
And that's something
that is highly unethical.
437
00:30:22,580 --> 00:30:26,540
It appears that some young people
may have been so affected by this
438
00:30:26,660 --> 00:30:29,870
that they may have done harm to themselves
based upon what they saw on their
439
00:30:29,910 --> 00:30:33,500
Facebook feed, and it was all
because of these experiments.
440
00:30:33,750 --> 00:30:35,950
The thing is that we have no
standing with Facebook.
441
00:30:36,040 --> 00:30:38,750
We're not citizens of Facebook;
we have no vote on Facebook.
442
00:30:38,790 --> 00:30:44,620
It's not a democracy, and this process is
not a way we can design the future.
443
00:30:44,700 --> 00:30:48,830
We can't rely on this single company
to invent our digital future.
444
00:30:50,660 --> 00:30:53,700
Basically, there is legislation
in this country that says that
445
00:30:53,790 --> 00:30:57,830
if you are a platform, you are not liable
for what people publish on you;
446
00:30:57,910 --> 00:31:00,120
however, if you start to edit
447
00:31:00,200 --> 00:31:03,700
what people publish on your platform,
then your legal obligations increase.
448
00:31:03,950 --> 00:31:07,660
We really, I think, should be concerned
when these companies that represent
449
00:31:07,790 --> 00:31:11,370
themselves as neutral platforms are
actually exercising editorial control and
450
00:31:11,410 --> 00:31:15,450
pushing particular political viewpoints,
because they're enjoying a special
451
00:31:15,500 --> 00:31:20,160
protected legal status on the theory
that they're providing a neutral platform
452
00:31:20,250 --> 00:31:23,080
for people to speak, to have a
diversity of views and so forth;
453
00:31:23,160 --> 00:31:26,660
and that status is based on section 230
of the Communications Decency Act,
454
00:31:26,700 --> 00:31:30,410
which protects them from legal liability,
and so this is the magic sauce
455
00:31:30,540 --> 00:31:33,750
from a legal standpoint that makes these
large platform companies possible,
456
00:31:33,830 --> 00:31:37,790
is they're not legally responsible for the
things that people say on their sites.
457
00:31:37,870 --> 00:31:40,200
It's the people who say
the things on their sites
458
00:31:40,330 --> 00:31:41,750
who bear legal responsibility.
459
00:31:41,830 --> 00:31:45,410
If they had to be legally responsible for
everything that everyone says on
460
00:31:45,500 --> 00:31:47,200
their sites, their business
model would break down.
461
00:31:47,290 --> 00:31:48,410
It wouldn't work very well.
462
00:31:48,500 --> 00:31:53,290
But I think that there's something wrong
when you use the legal liability
463
00:31:53,330 --> 00:31:56,750
that you were given to be a neutral
platform at the same time you're
464
00:31:56,830 --> 00:31:58,910
exercising very aggressive
editorial control
465
00:31:59,000 --> 00:32:00,950
to essentially pick sides politically.
466
00:32:01,290 --> 00:32:03,660
Does Facebook consider itself
to be a neutral public forum?
467
00:32:03,750 --> 00:32:06,910
The representatives of your company
have given conflicting answers on this.
468
00:32:07,120 --> 00:32:10,120
Are you a first amendment
speaker expressing your views
469
00:32:10,250 --> 00:32:13,620
or are you a neutral public forum
allowing everyone to speak?
470
00:32:13,830 --> 00:32:16,200
Senator, here's how we think about this.
471
00:32:16,330 --> 00:32:17,950
I don't believe that…
472
00:32:19,290 --> 00:32:23,620
There are certain content that clearly
we do not allow, right? Hate speech,
473
00:32:23,750 --> 00:32:25,950
terrorist content,
474
00:32:26,660 --> 00:32:30,370
nudity, anything that makes people
feel unsafe in the community.
475
00:32:30,700 --> 00:32:32,500
From that perspective, that's
476
00:32:32,580 --> 00:32:36,370
why we generally try to refer to
what we do as a platform for all ideas.
477
00:32:36,410 --> 00:32:38,120
Let me try this; because
the time is constrained.
478
00:32:38,830 --> 00:32:45,200
It's just a simple question. The predicate
for section 230 immunity under the CDA is
479
00:32:45,250 --> 00:32:47,410
that you are a neutral public forum.
480
00:32:47,500 --> 00:32:50,250
Do you consider yourself a neutral
public forum or are you engaged
481
00:32:50,370 --> 00:32:53,250
in political speech, which is your right
under the First Amendment?
482
00:32:53,540 --> 00:32:57,500
Well, Senator, our goal is certainly
not to engage in political speech.
483
00:32:57,620 --> 00:33:01,750
I'm not that familiar with the specific
legal language of the law that
484
00:33:01,830 --> 00:33:05,330
you speak to, so I would need to
follow up with you on that.
485
00:33:05,500 --> 00:33:09,120
If you're going to continue to behave
the way that you behave now,
486
00:33:09,330 --> 00:33:14,040
which is by editing content, filtering it
and steering the conversation the way
487
00:33:14,120 --> 00:33:18,040
you want it to go for political purposes,
you are going to be regulated
488
00:33:18,200 --> 00:33:21,040
every bit as much as any media company.
489
00:33:21,330 --> 00:33:24,750
So, what makes sense with this
approach is that it essentially allows the
490
00:33:24,790 --> 00:33:27,910
tech companies to decide,
now that they've grown up,
491
00:33:28,040 --> 00:33:29,750
what do they actually want to be?
492
00:33:29,910 --> 00:33:32,950
And it's a choice that they should make,
but they should be forced to
493
00:33:33,000 --> 00:33:38,250
make the choice and not hide
behind this fraud of legislation
494
00:33:38,290 --> 00:33:41,290
that gives them a free hand when
they don't deserve a free hand.
495
00:33:50,160 --> 00:33:54,950
There's what I call “the creepy line”,
and the Google policy about a lot of
496
00:33:55,000 --> 00:33:57,540
these things is to get right up to
the creepy line, but not cross it.
497
00:33:57,660 --> 00:34:01,250
Google crosses the creepy line every day.
498
00:34:01,950 --> 00:34:04,870
Not only does Google cross the creepy line
499
00:34:04,950 --> 00:34:07,500
the location of that line keeps shifting.
500
00:34:08,330 --> 00:34:10,620
Well, it's an interesting word, “creepy”,
501
00:34:10,700 --> 00:34:13,660
right? Because it's a word
that connotes horror.
502
00:34:14,000 --> 00:34:17,120
He didn't say, “Dangerous,”
he didn't say, “Unethical.”
503
00:34:17,250 --> 00:34:20,910
There's all sorts of words that could have
fit in that slot. He said, “Creepy.”
504
00:34:21,330 --> 00:34:24,870
And a creep is someone who
creeps around and follows you
505
00:34:24,910 --> 00:34:29,370
and spies on you for unsavory purposes,
right? That's the definition of a creep.
506
00:34:29,450 --> 00:34:33,830
You know, I don't think the
typical ethical person says,
507
00:34:34,160 --> 00:34:37,750
“I'm going to push right up to the
line of creepy and stop there.”
508
00:34:38,040 --> 00:34:41,870
You know, they say more something like,
“How about we don't get near enough to
509
00:34:41,950 --> 00:34:48,620
the creepy line, so that we're ever
engaging in even pseudo-creepy behavior?”
510
00:34:48,870 --> 00:34:51,750
Because creepy is really bad,
you know, it's…
511
00:34:52,910 --> 00:34:55,500
A creepy mugger is worse than a mugger.
512
00:34:55,660 --> 00:34:59,370
The mugger wants your money. God only
knows what the creepy mugger wants;
513
00:34:59,540 --> 00:35:01,000
it's more than your money.
514
00:35:01,160 --> 00:35:02,700
You give Google a lot of information.
515
00:35:02,830 --> 00:35:06,120
You’re searching for the most private
stuff on Google, you're searching about,
516
00:35:06,200 --> 00:35:09,910
you know, illnesses and diseases that
your family have, you're searching about
517
00:35:10,160 --> 00:35:14,580
things that might be your wife or your
spouse might not want you to know about,
518
00:35:14,700 --> 00:35:18,950
and you’re telling Google more than you
would tell a family member or your spouse
519
00:35:19,040 --> 00:35:20,540
or a very close friend,
520
00:35:20,750 --> 00:35:25,830
and Google has so much
information about you, that it's scary.
521
00:35:26,040 --> 00:35:29,700
For Google alone, we're studying
three very powerful ways in
522
00:35:29,750 --> 00:35:32,660
which they're impacting people's opinions.
523
00:35:33,750 --> 00:35:36,700
This phenomenon that we discovered
a few years ago, “SEME":
524
00:35:36,790 --> 00:35:40,120
search engine manipulation effect,
is a list effect.
525
00:35:40,250 --> 00:35:44,080
List effect is an effect that a list
has on some aspects
526
00:35:44,160 --> 00:35:46,040
of our cognitive functioning.
527
00:35:46,120 --> 00:35:49,450
What's higher in a list is easier
to remember, for example.
528
00:35:49,700 --> 00:35:53,660
SEME is an example of a list effect
but with a difference, because
529
00:35:53,830 --> 00:35:59,370
it's the only list effect I'm aware of
that is supported by a daily regimen of
530
00:35:59,450 --> 00:36:04,540
operant conditioning that tells us that
what's at the top of search results is
531
00:36:04,660 --> 00:36:07,750
better and truer than
what's lower in the list.
532
00:36:07,910 --> 00:36:12,540
It's gotten to a point that fewer than
5% of people click beyond page one,
533
00:36:12,620 --> 00:36:16,080
because the stuff that’s on page
one and typically at the top
534
00:36:16,160 --> 00:36:19,370
in that sort of above-the-fold
position is the best.
535
00:36:20,200 --> 00:36:24,870
Most of the searches that we conduct
are just simple routine searches.
536
00:36:24,910 --> 00:36:27,580
What is the capital of Kansas?
537
00:36:27,830 --> 00:36:31,290
And on those simple
searches over and over again,
538
00:36:31,370 --> 00:36:34,200
the correct answer
arises right at the top.
539
00:36:34,290 --> 00:36:40,580
So, we are learning over and over and
over again, and then the day comes
540
00:36:40,750 --> 00:36:44,370
when we look something up that
we're really unsure about, like,
541
00:36:44,450 --> 00:36:47,750
where should I go on vacation?
Which car should I buy?
542
00:36:47,950 --> 00:36:49,910
Which candidate should I vote for?
543
00:36:50,040 --> 00:36:53,080
And we do our little search
and there are our results,
544
00:36:53,160 --> 00:36:59,450
and what do we know? What’s at the top is
better. What's at the top is truer.
545
00:36:59,540 --> 00:37:04,500
That is why it's so easy to use search
results to shift people's opinions.
546
00:37:05,000 --> 00:37:10,000
This kind of phenomenon, this is really
scary compared to something like fake news
547
00:37:10,080 --> 00:37:13,370
because it's invisible.
There's a manipulation occurring.
548
00:37:13,410 --> 00:37:17,370
They can't see it. It's not competitive
because, well, for one thing,
549
00:37:17,410 --> 00:37:20,160
Google for all intents and
purposes has no competitors.
550
00:37:20,200 --> 00:37:25,830
Google controls 90% of
search in most of the world.
551
00:37:25,910 --> 00:37:29,910
Google is only going to show
you one list, in one order,
552
00:37:30,040 --> 00:37:32,580
which have a dramatic
impact on the decisions
553
00:37:32,620 --> 00:37:35,330
people make, on people's
opinions, beliefs and so on.
554
00:37:36,950 --> 00:37:40,250
The second thing they do is
what you might call the autofill in.
555
00:37:40,370 --> 00:37:45,040
If you are searching for a political
candidate, Congressman John Smith,
556
00:37:45,160 --> 00:37:48,410
you type in Congressman John Smith
and they will give you several options
557
00:37:48,500 --> 00:37:52,700
that will be a prompt as it were as
to what you might be looking for.
558
00:37:52,870 --> 00:37:57,370
If you have four positive search
suggestions for a candidate,
559
00:37:57,580 --> 00:37:59,250
well, guess what happens?
560
00:37:59,330 --> 00:38:00,790
SEME happens.
561
00:38:00,910 --> 00:38:06,370
People whose opinions can be shifted shift
because people are likely to
562
00:38:06,410 --> 00:38:09,200
click on one of those suggestions
that's gonna bring them
563
00:38:09,410 --> 00:38:13,000
search results, obviously,
that favor that candidate.
564
00:38:13,160 --> 00:38:17,080
That is going to connect them to web
pages that favor that candidate,
565
00:38:17,200 --> 00:38:22,450
but now we know that if you allow just
one negative to appear in that list,
566
00:38:22,580 --> 00:38:27,160
it wipes out the shift completely, because
of a phenomenon called negativity bias.
567
00:38:27,290 --> 00:38:32,790
One negative, and in some demographic
groups draw 10 to 15 times as many
568
00:38:32,870 --> 00:38:35,290
clicks as a neutral item in the same list.
569
00:38:35,410 --> 00:38:42,120
Turns out that Google is manipulating your
opinions from the very first character
570
00:38:42,250 --> 00:38:44,330
that you type into
the search bar.
571
00:38:44,450 --> 00:38:49,410
So, this is an incredibly powerful and
simple means of manipulation.
572
00:38:49,580 --> 00:38:51,500
If you are a search engine company,
573
00:38:51,540 --> 00:38:55,950
and you want to support one candidate or
one cause or one product or one company,
574
00:38:56,040 --> 00:39:00,200
just suppress negatives in
your search suggestion.
575
00:39:00,370 --> 00:39:05,290
All kinds of people over time will shift
their opinions in that direction
576
00:39:05,370 --> 00:39:09,120
where you want them to shift, but you
don't suppress negatives for the
577
00:39:09,160 --> 00:39:11,910
other candidate, the other cause,
the other product.
578
00:39:12,080 --> 00:39:15,250
And then third, very often
Google gives you a box,
579
00:39:15,330 --> 00:39:18,290
so they will give you search results,
but they're below the box
580
00:39:18,370 --> 00:39:22,200
and up above is a box and the
box just gives you the answer.
581
00:39:22,950 --> 00:39:25,750
When it comes to local search,
which it turns out is the most
582
00:39:25,790 --> 00:39:30,160
common thing people do on Google,
40% of all searches, it's not so clear.
583
00:39:30,330 --> 00:39:34,160
If I'm doing a search for a pediatrician
in Scranton, Pennsylvania,
584
00:39:34,250 --> 00:39:39,790
what happens is Google still bisects the
page, shoves the organic meritocracy
585
00:39:39,870 --> 00:39:44,790
based information far down the page
and plops its answer box up at the top,
586
00:39:44,910 --> 00:39:49,160
but it's populating that box with its own,
sort of, restricted set of
587
00:39:49,250 --> 00:39:52,790
information that it's, kind of, drawing
from its own kind of proprietary sandbox.
588
00:39:52,950 --> 00:39:56,660
These are just Google's reviews that it's
attempted to collect over the years.
589
00:39:56,790 --> 00:40:02,040
Users are habituated to assume the stuff
at the top is the best, but instead,
590
00:40:02,080 --> 00:40:07,290
it's just what Google wants them to see,
and it looks the same, and what happens is
591
00:40:07,370 --> 00:40:11,330
Google’s basically able to put its hand
on the scale and create this direct
592
00:40:11,370 --> 00:40:15,620
consumer harm, because that mom searching
for the pediatrician in Scranton is not
593
00:40:15,700 --> 00:40:19,500
finding the highest-rated pediatrician
according to Google's own algorithm.
594
00:40:19,580 --> 00:40:23,700
She's just getting served the Google
thing, no matter what, and that,
595
00:40:23,750 --> 00:40:27,370
I think, is leading to, you know,
terrible outcomes in the offline world.
596
00:40:27,580 --> 00:40:31,660
So, Google has at its disposal
on the search engine itself,
597
00:40:31,790 --> 00:40:37,000
at least three different ways of impacting
your opinion, and it is using them.
598
00:40:37,120 --> 00:40:43,910
We're talking about a single company
having the power to shift the opinions of
599
00:40:44,000 --> 00:40:47,450
literally billions of people,
without anyone having the
600
00:40:47,500 --> 00:40:49,250
slightest idea that they're doing so.
601
00:40:58,410 --> 00:41:00,290
Let me ask you, guys,
about… Switching gears,
602
00:41:00,370 --> 00:41:03,000
I wanna ask you about sort of
privacy and information.
603
00:41:03,120 --> 00:41:06,080
We have a phone that we were
talking about, it will be, I guess,
604
00:41:06,160 --> 00:41:09,830
technically always on,
we have Google which basically
605
00:41:10,330 --> 00:41:13,410
maybe knows what I'm going to
ask for before I ask for it.
606
00:41:13,450 --> 00:41:17,120
It finishes my search request; it knows
everything that's going on in my household
607
00:41:17,200 --> 00:41:19,250
Should I be concerned about
how much you know about me?
608
00:41:19,950 --> 00:41:23,000
We've had this question for more
than a decade at Google,
609
00:41:23,160 --> 00:41:24,950
and our answer is always the same.
610
00:41:25,080 --> 00:41:27,750
You have control over the
information that we have about you,
611
00:41:27,830 --> 00:41:31,200
you can actually have it deleted,
you can also search anonymously,
612
00:41:31,330 --> 00:41:35,450
you get to choose not to give us this
information, but if you give it to us,
613
00:41:35,580 --> 00:41:38,330
we can do a better job of making
better services for you,
614
00:41:38,450 --> 00:41:40,160
and I think that's the right answer.
615
00:41:40,250 --> 00:41:43,160
These computer systems
naturally collect this data,
616
00:41:43,330 --> 00:41:46,540
but we also forget it after a while,
and we've written down why and how.
617
00:41:51,540 --> 00:41:56,120
Google wants you very much to have
privacy from everyone except them.
618
00:41:57,370 --> 00:42:01,250
The first thing we all should
do is quit using Gmail.
619
00:42:01,330 --> 00:42:05,200
Google stores, analyzes all Gmails that
620
00:42:05,290 --> 00:42:09,080
we write and all the
incoming emails that we read.
621
00:42:09,160 --> 00:42:12,790
The ones coming in from other
email services, Google tracks it all.
622
00:42:12,950 --> 00:42:18,410
They not only track the Gmails that you
write, they track the drafts of those
623
00:42:18,500 --> 00:42:21,540
crazy emails that you decided not to send.
624
00:42:23,250 --> 00:42:30,200
And the information from Gmail is then put
into one's personal profile and becomes
625
00:42:30,250 --> 00:42:34,750
another source of information they have
for sending people targeted ads.
626
00:42:35,330 --> 00:42:39,500
Tech giants are increasingly under
scrutiny from politicians, regulators
627
00:42:39,580 --> 00:42:41,830
and experts on the left and the right.
628
00:42:41,950 --> 00:42:43,950
Some are concerned
about their growing power,
629
00:42:44,080 --> 00:42:47,370
even calling them monopolies,
and the tension keeps building.
630
00:42:47,500 --> 00:42:51,040
Yet their role in contemporary
life certainly isn't shrinking.
631
00:42:51,250 --> 00:42:54,620
We too at the NewsHour have worked
and collaborated with Facebook,
632
00:42:54,700 --> 00:42:56,870
Google and many other
new media businesses.
633
00:42:57,000 --> 00:43:00,450
Journalists whom I communicate
with regularly about these issues,
634
00:43:00,540 --> 00:43:05,700
including journalists at Time magazine,
The Guardian, The Hill
635
00:43:05,910 --> 00:43:08,910
I could go on and on they're using Gmail.
636
00:43:09,040 --> 00:43:13,290
Not only that, their companies
are using a form of Gmail,
637
00:43:13,450 --> 00:43:16,290
their emails are running
through Google's servers.
638
00:43:16,450 --> 00:43:19,620
All of their incoming and
outgoing communications
639
00:43:19,750 --> 00:43:22,250
are being monitored by Google.
640
00:43:22,370 --> 00:43:24,950
This is happening also
in major universities.
641
00:43:25,040 --> 00:43:28,450
If you entered the University of
California after a certain date,
642
00:43:28,540 --> 00:43:29,700
guess what?
643
00:43:29,750 --> 00:43:32,120
You're using Gmail whether
you know it or not.
644
00:43:32,250 --> 00:43:36,500
And it was found that Google was,
yes, in fact, scanning student emails
645
00:43:36,580 --> 00:43:41,410
for non-educational purposes, and what
those non-educational purposes were?
646
00:43:41,700 --> 00:43:44,910
We still don't know to this day because
Google still has not come clean with
647
00:43:45,160 --> 00:43:46,410
what they were doing with the information.
648
00:43:46,450 --> 00:43:49,700
They still haven't certified that
they have deleted that data.
649
00:43:49,830 --> 00:43:52,290
The big challenge was
when I found out that Google was
650
00:43:52,370 --> 00:43:54,040
scanning our student emails
651
00:43:54,120 --> 00:43:57,910
for advertising purposes and
other non-educational purposes.
652
00:43:58,000 --> 00:44:01,750
I went to my state lawmakers
and I tried to encourage them
653
00:44:01,870 --> 00:44:05,160
to introduce legislation
that would ban these practices.
654
00:44:05,290 --> 00:44:09,540
When I went to the State House, who
was there lobbying against any stronger
655
00:44:09,750 --> 00:44:14,250
privacy protections for our kids?
It was Google and Facebook.
656
00:44:17,330 --> 00:44:20,250
Well, I guess it makes sense
that so many parts of our life
657
00:44:20,330 --> 00:44:22,160
would be influenced by Google,
658
00:44:22,290 --> 00:44:25,830
but guess what?
The Federal Government runs on Google,
659
00:44:25,950 --> 00:44:29,330
so if you're counting on the Federal
Government to regulate Google,
660
00:44:29,410 --> 00:44:32,660
realize that the Federal
Government is using Google Docs.
661
00:44:32,830 --> 00:44:34,790
They're using Google tools.
662
00:44:34,950 --> 00:44:40,830
In some cases, they're even linked to
Gmail, so Google has fused itself with
663
00:44:40,910 --> 00:44:43,160
our Federal Government in so many ways.
664
00:44:43,620 --> 00:44:47,000
The government's motivation for using
it as well, Google's already done it,
665
00:44:47,160 --> 00:44:49,200
and they have this thing that works.
666
00:44:49,330 --> 00:44:52,200
It seems to be working pretty well for
consumers, so the idea behind it was,
667
00:44:52,330 --> 00:44:56,000
“Well, hey, we'll just put some government
agencies on Google documents,”
668
00:44:56,370 --> 00:44:58,750
or, you know, “We'll use
Google Cloud for some things.”
669
00:44:58,870 --> 00:45:02,620
They lack the technical expertise to
have the necessary security on it,
670
00:45:03,000 --> 00:45:06,330
and then it becomes a big embarrassment
afterwards to try and keep it out of the
671
00:45:06,410 --> 00:45:07,790
media because people would rightfully
672
00:45:07,830 --> 00:45:10,410
point out that there's
a huge security risk.
673
00:45:10,540 --> 00:45:14,450
Some members of Congress are calling
for an investigation of Google
674
00:45:14,580 --> 00:45:18,370
for secretly tracking iPhone
users all over the internet.
675
00:45:18,500 --> 00:45:22,450
Even users who thought they had
blocked that kind of surveillance.
676
00:45:23,160 --> 00:45:28,290
You see this pattern emerging over and
over again, so look at Google Street View.
677
00:45:28,660 --> 00:45:32,160
Google agreeing to pay $7
million to settle with 38 states
678
00:45:32,250 --> 00:45:36,040
over its street view cars collection of
data from unsecured Wi-Fi networks.
679
00:45:36,160 --> 00:45:39,290
The street view vehicles are equipped with
antennas and open source software
680
00:45:39,370 --> 00:45:41,580
gathered network
identification information,
681
00:45:41,700 --> 00:45:43,660
as well as data frames and payload
682
00:45:43,750 --> 00:45:46,790
data being transmitted over
unsecured wireless networks
683
00:45:46,830 --> 00:45:48,580
as the cars were driving by.
684
00:45:48,660 --> 00:45:52,660
It's a great tool for, say, tourists
or homesick transplants,
685
00:45:52,790 --> 00:45:57,330
but privacy advocate Kevin Bankston
says Google is being too invasive.
686
00:45:57,410 --> 00:46:02,370
There are a lot of people on the
web who are, I think, freaked out by this.
687
00:46:02,410 --> 00:46:04,660
They find it kind of
icky and uncomfortable.
688
00:46:05,160 --> 00:46:10,660
And so, absent the fact that it got caught
and they were fined and that they tried
689
00:46:10,830 --> 00:46:14,160
to cover up some of the things that they
were doing and the media got wind of it,
690
00:46:14,200 --> 00:46:15,700
nothing would have happened to them.
691
00:46:15,830 --> 00:46:20,290
The new Google Home Mini has a
massive problem and it isn’t even out yet.
692
00:46:20,370 --> 00:46:23,540
It was released to reporters for
free last week to test it out.
693
00:46:23,620 --> 00:46:26,000
One journalist discovered
his device recorded
694
00:46:26,120 --> 00:46:29,790
everything in earshot and uploaded it.
695
00:46:29,910 --> 00:46:33,750
More and more people are beginning
to use these smart devices in their homes.
696
00:46:33,830 --> 00:46:36,540
There’s two things too with that
I think people need to be aware of.
697
00:46:36,620 --> 00:46:39,790
The fact is that it's always on and
always listening, so it’s collecting
698
00:46:39,830 --> 00:46:42,040
an enormous amount of data on their users.
699
00:46:42,200 --> 00:46:45,370
The other thing is the fact that it is
only giving you one answer back.
700
00:46:45,410 --> 00:46:48,660
Okay, Google. How tall is a T-Rex?
701
00:46:50,580 --> 00:46:54,910
Here’s a summary from the website
humboldts.nationalgeographic.com.
702
00:46:55,120 --> 00:46:59,200
Fossil evidence shows that
Tyrannosaurus was about 40 feet long.
703
00:46:59,290 --> 00:47:02,950
I think it's something we should be
aware of what's happening with the bias
704
00:47:03,000 --> 00:47:05,700
that’s being introduced, because more
and more people are starting to use
705
00:47:05,790 --> 00:47:08,000
these units, and certainly
children are using it.
706
00:47:08,160 --> 00:47:12,580
And, kids don't have the ability to
discern that it's a biased result.
707
00:47:12,790 --> 00:47:15,870
According to a new report by the
Electronic Frontier Foundation,
708
00:47:15,950 --> 00:47:19,660
schools are collecting and storing kids’
names, birth dates, browsing histories,
709
00:47:19,750 --> 00:47:23,910
location data, and much more,
without proper privacy protections.
710
00:47:24,040 --> 00:47:28,120
They went into schools and they
offer their Google apps for education
711
00:47:28,200 --> 00:47:30,790
for free to schools.
What they weren’t telling anyone
712
00:47:30,870 --> 00:47:33,120
was they’re actually
collecting information
713
00:47:33,200 --> 00:47:36,910
on our kids secretly and they’re
using this for profiling purposes.
714
00:47:37,000 --> 00:47:39,700
We just don't know how that information
is going to be utilized against them.
715
00:47:39,790 --> 00:47:42,410
It could be utilized against them
when they apply to college,
716
00:47:42,540 --> 00:47:45,700
it could be utilized against them
when they go in for a job, it could be
717
00:47:45,790 --> 00:47:48,910
utilized against them for insurance,
it could be utilized against them
718
00:47:49,000 --> 00:47:50,950
whenever you want to buy something online.
719
00:47:51,080 --> 00:47:53,870
And, they're taking that information
and they’re weaponizing it
720
00:47:53,950 --> 00:47:56,160
against our own kids.
721
00:47:56,410 --> 00:47:59,500
And, guess what happened.
Nothing. Absolutely nothing happened when…
722
00:47:59,540 --> 00:48:02,330
when the FTC heard about it,
when the Department of Education
723
00:48:02,410 --> 00:48:04,290
heard about it, they took no action.
724
00:48:05,370 --> 00:48:08,200
So, Google grew out of a PhD
program at Stanford. Right?
725
00:48:08,290 --> 00:48:10,080
They have this affinity for academics.
726
00:48:10,160 --> 00:48:12,580
They view themselves as
academic, a little bit, in nature.
727
00:48:12,660 --> 00:48:14,830
Their campus looks like a college campus.
728
00:48:14,910 --> 00:48:18,160
So, Google provides a ton of
funding for academic institutions,
729
00:48:18,250 --> 00:48:20,250
individual academics,
all around the world.
730
00:48:20,330 --> 00:48:25,500
So, we found something like 300 papers
that were funded by Google in some way
731
00:48:25,580 --> 00:48:27,540
that supported Google’s policy positions.
732
00:48:27,620 --> 00:48:30,160
They like to say, “Hey,
we have the country's
733
00:48:30,250 --> 00:48:32,830
foremost academic experts behind us.”
734
00:48:32,870 --> 00:48:36,370
And, what they are not telling the
public is the fact that they’re actually
735
00:48:36,450 --> 00:48:41,330
funding the so-called independent experts,
these so-called independent academics.
736
00:48:41,450 --> 00:48:43,160
Eric Schmidt is testifying before Congress
737
00:48:43,250 --> 00:48:44,950
about whether or not
Google was a monopoly.
738
00:48:45,080 --> 00:48:49,200
He was relying on an academic study to
bolster his point, but he didn’t disclose
739
00:48:49,290 --> 00:48:51,330
that the paper was actually
funded by Google.
740
00:48:52,830 --> 00:48:58,950
Are you concerned that your company has
been “exerting enormous power to direct
741
00:48:59,080 --> 00:49:03,450
internet traffic in ways that hurt
many small, rural businesses”?
742
00:49:03,910 --> 00:49:06,120
Extremely good and well-meaning
743
00:49:06,200 --> 00:49:08,830
small businesses move
up and down in the rankings,
744
00:49:08,910 --> 00:49:11,950
but we are in the rankings business.
And so, for every loser,
745
00:49:12,040 --> 00:49:13,830
there's a winner, and so forth.
746
00:49:13,910 --> 00:49:19,290
I am satisfied that the vast majority of
small businesses are extremely well served
747
00:49:19,330 --> 00:49:23,000
by our approach, and as
I said earlier to Senator Klobuchar,
748
00:49:23,160 --> 00:49:27,790
I do believe that if anything, our system
promotes and enhances small business
749
00:49:27,870 --> 00:49:31,200
over larger businesses because
it gives them a hearing and a role
750
00:49:31,250 --> 00:49:35,370
that they would not otherwise have because
the nature of the way the algorithms work.
751
00:49:38,450 --> 00:49:40,500
Google has blacklists,
752
00:49:40,620 --> 00:49:43,830
and the biggest blacklist they
have is called their quarantine list.
753
00:49:46,120 --> 00:49:49,580
Now, I'm guessing very few people
have ever heard of this list,
754
00:49:49,660 --> 00:49:54,370
and yet I'm telling you, it not only is
a tool of censorship, it is by far the
755
00:49:54,410 --> 00:49:58,330
biggest and most dangerous
list that Google maintains
756
00:49:58,450 --> 00:50:00,540
for the purpose of
controlling information.
757
00:50:00,620 --> 00:50:05,410
The quarantine list is a list of websites
that Google doesn't want you to visit.
758
00:50:05,750 --> 00:50:08,540
Are there 1,000 items on this list? No.
759
00:50:08,620 --> 00:50:11,330
There are millions
of websites on this list.
760
00:50:11,500 --> 00:50:15,500
Google has the power to
block access to websites,
761
00:50:15,660 --> 00:50:18,120
and there are no relevant regulations.
762
00:50:18,250 --> 00:50:21,750
There's no oversight,
there's no advisory group.
763
00:50:21,790 --> 00:50:25,580
There’s nothing. No one even
realizes that Google is doing it.
764
00:50:25,870 --> 00:50:28,080
There was a particular day
where Google shut down
765
00:50:28,120 --> 00:50:30,200
the entire Internet for forty minutes.
766
00:50:30,290 --> 00:50:33,370
This was reported by the Guardian.
Google did not deny it.
767
00:50:33,790 --> 00:50:38,450
They shut down half of the Internet in
Japan, and again, they acknowledged it.
768
00:50:38,700 --> 00:50:41,660
We are talking about a
company with so much power.
769
00:50:41,790 --> 00:50:48,000
Well, who gave Google the
power to shut down the Internet?
770
00:50:48,700 --> 00:50:50,200
Where did that come from?
771
00:51:00,330 --> 00:51:05,540
They should say, “Look. We are
liberal or left-wing organizations.
772
00:51:05,620 --> 00:51:08,370
We don't want to give a forum for others.”
773
00:51:08,450 --> 00:51:10,700
Yeah. And then, I have no issue,
774
00:51:11,040 --> 00:51:16,450
but then they say they are a public
Forum, and that’s not honest.
775
00:51:17,790 --> 00:51:20,700
Google has this ability,
because they are such a large
776
00:51:20,830 --> 00:51:24,660
platform to act as the
world's most powerful censor,
777
00:51:24,950 --> 00:51:28,830
and it’s a power and a tool that
they use on a regular basis.
778
00:51:28,950 --> 00:51:32,580
Consider a simple case
involving an eminent scholar,
779
00:51:32,660 --> 00:51:35,700
Doctor Jordan Peterson,
of the University of Toronto.
780
00:51:35,950 --> 00:51:41,080
Canadian Professor Jordan Peterson shot to
fame when he opposed a transgender law
781
00:51:41,120 --> 00:51:44,660
that infringed badly on the
free-speech rights of Canadians.
782
00:51:44,750 --> 00:51:47,950
Now, his speech is being
suppressed in a different way.
783
00:51:48,080 --> 00:51:50,330
Google blocked him from
his YouTube account.
784
00:51:50,410 --> 00:51:53,290
We don't actually know why,
because Google refused to specify.
785
00:51:54,330 --> 00:52:00,410
I made a couple of videos at home
criticizing a new law that was set to be
786
00:52:00,500 --> 00:52:04,910
passed in Canada, purporting to add
gender identity and gender expression
787
00:52:05,000 --> 00:52:09,120
to the list of protected groups under
Canadian human rights legislation.
788
00:52:09,580 --> 00:52:12,160
I was objecting to two
parts of the legislation,
789
00:52:12,250 --> 00:52:16,410
and one part was what I regarded
as compelled speech.
790
00:52:16,540 --> 00:52:18,620
And, that was the provisions
and the policies
791
00:52:18,660 --> 00:52:20,410
surrounding the law
making it mandatory
792
00:52:20,500 --> 00:52:25,250
to use pronouns of people's choice;
mandatory under punishment of law.
793
00:52:25,790 --> 00:52:29,910
And, the second is that it writes a
social constructionist view of gender
794
00:52:29,950 --> 00:52:34,080
into the substructure of Canadian law.
That's wrong.
795
00:52:34,620 --> 00:52:39,040
It’s factually incorrect. So, we now
have a legal system that
796
00:52:39,120 --> 00:52:43,620
has a scientifically incorrect
doctrine built into it and people don't
797
00:52:43,700 --> 00:52:47,830
understand how dangerous that is,
but I understood how dangerous it was.
798
00:52:47,910 --> 00:52:50,540
And, that caused a real media storm,
799
00:52:50,580 --> 00:52:53,250
I would say, that in some sense
still hasn't subsided.
800
00:52:54,580 --> 00:52:59,450
Google shut off my Gmail account and
blocked access to my YouTube channel,
801
00:52:59,750 --> 00:53:03,620
and by that time, my YouTube channel had
about 260 videos on it and about
802
00:53:03,750 --> 00:53:07,910
15 million views, and about
400,000 subscribers.
803
00:53:08,000 --> 00:53:10,330
So, it’s a major YouTube channel.
804
00:53:11,450 --> 00:53:13,750
But they blocked access
to my mail as well,
805
00:53:13,790 --> 00:53:18,290
which had everything, like, all my mail
from the last 20 years, basically,
806
00:53:18,370 --> 00:53:22,700
and my calendar data - everything.
They said that I'd violated their policy.
807
00:53:22,910 --> 00:53:26,000
They said, first of all, that it was
a machine that flagged me,
808
00:53:26,120 --> 00:53:30,160
but then it was reviewed by human beings
and they decided to continue the ban.
809
00:53:30,580 --> 00:53:32,290
They gave me no reason.
810
00:53:32,410 --> 00:53:35,910
They said I'd violated their policy,
but it was completely vague
811
00:53:36,000 --> 00:53:37,910
and I got no reason at all.
812
00:53:38,040 --> 00:53:41,000
Now, I've received all
sorts of conflicting reports,
813
00:53:42,040 --> 00:53:45,540
and so I really have no
idea why it happened.
814
00:53:46,000 --> 00:53:49,120
I can't be certain that it
was political targeting,
815
00:53:49,580 --> 00:53:51,750
but that's really not the
point in some sense.
816
00:53:51,870 --> 00:53:56,200
The point is just that it was arbitrarily
shut off, and that's a real problem.
817
00:53:56,450 --> 00:54:00,290
You come to rely on these things,
and when the plug is pulled suddenly,
818
00:54:00,750 --> 00:54:03,540
then that puts a big hole in your life.
819
00:54:05,870 --> 00:54:11,080
Before January 1, 2012, I just never
gave Google a second thought.
820
00:54:11,200 --> 00:54:14,080
I just thought it was cool and
used it like everyone else does.
821
00:54:14,330 --> 00:54:17,620
On January 1 of 2012,
822
00:54:17,870 --> 00:54:23,500
I got a bunch of e-mails
from Google saying that
823
00:54:23,580 --> 00:54:25,700
my website had
been hacked,
824
00:54:25,750 --> 00:54:28,620
it contains malware, and that
they were blocking access.
825
00:54:28,790 --> 00:54:33,410
I really got curious.
Why was Google notifying me?
826
00:54:33,540 --> 00:54:37,500
Why wasn't I being notified by
some government agency
827
00:54:37,620 --> 00:54:41,500
or some sort of nonprofit
organization? One thing that
828
00:54:41,620 --> 00:54:45,250
I thought was curious was they had
no customer service department.
829
00:54:45,620 --> 00:54:50,410
So, here, they're blocking access to your
websites, including my main website,
830
00:54:50,580 --> 00:54:53,790
except there is no one you
can call to help you.
831
00:54:53,910 --> 00:54:57,620
That I thought was very odd for a
company that big and that cool.
832
00:54:58,040 --> 00:55:01,540
Access was being blocked not
just through Google‘s search engine,
833
00:55:01,660 --> 00:55:05,790
not just through Google's browser,
which is Chrome, but access was being
834
00:55:05,870 --> 00:55:10,950
blocked through Safari, which is an
Apple product, and through Firefox.
835
00:55:11,950 --> 00:55:16,330
And, I was thinking, “How could
Google impact what happens
836
00:55:16,410 --> 00:55:19,200
when someone is using
another product, like Firefox?”
837
00:55:19,870 --> 00:55:23,870
And, it actually took me quite a
while to figure out how that all worked.
838
00:55:24,290 --> 00:55:30,580
So, that's really the event that took
place early 2012 - New Year’s Day -
839
00:55:30,830 --> 00:55:34,910
that started to get me to look
a little more critically,
840
00:55:34,950 --> 00:55:36,700
a little more professionally, at Google,
841
00:55:36,790 --> 00:55:39,160
figuring out how this company operates.
842
00:55:39,250 --> 00:55:42,910
How do they do what they're doing?
How big is their reach?
843
00:55:43,200 --> 00:55:48,080
What are their motives for doing
what they do? And, over the years,
844
00:55:48,160 --> 00:55:50,450
the more I’ve learned,
the more concerned I’ve become.
845
00:55:50,910 --> 00:55:54,620
Well, Dr. Robert Epstein is a
world-renowned psychologist,
846
00:55:54,700 --> 00:55:57,500
and his work gets a lot of
attention from the media.
847
00:55:57,700 --> 00:56:02,250
In fact, the Washington Post ran a
story on the very question that Epstein
848
00:56:02,330 --> 00:56:04,540
had been researching,
and that question is,
849
00:56:04,660 --> 00:56:07,790
“Could Google actually tilt an election?”
850
00:56:07,950 --> 00:56:09,790
And, the article was fascinating.
851
00:56:09,910 --> 00:56:12,450
But what's even more
troubling and fascinating is
852
00:56:12,500 --> 00:56:15,540
what happened the day after
that article appeared.
853
00:56:15,700 --> 00:56:19,950
That's when Google decided to
shut off Dr. Robert Epstein.
854
00:56:20,500 --> 00:56:25,750
The next day, I could no longer access
Google.com. I mean, you're seeing
855
00:56:25,830 --> 00:56:28,700
something here that probably very few
people in the world have ever seen.
856
00:56:28,750 --> 00:56:35,290
You are seeing a timeout on Google.com.
So, I started, you know, asking around.
857
00:56:35,370 --> 00:56:40,000
I started doing some research on it, and
I actually found how a company like
858
00:56:40,160 --> 00:56:43,580
Google could, in fact, cut
you off from their services.
859
00:56:43,830 --> 00:56:47,750
I also found in their Terms of
Service, very clear language, saying
860
00:56:47,870 --> 00:56:50,580
they had every right to cut
you off from their services
861
00:56:50,660 --> 00:56:53,160
whenever they pleased,
with or without cause.
862
00:56:54,540 --> 00:56:58,200
When Google decided to go after
Robert Epstein and Jordan Peterson,
863
00:56:58,370 --> 00:57:00,000
in a way they were making a mistake
864
00:57:00,450 --> 00:57:02,250
because these are two
individuals that have a
865
00:57:02,290 --> 00:57:06,830
lot of contacts in the media, have a
lot of relationships, and can draw
866
00:57:06,910 --> 00:57:10,410
attention to the fact that they
are being censored by Google.
867
00:57:10,580 --> 00:57:15,120
But consider somebody who's had the
spigot shut off by Google who doesn't have
868
00:57:15,160 --> 00:57:18,080
those kind of relationships.
What happens to those people?
869
00:57:18,250 --> 00:57:22,620
Those people essentially disappear
because Google has decided
870
00:57:22,790 --> 00:57:25,620
they don't want you to hear
what they have to say.
871
00:57:25,700 --> 00:57:29,790
You started just having smaller
content providers that's doing incredibly,
872
00:57:29,950 --> 00:57:35,790
you know, innocuous things on reviews
of movies and expressing an opinion
873
00:57:35,830 --> 00:57:39,870
on something that somebody at
YouTube apparently objected to.
874
00:57:40,000 --> 00:57:43,000
It could be a critical
review of Wonder Woman,
875
00:57:43,120 --> 00:57:45,290
and all of a sudden, the
video gets demonetized.
876
00:57:45,370 --> 00:57:50,830
They’re no longer there to be discussed.
That's the problem with censorship.
877
00:57:50,910 --> 00:57:56,000
These companies want to present themselves
as only interested in the bottom line,
878
00:57:56,120 --> 00:57:58,500
only interested in serving customers,
879
00:57:58,580 --> 00:58:02,160
but when you look at the pattern of
behavior, you look at the censorship
880
00:58:02,250 --> 00:58:06,750
and the manipulation and the one-sided
nature of it, and you can only come to the
881
00:58:06,830 --> 00:58:11,830
conclusion that these companies have a far
deeper agenda than they want to let on.
882
00:58:22,160 --> 00:58:26,790
Google is definitely the biggest kingmaker
on this Earth that has ever existed,
883
00:58:27,000 --> 00:58:30,450
because it can make kings, and
not just in the United States,
884
00:58:30,540 --> 00:58:33,040
but pretty much in every
country in the world.
885
00:58:35,370 --> 00:58:36,790
That’s the thing.
886
00:58:36,870 --> 00:58:40,120
These people are playing with forces
that they don't understand
887
00:58:40,290 --> 00:58:41,790
and they think they can control them.
888
00:58:42,200 --> 00:58:44,540
Kingmakers aren't always benevolent forces
889
00:58:44,620 --> 00:58:47,040
by any stretch of the imagination,
890
00:58:47,250 --> 00:58:50,080
and especially when they're
operating behind the scenes.
891
00:58:50,290 --> 00:58:54,620
Google and Facebook has the
power to undermine democracy
892
00:58:54,700 --> 00:58:58,580
without us knowing that democracy
has been undermined.
893
00:59:00,450 --> 00:59:05,200
If the major players in tech right now -
and that's mainly Google and Facebook -
894
00:59:05,290 --> 00:59:10,080
banded together and got behind the same
candidate, they could shift, we figured,
895
00:59:10,410 --> 00:59:13,080
10% of the vote in
the United States
896
00:59:13,120 --> 00:59:15,200
with no one knowing that
they had done anything.
897
00:59:16,040 --> 00:59:19,620
Epstein came from a background
bedded in psychology.
898
00:59:19,790 --> 00:59:21,750
He is a Harvard trained PhD.
899
00:59:21,870 --> 00:59:25,620
He’s done a series of peer-reviewed
studies funded and supported by the
900
00:59:25,700 --> 00:59:27,370
National Science Foundation.
901
00:59:27,450 --> 00:59:30,410
And, essentially, what they
set out to do was to say,
902
00:59:30,660 --> 00:59:35,410
"If we present people with information
in a biased way through a search engine,
903
00:59:35,500 --> 00:59:38,410
can we steer them and
change their opinions?“
904
00:59:38,500 --> 00:59:41,830
And, what they found out
consistently, again and again,
905
00:59:42,040 --> 00:59:45,450
that yes, it was easy, actually, to shift
906
00:59:45,540 --> 00:59:49,410
and steer people's opinions in the
direction that they wanted them to go.
907
00:59:49,830 --> 00:59:53,540
The way we studied this was
with an experiment.
908
00:59:53,620 --> 00:59:56,290
We start with a group of people
who we know are
909
00:59:56,330 --> 01:00:00,500
undecided on a particular candidate,
and we guarantee
910
01:00:00,540 --> 01:00:03,500
that they were undecided on the election
that we were going to show them,
911
01:00:03,580 --> 01:00:08,450
because we decided to show them the 2010
election for Prime Minister of Australia.
912
01:00:09,160 --> 01:00:13,370
They had no preconceived notions
about that election or the candidates.
913
01:00:13,660 --> 01:00:15,660
We give them two short paragraphs;
914
01:00:15,750 --> 01:00:19,250
one about the first candidate, Gillard,
and the other’s name is Abbott.
915
01:00:19,330 --> 01:00:22,000
And then, we ask them a bunch
of questions about them.
916
01:00:22,120 --> 01:00:25,660
How much do you trust each one?
How much do you like each one?
917
01:00:25,750 --> 01:00:31,500
What's your overall impression? We ask on
an 11-point scale, where one candidate is
918
01:00:31,580 --> 01:00:33,410
at one end of the scale,
one’s at the other,
919
01:00:33,450 --> 01:00:35,410
and then we ask them the
ultimate question, which is,
920
01:00:35,500 --> 01:00:38,910
“Well, if you had to vote right now,
which one would you vote for?”
921
01:00:39,120 --> 01:00:42,580
So, at that point, they have
to pick either Gillard or Abbott.
922
01:00:42,790 --> 01:00:46,870
So, this is all pre-search.
Now, we let them do some research online.
923
01:00:46,950 --> 01:00:51,750
Now, they’re using a search engine that we
created which is modeled after Google
924
01:00:52,000 --> 01:00:56,450
and it's called Kadoodle. So, we show
them five pages of search results,
925
01:00:56,540 --> 01:00:58,450
six results per page.
926
01:00:58,540 --> 01:01:03,290
And then, they can use the search
engine as they would use Google.
927
01:01:03,370 --> 01:01:08,080
They’re seeing real search results from
that election connecting to real webpages.
928
01:01:08,160 --> 01:01:13,450
So, before the search, the split we‘re
getting is exactly what you would
929
01:01:13,580 --> 01:01:17,330
expect to get from people who
are undecided, and when they're done,
930
01:01:17,410 --> 01:01:19,410
we ask them all those questions again.
931
01:01:19,580 --> 01:01:23,910
This is now the post-search part of the
experiment, and the question is,
932
01:01:24,120 --> 01:01:27,160
do we get any shift?
Here's what people don't know:
933
01:01:27,250 --> 01:01:30,950
People are being randomly
assigned to one of three groups.
934
01:01:31,160 --> 01:01:35,040
In one group, they’re
seeing those search results
935
01:01:35,160 --> 01:01:38,450
in an order that favors Tony Abbott.
936
01:01:38,540 --> 01:01:42,200
In another group, they’re
seeing those search results
937
01:01:42,290 --> 01:01:45,040
in an order that favors Julia Gillard.
938
01:01:45,250 --> 01:01:47,250
And, in a third group,
they’re seeing them mixed up.
939
01:01:47,370 --> 01:01:48,750
That’s the control group.
940
01:01:48,950 --> 01:01:52,700
So, this is a randomized
experiment, in other words.
941
01:01:52,910 --> 01:01:56,870
We got a shift in that
first experiment of 48%.
942
01:01:57,250 --> 01:02:00,330
What that means is,
if I have 50 people over here,
943
01:02:00,500 --> 01:02:03,660
48% of them, almost half of them, shifted.
944
01:02:03,790 --> 01:02:06,950
We got shifts on all the numbers,
not just on the votes,
945
01:02:07,040 --> 01:02:10,040
but on the trusting, the liking,
the overall impression.
946
01:02:10,200 --> 01:02:13,200
Everything we asked
shifted in a direction
947
01:02:13,500 --> 01:02:16,750
that matched the bias
in the search rankings.
948
01:02:16,910 --> 01:02:20,620
This is random assignments, so we’re
arbitrarily putting some people in the
949
01:02:20,700 --> 01:02:24,870
pro-Abbott group, some people in the
pro-Gillard group; it’s just arbitrary.
950
01:02:24,910 --> 01:02:29,660
It’s random. And, wherever we put them,
they shift. So, that's the power here.
951
01:02:29,870 --> 01:02:32,580
There was another thing that caught our
attention in this first experiment, and
952
01:02:32,700 --> 01:02:36,910
that is, three quarters of the people
in that experiment seemed to have no
953
01:02:36,950 --> 01:02:40,750
awareness whatsoever that they
were seeing biased search rankings,
954
01:02:40,790 --> 01:02:43,250
even though these were blatantly biased.
955
01:02:43,870 --> 01:02:47,620
When we decided to repeat this,
we thought, why don't we try to mask
956
01:02:47,700 --> 01:02:51,080
what we’re doing just a little bit
and see if we still get a shift,
957
01:02:51,450 --> 01:02:56,330
and we’re going to take the fourth item,
so let’s say that’s a pro-Abbott item,
958
01:02:56,540 --> 01:02:59,160
and we’re going to swap
it with the pro-Gillard item.
959
01:02:59,290 --> 01:03:01,120
So, we’re just going
to mix it up a little bit.
960
01:03:01,250 --> 01:03:06,040
So, in that second experiment, same exact
procedure, 103 new participants.
961
01:03:06,200 --> 01:03:09,330
Two things happened.
Number one, we got a shift again.
962
01:03:09,410 --> 01:03:12,250
This time, 63%, even bigger.
963
01:03:12,700 --> 01:03:17,120
Even more interesting, the percentage
of people who seemed unaware
964
01:03:17,330 --> 01:03:20,950
that they were seeing biased
search rankings went up to 85%.
965
01:03:21,080 --> 01:03:22,910
We said, “Oh, let's do it again and let's
966
01:03:23,000 --> 01:03:24,750
be a little more
aggressive with the masks.”
967
01:03:24,870 --> 01:03:29,080
Third experiment, we again get
this enormous shift, but now,
968
01:03:29,250 --> 01:03:35,330
100% of our participants saw
no bias in the search results.
969
01:03:35,410 --> 01:03:40,660
So, this told us that not only could
we shift opinions and voting preferences
970
01:03:40,830 --> 01:03:43,080
just by manipulating search results,
971
01:03:43,200 --> 01:03:48,540
but we could do it in such
a way that no one was aware of.
972
01:03:49,120 --> 01:03:53,700
Now, you've got incredible
power to manipulate people.
973
01:03:55,080 --> 01:03:59,910
That was our first set of experiments.
The next thing we did was to replicate
974
01:04:00,000 --> 01:04:03,200
the experiments we had done on a
small scale, nationally in the US.
975
01:04:03,290 --> 01:04:07,040
So, we had more than 2,000
participants from all 50 states.
976
01:04:07,160 --> 01:04:10,620
We used masking, and in
some demographic groups,
977
01:04:10,700 --> 01:04:13,200
the effect was smaller, and
in some, it was larger.
978
01:04:13,290 --> 01:04:14,750
And, in one demographic group,
979
01:04:14,830 --> 01:04:17,160
the shift was 80%.
980
01:04:17,450 --> 01:04:21,160
Now, we also had so many people in
this particular study that we could
981
01:04:21,250 --> 01:04:25,500
look specifically at the very,
very small number of people
982
01:04:25,580 --> 01:04:30,500
who did notice the bias in the search
rankings, and that is when we learned
983
01:04:30,620 --> 01:04:33,330
that the people who notice the bias
984
01:04:33,450 --> 01:04:36,790
shift even farther in
the direction of the bias.
985
01:04:37,370 --> 01:04:43,620
It seems to be that if you see the bias,
since you believe in search results
986
01:04:43,750 --> 01:04:47,120
and you believe in search engines
and you believe they’re objective,
987
01:04:47,200 --> 01:04:50,790
that what you're seeing
is simply confirming
988
01:04:50,870 --> 01:04:52,870
that that candidate must be better
989
01:04:53,000 --> 01:04:57,540
because the search engine
itself prefers that candidate.
990
01:04:57,620 --> 01:05:01,370
In other words, this algorithm has
chosen that candidate over the other.
991
01:05:01,450 --> 01:05:05,160
That the algorithm has to be right
because, of course, it's objective.
992
01:05:06,910 --> 01:05:12,370
So, we went to India because in early
2014, there took place the largest
993
01:05:12,410 --> 01:05:17,410
democratic election in history. And,
we recruited more than 2,000 people
994
01:05:17,500 --> 01:05:20,790
from throughout India to participate
in the same kind of experiment.
995
01:05:20,910 --> 01:05:26,830
Now, we’re using real voters right
smack in the middle of an extremely
996
01:05:26,950 --> 01:05:30,910
intense campaign, and we're
showing them biased search results.
997
01:05:31,200 --> 01:05:33,910
What do we get? Well,
overall, we found
998
01:05:34,000 --> 01:05:37,080
that we could easily get
a shift of over 20%,
999
01:05:37,120 --> 01:05:40,790
and in some demographic groups,
the shift was over 60%.
1000
01:05:42,410 --> 01:05:45,370
So, this was still an enormous effect.
1001
01:05:45,580 --> 01:05:49,250
We actually have identified a manipulation
1002
01:05:49,370 --> 01:05:55,250
that a search engine company could
implement for free, and icing on the cake,
1003
01:05:55,370 --> 01:06:02,120
99.5% of the people in the study saw
no bias in the search results.
1004
01:06:04,450 --> 01:06:07,910
Hillary Clinton and Donald Trump
held dueling rallies Wednesday,
1005
01:06:08,000 --> 01:06:10,620
just shy of four weeks until election day.
1006
01:06:10,830 --> 01:06:14,750
Clinton rallied supporters in Colorado,
while Trump was in Florida.
1007
01:06:14,910 --> 01:06:18,080
Both candidates wasted no
time going after the other.
1008
01:06:18,540 --> 01:06:23,160
We developed a new monitoring system,
unprecedented, as far as I know,
1009
01:06:23,580 --> 01:06:26,870
and this system allowed
us, in effect, to look
1010
01:06:26,910 --> 01:06:30,950
over the shoulders of people as
they were using search engines
1011
01:06:31,120 --> 01:06:32,870
just for about five months
1012
01:06:32,910 --> 01:06:36,500
before the election, all the
way through election day,
1013
01:06:36,580 --> 01:06:38,330
and then we actually went
a little bit past that.
1014
01:06:38,410 --> 01:06:41,580
The American people are
the victims of this system.
1015
01:06:42,410 --> 01:06:46,500
We recorded all the webpages that
their search results connected to.
1016
01:06:46,750 --> 01:06:51,160
We knew where in those search results
people were seeing those links,
1017
01:06:51,410 --> 01:06:56,450
so we could compute bias. We could
actually compute whether or not those
1018
01:06:56,620 --> 01:07:01,410
search results were favoring either
Donald Trump or Hillary Clinton.
1019
01:07:01,830 --> 01:07:06,950
We found systematic bias
in favor of one candidate.
1020
01:07:07,040 --> 01:07:10,370
Now, it happens to be Hillary Clinton, but
in my mind, it doesn’t matter who it is;
1021
01:07:10,500 --> 01:07:15,080
if there’s systematic bias for one
candidate, that can shift a lot of votes.
1022
01:07:15,660 --> 01:07:20,830
Here, we’re looking at bias
according to search position.
1023
01:07:21,200 --> 01:07:24,450
So, this is position one in the
search results, position two,
1024
01:07:24,580 --> 01:07:25,950
position three, and so on.
1025
01:07:26,080 --> 01:07:30,000
Any dot above the line, again,
would indicate a pro-Clinton bias,
1026
01:07:30,080 --> 01:07:32,870
and I mean this bias
in a statistical sense.
1027
01:07:33,250 --> 01:07:36,750
Any one below would indicate
a bias in favor of Trump.
1028
01:07:37,000 --> 01:07:41,750
We were seeing bias
in all ten search positions
1029
01:07:41,830 --> 01:07:43,870
on the first page of search results.
1030
01:07:44,000 --> 01:07:45,410
That's pretty blatant.
1031
01:07:45,540 --> 01:07:48,450
I mean, frankly, if
I were running the show,
1032
01:07:48,620 --> 01:07:52,330
I wouldn't want to use a
manipulation that’s so blatant.
1033
01:07:52,620 --> 01:07:57,080
But remember, we're running
this tracking system secretly.
1034
01:07:57,250 --> 01:08:02,290
If there had been no bias in search
rankings, given the vast number of people
1035
01:08:02,370 --> 01:08:06,250
these days who get information from the
Internet, who look for information
1036
01:08:06,330 --> 01:08:10,410
about political issues online, who look
for information about candidates online,
1037
01:08:10,500 --> 01:08:15,910
if you took away this bias,
it is possible that Clinton
1038
01:08:16,040 --> 01:08:19,080
and Trump would've been neck
and neck in the popular vote.
1039
01:08:20,540 --> 01:08:24,410
I'm strictly apolitical,
and the organization
1040
01:08:24,500 --> 01:08:28,580
where I’ve conducted the
research is strictly nonpartisan.
1041
01:08:28,870 --> 01:08:31,500
I will say, in the 2016 election,
1042
01:08:31,580 --> 01:08:34,540
I do feel Hillary Clinton
was better qualified
1043
01:08:34,620 --> 01:08:37,450
to serve as President than
Donald Trump, but the work
1044
01:08:37,540 --> 01:08:40,160
that I'm doing has
nothing to do with politics.
1045
01:08:40,250 --> 01:08:42,290
It has nothing to do with candidates.
1046
01:08:42,370 --> 01:08:48,500
What I am finding out is that a handful
of people, located within a few miles
1047
01:08:48,580 --> 01:08:53,370
of each other in Silicon Valley, have
tremendous power that they shouldn't have.
1048
01:08:54,330 --> 01:08:57,750
Now, they could use that power on
Monday for one political party
1049
01:08:57,790 --> 01:08:59,790
and on Tuesday for a
different political party.
1050
01:08:59,910 --> 01:09:05,120
Do any of us want to be in that kind of
world where people like that have
1051
01:09:05,250 --> 01:09:09,700
so much power and they can wield it
this way or that way? I don't think we do.
1052
01:09:09,910 --> 01:09:12,540
There’s a reason all our political
ads end with I’m so-and-so
1053
01:09:12,660 --> 01:09:15,950
and I approve this message, which is that
if you want people to make informed
1054
01:09:16,040 --> 01:09:17,910
decisions based on having
more political speech,
1055
01:09:18,000 --> 01:09:19,700
not less, based on not tamping down
1056
01:09:19,790 --> 01:09:23,750
but allowing as many speakers as possible,
then people need to know when they’re
1057
01:09:23,790 --> 01:09:27,330
being subjected to political speech and
political decision-making, and so forth.
1058
01:09:27,450 --> 01:09:29,250
And so, I think that certainly,
1059
01:09:29,330 --> 01:09:31,950
if they engaged in that and
if that’s something that they do,
1060
01:09:32,080 --> 01:09:34,700
that needs to be in the
purview of a transparency
1061
01:09:34,790 --> 01:09:36,040
rule of Congress, if
we were to design one,
1062
01:09:36,080 --> 01:09:37,450
because that’s something that, you know,
1063
01:09:37,580 --> 01:09:40,500
it’s not that people shouldn’t be
subjected to that if a private company
1064
01:09:40,540 --> 01:09:43,580
wants to do it, in my judgment, but
they should know that it’s happening.
1065
01:09:43,700 --> 01:09:46,910
When you start to talk about
these big platforms,
1066
01:09:47,000 --> 01:09:50,000
now, other possibilities arise, because if
1067
01:09:50,080 --> 01:09:54,620
Google or Facebook want to favor
some candidate or some cause,
1068
01:09:54,870 --> 01:09:59,750
you cannot possibly correct
for that. It is impossible.
1069
01:10:00,700 --> 01:10:05,700
If, a few months before the election,
Facebook sends out "Go out
1070
01:10:05,830 --> 01:10:09,080
and register“ reminders to certain people,
1071
01:10:09,120 --> 01:10:11,450
but not to others, so we call this TME,
1072
01:10:11,620 --> 01:10:15,580
the Targeted Messaging Effect.
What happens if over and over again
1073
01:10:15,620 --> 01:10:16,950
they're sending out these messages,
1074
01:10:17,040 --> 01:10:19,250
“go register”,
“go register”, “go register”,
1075
01:10:19,370 --> 01:10:23,370
and they’re doing so selectively?
Couldn’t they shift registrations?
1076
01:10:23,910 --> 01:10:26,200
Here’s the problem
with Facebook and Google:
1077
01:10:26,290 --> 01:10:28,750
They present themselves,
not as a government
1078
01:10:28,790 --> 01:10:31,330
trying to steer people
towards some utopia,
1079
01:10:31,450 --> 01:10:34,370
but as companies that are
simply providing services;
1080
01:10:34,410 --> 01:10:38,200
services that we want and
services that are good for us.
1081
01:10:38,290 --> 01:10:41,080
You’re powerful enough to
change the political landscape.
1082
01:10:41,160 --> 01:10:44,620
It isn’t a question of whether
you want to if you can;
1083
01:10:44,910 --> 01:10:48,790
it’s a question of convince me that
you’re not doing it if you can,
1084
01:10:49,200 --> 01:10:53,540
because there’s no reason for me to
assume that you’re not subject to the
1085
01:10:53,580 --> 01:10:58,000
same dark motivations of power
and domination that are characteristic
1086
01:10:58,080 --> 01:11:02,160
of any system that has the
capacity for power and domination.
1087
01:11:02,290 --> 01:11:06,040
You should assume that
you have a tyrant in you,
1088
01:11:06,160 --> 01:11:10,080
instead of assuming
that you can just not be evil.
1089
01:11:10,450 --> 01:11:13,000
It’s not that funny when
it's a system that big.
1090
01:11:13,200 --> 01:11:15,910
When you’re building a super powerful
1091
01:11:16,080 --> 01:11:20,910
super-intelligence, not being evil
actually turns out to be a big deal.
1092
01:11:21,330 --> 01:11:25,290
If you look at the traditional
notion of fascism as demonstrated by
1093
01:11:25,370 --> 01:11:29,540
Mussolini in Italy, it was a
policy of corporatism.
1094
01:11:29,700 --> 01:11:33,500
It was a fusing of powerful
private corporations
1095
01:11:33,580 --> 01:11:36,160
with the government
and joining them together.
1096
01:11:36,290 --> 01:11:40,120
And, what we are experiencing
increasingly in the tech world
1097
01:11:40,250 --> 01:11:43,830
is a fusing of these large tech firms,
like Google and Facebook,
1098
01:11:44,000 --> 01:11:45,500
with our federal government.
1099
01:11:45,620 --> 01:11:49,200
It‘s not just trying to get the sort of
regulatory freedom that they want,
1100
01:11:49,410 --> 01:11:53,540
it's about involvement in areas
related to military technology,
1101
01:11:53,750 --> 01:11:59,000
related to artificial intelligence, to
steering the tech ship of the future
1102
01:11:59,290 --> 01:12:03,660
of the American people, and the problem
is that the conversation is being held by
1103
01:12:03,790 --> 01:12:07,910
our political leaders with Google and
Facebook, but it's a conversation
1104
01:12:08,120 --> 01:12:11,370
that never includes us at the table.
1105
01:12:20,580 --> 01:12:22,950
The adoption rate of smart phones
1106
01:12:23,000 --> 01:12:26,120
is so profound that that
is the information tool,
1107
01:12:26,500 --> 01:12:29,660
and so how people get their information,
what they believe, what they don't,
1108
01:12:29,750 --> 01:12:31,620
is, I think, a project
for the next decade.
1109
01:12:32,950 --> 01:12:36,410
I think we are increasingly moving to
an era where people are saying,
1110
01:12:36,540 --> 01:12:39,250
"We need to take better care
of our digital environments
1111
01:12:39,330 --> 01:12:40,330
and we need to take
1112
01:12:40,370 --> 01:12:44,750
better care of our digital selves“,
and what that means is being aware of
1113
01:12:44,870 --> 01:12:48,250
what we are putting into our
computers, but more importantly,
1114
01:12:48,370 --> 01:12:51,790
what our computers and our
devices are putting into us.
1115
01:12:51,870 --> 01:12:55,950
We are sharing information with them,
increasingly sensitive information.
1116
01:12:56,120 --> 01:13:00,200
The amount of data that they had on
us 15 years ago was relatively small;
1117
01:13:00,410 --> 01:13:04,160
now it's amazingly complex, and
they never forget any of it.
1118
01:13:05,410 --> 01:13:10,000
Mr. Zuckerberg, I remember well your first
visit to Capitol Hill back in 2010.
1119
01:13:10,580 --> 01:13:14,200
You spoke to the Senate Republican
High-Tech Task Force, which I chair.
1120
01:13:14,540 --> 01:13:18,000
You said back then that
Facebook would always be free.
1121
01:13:18,830 --> 01:13:20,870
Is that still your objective?
1122
01:13:21,910 --> 01:13:26,250
Senator, yes. There will always
be a version of Facebook that is free.
1123
01:13:26,330 --> 01:13:28,950
It is our mission to try to help
connect everyone around the world
1124
01:13:29,080 --> 01:13:30,870
and to bring the world closer together.
1125
01:13:30,950 --> 01:13:33,870
In order to do that, we believe that
we need to offer a service that everyone
1126
01:13:33,910 --> 01:13:36,200
can afford and we’re
committed to doing that.
1127
01:13:36,410 --> 01:13:38,830
Well, if so, how do you
sustain a business model
1128
01:13:38,910 --> 01:13:41,040
in which users don't pay for your service?
1129
01:13:42,790 --> 01:13:44,870
Senator, we run ads.
1130
01:13:46,250 --> 01:13:47,540
I see.
1131
01:13:47,750 --> 01:13:50,330
At the end of the day,
there is no free lunch,
1132
01:13:50,410 --> 01:13:53,370
especially when it comes to
social media and the Internet.
1133
01:13:53,620 --> 01:13:57,080
The entire internet economy is
literally based off of surveillance,
1134
01:13:57,370 --> 01:14:00,120
and that's something that we really
need to think long and hard,
1135
01:14:00,250 --> 01:14:03,370
"Is that a sustainable business
model to protect our privacy
1136
01:14:03,450 --> 01:14:06,580
and protect our families and
protect our kids, moving forward?“
1137
01:14:06,830 --> 01:14:11,080
What we need to think about is, how
can we continue with a light touch,
1138
01:14:11,200 --> 01:14:17,040
deregulated approach, but one where people
have the information they need to know
1139
01:14:17,250 --> 01:14:20,040
what's being done and whether
they consent to it or not?
1140
01:14:20,080 --> 01:14:23,370
And so, that's why I think we need to
have a really big focus on transparency,
1141
01:14:24,080 --> 01:14:27,330
on privacy, on a level playing
field across the board,
1142
01:14:27,370 --> 01:14:28,500
so that people understand
1143
01:14:28,580 --> 01:14:30,660
what's happening and they
can make decisions in a way
1144
01:14:30,750 --> 01:14:33,290
that they're not able
to very clearly right now.
1145
01:14:34,080 --> 01:14:38,580
It isn't obvious that regulators are fast
enough to keep up with the tech world.
1146
01:14:39,000 --> 01:14:42,700
They‘re going to be five years behind the
game, that’s like a hundred years.
1147
01:14:42,790 --> 01:14:45,580
That’s like wrestling with
Victorian England, you know?
1148
01:14:45,700 --> 01:14:50,410
It seems to me that it would be better in
many ways if there were multiple
1149
01:14:50,540 --> 01:14:54,120
competing search engines and
if there were multiple Facebooks,
1150
01:14:54,540 --> 01:14:58,080
because at least then we‘d have a
diversity of ethical conundrums,
1151
01:14:58,200 --> 01:15:02,540
instead of this totalitarian
conundrum that we have right now.
1152
01:15:02,910 --> 01:15:06,450
My concern at this point is that I don’t
think a lot of people in government
1153
01:15:06,580 --> 01:15:09,000
really fully understand the
extent of the problem.
1154
01:15:09,120 --> 01:15:11,870
I don’t even think they really
understand what they’re up against.
1155
01:15:12,160 --> 01:15:15,500
I mean, these are massive,
massive technological changes,
1156
01:15:15,660 --> 01:15:17,830
and they’re all happening in parallel.
1157
01:15:17,910 --> 01:15:21,370
We have no idea of what the
consequences of that are going to be.
1158
01:15:21,870 --> 01:15:26,370
So, my concern fundamentally is that these
machines will reflect us ethically,
1159
01:15:26,750 --> 01:15:29,080
and that should be frightening because
1160
01:15:29,250 --> 01:15:32,620
I wouldn’t say that our ethical
house is particularly in order.
1161
01:15:32,750 --> 01:15:35,580
So, they're going to magnify what we are.
1162
01:15:35,830 --> 01:15:39,410
That's making the presumption that the
thing that we’re building will be
1163
01:15:39,540 --> 01:15:43,700
a good thing, and I don't think that
it will be a good thing
1164
01:15:43,790 --> 01:15:45,500
because it will reflect us.
1165
01:15:45,620 --> 01:15:49,700
If they have this kind of power,
then democracy is an illusion.
1166
01:15:49,830 --> 01:15:52,040
The free and fair election doesn't exist.
1167
01:15:52,120 --> 01:15:56,830
There have to be in place, numerous
safeguards to make sure not only that
1168
01:15:56,910 --> 01:16:02,790
they don't exercise these powers, but that
they can't exercise these powers.
1169
01:16:03,040 --> 01:16:07,750
The Internet belongs to all of us; it does
not belong to Google or Facebook.
1170
01:16:07,870 --> 01:16:11,410
I mean, we have empirical evidence
that Google is participating in
1171
01:16:11,450 --> 01:16:15,080
self-serving bias in a way that
directly harms consumers.
1172
01:16:15,290 --> 01:16:20,540
As a society, as a democracy, any time
you have concentration at the levels
1173
01:16:20,620 --> 01:16:24,410
they are in the information sector, it
should be concerning because they really
1174
01:16:24,500 --> 01:16:26,660
are now more powerful than governments.
1175
01:16:26,750 --> 01:16:28,250
They are a form of regulator,
1176
01:16:28,330 --> 01:16:32,160
but they are a private regulator
with no democratic accountability.
1177
01:16:32,290 --> 01:16:37,950
I can't believe in a system in which the
power is separate from the people.
1178
01:16:38,000 --> 01:16:41,450
We’re talking about some pretty
arrogant people, in my opinion,
1179
01:16:41,580 --> 01:16:46,700
who think of themselves as
gods of sorts, and who really
1180
01:16:46,830 --> 01:16:50,200
want to have a complete
hold over humanity.
1181
01:16:50,290 --> 01:16:54,000
These are basically big mind control
machines, and mind control machines
1182
01:16:54,080 --> 01:16:56,870
that are really good at controlling minds.
1183
01:16:57,250 --> 01:17:01,080
It’s going to be harder and harder
to fight them, if we don’t do so,
1184
01:17:01,160 --> 01:17:03,080
I would say, as soon as possible,
1185
01:17:03,250 --> 01:17:08,700
The more rope we give them,
the sooner we are all hanged.
1186
01:17:10,450 --> 01:17:14,160
I don't believe our species can
survive unless we fix this.
1187
01:17:14,250 --> 01:17:19,330
We cannot have a society in which if
two people wish to communicate,
1188
01:17:19,450 --> 01:17:23,040
the only way that can happen is
if it's financed by a third person
1189
01:17:23,120 --> 01:17:25,870
who wishes to manipulate them.
1190
01:17:28,910 --> 01:17:34,040
The traditional notion of totalitarianism
was resting on the premise
1191
01:17:34,080 --> 01:17:39,080
or the idea that a government would try
to achieve total control over your life,
1192
01:17:39,660 --> 01:17:42,080
and they would do it by using the might
1193
01:17:42,160 --> 01:17:45,330
and muscle of government to
do so under compulsion.
1194
01:17:45,540 --> 01:17:49,910
Well, today, we essentially have a
totalitarian force in the world
1195
01:17:50,080 --> 01:17:53,080
and that is these large tech companies.
1196
01:17:53,200 --> 01:17:57,620
But guess what, they didn't use storm
troopers, they didn't use the Gulag,
1197
01:17:57,790 --> 01:18:01,250
they didn't use the arrest of
political prisoners to accomplish it.
1198
01:18:01,450 --> 01:18:04,620
We all opted in to do it ourselves.
1199
01:18:04,700 --> 01:18:07,330
We volunteered for this arrangement.
1200
01:18:07,410 --> 01:18:12,620
And, we live in a world today where
these tech giants have a level of control
1201
01:18:12,830 --> 01:18:16,790
and an ability to manipulate
us that Stalin, Mao,
1202
01:18:16,830 --> 01:18:20,450
Hitler, and Mussolini could
only have dreamed of.
1203
01:18:20,660 --> 01:18:23,910
The power is immense and
we are essentially trusting these
1204
01:18:23,950 --> 01:18:28,450
large tech companies to make the
right and good decision for us.
1205
01:18:28,830 --> 01:18:34,160
I, for one, am not prepared to see that
level of power to these individuals.
1206
01:18:37,450 --> 01:18:42,830
In the meantime, if the companies
won’t change, delete your accounts. Okay?
115390
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.