Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:01,645 --> 00:00:05,525
It's just 25 years since
the World Wide Web was created.
2
00:00:05,525 --> 00:00:09,365
It now touches all of our lives,
our personal information
3
00:00:09,365 --> 00:00:13,085
and data swirling through
the internet on a daily basis.
4
00:00:13,085 --> 00:00:17,245
Yet it's now caught in the greatest
controversy of its life -
5
00:00:17,245 --> 00:00:18,325
surveillance.
6
00:00:18,325 --> 00:00:21,405
This is a spy master's dream.
7
00:00:21,405 --> 00:00:24,645
No spy of the previous generations
could have imagined that we
8
00:00:24,645 --> 00:00:28,085
would all volunteer for the world's
best tracking device.
9
00:00:28,085 --> 00:00:32,045
The revelations of US intelligence
contractor Edward Snowden
10
00:00:32,045 --> 00:00:37,365
have led many to ask if the Web we
love has been turned against us...
11
00:00:37,365 --> 00:00:40,245
They don't just want your search
data or your e-mail.
12
00:00:40,245 --> 00:00:41,725
They want everything.
13
00:00:41,725 --> 00:00:46,765
And, as you are being surveilled
24/7, you are more under control.
14
00:00:46,765 --> 00:00:48,845
You are less free.
15
00:00:48,845 --> 00:00:53,885
..leading to soul-searching
amongst those responsible
for the Web itself.
16
00:00:53,885 --> 00:00:56,805
I used to think that in some
countries you worry about the
17
00:00:56,805 --> 00:01:00,405
government and in some countries you
worry about the corporations.
18
00:01:00,405 --> 00:01:02,805
I realise now that that was naive.
19
00:01:02,805 --> 00:01:05,685
But thanks to a collection
of brilliant thinkers
20
00:01:05,685 --> 00:01:08,325
and researchers,
science has been fighting back...
21
00:01:08,325 --> 00:01:14,845
It's really no surprise
that the privacy issue has
unfolded the way it has.
22
00:01:14,845 --> 00:01:20,285
..developing technology
to defeat surveillance,
protecting activists...
23
00:01:20,285 --> 00:01:24,525
They tried to intimidate me. I knew
that they don't know anything.
24
00:01:24,525 --> 00:01:28,365
..and in the process coming
into conflict with global power.
25
00:01:28,365 --> 00:01:32,885
They are detaining me
at airports, threatening me.
26
00:01:32,885 --> 00:01:36,205
But now, thanks to
a new digital currency...
27
00:01:36,205 --> 00:01:40,205
If you're buying less
than half a Bitcoin, you'll
have to go to market price.
28
00:01:40,205 --> 00:01:43,245
..this technology is sending
law makers into a panic,
29
00:01:43,245 --> 00:01:47,525
due to the growth of a new
black market in the Dark Web.
30
00:01:47,525 --> 00:01:50,965
It was like a buffet
dinner for narcotics.
31
00:01:50,965 --> 00:01:55,045
Our detection rate is dropping.
It's risk-free crime.
32
00:01:55,045 --> 00:01:58,645
This is the story of a battle to
shape the technology which now
33
00:01:58,645 --> 00:02:03,165
defines our world, and whoever wins
will influence not only the
34
00:02:03,165 --> 00:02:08,645
future of the internet but the very
idea of what it means to be free.
35
00:02:08,645 --> 00:02:11,285
The sickness that befalls
the internet is something that
36
00:02:11,285 --> 00:02:12,685
befalls the whole world.
37
00:02:23,685 --> 00:02:29,125
Where does the outside world stop
and private space begin?
38
00:02:29,125 --> 00:02:33,205
What details of your life are you
willing to share with strangers?
39
00:02:36,485 --> 00:02:38,085
Take this house.
40
00:02:40,325 --> 00:02:43,605
Every morning, lights come on.
41
00:02:43,605 --> 00:02:47,645
Coffee brews - automatically.
42
00:02:50,085 --> 00:02:53,085
Technology just like this
is increasingly being
43
00:02:53,085 --> 00:02:56,165
installed into millions of homes
across the world
44
00:02:56,165 --> 00:03:01,485
and it promises to change
the way we live for ever.
45
00:03:01,485 --> 00:03:03,685
So there are sensors of all kinds,
46
00:03:03,685 --> 00:03:05,525
there's lights and locks
and thermostats
47
00:03:05,525 --> 00:03:06,845
and once they're connected
48
00:03:06,845 --> 00:03:09,965
then our platform can make them do
whatever you want them to do.
49
00:03:09,965 --> 00:03:11,405
So as an example,
50
00:03:11,405 --> 00:03:15,045
if I wake up in the morning,
the house knows that I'm waking up.
51
00:03:15,045 --> 00:03:16,445
It can wake up with me.
52
00:03:16,445 --> 00:03:19,725
When we walk in the kitchen,
it will play the local news
53
00:03:19,725 --> 00:03:23,525
and sort of greet us into the day,
tell us the weather forecast
54
00:03:23,525 --> 00:03:26,205
so we know how to dress
for the day and so on.
55
00:03:26,205 --> 00:03:29,685
This technology is known
as the internet of things,
56
00:03:29,685 --> 00:03:33,405
where the objects in our houses -
kitchen appliances,
57
00:03:33,405 --> 00:03:37,445
anything electronic -
can be connected to the internet.
58
00:03:37,445 --> 00:03:43,445
But for it to be useful, we're
going to have to share intimate
details of our private life.
59
00:03:43,445 --> 00:03:45,885
So this is my things app.
60
00:03:45,885 --> 00:03:48,765
It can run on your mobile phone or
on a tablet or something like that.
61
00:03:48,765 --> 00:03:50,805
I can do things like look
at the comings
62
00:03:50,805 --> 00:03:52,365
and goings of family members.
63
00:03:52,365 --> 00:03:54,485
It can automatically detect
when we come and go
64
00:03:54,485 --> 00:03:58,325
based on our mobile phones or
you can have it detect your presence
65
00:03:58,325 --> 00:04:01,205
with a little sensor, you can put it
in your car or something like that.
66
00:04:01,205 --> 00:04:04,525
So when we leave the house,
and there's no-one home,
that's when it'll lock up
67
00:04:04,525 --> 00:04:07,485
and shut down all
the electricity used and so on.
68
00:04:07,485 --> 00:04:10,725
For most of us,
this is deeply private information.
69
00:04:10,725 --> 00:04:11,925
Yet once we hand it over,
70
00:04:11,925 --> 00:04:14,565
we have to trust a company
to keep it confidential.
71
00:04:14,565 --> 00:04:18,045
The consumer really owns 100% of
their own data so they're opting in.
72
00:04:18,045 --> 00:04:21,085
It's not something where
that data would ever be shared
73
00:04:21,085 --> 00:04:23,165
without their giving
their permission.
74
00:04:27,725 --> 00:04:33,045
This house represents a new normal
where even the movements
75
00:04:33,045 --> 00:04:36,405
within our own home are
documented and stored.
76
00:04:36,405 --> 00:04:37,565
It's a new frontier.
77
00:04:39,965 --> 00:04:45,045
The internet is asking us to
redefine what we consider private.
78
00:04:54,405 --> 00:04:57,645
To understand the enormous
changes taking place,
79
00:04:57,645 --> 00:05:00,485
it's necessary to come here.
80
00:05:00,485 --> 00:05:03,125
Almost 150 years ago, this hut
was on the frontier of the world's
81
00:05:03,125 --> 00:05:08,445
first information revolution -
the telegraph.
82
00:05:09,685 --> 00:05:13,485
It was a crucial hub for
a global network of wires -
83
00:05:13,485 --> 00:05:15,685
a role that is
just as important today.
84
00:05:17,765 --> 00:05:21,565
Seeing the cables in a room
like this shows that the physical
85
00:05:21,565 --> 00:05:24,045
infrastructure needed
to move information
86
00:05:24,045 --> 00:05:27,925
around the world hasn't changed
very much in the past hundred years.
87
00:05:27,925 --> 00:05:29,965
I think that the internet,
88
00:05:29,965 --> 00:05:32,965
we tend to think of as a cloud
floating somewhere off in cyberspace,
89
00:05:32,965 --> 00:05:35,205
but they're physical wires,
physical cables
90
00:05:35,205 --> 00:05:40,165
and sometimes wireless signals that
are communicating with each other.
91
00:05:40,165 --> 00:05:43,765
Cornwall, where the
telegraph cables come ashore,
92
00:05:43,765 --> 00:05:47,605
still remains crucial
for today's internet.
93
00:05:47,605 --> 00:05:51,885
25% of all traffic passes
through here.
94
00:05:51,885 --> 00:05:53,885
Running from the United States
95
00:05:53,885 --> 00:05:56,965
and other places to the United
Kingdom are a large number of the
96
00:05:56,965 --> 00:06:03,045
most significant fibre-optic cables
that carry huge amounts of data.
97
00:06:03,045 --> 00:06:06,525
Alongside this information
super-highway is a site
98
00:06:06,525 --> 00:06:10,325
belonging to the UK Government,
GCHQ Bude.
99
00:06:11,645 --> 00:06:14,845
We now know that this listening
station has been gathering
100
00:06:14,845 --> 00:06:19,925
and analysing everything that
comes across these wires.
101
00:06:19,925 --> 00:06:23,165
Any data that passes across the
internet could theoretically come
102
00:06:23,165 --> 00:06:28,325
down these cables, so that's e-mails,
websites, the bit torrent downloads,
103
00:06:28,325 --> 00:06:32,645
the films that you're accessing
through Netflix and online services.
104
00:06:32,645 --> 00:06:38,405
The sheer amount of data
captured here is almost
impossible to comprehend.
105
00:06:38,405 --> 00:06:40,685
In terms of what the GCHQ
were looking at, we've got
106
00:06:40,685 --> 00:06:44,965
from internal documents that,
in 2011, they were tapping 200
107
00:06:44,965 --> 00:06:47,965
ten-gigabit cables
coming into Cornwall.
108
00:06:47,965 --> 00:06:52,205
To give a rough idea of how much
data that is, if you were to
109
00:06:52,205 --> 00:06:55,245
digitise the entire contents
of the British Library, then you
110
00:06:55,245 --> 00:07:00,325
could transfer it down that set
of cables in about 40 seconds.
111
00:07:00,325 --> 00:07:02,805
Tapping the wires is
surprisingly simple.
112
00:07:10,925 --> 00:07:17,685
The data carried by the fibre-optic
cable just needs to be diverted.
113
00:07:17,685 --> 00:07:20,445
A fibre-optic cable signal
is a beam of light
114
00:07:20,445 --> 00:07:24,525
travelling down a cable
made from glass.
115
00:07:24,525 --> 00:07:27,765
Pulses of light represent
the pieces of information
116
00:07:27,765 --> 00:07:31,205
travelling across the internet,
which is the e-mails, the web pages,
117
00:07:31,205 --> 00:07:34,605
everything that's going
over the internet.
118
00:07:34,605 --> 00:07:38,325
Every 50 miles or so,
that signal becomes sufficiently
119
00:07:38,325 --> 00:07:41,965
weak that it needs to be repeated,
and this is the weak spot.
120
00:07:42,965 --> 00:07:48,445
And it's very easy to insert
an optical tap at that point.
121
00:07:48,445 --> 00:07:52,165
And that's just what GCHQ did.
122
00:07:52,165 --> 00:07:56,525
A device was placed into the beam
of data which created a mirror
123
00:07:56,525 --> 00:07:59,405
image of the millions of e-mails,
web searches
124
00:07:59,405 --> 00:08:02,485
and internet traffic passing through
the cables every second.
125
00:08:05,605 --> 00:08:08,965
What you effectively get is two
copies of the signal, one going
126
00:08:08,965 --> 00:08:12,565
off to the GCHQ and one carrying
on in its original destination.
127
00:08:22,165 --> 00:08:25,445
All of the information going over
those cables is able to be
128
00:08:25,445 --> 00:08:28,685
replayed over the course of
three days so you can rewind
129
00:08:28,685 --> 00:08:31,725
and see what was going over
the internet at a particular moment.
130
00:08:31,725 --> 00:08:35,285
Analysing this amount of data
is an impressive achievement
131
00:08:35,285 --> 00:08:37,965
but it also attracts criticism.
132
00:08:37,965 --> 00:08:41,685
When you see the capacity and the
potential for that technology,
133
00:08:41,685 --> 00:08:45,125
and the fact that it is being used
without transparency
134
00:08:45,125 --> 00:08:48,885
and without very high levels
of accountability, it's incredibly
135
00:08:48,885 --> 00:08:52,885
concerning because the power of that
data to predict and analyse what
136
00:08:52,885 --> 00:08:54,885
we're going to do is very, very high
137
00:08:54,885 --> 00:08:56,925
and giving that power
to somebody else,
138
00:08:56,925 --> 00:09:01,405
regardless of the original or
stated intentions, is very worrying.
139
00:09:08,725 --> 00:09:13,405
We only know about the GCHQ project
thanks to documents released
140
00:09:13,405 --> 00:09:18,245
by US whistle-blower Edward Snowden,
and the revelation has begun
141
00:09:18,245 --> 00:09:22,885
to change the way many people think
about privacy and the internet -
142
00:09:22,885 --> 00:09:27,405
among them the inventor of
the World Wide Web, Tim Berners-Lee.
143
00:09:27,405 --> 00:09:30,045
Information is
a funny sort of power.
144
00:09:30,045 --> 00:09:33,805
The way a government can use it to
145
00:09:33,805 --> 00:09:35,245
keep control of its citizens
146
00:09:35,245 --> 00:09:39,285
is insidious and sneaky,
nasty in some cases.
147
00:09:39,285 --> 00:09:43,245
We have to all learn more about it
148
00:09:43,245 --> 00:09:48,925
and we have to in a way rethink,
rebase a lot of our philosophy.
149
00:09:50,525 --> 00:09:54,405
Part of changing that philosophy
is to understand that our lives are
150
00:09:54,405 --> 00:09:57,845
not analysed in
the first instance by people,
151
00:09:57,845 --> 00:09:59,805
but by computer programmes.
152
00:10:01,525 --> 00:10:05,005
Some people just don't have an
understanding about what's possible.
153
00:10:05,005 --> 00:10:07,805
I've heard people say, "Well,
we know nobody's reading our e-mails
154
00:10:07,805 --> 00:10:09,645
"because they don't have
enough people."
155
00:10:09,645 --> 00:10:12,685
Actually, hello, it's not... These
e-mails are not being read by people.
156
00:10:12,685 --> 00:10:14,525
They're being read by machines.
157
00:10:16,765 --> 00:10:19,805
They're being read by machines which
can do the sorts of things
158
00:10:19,805 --> 00:10:23,245
that search engines do
and can look at all the e-mails
159
00:10:23,245 --> 00:10:26,525
and at all the social connections
and can watch.
160
00:10:26,525 --> 00:10:30,565
Sometimes these machines
learn how to spot trends
161
00:10:30,565 --> 00:10:35,685
and can build systems which will
just watch a huge amount of data
162
00:10:35,685 --> 00:10:39,965
and start to pick things out
and then will suggest to the
163
00:10:39,965 --> 00:10:44,165
security agency, well, "These
people need to be investigated."
164
00:10:46,405 --> 00:10:49,845
But then the thing could be wrong.
Boy, we have to have a protection.
165
00:10:52,125 --> 00:10:55,125
The Snowden revelations have
generated greater interest
166
00:10:55,125 --> 00:10:58,485
than ever in how the internet
is being used for the purposes
167
00:10:58,485 --> 00:11:00,685
of surveillance.
168
00:11:00,685 --> 00:11:04,365
But watching isn't
just done by governments.
169
00:11:07,605 --> 00:11:13,445
The most detailed documenting
of our lives is done
by technology companies.
170
00:11:13,445 --> 00:11:17,485
Two years ago, tech researcher
Julia Angwin decided to investigate
171
00:11:17,485 --> 00:11:21,325
how much these companies
track our behaviour daily.
172
00:11:21,325 --> 00:11:25,085
Her findings give us one of
the best pictures yet of just who is
173
00:11:25,085 --> 00:11:29,485
watching us online
every minute of every day.
174
00:11:29,485 --> 00:11:32,405
When I talk about the underbelly
of the information revolution,
175
00:11:32,405 --> 00:11:36,765
I'm really talking
about the unseen downside
176
00:11:36,765 --> 00:11:38,205
of the information revolution.
177
00:11:38,205 --> 00:11:41,045
You know, we obviously have
seen all the benefits of having
178
00:11:41,045 --> 00:11:42,885
all this information
at our fingertips.
179
00:11:42,885 --> 00:11:46,125
But we're just awakening to
the fact that we're also being
180
00:11:46,125 --> 00:11:47,965
surveilled all the time.
181
00:11:47,965 --> 00:11:51,605
We're being monitored in ways
that were never before possible.
182
00:11:55,085 --> 00:11:57,525
Every time we browse the internet,
183
00:11:57,525 --> 00:12:02,165
what we do can be collated
and sold to advertisers.
184
00:12:02,165 --> 00:12:04,805
So, basically, online there
are hundreds of companies that
185
00:12:04,805 --> 00:12:09,285
sort of install invisible
tracking technology on websites.
186
00:12:12,925 --> 00:12:16,005
They've installed basically
a serial number on your computer
187
00:12:16,005 --> 00:12:19,045
and they watch you whenever
they see you across the Web
188
00:12:19,045 --> 00:12:21,285
and build a dossier
about your reading habits,
189
00:12:21,285 --> 00:12:23,925
your shopping habits,
whatever they can obtain.
190
00:12:23,925 --> 00:12:25,925
And then there's a real
market for that data.
191
00:12:25,925 --> 00:12:27,045
They buy and sell it
192
00:12:27,045 --> 00:12:31,005
and there's an online auction
bidding for information about you.
193
00:12:31,005 --> 00:12:35,085
And so the people who know your
browsing habits can also discover
194
00:12:35,085 --> 00:12:39,925
your deepest secrets, and then
sell them to the highest bidder.
195
00:12:42,005 --> 00:12:44,445
So let's say a woman takes
a pregnancy test
196
00:12:44,445 --> 00:12:46,365
and finds out she's pregnant.
197
00:12:46,365 --> 00:12:49,085
Then she might go
look for something online.
198
00:12:49,085 --> 00:12:52,965
By making pregnancy-related
searches, this woman has become a
199
00:12:52,965 --> 00:12:58,245
hot property for the people looking
for a good target for advertising.
200
00:12:58,245 --> 00:13:00,085
The process of selling then begins.
201
00:13:02,445 --> 00:13:07,125
Within seconds, she is identified
by the companies watching her.
202
00:13:07,125 --> 00:13:09,245
Her profile is now sold
multiple times.
203
00:13:11,125 --> 00:13:15,085
She will then find herself bombarded
with ads relating to pregnancy.
204
00:13:16,685 --> 00:13:20,165
She may well find that she's been
followed around the Web by ads
205
00:13:20,165 --> 00:13:23,045
and that's really
a result of that auction house.
206
00:13:23,045 --> 00:13:26,885
Now they will say to you that they
know, they know she's pregnant
but they don't know her name
207
00:13:26,885 --> 00:13:28,685
but what's happening is now that,
208
00:13:28,685 --> 00:13:31,765
more and more, that information
is really not that anonymous.
209
00:13:31,765 --> 00:13:34,405
You know it's sort of like
if they know you're pregnant,
210
00:13:34,405 --> 00:13:37,205
and they know where you live,
yes maybe they don't know your name.
211
00:13:37,205 --> 00:13:39,685
That's just because they haven't
bothered to look it up.
212
00:13:39,685 --> 00:13:43,525
We continually create new
information about ourselves and this
213
00:13:43,525 --> 00:13:48,365
information gives a continual window
into our behaviours and habits.
214
00:13:48,365 --> 00:13:51,445
The next stage of surveillance
comes from the offices
215
00:13:51,445 --> 00:13:54,805
and buildings around us,
216
00:13:54,805 --> 00:13:58,765
sending out Wi-Fi signals
across the city.
217
00:13:58,765 --> 00:14:03,045
OK, so let's see how many signals
we have right here, Wi-Fi.
218
00:14:03,045 --> 00:14:07,925
So we have one, two -
16, 17, 18, 19,
219
00:14:07,925 --> 00:14:10,765
20, 21, oh, and a few more
just added themselves,
220
00:14:10,765 --> 00:14:12,885
so we're around 25 right now.
221
00:14:12,885 --> 00:14:14,845
Right there at this point
we have 25 signals that
222
00:14:14,845 --> 00:14:16,245
are reaching out, basically
223
00:14:16,245 --> 00:14:18,605
sending a little signal to
my phone saying, "I'm here,"
224
00:14:18,605 --> 00:14:21,165
and my phone is sending a signal
back saying, "I'm also here."
225
00:14:21,165 --> 00:14:23,165
As long as your phone's Wi-Fi
connection is on
226
00:14:23,165 --> 00:14:28,205
and connected to a signal,
you can be tracked.
227
00:14:28,205 --> 00:14:31,445
Google, Apple, other big companies
are racing to map the whole
228
00:14:31,445 --> 00:14:35,325
world using Wi-Fi signals and then,
whenever your phone is somewhere,
229
00:14:35,325 --> 00:14:38,285
they know exactly how far
you are from the closest Wi-Fi
230
00:14:38,285 --> 00:14:41,405
signal and they can map you much
more precisely even than GPS.
231
00:14:41,405 --> 00:14:45,085
And we now know that this data
has been seized by governments
232
00:14:45,085 --> 00:14:50,365
as part of their internet
surveillance operations.
233
00:14:50,365 --> 00:14:53,605
The NSA was like, "Oh, that's
an awesome way to track people.
234
00:14:53,605 --> 00:14:57,285
"Let's scoop up that information
too."
235
00:14:57,285 --> 00:15:01,645
So we have seen that the governments
find all this data irresistible.
236
00:15:05,405 --> 00:15:07,845
But is this simply
a matter of principle?
237
00:15:15,325 --> 00:15:18,285
Is losing privacy
ultimately the price
238
00:15:18,285 --> 00:15:21,965
we pay for peaceful streets
and freedom from terror attacks?
239
00:15:26,165 --> 00:15:28,325
This man thinks
we should look deeper.
240
00:15:30,605 --> 00:15:34,445
Bruce Schneier is a leading
internet security expert.
241
00:15:34,445 --> 00:15:38,845
He was part of the team which first
analysed the Snowden documents.
242
00:15:38,845 --> 00:15:40,725
They don't just want your search data
243
00:15:40,725 --> 00:15:42,965
or your e-mail. They want everything.
244
00:15:42,965 --> 00:15:45,925
They want to tie it to your
real world behaviours - location
245
00:15:45,925 --> 00:15:49,485
data from your cellphone -
and it's these correlations.
246
00:15:49,485 --> 00:15:55,645
And as you are being surveilled
24/7, you are more under control.
247
00:15:55,645 --> 00:15:59,925
Right? You are less free,
you are less autonomous.
248
00:15:59,925 --> 00:16:02,685
Schneier believes the ultimate
result from all this
249
00:16:02,685 --> 00:16:05,605
surveillance may be
a loss of freedom.
250
00:16:05,605 --> 00:16:09,445
What data does is gives someone
control over you.
251
00:16:09,445 --> 00:16:12,405
The reason Google and
Facebook are collecting this
252
00:16:12,405 --> 00:16:14,725
is for psychological manipulation.
253
00:16:14,725 --> 00:16:19,645
That's their stated business
purpose, right? Advertising.
254
00:16:19,645 --> 00:16:21,725
They want to convince you
255
00:16:21,725 --> 00:16:25,645
to buy things you might
not want to buy otherwise.
256
00:16:25,645 --> 00:16:28,445
And so that data is
all about control.
257
00:16:30,005 --> 00:16:32,925
Governments collect it
also for control. Right?
258
00:16:32,925 --> 00:16:34,765
They want to control
their population.
259
00:16:34,765 --> 00:16:36,805
Maybe they're
concerned about dissidents,
260
00:16:36,805 --> 00:16:39,685
maybe they're
concerned about criminals.
261
00:16:43,205 --> 00:16:46,565
It's hard to imagine that
this data can exert such
262
00:16:46,565 --> 00:16:51,445
a level of potential control over us
but looking for the patterns
263
00:16:51,445 --> 00:16:56,565
in the data can give anyone
analysing it huge power.
264
00:16:59,365 --> 00:17:02,805
What's become increasingly
understood is how much is
265
00:17:02,805 --> 00:17:06,045
revealed by meta-data,
by the information about who is
266
00:17:06,045 --> 00:17:09,085
sending messages and how often
they're sending them to each other.
267
00:17:09,085 --> 00:17:12,165
This reveals information
about our social networks, it can
268
00:17:12,165 --> 00:17:16,605
reveal information about the places
that we go on a day-to-day basis.
269
00:17:16,605 --> 00:17:20,485
All this meta-data stacks up to
allow a very detailed
270
00:17:20,485 --> 00:17:22,765
view into our lives
and increasingly what
271
00:17:22,765 --> 00:17:26,125
we find is that these patterns
of communication can even be
272
00:17:26,125 --> 00:17:30,885
used in a predictive sense to
determine factors about our lives.
273
00:17:30,885 --> 00:17:35,805
And what some people fear
is what the spread of this
analysis could lead to.
274
00:17:35,805 --> 00:17:40,565
Ultimately the concern of this is
that, in a dystopian scenario,
275
00:17:40,565 --> 00:17:44,445
you have a situation where
every facet of your life is
276
00:17:44,445 --> 00:17:48,285
something that is open to analysis
by whoever has access to the data.
277
00:17:48,285 --> 00:17:52,965
They can look at the likelihood that
you should be given health insurance
278
00:17:52,965 --> 00:17:57,005
because you come from a family that
has a history of heart disease.
279
00:17:57,005 --> 00:17:59,965
They can look at the likelihood
that you're going to get
280
00:17:59,965 --> 00:18:04,085
Alzheimer's disease because of
the amount of active intellectual
281
00:18:04,085 --> 00:18:06,285
entertainment you take part
in on a day-to-day basis.
282
00:18:06,285 --> 00:18:09,925
And when you start giving this
level of minute control over
283
00:18:09,925 --> 00:18:14,125
people's lives, then you allow far
too much power over individuals.
284
00:18:17,525 --> 00:18:23,445
The picture painted is certainly
dark but there is another way.
285
00:18:27,285 --> 00:18:31,165
It comes from the insights
of a scientist whose work once
286
00:18:31,165 --> 00:18:36,445
seemed like a footnote in the
history of the internet - until now.
287
00:18:41,965 --> 00:18:45,605
The story begins in the late
'70s in Berkeley, California.
288
00:18:52,525 --> 00:18:56,565
Governments and companies had begun
to harness the power of computing.
289
00:19:00,285 --> 00:19:03,285
They seemed to promise
a future of efficiency,
290
00:19:03,285 --> 00:19:06,405
of problems becoming overcome
thanks to technology.
291
00:19:08,925 --> 00:19:11,565
Yet not everyone was so convinced.
292
00:19:11,565 --> 00:19:16,245
David Chaum was a computer
scientist at Berkeley.
293
00:19:16,245 --> 00:19:19,325
For him, a world in which
we would become increasingly joined
294
00:19:19,325 --> 00:19:23,365
together by machines in
a network held grave dangers.
295
00:19:26,085 --> 00:19:29,245
As computing advanced,
he grew increasingly
296
00:19:29,245 --> 00:19:32,125
convinced of the threats these
networks could pose.
297
00:19:33,525 --> 00:19:35,965
David Chaum was very far
ahead of his time.
298
00:19:35,965 --> 00:19:39,685
He predicted in the early 1980s
concerns that would
299
00:19:39,685 --> 00:19:43,085
arise on the internet
15 or 20 years later -
300
00:19:43,085 --> 00:19:44,765
the whole field of traffic analysis
301
00:19:44,765 --> 00:19:47,405
that allows you to predict
the behaviours of individuals,
302
00:19:47,405 --> 00:19:51,805
not by looking at the contents
of their e-mails but by looking
at the patterns of communication.
303
00:19:51,805 --> 00:19:56,285
David Chaum to some extent foresaw
that and solved the problem.
304
00:20:00,925 --> 00:20:05,845
Well, it's sad to me but it is
really no surprise
305
00:20:05,845 --> 00:20:10,085
that the privacy issue
has unfolded the way it has.
306
00:20:10,085 --> 00:20:16,005
I spelled it out in the early
publications in the '80s.
307
00:20:16,005 --> 00:20:19,565
Chaum's papers explained that
in a future world where we would
308
00:20:19,565 --> 00:20:22,085
increasingly use computers,
it would be easy
309
00:20:22,085 --> 00:20:24,725
to conduct mass surveillance.
310
00:20:24,725 --> 00:20:28,565
Chaum wanted to find
a way to stop it.
311
00:20:28,565 --> 00:20:33,685
I always had a deep feeling that
privacy is intimately tied to
312
00:20:33,685 --> 00:20:37,925
human potential and
313
00:20:37,925 --> 00:20:43,205
that it's an extraordinarily
important aspect of democracy.
314
00:20:45,605 --> 00:20:48,285
Chaum focused on the new
technology of e-mails.
315
00:20:49,925 --> 00:20:52,565
Anyone watching the network
through which these messages
316
00:20:52,565 --> 00:20:57,365
travelled could find out enormous
amounts about that person.
317
00:20:57,365 --> 00:20:59,445
He wanted to make this
more difficult.
318
00:21:00,805 --> 00:21:04,805
Well, I was driving from Berkeley
to Santa Barbara
319
00:21:04,805 --> 00:21:09,405
along the coastline in
my VW Camper van and
320
00:21:09,405 --> 00:21:11,245
out of nowhere... You know, it was
321
00:21:11,245 --> 00:21:14,245
beautiful scenery, I was just
driving along and it occurred to me
322
00:21:14,245 --> 00:21:17,685
how to solve this problem I'd been
trying to solve for a long time.
323
00:21:17,685 --> 00:21:21,405
Yeah, it was a kind of a,
you know, a eureka-moment type.
324
00:21:21,405 --> 00:21:23,485
I felt like, "Hey, this is it."
325
00:21:25,805 --> 00:21:29,205
Chaum's focus was
the pattern of communications that
326
00:21:29,205 --> 00:21:31,525
a computer made on the network.
327
00:21:31,525 --> 00:21:35,405
If that pattern could be disguised
using cryptography, it would
328
00:21:35,405 --> 00:21:39,365
be harder for anyone watching to
identify individuals and carry out
329
00:21:39,365 --> 00:21:42,485
effective surveillance,
330
00:21:42,485 --> 00:21:45,965
and Chaum's system had a twist.
331
00:21:45,965 --> 00:21:50,205
Cryptography has traditionally
been used to provide
332
00:21:50,205 --> 00:21:57,165
secrecy for message content
and so I used this message secrecy
333
00:21:57,165 --> 00:22:03,085
technology of encryption to actually
protect the meta-data of who
334
00:22:03,085 --> 00:22:08,885
talks to who and when,
and that was quite a paradigm shift.
335
00:22:08,885 --> 00:22:12,365
Chaum had realised
something about surveillance.
336
00:22:12,365 --> 00:22:16,845
Who we talk to and when is
just as important as what we say.
337
00:22:19,365 --> 00:22:22,605
The key to avoiding this
type of traffic analysis was to
338
00:22:22,605 --> 00:22:25,365
render the user effectively
anonymous.
339
00:22:30,445 --> 00:22:34,365
But he realised that wasn't enough.
340
00:22:34,365 --> 00:22:39,085
He wanted to build a secure network
and to do this
341
00:22:39,085 --> 00:22:41,125
he needed more anonymous users.
342
00:22:43,645 --> 00:22:46,285
One cannot be anonymous alone.
343
00:22:46,285 --> 00:22:51,965
One can only be anonymous relative
to a set of people.
344
00:22:51,965 --> 00:22:55,725
The more anonymous users you can
gather together in a network,
345
00:22:55,725 --> 00:22:59,925
the harder it becomes for someone
watching to keep track of them,
346
00:22:59,925 --> 00:23:03,365
especially if they're mixed up.
347
00:23:03,365 --> 00:23:06,765
And so a whole batch of input
messages from different
348
00:23:06,765 --> 00:23:14,125
people are shuffled and then sent
to another computer, then shuffled
349
00:23:14,125 --> 00:23:19,085
again and so forth, and you can't
tell as an observer of the network
350
00:23:19,085 --> 00:23:24,085
which item that went in corresponds
to which item coming out.
351
00:23:27,685 --> 00:23:33,005
David Chaum was trying to provide
protection against a world in which
352
00:23:33,005 --> 00:23:36,885
our communications would be analysed
and potentially used against us.
353
00:23:36,885 --> 00:23:42,925
Chaum's response to this was to
say, in order to have a free society,
354
00:23:42,925 --> 00:23:47,845
we need to have freedom
from analysis of our behaviours
and our communications.
355
00:23:47,845 --> 00:23:51,285
But Chaum's system didn't take off
because communication using
356
00:23:51,285 --> 00:23:55,365
e-mail was still the preserve
of a few academics and technicians.
357
00:23:58,365 --> 00:24:00,445
Yet his insights weren't forgotten.
358
00:24:03,085 --> 00:24:06,725
Within a decade, the arrival
of the World Wide Web took
359
00:24:06,725 --> 00:24:09,765
communication increasingly online.
360
00:24:15,765 --> 00:24:18,725
The US Government understood
the importance of protecting
361
00:24:18,725 --> 00:24:22,205
its own online
communications from surveillance.
362
00:24:22,205 --> 00:24:24,485
It began to put money into research.
363
00:24:28,645 --> 00:24:32,085
At the US Naval Research Laboratory,
364
00:24:32,085 --> 00:24:36,805
a team led by scientist
Paul Syverson got to work.
365
00:24:41,285 --> 00:24:46,245
Suppose we wanted to have a system
where people could communicate
366
00:24:46,245 --> 00:24:51,405
back to their home office or
with each other over the internet
367
00:24:51,405 --> 00:24:57,925
but without people being able to
associate source and destination.
368
00:24:57,925 --> 00:25:02,365
Syverson soon came
across the work of David Chaum.
369
00:25:02,365 --> 00:25:06,845
The first work which is associated
with this area is
370
00:25:06,845 --> 00:25:08,165
the work of David Chaum.
371
00:25:09,485 --> 00:25:13,165
A Chaum mix basically
gets its security
372
00:25:13,165 --> 00:25:16,205
because it takes in
a bunch of messages
373
00:25:16,205 --> 00:25:22,125
and then re-orders them and changes
their appearance and spews them out.
374
00:25:22,125 --> 00:25:26,285
But this was now
the age of the World Wide Web.
375
00:25:26,285 --> 00:25:30,325
The Navy wanted to develop anonymous
communications for this new era.
376
00:25:33,765 --> 00:25:37,725
So Syverson and his colleagues set
to work on building a system
377
00:25:37,725 --> 00:25:40,565
that could be used by operatives
across the world.
378
00:25:44,725 --> 00:25:50,605
You have enough of a network with
enough distribution that it's
379
00:25:50,605 --> 00:25:53,765
going to be very
hard for an adversary to be in all
380
00:25:53,765 --> 00:25:58,725
the places and to see all
the traffic wherever it is.
381
00:25:58,725 --> 00:26:02,765
Syverson's system was called
the Tor network.
382
00:26:02,765 --> 00:26:06,885
Tor stands for "the onion router".
It works like this.
383
00:26:10,445 --> 00:26:13,325
A user wants to visit a website
but doesn't want to
384
00:26:13,325 --> 00:26:18,325
that identifies their computer.
385
00:26:18,325 --> 00:26:21,365
As they send the request,
three layers of encryption
386
00:26:21,365 --> 00:26:25,285
are placed around it
like the layers of an onion.
387
00:26:25,285 --> 00:26:28,445
The message is then sent through
a series of computers which
388
00:26:28,445 --> 00:26:30,525
have volunteered to
act as relay points.
389
00:26:33,205 --> 00:26:36,245
As the message
passes from computer to computer,
390
00:26:36,245 --> 00:26:38,885
a layer of encryption is removed.
391
00:26:38,885 --> 00:26:42,925
Each time it is removed,
all the relay computer can see
392
00:26:42,925 --> 00:26:46,205
is an order which tells it to
pass the message on.
393
00:26:46,205 --> 00:26:49,045
The final computer relay
decrypts the innermost
394
00:26:49,045 --> 00:26:53,485
layer of encryption, revealing
the content of the communication.
395
00:26:53,485 --> 00:26:57,365
However - importantly -
the identity of the user is hidden.
396
00:27:00,005 --> 00:27:05,725
Somebody who wants to look at things
around the Web and not necessarily
397
00:27:05,725 --> 00:27:09,605
have people know what he's
interested in. It might just be the
398
00:27:09,605 --> 00:27:13,365
local internet services provider,
he doesn't want them to know
399
00:27:13,365 --> 00:27:17,045
which things he's looking at, but it
might be also the destination.
400
00:27:19,885 --> 00:27:21,365
Syverson's system worked.
401
00:27:23,845 --> 00:27:28,045
It was now possible to surf the net
without being watched.
402
00:27:30,365 --> 00:27:34,325
As David Chaum had observed,
the more anonymous people,
403
00:27:34,325 --> 00:27:36,365
the better the security.
404
00:27:36,365 --> 00:27:40,005
The Navy had what they believed was
a smart way of achieving that...
405
00:27:45,205 --> 00:27:47,325
..open the network out to everyone.
406
00:27:50,005 --> 00:27:54,805
It's not enough for a government
system to carry traffic
just for the government.
407
00:27:54,805 --> 00:27:58,805
It also has to carry
traffic for other people.
408
00:27:58,805 --> 00:28:01,965
Part of anonymity is having a large
number of people who are also
409
00:28:01,965 --> 00:28:06,085
anonymous because you can't be
anonymous on your own.
410
00:28:06,085 --> 00:28:11,045
What Syverson and his team had done,
building on the work of David Chaum,
411
00:28:11,045 --> 00:28:15,245
would begin to revolutionise the way
that people could operate online.
412
00:28:19,245 --> 00:28:21,045
Over the coming years,
413
00:28:21,045 --> 00:28:24,725
the Tor network expanded as more
people volunteered to become
414
00:28:24,725 --> 00:28:29,965
relay computers, the points through
which the messages could be relayed.
415
00:28:29,965 --> 00:28:32,885
What Tor did was it made
a useable system for people.
416
00:28:32,885 --> 00:28:34,685
People wanted to protect
what they were
417
00:28:34,685 --> 00:28:38,605
looking at on the internet
from being watched by their ISP or
418
00:28:38,605 --> 00:28:42,565
their government or the company that
they're working for at the time.
419
00:28:42,565 --> 00:28:45,645
But Tor's success wasn't
just down to the Navy.
420
00:28:49,885 --> 00:28:54,125
In the mid-2000s, they handed
the network over to a non-profit
421
00:28:54,125 --> 00:28:56,885
organisation who
overhauled the system.
422
00:28:59,525 --> 00:29:03,325
Now the network would be represented
by people like this -
423
00:29:03,325 --> 00:29:04,525
Jake Applebaum...
424
00:29:07,045 --> 00:29:10,085
..researchers dedicated
to the opportunities
425
00:29:10,085 --> 00:29:12,325
they felt Tor could
give for free speech.
426
00:29:16,005 --> 00:29:17,805
We work with the research community
427
00:29:17,805 --> 00:29:20,805
all around the world, the academic
community, the hacker community.
428
00:29:20,805 --> 00:29:22,685
It's a free software project
429
00:29:22,685 --> 00:29:25,045
so that means that all
the source code is available.
430
00:29:25,045 --> 00:29:27,485
That means that anyone can
look at it and see how it works,
431
00:29:27,485 --> 00:29:30,765
and that means everybody that does
and shares it with us helps improve
432
00:29:30,765 --> 00:29:34,765
the programme for
everyone else on the planet
and the network as a whole.
433
00:29:36,365 --> 00:29:43,325
Applebaum now travels the world
promoting the use of the software.
434
00:29:43,325 --> 00:29:46,565
The Tor network gives each person
the ability to read without
435
00:29:46,565 --> 00:29:50,645
creating a data trail that will
later be used against them.
436
00:29:50,645 --> 00:29:53,685
It gives every person a voice.
437
00:29:53,685 --> 00:29:56,245
Every person has the right
to read and to speak freely,
438
00:29:56,245 --> 00:29:58,365
not one human excluded.
439
00:29:58,365 --> 00:30:02,325
And one place Tor has become
important is the Middle East.
440
00:30:04,645 --> 00:30:06,005
During the Arab Spring,
441
00:30:06,005 --> 00:30:09,525
as disturbances spread across
the region, it became a vital
442
00:30:09,525 --> 00:30:10,725
tool for dissidents...
443
00:30:12,605 --> 00:30:15,605
..especially in places like Syria.
444
00:30:17,885 --> 00:30:22,525
One of those who used it from
the beginning was opposition
activist Reem al Assil.
445
00:30:25,965 --> 00:30:29,365
I found out first about Tor
back in 2011.
446
00:30:32,245 --> 00:30:37,325
Surveillance in Syria is
a very big problem for activists
447
00:30:37,325 --> 00:30:39,405
or for anyone even,
448
00:30:39,405 --> 00:30:43,845
because the Syrian regime are trying
all the time to get into people's
449
00:30:43,845 --> 00:30:50,125
e-mails and Facebook to see what
they are up to, what they are doing.
450
00:30:50,125 --> 00:30:52,565
By the time the Syrian
uprising happened,
451
00:30:52,565 --> 00:30:55,645
the Tor project had developed
a browser which made
452
00:30:55,645 --> 00:30:59,085
downloading the software
very simple.
453
00:30:59,085 --> 00:31:02,485
You basically go
and download Tor in your computer
454
00:31:02,485 --> 00:31:07,005
and once it's installed,
whenever you want to browse the Web,
455
00:31:07,005 --> 00:31:13,085
you go and click on it just like
Internet Explorer or Google Chrome.
456
00:31:13,085 --> 00:31:16,845
Reem had personal experience
of the protection offered by Tor
457
00:31:16,845 --> 00:31:19,285
when she was
arrested by the secret police.
458
00:31:22,085 --> 00:31:26,285
I denied having any relation with
any opposition work, you know,
459
00:31:26,285 --> 00:31:29,485
or anything
and they tried to intimidate me
460
00:31:29,485 --> 00:31:34,205
and they said, "Well, see, we have...
we know everything about you so now,
461
00:31:34,205 --> 00:31:38,685
"just we need you to tell us," but I
knew that they don't know anything.
462
00:31:38,685 --> 00:31:43,325
By using Tor, I was an anonymous
user so they couldn't tell
463
00:31:43,325 --> 00:31:47,845
that Reem is doing so-and-so,
is watching so-and-so.
464
00:31:47,845 --> 00:31:51,485
So that's why Tor
protected me in this way.
465
00:31:53,685 --> 00:31:56,685
Syria was not the only place
where Tor was vital.
466
00:32:01,125 --> 00:32:05,725
It's used in China and in Iran.
467
00:32:05,725 --> 00:32:08,965
In any country where internet
access is restricted,
468
00:32:08,965 --> 00:32:16,125
Tor can be used by citizens to avoid
the gaze of the authorities.
469
00:32:16,125 --> 00:32:19,165
China, for example, regularly
attacks and blocks the Tor network
470
00:32:19,165 --> 00:32:22,285
and they don't attack us directly
471
00:32:22,285 --> 00:32:24,605
so much as they actually attack
people in China using Tor.
472
00:32:24,605 --> 00:32:26,925
They stop them
from using the Tor network.
473
00:32:28,845 --> 00:32:34,325
But Tor wasn't just helping
inside repressive regimes.
474
00:32:34,325 --> 00:32:39,205
It was now being used for
whistle-blowing in the West,
475
00:32:39,205 --> 00:32:42,245
through WikiLeaks,
476
00:32:42,245 --> 00:32:44,485
founded by Julian Assange.
477
00:32:49,445 --> 00:32:52,005
Assange has spent
the last two years under
478
00:32:52,005 --> 00:32:55,285
the protection of
the Ecuadorian Embassy in London.
479
00:32:55,285 --> 00:32:59,445
He is fighting extradition to
Sweden on sexual assault charges,
480
00:32:59,445 --> 00:33:00,725
charges he denies.
481
00:33:01,965 --> 00:33:04,965
I'd been involved in cryptography
and anonymous communications
482
00:33:04,965 --> 00:33:08,845
for almost 20 years,
since the early 1990s.
483
00:33:08,845 --> 00:33:11,605
Cryptographic anonymity
didn't come from nowhere.
484
00:33:11,605 --> 00:33:14,365
It was a long-standing quest
which had a Holy Grail,
485
00:33:14,365 --> 00:33:16,765
which is to be able to communicate
486
00:33:16,765 --> 00:33:20,445
individual-to-individual
freely and anonymously.
487
00:33:20,445 --> 00:33:22,685
Tor was the first protocol,
488
00:33:22,685 --> 00:33:24,725
first anonymous protocol,
489
00:33:24,725 --> 00:33:26,965
that got the balance right.
490
00:33:28,805 --> 00:33:32,245
From its early years, people
who wanted to submit documents
491
00:33:32,245 --> 00:33:34,845
anonymously to WikiLeaks
could use Tor.
492
00:33:36,805 --> 00:33:39,645
Tor was and is
one of the mechanisms
493
00:33:39,645 --> 00:33:42,205
which we have received
important documents, yes.
494
00:33:45,445 --> 00:33:47,965
One man who provided
a link between WikiLeaks
495
00:33:47,965 --> 00:33:50,685
and the Tor project
was Jake Applebaum.
496
00:33:53,365 --> 00:33:55,805
Sources that want to leak documents
497
00:33:55,805 --> 00:33:58,445
need to be able to
communicate with WikiLeaks
498
00:33:58,445 --> 00:34:00,325
and it has always been the case
499
00:34:00,325 --> 00:34:04,565
that they have offered a Tor-hidden
service and that Tor-hidden service
500
00:34:04,565 --> 00:34:08,605
allows people to reach
the WikiLeaks submission engine.
501
00:34:10,325 --> 00:34:12,965
In 2010, what could be achieved
502
00:34:12,965 --> 00:34:17,165
when web activism met anonymity
was revealed to the world.
503
00:34:21,765 --> 00:34:23,605
WikiLeaks received a huge leak
504
00:34:23,605 --> 00:34:26,645
of confidential US government
material,
505
00:34:26,645 --> 00:34:30,325
mainly relating to the wars
in Afghanistan and Iraq.
506
00:34:35,805 --> 00:34:38,045
The first release was this footage
507
00:34:38,045 --> 00:34:41,285
which showed a US Apache
helicopter attack in Iraq.
508
00:34:49,005 --> 00:34:52,565
Among those dead were two
journalists from Reuters,
509
00:34:52,565 --> 00:34:55,125
Saeed Chmagh and Namir Noor-Eldeen.
510
00:34:56,245 --> 00:34:58,565
The Americans investigated,
511
00:34:58,565 --> 00:35:01,005
but say there was no wrong-doing
512
00:35:01,005 --> 00:35:05,485
yet this footage would never have
become public if Chelsea Manning,
513
00:35:05,485 --> 00:35:08,205
a US contractor in Iraq,
hadn't leaked it.
514
00:35:09,605 --> 00:35:13,245
Chelsea Manning has said
that to the court,
515
00:35:13,245 --> 00:35:16,245
that he used Tor
amongst a number of other things
516
00:35:16,245 --> 00:35:18,565
to submit documents to WikiLeaks.
517
00:35:18,565 --> 00:35:20,525
Obviously, we can't comment on that,
518
00:35:20,525 --> 00:35:23,125
because we have an obligation
to protect our sources.
519
00:35:26,045 --> 00:35:28,845
The release of the documents
provided by Manning,
520
00:35:28,845 --> 00:35:32,245
which culminated in 250,000 cables,
521
00:35:32,245 --> 00:35:35,125
seemed to reveal the power
of anonymity through encryption.
522
00:35:37,965 --> 00:35:39,525
Because with encryption,
523
00:35:39,525 --> 00:35:43,325
two people can come together
to communicate privately.
524
00:35:43,325 --> 00:35:46,725
The full might of a superpower
cannot break that encryption
525
00:35:46,725 --> 00:35:50,365
if it is properly implemented
and that's an extraordinary thing,
526
00:35:50,365 --> 00:35:54,405
where individuals are given
a certain type of freedom of action
527
00:35:54,405 --> 00:35:57,445
that is equivalent to the freedom
of action that a superpower has.
528
00:35:59,165 --> 00:36:02,045
But it wasn't the technology that
let Manning down.
529
00:36:04,405 --> 00:36:07,925
He confessed what he had done
to a contact and was arrested.
530
00:36:10,965 --> 00:36:15,165
Those who had used anonymity to leak
secrets were now under fire.
531
00:36:19,205 --> 00:36:22,125
WikiLeaks had already been
criticised for the release
532
00:36:22,125 --> 00:36:25,085
of un-redacted documents revealing
the names of Afghans
533
00:36:25,085 --> 00:36:26,725
who had assisted the US.
534
00:36:27,925 --> 00:36:31,405
The US government
was also on the attack.
535
00:36:31,405 --> 00:36:34,645
The United States strongly
condemns the illegal
536
00:36:34,645 --> 00:36:38,285
disclosure
of classified information.
537
00:36:38,285 --> 00:36:42,805
It puts people's lives in danger,
threatens our national security...
538
00:36:45,005 --> 00:36:49,445
Meanwhile, the Tor project,
started by the US government,
539
00:36:49,445 --> 00:36:50,685
was becoming a target.
540
00:36:53,125 --> 00:36:55,725
It's very funny, right,
because on the one hand,
541
00:36:55,725 --> 00:36:59,125
these people are funding Tor because
they say they believe in anonymity.
542
00:36:59,125 --> 00:37:01,885
And on the other hand,
they're detaining me at airports,
543
00:37:01,885 --> 00:37:04,485
threatening me and doing
things like that.
544
00:37:04,485 --> 00:37:07,525
And they've even said to me,
"We love what you do in Iran
545
00:37:07,525 --> 00:37:09,445
"and in China,
in helping Tibetan people.
546
00:37:09,445 --> 00:37:11,405
"We love all the stuff
that you're doing,
547
00:37:11,405 --> 00:37:13,245
"but why do you have to do it here?"
548
00:37:14,845 --> 00:37:16,285
Thanks to Edward Snowden,
549
00:37:16,285 --> 00:37:19,125
we now know that this
culminated in the Tor network
550
00:37:19,125 --> 00:37:23,365
being the focus of failed attacks by
America's National Security Agency.
551
00:37:24,725 --> 00:37:26,645
They revealed their frustration
552
00:37:26,645 --> 00:37:30,885
in a confidential PowerPoint
presentation called Tor Stinks,
553
00:37:30,885 --> 00:37:35,765
which set out the ways in which the
NSA had tried to crack the network.
554
00:37:35,765 --> 00:37:39,845
They think Tor stinks
because they want to attack people
555
00:37:39,845 --> 00:37:42,885
and sometimes technology
makes that harder.
556
00:37:42,885 --> 00:37:46,765
It is because the users have
something which bothers them,
557
00:37:46,765 --> 00:37:48,645
which is real autonomy.
558
00:37:48,645 --> 00:37:51,845
It gives them true privacy
and security.
559
00:37:51,845 --> 00:37:54,885
Tor, invented and funded
by the US government,
560
00:37:54,885 --> 00:37:58,125
was now used by activists,
journalists,
561
00:37:58,125 --> 00:38:01,805
anybody who wanted to
communicate anonymously,
562
00:38:01,805 --> 00:38:04,445
and it wasn't long
before its potential
563
00:38:04,445 --> 00:38:07,045
began to attract
a darker type of user.
564
00:38:11,645 --> 00:38:13,365
It began here in Washington.
565
00:38:14,485 --> 00:38:16,605
Jon Iadonisi is a former Navy SEAL
566
00:38:16,605 --> 00:38:19,725
turned advisor on cyber operations
to government.
567
00:38:21,885 --> 00:38:26,045
There was some tips that came in
out of Baltimore,
568
00:38:26,045 --> 00:38:27,965
two federal agents saying,
569
00:38:27,965 --> 00:38:29,805
"You are a police officer,
570
00:38:29,805 --> 00:38:32,285
"you really should take
a look at this website
571
00:38:32,285 --> 00:38:34,685
"and, oh, by the way,
the only way you get to it
572
00:38:34,685 --> 00:38:36,525
"is if you anonymise yourself
573
00:38:36,525 --> 00:38:39,165
"through something
called the Tor router."
574
00:38:39,165 --> 00:38:43,845
What they found was
a website called Silk Road.
575
00:38:43,845 --> 00:38:48,085
And they were amazed when they were
able to download this plug-in
576
00:38:48,085 --> 00:38:50,925
on their browser,
go into the Silk Road
577
00:38:50,925 --> 00:38:55,005
and then from there, see, literally
they can make a purchase,
578
00:38:55,005 --> 00:38:57,525
it was like a buffet dinner
for narcotics.
579
00:39:00,965 --> 00:39:03,725
Silk Road was
a global drugs marketplace
580
00:39:03,725 --> 00:39:06,845
which brought together
anonymous buyers and sellers
581
00:39:06,845 --> 00:39:08,205
from around the world.
582
00:39:10,245 --> 00:39:13,085
And what they found was
people aren't just buying
583
00:39:13,085 --> 00:39:15,125
one or two instances
of designer drugs
584
00:39:15,125 --> 00:39:18,165
but they're buying massive
quantity wholesale.
585
00:39:18,165 --> 00:39:22,845
In London, the tech community
was watching closely.
586
00:39:22,845 --> 00:39:25,765
Thomas Olofsson was
one of many interested
587
00:39:25,765 --> 00:39:27,965
in how much money
the site was making.
588
00:39:27,965 --> 00:39:31,765
Well, in Silk Road, they have
something called an "escrow" system
589
00:39:31,765 --> 00:39:35,605
so if you want to buy drugs,
you pay money into Silk Road
590
00:39:35,605 --> 00:39:36,845
as a facilitator
591
00:39:36,845 --> 00:39:40,325
and they keep the money
in escrow until you sign off
592
00:39:40,325 --> 00:39:43,125
that you have had
your drugs delivered.
593
00:39:43,125 --> 00:39:47,645
They will then release
your money to the drug dealer.
594
00:39:47,645 --> 00:39:51,245
We're talking about several
millions a day in trade.
595
00:39:54,885 --> 00:39:57,445
The extraordinary success
of Silk Road
596
00:39:57,445 --> 00:40:00,605
attracted new customers
to new illegal sites.
597
00:40:01,725 --> 00:40:05,725
This part of the internet
even had a new name - the Dark Web.
598
00:40:07,045 --> 00:40:10,165
A dark website
is impossible to shut down
599
00:40:10,165 --> 00:40:13,565
because you don't know where
a dark website is hosted
600
00:40:13,565 --> 00:40:17,485
or even where it's physically
located or who's behind it.
601
00:40:20,925 --> 00:40:23,965
And there was one other thing
which made Silk Road
602
00:40:23,965 --> 00:40:26,525
and its imitators difficult to stop.
603
00:40:26,525 --> 00:40:33,005
You paid with a new currency that
only exists online, called Bitcoin.
604
00:40:33,005 --> 00:40:36,045
Before, even if
you had anonymity as a user,
605
00:40:36,045 --> 00:40:39,405
you could still track
the transactions,
606
00:40:39,405 --> 00:40:41,805
the money flowing between persons,
607
00:40:41,805 --> 00:40:45,725
because if you use your Visa card,
your ATM, bank transfer,
608
00:40:45,725 --> 00:40:49,365
Western Union, there is always...
Money leaves a mark.
609
00:40:49,365 --> 00:40:51,925
This is the first time that
you can anonymously
610
00:40:51,925 --> 00:40:53,805
move money between two persons.
611
00:41:02,565 --> 00:41:06,005
Bitcoin is no longer
an underground phenomenon.
612
00:41:06,005 --> 00:41:08,845
Buy Bitcoin, $445.
613
00:41:08,845 --> 00:41:13,365
Sellers will sell at $460.
614
00:41:13,365 --> 00:41:15,205
If you're buying less
than half a Bitcoin,
615
00:41:15,205 --> 00:41:16,805
you'll have to go to market price.
616
00:41:16,805 --> 00:41:20,045
This Bitcoin event
is taking place on Wall Street.
617
00:41:22,885 --> 00:41:25,325
45 bid, 463 asked.
618
00:41:28,045 --> 00:41:29,765
But how does the currency work?
619
00:41:31,005 --> 00:41:35,045
One of the people best placed
to explain is Peter Todd.
620
00:41:35,045 --> 00:41:38,125
He's chief scientist for a number
of Bitcoin companies
621
00:41:38,125 --> 00:41:42,165
including one of the hottest,
Dark Wallet.
622
00:41:43,885 --> 00:41:46,685
So, what Bitcoin is,
is it's virtual money.
623
00:41:46,685 --> 00:41:48,485
I can give it to you electronically,
624
00:41:48,485 --> 00:41:51,205
you can give it to someone
else electronically.
625
00:41:51,205 --> 00:41:53,365
The key thing to understand
about Bitcoin
626
00:41:53,365 --> 00:41:57,045
is that these two people trading
here are making a deal
627
00:41:57,045 --> 00:41:59,245
without any bank involvement.
628
00:41:59,245 --> 00:42:01,765
It's a form of electronic cash
629
00:42:01,765 --> 00:42:04,725
and that has massive implications.
630
00:42:04,725 --> 00:42:07,525
So, of course, in normal
electronic banking systems,
631
00:42:07,525 --> 00:42:10,485
what I would say is, "Please
transfer money from my account
632
00:42:10,485 --> 00:42:12,845
"to someone else's account,"
but fundamentally,
633
00:42:12,845 --> 00:42:17,125
who owns what money is recorded
by the bank, by the intermediary?
634
00:42:18,965 --> 00:42:22,605
What's really interesting about
Bitcoin is this virtual money
635
00:42:22,605 --> 00:42:25,765
is not controlled by a bank,
it's not controlled by government.
636
00:42:25,765 --> 00:42:27,645
It's controlled by an algorithm.
637
00:42:27,645 --> 00:42:31,325
The algorithm in question
is a triumph of mathematics.
638
00:42:32,365 --> 00:42:35,285
This is a Bitcoin
transaction in action.
639
00:42:37,765 --> 00:42:41,245
To pay someone in Bitcoin,
the transaction must be signed
640
00:42:41,245 --> 00:42:45,925
using an cryptographic key which
proves the parties agree to it.
641
00:42:45,925 --> 00:42:49,885
An unchangeable electronic record
is then made of this transaction.
642
00:42:50,925 --> 00:42:54,085
This electronic record
contains every transaction
643
00:42:54,085 --> 00:42:56,725
ever made on Bitcoin.
644
00:42:56,725 --> 00:42:58,885
It's called the block chain
645
00:42:58,885 --> 00:43:03,645
and it's stored in a distributed
form by every user,
646
00:43:03,645 --> 00:43:06,765
not by a bank or other authority.
647
00:43:06,765 --> 00:43:10,125
The block chain is really
the revolutionary part of Bitcoin.
648
00:43:10,125 --> 00:43:12,845
What's really unique
about it is it's all public
649
00:43:12,845 --> 00:43:16,125
so you can run the Bitcoin
algorithm on your computer
650
00:43:16,125 --> 00:43:19,085
and your computer's inspecting
every single transaction
651
00:43:19,085 --> 00:43:21,325
to be sure that it
actually followed the rules,
652
00:43:21,325 --> 00:43:23,325
and the rules
are really what Bitcoin is.
653
00:43:23,325 --> 00:43:26,085
You know, that's the rules of
the system, that's the algorithm.
654
00:43:26,085 --> 00:43:29,445
It says things like, "You can only
send money to one person at once,"
655
00:43:29,445 --> 00:43:32,085
and we all agree to those rules.
656
00:43:32,085 --> 00:43:34,405
And Bitcoin has one other
characteristic
657
00:43:34,405 --> 00:43:37,045
which it shares with cash.
658
00:43:37,045 --> 00:43:40,205
It can be very hard to trace.
659
00:43:40,205 --> 00:43:42,645
Well, what's controversial
about Bitcoin
660
00:43:42,645 --> 00:43:46,085
is that it goes back
to something quite like cash.
661
00:43:46,085 --> 00:43:49,205
It's not like a bank account
where a government investigator
662
00:43:49,205 --> 00:43:51,325
can just call up the bank
and get all the records
663
00:43:51,325 --> 00:43:54,805
of who I've ever transacted
with without any effort at all.
664
00:43:54,805 --> 00:43:57,885
You know, if they want to go and
find out where I got my Bitcoins,
665
00:43:57,885 --> 00:43:59,325
they're going to have to ask me.
666
00:43:59,325 --> 00:44:01,125
They're going to have
to investigate.
667
00:44:04,765 --> 00:44:08,045
The emergence of Bitcoin
and the growth of the Dark Web
668
00:44:08,045 --> 00:44:12,125
was now leading law enforcement in
Washington to take a close interest.
669
00:44:13,885 --> 00:44:18,165
Transactions in the Dark Web
were unbelievably more enabled
670
00:44:18,165 --> 00:44:20,605
and in many cases could exist
671
00:44:20,605 --> 00:44:23,845
because of this virtual currency
known as Bitcoin.
672
00:44:25,085 --> 00:44:27,725
So, as people started
to take a look at Bitcoin
673
00:44:27,725 --> 00:44:30,645
and understand, "How do we regulate
this, how do we monitor this?
674
00:44:30,645 --> 00:44:33,845
"Oh, my God! It's completely
anonymous, we have no record,"
675
00:44:33,845 --> 00:44:37,565
they started seeing transactions
in the sort of the digital exhaust
676
00:44:37,565 --> 00:44:40,125
that led them into Silk Road.
677
00:44:40,125 --> 00:44:44,285
And so, now you had criminal grounds
to start taking a look at this
678
00:44:44,285 --> 00:44:49,645
coupled with the movement from the
financial side and regulatory side
679
00:44:49,645 --> 00:44:51,725
on the virtual currency.
680
00:44:51,725 --> 00:44:55,245
So these two fronts
began converging.
681
00:44:55,245 --> 00:44:57,605
The FBI began to mount
a complex plot
682
00:44:57,605 --> 00:45:01,045
against the alleged lead
administrator of Silk Road
683
00:45:01,045 --> 00:45:04,125
who used the pseudonym
"Dread Pirate Roberts".
684
00:45:06,525 --> 00:45:10,205
They really started focusing
on Dread Pirate Roberts
685
00:45:10,205 --> 00:45:14,445
and his role as not just a leader
but sort of the mastermind
686
00:45:14,445 --> 00:45:17,165
in the whole
ecosystem of the platform, right,
687
00:45:17,165 --> 00:45:21,925
from management administratively
to financial merchandising
688
00:45:21,925 --> 00:45:24,805
to vendor placement,
recruitment, etc.
689
00:45:26,445 --> 00:45:30,125
Dread Pirate Roberts was believed
to be Ross Ulbricht,
690
00:45:30,125 --> 00:45:32,205
listed on his LinkedIn entry
691
00:45:32,205 --> 00:45:37,165
as an investment advisor and
entrepreneur from Austin, Texas.
692
00:45:37,165 --> 00:45:42,485
Taking him down was really almost
a story out of a Hollywood movie.
693
00:45:42,485 --> 00:45:45,285
I mean, we had people
that staged the death
694
00:45:45,285 --> 00:45:47,765
of one of the potential informants.
695
00:45:47,765 --> 00:45:51,805
We had undercover police officers
acting as cocaine...
696
00:45:51,805 --> 00:45:54,445
under-kingpins inside Silk Road.
697
00:45:54,445 --> 00:45:57,925
So, at the conclusion of all
these different elements,
698
00:45:57,925 --> 00:46:03,205
they actually finally ended up
bringing in Dread Pirate Roberts
699
00:46:03,205 --> 00:46:07,285
and now are trying
to move forward with his trial.
700
00:46:07,285 --> 00:46:09,685
Whether or not he is
Dread Pirate Roberts,
701
00:46:09,685 --> 00:46:14,245
Ulbricht's arrest in 2013
brought down Silk Road.
702
00:46:14,245 --> 00:46:18,245
In the process, the FBI seized
$28.5 million
703
00:46:18,245 --> 00:46:22,645
from the site's escrow account,
money destined for drug dealers.
704
00:46:25,965 --> 00:46:28,405
Yet the problem
for the US authorities
705
00:46:28,405 --> 00:46:30,925
was that their success
was only temporary.
706
00:46:35,045 --> 00:46:37,125
The site is now up
and running again.
707
00:46:38,565 --> 00:46:42,005
It re-emerged just two months
later by some other guys
708
00:46:42,005 --> 00:46:44,245
that took the same code base,
the same,
709
00:46:44,245 --> 00:46:46,245
actually the same site more or less,
710
00:46:46,245 --> 00:46:48,165
just other people running the site,
711
00:46:48,165 --> 00:46:53,445
because it's obviously
a very, very profitable site to run.
712
00:46:53,445 --> 00:46:57,245
And Silk Road has now been
joined by a host of other sites.
713
00:47:00,165 --> 00:47:04,645
One of the boom industries
on the Dark Web is financial crime.
714
00:47:05,965 --> 00:47:08,205
Yeah, this website is quite focused
715
00:47:08,205 --> 00:47:10,805
on credit card numbers
and stolen data.
716
00:47:10,805 --> 00:47:13,885
On here, for instance,
is credit card numbers
717
00:47:13,885 --> 00:47:16,725
from around $5-6 per
credit card number
718
00:47:16,725 --> 00:47:19,965
paying in equivalent
of Bitcoins, totally anonymous.
719
00:47:21,405 --> 00:47:24,885
The cards are sold in
something called a "dump".
720
00:47:26,245 --> 00:47:30,725
I mean, it's quite a large number
of entries, it's tens of thousands.
721
00:47:30,725 --> 00:47:34,085
You get everything you need to
be able to buy stuff online
722
00:47:34,085 --> 00:47:38,885
with these credit cards -
first name, last name, address,
723
00:47:38,885 --> 00:47:41,605
the card number,
everything, the CVV number,
724
00:47:41,605 --> 00:47:44,565
everything
but the PIN code, basically.
725
00:47:44,565 --> 00:47:48,205
The Dark Web is now used for
various criminal activities -
726
00:47:48,205 --> 00:47:51,045
drugs and guns, financial crime,
727
00:47:51,045 --> 00:47:54,085
and even child sexual exploitation.
728
00:48:01,685 --> 00:48:05,045
So, is anonymity
a genuine threat to society?
729
00:48:08,205 --> 00:48:10,085
A nightmare that should haunt us?
730
00:48:15,005 --> 00:48:19,965
This man who leads Europe's fight
against cybercrime believes so.
731
00:48:19,965 --> 00:48:23,765
The Tor network plays a role
because it hides criminals.
732
00:48:23,765 --> 00:48:25,605
I know it was not the intention,
733
00:48:25,605 --> 00:48:28,445
but that's the outcome
and this is my job,
734
00:48:28,445 --> 00:48:32,085
to tell the society
what is the trade-offs here.
735
00:48:32,085 --> 00:48:35,445
By having no possibilities
to penetrate this,
736
00:48:35,445 --> 00:48:37,325
we will then secure criminals
737
00:48:37,325 --> 00:48:40,325
that they can continue their
crimes on a global network.
738
00:48:42,445 --> 00:48:45,405
And despite the success over
Silk Road and others,
739
00:48:45,405 --> 00:48:47,365
he is worried for the future.
740
00:48:49,925 --> 00:48:53,205
Our detection rate is dropping.
It's very simple.
741
00:48:53,205 --> 00:48:56,645
The business model of the criminal
is to make profit with low risk
742
00:48:56,645 --> 00:49:00,205
and, here, you actually
eliminate the risk
743
00:49:00,205 --> 00:49:02,525
because there is no risk
to get identified
744
00:49:02,525 --> 00:49:05,365
so either you have to screw up
or be very unlucky
745
00:49:05,365 --> 00:49:06,845
as somebody rats you out.
746
00:49:06,845 --> 00:49:09,685
Otherwise, you're secure. So, if you
run a tight operation,
747
00:49:09,685 --> 00:49:12,925
it's very, very difficult
for the police to penetrate
748
00:49:12,925 --> 00:49:14,565
so it's risk-free crime.
749
00:49:14,565 --> 00:49:18,005
So, does the anonymity offered
by the Tor network
750
00:49:18,005 --> 00:49:21,605
encourage crime
or simply displace it?
751
00:49:21,605 --> 00:49:26,205
Those who work for the project are
well aware of the charges it faces.
752
00:49:26,205 --> 00:49:30,405
There is often asserted
certain narratives about anonymity
753
00:49:30,405 --> 00:49:35,565
and, of course, one of the narratives
is that anonymity creates crime
754
00:49:35,565 --> 00:49:38,125
so you hear about things
like the Silk Road and you hear,
755
00:49:38,125 --> 00:49:41,525
"Oh, it's terrible, someone can do
something illegal on the internet."
756
00:49:41,525 --> 00:49:42,925
Well, welcome to the internet.
757
00:49:42,925 --> 00:49:45,125
It is a reflection of human society
758
00:49:45,125 --> 00:49:48,045
where there is sometimes
illegal behaviour.
759
00:49:48,045 --> 00:49:49,845
These arguments aside,
760
00:49:49,845 --> 00:49:53,245
for users wanting
to avoid surveillance,
761
00:49:53,245 --> 00:49:55,245
Tor has limitations -
762
00:49:55,245 --> 00:49:58,845
it's slow, content isn't
automatically encrypted
763
00:49:58,845 --> 00:50:00,565
on exiting the network
764
00:50:00,565 --> 00:50:04,165
and some have claimed
a bug can de-anonymise users.
765
00:50:04,165 --> 00:50:06,445
Tor say they have
a fix for this problem.
766
00:50:07,565 --> 00:50:11,405
Whilst the search for a solution
to bulk surveillance continues,
767
00:50:11,405 --> 00:50:13,445
this man has a different approach.
768
00:50:15,885 --> 00:50:19,445
He is Eugene Kaspersky,
CEO of one of the world's
769
00:50:19,445 --> 00:50:23,085
fastest-growing
internet security companies.
770
00:50:23,085 --> 00:50:26,365
300 million users
now rely on its software.
771
00:50:27,445 --> 00:50:31,925
For Kaspersky, widespread
anonymity is not a solution.
772
00:50:31,925 --> 00:50:35,605
In fact, he thinks
we need the absolute opposite,
773
00:50:35,605 --> 00:50:37,605
a form of online passport.
774
00:50:40,245 --> 00:50:45,125
My idea is that all the services
in the internet, they must be split.
775
00:50:47,365 --> 00:50:50,605
So, the non-critical -
your personal e-mails,
776
00:50:50,605 --> 00:50:52,645
the news from the internet,
777
00:50:52,645 --> 00:50:55,125
your chatting with your family,
778
00:50:55,125 --> 00:51:00,565
what else? - leave them alone so
don't need, you don't need any ID.
779
00:51:00,565 --> 00:51:02,405
It's a place of freedom.
780
00:51:02,405 --> 00:51:06,525
And there are critical services,
like banking services,
781
00:51:06,525 --> 00:51:09,685
financial services, banks,
booking their tickets,
782
00:51:09,685 --> 00:51:12,365
booking the hotels or what else?
783
00:51:12,365 --> 00:51:15,605
So please present your ID
if you do this.
784
00:51:18,285 --> 00:51:20,885
So, it's a kind of balance -
785
00:51:20,885 --> 00:51:22,885
freedom and security,
786
00:51:22,885 --> 00:51:26,405
anonymity and wearing the badge,
wearing your ID.
787
00:51:26,405 --> 00:51:29,165
In his view, this shouldn't be
a problem,
788
00:51:29,165 --> 00:51:32,445
as privacy has pretty much
disappeared online anyway.
789
00:51:33,765 --> 00:51:38,205
The reality is that we don't have
too much privacy in the cyber space.
790
00:51:38,205 --> 00:51:42,005
If you want to travel, if you want
to pay with your credit card,
791
00:51:42,005 --> 00:51:45,205
if you want to access internet,
forget about privacy.
792
00:51:49,525 --> 00:51:51,965
Yet, the idea that we could soon
inhabit a world
793
00:51:51,965 --> 00:51:55,725
where our lives are ever more
transparent is very unpopular.
794
00:51:57,445 --> 00:52:01,845
Some people say,
"Privacy is over, get over it",
795
00:52:01,845 --> 00:52:05,045
because basically we're
trending towards the idea
796
00:52:05,045 --> 00:52:07,885
of just completely
transparent lives.
797
00:52:07,885 --> 00:52:12,685
I think that's nonsense because
information boundaries are important
798
00:52:12,685 --> 00:52:16,125
so that means that we've got to
have systems which respect them
799
00:52:16,125 --> 00:52:18,605
so we've got to have
the technology to produce,
800
00:52:18,605 --> 00:52:20,125
which can produce privacy.
801
00:52:26,125 --> 00:52:29,325
For those of us who might wish
to resist surveillance,
802
00:52:29,325 --> 00:52:32,205
the hunt for that
technology is now on
803
00:52:32,205 --> 00:52:36,045
and it is to cryptographers
that we must look for answers.
804
00:52:37,685 --> 00:52:41,605
If you want a demonstration of their
importance to today's internet,
805
00:52:41,605 --> 00:52:45,405
you only have to come to
this bunker in Virginia.
806
00:52:45,405 --> 00:52:49,245
Today, an unusual ceremony
is taking place.
807
00:52:52,685 --> 00:52:55,045
COMPUTER: 'Please centre
your eyes in the mirror.'
808
00:52:55,045 --> 00:52:56,765
The internet is being upgraded.
809
00:52:56,765 --> 00:52:59,285
'Thank you, your identity
has been verified.'
810
00:53:00,405 --> 00:53:01,445
Right, we're in.
811
00:53:01,445 --> 00:53:04,405
This is the biggest security
upgrade to the internet
812
00:53:04,405 --> 00:53:06,725
in over 20 years.
813
00:53:06,725 --> 00:53:11,005
A group of tech specialists have
been summoned by Steve Crocker,
814
00:53:11,005 --> 00:53:14,845
one of the godfathers
of the internet.
815
00:53:14,845 --> 00:53:18,005
We discovered that there
were some vulnerabilities
816
00:53:18,005 --> 00:53:20,485
in the basic
domain name system structure
817
00:53:20,485 --> 00:53:24,965
that can lead to having
a domain name hijacked
818
00:53:24,965 --> 00:53:28,565
or having someone directed
to a false site and, from there,
819
00:53:28,565 --> 00:53:32,325
passwords can be detected
and accounts can be cleaned out.
820
00:53:33,685 --> 00:53:36,965
At the heart of this upgrade
is complex cryptography.
821
00:53:39,125 --> 00:53:41,925
Work began on adding
cryptographically strong signatures
822
00:53:41,925 --> 00:53:45,525
to every entry
in the domain name system
823
00:53:45,525 --> 00:53:50,885
in order to make it impossible to
spoof or plant false information.
824
00:53:55,005 --> 00:53:57,885
To make sure these cryptographic
codes remain secure,
825
00:53:57,885 --> 00:54:00,325
they are reset every three months.
826
00:54:02,405 --> 00:54:06,005
Three trusted experts from
around the world have been summoned
827
00:54:06,005 --> 00:54:10,485
with three keys, keys that
open these safety deposit boxes.
828
00:54:10,485 --> 00:54:13,565
So, now we are opening box 1-2-4-0.
829
00:54:13,565 --> 00:54:15,765
Inside are smart cards.
830
00:54:15,765 --> 00:54:17,805
When the three are put together,
831
00:54:17,805 --> 00:54:21,885
a master code can be validated
which resets the system
832
00:54:21,885 --> 00:54:24,085
for millions of websites.
833
00:54:24,085 --> 00:54:26,325
We are very lucky because
every one of these people
834
00:54:26,325 --> 00:54:29,205
are respected members of
the technical community,
835
00:54:29,205 --> 00:54:30,805
technical internet community,
836
00:54:30,805 --> 00:54:32,405
and that's what we were doing.
837
00:54:32,405 --> 00:54:35,245
Just like the internet itself
was formed from the bottom up,
838
00:54:35,245 --> 00:54:37,085
high techies from around the world.
839
00:54:37,085 --> 00:54:38,925
So, step 22.
840
00:54:40,325 --> 00:54:42,805
A complex set of instructions
is followed
841
00:54:42,805 --> 00:54:46,245
to make sure the system
is ready to operate.
842
00:54:46,245 --> 00:54:49,485
And now we need to verify the KSR.
843
00:54:49,485 --> 00:54:51,125
This is the key step.
844
00:54:53,165 --> 00:54:56,965
So, this is it. When I hit "yes",
it will be signed.
845
00:54:58,925 --> 00:55:01,045
There you go, thank you very much.
846
00:55:01,045 --> 00:55:02,525
APPLAUSE
847
00:55:06,165 --> 00:55:09,405
These scientists want
to use cryptography to make
848
00:55:09,405 --> 00:55:13,365
the architecture of the
internet more resilient.
849
00:55:13,365 --> 00:55:17,525
But, for users,
cryptography has another purpose.
850
00:55:17,525 --> 00:55:20,885
We can use it to
encrypt our message content.
851
00:55:20,885 --> 00:55:23,805
So, if we want to change the way
that mass surveillance is done,
852
00:55:23,805 --> 00:55:25,725
encryption, it turns out,
853
00:55:25,725 --> 00:55:28,085
is one of the ways that we do that.
854
00:55:28,085 --> 00:55:29,805
When we encrypt our data,
855
00:55:29,805 --> 00:55:32,645
we change the value
that mass surveillance presents.
856
00:55:32,645 --> 00:55:35,165
When phone calls are
end-to-end encrypted
857
00:55:35,165 --> 00:55:38,125
such that no-one else
can decipher their content,
858
00:55:38,125 --> 00:55:41,285
thanks to the science
of mathematics, of cryptography.
859
00:55:43,125 --> 00:55:46,245
And those who understand
surveillance from the inside
860
00:55:46,245 --> 00:55:49,605
agree that it's the only way
to protect our communications.
861
00:56:21,365 --> 00:56:23,245
However there's a problem.
862
00:56:23,245 --> 00:56:27,125
It's still very difficult to encrypt
content in a user-friendly way.
863
00:56:28,645 --> 00:56:32,085
One of the best ways to protect
your privacy is to use encryption,
864
00:56:32,085 --> 00:56:34,925
but encryption is
so incredibly hard to use.
865
00:56:34,925 --> 00:56:37,365
I am a technology person
866
00:56:37,365 --> 00:56:41,445
and I struggle all the time to
get my encryption products to work.
867
00:56:41,445 --> 00:56:44,045
And that's the next challenge,
868
00:56:44,045 --> 00:56:47,445
developing an internet
where useable encryption
869
00:56:47,445 --> 00:56:48,925
makes bulk surveillance,
870
00:56:48,925 --> 00:56:51,765
for those who want to avoid it,
more difficult.
871
00:56:53,165 --> 00:56:56,765
At the moment, the systems
we're using are fairly insecure
872
00:56:56,765 --> 00:56:59,525
in lots of ways. I think
the programmers out there
873
00:56:59,525 --> 00:57:02,125
have got to help build tools
to make that easier.
874
00:57:04,765 --> 00:57:07,365
If encryption is to become
more commonplace,
875
00:57:07,365 --> 00:57:10,485
the pressure will likely
come from the market.
876
00:57:11,565 --> 00:57:15,365
The result of the revelations
about the National Security Agency
877
00:57:15,365 --> 00:57:18,005
is that people are becoming
quite paranoid.
878
00:57:18,005 --> 00:57:21,205
It's important to not be
paralysed by that paranoia,
879
00:57:21,205 --> 00:57:25,125
but rather to express
the desire for privacy
880
00:57:25,125 --> 00:57:28,325
and by expressing
the desire for privacy,
881
00:57:28,325 --> 00:57:30,365
the market will fill the demand.
882
00:57:30,365 --> 00:57:32,605
By making it harder,
you protect yourself,
883
00:57:32,605 --> 00:57:35,885
you protect your family and you
also protect other people
884
00:57:35,885 --> 00:57:39,525
in just making it more expensive
to surveil everyone all the time.
885
00:57:39,525 --> 00:57:43,485
In the end, encryption
is all about mathematics
886
00:57:43,485 --> 00:57:45,605
and, for those
who want more privacy,
887
00:57:45,605 --> 00:57:47,685
the numbers work in their favour.
888
00:57:48,765 --> 00:57:51,925
It turns out that it's
easier in this universe
889
00:57:51,925 --> 00:57:55,965
to encrypt information,
much easier, than it is to decrypt it
890
00:57:55,965 --> 00:57:59,405
if you're someone
watching from the outside.
891
00:57:59,405 --> 00:58:02,085
The universe
fundamentally favours privacy.
892
00:58:04,245 --> 00:58:06,005
We have reached a critical moment
893
00:58:06,005 --> 00:58:08,165
when the limits of privacy
894
00:58:08,165 --> 00:58:10,885
could be defined for a generation.
895
00:58:10,885 --> 00:58:14,045
We are beginning to grasp
the shape of this new world.
896
00:58:15,685 --> 00:58:18,925
It's time to decide
whether we are happy to accept it.
897
00:58:45,125 --> 00:58:47,885
'Thank you. Your identity
has been verified.'
80502
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.