Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,480 --> 00:00:07,280
One of the biggest threats to your security privacy and anonymity online is potentially Unfortunately
2
00:00:07,310 --> 00:00:07,970
your own.
3
00:00:07,970 --> 00:00:12,190
Go home and this threat is coming from a number of fronts.
4
00:00:12,200 --> 00:00:21,000
The two main fronts one is the push to weaken and regulate encryption and the other is to legalize it
5
00:00:21,000 --> 00:00:26,590
I guess legitimize mass surveillance and spying on its citizens.
6
00:00:26,640 --> 00:00:36,270
Many countries are talking about implementing policies that limits encryption and policies to legitimize
7
00:00:36,270 --> 00:00:37,950
and legalize spying.
8
00:00:37,980 --> 00:00:41,420
That has been going on anyway illegally for years.
9
00:00:41,550 --> 00:00:48,750
And in fact by the time you listen to this because it's moving so fast in your particular country things
10
00:00:48,750 --> 00:00:57,210
may have changed and it may now be more legal to spy on you and encryption may be regulated to a further
11
00:00:57,210 --> 00:01:07,140
extent this regulation and mandating of insecurity and legalised spying is going on in many places United
12
00:01:07,140 --> 00:01:13,450
States the U.K. China Russia Brazil India etc..
13
00:01:13,620 --> 00:01:20,540
The UK has got the data communications bill which includes recording 12 months of Internet history and
14
00:01:20,610 --> 00:01:22,820
the Orwellian measures.
15
00:01:23,030 --> 00:01:27,600
Other examples that show the tide of change in other countries.
16
00:01:27,630 --> 00:01:33,980
You have whatsapp being banned for 48 hours in Brazil because of the encryption by the government.
17
00:01:33,990 --> 00:01:42,240
India India has some very strong ideas on limiting encryption Kazakstan legally requiring back doors
18
00:01:42,720 --> 00:01:45,420
encryption is fundamentally mathematics.
19
00:01:45,440 --> 00:01:47,550
It cannot be banned.
20
00:01:47,610 --> 00:01:51,470
The horse has already left the stable it's been created.
21
00:01:51,520 --> 00:02:00,770
It cannot be weakened just for a terrorist or a criminal or for someone who you want to have weak encryption.
22
00:02:00,930 --> 00:02:07,560
Those people will just use the strong encryption that's already out there and everyone else will be
23
00:02:07,560 --> 00:02:13,800
stuck with weakened security and weakened encryption because they're forced to use the weakened grips.
24
00:02:13,980 --> 00:02:21,840
If it is weakened or back doored is weakened for everyone including the hackers trying to compromise
25
00:02:21,840 --> 00:02:30,210
our systems something like this was actually tried already as the crypto wars in the 1990s something
26
00:02:30,210 --> 00:02:36,600
called the Clipper Chip was proposed by the U.S. government and the floor was found in it.
27
00:02:36,690 --> 00:02:42,180
And luckily there was no widespread adoption because this chip was going to be put into you know every
28
00:02:42,900 --> 00:02:45,660
electronic device that was going to do encryption.
29
00:02:45,660 --> 00:02:51,370
So the government could bypass encryption and look at what it is you were doing.
30
00:02:51,450 --> 00:02:56,700
So if that had happened that would have been a complete disaster because of the vulnerability that was
31
00:02:56,700 --> 00:02:57,230
found.
32
00:02:57,270 --> 00:02:58,320
This is the problem.
33
00:02:58,470 --> 00:03:02,440
If you weaken encryption you can weaken it for everybody.
34
00:03:02,460 --> 00:03:08,910
Terrorists and criminals will continue to use strong encryption even if normal citizens are banned.
35
00:03:08,910 --> 00:03:13,570
There is no evidence that weakening encryption will help at all.
36
00:03:13,650 --> 00:03:21,670
And to add to all that we have no feasible technical way of achieving this.
37
00:03:21,810 --> 00:03:28,670
Unfortunately all of this stuff is perhaps too complicated for people to really grasp.
38
00:03:28,680 --> 00:03:34,440
Those people that make decisions on these things or maybe they do understand it but because of political
39
00:03:34,480 --> 00:03:36,890
agendas they're still pushing forward.
40
00:03:37,020 --> 00:03:44,930
This is Matt Bley speaking to a U.S. congressional committee on the feasibility of these plans.
41
00:03:45,240 --> 00:03:48,510
And it's definitely worth a watch.
42
00:03:48,540 --> 00:03:50,590
So I'm going to play it now.
43
00:03:50,880 --> 00:03:52,860
It's only five minutes long.
44
00:03:52,950 --> 00:03:53,600
Dr. Blix.
45
00:03:53,620 --> 00:03:54,980
Five minutes.
46
00:03:55,500 --> 00:03:57,130
Thank you Mr. Chairman.
47
00:03:57,870 --> 00:04:06,180
As a technologist I'm finding myself in the very curious and participating in a debate over the desirability
48
00:04:06,180 --> 00:04:13,920
of something that sounds wonderful which is security systems that can be bypassed by the good guys but
49
00:04:13,920 --> 00:04:20,540
that also reliably the bad guys out and we could certainly discuss that.
50
00:04:20,540 --> 00:04:30,600
But as a technologist I can't ignore this stark reality which is simply that it can't be done safely.
51
00:04:30,810 --> 00:04:38,850
And if we make wishful policies that assume and pretend that we can there will be terrible consequences
52
00:04:38,850 --> 00:04:42,070
for our economy and for our national security.
53
00:04:43,490 --> 00:04:49,970
So it would be difficult to overstate today the importance of robust reliable computing and communications
54
00:04:49,970 --> 00:04:53,990
to our personal commercial and national security.
55
00:04:54,020 --> 00:05:00,980
Modern computing and network technologies are obviously yielding great benefits to our society and we
56
00:05:01,040 --> 00:05:07,340
are depending on them to be reliable and trustworthy in the same way that we depend on power and water
57
00:05:07,340 --> 00:05:10,170
and the rest of our critical infrastructure today.
58
00:05:10,610 --> 00:05:17,330
But unfortunately software based system is the foundation on which all of this modern communications
59
00:05:17,400 --> 00:05:26,120
technologies is based are also notoriously vulnerable to attack by criminals and by hostile nation states
60
00:05:26,780 --> 00:05:35,000
large scale data breaches of course are are literally a daily occurrence and this problem is getting
61
00:05:35,000 --> 00:05:41,750
worse rather than better as we build larger and more complex systems and it's really not an exaggeration
62
00:05:41,750 --> 00:05:49,560
to characterize the state of software security as an emerging national crisis.
63
00:05:49,910 --> 00:05:58,610
And the sad truth behind this is that computer science my field simply does not know how to build complex
64
00:05:58,610 --> 00:06:03,220
large scale software that has reliably correct behavior.
65
00:06:03,470 --> 00:06:08,930
And this is not a new problem it has nothing to do with encryption or modern technology.
66
00:06:08,930 --> 00:06:15,080
It's been the central focus of computing research since the dawn of the programmable computer.
67
00:06:15,320 --> 00:06:21,050
And as new technology allows us to build larger and more complex systems the problem of ensuring their
68
00:06:21,050 --> 00:06:26,660
reliability becomes actually exponentially harder with more and more components interacting with each
69
00:06:26,660 --> 00:06:27,470
other.
70
00:06:27,470 --> 00:06:34,370
So as we integrate in secure vulnerable systems into the fabric of our economy the consequences of those
71
00:06:34,370 --> 00:06:40,780
systems failing become both more likely and increasingly serious.
72
00:06:42,210 --> 00:06:47,910
Unfortunately there is no magic bullet for securing software based system.
73
00:06:47,910 --> 00:06:55,930
Large systems are fundamentally risky and this is something that we can at best manage rather than than
74
00:06:56,040 --> 00:06:58,170
fix outright.
75
00:06:58,350 --> 00:07:06,090
There are really only two known ways to manage the risk of unreliable and in secure software.
76
00:07:06,090 --> 00:07:15,300
One is the use of encryption which allows us to process sensitive data over in secure media and in secure
77
00:07:15,810 --> 00:07:18,740
software systems to the extent that we can.
78
00:07:18,750 --> 00:07:26,040
And the other is to design our software systems to be as small and as simple as we possibly can to minimize
79
00:07:26,040 --> 00:07:32,080
the number of features that a malicious attacker might be able to find flaws to exploit.
80
00:07:32,830 --> 00:07:39,700
And this is why proposals for law enforcement access features frighten me so much cryptographic systems
81
00:07:39,700 --> 00:07:44,930
are among the most fragile and subtle elements of modern software.
82
00:07:44,940 --> 00:07:51,270
We often discover devastating weaknesses in even very simple cryptographic systems years after they're
83
00:07:51,280 --> 00:07:57,910
designed and fielded with third party access requirements to do is take even very simple problems that
84
00:07:57,940 --> 00:08:03,940
we don't really know how to solve and turn them into far more complex problems that we really have no
85
00:08:03,940 --> 00:08:06,640
chance of reliably solving.
86
00:08:06,700 --> 00:08:14,530
So backdoor cryptography of the kind advocated by by the FBI you know might solve some problems if we
87
00:08:14,530 --> 00:08:15,010
could do it.
88
00:08:15,010 --> 00:08:21,820
But it's a notoriously and well-known difficult problem we've found subtle flaws even in systems designed
89
00:08:21,820 --> 00:08:25,320
by the National Security Agency such as the clipper chip.
90
00:08:25,420 --> 00:08:31,210
Two decades ago and even if we could get the cryptography right we'd be left with the problem of integrating
91
00:08:31,210 --> 00:08:37,840
access features into the software design and requiring designers to design around third party access
92
00:08:37,840 --> 00:08:44,740
requirements will basically undermine our already tenuous ability to defend against attack.
93
00:08:44,760 --> 00:08:48,670
It's tempting to frame this debate as being between personal privacy and law enforcement.
94
00:08:48,670 --> 00:08:51,180
But in fact the stakes are higher than that.
95
00:08:51,400 --> 00:08:58,370
We just can't do what the FBI is asking without seriously weakening our infrastructure.
96
00:08:58,540 --> 00:09:03,510
The ultimate beneficiaries will be criminals and rival nation states.
97
00:09:03,520 --> 00:09:12,010
Congress faces a crucial choice here to effectively legislate mandatory insecurity in our critical infrastructure
98
00:09:12,490 --> 00:09:19,720
or recognize the critical importance of robust security in preventing crime in our increasingly connected
99
00:09:19,720 --> 00:09:20,200
world.
100
00:09:20,350 --> 00:09:21,210
Thank you very much.
101
00:09:22,500 --> 00:09:24,650
This here is a good paper to read.
102
00:09:24,930 --> 00:09:30,900
And it's from a number of top crypto experts including some of the people that actually design the crypt
103
00:09:31,260 --> 00:09:37,500
that we're going to talk about on the course on why mandating insecurity is a bad idea.
104
00:09:37,590 --> 00:09:39,300
So if you want to dig deeper into that.
105
00:09:39,480 --> 00:09:45,030
Give the readers your homework and other quick Darkman that you can read is the case against regulating
106
00:09:45,120 --> 00:09:46,390
encryption technology.
107
00:09:46,410 --> 00:09:47,580
Only a couple of pages.
108
00:09:47,730 --> 00:09:49,640
And to give you more of a background.
109
00:09:49,870 --> 00:09:55,030
And this is a good read this is the nine epic failures of regulating encryption.
110
00:09:55,030 --> 00:10:01,480
It was a great report to give you an idea of the number of crypto products out there by Bruce Schneier.
111
00:10:01,620 --> 00:10:09,310
This is world wide survey of encryption products give the google the PTF version which is here.
112
00:10:09,370 --> 00:10:16,080
There's also an Excel version that excels pretty cool because you can sort by the type so you can look
113
00:10:16,080 --> 00:10:21,060
all the different of crypto products and maybe where we get to those sections and those products you
114
00:10:21,060 --> 00:10:29,430
can see all the ones that are out there it finds 865 hardware and software products and co-option encryption
115
00:10:29,760 --> 00:10:36,390
from 55 different countries so obviously if you have a law in one country something affects all the
116
00:10:36,390 --> 00:10:42,540
other countries and you know people who do use the crypto from whichever country they choose to use
117
00:10:42,540 --> 00:10:47,270
it for them let's move on to the legalization of spying and mass surveillance.
118
00:10:47,270 --> 00:10:55,010
Now I think we can start with some quotes from Edward Snowden So we have here arguing that you don't
119
00:10:55,010 --> 00:10:58,730
care about the right to privacy because you have nothing to hide.
120
00:10:58,940 --> 00:11:04,290
It's no different than saying you don't care about free speech because you have nothing to say.
121
00:11:04,340 --> 00:11:08,700
He continues people who use the I have nothing to hide line.
122
00:11:08,750 --> 00:11:12,050
Don't understand the basic foundation of human rights.
123
00:11:12,050 --> 00:11:15,990
Nobody needs to justify why they need a right.
124
00:11:16,010 --> 00:11:21,890
The burden of justification falls on the ones seeking to infringe upon the right.
125
00:11:21,920 --> 00:11:27,980
If one person chooses to disregard his right to privacy that doesn't automatically mean everyone should
126
00:11:27,980 --> 00:11:28,810
follow suit.
127
00:11:28,850 --> 00:11:34,440
Either you can't give away the rights of others because they're not useful to you.
128
00:11:34,460 --> 00:11:41,450
More simply the majority cannot vote away the natural rights of the minority.
129
00:11:41,450 --> 00:11:42,320
My view is this.
130
00:11:42,320 --> 00:11:46,720
When people know they will be watch they are being spied on.
131
00:11:46,760 --> 00:11:48,740
They alter what they do.
132
00:11:48,740 --> 00:11:50,870
They are no longer free.
133
00:11:50,930 --> 00:11:58,120
Terrorists want us to lose our freedom by creating mass surveillance to prevent terrorism by creating
134
00:11:58,210 --> 00:11:59,990
mass surveillance infrastructure.
135
00:11:59,990 --> 00:12:03,830
We lose the very freedom we are trying to protect.
136
00:12:03,980 --> 00:12:10,100
But the counter-argument to this is that we will be more secure from mass surveillance will be more
137
00:12:10,100 --> 00:12:12,330
safe from mass surveillance.
138
00:12:12,390 --> 00:12:15,310
But the evidence is thin to support that.
139
00:12:15,500 --> 00:12:22,360
The former head of the NSA global intelligence gathering operations Garko old Bill Binney.
140
00:12:22,400 --> 00:12:28,430
He says that mass surveillance interferes with the government's ability to catch bad guys and that the
141
00:12:28,430 --> 00:12:31,400
government's failure in terms of 9/11.
142
00:12:31,400 --> 00:12:38,540
The Boston bombing the Texas shooting and other terrorist attacks because it was overwhelmed with data
143
00:12:38,630 --> 00:12:40,440
from mass surveillance.
144
00:12:40,460 --> 00:12:47,150
For me the issue of mass surveillance is about giving away too much power to a government.
145
00:12:47,150 --> 00:12:55,910
Key questions to consider and ask can you trust all the people government offices agencies companies
146
00:12:55,970 --> 00:13:02,800
and contractors with your personal and private data gathered through this mass surveillance.
147
00:13:02,810 --> 00:13:09,230
Can you trust that they will always have your best interests at heart and that they will act justly
148
00:13:09,320 --> 00:13:10,100
with this power.
149
00:13:10,100 --> 00:13:18,230
This new power that they will have and not just now but in the future and with your children because
150
00:13:18,230 --> 00:13:21,650
your children will inherit a watched world.
151
00:13:21,650 --> 00:13:29,570
This data will be kept and any slight deviation from what is considered acceptable could be used against
152
00:13:29,570 --> 00:13:33,380
you if you oppose those in power.
153
00:13:33,470 --> 00:13:35,700
Consider the civil rights movement.
154
00:13:35,810 --> 00:13:42,470
If mass surveillance is going on during that political movement how would it affect a political change
155
00:13:42,470 --> 00:13:45,560
for black people in the United States.
156
00:13:45,560 --> 00:13:51,530
Would civil rights have happened much slower more violently because of the mass surveillance.
157
00:13:51,530 --> 00:13:58,490
Or would it have been crushed completely so they wouldn't even exist now if we had mass surveillance.
158
00:13:58,760 --> 00:13:59,820
Things to consider.
159
00:14:01,100 --> 00:14:06,210
Also consider donating to some of these privacy causes.
160
00:14:06,230 --> 00:14:12,770
If privacy is something that you are particularly interested in and passionate about regulating encryption
161
00:14:13,190 --> 00:14:22,010
mandating insecurity and legalizing spying is unfortunately an active threat that's potentially on your
162
00:14:22,010 --> 00:14:24,590
threat landscape that you need to be aware of.
163
00:14:24,620 --> 00:14:33,620
If your ability to use encryption is reduced then your security will be reduced as well and you will
164
00:14:33,620 --> 00:14:36,500
need all the mitigating controls.
18341
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.