Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:16,059 --> 00:00:23,069
♪
2
00:00:23,110 --> 00:00:25,810
Narrator: The U.S. government
suffers the most invasive
3
00:00:25,851 --> 00:00:29,121
cyberattack in its history.
4
00:00:29,159 --> 00:00:31,029
Nikolas Badminton: The hackers
broke into email accounts
5
00:00:31,074 --> 00:00:34,384
affiliated with the head of the
Department of Homeland Security.
6
00:00:34,425 --> 00:00:35,985
Ramona Pringle:
You would think they'd have
7
00:00:36,036 --> 00:00:37,906
all sorts of
safeguards in place.
8
00:00:37,950 --> 00:00:40,780
Narrator: Job hunters across
the world are facing
9
00:00:40,823 --> 00:00:43,653
stressful interviews conducted
not by a person,
10
00:00:43,695 --> 00:00:45,915
but by an algorithm
that scrutinises
11
00:00:45,958 --> 00:00:48,308
their every word and gesture.
12
00:00:48,352 --> 00:00:50,572
K. Alexander: It analyzes their
facial expressions;
13
00:00:50,615 --> 00:00:53,485
how much eye contact they make,
and even their tone of voice.
14
00:00:53,531 --> 00:00:56,931
M. Aitken: They're essentially
trying to impress a machine.
15
00:00:56,969 --> 00:00:59,889
Narrator: In India, a research
student is using new technology
16
00:00:59,929 --> 00:01:02,239
to help deaf people
communicate in a way
17
00:01:02,279 --> 00:01:04,799
that they have never
been able to do before.
18
00:01:04,847 --> 00:01:06,937
Aitken: She wants to invent an
AI program that can translate
19
00:01:06,979 --> 00:01:09,549
visual 'sign language'
into English text.
20
00:01:09,591 --> 00:01:10,981
Anthony Morgan: And it isn't
just limited to people
21
00:01:11,027 --> 00:01:13,157
with hearing conditions.
AI can help people
22
00:01:13,203 --> 00:01:20,693
who are blind or have
other disabilities.
23
00:01:20,732 --> 00:01:22,342
Narrator: These are the
stories of the future
24
00:01:22,386 --> 00:01:26,216
that big data is bringing
to our doorsteps.
25
00:01:26,260 --> 00:01:30,350
The real world impact of
predictions and surveillance.
26
00:01:30,394 --> 00:01:32,614
The power of artificial
intelligence
27
00:01:32,657 --> 00:01:34,917
and autonomous machines.
28
00:01:34,964 --> 00:01:36,444
For better or for worse,
29
00:01:36,487 --> 00:01:46,497
these are the
Secrets of Big Data.
30
00:01:46,541 --> 00:01:49,761
Narrator: In late 2020, an
employee at the Silicon Valley
31
00:01:49,805 --> 00:01:52,545
headquarters of FireEye
in California,
32
00:01:52,590 --> 00:01:56,030
one of the most respected and
successful cybersecurity firms
33
00:01:56,072 --> 00:01:59,682
in the United States, is doing
a routine systems check
34
00:01:59,728 --> 00:02:03,508
when she notices something
out of the ordinary.
35
00:02:03,558 --> 00:02:05,078
Alexander: One of their
employees seems to have
36
00:02:05,125 --> 00:02:07,425
two phones registered to
his network.
37
00:02:07,475 --> 00:02:10,775
Narrator: While this may appear
insignificant to an outsider,
38
00:02:10,826 --> 00:02:13,476
FireEye's clientele includes
some of the world's
39
00:02:13,524 --> 00:02:17,014
biggest companies and top-level
government institutions,
40
00:02:17,049 --> 00:02:20,579
which makes them a constant
target for cyber-espionage.
41
00:02:20,618 --> 00:02:22,358
Badminton: Because of their
high profile,
42
00:02:22,403 --> 00:02:24,803
FireEye is always under
threat of attack.
43
00:02:24,840 --> 00:02:26,320
That's why anything unusual
in their systems
44
00:02:26,363 --> 00:02:28,413
is cause for concern.
45
00:02:28,452 --> 00:02:30,802
Narrator: The employee that
registered the second phone
46
00:02:30,846 --> 00:02:33,146
is contacted and tells the
security team
47
00:02:33,196 --> 00:02:35,806
that he has no idea why
there is another number
48
00:02:35,851 --> 00:02:37,721
attached to his network.
49
00:02:37,766 --> 00:02:39,506
Pringle: Alarm bells
start to go off.
50
00:02:39,550 --> 00:02:41,990
FireEye can only conclude that
they've been compromised
51
00:02:42,031 --> 00:02:44,771
and somebody has accessed
their systems.
52
00:02:44,816 --> 00:02:46,986
Narrator: The company
immediately launches
53
00:02:47,036 --> 00:02:48,776
an investigation.
54
00:02:48,820 --> 00:02:51,950
After several weeks of analysis,
FireEye discovers
55
00:02:51,997 --> 00:02:54,477
that not only did someone breach
their network,
56
00:02:54,522 --> 00:02:57,182
but they also stole hacking
applications
57
00:02:57,220 --> 00:02:59,350
that the company employs to
assess the safety
58
00:02:59,396 --> 00:03:02,176
of its own clients' networks.
59
00:03:02,225 --> 00:03:04,835
Alexander: This is very bad.
The tools that they stole could
60
00:03:04,880 --> 00:03:08,840
be used to stage sophisticated
new attacks around the world.
61
00:03:08,884 --> 00:03:11,154
Narrator: FireEye is able to
trace the intrusion back
62
00:03:11,191 --> 00:03:13,541
to something seemingly harmless,
63
00:03:13,584 --> 00:03:17,634
a routine software update from a
company called SolarWinds,
64
00:03:17,675 --> 00:03:20,025
a leading provider of system
management tools
65
00:03:20,069 --> 00:03:23,589
for network and infrastructure
monitoring.
66
00:03:23,638 --> 00:03:25,598
Badminton: SolarWinds is a
major player in the space,
67
00:03:25,640 --> 00:03:28,600
with hundreds of thousands
of customers around the world.
68
00:03:28,643 --> 00:03:31,253
Narrator: The software,
called Orion,
69
00:03:31,298 --> 00:03:33,468
is a popular network
management system.
70
00:03:33,517 --> 00:03:36,737
To update it, users were
prompted to log into
71
00:03:36,781 --> 00:03:39,001
the SolarWinds' development
website,
72
00:03:39,044 --> 00:03:41,874
enter their password and then
the new software would be
73
00:03:41,917 --> 00:03:46,047
automatically integrated
into their servers.
74
00:03:46,095 --> 00:03:48,445
Alexander: On the surface,
it's a pretty standard update.
75
00:03:48,489 --> 00:03:51,879
Some bug fixes, performance
improvements and such.
76
00:03:51,927 --> 00:03:56,017
Narrator: But below the surface,
it's anything but standard.
77
00:03:56,061 --> 00:03:59,111
Someone has managed to
insert a malicious code
78
00:03:59,151 --> 00:04:01,421
into the Orion software update,
79
00:04:01,458 --> 00:04:04,418
and unaware of this, some 18,000
80
00:04:04,461 --> 00:04:08,251
SolarWinds customers downloaded
the tainted product.
81
00:04:08,291 --> 00:04:11,731
Once the update was completed,
the perpetrators were able
82
00:04:11,773 --> 00:04:14,563
to gain access to other
companies and organisations
83
00:04:14,602 --> 00:04:18,912
that these customers used and
even worked for.
84
00:04:18,954 --> 00:04:21,134
Including tech giants Intel,
85
00:04:21,173 --> 00:04:25,053
Cisco and Microsoft.
86
00:04:25,090 --> 00:04:26,610
Pringle: But what's more
concerning is that a number
87
00:04:26,657 --> 00:04:29,527
of US federal agencies
are also compromised,
88
00:04:29,573 --> 00:04:32,273
including the Treasury, Justice
and Energy departments
89
00:04:32,315 --> 00:04:34,795
and even the Pentagon.
90
00:04:34,839 --> 00:04:38,099
Narrator: The SolarWinds hack
is one of the largest and most
91
00:04:38,147 --> 00:04:42,587
sophisticated cybersecurity
breaches of the 21st century.
92
00:04:42,630 --> 00:04:45,500
Authorities begin to
investigate, trying to ascertain
93
00:04:45,546 --> 00:04:47,676
who is behind this brazen attack
94
00:04:47,722 --> 00:04:51,682
and how exactly they managed to
execute it.
95
00:04:51,726 --> 00:04:52,896
Badminton: The SolarWinds hack
is what's called
96
00:04:52,944 --> 00:04:54,604
a supply-chain attack.
97
00:04:54,642 --> 00:04:57,562
Rather than trying to breach a
company or institution directly,
98
00:04:57,601 --> 00:05:00,871
hackers identify a third party
vendor with weak cybersecurity
99
00:05:00,909 --> 00:05:03,429
and use them to gain access.
100
00:05:03,477 --> 00:05:05,347
Alexander: As there are many
possibilities for
101
00:05:05,392 --> 00:05:07,922
who that third party is,
there are also
102
00:05:07,959 --> 00:05:10,139
a few different types
of supply chain attacks.
103
00:05:10,179 --> 00:05:12,219
But one common trick is to
breach businesses
104
00:05:12,268 --> 00:05:14,098
that build websites.
105
00:05:14,139 --> 00:05:16,139
Narrator: In a website
builder attack,
106
00:05:16,185 --> 00:05:18,485
hackers compromise
companies who use
107
00:05:18,535 --> 00:05:21,055
ready-made templates
to create websites,
108
00:05:21,103 --> 00:05:23,933
usually digital ad agencies
or developers.
109
00:05:23,975 --> 00:05:26,055
Once the business is breached,
110
00:05:26,108 --> 00:05:29,328
the attackers manipulate the
core script of the template,
111
00:05:29,372 --> 00:05:32,292
redirecting victims
to a corrupt domain.
112
00:05:32,332 --> 00:05:35,382
Malware is then installed onto
the systems of those
113
00:05:35,422 --> 00:05:38,602
browsing legitimate websites.
114
00:05:38,642 --> 00:05:40,782
Pringle: Builder attacks
are very efficient
115
00:05:40,818 --> 00:05:43,648
because instead of targeting a
bunch of websites individually,
116
00:05:43,691 --> 00:05:47,431
hackers can gain access
to any site that uses the
117
00:05:47,477 --> 00:05:52,697
doctored code, all by gaining
access to just one company.
118
00:05:52,743 --> 00:05:54,353
Badminton: What we're also
seeing more and more of
119
00:05:54,397 --> 00:05:56,177
are so-called
"watering hole attacks",
120
00:05:56,225 --> 00:05:58,745
where hackers single out a
website that's visited often
121
00:05:58,793 --> 00:06:00,933
by employees of a
certain organisation
122
00:06:00,969 --> 00:06:05,709
or even a whole sector
like healthcare or defence.
123
00:06:05,756 --> 00:06:07,846
Narrator: Once the target
website of a watering hole
124
00:06:07,889 --> 00:06:11,589
attack is compromised, the
perpetrators distribute malware,
125
00:06:11,632 --> 00:06:14,642
sometimes without the
victims realising it.
126
00:06:14,678 --> 00:06:17,898
But because users trust the
site, it can also be hidden
127
00:06:17,942 --> 00:06:20,862
in a file that they
deliberately download,
128
00:06:20,902 --> 00:06:24,862
unaware of the malicious
content it contains.
129
00:06:24,906 --> 00:06:28,996
In 2021, Google's Threat
Advisory Group discovered
130
00:06:29,040 --> 00:06:31,610
a watering hole attack that
breached several media
131
00:06:31,652 --> 00:06:34,522
and pro-democracy websites to
target visitors
132
00:06:34,568 --> 00:06:37,008
specifically from Hong Kong.
133
00:06:37,048 --> 00:06:40,308
Cybersecurity experts suspect
that the Chinese government
134
00:06:40,356 --> 00:06:44,136
was behind the attack.
135
00:06:44,186 --> 00:06:46,276
Pringle: Hackers are now using
'watering hole' sites
136
00:06:46,318 --> 00:06:48,448
for cyber attacks against
an array of victims
137
00:06:48,495 --> 00:06:50,975
in many different sectors.
138
00:06:51,019 --> 00:06:53,459
Narrator: But the most popular
supply chain attack method
139
00:06:53,500 --> 00:06:56,150
is third-party software
interference,
140
00:06:56,198 --> 00:06:59,508
as witnessed in the
SolarWinds hack.
141
00:06:59,549 --> 00:07:01,989
And as the investigation
into the breach deepens,
142
00:07:02,030 --> 00:07:04,730
the ingenuity and complexity
of the operation
143
00:07:04,772 --> 00:07:08,082
becomes evident to authorities.
144
00:07:08,123 --> 00:07:10,133
Alexander: They discover that
the hackers originally gained
145
00:07:10,168 --> 00:07:14,478
access to SolarWinds over a year
before the attack was exposed.
146
00:07:14,521 --> 00:07:17,441
As a sort of trial run, they
inserted a small snippet
147
00:07:17,480 --> 00:07:19,570
of harmless code into
a software update
148
00:07:19,613 --> 00:07:21,663
to see what they could
get away with it.
149
00:07:21,702 --> 00:07:23,752
Badminton: Once that version of
the software was published
150
00:07:23,791 --> 00:07:26,141
and distributed with
their code still intact,
151
00:07:26,184 --> 00:07:29,364
they knew that a full-scale
attack was possible.
152
00:07:29,405 --> 00:07:31,275
Narrator: On the heels
of this victory,
153
00:07:31,320 --> 00:07:33,800
the attackers then
do something strange,
154
00:07:33,844 --> 00:07:37,674
vanish for five months.
155
00:07:37,718 --> 00:07:39,588
Alexander: Presumably they were
working on writing the code
156
00:07:39,633 --> 00:07:41,983
for the main operation,
because when they reappear,
157
00:07:42,026 --> 00:07:43,936
they come equipped with
a backdoor attack
158
00:07:43,985 --> 00:07:47,855
the likes of which
the world has never seen.
159
00:07:47,902 --> 00:07:49,902
Narrator: Investigators are
stunned when they discover
160
00:07:49,947 --> 00:07:52,987
exactly how the perpetrators
managed to introduce
161
00:07:53,037 --> 00:07:56,907
the tainted code into the
SolarWinds software update.
162
00:07:56,954 --> 00:08:00,394
The first step was to embed code
that informed them
163
00:08:00,436 --> 00:08:02,476
whenever an employee on the
development team
164
00:08:02,525 --> 00:08:05,695
was preparing new software.
165
00:08:05,746 --> 00:08:07,616
Pringle: These companies
have a digital library,
166
00:08:07,661 --> 00:08:09,621
and every time they
engineer an update,
167
00:08:09,663 --> 00:08:11,583
the developer has
to check the code out
168
00:08:11,621 --> 00:08:13,361
and then when they're done
modifying it,
169
00:08:13,405 --> 00:08:16,055
they check it back in.
170
00:08:16,104 --> 00:08:18,194
Badminton: This creates a
digital trail so it's easy
171
00:08:18,236 --> 00:08:20,236
to track who has had
access to the files
172
00:08:20,282 --> 00:08:22,722
and when they were worked on.
173
00:08:22,763 --> 00:08:24,203
Narrator: Once an
update is complete,
174
00:08:24,242 --> 00:08:26,592
what's called a build
process is started,
175
00:08:26,636 --> 00:08:28,546
which converts the code
from human language
176
00:08:28,595 --> 00:08:30,765
to computer language.
177
00:08:30,814 --> 00:08:33,124
The finished software is
then stamped with
178
00:08:33,164 --> 00:08:35,734
what could be described
as a digital seal,
179
00:08:35,776 --> 00:08:37,726
which in most cases
makes it impossible
180
00:08:37,778 --> 00:08:41,218
to tamper with without someone
being alerted.
181
00:08:41,259 --> 00:08:43,039
Alexander: The hackers were
able to study the SolarWinds
182
00:08:43,087 --> 00:08:44,917
build process and
sneak the code in
183
00:08:44,959 --> 00:08:47,529
at the very last second
so it went undetected.
184
00:08:47,570 --> 00:08:49,880
Badminton: In real world terms,
is like if someone slipped
185
00:08:49,920 --> 00:08:52,620
a poison pill into a bottle
of aspirin at the factory,
186
00:08:52,662 --> 00:08:55,972
only a moment before the
bottle was sealed shut.
187
00:08:56,013 --> 00:08:58,103
Narrator: The resulting
malicious software update
188
00:08:58,146 --> 00:09:01,626
was then unknowingly sent out
to SolarWinds customers,
189
00:09:01,671 --> 00:09:04,721
giving the attackers total
anonymous access
190
00:09:04,761 --> 00:09:07,761
to any Orion user who
installed the update
191
00:09:07,808 --> 00:09:10,808
and had an internet connection.
192
00:09:10,854 --> 00:09:13,774
Pringle: The code itself was
sophisticated but brief,
193
00:09:13,814 --> 00:09:16,734
only 3,500 encrypted
characters long.
194
00:09:16,773 --> 00:09:19,823
The best hackers are very
economical in their programming,
195
00:09:19,863 --> 00:09:23,653
the more concise the code,
the harder it is to detect.
196
00:09:23,693 --> 00:09:26,573
Narrator: As investigators
unravel how the operation
197
00:09:26,609 --> 00:09:29,739
was carried out, they begin to
search for clues
198
00:09:29,786 --> 00:09:33,046
as to who was behind it, but
determining the identity
199
00:09:33,094 --> 00:09:36,404
of the attackers is proving
difficult.
200
00:09:36,445 --> 00:09:38,575
Badminton: A lot of hackers
inadvertently leave evidence
201
00:09:38,621 --> 00:09:41,621
behind, some have coding tics
that give them away based on
202
00:09:41,668 --> 00:09:44,628
known previous attacks or they
might even write something
203
00:09:44,671 --> 00:09:48,331
in their native language, which
can give away their nationality.
204
00:09:48,370 --> 00:09:51,330
Narrator: But the SolarWinds
code is so sophisticated,
205
00:09:51,373 --> 00:09:53,683
that there are no clues
to its origin.
206
00:09:53,723 --> 00:09:57,163
Authorities can't find any
evidence to pin down exactly
207
00:09:57,205 --> 00:10:01,635
where the attack came from, but
they have their suspicions.
208
00:10:01,688 --> 00:10:06,608
In 2017, the most damaging and
expensive cyberattack in history
209
00:10:06,649 --> 00:10:09,569
was allegedly perpetrated
by the Russian military.
210
00:10:09,609 --> 00:10:14,399
The hack, called NotPetya, also
used corrupted software
211
00:10:14,439 --> 00:10:16,659
as a delivery method.
212
00:10:16,703 --> 00:10:19,013
Alexander: NotPetya was
originally a cyber-weapon used
213
00:10:19,053 --> 00:10:20,843
against Ukraine by the Russians.
214
00:10:20,881 --> 00:10:23,711
They breached tax software
called M.E. Doc
215
00:10:23,753 --> 00:10:26,363
that is widely used by
Ukrainian businesses.
216
00:10:26,408 --> 00:10:31,198
From there, it spread like
wildfire around the world.
217
00:10:31,239 --> 00:10:33,759
Narrator: Investigators
discovered that NotPetya
218
00:10:33,807 --> 00:10:37,027
had been specifically programmed
to make it impossible
219
00:10:37,071 --> 00:10:42,211
to recover any files once
systems were infected.
220
00:10:42,250 --> 00:10:44,250
It was designed to
completely destroy
221
00:10:44,295 --> 00:10:46,945
any computer that
it infiltrated.
222
00:10:46,994 --> 00:10:49,874
The malware targeted everything
from energy companies,
223
00:10:49,910 --> 00:10:53,650
the power grid and gas
stations to airports,
224
00:10:53,696 --> 00:10:57,696
banks and major corporations.
225
00:10:57,744 --> 00:10:59,484
Pringle: The US government
assessed that the
226
00:10:59,528 --> 00:11:03,398
NotPetya attack ended up causing
about $10 Billion USD
227
00:11:03,445 --> 00:11:06,535
worth of damages world wide,
making it the most expensive
228
00:11:06,578 --> 00:11:09,928
and destructive
cyberattack in history.
229
00:11:09,973 --> 00:11:12,503
Narrator: US authorities begin
to suspect that Russia
230
00:11:12,541 --> 00:11:16,331
may be behind the SolarWinds
hack as well, because both
231
00:11:16,371 --> 00:11:19,161
used tainted software code as
a launching point.
232
00:11:19,200 --> 00:11:21,720
But the problem for
investigators is
233
00:11:21,768 --> 00:11:24,678
that the similarities end there.
234
00:11:24,727 --> 00:11:27,077
Pringle: NotPetya was hell-bent
on destroying everything
235
00:11:27,121 --> 00:11:31,041
in its path, but the SolarWinds
attack was done covertly.
236
00:11:31,081 --> 00:11:35,911
They were also selective about
what institutions to target.
237
00:11:35,956 --> 00:11:38,476
Narrator: Breaches as large
as the SolarWinds hack
238
00:11:38,523 --> 00:11:42,353
can present an embarrassment of
riches for attackers.
239
00:11:42,397 --> 00:11:45,877
With so many options, it can be
hard for them to narrow down
240
00:11:45,922 --> 00:11:50,882
which companies or government
agencies they want to access.
241
00:11:50,927 --> 00:11:52,887
Badminton: What the attackers do
is create a passive domain name
242
00:11:52,929 --> 00:11:55,629
server system that not only
identifies potential targets
243
00:11:55,671 --> 00:11:58,111
by IP address, but also
gives them a little bit
244
00:11:58,152 --> 00:12:00,072
of information about each one.
245
00:12:00,110 --> 00:12:01,330
Pringle: The hackers then choose
246
00:12:01,372 --> 00:12:03,552
which targets are worthy
of their attention.
247
00:12:03,592 --> 00:12:05,732
They mostly go
after tech companies
248
00:12:05,768 --> 00:12:09,688
and high-profile branches
of the government.
249
00:12:09,729 --> 00:12:11,769
Narrator: The SolarWinds
attackers manage to breach
250
00:12:11,818 --> 00:12:14,998
about a dozen vital US
government agencies.
251
00:12:15,038 --> 00:12:18,738
Disturbingly, they even break
into the Cybersecurity
252
00:12:18,781 --> 00:12:22,311
and Infrastructure Security
Agency, or CISA,
253
00:12:22,350 --> 00:12:25,610
the office at the Department
of Homeland Security
254
00:12:25,657 --> 00:12:28,657
whose primary function is to
defend government networks
255
00:12:28,704 --> 00:12:31,884
from cyberattacks.
256
00:12:31,925 --> 00:12:34,095
Alexander: For them to breach
the very agency whose job it is
257
00:12:34,144 --> 00:12:36,104
to defend against
these kinds of attacks
258
00:12:36,146 --> 00:12:38,236
is a major embarrassment
for US authorities.
259
00:12:38,279 --> 00:12:40,539
How does that even happen?
260
00:12:40,585 --> 00:12:43,145
Narrator: According to the
Department of Homeland Security,
261
00:12:43,197 --> 00:12:46,937
their system only detects known
threats and the SolarWinds
262
00:12:46,983 --> 00:12:50,123
attack is unlike anything
they've ever seen before.
263
00:12:50,160 --> 00:12:53,160
On top of that issue, their
processes don't involve
264
00:12:53,207 --> 00:12:54,987
scanning software updates.
265
00:12:55,035 --> 00:12:58,255
So even if the system could have
identified the malicious code,
266
00:12:58,299 --> 00:13:00,259
they never would
have detected it,
267
00:13:00,301 --> 00:13:03,831
because it was buried
in the update.
268
00:13:03,870 --> 00:13:05,960
Pringle: That seems like a
pretty big oversight on the part
269
00:13:06,002 --> 00:13:08,792
of the government,
who you would think would have
270
00:13:08,831 --> 00:13:10,661
all sorts of
safeguards in place.
271
00:13:10,702 --> 00:13:12,972
Narrator: As investigators
dig deeper,
272
00:13:13,009 --> 00:13:15,929
they discover something
truly shocking.
273
00:13:15,969 --> 00:13:19,409
The attackers had unfettered
access to some of these systems
274
00:13:19,450 --> 00:13:24,670
for an astonishing nine months
before the hack was detected.
275
00:13:24,716 --> 00:13:26,756
Badminton: Interestingly, the
hackers didn't seem to disrupt
276
00:13:26,806 --> 00:13:28,626
any systems or
destroy any files.
277
00:13:28,677 --> 00:13:31,027
They just kind of
silently roamed around.
278
00:13:31,071 --> 00:13:35,291
Which points to one
thing - cyber espionage.
279
00:13:35,336 --> 00:13:37,686
Narrator: Authorities now have
the motive for the attack,
280
00:13:37,729 --> 00:13:40,039
but are still unable to find
any evidence
281
00:13:40,080 --> 00:13:42,910
as to who perpetrated it.
282
00:13:42,952 --> 00:13:46,002
But they can't help but come
back to their prime suspect:
283
00:13:46,042 --> 00:13:47,652
Russia.
284
00:13:47,696 --> 00:13:50,956
More specifically, one of the
most tenacious and cunning
285
00:13:51,004 --> 00:13:53,054
hacking groups on the planet:
286
00:13:53,093 --> 00:13:56,443
the Russian Intelligence-backed
APT29,
287
00:13:56,487 --> 00:14:00,967
also known as Cozy Bear.
288
00:14:01,014 --> 00:14:02,974
Pringle: Cozy Bear has been
responsible for some of the most
289
00:14:03,016 --> 00:14:05,756
infamous hacks of
US and NATO member countries
290
00:14:05,801 --> 00:14:07,671
over the past several years.
291
00:14:07,716 --> 00:14:10,066
They're the cream of the crop.
292
00:14:10,110 --> 00:14:14,110
Narrator: In 2016, WikiLeaks
released 20,000 emails
293
00:14:14,157 --> 00:14:16,287
from the Democratic
National Committee
294
00:14:16,333 --> 00:14:19,643
that they acquired after Cozy
Bear and another hacking team
295
00:14:19,684 --> 00:14:21,734
believed to be tied to a
separate branch
296
00:14:21,773 --> 00:14:23,513
of the Russian
intelligence service,
297
00:14:23,558 --> 00:14:26,648
accessed the DNC's
internal network.
298
00:14:26,691 --> 00:14:29,871
Cozy Bear camped out in the
system undetected
299
00:14:29,912 --> 00:14:33,262
for over a year, actions
suspiciously similar
300
00:14:33,307 --> 00:14:35,087
to the SolarWinds hack.
301
00:14:35,135 --> 00:14:38,045
And the resemblance
doesn't stop there.
302
00:14:38,094 --> 00:14:40,014
Badminton: Both of these
attacks have common thread,
303
00:14:40,053 --> 00:14:42,143
they use cutting
edge digital tools,
304
00:14:42,185 --> 00:14:44,795
and this suggests state funding.
305
00:14:44,840 --> 00:14:46,840
They went after strategic
information,
306
00:14:46,886 --> 00:14:48,576
rather than financial gain;
307
00:14:48,626 --> 00:14:50,106
and they chose
targets of interest
308
00:14:50,150 --> 00:14:53,810
to Russia's
intelligence community.
309
00:14:53,849 --> 00:14:56,289
Narrator: For its part, Russia
denied any involvement
310
00:14:56,330 --> 00:14:57,980
in the SolarWinds hack.
311
00:14:58,027 --> 00:15:00,117
But the US was unconvinced
312
00:15:00,160 --> 00:15:03,990
and imposed sanctions on them as
punishment for the attack.
313
00:15:04,033 --> 00:15:05,563
Pringle: The U.S.
ultimately announced
314
00:15:05,600 --> 00:15:08,430
that 10 Russian diplomats would
be expelled from the country
315
00:15:08,472 --> 00:15:12,042
and 32 entities and individuals
would be blacklisted.
316
00:15:12,085 --> 00:15:15,035
The sanctions also targeted
six Russian tech firms
317
00:15:15,088 --> 00:15:18,438
linked to intelligence services.
318
00:15:18,482 --> 00:15:20,272
Narrator: The full extent
of the damage caused
319
00:15:20,310 --> 00:15:23,790
by the SolarWinds hack remains
something of a mystery.
320
00:15:23,835 --> 00:15:26,135
For the government agencies that
were breached,
321
00:15:26,186 --> 00:15:29,226
it's virtually impossible for
them to know the sum total
322
00:15:29,276 --> 00:15:33,236
of the information the
Russians had access to.
323
00:15:33,280 --> 00:15:35,060
Badminton: They do know for sure
that the hackers broke into
324
00:15:35,108 --> 00:15:37,148
email accounts
affiliated with the
325
00:15:37,197 --> 00:15:39,457
Head of the Department of
Homeland Security
326
00:15:39,503 --> 00:15:41,333
and also several others who work
327
00:15:41,375 --> 00:15:44,245
in the department's
cybersecurity division.
328
00:15:44,291 --> 00:15:47,641
Narrator: For private companies,
the impact is also murky.
329
00:15:47,685 --> 00:15:51,035
Microsoft reported no evidence
of stolen or leaked
330
00:15:51,080 --> 00:15:53,740
customer data from the attack.
331
00:15:53,778 --> 00:15:55,518
Alexander: It seems pretty
clear that the US government
332
00:15:55,563 --> 00:15:57,523
was the primary target
of the attack.
333
00:15:57,565 --> 00:16:00,605
The tech companies were probably
just collateral damage.
334
00:16:00,655 --> 00:16:03,345
Narrator: Up against this kind
of formidable enemy,
335
00:16:03,397 --> 00:16:06,267
American authorities face
difficult questions.
336
00:16:06,313 --> 00:16:08,713
Was the attack just
the tip of the spear
337
00:16:08,750 --> 00:16:11,010
in the escalating cyberwar
between familiar
338
00:16:11,057 --> 00:16:13,097
cold war adversaries?
339
00:16:13,146 --> 00:16:16,796
Is another, more destructive
hack waiting around the corner?
340
00:16:16,845 --> 00:16:19,715
Most experts agree - it's
not a matter of
341
00:16:19,761 --> 00:16:26,991
if it happens again, but when?
342
00:16:27,029 --> 00:16:30,859
♪ [show theme music]
343
00:16:30,902 --> 00:16:38,262
♪♪
344
00:16:38,301 --> 00:16:40,781
Narrator: 25-year-old New York
native Sheikh Ahmed
345
00:16:40,825 --> 00:16:43,565
is on the hunt for a job
as a bank teller.
346
00:16:43,611 --> 00:16:46,661
He has applied for many
positions across the city,
347
00:16:46,701 --> 00:16:49,361
and today he gets the news that
he has been selected
348
00:16:49,399 --> 00:16:52,529
for not just one,
but eight interviews.
349
00:16:52,576 --> 00:16:55,576
However, these would be no
ordinary meetings with
350
00:16:55,623 --> 00:16:58,713
Human Resources representatives
or branch managers.
351
00:16:58,756 --> 00:17:01,456
These are HireVue assessments,
352
00:17:01,498 --> 00:17:03,148
a state of the art
recruiting tool
353
00:17:03,196 --> 00:17:05,066
that uses artificial
intelligence
354
00:17:05,111 --> 00:17:07,111
to assess a candidate's
worthiness
355
00:17:07,156 --> 00:17:10,586
with no prospective
employer present.
356
00:17:10,638 --> 00:17:12,378
Badminton: The Hirevue system
uses a person's phone
357
00:17:12,422 --> 00:17:15,252
or computer camera to scrutinize
the smallest details
358
00:17:15,295 --> 00:17:18,075
of their answers to a
standard set of questions.
359
00:17:18,124 --> 00:17:20,264
Alexander: It analyzes their
facial expressions;
360
00:17:20,300 --> 00:17:22,000
how much eye contact they make,
361
00:17:22,041 --> 00:17:24,301
what words they use and even
their tone of voice.
362
00:17:24,347 --> 00:17:26,957
HireVue claims that by using
these metrics,
363
00:17:27,002 --> 00:17:29,002
they can determine how
enthusiastic a person is
364
00:17:29,048 --> 00:17:31,398
about getting the job.
365
00:17:31,441 --> 00:17:33,751
Narrator: This information is
then used to automatically
366
00:17:33,791 --> 00:17:36,321
produce an employability score,
367
00:17:36,359 --> 00:17:38,449
which is ranked against
other candidates.
368
00:17:38,492 --> 00:17:41,802
The HireVue algorithm is part of
a burgeoning field
369
00:17:41,843 --> 00:17:44,933
of artificial intelligence
called ERT,
370
00:17:44,976 --> 00:17:48,846
Emotion Recognition Technology.
371
00:17:48,893 --> 00:17:51,073
Aitken: ERT basically tries to
identify how someone is feeling
372
00:17:51,113 --> 00:17:52,593
based on their facial
expressions
373
00:17:52,636 --> 00:17:56,026
and other physical clues.
374
00:17:56,075 --> 00:17:59,375
Narrator: These systems rely on
two factors - computer vision,
375
00:17:59,426 --> 00:18:01,856
to accurately recognize
facial movements,
376
00:18:01,906 --> 00:18:05,476
and machine learning to analyze
and decipher them.
377
00:18:05,519 --> 00:18:09,309
The algorithms reference huge
image databases of human faces
378
00:18:09,349 --> 00:18:11,699
that are classified by emotion,
379
00:18:11,742 --> 00:18:15,142
and then try to match
them to the subject.
380
00:18:15,181 --> 00:18:16,791
Badminton:
Six basic feelings are used:
381
00:18:16,834 --> 00:18:18,924
fear, anger, joy, sadness,
382
00:18:18,967 --> 00:18:21,397
disgust, and surprise.
383
00:18:21,448 --> 00:18:25,148
Narrator: Pioneering American
Psychologist Doctor Paul Ekman
384
00:18:25,191 --> 00:18:27,671
was the first to categorize
these as the fundamental
385
00:18:27,715 --> 00:18:30,535
human emotions back
in the 1960s.
386
00:18:30,587 --> 00:18:34,417
Ekman is considered to be the
founding father of ERT,
387
00:18:34,461 --> 00:18:36,991
and his early research still
echoes in today's
388
00:18:37,028 --> 00:18:40,118
sophisticated artificial
intelligence systems.
389
00:18:40,162 --> 00:18:41,472
Alexander: He believed
that these were the
390
00:18:41,511 --> 00:18:43,911
universal feelings
that all humans shared,
391
00:18:43,948 --> 00:18:45,688
regardless of gender, culture,
392
00:18:45,733 --> 00:18:47,953
location or situation.
393
00:18:47,996 --> 00:18:50,606
Narrator: In 1978, he published
394
00:18:50,651 --> 00:18:54,521
the the Facial Action
Coding System or FACS.
395
00:18:54,568 --> 00:18:57,178
Alexander: The FACS system
categorized around 40 unique
396
00:18:57,223 --> 00:19:00,103
muscle movements of the face
and called the elements of each
397
00:19:00,139 --> 00:19:02,179
expression an action unit.
398
00:19:02,228 --> 00:19:05,488
Narrator: For the most part,
FACS was a resounding success,
399
00:19:05,535 --> 00:19:07,835
but there were issues.
400
00:19:07,885 --> 00:19:12,145
The central problem was that it
was very time consuming to use,
401
00:19:12,194 --> 00:19:16,074
it took up to 100 hours to teach
users the procedures,
402
00:19:16,111 --> 00:19:20,251
and an hour to evaluate only one
minute of film footage.
403
00:19:20,289 --> 00:19:23,339
But a promising new technology
that might help overcome
404
00:19:23,379 --> 00:19:25,639
these obstacles was on the
horizon,
405
00:19:25,686 --> 00:19:28,116
computer vision.
406
00:19:28,167 --> 00:19:30,727
Aitken: In the early '90s,
researchers realized
407
00:19:30,778 --> 00:19:33,038
that in order to take advantage
of advancing technology,
408
00:19:33,084 --> 00:19:35,264
they needed a database of
standardized images
409
00:19:35,304 --> 00:19:36,874
to work with.
410
00:19:36,914 --> 00:19:38,834
Badminton: At this point, the
US government stepped in
411
00:19:38,873 --> 00:19:41,663
and financed a program to
compile facial pictures.
412
00:19:41,702 --> 00:19:45,402
They saw the potential for ERT
as a security application.
413
00:19:45,445 --> 00:19:48,005
Narrator: By the
end of the 1990s,
414
00:19:48,056 --> 00:19:50,096
machine-learning scientists
began to collect
415
00:19:50,145 --> 00:19:52,225
and classify these archives,
416
00:19:52,278 --> 00:19:55,238
resulting in robust image
datasets that provide
417
00:19:55,281 --> 00:19:58,851
the foundation for much of
today's AI based research.
418
00:19:58,893 --> 00:20:01,333
And emotion recognition
technology
419
00:20:01,374 --> 00:20:05,294
is quickly becoming big
business.
420
00:20:05,334 --> 00:20:07,124
Alexander: One early
provider of ERT services
421
00:20:07,162 --> 00:20:09,382
was a startup called Affectiva.
422
00:20:09,425 --> 00:20:11,335
Their technology was
sold to businesses
423
00:20:11,384 --> 00:20:13,304
as a market research product,
424
00:20:13,342 --> 00:20:15,952
analyzing real-time
emotional reactions to ads
425
00:20:15,997 --> 00:20:18,997
and new products
in focus groups.
426
00:20:19,043 --> 00:20:21,923
Narrator: ERT has since expanded
into many other areas
427
00:20:21,959 --> 00:20:24,699
of business, particularly
recruitment,
428
00:20:24,745 --> 00:20:26,825
where companies like
HireVue claim
429
00:20:26,877 --> 00:20:28,967
that they can streamline the
hiring process
430
00:20:29,010 --> 00:20:32,320
by using their systems to weed
out unworthy applicants
431
00:20:32,361 --> 00:20:35,231
quickly and accurately.
432
00:20:35,277 --> 00:20:36,497
Badminton: A process that
used to take weeks
433
00:20:36,539 --> 00:20:38,239
now only takes a few days.
434
00:20:38,280 --> 00:20:42,110
It's way cheaper and faster
than if humans were involved.
435
00:20:42,153 --> 00:20:46,463
Narrator: In fact, ERT is now
so prevalent in human resources,
436
00:20:46,506 --> 00:20:49,416
that there are online guides
with tips for candidates
437
00:20:49,465 --> 00:20:52,115
on how to best present
themselves to the camera.
438
00:20:52,163 --> 00:20:55,473
For job seekers like Sheikh
Ahmed, the process can be
439
00:20:55,515 --> 00:20:58,685
an intimidating and
distressing experience.
440
00:20:58,735 --> 00:21:01,085
Aitken: They're essentially
trying to impress a machine.
441
00:21:01,129 --> 00:21:02,999
It really is kind of strange.
442
00:21:03,044 --> 00:21:06,274
Narrator: Ahmed has spent
countless hours studying guides
443
00:21:06,308 --> 00:21:08,568
on how to speak and
comfort himself,
444
00:21:08,615 --> 00:21:10,525
but on the day of the interviews
445
00:21:10,573 --> 00:21:13,053
he frets over something
seemingly trivial,
446
00:21:13,097 --> 00:21:15,837
how to position the camera.
447
00:21:15,883 --> 00:21:18,283
Alexander: A high angle might
make him seem weak and small,
448
00:21:18,320 --> 00:21:22,110
whereas a low angle might
make him appear too dominant.
449
00:21:22,150 --> 00:21:23,370
Narrator: And there
are other factors
450
00:21:23,412 --> 00:21:25,242
fuelling Ahmed's anxiety,
451
00:21:25,284 --> 00:21:29,114
namely that random sounds
might harm his score.
452
00:21:29,157 --> 00:21:31,637
He turns off the air
conditioning system,
453
00:21:31,681 --> 00:21:33,641
and tucks himself into the
corner of his father's
454
00:21:33,683 --> 00:21:37,173
soundproof music studio, far
away from the normally
455
00:21:37,208 --> 00:21:40,118
pleasant chirping of the
family's pet bird.
456
00:21:40,168 --> 00:21:41,908
Badminton: Because
the software analyses
457
00:21:41,952 --> 00:21:44,782
the sound of people's voices,
any outside interference
458
00:21:44,825 --> 00:21:47,605
could have an impact
on his evaluation.
459
00:21:47,654 --> 00:21:50,094
Narrator: Ahmed settles
into a gruelling day
460
00:21:50,134 --> 00:21:53,624
and confronts the unsettling
reality of facing an algorithm
461
00:21:53,660 --> 00:21:56,620
that judges every involuntary
gesture that he makes
462
00:21:56,663 --> 00:21:58,583
and every word that he utters.
463
00:21:58,621 --> 00:22:01,491
To critics of emotion
recognition technology,
464
00:22:01,537 --> 00:22:05,447
and there are many,
this is problematic.
465
00:22:05,498 --> 00:22:08,278
Most wonder if artificial
intelligence can really
466
00:22:08,327 --> 00:22:12,897
interpret something as complex
and nuanced as human behaviour.
467
00:22:12,940 --> 00:22:15,290
Aitken: Some argue that it's
impossible to know definitively
468
00:22:15,334 --> 00:22:16,864
what a person is feeling
simply by reading
469
00:22:16,900 --> 00:22:18,290
their facial expressions.
470
00:22:18,337 --> 00:22:20,467
People sometimes smile even if
they're not happy
471
00:22:20,513 --> 00:22:22,783
or scowl when they aren't angry.
472
00:22:22,819 --> 00:22:25,079
Narrator: And there
are other problems.
473
00:22:25,126 --> 00:22:29,036
Critics of ERT claim that the
image categorizing process
474
00:22:29,086 --> 00:22:32,346
used to develop algorithms
is overly simplistic.
475
00:22:32,394 --> 00:22:35,144
Something as complicated
as human emotion
476
00:22:35,179 --> 00:22:38,919
can't be distilled down
to six basic feelings.
477
00:22:38,966 --> 00:22:41,096
Badminton: Emotions are complex
and often interrelated.
478
00:22:41,142 --> 00:22:43,012
There are many grey areas.
479
00:22:43,057 --> 00:22:46,667
There are subtleties that no AI
is capable of detecting.
480
00:22:46,713 --> 00:22:49,023
Well, not quite yet!
481
00:22:49,063 --> 00:22:50,593
Narrator: Some are also
quick to point out
482
00:22:50,630 --> 00:22:53,720
that people express emotions
in many different ways,
483
00:22:53,763 --> 00:22:56,383
not just using
facial expressions.
484
00:22:56,418 --> 00:22:58,858
Factors like body language
are also indicators
485
00:22:58,899 --> 00:23:00,899
of how a person is feeling.
486
00:23:00,944 --> 00:23:02,824
Alexander: Physical cues
such as crossed arms
487
00:23:02,859 --> 00:23:04,949
or a slumped posture
can sometimes convey
488
00:23:04,992 --> 00:23:07,912
someone's state of mind better
than the look on their face.
489
00:23:07,951 --> 00:23:09,821
Narrator: But Ekman and his
supporters counter
490
00:23:09,866 --> 00:23:12,996
that the research is sound and
stand by the assertion
491
00:23:13,043 --> 00:23:15,873
that if a universal emotion is
triggered in a person,
492
00:23:15,916 --> 00:23:18,176
then an involuntary facial
movement
493
00:23:18,222 --> 00:23:21,402
naturally appears on their face.
494
00:23:21,443 --> 00:23:23,623
Aitken: So the argument goes
that even if that person
495
00:23:23,663 --> 00:23:25,233
tried to hide their feelings,
496
00:23:25,273 --> 00:23:27,623
the basic,
reflex emotion would surface,
497
00:23:27,667 --> 00:23:31,187
and if someone knew what to look
for, they could identify it.
498
00:23:31,235 --> 00:23:33,885
Narrator: Still, there are
many skeptics who question
499
00:23:33,934 --> 00:23:36,554
the scientific validity of ERT
500
00:23:36,589 --> 00:23:39,589
and have problems with some of
the methodology.
501
00:23:39,635 --> 00:23:41,545
Alexander: One of the issues
people have is that the
502
00:23:41,594 --> 00:23:44,474
image datasets may be
made up of posed faces.
503
00:23:44,510 --> 00:23:46,770
If someone is asked to
make a sad face,
504
00:23:46,816 --> 00:23:48,426
it may look different
from how their face
505
00:23:48,470 --> 00:23:50,990
actually looks when they're sad.
506
00:23:51,038 --> 00:23:53,558
Aitken: It's a valid argument,
so the most recent systems
507
00:23:53,606 --> 00:23:55,606
have started to draw on
images that are candid,
508
00:23:55,651 --> 00:23:57,481
footage of people doing
mundane things like
509
00:23:57,523 --> 00:23:59,703
driving their cars
or watching TV.
510
00:23:59,742 --> 00:24:02,922
Narrator: There has also been
criticism of the forced-choice
511
00:24:02,963 --> 00:24:06,313
answer method of labelling
pictures in datasets.
512
00:24:06,357 --> 00:24:08,487
Because there are limited
options when asked
513
00:24:08,534 --> 00:24:10,844
to ascribe an emotion
to a picture,
514
00:24:10,884 --> 00:24:13,714
there's no room
for interpretation.
515
00:24:13,756 --> 00:24:15,366
Badminton: Someone might look at
an image and think the person
516
00:24:15,410 --> 00:24:16,980
is feeling guilt or shame,
517
00:24:17,020 --> 00:24:21,500
but those feelings may not be on
the list of possible choices.
518
00:24:21,547 --> 00:24:23,937
Narrator: And there
are cultural concerns.
519
00:24:23,984 --> 00:24:25,864
People from different regions
of the world
520
00:24:25,899 --> 00:24:29,159
convey emotions
in different ways.
521
00:24:29,206 --> 00:24:31,376
Alexander: Many people use
smiles to show happiness
522
00:24:31,426 --> 00:24:33,946
but for example,
in Japan some smiles
523
00:24:33,994 --> 00:24:36,784
are simple expressions of
politeness, rather than joy.
524
00:24:36,823 --> 00:24:39,393
So it can be fairly nuanced.
525
00:24:39,434 --> 00:24:41,394
Aitken: But even if one culture
has a slightly different idea
526
00:24:41,436 --> 00:24:43,216
of what a happy face looks like,
527
00:24:43,264 --> 00:24:45,664
most people recognize
joy when they see it,
528
00:24:45,701 --> 00:24:48,791
regardless of where
they're from.
529
00:24:48,835 --> 00:24:51,705
Narrator: In order to mitigate
the effect of cultural nuances,
530
00:24:51,751 --> 00:24:55,671
ERT companies are compiling
more diverse datasets.
531
00:24:55,711 --> 00:24:58,411
Affectiva, one of the
leaders in the field,
532
00:24:58,453 --> 00:25:01,203
boasts a collection of more
than 10 million images
533
00:25:01,238 --> 00:25:05,808
of people's facial expressions
from 87 countries.
534
00:25:05,852 --> 00:25:07,852
And they are always adjusting
their algorithms
535
00:25:07,897 --> 00:25:10,807
to make them more accurate.
536
00:25:10,857 --> 00:25:12,637
Badminton: What these companies
are now doing is including
537
00:25:12,685 --> 00:25:14,765
an element of analysis
to their systems.
538
00:25:14,817 --> 00:25:17,647
So that rather than just
identifying an emotion,
539
00:25:17,690 --> 00:25:20,480
the AI is able to apply a
cultural context
540
00:25:20,519 --> 00:25:23,129
when classifying it.
541
00:25:23,173 --> 00:25:26,223
Narrator: Context is another
issue that critics of ERT
542
00:25:26,263 --> 00:25:27,743
take umbrage with.
543
00:25:27,787 --> 00:25:31,967
In 1972, Paul Ekman conducted
an experiment
544
00:25:32,008 --> 00:25:34,448
to study the differences between
how Japanese
545
00:25:34,489 --> 00:25:37,359
and American audiences reacted
to a horror film.
546
00:25:37,405 --> 00:25:40,705
And found that Japanese people
showed less negative expressions
547
00:25:40,756 --> 00:25:44,106
when there was an authority
figure in the room.
548
00:25:44,151 --> 00:25:46,071
Alexander: Different cultures
have their own set of rules
549
00:25:46,109 --> 00:25:48,329
about who can show
certain emotions to whom.
550
00:25:48,372 --> 00:25:51,072
In this case, the Japanese
audience probably behaved
551
00:25:51,114 --> 00:25:52,904
differently because they knew
there was someone there
552
00:25:52,942 --> 00:25:54,292
who may have been judging them.
553
00:25:54,335 --> 00:25:56,895
Narrator: And for people like
Sheikh Ahmed,
554
00:25:56,946 --> 00:25:59,986
being judged by an Artificial
Intelligence system
555
00:26:00,036 --> 00:26:02,336
while merely trying to find a
job would certainly
556
00:26:02,386 --> 00:26:04,866
have an effect on one's
behaviour.
557
00:26:04,911 --> 00:26:06,961
Aitken: Ahmed altered his
responses slightly
558
00:26:07,000 --> 00:26:09,090
over the course of the
eight interviews that day.
559
00:26:09,132 --> 00:26:11,442
I guess he thought that if he
gave the algorithm a variety
560
00:26:11,482 --> 00:26:15,232
of answers, it might increase
his chances of a positive score.
561
00:26:15,269 --> 00:26:17,099
Narrator: By the
end of the ordeal,
562
00:26:17,140 --> 00:26:21,010
an exhausted Ahmed is drenched
in sweat, his mouth is dry
563
00:26:21,057 --> 00:26:23,707
and he can't shake the feeling
that he hadn't made enough
564
00:26:23,756 --> 00:26:26,926
eye contact with the camera or
said the right things.
565
00:26:26,976 --> 00:26:28,796
Alexander:
Not enough eye contact?
566
00:26:28,848 --> 00:26:30,238
You're shy
and have no confidence.
567
00:26:30,284 --> 00:26:31,814
Too much eye contact,
568
00:26:31,851 --> 00:26:34,111
and you're
aggressive and too intense.
569
00:26:34,157 --> 00:26:36,457
Badminton: As difficult
as the process may be,
570
00:26:36,507 --> 00:26:39,507
the reality is that ERT
in recruitment is only
571
00:26:39,554 --> 00:26:42,564
going to become more common
as the technology advances.
572
00:26:42,601 --> 00:26:45,131
Narrator: And ERT is
gradually creeping its way
573
00:26:45,168 --> 00:26:47,208
into other fields as well.
574
00:26:47,257 --> 00:26:50,567
The latest sector to feel
its touch is education.
575
00:26:50,609 --> 00:26:54,399
True Light College, a secondary
school for girls in Hong Kong,
576
00:26:54,438 --> 00:26:58,048
used ERT software to evaluate
students' faces
577
00:26:58,094 --> 00:27:00,714
as they learned remotely
during the pandemic.
578
00:27:00,749 --> 00:27:03,009
Alexander: The developers say
that the system helps teachers
579
00:27:03,056 --> 00:27:05,006
make learning more
engaging and personal,
580
00:27:05,058 --> 00:27:07,838
by reacting to a student's
expressions in real time.
581
00:27:07,887 --> 00:27:11,537
It even sends them alerts if
they seem distracted or bored.
582
00:27:11,586 --> 00:27:13,676
Narrator: The company behind the
software claims
583
00:27:13,719 --> 00:27:16,199
that it is able to correctly
decipher a child's
584
00:27:16,243 --> 00:27:19,553
emotional state about
85% of the time
585
00:27:19,594 --> 00:27:22,644
and demand for the program
has increased dramatically,
586
00:27:22,684 --> 00:27:25,034
with the number of schools using
it in Hong Kong
587
00:27:25,078 --> 00:27:28,298
more than doubling
from 34 to 83.
588
00:27:28,342 --> 00:27:30,392
Aitken: The whole thing seems
really invasive to me.
589
00:27:30,431 --> 00:27:31,871
These are kids, after all.
590
00:27:31,911 --> 00:27:33,221
Do we really need
to be monitoring
591
00:27:33,260 --> 00:27:35,830
their faces as they learn?
592
00:27:35,871 --> 00:27:38,441
Narrator: Elsewhere in China,
ERT is being used
593
00:27:38,482 --> 00:27:41,272
for even more intrusive
purposes.
594
00:27:41,311 --> 00:27:43,531
Cameras with emotion
recognition systems
595
00:27:43,574 --> 00:27:45,584
have been installed in Xinjiang,
596
00:27:45,620 --> 00:27:48,880
the region where an estimated
1 million mostly Uyghur Muslims
597
00:27:48,928 --> 00:27:51,228
are being detained
in prison camps.
598
00:27:51,278 --> 00:27:54,318
Chinese authorities believe that
their algorithms are able to
599
00:27:54,368 --> 00:27:58,418
identify potential criminals by
determining their mental state.
600
00:27:58,459 --> 00:28:01,589
Badminton: It's kind of the next
step in the evolution of ERT.
601
00:28:01,636 --> 00:28:04,676
Some believe that not only can
the AI detect how a person is
602
00:28:04,726 --> 00:28:07,286
feeling, but it's even able to
predict their future actions
603
00:28:07,337 --> 00:28:10,727
and give an overall impression
of their personality.
604
00:28:10,776 --> 00:28:12,466
Aitken: But there's no real
evidence that these systems
605
00:28:12,516 --> 00:28:14,126
are even remotely accurate.
606
00:28:14,170 --> 00:28:16,870
They're based on very
vague so-called science.
607
00:28:16,912 --> 00:28:20,442
Narrator: In 2018, a
controversial study
608
00:28:20,481 --> 00:28:23,141
out of Stanford University
in California
609
00:28:23,179 --> 00:28:25,359
even went so far as to declare
that
610
00:28:25,399 --> 00:28:27,879
facial analysis is capable of
identifying
611
00:28:27,923 --> 00:28:30,143
a person's sexuality.
612
00:28:30,186 --> 00:28:32,796
Using a dataset of over
35,000 images
613
00:28:32,841 --> 00:28:34,671
taken from dating websites,
614
00:28:34,713 --> 00:28:37,453
a machine-learning system
was able to differentiate
615
00:28:37,498 --> 00:28:39,758
between pictures of gay
and straight people
616
00:28:39,805 --> 00:28:42,015
with surprising accuracy.
617
00:28:42,068 --> 00:28:44,638
Alexander: The program was
able to correctly categorize
618
00:28:44,679 --> 00:28:47,379
81% of cases involving
images of men
619
00:28:47,421 --> 00:28:50,161
and 74% of photographs of women.
620
00:28:50,206 --> 00:28:52,636
When humans did the same test,
those numbers dropped
621
00:28:52,687 --> 00:28:54,987
by 20% across both genders.
622
00:28:55,037 --> 00:28:57,517
The researchers were actually
quite shocked at how easy it was
623
00:28:57,561 --> 00:28:59,521
for the algorithm to
make the distinction.
624
00:28:59,563 --> 00:29:01,653
Narrator: The authors of the
study concluded
625
00:29:01,696 --> 00:29:03,996
that there is mounting
scientific proof
626
00:29:04,046 --> 00:29:07,436
that there may be connections
between faces and psychology
627
00:29:07,484 --> 00:29:10,274
that are impossible to detect
with the human eye,
628
00:29:10,313 --> 00:29:13,803
but are identifiable
to machine learning systems.
629
00:29:13,839 --> 00:29:16,189
Badminton: Several prominent
LGBT organizations
630
00:29:16,232 --> 00:29:19,152
were not happy and demanded
that Stanford distance itself
631
00:29:19,192 --> 00:29:22,592
from the research, calling
it dangerous and flawed.
632
00:29:22,630 --> 00:29:25,370
Aitken: People were angry,
because it is without question
633
00:29:25,415 --> 00:29:27,235
an international
human rights issue.
634
00:29:27,287 --> 00:29:29,857
This kind of technology could
be used to expose people as gay
635
00:29:29,898 --> 00:29:31,768
whether accurately
or inaccurately.
636
00:29:31,813 --> 00:29:33,823
And in countries like
Saudi Arabia and Iran,
637
00:29:33,859 --> 00:29:35,989
where homosexuality is
punished by execution,
638
00:29:36,035 --> 00:29:38,775
that is very dangerous!
639
00:29:38,820 --> 00:29:40,040
Narrator: While the
controversies around
640
00:29:40,082 --> 00:29:43,092
emotion recognition
technology swirl,
641
00:29:43,129 --> 00:29:46,089
the industry shows no sign
of slowing down.
642
00:29:46,132 --> 00:29:48,662
Some estimates project
that it will reach
643
00:29:48,699 --> 00:29:52,529
$37 billion US dollars by 2026,
644
00:29:52,573 --> 00:29:56,883
up from $19.5 billion in 2020.
645
00:29:56,925 --> 00:29:59,145
And it's not just
plucky startups
646
00:29:59,188 --> 00:30:01,228
looking to get a
piece of the pie.
647
00:30:01,277 --> 00:30:05,277
Tech industry titans like Apple,
Microsoft and Amazon are all
648
00:30:05,325 --> 00:30:08,975
investing heavily in developing
their own ERT products.
649
00:30:09,024 --> 00:30:11,204
Badminton: Obviously these
companies see something of value
650
00:30:11,244 --> 00:30:13,554
in the technology,
but I think there will always be
651
00:30:13,594 --> 00:30:17,084
lingering questions about
its scientific integrity.
652
00:30:17,119 --> 00:30:19,029
Narrator: Meanwhile,
back in New York,
653
00:30:19,078 --> 00:30:21,038
Sheikh Ahmed waits nervously
654
00:30:21,080 --> 00:30:25,910
to finally find out
if he got a job.
655
00:30:25,954 --> 00:30:29,784
♪ [show theme music]
656
00:30:29,828 --> 00:30:33,568
♪♪
657
00:30:33,614 --> 00:30:37,624
♪
658
00:30:37,661 --> 00:30:39,931
Narrator: In a Baron County,
Wisconsin courtroom,
659
00:30:39,968 --> 00:30:42,188
48-year-old Paul Zilly
660
00:30:42,231 --> 00:30:44,101
is about to receive
his sentence.
661
00:30:44,146 --> 00:30:46,056
The stakes are pretty high,
662
00:30:46,105 --> 00:30:49,275
he'll either be sent to prison
or be given probation.
663
00:30:49,325 --> 00:30:52,755
A significant factor guiding
the judge's decision
664
00:30:52,807 --> 00:30:55,287
will be determining whether or
not he is likely to commit
665
00:30:55,331 --> 00:30:57,811
another crime in the future.
666
00:30:57,856 --> 00:30:59,766
Morgan: He had been arrested a
few months earlier
667
00:30:59,814 --> 00:31:01,994
for stealing a lawn mower
and some other tools.
668
00:31:02,034 --> 00:31:04,214
And he plead guilty
to all of the charges.
669
00:31:04,253 --> 00:31:06,603
Narrator: Before his appearance
in court,
670
00:31:06,647 --> 00:31:09,687
the county prosecutor
offered him a plea deal:
671
00:31:09,737 --> 00:31:12,437
One year in jail and
follow-up supervision
672
00:31:12,479 --> 00:31:15,179
to make sure that he
doesn't reoffend.
673
00:31:15,221 --> 00:31:17,661
Aitken: His court appointed
attorney agrees to the terms,
674
00:31:17,701 --> 00:31:19,921
saying that a long jail term
isn't in his client's
675
00:31:19,965 --> 00:31:22,005
best interest because he's
not a career criminal...
676
00:31:22,054 --> 00:31:23,884
In other words, the attorney
doesn't think
677
00:31:23,925 --> 00:31:25,575
it's likely he will reoffend.
678
00:31:25,622 --> 00:31:27,892
Narrator: Unfortunately,
it doesn't turn out
679
00:31:27,929 --> 00:31:29,579
the way he expects,
680
00:31:29,626 --> 00:31:33,016
Wisconsin judges are now looking
to a new tool
681
00:31:33,065 --> 00:31:35,755
to help determine if criminals
will reoffend:
682
00:31:35,806 --> 00:31:38,636
An artificial intelligence
risk-assessment program
683
00:31:38,679 --> 00:31:40,939
that is designed to predict
future behavior
684
00:31:40,986 --> 00:31:42,986
of convicted criminals.
685
00:31:43,031 --> 00:31:45,251
Morgan: Wisconsin is
one of the first US states
686
00:31:45,294 --> 00:31:47,604
to integrate it into their
criminal justice system.
687
00:31:47,644 --> 00:31:49,394
Narrator: To Zilly's surprise,
688
00:31:49,429 --> 00:31:51,779
it has rated him, "High Risk"
689
00:31:51,822 --> 00:31:54,222
for committing violent
crime in the future
690
00:31:54,260 --> 00:31:57,350
and 'Medium Risk' overall
as a potential reoffender.
691
00:31:57,393 --> 00:31:59,793
Based on the algorithm's
prediction,
692
00:31:59,830 --> 00:32:02,790
the judge overturns the
prosecution's plea deal
693
00:32:02,833 --> 00:32:06,323
and sentences Zilly
to two years in county jail.
694
00:32:06,359 --> 00:32:07,529
[gavel strikes]
695
00:32:07,577 --> 00:32:08,797
Aitken: He is
completely shocked.
696
00:32:08,839 --> 00:32:10,539
The judge doubled
his prison time.
697
00:32:10,580 --> 00:32:13,710
He thought he was only
going to serve a year.
698
00:32:13,757 --> 00:32:16,537
Narrator: Zilly is adamant
he won't reoffend,
699
00:32:16,586 --> 00:32:19,976
but some people working within
the justice system are confident
700
00:32:20,025 --> 00:32:22,845
that AI can accurately and
fairly predict
701
00:32:22,897 --> 00:32:26,287
if someone will commit
a crime in the future.
702
00:32:26,335 --> 00:32:27,595
Badminton: Over the
past few years
703
00:32:27,641 --> 00:32:29,951
several tech companies
have used AI
704
00:32:29,991 --> 00:32:31,991
in the development of
'Risk Assessment' software
705
00:32:32,037 --> 00:32:35,207
that can be licensed by
various judicial systems.
706
00:32:35,257 --> 00:32:37,647
Narrator: The software's
big selling point
707
00:32:37,694 --> 00:32:40,354
is that it's able to mimic
the problem-solving
708
00:32:40,393 --> 00:32:43,443
and decision-making capabilities
of the human mind.
709
00:32:43,483 --> 00:32:45,703
Using machine-learning
algorithms,
710
00:32:45,746 --> 00:32:48,966
it analyzes existing data to
detect patterns
711
00:32:49,010 --> 00:32:52,270
and predict the likelihood that
crimes will occur in the future.
712
00:32:52,318 --> 00:32:55,188
Morgan: It's kind of like how a
bookie determines the odds
713
00:32:55,234 --> 00:32:57,764
for a sporting event or how
pollsters figure out
714
00:32:57,801 --> 00:32:59,671
who might win an election.
715
00:32:59,716 --> 00:33:02,546
Narrator: AI risk assessment
programs are now being used
716
00:33:02,589 --> 00:33:05,939
by at least 16 different
European countries
717
00:33:05,984 --> 00:33:08,254
and almost every US state.
718
00:33:08,290 --> 00:33:10,640
Aitken: Legal agencies are
seriously understaffed,
719
00:33:10,684 --> 00:33:12,694
Courts are deluged
with criminal cases,
720
00:33:12,729 --> 00:33:15,299
and so there is a backlog of
cases waiting to be heard.
721
00:33:15,341 --> 00:33:17,001
So often while
waiting for trial,
722
00:33:17,038 --> 00:33:18,518
people have to wait in prison,
723
00:33:18,561 --> 00:33:21,131
which only contributes
to overcrowding.
724
00:33:21,173 --> 00:33:23,393
Narrator: The AI
programs are designed to help
725
00:33:23,436 --> 00:33:25,436
alleviate these problems.
726
00:33:25,481 --> 00:33:28,141
Punitive decisions are
difficult to make,
727
00:33:28,180 --> 00:33:30,660
and made more so when
judges are flooded
728
00:33:30,704 --> 00:33:32,884
with many complex cases
that require
729
00:33:32,923 --> 00:33:35,103
a lot of knowledge and context.
730
00:33:35,143 --> 00:33:38,323
The hope is that AI
will help judges,
731
00:33:38,364 --> 00:33:42,804
making the process more accurate
and more efficient.
732
00:33:42,846 --> 00:33:44,806
Badminton: It's not easy.
The reality is that they may
733
00:33:44,848 --> 00:33:46,848
end up making a terrible mistake
734
00:33:46,894 --> 00:33:50,074
by granting a dangerous criminal
parole or on the flip-side,
735
00:33:50,115 --> 00:33:54,025
sentencing a person to prison
when probation would be better.
736
00:33:54,075 --> 00:33:56,075
So essentially the software
is being used
737
00:33:56,121 --> 00:33:58,781
to minimise these
kinds of errors.
738
00:33:58,819 --> 00:34:01,949
Narrator: One of America's
leading risk assessment firms
739
00:34:01,996 --> 00:34:04,826
claims that there are studies
proving that the technology
740
00:34:04,868 --> 00:34:07,778
is more accurate than human
judges in predicting
741
00:34:07,828 --> 00:34:10,608
a criminal's likelihood of
reoffending.
742
00:34:10,657 --> 00:34:14,007
In 2020, researchers at
Stanford University
743
00:34:14,052 --> 00:34:16,402
and UC Berkeley in California
744
00:34:16,445 --> 00:34:19,225
discovered that when assessing
things as complex
745
00:34:19,274 --> 00:34:21,414
as a criminal justice system,
746
00:34:21,450 --> 00:34:24,240
AI is up to 30% more accurate
747
00:34:24,279 --> 00:34:27,629
with its decisions than the
judges they surveyed.
748
00:34:27,674 --> 00:34:29,634
Morgan: Critics challenge
these claims, they say
749
00:34:29,676 --> 00:34:31,716
that there is not enough
evidence to prove that
750
00:34:31,765 --> 00:34:35,065
these technology can actually
improve decision-making.
751
00:34:35,116 --> 00:34:38,246
Narrator: In one example,
a 54 year old Florida man
752
00:34:38,293 --> 00:34:39,903
with an extensive
criminal record
753
00:34:39,947 --> 00:34:41,987
involving aggravated assault,
754
00:34:42,036 --> 00:34:44,946
multiple thefts and
felony drug trafficking,
755
00:34:44,995 --> 00:34:47,775
was arrested for shoplifting
and surprisingly,
756
00:34:47,824 --> 00:34:50,964
the algorithm rated him
'low risk' for reoffending.
757
00:34:51,001 --> 00:34:52,701
Aitken: Judging by
his criminal history,
758
00:34:52,742 --> 00:34:54,402
you would probably
think the opposite.
759
00:34:54,440 --> 00:34:55,920
But this could
indicate a problem
760
00:34:55,963 --> 00:34:57,753
in how the software
assesses risk.
761
00:34:57,791 --> 00:35:01,191
Narrator: It may also indicate
Paul Zilly is not actually
762
00:35:01,229 --> 00:35:05,969
at 'High Risk' of reoffending
and was unfairly sentenced.
763
00:35:06,016 --> 00:35:07,406
Badminton: The algorithms
look at police records
764
00:35:07,453 --> 00:35:09,463
and court documents
to see if the individual
765
00:35:09,498 --> 00:35:11,668
has any prior arrests
or convictions.
766
00:35:11,718 --> 00:35:14,458
Those reports also present
other relevant information
767
00:35:14,503 --> 00:35:16,553
to the algorithm,
like for example,
768
00:35:16,592 --> 00:35:19,512
if the individual has
a history of substance abuse.
769
00:35:19,552 --> 00:35:22,292
Narrator: It turns out,
Paul Zilly has a history
770
00:35:22,337 --> 00:35:24,507
of drug abuse.
Before he was arrested,
771
00:35:24,557 --> 00:35:26,427
he was struggling
with an addiction
772
00:35:26,472 --> 00:35:28,602
to crystal methamphetamine.
773
00:35:28,648 --> 00:35:31,128
He had told police he intended
to sell the items he stole
774
00:35:31,172 --> 00:35:33,092
to fuel his drug habit.
775
00:35:33,131 --> 00:35:34,961
Aitken: This definitely may
have contributed to him
776
00:35:35,002 --> 00:35:37,002
being rated high risk
by the algorithm.
777
00:35:37,047 --> 00:35:39,137
Drug addiction is
often related to crime,
778
00:35:39,180 --> 00:35:40,790
as the desperate need
for a substance
779
00:35:40,834 --> 00:35:42,534
leads to desperate measures.
780
00:35:42,575 --> 00:35:46,135
Narrator: The algorithm also
analyses the responses to the
781
00:35:46,187 --> 00:35:49,277
questionnaire Zilly filled out
while he was incarcerated.
782
00:35:49,321 --> 00:35:52,931
It consists of 137 questions
that help determine
783
00:35:52,976 --> 00:35:56,236
if a person is at risk of
reoffending.
784
00:35:56,284 --> 00:35:58,634
Badminton: Some of the questions
are serious ethical quandaries,
785
00:35:58,678 --> 00:36:00,848
for example,
"Does a hungry person
786
00:36:00,897 --> 00:36:02,767
have the right to access food?"
787
00:36:02,812 --> 00:36:05,512
Whereas others are based on
one's subjective opinion,
788
00:36:05,554 --> 00:36:08,824
like "If people make me angry,
I can be dangerous."
789
00:36:08,862 --> 00:36:12,042
It seems some of these questions
may not have a clear answer,
790
00:36:12,082 --> 00:36:15,562
in which case,
why are they using them?
791
00:36:15,608 --> 00:36:17,218
Narrator: According to
Zilly's risk assessment,
792
00:36:17,262 --> 00:36:19,702
he scored poorly on the
questionnaire;
793
00:36:19,742 --> 00:36:21,832
combined with his history of
drug abuse,
794
00:36:21,875 --> 00:36:24,565
he was labelled 'High Risk'.
795
00:36:24,617 --> 00:36:26,397
Morgan: This may help
explain why his sentence
796
00:36:26,445 --> 00:36:28,835
was upped from one year to two.
797
00:36:28,882 --> 00:36:31,582
In response, Zilly's
court-appointed attorney
798
00:36:31,624 --> 00:36:34,104
filed an appeal,
trying to reduce the sentence.
799
00:36:34,148 --> 00:36:37,328
Narrator: There is concern that
the questionnaire may also be
800
00:36:37,369 --> 00:36:40,849
biased in what it specifically
asks of individuals,
801
00:36:40,894 --> 00:36:43,514
like if they are employed,
where they live
802
00:36:43,549 --> 00:36:46,249
and what the crime levels are
like in their neighborhood.
803
00:36:46,291 --> 00:36:48,381
Critics say this could result
in the algorithms
804
00:36:48,423 --> 00:36:51,823
making assessments that
are discriminatory.
805
00:36:51,861 --> 00:36:53,391
Badminton: Since poorer
neighbourhoods often have
806
00:36:53,428 --> 00:36:55,558
higher crime rates
than more wealthy ones,
807
00:36:55,604 --> 00:36:58,044
the algorithm may assume
it's residents
808
00:36:58,085 --> 00:36:59,695
are at a greater risk of
committing a crime
809
00:36:59,739 --> 00:37:03,399
than if they were living
in a rich area.
810
00:37:03,438 --> 00:37:05,698
Narrator: Civil rights activists
believe the technology
811
00:37:05,745 --> 00:37:08,225
could unfairly flag
people of colour,
812
00:37:08,269 --> 00:37:10,879
who statistically, live in
poorer neighbourhoods
813
00:37:10,924 --> 00:37:12,624
with higher crime rates.
814
00:37:12,665 --> 00:37:16,795
And there is compelling evidence
that it is already happening.
815
00:37:16,843 --> 00:37:19,453
Aitken: Recently, a study of the
AI program that had provided
816
00:37:19,498 --> 00:37:22,198
risk scores to offenders in
Broward County, Florida,
817
00:37:22,240 --> 00:37:24,500
found the algorithm incorrectly
flagged black defendants
818
00:37:24,546 --> 00:37:25,976
as future criminals at almost
819
00:37:26,026 --> 00:37:28,026
double the rate
as white defendants.
820
00:37:28,071 --> 00:37:30,601
And white defendants were
incorrectly assessed as low risk
821
00:37:30,639 --> 00:37:33,859
to offend more often than
their black counterparts.
822
00:37:33,903 --> 00:37:36,253
Narrator: Critics cite
the case of an 18-year-old
823
00:37:36,297 --> 00:37:39,207
African-American woman who
was arrested for burglary
824
00:37:39,257 --> 00:37:41,257
in Ft. Lauderdale, Florida.
825
00:37:41,302 --> 00:37:43,652
Despite being a first
time offender,
826
00:37:43,696 --> 00:37:46,476
the algorithm rated
her 'high risk'.
827
00:37:46,525 --> 00:37:48,345
Morgan: Compare this
to the previous summer,
828
00:37:48,396 --> 00:37:50,786
when an older white man from the
same area was arrested
829
00:37:50,833 --> 00:37:54,623
and rated low risk,
despite having been
830
00:37:54,663 --> 00:37:57,453
previously convicted
of armed robbery.
831
00:37:57,492 --> 00:38:00,322
Narrator: This leads civil
rights activists to conclude
832
00:38:00,365 --> 00:38:02,315
that risk assessment programs
833
00:38:02,367 --> 00:38:05,367
may be perpetuating
existent biases,
834
00:38:05,413 --> 00:38:07,763
further compounding
prejudice and racism
835
00:38:07,807 --> 00:38:09,457
in the justice system.
836
00:38:09,504 --> 00:38:12,514
But the companies that license
their software to several US
837
00:38:12,551 --> 00:38:15,641
state justice systems claim that
a person's race
838
00:38:15,684 --> 00:38:18,774
isn't a factor in the
algorithms' risk assessment.
839
00:38:18,818 --> 00:38:22,518
AI advocates believe the
opposite, that the technology
840
00:38:22,561 --> 00:38:24,781
actually makes the criminal
justice system
841
00:38:24,824 --> 00:38:26,704
fairer for people of colour.
842
00:38:26,739 --> 00:38:29,829
Aitken: They say that it cuts
the human, or biased factor out,
843
00:38:29,872 --> 00:38:32,012
meaning that it should give a
more objective,
844
00:38:32,048 --> 00:38:34,088
less-biased evaluation
of each person.
845
00:38:34,137 --> 00:38:35,917
Badminton: But as we just
saw in Florida,
846
00:38:35,965 --> 00:38:37,745
this isn't always the case.
847
00:38:37,793 --> 00:38:41,883
The algorithm perpetuates
existing biases.
848
00:38:41,928 --> 00:38:43,888
Narrator: At Paul Zilly's
appeal hearing,
849
00:38:43,930 --> 00:38:46,410
his lawyer questions
Dr. Tim Brennan,
850
00:38:46,454 --> 00:38:48,894
one of the creators
of the AI software
851
00:38:48,935 --> 00:38:50,625
that assessed his client.
852
00:38:50,676 --> 00:38:52,626
Morgan: Brennan testifies that
his software
853
00:38:52,678 --> 00:38:55,118
wasn't designed to be used in
sentencing.
854
00:38:55,158 --> 00:38:56,378
In fact he didn't want it
involved
855
00:38:56,421 --> 00:38:58,471
in the criminial justice
system at all.
856
00:38:58,510 --> 00:39:01,120
Its purpose was
to help reduce crime,
857
00:39:01,164 --> 00:39:04,394
not to further punish
people like Paul Zilly.
858
00:39:04,429 --> 00:39:06,299
Narrator: In light of
Brennan's testimony,
859
00:39:06,344 --> 00:39:09,964
the judge reduces Zilly's
sentence to 18 months, admitting
860
00:39:09,999 --> 00:39:12,569
that he may have put too much
faith in the algorithm.
861
00:39:12,611 --> 00:39:14,661
Badminton: Here is a case
where the judicial system
862
00:39:14,700 --> 00:39:17,310
relied far too heavily
on this technology,
863
00:39:17,355 --> 00:39:20,525
leading the judge to make an
unfair sentencing decision.
864
00:39:20,575 --> 00:39:23,795
And unfortunately, it's probably
safe to assume
865
00:39:23,839 --> 00:39:27,149
that this isn't the only case
where this has happened.
866
00:39:27,190 --> 00:39:28,930
Narrator: To attempt
to remedy this,
867
00:39:28,975 --> 00:39:32,975
several civil rights lawyers,
UN officials and labor unions
868
00:39:33,022 --> 00:39:35,722
are now lobbying for more
government regulation
869
00:39:35,764 --> 00:39:38,334
of AI's use within
the legal system.
870
00:39:38,376 --> 00:39:39,766
Aitken: It's a
civil rights issue.
871
00:39:39,812 --> 00:39:42,682
Because the technology
can perpetuate biases,
872
00:39:42,728 --> 00:39:45,338
the use of algorithmic tools
in courtrooms can lead to
873
00:39:45,383 --> 00:39:47,693
violations of a person's right
to a fair sentencing.
874
00:39:47,733 --> 00:39:49,823
Narrator: Despite
inherent problems,
875
00:39:49,865 --> 00:39:53,035
several legal analysts believe
its potential for good
876
00:39:53,086 --> 00:39:56,306
far outweighs the harm
it may cause.
877
00:39:56,350 --> 00:39:57,920
Morgan:
The State of Virginia claims
878
00:39:57,960 --> 00:40:00,140
that they've managed to cut down
on their prison populations
879
00:40:00,180 --> 00:40:02,840
by 26% using these algorithms.
880
00:40:02,878 --> 00:40:05,928
They say they'd been able to do
so by releasing people early
881
00:40:05,968 --> 00:40:08,488
who are 'low risk' of
reoffending.
882
00:40:08,536 --> 00:40:10,446
Aitken: Unburdening
the justice system,
883
00:40:10,495 --> 00:40:12,845
while providing people with a
chance to rebuild their lives
884
00:40:12,888 --> 00:40:15,938
outside of prison, is obviously
a benefit to everyone involved.
885
00:40:15,978 --> 00:40:17,888
The question, as always, is
886
00:40:17,937 --> 00:40:19,937
if we can live with the
negative consequences
887
00:40:19,982 --> 00:40:21,902
of employing this technology.
888
00:40:21,941 --> 00:40:25,681
Is it worth it if even one
person is sentenced unjustly?
889
00:40:25,727 --> 00:40:27,247
Maybe not.
890
00:40:27,294 --> 00:40:30,084
Narrator: The case of Paul Zilly
clearly illustrates
891
00:40:30,123 --> 00:40:33,613
the inherent perils of allowing
algorithms to make decisions
892
00:40:33,648 --> 00:40:36,958
regarding something as important
as a person's freedom.
893
00:40:36,999 --> 00:40:39,179
And while it's still unknown
what future impact
894
00:40:39,219 --> 00:40:41,439
AI will have on our
legal systems,
895
00:40:41,482 --> 00:40:44,312
as more courtrooms adopt
this technology,
896
00:40:44,354 --> 00:40:47,624
and barring a proper framework
regulating its use,
897
00:40:47,662 --> 00:40:50,882
it's likely that injustices
will continue
898
00:40:50,926 --> 00:40:54,146
and the controversy surrounding
it will further intensify.
899
00:40:54,190 --> 00:40:56,020
♪
900
00:40:56,062 --> 00:40:59,852
♪ [show theme music]
901
00:40:59,892 --> 00:41:03,642
♪♪
902
00:41:03,678 --> 00:41:07,068
♪
903
00:41:07,116 --> 00:41:09,076
Narrator: At the Vellore
Institute of Technology
904
00:41:09,118 --> 00:41:10,768
in southern India,
905
00:41:10,816 --> 00:41:13,166
19-year-old Priyanjali Gupta
906
00:41:13,209 --> 00:41:15,339
is a 2nd year
engineering student
907
00:41:15,385 --> 00:41:18,125
specialising in data science.
908
00:41:18,171 --> 00:41:22,091
In February 2021, she decides to
take the weekend off
909
00:41:22,131 --> 00:41:24,871
from her studies and visit
her mother in New Delhi.
910
00:41:24,917 --> 00:41:28,617
While there, Priyanjali
confides something to her.
911
00:41:28,660 --> 00:41:30,660
Pringle: She's been thinking
about trying to develop
912
00:41:30,705 --> 00:41:32,965
some kind of new technology
that can help people.
913
00:41:33,012 --> 00:41:35,012
But she doesn't know what
specifically.
914
00:41:35,057 --> 00:41:38,227
Narrator: One day, Priyanjali
has an epiphany of sorts
915
00:41:38,278 --> 00:41:40,278
and realizes that
virtual assistants
916
00:41:40,323 --> 00:41:43,893
which rely on voice commands
such as Alexa and Siri
917
00:41:43,936 --> 00:41:46,496
are not accessible to people
who are deaf...
918
00:41:46,547 --> 00:41:49,027
So she sets out to
create an application
919
00:41:49,071 --> 00:41:51,771
that will be inclusive to people
with hearing disabilities,
920
00:41:51,813 --> 00:41:53,693
using artificial intelligence.
921
00:41:53,728 --> 00:41:55,558
Aitken: She wants
to invent an AI program
922
00:41:55,600 --> 00:41:57,470
that can translate
visual 'sign language'
923
00:41:57,515 --> 00:42:01,605
into English text
and do so in real-time.
924
00:42:01,649 --> 00:42:03,829
Narrator: There are many
different forms of sign language
925
00:42:03,869 --> 00:42:06,999
but Priyanjali's AI
application translates
926
00:42:07,046 --> 00:42:10,956
the most commonly used,
American Sign Language, or ASL.
927
00:42:11,006 --> 00:42:13,486
Which is used by
around 500,000 people
928
00:42:13,531 --> 00:42:15,451
in the US and Canada.
929
00:42:15,489 --> 00:42:18,579
Priyanjali is hopeful she can
get the software to work,
930
00:42:18,623 --> 00:42:20,543
but it will be no easy feat.
931
00:42:20,581 --> 00:42:24,321
It's a complicated
and technical process.
932
00:42:24,367 --> 00:42:25,887
Aitken: The process is called,
'Deep Learning'.
933
00:42:25,934 --> 00:42:28,024
It's where the AI is
trained to perform tasks
934
00:42:28,067 --> 00:42:29,807
by analyzing
large amounts of data,
935
00:42:29,851 --> 00:42:32,111
it's similar to
how human beings learn.
936
00:42:32,158 --> 00:42:34,898
And it does so on a network
called a "neural network."
937
00:42:34,943 --> 00:42:37,343
The more data the algorithms
are able to analyse,
938
00:42:37,380 --> 00:42:39,300
the more they will "learn"
and the more accurate
939
00:42:39,339 --> 00:42:41,119
they will become
in their analysis.
940
00:42:41,167 --> 00:42:43,997
Narrator: In Priyanjali's case,
her machine will need to analyze
941
00:42:44,039 --> 00:42:45,869
different sign language gestures
942
00:42:45,911 --> 00:42:48,041
to be able to learn
what they mean.
943
00:42:48,087 --> 00:42:51,257
It's in the developmental stages
but it could be promising.
944
00:42:51,307 --> 00:42:53,477
Around 70 million people
around the world
945
00:42:53,527 --> 00:42:55,487
use sign language
to communicate,
946
00:42:55,529 --> 00:42:58,659
so it's crucial that
technologies like this exist.
947
00:42:58,706 --> 00:43:00,266
Morgan: And it isn't just
limited to people
948
00:43:00,316 --> 00:43:01,926
with hearing disabilities.
949
00:43:01,970 --> 00:43:03,930
AI can help people who are blind
or have
950
00:43:03,972 --> 00:43:06,452
other physical or cognitive
disabilities.
951
00:43:06,496 --> 00:43:08,756
Pringle: The technology has the
potential to provide
952
00:43:08,803 --> 00:43:12,463
more independence in their
day-to-day lives.
953
00:43:12,502 --> 00:43:14,772
Narrator: There are already many
AI powered tools
954
00:43:14,809 --> 00:43:16,639
available to the disabled.
955
00:43:16,681 --> 00:43:19,901
People with visual impairments
can access talking keyboards,
956
00:43:19,945 --> 00:43:22,685
and use various virtual
assistants like 'Siri'
957
00:43:22,730 --> 00:43:28,080
and 'Alexa' to perform a web
search or write an email.
958
00:43:28,127 --> 00:43:29,867
Pringle: There are also
AI powered applications
959
00:43:29,911 --> 00:43:31,611
that can read out words
on a printed page,
960
00:43:31,652 --> 00:43:33,702
a computer screen or smartphone.
961
00:43:33,741 --> 00:43:35,921
It can even describe
what's on screen,
962
00:43:35,961 --> 00:43:39,051
such as application icons,
photo images and videos.
963
00:43:39,094 --> 00:43:41,364
Narrator: Artificial
intelligence is also having
964
00:43:41,401 --> 00:43:43,321
an impact on helping
those for whom
965
00:43:43,359 --> 00:43:46,059
communication can be
challenging.
966
00:43:46,101 --> 00:43:47,671
Morgan: People with
certain brain injuries
967
00:43:47,712 --> 00:43:50,672
or with conditions like
Parkinson's can have a hard time
968
00:43:50,715 --> 00:43:53,715
speaking in ways that are easy
for others to understand.
969
00:43:53,761 --> 00:43:56,021
But AI algorithms
can take what they are saying
970
00:43:56,068 --> 00:43:58,458
and transform them into
audio or text files
971
00:43:58,505 --> 00:44:00,455
that are easier to understand.
972
00:44:00,507 --> 00:44:03,077
Narrator: The technology can
also assist people
973
00:44:03,118 --> 00:44:06,208
who may be unable
to speak at all.
974
00:44:06,252 --> 00:44:10,602
In Nebraska, Kaden Bowen is a
teenager with cerebral palsy,
975
00:44:10,648 --> 00:44:11,948
it's a condition
which prevents him
976
00:44:11,997 --> 00:44:14,607
from being able to walk or talk.
977
00:44:14,652 --> 00:44:18,662
To help him communicate, he uses
a rudimentary speaking device
978
00:44:18,699 --> 00:44:22,269
with buttons containing
preselected words or phrases.
979
00:44:22,311 --> 00:44:26,141
But recently he and his father
began using Amazon Echo,
980
00:44:26,185 --> 00:44:30,315
a virtual assistant that uses AI
in its voice-control system.
981
00:44:30,363 --> 00:44:31,763
Aitken: Using his
speaking device,
982
00:44:31,799 --> 00:44:34,759
Kaden can have the Echo perform
tasks he wants it to do,
983
00:44:34,802 --> 00:44:36,982
like to call his family members
on their phones,
984
00:44:37,022 --> 00:44:38,982
and ask them, for example,
to take him for a car ride.
985
00:44:39,024 --> 00:44:41,724
It may seem small, but this
provides him with
986
00:44:41,766 --> 00:44:44,376
an ability to communicate
that he didn't have before.
987
00:44:44,420 --> 00:44:46,600
Narrator: One of the most
significant ways
988
00:44:46,640 --> 00:44:49,250
that technology is improving
disabled people's lives
989
00:44:49,295 --> 00:44:51,205
is in transportation.
990
00:44:51,253 --> 00:44:54,563
Mobility is often one of their
most challenging issues.
991
00:44:54,604 --> 00:44:58,174
But AI powered navigation tools
like 'Google Maps'
992
00:44:58,217 --> 00:45:01,087
can help them attain
more autonomy.
993
00:45:01,133 --> 00:45:03,833
Morgan: Apps like these utilize
GPS technology to make it
994
00:45:03,875 --> 00:45:06,225
really easy to visualise the
route you need to take,
995
00:45:06,268 --> 00:45:08,968
all while providing information
about accessibility,
996
00:45:09,010 --> 00:45:12,410
like where ramps
or elevators are.
997
00:45:12,448 --> 00:45:14,838
Narrator: The recent advancement
of self-driving cars
998
00:45:14,886 --> 00:45:17,756
is also a potentially
significant development.
999
00:45:17,802 --> 00:45:19,762
Aitken: People with disabilities
that prevent them
1000
00:45:19,804 --> 00:45:21,894
from being able to drive,
might be able to
1001
00:45:21,936 --> 00:45:24,026
use self-driving cars to
get around on their own,
1002
00:45:24,069 --> 00:45:25,719
providing them
with a degree of independence
1003
00:45:25,766 --> 00:45:28,416
they may not have had before.
1004
00:45:28,464 --> 00:45:31,254
Narrator: Despite AI's many
positive benefits,
1005
00:45:31,293 --> 00:45:33,733
there are experts who are
raising questions about
1006
00:45:33,774 --> 00:45:36,654
its potential limitations,
particularly
1007
00:45:36,690 --> 00:45:40,220
around the technology's
'financial accessibility'.
1008
00:45:40,259 --> 00:45:43,089
According to recent data,
roughly 26 percent
1009
00:45:43,131 --> 00:45:45,181
of US citizens with disabilities
1010
00:45:45,220 --> 00:45:47,660
are currently living in poverty,
1011
00:45:47,701 --> 00:45:50,051
nearly two and a half times
higher than people
1012
00:45:50,095 --> 00:45:51,655
who aren't disabled.
1013
00:45:51,705 --> 00:45:53,785
And it's more or less the
same for people
1014
00:45:53,838 --> 00:45:56,278
living in EU countries.
1015
00:45:56,318 --> 00:45:58,318
Morgan: They may be unable to
find work that pays
1016
00:45:58,364 --> 00:46:01,024
a decent wage, if they are
able to work at all.
1017
00:46:01,062 --> 00:46:03,502
They might not have financial
support from friends or family
1018
00:46:03,543 --> 00:46:05,373
and any government
disability funding
1019
00:46:05,414 --> 00:46:08,204
might not provide them
enough to live on.
1020
00:46:08,243 --> 00:46:10,463
Pringle: So owing to their
economic insecurity,
1021
00:46:10,506 --> 00:46:12,416
they may not have
the money to spend
1022
00:46:12,465 --> 00:46:14,675
on cutting edge technology
or software,
1023
00:46:14,728 --> 00:46:17,208
leaving them unable
to benefit from it.
1024
00:46:17,252 --> 00:46:20,082
Narrator: This situation could
be even more challenging
1025
00:46:20,125 --> 00:46:22,555
in developing nations where
poverty rates are higher
1026
00:46:22,605 --> 00:46:25,295
and the median income is lower.
1027
00:46:25,347 --> 00:46:26,827
Aitken: Problems with
access to technology
1028
00:46:26,871 --> 00:46:28,441
are sure to be
difficult to address,
1029
00:46:28,481 --> 00:46:30,311
because it is
systemic in nature.
1030
00:46:30,352 --> 00:46:32,222
Meaning that there are many
contributing factors
1031
00:46:32,267 --> 00:46:33,967
and reasons as to why it exists,
1032
00:46:34,008 --> 00:46:36,618
making it all the more
difficult to solve.
1033
00:46:36,663 --> 00:46:40,363
Morgan: But some are trying.
1034
00:46:40,406 --> 00:46:42,756
One company is setting up
an AI interface to help
1035
00:46:42,800 --> 00:46:45,930
people with disabilities find
employment opportunities.
1036
00:46:45,977 --> 00:46:49,457
It can browse job search results
and even set up interviews,
1037
00:46:49,502 --> 00:46:52,202
then can provide the individual
with an interactive
1038
00:46:52,244 --> 00:46:55,994
voice response, chatbots,
and voice assistants.
1039
00:46:56,030 --> 00:46:59,690
Narrator: And recently,
Microsoft invested $25 million
1040
00:46:59,729 --> 00:47:02,249
on a global AI
Accessibility initiative,
1041
00:47:02,297 --> 00:47:04,557
funding projects that
develop software
1042
00:47:04,604 --> 00:47:06,654
and technologies for
disabled people,
1043
00:47:06,693 --> 00:47:08,433
aiming to improve their
independence
1044
00:47:08,477 --> 00:47:10,347
and quality of life.
1045
00:47:10,392 --> 00:47:12,702
Pringle: It's worth remembering
that as the technology
1046
00:47:12,742 --> 00:47:15,832
becomes more available, its
costs will also come down.
1047
00:47:15,876 --> 00:47:17,876
And so that will increase
its accessibility
1048
00:47:17,922 --> 00:47:20,492
to people with disabilities.
1049
00:47:20,533 --> 00:47:23,233
Narrator: Perhaps there is some
hope in the fact that there are
1050
00:47:23,275 --> 00:47:26,665
so many young AI developers
like Priyanjali Gupta.
1051
00:47:26,713 --> 00:47:29,283
On her webcam, she records
herself doing
1052
00:47:29,324 --> 00:47:32,634
several basic sign language
gestures.
1053
00:47:32,675 --> 00:47:34,755
Aitken: The AI software will
then interpret the motions
1054
00:47:34,808 --> 00:47:37,248
and translate it into
readable English text.
1055
00:47:37,289 --> 00:47:39,899
Narrator: The project is still
in its initial phases
1056
00:47:39,944 --> 00:47:42,424
and faces some technical
limitations.
1057
00:47:42,468 --> 00:47:45,208
But maybe with time, it can
become a full fledged
1058
00:47:45,253 --> 00:47:48,173
on screen translator
of sign language.
1059
00:47:48,213 --> 00:47:50,083
Pringle: Thankfully, Gupta is
not the only one
1060
00:47:50,128 --> 00:47:52,258
developing AI programs
to help people
1061
00:47:52,304 --> 00:47:55,354
with hearing disabilities
or impairments.
1062
00:47:55,394 --> 00:47:57,924
Narrator: Several tech companies
are developing smartphone
1063
00:47:57,962 --> 00:48:01,492
applications that can use
its camera to lip-read.
1064
00:48:01,530 --> 00:48:04,580
There are also AI applications
that utilize
1065
00:48:04,620 --> 00:48:07,670
Automated Speech Recognition,
or ASR,
1066
00:48:07,710 --> 00:48:09,890
which can transcribe the
conversation
1067
00:48:09,930 --> 00:48:13,190
of a group of people
in real-time.
1068
00:48:13,238 --> 00:48:14,498
Morgan: Something like that
could help people with
1069
00:48:14,543 --> 00:48:16,723
hearing disabilities to be
included in a conversation
1070
00:48:16,763 --> 00:48:18,853
without even needing
to read lips.
1071
00:48:18,896 --> 00:48:21,506
Icing on the cake is that
these algorithms can add things
1072
00:48:21,550 --> 00:48:25,340
like punctuation and names of
the person who's speaking.
1073
00:48:25,380 --> 00:48:27,340
Narrator: Internet
accessibility is also
1074
00:48:27,382 --> 00:48:30,172
a significant issue
for disabled people.
1075
00:48:30,211 --> 00:48:33,211
While some websites are now
optimizing their platforms
1076
00:48:33,258 --> 00:48:35,128
to allow visually impaired
individuals
1077
00:48:35,173 --> 00:48:37,443
to adjust the font size
and colour,
1078
00:48:37,479 --> 00:48:39,659
to be more easily seen and read,
1079
00:48:39,699 --> 00:48:43,269
a recent study showed that
98% of the world's
1080
00:48:43,311 --> 00:48:46,921
top one million websites don't
offer full accessibility.
1081
00:48:46,967 --> 00:48:48,657
And there are serious concerns
1082
00:48:48,708 --> 00:48:50,968
about the ones that
are accessible,
1083
00:48:51,015 --> 00:48:54,185
specifically regarding
online privacy.
1084
00:48:54,235 --> 00:48:56,235
Morgan: Many of these
tools are cloud-based,
1085
00:48:56,281 --> 00:48:58,241
so it's possible that
information about
1086
00:48:58,283 --> 00:49:01,293
a person's disability could be
obtained by a third party.
1087
00:49:01,329 --> 00:49:04,589
In addition to being a huge
violation of privacy,
1088
00:49:04,637 --> 00:49:06,937
information like this falling
into the wrong hands
1089
00:49:06,987 --> 00:49:08,817
leaves the door open
to things like
1090
00:49:08,858 --> 00:49:11,038
discrimination or
social exclusion,
1091
00:49:11,078 --> 00:49:13,078
or even just online bullying.
1092
00:49:13,124 --> 00:49:16,654
Narrator: But many disabled
people are embracing technology,
1093
00:49:16,692 --> 00:49:19,432
believing its ability to
help them far outweighs
1094
00:49:19,478 --> 00:49:21,608
any potential problems
it may cause,
1095
00:49:21,654 --> 00:49:23,964
and for people like
Priyanjali Gupta,
1096
00:49:24,004 --> 00:49:27,014
this is all the
encouragement they need.
1097
00:49:27,051 --> 00:49:29,921
Morgan: Gupta is currently
able to get her AI technology
1098
00:49:29,967 --> 00:49:33,267
to adapt six different sign
language gestures into English.
1099
00:49:33,318 --> 00:49:36,758
"Yes", "No",
"Please", "Thank You",
1100
00:49:36,799 --> 00:49:39,499
"Hello" and "I Love You".
1101
00:49:39,541 --> 00:49:42,851
Narrator: Now in her 3rd year,
the 21-year-old
1102
00:49:42,892 --> 00:49:45,942
university student is
researching a new neural network
1103
00:49:45,983 --> 00:49:49,293
that will improve the video
analysis done by the AI.
1104
00:49:49,334 --> 00:49:51,994
And she's also trying to
secure additional funding
1105
00:49:52,032 --> 00:49:54,122
in order to make improvements.
1106
00:49:54,165 --> 00:49:56,645
Aitken: She sees her invention
as a small, but very important
1107
00:49:56,689 --> 00:49:59,389
step in helping people
struggling with disabilities.
1108
00:49:59,431 --> 00:50:02,301
Narrator: As technology
continues to be integrated
1109
00:50:02,347 --> 00:50:05,567
into the day-to-day lives of
people with disabilities,
1110
00:50:05,611 --> 00:50:08,571
there is real hope that it has
the potential to help them
1111
00:50:08,614 --> 00:50:11,314
live more independent and
fulfilling lives.
1112
00:50:11,356 --> 00:50:13,916
And as long as the Priyanjali
Guptas of the world
1113
00:50:13,967 --> 00:50:16,097
are out there using their
expertise to develop
1114
00:50:16,143 --> 00:50:18,673
new and innovative applications,
1115
00:50:18,711 --> 00:50:21,411
the future looks
more accessible than ever.
89752
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.