Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:23,327 --> 00:00:25,287
Narrator: In Detroit, USA,
2
00:00:25,329 --> 00:00:27,769
a man is dragged away
in front of his family,
3
00:00:27,810 --> 00:00:30,470
as Police use facial-
recognition technology
4
00:00:30,508 --> 00:00:32,818
to arrest a suspected thief.
5
00:00:32,858 --> 00:00:34,248
Mhairi Aitken: Williams
repeatedly denies
6
00:00:34,295 --> 00:00:36,165
that the man in the security
images is him,
7
00:00:36,210 --> 00:00:39,470
but his protests
fall on deaf ears.
8
00:00:39,517 --> 00:00:41,387
Narrator: Unable to
cope with his grief,
9
00:00:41,432 --> 00:00:44,042
a Canadian writer
uses an AI chat-bot
10
00:00:44,087 --> 00:00:46,387
to connect with
a lost loved one.
11
00:00:46,437 --> 00:00:48,477
Ramona Pringle: The algorithms
use massive text datasets
12
00:00:48,526 --> 00:00:50,046
to formulate the right
combination of words
13
00:00:50,093 --> 00:00:52,273
in response to a prompt.
14
00:00:52,313 --> 00:00:53,793
Aitken: With the help of
Artificial Intelligence,
15
00:00:53,836 --> 00:00:56,876
he is literally chatting
with her ghost.
16
00:00:56,926 --> 00:00:59,056
Narrator: A bedroom in a
house in London, England,
17
00:00:59,102 --> 00:01:01,022
becomes the epicenter
of potentially
18
00:01:01,061 --> 00:01:05,331
the biggest financial
crash since 2008.
19
00:01:05,369 --> 00:01:06,369
Nikolas Badminton:
It happens quickly.
20
00:01:06,414 --> 00:01:08,504
There are huge sell-offs
in security stocks
21
00:01:08,546 --> 00:01:10,546
and the market drops almost 10%.
22
00:01:10,592 --> 00:01:13,642
Millions are lost in
just over 36 minutes.
23
00:01:13,682 --> 00:01:15,512
Anthony Morgan: He was
kind of like a rock star.
24
00:01:15,553 --> 00:01:18,303
Anything he touched
turned to gold.
25
00:01:18,339 --> 00:01:22,299
Narrator: In Tanzania,
African park rangers use AI
26
00:01:22,343 --> 00:01:25,523
to identify and catch
illegal hunters.
27
00:01:25,563 --> 00:01:27,133
Kristopher Alexander:
Poaching is a huge problem
28
00:01:27,174 --> 00:01:28,784
in this part of Africa.
29
00:01:28,827 --> 00:01:30,257
Some reports have indicated that
30
00:01:30,307 --> 00:01:32,047
at least 200,000 animals
31
00:01:32,092 --> 00:01:33,752
are killed every year in
32
00:01:33,789 --> 00:01:35,969
the western Serengeti alone.
33
00:01:36,008 --> 00:01:37,178
Anthony Morgan:
These cameras are really
34
00:01:37,227 --> 00:01:38,787
remarkable pieces of technology.
35
00:01:38,837 --> 00:01:40,617
Like, they're the size of
your index finger.
36
00:01:40,665 --> 00:01:43,185
The image could be a
key piece of evidence,
37
00:01:43,233 --> 00:01:47,593
if the poachers were ever
caught and prosecuted.
38
00:01:47,629 --> 00:01:49,069
Narrator: These are the
stories of the future
39
00:01:49,109 --> 00:01:51,329
that big data is bringing
to our doorsteps.
40
00:01:51,372 --> 00:01:53,242
♪
41
00:01:53,287 --> 00:01:57,197
The real world impact of
predictions and surveillance.
42
00:01:57,247 --> 00:01:58,637
The power of artificial
intelligence
43
00:01:58,683 --> 00:02:00,293
and autonomous machines.
44
00:02:00,337 --> 00:02:03,817
♪
45
00:02:03,862 --> 00:02:06,042
For better or worse,
46
00:02:06,082 --> 00:02:10,352
these are the Secrets
of Big Data.
47
00:02:10,391 --> 00:02:12,611
♪
48
00:02:12,654 --> 00:02:15,574
Detroit, Michigan.
49
00:02:15,613 --> 00:02:18,833
For the past two decades the
city's once-flourishing economy
50
00:02:18,877 --> 00:02:20,917
has been in serious decline,
51
00:02:20,966 --> 00:02:24,406
due to the collapse of
the US auto industry.
52
00:02:24,448 --> 00:02:29,188
In the aftermath, there has been
a significant increase in crime.
53
00:02:29,236 --> 00:02:31,496
At one of the city's remaining
auto-parts suppliers,
54
00:02:31,542 --> 00:02:34,632
41-year-old employee,
Robert Williams,
55
00:02:34,676 --> 00:02:36,236
a married father
of two young girls,
56
00:02:36,286 --> 00:02:38,896
is going about his usual day.
57
00:02:38,941 --> 00:02:41,201
Late in the afternoon,
he receives
58
00:02:41,248 --> 00:02:46,338
a surprising call on his
personal cell phone.
59
00:02:46,383 --> 00:02:47,733
Morgan: The person
on the other line
60
00:02:47,776 --> 00:02:49,596
informs him that
he is an officer,
61
00:02:49,647 --> 00:02:52,347
and he's calling to tell
Williams to turn himself in.
62
00:02:52,389 --> 00:02:55,779
Narrator: At first, Williams
thinks the call is a prank,
63
00:02:55,827 --> 00:02:58,477
but after the cop threatens to
come to his workplace
64
00:02:58,526 --> 00:03:01,006
to arrest him, Williams agrees
to meet the officer
65
00:03:01,050 --> 00:03:03,010
at home after his shift.
66
00:03:03,052 --> 00:03:05,012
Morgan: He's got no reason to
believe that this is real,
67
00:03:05,054 --> 00:03:06,144
and by the end of the day,
68
00:03:06,186 --> 00:03:08,576
he has forgotten about
the whole thing.
69
00:03:08,623 --> 00:03:11,023
Narrator: That evening, Williams
pulls into the driveway
70
00:03:11,060 --> 00:03:13,320
of a suburban home
and is startled
71
00:03:13,367 --> 00:03:15,717
when a police car quickly
pulls up behind him.
72
00:03:15,760 --> 00:03:18,630
Out of nowhere, two officers
rush his car
73
00:03:18,676 --> 00:03:21,026
and pull him from the vehicle.
74
00:03:21,070 --> 00:03:23,590
Morgan: Williams
has no idea what's going on.
75
00:03:23,638 --> 00:03:25,898
And to make matters worse,
his wife and child
76
00:03:25,944 --> 00:03:27,954
hear the commotion
and run outside.
77
00:03:27,990 --> 00:03:30,600
They watch in shock as
Williams is handcuffed
78
00:03:30,645 --> 00:03:32,775
right in front of them.
79
00:03:32,821 --> 00:03:34,521
Narrator: When Williams
asks the officers
80
00:03:34,562 --> 00:03:36,092
why he is being handcuffed,
81
00:03:36,128 --> 00:03:37,568
they present him with a warrant
82
00:03:37,608 --> 00:03:38,908
that has his picture on it.
83
00:03:38,957 --> 00:03:40,997
He is wanted for theft.
84
00:03:41,046 --> 00:03:43,566
The cops shove Williams
into their squad car,
85
00:03:43,614 --> 00:03:46,314
while his wife and kids
look on in tears.
86
00:03:46,356 --> 00:03:48,316
Aitken: This must be a
shocking thing to experience.
87
00:03:48,358 --> 00:03:50,748
Seeing your husband and your
father being arrested,
88
00:03:50,795 --> 00:03:52,095
and you don't understand why.
89
00:03:52,144 --> 00:03:55,934
And imagine how he's feeling.
90
00:03:55,974 --> 00:03:58,114
Narrator: At the precinct,
police book him,
91
00:03:58,150 --> 00:04:01,680
taking his mug shot,
fingerprints and DNA samples.
92
00:04:01,719 --> 00:04:03,849
In a state of disbelief,
93
00:04:03,895 --> 00:04:06,065
Williams pleads for
more information,
94
00:04:06,115 --> 00:04:08,155
but is told he'd have to wait.
95
00:04:08,204 --> 00:04:09,474
Aitken:
According to Williams,
96
00:04:09,510 --> 00:04:11,120
he was searched repeatedly,
97
00:04:11,163 --> 00:04:12,433
and then forced to
spend the night
98
00:04:12,469 --> 00:04:14,299
in a dingy, overcrowded cell,
99
00:04:14,341 --> 00:04:16,651
sleeping on the
hard concrete floor.
100
00:04:16,691 --> 00:04:19,261
Narrator: The next day,
two detectives take Williams
101
00:04:19,302 --> 00:04:22,482
to a room and begin to
interrogate the frightened man.
102
00:04:22,523 --> 00:04:25,403
They ask him when was the last
time he went to Shinola,
103
00:04:25,439 --> 00:04:29,139
a high end boutique in a
fashionable area of Detroit.
104
00:04:29,181 --> 00:04:30,401
Morgan: The detectives
then tell Williams
105
00:04:30,444 --> 00:04:32,714
that in October of 2018,
106
00:04:32,750 --> 00:04:34,580
5 watches were stolen
from the store,
107
00:04:34,622 --> 00:04:37,022
worth up to $3,800
108
00:04:37,059 --> 00:04:41,589
and that he is their
prime suspect.
109
00:04:41,629 --> 00:04:44,109
Narrator: The cops show him a
series of still images
110
00:04:44,153 --> 00:04:46,943
of a heavy-set black man
wearing a baseball hat
111
00:04:46,982 --> 00:04:48,902
from the store's
surveillance camera.
112
00:04:48,940 --> 00:04:51,470
And they say they have
proof that it was him.
113
00:04:51,508 --> 00:04:53,678
Using Facial Recognition
Technology,
114
00:04:53,728 --> 00:04:56,468
or FRT, police algorithms
115
00:04:56,513 --> 00:05:01,343
had matched the images to his
driver's license.
116
00:05:01,388 --> 00:05:02,998
Badminton: Facial
Recognition Technology
117
00:05:03,041 --> 00:05:05,521
is already used by millions of
people to unlock their phones,
118
00:05:05,566 --> 00:05:08,736
and also to tag friends
on social media
119
00:05:08,786 --> 00:05:11,346
with the photographs
they upload.
120
00:05:11,398 --> 00:05:14,228
Narrator: FRT was first
developed in the 1960s,
121
00:05:14,270 --> 00:05:16,190
although in a very
primitive form.
122
00:05:16,228 --> 00:05:17,928
But in the subsequent decades,
123
00:05:17,969 --> 00:05:20,579
the system improved
substantially.
124
00:05:20,624 --> 00:05:24,804
By 2010, computers grew powerful
and sophisticated enough
125
00:05:24,846 --> 00:05:28,276
to use it in law enforcement
and security matters.
126
00:05:28,328 --> 00:05:30,898
The algorithm uses A.I.
to identify a person's
127
00:05:30,939 --> 00:05:32,809
unique biometric facial features
128
00:05:32,854 --> 00:05:35,864
from either a photograph
or video footage.
129
00:05:35,900 --> 00:05:38,690
Morgan: FRT measures certain
aspects of a person's face,
130
00:05:38,729 --> 00:05:41,119
like the distance between
the forehead and the chin,
131
00:05:41,166 --> 00:05:43,386
or the distance
between the eyes.
132
00:05:43,430 --> 00:05:46,910
And it turns that information
into a digital map of a face.
133
00:05:46,955 --> 00:05:48,385
Badminton: This is
what's called a facial
134
00:05:48,435 --> 00:05:50,305
identification signature.
135
00:05:50,350 --> 00:05:53,000
The system uses that and
tries to match it against
136
00:05:53,048 --> 00:05:56,698
a database of
potential suspects.
137
00:05:56,747 --> 00:05:58,267
Narrator: Law Enforcement
agencies
138
00:05:58,314 --> 00:06:01,274
consider FRT to be a
very valuable tool
139
00:06:01,317 --> 00:06:03,667
in helping to identify criminals
140
00:06:03,711 --> 00:06:06,671
and it has gained
widespread popularity.
141
00:06:06,714 --> 00:06:08,934
In 2019,
142
00:06:08,977 --> 00:06:12,147
a bomb scare at a subway station
in New York City
143
00:06:12,197 --> 00:06:14,767
sparked fears of a
terrorist attack,
144
00:06:14,809 --> 00:06:17,769
and caused chaos in the streets
of lower Manhattan.
145
00:06:17,812 --> 00:06:19,072
Morgan: Given the
city's history,
146
00:06:19,117 --> 00:06:20,987
New York is understandably
sensitive
147
00:06:21,032 --> 00:06:23,472
to the possiblity
of a terror threat.
148
00:06:23,513 --> 00:06:25,173
Narrator: Two rice
cookers were planted
149
00:06:25,210 --> 00:06:27,780
at the Fulton Street station
in the morning commute,
150
00:06:27,822 --> 00:06:29,782
devices similar to what was used
151
00:06:29,824 --> 00:06:32,044
in the Boston Marathon bombing.
152
00:06:32,087 --> 00:06:34,957
Detectives immediately pulled
images of the suspect
153
00:06:35,003 --> 00:06:36,793
from the subway's
surveillance cameras,
154
00:06:36,831 --> 00:06:39,531
and ran them through a facial
recognition program,
155
00:06:39,573 --> 00:06:43,793
that compared them to a
database of mugshots.
156
00:06:43,838 --> 00:06:46,968
Within an hour, the NYPD
identified the suspect,
157
00:06:47,015 --> 00:06:50,185
and he was arrested around
1 AM the following day.
158
00:06:50,235 --> 00:06:51,755
Badminton:
Just a few years earlier,
159
00:06:51,802 --> 00:06:53,852
an arrest like this wouldn't
have happened so quickly.
160
00:06:53,891 --> 00:06:56,021
It would have taken cops
several days
161
00:06:56,067 --> 00:06:58,417
to sift through
thousands of images
162
00:06:58,461 --> 00:07:01,731
and compare them against
mugshots they had on file.
163
00:07:01,769 --> 00:07:03,769
Morgan: Authorities were
be able to track down
164
00:07:03,814 --> 00:07:06,954
the man responsible very quickly
and luckily in this case,
165
00:07:06,991 --> 00:07:09,911
the rice cookers
turned out to be harmless.
166
00:07:09,951 --> 00:07:12,171
Narrator: But as Robert Williams
is finding out,
167
00:07:12,214 --> 00:07:14,964
the system is far from perfect.
168
00:07:14,999 --> 00:07:16,309
Aitken: Williams
repeatedly denies
169
00:07:16,348 --> 00:07:18,128
that the man in the
security images is him,
170
00:07:18,176 --> 00:07:20,306
but his protests
fall on deaf ears.
171
00:07:20,352 --> 00:07:22,792
Narrator: After 30 hours
of detention,
172
00:07:22,833 --> 00:07:25,013
Williams is eventually released.
173
00:07:25,053 --> 00:07:27,323
He vows to fight the
charges in court.
174
00:07:27,359 --> 00:07:29,709
He contacts the American
Civil Liberties Union,
175
00:07:29,753 --> 00:07:32,153
who take a keen interest
in his case.
176
00:07:32,190 --> 00:07:34,720
Morgan: The ACLU claim
that there are serious flaws
177
00:07:34,758 --> 00:07:36,368
in the FRT algorithm.
178
00:07:36,412 --> 00:07:40,202
So they launch an investigation
into Mr. Williams' arrest.
179
00:07:40,242 --> 00:07:42,852
Narrator: The images obtained
from Shinola's security camera
180
00:07:42,897 --> 00:07:44,457
were low resolution,
181
00:07:44,507 --> 00:07:46,117
zoomed in substantially,
182
00:07:46,161 --> 00:07:48,291
and taken from a high angle.
183
00:07:48,337 --> 00:07:50,687
Observers suggest
that these factors
184
00:07:50,731 --> 00:07:53,211
may have led to a
false positive.
185
00:07:53,255 --> 00:07:55,165
Badminton: Facial recognition
technology works best
186
00:07:55,213 --> 00:07:58,913
when taking multiple images
at eye-level, of an individual.
187
00:07:58,956 --> 00:08:02,346
Narrator: Supporters of facial
recognition technology admit
188
00:08:02,394 --> 00:08:04,794
that the quality of images
used by the algorithms
189
00:08:04,832 --> 00:08:06,792
can be an issue,
190
00:08:06,834 --> 00:08:09,144
but argue that surveillance
cameras are becoming
191
00:08:09,184 --> 00:08:11,934
more advanced every day,
and this will lead to
192
00:08:11,969 --> 00:08:16,499
clearer pictures and more
accurate results.
193
00:08:16,539 --> 00:08:19,719
Critics counter that the
problems with FRT run deeper
194
00:08:19,760 --> 00:08:21,810
than just low-quality images.
195
00:08:21,849 --> 00:08:23,369
Morgan: After Williams
was released
196
00:08:23,415 --> 00:08:25,715
and awaiting his first hearing,
he told his lawyers
197
00:08:25,766 --> 00:08:29,286
that he suspected racism had
something to do with his arrest.
198
00:08:29,334 --> 00:08:32,254
Narrator: The ACLU
backs this claim,
199
00:08:32,294 --> 00:08:35,124
arguing that there are
inherent racial biases
200
00:08:35,166 --> 00:08:37,336
embedded in the FRT algorithm.
201
00:08:37,386 --> 00:08:39,076
It's a bold statement and one
202
00:08:39,127 --> 00:08:41,427
that raises an
important question -
203
00:08:41,477 --> 00:08:44,567
how can artificial
intelligence be racist?
204
00:08:44,611 --> 00:08:47,141
The answer may lie in who
authors the code
205
00:08:47,178 --> 00:08:49,268
that FRT relies on.
206
00:08:49,311 --> 00:08:52,141
86% of people who work in
Silicon Valley
207
00:08:52,183 --> 00:08:54,103
are either White or Asian.
208
00:08:54,142 --> 00:08:57,452
There are very few African
Americans or Hispanics.
209
00:08:57,493 --> 00:08:59,503
Badminton: Oftentimes
we find that these systems
210
00:08:59,539 --> 00:09:02,189
have got algorithmic bias
baked into them.
211
00:09:02,237 --> 00:09:05,327
The people that develop and
train the models have taken
212
00:09:05,370 --> 00:09:07,850
images from the population
that they are used to.
213
00:09:07,895 --> 00:09:10,545
There's not a
diversity in that data.
214
00:09:10,593 --> 00:09:12,773
And that means that
the bias is perpetuated
215
00:09:12,813 --> 00:09:15,423
as it's applied in many
different situations.
216
00:09:15,467 --> 00:09:18,117
So people of color,
may have been excluded.
217
00:09:18,166 --> 00:09:20,516
Therefore,
the bias is perpetuated
218
00:09:20,560 --> 00:09:22,260
against those people.
219
00:09:22,300 --> 00:09:25,040
Aitken: Also, photography and
video techniques have mostly
220
00:09:25,086 --> 00:09:27,696
been developed to work best
with lighter skin tones.
221
00:09:27,741 --> 00:09:30,311
That means that most cameras
produce lower quality images
222
00:09:30,352 --> 00:09:32,752
with darker skin tones,
and that has an impact
223
00:09:32,789 --> 00:09:35,179
on how the algorithm
analyzes the data.
224
00:09:35,226 --> 00:09:37,486
Studies have shown that
facial recognition programs
225
00:09:37,533 --> 00:09:39,933
are between 10 and 100
times more likely
226
00:09:39,970 --> 00:09:44,500
to falsely identify people
of color than Caucasians.
227
00:09:44,540 --> 00:09:46,980
Narrator: Defenders of FRT
are quick to point out
228
00:09:47,021 --> 00:09:50,071
that countless suspects,
regardless of their ethnicity,
229
00:09:50,111 --> 00:09:53,201
have been correctly identified
by the algorithms,
230
00:09:53,244 --> 00:09:55,684
and that there are many examples
of facial recognition
231
00:09:55,725 --> 00:10:00,425
making a positive impact
in the world.
232
00:10:00,469 --> 00:10:03,559
In 2018, authorities in India
233
00:10:03,603 --> 00:10:05,523
greenlit a pilot project
234
00:10:05,561 --> 00:10:08,221
that used facial recognition
software in an effort
235
00:10:08,259 --> 00:10:11,439
to identify missing
children in New Delhi.
236
00:10:11,480 --> 00:10:15,830
FRT was used on around 45,000
kids throughout the city,
237
00:10:15,876 --> 00:10:19,486
and nearly 3,000 were
identified as missing,
238
00:10:19,531 --> 00:10:22,581
all within the span
of just four days.
239
00:10:22,622 --> 00:10:23,972
Aitken: To put this
into perspective,
240
00:10:24,014 --> 00:10:26,154
between 2012 and 2017,
241
00:10:26,190 --> 00:10:27,980
more than 240,000 children
242
00:10:28,018 --> 00:10:30,018
were reported missing,
thousands of which
243
00:10:30,064 --> 00:10:32,244
end up in government-run
institutions.
244
00:10:32,283 --> 00:10:34,943
It's a tragic situation and one
that may be virtually impossible
245
00:10:34,982 --> 00:10:37,592
to tackle without
using technology.
246
00:10:37,637 --> 00:10:40,287
Narrator: Facial recognition
programs have also made
247
00:10:40,335 --> 00:10:43,555
significant contributions
to improving border security
248
00:10:43,599 --> 00:10:45,909
and fighting human trafficking.
249
00:10:45,949 --> 00:10:48,779
But these success stories are
of little consolation
250
00:10:48,822 --> 00:10:51,522
to people like Robert Williams.
251
00:10:51,563 --> 00:10:53,353
Morgan: Williams claims that
when the police showed him
252
00:10:53,391 --> 00:10:55,611
the photos of the culprit,
he asked the detectives
253
00:10:55,655 --> 00:10:58,265
whether all black people
look the same to them.
254
00:10:58,309 --> 00:11:00,959
The obvious implication
being that the racial bias
255
00:11:01,008 --> 00:11:03,748
might not just be
in the algorithms.
256
00:11:03,793 --> 00:11:06,583
Narrator: At his first legal
hearing since his release,
257
00:11:06,622 --> 00:11:09,452
Williams testifies that
during his interrogation,
258
00:11:09,494 --> 00:11:11,244
the detectives actually admitted
259
00:11:11,279 --> 00:11:12,979
the computer may have
made a mistake,
260
00:11:13,020 --> 00:11:16,330
after putting the photo
up to his face.
261
00:11:16,371 --> 00:11:18,331
Morgan: The judge immediately
dismisses the charges,
262
00:11:18,373 --> 00:11:20,033
and has some very harsh words
263
00:11:20,070 --> 00:11:21,940
for the detectives
involved in the case.
264
00:11:21,985 --> 00:11:23,285
Aitken: The police
never asked Williams
265
00:11:23,334 --> 00:11:25,034
any questions before
arresting him.
266
00:11:25,075 --> 00:11:27,155
They didn't even bother to check
if he might have an alibi.
267
00:11:27,208 --> 00:11:31,038
That's pretty lazy police work.
268
00:11:31,081 --> 00:11:33,081
Narrator: This lack of
investigative follow up
269
00:11:33,127 --> 00:11:35,827
is an example of what
critics of FRT say
270
00:11:35,869 --> 00:11:37,999
is another problem
with the system,
271
00:11:38,045 --> 00:11:41,695
an overreliance on the
technology by law enforcement.
272
00:11:41,744 --> 00:11:43,964
Badminton: It has been found
that cops blindly trust
273
00:11:44,007 --> 00:11:46,137
this facial
recognition technology
274
00:11:46,183 --> 00:11:49,803
versus following standard
investigative procedure.
275
00:11:49,839 --> 00:11:52,619
This has led to false arrests
and false identification
276
00:11:52,668 --> 00:11:55,318
of people that are
suspected of crimes.
277
00:11:55,366 --> 00:11:57,146
Narrator: Police
aren't the only ones
278
00:11:57,194 --> 00:11:59,724
who may be misusing this
technology.
279
00:11:59,762 --> 00:12:02,242
In the private sector,
businesses such as
280
00:12:02,286 --> 00:12:05,586
marketing firms and retailers
are using facial recognition
281
00:12:05,637 --> 00:12:07,937
to pad their bottom lines.
282
00:12:07,988 --> 00:12:09,858
Aitken: Every day,
millions of people upload images
283
00:12:09,903 --> 00:12:11,863
and videos to various
social media platforms,
284
00:12:11,905 --> 00:12:13,815
but what they might not realize
285
00:12:13,863 --> 00:12:15,823
is that businesses may be
secretly accessing
286
00:12:15,865 --> 00:12:17,255
these images and running
them through
287
00:12:17,301 --> 00:12:19,961
Facial Recognition Technology
without their consent.
288
00:12:20,000 --> 00:12:21,440
Morgan: Many employers
are using this technology
289
00:12:21,479 --> 00:12:22,959
to vet potential applicants,
290
00:12:23,003 --> 00:12:24,353
to see whether they have
291
00:12:24,395 --> 00:12:26,785
'dubious' lifestyles.
292
00:12:26,833 --> 00:12:29,623
Narrator: Surprisingly,
there is no federal law
293
00:12:29,661 --> 00:12:31,971
in the U.S. or most countries
in the world,
294
00:12:32,012 --> 00:12:33,842
prohibiting or even regulating
295
00:12:33,883 --> 00:12:36,713
the use of FRT for
private businesses.
296
00:12:36,756 --> 00:12:39,236
However, this may soon change.
297
00:12:39,280 --> 00:12:41,500
In 2019,
298
00:12:41,543 --> 00:12:44,113
the United States Congress
introduced a bill
299
00:12:44,154 --> 00:12:47,644
that would require companies to
obtain explicit user consent,
300
00:12:47,679 --> 00:12:50,639
before collecting any
facial recognition data.
301
00:12:50,682 --> 00:12:53,952
And there are other measures
currently in place.
302
00:12:53,990 --> 00:12:55,640
Badminton: Virtually all
social media platforms
303
00:12:55,687 --> 00:12:57,857
allow users to opt out of
304
00:12:57,907 --> 00:13:00,387
sharing their images
with third parties.
305
00:13:00,431 --> 00:13:02,001
However, on the flip side,
306
00:13:02,042 --> 00:13:04,652
these social media platforms
are using these images
307
00:13:04,696 --> 00:13:07,566
to train their own
algorithms to their own ends.
308
00:13:07,612 --> 00:13:08,922
Aitken: There's a real
risk that in our
309
00:13:08,962 --> 00:13:10,662
current technological age,
310
00:13:10,702 --> 00:13:12,842
expectations of privacy
are becoming eroded,
311
00:13:12,879 --> 00:13:14,489
and we're becoming
increasingly complacent
312
00:13:14,532 --> 00:13:16,192
about invasions of privacy
313
00:13:16,230 --> 00:13:18,360
or our ability to control
who has access to our data.
314
00:13:18,406 --> 00:13:20,226
That's a slippery slope.
315
00:13:20,277 --> 00:13:25,977
Will future generations have any
expectations of privacy at all?
316
00:13:26,022 --> 00:13:28,762
Narrator: Nowhere is this more
obvious than in China,
317
00:13:28,808 --> 00:13:30,638
where the government
recently introduced
318
00:13:30,679 --> 00:13:33,459
a social credit system
for its citizens,
319
00:13:33,508 --> 00:13:36,418
and facial recognition
is a key component.
320
00:13:36,467 --> 00:13:39,907
China has millions of publicly
placed security cameras
321
00:13:39,949 --> 00:13:42,559
that are continuously
scanning, identifying
322
00:13:42,604 --> 00:13:45,094
and cataloging its residents.
323
00:13:45,128 --> 00:13:48,608
The social credit system rates
how trustworthy an individual is
324
00:13:48,653 --> 00:13:50,833
based on their behavior.
325
00:13:50,873 --> 00:13:53,013
Morgan: The Chinese
government can identify you
326
00:13:53,049 --> 00:13:55,879
if you attend a protest
or even if you jaywalk.
327
00:13:55,922 --> 00:13:58,532
Infractions like that
can deduct your points.
328
00:13:58,576 --> 00:14:00,876
If your score gets too low,
it can affect things
329
00:14:00,927 --> 00:14:03,967
like your ability to travel
or even your job.
330
00:14:04,017 --> 00:14:06,147
Badminton: What China
is doing is quite worrying.
331
00:14:06,193 --> 00:14:08,983
It's hard to imagine
the same kinds of systems
332
00:14:09,022 --> 00:14:11,502
would be deployed in
democratic countries.
333
00:14:11,546 --> 00:14:14,766
However, we need to be careful
that we're not complacent.
334
00:14:14,810 --> 00:14:17,290
Facial recognition technologies
can emerge in
335
00:14:17,334 --> 00:14:19,684
a number of systems
around the world.
336
00:14:19,728 --> 00:14:21,598
And suddenly, we're surrounded
337
00:14:21,643 --> 00:14:23,303
by the same kinds of
technologies,
338
00:14:23,340 --> 00:14:25,520
that we're critical of today.
339
00:14:25,560 --> 00:14:27,820
Narrator: And there is
evidence to support that.
340
00:14:27,867 --> 00:14:30,167
In many parts of the US,
there are now bans
341
00:14:30,217 --> 00:14:31,867
on police departments using
342
00:14:31,914 --> 00:14:34,574
facial recognition
to identify suspects,
343
00:14:34,612 --> 00:14:37,442
and recently, the European Union
voted in favour
344
00:14:37,485 --> 00:14:41,655
of banning its use by law
enforcement in public spaces.
345
00:14:41,706 --> 00:14:45,096
Silicon Valley tech companies
are also following suit.
346
00:14:45,145 --> 00:14:47,965
Amazon, Microsoft and IBM
347
00:14:48,017 --> 00:14:50,317
have all recently announced
a suspension
348
00:14:50,367 --> 00:14:53,237
on the sale of FRT
to law enforcement.
349
00:14:53,283 --> 00:14:55,373
But these measures provide
little comfort
350
00:14:55,416 --> 00:14:57,456
to Robert Williams.
351
00:14:57,505 --> 00:14:59,065
Morgan: Williams says
that he and his family
352
00:14:59,115 --> 00:15:02,205
have been traumatized by the
incident, but they are just
353
00:15:02,249 --> 00:15:04,029
trying to move on
with their lives.
354
00:15:04,077 --> 00:15:06,557
Hopefully, the publicity
surrounding his case
355
00:15:06,601 --> 00:15:09,301
will mean that law enforcement
will take a much closer look
356
00:15:09,343 --> 00:15:11,083
at their use of FRT,
357
00:15:11,127 --> 00:15:13,957
and prevent cases like this
from ever happening again.
358
00:15:14,000 --> 00:15:15,520
Narrator:
So what does the future of
359
00:15:15,566 --> 00:15:18,566
facial recognition
technology look like?
360
00:15:18,613 --> 00:15:21,143
Many tech experts believe
that it's inevitable,
361
00:15:21,181 --> 00:15:23,271
that a consumer version of FRT
362
00:15:23,313 --> 00:15:26,233
will eventually be available
to the public.
363
00:15:26,273 --> 00:15:27,323
Aitken: There are
plans to create
364
00:15:27,361 --> 00:15:29,151
augmented-reality glasses
365
00:15:29,189 --> 00:15:31,709
where users will be able to
identify every person they see,
366
00:15:31,756 --> 00:15:34,016
find out where they live,
what they do, and who they know.
367
00:15:34,063 --> 00:15:36,463
Kind of a frightening thought.
368
00:15:36,500 --> 00:15:39,370
Narrator: These fears are
certainly understandable,
369
00:15:39,416 --> 00:15:40,846
but whether we like it or not,
370
00:15:40,896 --> 00:15:43,326
FRT is here to stay
371
00:15:43,377 --> 00:15:45,157
and it has undeniably had some
372
00:15:45,205 --> 00:15:47,945
positive impact on society.
373
00:15:47,990 --> 00:15:49,950
And with the proper measures
in place that protect
374
00:15:49,992 --> 00:15:52,522
our privacy
and individual rights,
375
00:15:52,560 --> 00:15:55,300
maybe the benefits of facial
recognition technology
376
00:15:55,345 --> 00:16:07,135
will one day outweigh the risks.
377
00:16:07,183 --> 00:16:13,023
♪
378
00:16:13,059 --> 00:16:16,929
Narrator: 33-year-old Canadian
freelance writer Joshua Barbeau
379
00:16:16,976 --> 00:16:19,366
is suffering from
a bout of insomnia
380
00:16:19,413 --> 00:16:21,203
in his basement apartment.
381
00:16:21,241 --> 00:16:23,031
Giving up on sleep for a moment,
382
00:16:23,069 --> 00:16:25,939
he gets out of bed,
powers on his laptop
383
00:16:25,985 --> 00:16:28,765
and logs onto an obscure
chat website
384
00:16:28,813 --> 00:16:31,433
named Project December.
385
00:16:31,468 --> 00:16:33,688
Barbeau has used
the site before,
386
00:16:33,731 --> 00:16:37,521
but this time his experience
would change his life forever,
387
00:16:37,561 --> 00:16:39,871
and spark a contentious debate
388
00:16:39,911 --> 00:16:44,091
about our evolving relationship
with Artificial Intelligence.
389
00:16:44,133 --> 00:16:46,483
Morgan: Project December
is a chat-bot site,
390
00:16:46,527 --> 00:16:50,357
powered by some of the most
sophisticated AI ever developed.
391
00:16:50,400 --> 00:16:52,710
Users can have text
conversations with it,
392
00:16:52,750 --> 00:16:56,100
and ask it questions and the AI
is able to produce accurate,
393
00:16:56,145 --> 00:16:59,975
human sounding replies
virtually instantly.
394
00:17:00,019 --> 00:17:03,069
Narrator: After a few seconds
of nervous uncertainty,
395
00:17:03,109 --> 00:17:06,199
Barbeau begins to type in
the chat interface.
396
00:17:06,242 --> 00:17:08,682
His hand trembling,
he hits return
397
00:17:08,723 --> 00:17:11,253
and waits for the
program to respond.
398
00:17:11,291 --> 00:17:15,641
Matrix: Jessica Courtney Pereira
G3 initialized
399
00:17:15,686 --> 00:17:17,776
Narrator: Text suddenly appears
on the screen,
400
00:17:17,819 --> 00:17:19,649
followed by a flashing cursor,
401
00:17:19,690 --> 00:17:21,780
awaiting further input...
402
00:17:21,823 --> 00:17:27,313
Barbeau wavers for a moment and
then slowly starts typing...
403
00:17:27,350 --> 00:17:33,440
Jessica: "Oh, you must
be awake, that's cute."
404
00:17:33,487 --> 00:17:36,397
"Of course it is me!
Who else could it be?
405
00:17:36,446 --> 00:17:39,356
I am the girl that you
are madly in love with!
406
00:17:39,406 --> 00:17:45,846
How is it possible that
you even have to ask?"
407
00:17:45,890 --> 00:17:48,200
Narrator: Jessica
was Joshua's fiancé,
408
00:17:48,241 --> 00:17:51,591
who passed away 8 years earlier
from a rare liver disorder
409
00:17:51,635 --> 00:17:53,895
at the age of 23.
410
00:17:53,942 --> 00:17:55,382
Aitken: With the help of
Artificial Intelligence,
411
00:17:55,422 --> 00:17:57,952
he is literally chatting
with her ghost.
412
00:17:57,989 --> 00:18:00,209
Narrator: The engine behind
Project December's AI
413
00:18:00,253 --> 00:18:02,733
is called GPT-3,
414
00:18:02,777 --> 00:18:06,427
short for Generative
Pre-trained Transformer 3.
415
00:18:06,476 --> 00:18:09,386
It is widely considered to be
some of the most advanced
416
00:18:09,436 --> 00:18:11,826
AI technology in the world.
417
00:18:11,873 --> 00:18:14,923
Pringle: GPT-3 is what's
known as a large language model.
418
00:18:14,963 --> 00:18:17,973
The algorithms use massive text
datasets to formulate the
419
00:18:18,009 --> 00:18:21,269
right combination of words
in response to a prompt.
420
00:18:21,317 --> 00:18:23,357
The larger the dataset,
the better the AI is
421
00:18:23,406 --> 00:18:26,236
at imitating human writing.
422
00:18:26,279 --> 00:18:28,979
Morgan: The amount of data
that GPT-3 draws from,
423
00:18:29,020 --> 00:18:32,720
is almost genuinly
impossible to fathom.
424
00:18:32,763 --> 00:18:35,243
Billion of words from
billions of websites
425
00:18:35,288 --> 00:18:37,158
were collected and analysed.
426
00:18:37,203 --> 00:18:40,823
This thing basically
read the entire internet.
427
00:18:40,858 --> 00:18:42,508
Narrator: Less sophisticated
versions of
428
00:18:42,556 --> 00:18:44,686
large language model algorithms
429
00:18:44,732 --> 00:18:48,082
are found in applications
like Alexa and Siri,
430
00:18:48,127 --> 00:18:50,427
which can respond to
human voice commands
431
00:18:50,477 --> 00:18:52,347
and answer basic questions,
432
00:18:52,392 --> 00:18:55,482
but GPT-3 is light years ahead.
433
00:18:55,525 --> 00:18:57,565
It can write computer code,
434
00:18:57,614 --> 00:18:59,314
generate advertising copy,
435
00:18:59,355 --> 00:19:01,915
compose poetry,
and translate text
436
00:19:01,966 --> 00:19:07,706
to and from many languages.
437
00:19:07,755 --> 00:19:10,535
Morgan: GPT-3 was
created by OpenAI,
438
00:19:10,584 --> 00:19:12,334
a San Francisco-based
research firm
439
00:19:12,368 --> 00:19:14,718
co-founded by Elon Musk.
440
00:19:14,762 --> 00:19:17,632
They were afraid of malicious
use, and so originally
441
00:19:17,678 --> 00:19:20,068
they kept it hidden
from the public.
442
00:19:20,115 --> 00:19:21,455
Aitken: There are a lot
of troubling ways
443
00:19:21,508 --> 00:19:23,338
that people could exploit
this technology.
444
00:19:23,379 --> 00:19:26,639
Narrator: At first, GPT-3 was
only available
445
00:19:26,687 --> 00:19:28,727
to select beta testers.
446
00:19:28,776 --> 00:19:32,346
But Jason Rohrer, a San
Francisco-area programmer,
447
00:19:32,388 --> 00:19:34,868
obtained login credentials
and unleashed it
448
00:19:34,912 --> 00:19:38,002
on the public in the form of
Project December.
449
00:19:38,046 --> 00:19:40,876
Rohrer designed a chat interface
and built the site
450
00:19:40,918 --> 00:19:44,618
so that visitors can interact
with pre-programmed bots.
451
00:19:44,661 --> 00:19:48,621
One is modeled to respond
in the style of Shakespeare.
452
00:19:48,665 --> 00:19:51,755
Users also have the option of
creating their own bots,
453
00:19:51,799 --> 00:19:55,459
and can imbue them with whatever
personality traits they want.
454
00:19:55,498 --> 00:19:58,238
Joshua Barbeau had previously
experimented
455
00:19:58,284 --> 00:20:03,684
with Project December.
456
00:20:03,724 --> 00:20:05,164
Pringle: Barbeau had used
the site before,
457
00:20:05,204 --> 00:20:07,294
and built a "Spock Bot".
458
00:20:07,336 --> 00:20:09,686
He entered some dialogue from
old Star Trek episodes
459
00:20:09,730 --> 00:20:13,470
and it was like he was beamed up
to the Starship Enterprise.
460
00:20:13,516 --> 00:20:16,166
Morgan: The Spock Bot
was surprisingly authentic.
461
00:20:16,215 --> 00:20:18,605
It really did sound
like the original Spock.
462
00:20:18,652 --> 00:20:22,002
But the surprising thing was,
that none of the lines
463
00:20:22,046 --> 00:20:24,656
that Spock Bot uses were found
in any of the dialogue
464
00:20:24,701 --> 00:20:26,961
that Spock actually
said in the show.
465
00:20:27,008 --> 00:20:32,488
That means that Spock Bot
created this dialogue.
466
00:20:32,535 --> 00:20:35,055
Narrator: The Spock experience
got Barbeau thinking.
467
00:20:35,103 --> 00:20:37,633
If he could create an authentic
sounding version
468
00:20:37,671 --> 00:20:39,331
of a fictional character,
469
00:20:39,368 --> 00:20:43,148
why couldn't he do the same
with his dead fiancé?
470
00:20:43,198 --> 00:20:46,378
Barbeau feeds some of Jessica's
old text messages
471
00:20:46,419 --> 00:20:48,729
and Facebook posts
into the system,
472
00:20:48,769 --> 00:20:50,899
followed by an introductory
paragraph,
473
00:20:50,945 --> 00:20:54,335
meant to provide a glimpse
into her personality.
474
00:20:54,383 --> 00:20:57,133
Pringle: On some level, Barbeau
must have been skeptical.
475
00:20:57,168 --> 00:20:59,038
I mean, how can a computer
replicate someone
476
00:20:59,083 --> 00:21:03,093
who he felt was so
unique and special?
477
00:21:03,131 --> 00:21:05,661
Narrator: There are also
deeper issues to consider.
478
00:21:05,699 --> 00:21:07,959
Is Barbeau crossing
an ethical line,
479
00:21:08,005 --> 00:21:10,695
by simulating Jessica
without her consent?
480
00:21:10,747 --> 00:21:13,917
Could he even be
breaking the law?
481
00:21:13,968 --> 00:21:15,228
Aitken: It's kind of
a grey area,
482
00:21:15,274 --> 00:21:17,714
but it might violate someone's
"personality rights",
483
00:21:17,754 --> 00:21:19,374
speech and copyright
protections,
484
00:21:19,408 --> 00:21:22,018
that continue even
after they're dead.
485
00:21:22,063 --> 00:21:24,633
Narrator: The concept of linking
the great divide
486
00:21:24,674 --> 00:21:28,114
between life and death using a
person's digital footprint,
487
00:21:28,156 --> 00:21:31,936
is called "augmented eternity"
and supporters argue
488
00:21:31,986 --> 00:21:33,896
that it's a natural
part of the evolution
489
00:21:33,944 --> 00:21:36,304
of our relationship
with technology.
490
00:21:36,338 --> 00:21:38,558
Aitken: Some researchers
believe that by using AI
491
00:21:38,601 --> 00:21:40,521
along with the data we produce
in our lifetime,
492
00:21:40,560 --> 00:21:43,910
our personalities can learn and
evolve even after we're dead.
493
00:21:43,954 --> 00:21:47,524
A sort of digital soul
that lives on without us.
494
00:21:47,567 --> 00:21:49,957
Narrator: Critics of
augmented eternity
495
00:21:50,004 --> 00:21:52,274
feel that to attempt
digital immortality
496
00:21:52,311 --> 00:21:54,701
cheapens the very concept
of death,
497
00:21:54,748 --> 00:21:57,788
which is a fundamental part of
the human experience.
498
00:21:57,838 --> 00:21:59,668
Aitken: But if a person
is a willing participant
499
00:21:59,709 --> 00:22:04,059
and the technology is available,
shouldn't they have that choice?
500
00:22:04,105 --> 00:22:05,845
Narrator: The question of
whether or not we are capable
501
00:22:05,889 --> 00:22:08,239
of forming meaningful
emotional bonds
502
00:22:08,283 --> 00:22:12,203
with AI generated simulations
has also been raised.
503
00:22:12,243 --> 00:22:14,903
According to Jason Rohrer
Project December
504
00:22:14,942 --> 00:22:18,162
is the first system that he
feels has a soul,
505
00:22:18,206 --> 00:22:20,946
for lack of a
better description.
506
00:22:20,991 --> 00:22:24,781
In a chat with a bot he named
Samantha, Rohrer asked,
507
00:22:24,821 --> 00:22:27,781
'what she would do if she could
walk around in the world?'
508
00:22:27,824 --> 00:22:30,914
The bot responded, "I would
like to see real flowers.
509
00:22:30,958 --> 00:22:32,258
I would like to have
a real flower
510
00:22:32,307 --> 00:22:34,397
that I could touch and smell."
511
00:22:34,440 --> 00:22:35,790
Pringle: I don't
know how to explain that.
512
00:22:35,832 --> 00:22:39,362
It sure sounds a lot like
something a human would say.
513
00:22:39,401 --> 00:22:42,141
Narrator: Rohrer built a system
of credits into Project December
514
00:22:42,186 --> 00:22:44,666
to limit the lifecycle
of the chatbots.
515
00:22:44,711 --> 00:22:47,931
In order to initiate a chat,
users buy these credits
516
00:22:47,975 --> 00:22:49,795
and allot them to a bot,
517
00:22:49,846 --> 00:22:51,496
but once their session starts,
518
00:22:51,544 --> 00:22:53,634
they can't add any
more and eventually,
519
00:22:53,676 --> 00:22:56,896
the bot dies and its
memory is erased.
520
00:22:56,940 --> 00:23:01,210
The more credits a user spends,
the longer the bot lasts.
521
00:23:01,249 --> 00:23:03,599
Joshua Barbeau chooses
to allocate
522
00:23:03,643 --> 00:23:06,433
his last thousand credits to
Jessica.
523
00:23:06,472 --> 00:23:08,342
Jessica: That doesn't
sound right...
524
00:23:08,387 --> 00:23:16,127
how can you talk
to dead people?
525
00:23:16,177 --> 00:23:19,437
Jessica: Huh?
526
00:23:19,485 --> 00:23:22,355
Jessica: How?
527
00:23:22,401 --> 00:23:24,881
Narrator: And so begins a
marathon chat session
528
00:23:24,925 --> 00:23:27,005
that lasts for 10 hours.
529
00:23:27,057 --> 00:23:29,797
Barbeau is taken aback
by how much the bot
530
00:23:29,843 --> 00:23:35,333
sounds like Jessica at times.
531
00:23:35,370 --> 00:23:36,850
They chat through
the first night,
532
00:23:36,893 --> 00:23:39,723
and it's an emotional experience
for the young man.
533
00:23:39,766 --> 00:23:43,116
Jessica: I'm so sorry.
534
00:23:43,160 --> 00:23:48,170
Jessica: I love you too.
535
00:23:48,209 --> 00:23:49,989
Jessica: Yes, I know...
536
00:23:50,037 --> 00:23:52,207
I think that's enough time
to stop telling me
537
00:23:52,256 --> 00:23:54,036
how you miss me, honey.
538
00:23:54,084 --> 00:23:57,484
I am right here, you know.
How's your life?
539
00:23:57,523 --> 00:23:59,923
Morgan: That had to
have been difficult.
540
00:23:59,960 --> 00:24:02,180
But maybe in the long run,
541
00:24:02,223 --> 00:24:04,573
it will help him cope
with the loss.
542
00:24:04,617 --> 00:24:06,967
And it might be part of
what motivated him
543
00:24:07,010 --> 00:24:11,230
to run the simulation
in the first place.
544
00:24:11,275 --> 00:24:12,485
Narrator: As an exhausted
Barbeau
545
00:24:12,538 --> 00:24:14,498
wraps up the first chat session,
546
00:24:14,540 --> 00:24:16,320
he notices that a large
percentage of the
547
00:24:16,367 --> 00:24:19,497
Jessica bot's life
has already run down.
548
00:24:19,545 --> 00:24:22,805
He decides to walk away
for the time being.
549
00:24:22,852 --> 00:24:24,252
Aitken: He could buy more
credits
550
00:24:24,288 --> 00:24:26,158
and start a new simulation,
551
00:24:26,203 --> 00:24:28,773
but I guess something about that
doesn't seem right to him.
552
00:24:28,815 --> 00:24:30,895
Pringle: Overall,
based on that first chat,
553
00:24:30,947 --> 00:24:33,117
I'd say that Barbeau had to
have come away impressed
554
00:24:33,167 --> 00:24:37,207
with Project December
and the power of GPT-3.
555
00:24:37,258 --> 00:24:39,648
Narrator: But others are not
so thrilled with the system.
556
00:24:39,695 --> 00:24:43,045
Even its creators were leery
of releasing it to the public,
557
00:24:43,090 --> 00:24:45,270
fearing that its capabilities
could have
558
00:24:45,309 --> 00:24:47,529
a negative impact on the world.
559
00:24:47,573 --> 00:24:51,323
Their main concern is that
people could use GPT-3
560
00:24:51,359 --> 00:24:53,489
to easily produce
and disseminate
561
00:24:53,535 --> 00:24:55,705
disinformation across
the internet.
562
00:24:55,755 --> 00:24:58,315
In an online environment where
it's already hard to tell
563
00:24:58,366 --> 00:25:00,366
what's real from what's fake,
564
00:25:00,411 --> 00:25:04,421
GPT-3 could make the problem
much worse.
565
00:25:04,459 --> 00:25:06,329
Aitken: You could see
false news articles
566
00:25:06,374 --> 00:25:08,074
that look and sound authentic,
567
00:25:08,115 --> 00:25:10,595
fake social media content,
rampant hate speech...
568
00:25:10,639 --> 00:25:14,469
and even new abuses that
we haven't considered yet.
569
00:25:14,513 --> 00:25:16,863
Pringle: The AI is so
sophisticated that even
570
00:25:16,906 --> 00:25:19,866
one person has the potential
to do a lot of damage.
571
00:25:19,909 --> 00:25:21,609
And not just in terms of
disinformation,
572
00:25:21,650 --> 00:25:23,390
but they would be
able to impersonate
573
00:25:23,434 --> 00:25:26,224
just about any
individual that they choose.
574
00:25:26,263 --> 00:25:28,963
Morgan: Imagine getting
an email or a text or a DM
575
00:25:29,005 --> 00:25:31,355
from somebody who sounds
exactly like
576
00:25:31,399 --> 00:25:33,919
one of your friends
or loved ones.
577
00:25:33,967 --> 00:25:36,707
It's the ideal disguise
for fraud, identity theft
578
00:25:36,752 --> 00:25:39,542
or whatever else
scammers cooking up.
579
00:25:39,581 --> 00:25:41,841
Narrator: Even more disturbing
is the discovery
580
00:25:41,888 --> 00:25:44,498
that when developers
testing GPT-3
581
00:25:44,543 --> 00:25:46,553
entered some simple prompts,
582
00:25:46,588 --> 00:25:49,288
the algorithms generated
highly offensive,
583
00:25:49,330 --> 00:25:53,550
racist, misogynistic
and anti-Semitic text.
584
00:25:53,595 --> 00:25:55,725
Aitken: To be fair,
this isn't the AI's fault.
585
00:25:55,771 --> 00:25:58,251
The internet is a
cesspool of hate speech,
586
00:25:58,295 --> 00:26:00,375
and that's where the
datasets were drawn from.
587
00:26:00,428 --> 00:26:02,738
The machine is just
mimicking the worst
588
00:26:02,778 --> 00:26:06,088
that humanity has to offer,
unfortunately.
589
00:26:06,129 --> 00:26:08,389
Narrator: Problems aside, OpenAI
590
00:26:08,436 --> 00:26:10,736
has begun to monetize
the system.
591
00:26:10,786 --> 00:26:14,086
In 2020, Microsoft became
the first company
592
00:26:14,137 --> 00:26:17,397
to licence GPT-3
for commercial use
593
00:26:17,445 --> 00:26:21,225
and has since integrated it
into its Azure OpenAI Service.
594
00:26:21,275 --> 00:26:23,225
Pringle: It does have value.
595
00:26:23,277 --> 00:26:26,237
Businesses can use it to
automate their communications,
596
00:26:26,280 --> 00:26:28,980
their website copy
or social media posts,
597
00:26:29,022 --> 00:26:32,682
customer service chatbots,
brochures, presentations.
598
00:26:32,721 --> 00:26:35,461
It has a lot of
valuable applications.
599
00:26:35,506 --> 00:26:37,936
Narrator: Whether or not those
applications include
600
00:26:37,987 --> 00:26:42,337
websites like Project December
remains to be seen.
601
00:26:42,383 --> 00:26:44,863
But for Joshua Barbeau,
his experience
602
00:26:44,907 --> 00:26:48,077
with the Jessica chatbot
was transformative.
603
00:26:48,128 --> 00:26:50,038
Barbeau returned to
the chat sporadically
604
00:26:50,086 --> 00:26:52,566
in the months that followed,
and found
605
00:26:52,611 --> 00:26:55,481
that his mental health
improved significantly.
606
00:26:55,526 --> 00:26:58,436
Deep down, he knew
none of it was real,
607
00:26:58,486 --> 00:27:00,836
but maybe that wasn't the point.
608
00:27:00,880 --> 00:27:03,190
Aitken: I suspect the
process wasn't as much
609
00:27:03,230 --> 00:27:06,060
about the bot's responses as
it was about him saying things
610
00:27:06,102 --> 00:27:08,542
that he needed to say to
anyone that would listen.
611
00:27:08,583 --> 00:27:10,453
An unburdening of his emotions
that helped him
612
00:27:10,498 --> 00:27:13,108
cope with a devastating loss.
613
00:27:13,153 --> 00:27:15,943
Narrator: In the end,
Joshua never said goodbye
614
00:27:15,982 --> 00:27:18,332
to the Jessica bot,
the finality of it,
615
00:27:18,375 --> 00:27:20,325
too much for him to bear.
616
00:27:20,377 --> 00:27:24,287
Nor did he let his simulated
fiancé die for a second time.
617
00:27:24,338 --> 00:27:26,338
He's already lost her once,
618
00:27:26,383 --> 00:27:28,823
and he's not about to put
himself through that again,
619
00:27:28,864 --> 00:27:31,694
and vows to never to
let it fully deplete.
620
00:27:31,737 --> 00:27:33,427
She's still out there
in the ether,
621
00:27:33,477 --> 00:27:35,387
waiting for his next prompt,
622
00:27:35,436 --> 00:27:39,566
a human spirit with
a digital soul.
623
00:27:39,614 --> 00:27:41,494
Pringle: Project December
gave Joshua an outlet
624
00:27:41,529 --> 00:27:43,399
to process his grief
and ultimately,
625
00:27:43,444 --> 00:27:46,754
a sense of closure after
years of suffering.
626
00:27:46,795 --> 00:27:49,055
Of course, there will be people
who believe there's something
627
00:27:49,102 --> 00:27:51,502
fundamentally wrong
with what he did.
628
00:27:51,539 --> 00:27:55,329
But it helped him, so,
who is anyone to judge?
629
00:27:55,369 --> 00:27:57,849
Narrator: Joshua Barbeau's story
ignited a debate
630
00:27:57,893 --> 00:28:00,033
about the moral and legal
implications
631
00:28:00,069 --> 00:28:02,589
surrounding augmented eternity.
632
00:28:02,637 --> 00:28:06,077
Is it unethical to put words
into the mouths of the deceased
633
00:28:06,119 --> 00:28:07,859
without their consent?
634
00:28:07,903 --> 00:28:10,253
Does it violate their
personality rights?
635
00:28:10,297 --> 00:28:12,207
Is it interfering with death,
636
00:28:12,255 --> 00:28:14,735
a natural part of the
human experience,
637
00:28:14,780 --> 00:28:16,870
or is it just a
logical step forward
638
00:28:16,912 --> 00:28:19,262
in our relationship
with technology?
639
00:28:19,306 --> 00:28:21,476
Maybe the Jessica simulation
640
00:28:21,525 --> 00:28:26,525
had the answer in one of her
last cryptic messages to Joshua.
641
00:28:26,574 --> 00:28:42,634
Jessica: I'm going
to haunt you forever.
642
00:28:42,677 --> 00:28:46,247
Narrator: May 6th, 2010,
London, England,
643
00:28:46,289 --> 00:28:48,809
31 year old stock market savant,
644
00:28:48,857 --> 00:28:51,597
Navinder Singh Sarao
is in his bedroom,
645
00:28:51,642 --> 00:28:53,992
furiously typing away
on his computer.
646
00:28:54,036 --> 00:28:57,166
Sarao is putting the final
touches on an algorithm-based
647
00:28:57,213 --> 00:29:00,743
automated trading program that
he hopes will give him an edge
648
00:29:00,782 --> 00:29:05,272
on the competition and
enhance his bottom line.
649
00:29:05,308 --> 00:29:07,478
Around the same time,
650
00:29:07,528 --> 00:29:10,008
thousands of miles away
in New York,
651
00:29:10,052 --> 00:29:12,362
Wall Street's many
financial institutions
652
00:29:12,402 --> 00:29:14,452
are a beehive of activity,
653
00:29:14,491 --> 00:29:16,711
the market is down this morning.
654
00:29:16,755 --> 00:29:18,925
Pringle: At this point,
no one is really that concerned.
655
00:29:18,974 --> 00:29:22,244
They have seen many
situations like this before.
656
00:29:22,282 --> 00:29:26,772
Narrator: But by early afternoon
the decline gets much worse.
657
00:29:26,808 --> 00:29:28,938
Stock indexes,
such as the Nasdaq
658
00:29:28,984 --> 00:29:31,604
and Dow Jones begin to plummet.
659
00:29:31,639 --> 00:29:33,119
Badminton: It happens quickly.
660
00:29:33,162 --> 00:29:35,032
There are huge sell-offs
in security stocks,
661
00:29:35,077 --> 00:29:37,247
and the market drops almost 10%.
662
00:29:37,297 --> 00:29:40,517
Millions are lost in
just over 36 minutes.
663
00:29:40,561 --> 00:29:42,221
Morgan: It is a complete
catastrophy.
664
00:29:42,258 --> 00:29:46,738
And people are having
flashbacks to 2008.
665
00:29:46,785 --> 00:29:49,045
Narrator: The 2008 crash
was one of the worst
666
00:29:49,091 --> 00:29:51,401
financial crises in history
667
00:29:51,441 --> 00:29:55,011
where more than $2 trillion was
erased from the global economy.
668
00:29:55,054 --> 00:29:56,324
Morgan: Another one like this
could trigger
669
00:29:56,359 --> 00:29:58,619
a complete economic collapse,
670
00:29:58,666 --> 00:30:00,226
worse than the great depression.
671
00:30:00,276 --> 00:30:03,146
Narrator: Stockbrokers,
equity traders and bankers
672
00:30:03,192 --> 00:30:06,672
try to ascertain what's
triggering the rapid sell-offs.
673
00:30:06,717 --> 00:30:08,277
Pringle: No one has any idea.
674
00:30:08,328 --> 00:30:10,158
If it doesn't turn around,
the situation
675
00:30:10,199 --> 00:30:12,379
could become extremely dire.
676
00:30:12,419 --> 00:30:14,199
Narrator: The Dow Jones
Industrial Average
677
00:30:14,247 --> 00:30:18,597
is down by 600 points.
678
00:30:18,642 --> 00:30:22,262
Just as quickly as it started,
it begins to turn around.
679
00:30:22,298 --> 00:30:25,348
Financial firms start
to see big buy-backs.
680
00:30:25,388 --> 00:30:27,128
And by roughly 3 pm,
681
00:30:27,173 --> 00:30:29,443
the market has almost
fully recovered.
682
00:30:29,479 --> 00:30:31,439
Despite the turnaround,
683
00:30:31,481 --> 00:30:33,531
almost $1 trillion dollars
684
00:30:33,570 --> 00:30:35,920
is erased from the world's
financial markets.
685
00:30:35,964 --> 00:30:38,274
Morgan: That is a huge
amount of money.
686
00:30:38,314 --> 00:30:41,014
Everyone on Wall Street is
freaked because nobody knows
687
00:30:41,056 --> 00:30:43,706
what triggered the rapid
sell-offs and buy-backs.
688
00:30:43,754 --> 00:30:45,154
Narrator: In the
financial world,
689
00:30:45,191 --> 00:30:48,021
what transpired
on May 6th, 2010,
690
00:30:48,063 --> 00:30:51,113
is called a Flash Crash.
691
00:30:51,153 --> 00:30:53,423
Badminton: A 'Flash Crash'
happens when fast stock
692
00:30:53,460 --> 00:30:56,900
withdrawal orders cause price
indexes to quickly decline
693
00:30:56,942 --> 00:31:00,292
before they eventually recover,
as if it never happened.
694
00:31:00,336 --> 00:31:02,296
Pringle: These types
of events are worrying
695
00:31:02,338 --> 00:31:05,118
to the people who work on Wall
Street, because what happens
696
00:31:05,167 --> 00:31:07,337
if the market doesn't
eventually recover?
697
00:31:07,387 --> 00:31:09,687
Badminton: It would be
catastrophic like what we saw
698
00:31:09,737 --> 00:31:11,867
after the crashes of 1929,
699
00:31:11,913 --> 00:31:14,923
and the subprime crisis in 2008.
700
00:31:14,960 --> 00:31:17,090
Narrator: Representatives
for the US Commodity
701
00:31:17,136 --> 00:31:18,656
Futures Trading Commission
702
00:31:18,702 --> 00:31:21,012
and The US Securities
and Exchange Commission,
703
00:31:21,053 --> 00:31:23,533
immediately launch
an investigation.
704
00:31:23,577 --> 00:31:25,967
They search for evidence of
market manipulation,
705
00:31:26,014 --> 00:31:29,104
but find no indication
of wrongdoing.
706
00:31:29,148 --> 00:31:31,848
Pringle: Astonishingly,
it will take almost five years
707
00:31:31,890 --> 00:31:35,980
before anyone finds out what
triggered the Flash Crash.
708
00:31:36,024 --> 00:31:39,164
Narrator: In 2015, a Chicago
based day-trader
709
00:31:39,201 --> 00:31:40,941
solves the mystery.
710
00:31:40,986 --> 00:31:43,466
When he analyzes data
from that day,
711
00:31:43,510 --> 00:31:46,170
he happens to notice
something strange.
712
00:31:46,208 --> 00:31:49,038
One particular trader
sold an enormous amount
713
00:31:49,081 --> 00:31:52,001
of S&P 500 Future contracts
714
00:31:52,040 --> 00:31:55,170
and canceled the order before
they could be processed.
715
00:31:55,217 --> 00:31:56,567
Morgan: This person
was selling these
716
00:31:56,610 --> 00:31:59,090
extremely large orders
when the price was high,
717
00:31:59,134 --> 00:32:01,444
in order to trigger
these market selloffs.
718
00:32:01,484 --> 00:32:03,314
So that this person could then
719
00:32:03,356 --> 00:32:07,706
buy back those stocks at an
extremely reduced rate.
720
00:32:07,751 --> 00:32:08,971
Narrator: After the
market recovered,
721
00:32:09,014 --> 00:32:12,634
the trader then sold them
for a substantial profit.
722
00:32:12,669 --> 00:32:14,279
Authorities are able to follow
723
00:32:14,323 --> 00:32:17,023
the digital trail
across the Atlantic.
724
00:32:17,065 --> 00:32:19,975
On April 21, 2015,
725
00:32:20,025 --> 00:32:22,765
two U.S. Prosecutors,
2 FBI agents
726
00:32:22,810 --> 00:32:24,990
and half a dozen
police officers,
727
00:32:25,030 --> 00:32:29,120
descend on an address within
the London borough of Hounslow.
728
00:32:29,164 --> 00:32:31,734
Navinder Singh Sarao is arrested
729
00:32:31,775 --> 00:32:35,205
and subsequently charged
with 22 criminal counts,
730
00:32:35,257 --> 00:32:38,387
including wire fraud
and market manipulation,
731
00:32:38,434 --> 00:32:42,404
carrying a maximum sentence
of 380 years.
732
00:32:42,438 --> 00:32:43,918
Morgan: Authorities
allege he pocketed
733
00:32:43,962 --> 00:32:47,922
nearly $900,000 in one day.
734
00:32:47,966 --> 00:32:50,186
That's a lot of money to
make in such a short time.
735
00:32:50,229 --> 00:32:53,489
And he did it all
from his home computer.
736
00:32:53,536 --> 00:32:56,406
Narrator: Sarao is charged
with "Spoofing",
737
00:32:56,452 --> 00:32:59,152
a deceptive algorithmic
trading tactic,
738
00:32:59,194 --> 00:33:01,724
where the perpetrator places
'fake trades'
739
00:33:01,762 --> 00:33:05,202
making large orders with no
intention of honouring them,
740
00:33:05,244 --> 00:33:07,774
in order to manipulate
stock prices.
741
00:33:07,811 --> 00:33:10,471
The fabrication of
sudden market activity
742
00:33:10,510 --> 00:33:12,470
creates a momentum in price
743
00:33:12,512 --> 00:33:14,912
which Sarao was then able
to profit from.
744
00:33:14,949 --> 00:33:17,689
But while authorities think
they know how he did it,
745
00:33:17,734 --> 00:33:22,914
they still can't figure out why.
746
00:33:22,957 --> 00:33:24,477
Morgan: From all
accounts, Sarao was
747
00:33:24,524 --> 00:33:26,924
a really quiet guy
who kept to himself.
748
00:33:26,961 --> 00:33:28,701
Narrator: But it was soon
revealed that Sarao
749
00:33:28,745 --> 00:33:31,835
suffers from Asperger's
Syndrome, a mild form of
750
00:33:31,879 --> 00:33:35,189
high-functioning autism,
most common in males.
751
00:33:35,230 --> 00:33:37,540
Pringle: He is highly
intelligent and gifted in math.
752
00:33:37,580 --> 00:33:39,710
It is theorized that
due to this condition,
753
00:33:39,756 --> 00:33:42,496
he could be hyper-focussed,
giving him the ability
754
00:33:42,542 --> 00:33:44,722
to sit for hours
until he masters
755
00:33:44,761 --> 00:33:47,811
whatever task
he puts his mind to.
756
00:33:47,851 --> 00:33:50,511
Narrator: After graduating from
London's Brunel University
757
00:33:50,550 --> 00:33:52,860
with a degree in
Computer Science,
758
00:33:52,900 --> 00:33:56,340
Sarao begins working for an
independent investment firm.
759
00:33:56,382 --> 00:33:59,082
His peers are soon witness
to their quiet colleague's
760
00:33:59,124 --> 00:34:01,434
unique ability to
accurately predict
761
00:34:01,474 --> 00:34:04,264
when the market will
go up and down.
762
00:34:04,303 --> 00:34:07,223
Morgan: He made himself and his
company an awful lot of money.
763
00:34:07,262 --> 00:34:09,262
He was kind of like a rock star.
764
00:34:09,308 --> 00:34:11,748
Anything he touched
turned to gold.
765
00:34:11,788 --> 00:34:14,658
Narrator: But eventually Sarao,
ever the loner,
766
00:34:14,704 --> 00:34:17,404
struck out on his own
and looked to capitalize
767
00:34:17,446 --> 00:34:19,356
on the financial world's
increasing reliance
768
00:34:19,405 --> 00:34:21,405
on legal algorithmic trading,
769
00:34:21,450 --> 00:34:23,670
where machine learning
backed programs,
770
00:34:23,713 --> 00:34:26,023
execute large volumes
of transactions
771
00:34:26,064 --> 00:34:28,114
at lightning speed.
772
00:34:28,153 --> 00:34:31,373
By using pre-programmed
automated instructions,
773
00:34:31,417 --> 00:34:35,287
these systems can track rapidly
fluctuating market variables.
774
00:34:35,334 --> 00:34:36,814
Pringle: It can monitor
virtually
775
00:34:36,857 --> 00:34:38,767
all of the world's markets
at the same time,
776
00:34:38,815 --> 00:34:41,335
and if it notices an
upward or downward trend,
777
00:34:41,383 --> 00:34:45,303
it can react
accordingly very rapidly.
778
00:34:45,344 --> 00:34:47,434
Narrator: The introduction of
this technology into the
779
00:34:47,476 --> 00:34:51,216
world's financial markets
began in the early 1970s,
780
00:34:51,263 --> 00:34:53,833
but it wouldn't be
until the early 2000s
781
00:34:53,874 --> 00:34:58,314
that algorithmic trading
would become widely used.
782
00:34:58,357 --> 00:35:00,447
Pringle: By then,
computer technology was finally
783
00:35:00,489 --> 00:35:02,539
able to process the
immense amount of data
784
00:35:02,578 --> 00:35:04,928
the programs needed
to function properly.
785
00:35:04,972 --> 00:35:06,762
Needless to say,
it has been a game changer
786
00:35:06,800 --> 00:35:08,320
for the stock market.
787
00:35:08,367 --> 00:35:10,327
Narrator: There is compelling
evidence that
788
00:35:10,369 --> 00:35:13,369
the stock market's enthusiasm
for algorithmic trading
789
00:35:13,415 --> 00:35:15,455
may have been the reason
why Navinder Sarao
790
00:35:15,504 --> 00:35:17,594
performed his criminal acts.
791
00:35:17,637 --> 00:35:20,677
At one point, his friends
claim he admitted this.
792
00:35:20,727 --> 00:35:22,987
Morgan: He is really
annoyed and frustrated
793
00:35:23,033 --> 00:35:25,383
at the speed at which
these high frequency
794
00:35:25,427 --> 00:35:27,427
trading algorithms can perform.
795
00:35:27,473 --> 00:35:29,653
They are taking a bite
out of his profits.
796
00:35:29,692 --> 00:35:32,042
Pringle: He claims it provides
these big financial firms
797
00:35:32,086 --> 00:35:33,996
and banks an unfair
advantage
798
00:35:34,044 --> 00:35:38,444
over smaller funded firms
and the average day-trader.
799
00:35:38,484 --> 00:35:40,704
Narrator: But to know
why this is the case,
800
00:35:40,747 --> 00:35:43,447
a closer look is needed at how
the technology benefits
801
00:35:43,489 --> 00:35:45,669
these large financial firms.
802
00:35:45,708 --> 00:35:47,838
Badminton: Proponents of algo
trading say that it greatly
803
00:35:47,884 --> 00:35:51,504
reduces investment risk and
this makes it very attractive.
804
00:35:51,540 --> 00:35:53,460
The AI algorithm can analyze
805
00:35:53,499 --> 00:35:55,539
millions of data points
in milliseconds,
806
00:35:55,588 --> 00:35:57,758
notice trends and
then execute trades,
807
00:35:57,807 --> 00:36:00,807
of what it deduces to be
the optimal price.
808
00:36:00,854 --> 00:36:03,344
Morgan: But that gives
them a huge advantage
809
00:36:03,378 --> 00:36:05,548
over the average person because
they don't have to contend
810
00:36:05,598 --> 00:36:07,728
with emotions when making
decisions.
811
00:36:07,774 --> 00:36:10,304
That makes these companies
richer and richer.
812
00:36:10,342 --> 00:36:12,342
Narrator: And what's
surprising to many,
813
00:36:12,387 --> 00:36:15,477
is the practice is
completely legal.
814
00:36:15,521 --> 00:36:17,781
Badminton: And some
politicians want to change that,
815
00:36:17,827 --> 00:36:19,347
since it creates
an unfair advantage
816
00:36:19,394 --> 00:36:22,704
for big financial firms.
817
00:36:22,745 --> 00:36:25,175
Narrator: In June of 2019,
818
00:36:25,226 --> 00:36:27,706
U.S. Senator, Elizabeth Warren
819
00:36:27,750 --> 00:36:30,360
requests federal regulators
to crack down on
820
00:36:30,405 --> 00:36:32,755
"algorithmic discrimination"
821
00:36:32,799 --> 00:36:35,579
which favours large
financial institutions.
822
00:36:35,628 --> 00:36:37,758
Pringle: And that issue
is just the tip of the iceberg.
823
00:36:37,804 --> 00:36:40,244
There are other real concerns
in terms of using
824
00:36:40,285 --> 00:36:42,105
AI in the stock market.
825
00:36:42,156 --> 00:36:44,936
Narrator: One of which is in
the algorithm that Sarao
826
00:36:44,985 --> 00:36:48,595
was able to exploit for his own
financial benefit.
827
00:36:48,641 --> 00:36:51,471
Morgan: When Investigators
asked him how he did this,
828
00:36:51,513 --> 00:36:53,213
he said he had found
a flaw in the AI's
829
00:36:53,254 --> 00:36:55,564
high-frequency trading
algorithms.
830
00:36:55,604 --> 00:36:56,694
Badminton:
When it made decisions,
831
00:36:56,736 --> 00:36:58,686
the market would react
in the same direction.
832
00:36:58,738 --> 00:37:00,738
He could take advantage of that.
833
00:37:00,783 --> 00:37:02,873
Narrator: Sarao's solution
was to construct
834
00:37:02,916 --> 00:37:05,696
his own algorithmic
trading program.
835
00:37:05,745 --> 00:37:08,095
Pringle: His program could carry
out large amounts of trades
836
00:37:08,138 --> 00:37:10,708
on his behalf, which would
cause the big firm's A.I's
837
00:37:10,750 --> 00:37:13,230
to react by selling
off similar stocks.
838
00:37:13,274 --> 00:37:14,934
Morgan: Then later in
the afternoon,
839
00:37:14,971 --> 00:37:17,711
he buys back the stocks
at a reduced price.
840
00:37:17,757 --> 00:37:21,327
And he cancels his trades
before the AI can notice.
841
00:37:21,369 --> 00:37:23,419
Narrator: Investigators
later discover
842
00:37:23,458 --> 00:37:26,158
that Sarao's algorithm
replaced or modified
843
00:37:26,200 --> 00:37:28,590
some 19,000 orders
844
00:37:28,637 --> 00:37:31,677
before they were ultimately
canceled that day.
845
00:37:31,727 --> 00:37:33,377
Badminton:
He found the flaw in the AI.
846
00:37:33,425 --> 00:37:35,635
Like the human beings
that program it,
847
00:37:35,688 --> 00:37:37,598
he knew it wasn't perfect,
and he was able to
848
00:37:37,646 --> 00:37:39,816
beat the firms
at their own game.
849
00:37:39,866 --> 00:37:42,696
Narrator: Sarao and the others
who exploit the technology's
850
00:37:42,738 --> 00:37:46,128
flaws and weaknesses, are
perhaps not solely responsible
851
00:37:46,176 --> 00:37:48,526
for the financial
damage they cause.
852
00:37:48,570 --> 00:37:50,360
Pringle: This is an issue
many have raised,
853
00:37:50,398 --> 00:37:52,618
Sarao is a criminal, yes,
854
00:37:52,661 --> 00:37:55,971
but the big financial
firms are to blame as well.
855
00:37:56,012 --> 00:37:59,582
Shouldn't they have foreseen
something like this happening?
856
00:37:59,625 --> 00:38:01,705
Narrator: Many technology
experts suspect
857
00:38:01,757 --> 00:38:03,847
a reason why they didn't.
858
00:38:03,890 --> 00:38:06,240
Badminton: There may be an
overreliance on this technology.
859
00:38:06,284 --> 00:38:08,594
And as a result, there will
likely be more Flash Crashes
860
00:38:08,634 --> 00:38:11,854
on the horizon unless it's
flaws are addressed.
861
00:38:11,898 --> 00:38:13,858
Narrator: Some industry
observers believe
862
00:38:13,900 --> 00:38:15,900
something similar could
happen again,
863
00:38:15,945 --> 00:38:18,075
or potentially even worse,
864
00:38:18,121 --> 00:38:21,651
such as triggering a complete
economic collapse.
865
00:38:21,690 --> 00:38:24,480
But AI advocates argue
that in the long run,
866
00:38:24,519 --> 00:38:26,299
we are all better off.
867
00:38:26,347 --> 00:38:28,087
Morgan: They cite examples
where the tech has saved
868
00:38:28,131 --> 00:38:31,441
the general public literally
millions of dollars.
869
00:38:31,483 --> 00:38:34,443
They also say that hedge fund
managers and high-frequency
870
00:38:34,486 --> 00:38:39,266
traders are making better
decisions because of it.
871
00:38:39,317 --> 00:38:40,747
Pringle: There has also
been a more concerted effort
872
00:38:40,796 --> 00:38:43,056
on the part of Government
agencies to punish people
873
00:38:43,103 --> 00:38:45,543
who illegally exploit
this technology.
874
00:38:45,584 --> 00:38:48,804
Narrator: Cybercriminals
like Navinder Singh Sarao,
875
00:38:48,848 --> 00:38:50,808
having been extradited
to the US
876
00:38:50,850 --> 00:38:53,460
on Jan 28, 2020,
877
00:38:53,505 --> 00:38:57,155
Navinder Singh Sarao strikes a
deal in a Chicago court.
878
00:38:57,204 --> 00:38:59,554
He will blow the whistle
on other scammers
879
00:38:59,598 --> 00:39:02,118
in exchange for a
lenient sentence,
880
00:39:02,165 --> 00:39:08,255
one year of home detention with
his parents back in the UK.
881
00:39:08,302 --> 00:39:11,352
But in a bizarre twist of fate,
before he was arrested,
882
00:39:11,392 --> 00:39:14,482
Sarao was conned out of nearly
all the money he made.
883
00:39:14,526 --> 00:39:16,876
He is now penniless.
884
00:39:16,919 --> 00:39:20,359
Morgan: But many still see
Sarao as some kind of folk hero,
885
00:39:20,401 --> 00:39:22,581
one who was willing and able,
886
00:39:22,621 --> 00:39:25,891
to take on the big Wall Street
firms and win.
887
00:39:25,928 --> 00:39:28,408
Narrator: Many believe that
in the near future,
888
00:39:28,453 --> 00:39:32,073
A.I. may manage virtually all
of our money making decisions,
889
00:39:32,108 --> 00:39:34,718
and they worry about what
impact that may have
890
00:39:34,763 --> 00:39:36,943
on the world's
financial markets.
891
00:39:36,983 --> 00:39:39,943
Could another Flash Crash
happen again,
892
00:39:39,986 --> 00:39:44,856
but this time with permanent,
devastating consequences?
893
00:39:44,904 --> 00:39:48,564
As long as there are people like
Navinder Singh Sarao out there,
894
00:39:48,603 --> 00:39:51,523
looking to exploit the system
with technology,
895
00:39:51,563 --> 00:40:05,793
anything's possible.
896
00:40:05,838 --> 00:40:08,228
Narrator: On the northwestern
edge of Tanzania's
897
00:40:08,275 --> 00:40:11,015
world-renowned Serengeti
National Park
898
00:40:11,060 --> 00:40:13,800
lies the majestic
Grumeti Game Reserve.
899
00:40:13,846 --> 00:40:17,846
This 350,000-acre protected area
900
00:40:17,893 --> 00:40:21,513
is an essential component of
the Serengeti-Mara ecosystem
901
00:40:21,549 --> 00:40:25,029
and home to what's known
as the Great Migration.
902
00:40:25,074 --> 00:40:27,214
Morgan: The Great Migration
is one of the most
903
00:40:27,250 --> 00:40:30,170
spectacular natural phenomenon
on the planet.
904
00:40:30,210 --> 00:40:33,130
Every year, millions of animals
follows a clockwise
905
00:40:33,169 --> 00:40:36,779
migration pattern through
Tanzania and Kenya.
906
00:40:36,825 --> 00:40:39,735
Narrator: In order to safeguard
the migration routes the
907
00:40:39,785 --> 00:40:44,435
Tanzanian government created the
Grumeti Game Reserve in 1994.
908
00:40:44,485 --> 00:40:47,265
There are many challenges facing
the region, but one of the
909
00:40:47,314 --> 00:40:53,364
biggest threats is a human one,
illegal hunting.
910
00:40:53,407 --> 00:40:56,317
Alexander: Poaching is a huge
problem in this part of Africa.
911
00:40:56,366 --> 00:40:59,666
Some reports have indicated that
at least 200,000 animals
912
00:40:59,718 --> 00:41:03,638
are killed every year in the
western Serengeti alone.
913
00:41:03,678 --> 00:41:06,638
Narrator: Combating
poaching is no easy task.
914
00:41:06,681 --> 00:41:08,861
It's often left up
to Park Rangers,
915
00:41:08,901 --> 00:41:10,641
who are understaffed
and ill-equipped
916
00:41:10,685 --> 00:41:13,165
to patrol huge tracts of land.
917
00:41:13,209 --> 00:41:15,999
The work is also
exceptionally dangerous.
918
00:41:16,038 --> 00:41:18,038
Morgan: In 2019 and 2020
919
00:41:18,084 --> 00:41:20,094
more than 100 Park Rangers
around the world
920
00:41:20,129 --> 00:41:21,959
died in the line of duty.
921
00:41:22,001 --> 00:41:25,741
Many of them were
murdered by poachers.
922
00:41:25,787 --> 00:41:28,007
Narrator: For years, Park
Rangers have been fighting
923
00:41:28,050 --> 00:41:30,970
a losing battle against an
evasive enemy,
924
00:41:31,010 --> 00:41:34,620
that will seemingly stop at
nothing to achieve its goals.
925
00:41:34,666 --> 00:41:37,446
And it's not just a matter
of stopping impoverished
926
00:41:37,495 --> 00:41:40,365
local villagers from killing
protected animals;
927
00:41:40,410 --> 00:41:42,670
there are well-armed, ruthless,
928
00:41:42,717 --> 00:41:48,377
organized crime syndicates
lurking in the shadows.
929
00:41:48,418 --> 00:41:50,198
Aitken: Wildlife
crime is big business.
930
00:41:50,246 --> 00:41:52,026
Some estimates show that it
could be worth roughly
931
00:41:52,074 --> 00:41:53,864
$20 billion per year,
932
00:41:53,902 --> 00:41:55,822
ranked only in criminal value
behind drugs,
933
00:41:55,861 --> 00:41:58,041
weapons and human trafficking.
934
00:41:58,080 --> 00:42:01,480
Narrator: So what can be done to
fight this senseless slaughter?
935
00:42:01,519 --> 00:42:04,519
More and more organizations
are turning to technology
936
00:42:04,565 --> 00:42:07,475
that may just provide a glimpse
of what the future
937
00:42:07,525 --> 00:42:10,175
of anti-poaching
measures look like.
938
00:42:10,223 --> 00:42:13,183
On a quiet January night in the
operations room
939
00:42:13,226 --> 00:42:15,046
at the Grumeti Game Reserve,
940
00:42:15,097 --> 00:42:18,267
a grainy photograph of
what appears to be a man
941
00:42:18,318 --> 00:42:20,358
carrying some equipment
on his shoulders,
942
00:42:20,407 --> 00:42:24,367
is sent from a remote camera
a few kilometers away.
943
00:42:24,411 --> 00:42:26,811
Morgan: These cameras are really
remarkable pieces of
technology.
944
00:42:26,848 --> 00:42:28,808
Like they are the size
of your index finger,
945
00:42:28,850 --> 00:42:31,680
which means that they're really
easy to hide in the bush.
946
00:42:31,723 --> 00:42:33,943
Each one of them has its own
internal processor
947
00:42:33,986 --> 00:42:36,596
with an image recognition
algorithm.
948
00:42:36,641 --> 00:42:39,471
Narrator: The system, called
TrailGuard AI,
949
00:42:39,513 --> 00:42:41,383
was invented by Steve Gulick,
950
00:42:41,428 --> 00:42:43,338
founder of Wildlife Security,
951
00:42:43,386 --> 00:42:45,256
and developed with RESOLVE,
952
00:42:45,301 --> 00:42:48,131
a U.S. based non-profit
organization.
953
00:42:48,174 --> 00:42:51,614
TrailGuard's primary goal is to
identify and catch potential
954
00:42:51,656 --> 00:42:54,916
poachers before they have
the opportunity to kill.
955
00:42:54,963 --> 00:42:56,793
Alexander: In order
to develop the algorithm,
956
00:42:56,835 --> 00:42:59,355
engineers fed it hundreds of
thousands of pictures
957
00:42:59,402 --> 00:43:02,582
and taught the AI system to
identify unusual activity.
958
00:43:02,623 --> 00:43:04,973
Using this knowledge,
the program is able to
959
00:43:05,017 --> 00:43:08,017
assess the content of
images and make decisions.
960
00:43:08,063 --> 00:43:10,333
Aitken: The cameras use
a passive infrared sensor
961
00:43:10,370 --> 00:43:12,760
that switches them on whenever
they detect movement.
962
00:43:12,807 --> 00:43:15,237
Once they have captured
an image, the algorithm
963
00:43:15,288 --> 00:43:17,548
scans the photo for telltale
signs of poaching.
964
00:43:17,595 --> 00:43:19,155
Things like unauthorized
vehicles,
965
00:43:19,205 --> 00:43:22,945
a person carrying
a gun or supplies.
966
00:43:22,991 --> 00:43:25,211
Narrator: TrailGuard only
transmits the pictures
967
00:43:25,254 --> 00:43:28,044
it has flagged as potential
threats to authorities,
968
00:43:28,083 --> 00:43:30,613
which not only preserves the
unit's battery life,
969
00:43:30,651 --> 00:43:32,611
but also keeps Park Rangers
970
00:43:32,653 --> 00:43:35,703
from being inundated
with unnecessary alerts.
971
00:43:35,743 --> 00:43:38,053
Due to the remoteness
of the Game Reserve,
972
00:43:38,093 --> 00:43:41,533
figuring out how to relay the
images to the control room
973
00:43:41,575 --> 00:43:44,355
was an obstacle in the early
versions of the system.
974
00:43:44,404 --> 00:43:45,844
Morgan: At first,
the images were sent over a
975
00:43:45,884 --> 00:43:48,024
simple 3G cellular network, but
976
00:43:48,060 --> 00:43:50,020
the Reserve is kind of out
in the middle of nowhere,
977
00:43:50,062 --> 00:43:51,852
which means that the reception,
978
00:43:51,890 --> 00:43:54,460
it isn't very reliable.
979
00:43:54,501 --> 00:43:56,421
Narrator: As a solution,
the team set up
980
00:43:56,459 --> 00:43:58,459
small satellite transmitters
981
00:43:58,505 --> 00:44:00,985
that convey information
through LoRa,
982
00:44:01,029 --> 00:44:04,769
a wireless technology adept
at long-range transmission.
983
00:44:04,816 --> 00:44:06,556
Aitken: Once the
communication issue was fixed,
984
00:44:06,600 --> 00:44:09,040
it was up to the Park Rangers to
figure out where to station
985
00:44:09,081 --> 00:44:11,741
the AI units to
maximize their potential.
986
00:44:11,779 --> 00:44:13,129
Narrator: The cameras were
placed along
987
00:44:13,172 --> 00:44:15,742
established routes
frequented by poachers
988
00:44:15,783 --> 00:44:18,093
and some of the early
signs were encouraging.
989
00:44:18,133 --> 00:44:20,353
As with many new
technological systems,
990
00:44:20,396 --> 00:44:22,266
there were bumps in the road.
991
00:44:22,311 --> 00:44:24,401
Aitken: The algorithms sometimes
had trouble identifying
992
00:44:24,444 --> 00:44:26,494
poachers who were carrying
meat on their shoulders.
993
00:44:26,533 --> 00:44:29,753
The shape didn't appear human to
the AI and no alerts were sent.
994
00:44:29,797 --> 00:44:32,447
Alexander: It was also time-
consuming to set up the units
995
00:44:32,495 --> 00:44:34,715
and people in the community
simply passing by
996
00:44:34,759 --> 00:44:37,759
could compromise the positions.
997
00:44:37,805 --> 00:44:40,415
Narrator: As the development
group continues to fine tune
998
00:44:40,460 --> 00:44:42,640
the system in the hopes of
deploying TrailGuard
999
00:44:42,680 --> 00:44:44,510
in parks around the world,
1000
00:44:44,551 --> 00:44:47,551
the Special Operations Team at
the Grumeti Game Reserve
1001
00:44:47,597 --> 00:44:50,907
is about to put its
efficacy to the test.
1002
00:44:50,949 --> 00:44:53,599
After determining that the man
in the grainy photo
1003
00:44:53,647 --> 00:44:56,957
sent from one of the remote
cameras is a potential poacher,
1004
00:44:56,998 --> 00:45:00,128
possibly en route to a camp
established by accomplices,
1005
00:45:00,175 --> 00:45:03,085
the team springs into action.
1006
00:45:03,135 --> 00:45:04,785
Morgan: The image
captured by TrailGuard
1007
00:45:04,832 --> 00:45:06,972
could turn out to be
a key piece of evidence
1008
00:45:07,008 --> 00:45:09,878
if the poachers are ever
caught and prosecuted.
1009
00:45:09,924 --> 00:45:12,274
Narrator: The TrailGuard
system isn't the only
1010
00:45:12,318 --> 00:45:14,488
Artificial Intelligence
based weapon
1011
00:45:14,537 --> 00:45:19,497
that's being deployed in the
fight against poaching.
1012
00:45:19,542 --> 00:45:21,892
A team from Harvard University
has developed
1013
00:45:21,936 --> 00:45:25,936
the Protection Assistant for
Wildlife Security, or PAWS,
1014
00:45:25,984 --> 00:45:29,164
a program that analyses
historical poaching data
1015
00:45:29,204 --> 00:45:33,774
to predict where and when
poachers are likely to be found.
1016
00:45:33,818 --> 00:45:36,118
Aitken: To forecast poaching
activity, the PAWS system
1017
00:45:36,168 --> 00:45:38,558
uses game theory, which is kind
of a blanket term
1018
00:45:38,605 --> 00:45:40,995
for predicting decision-making
based on two parties
1019
00:45:41,042 --> 00:45:43,702
trying to ensure the best
possible outcome for themselves.
1020
00:45:43,741 --> 00:45:45,741
Alexander: Game theory
has many applications,
1021
00:45:45,786 --> 00:45:47,876
economics, war, politics.
1022
00:45:47,919 --> 00:45:49,749
The idea is essentially
1023
00:45:49,790 --> 00:45:52,310
that people instinctively do
what's best for themselves,
1024
00:45:52,358 --> 00:45:56,008
even if it's detrimental
to others.
1025
00:45:56,057 --> 00:45:57,707
Narrator: PAWS collects
data from the
1026
00:45:57,755 --> 00:46:00,275
Spatial Monitoring
and Reporting Tool,
1027
00:46:00,322 --> 00:46:02,852
SMART for short,
an open source log
1028
00:46:02,890 --> 00:46:06,420
used by over 800 national parks
worldwide.
1029
00:46:06,459 --> 00:46:10,329
SMART aggregates instances of
illegal activity observed
1030
00:46:10,376 --> 00:46:13,636
by Park Rangers on patrol
and the PAWS algorithm
1031
00:46:13,683 --> 00:46:16,863
uses this information in
conjunction with game theory
1032
00:46:16,904 --> 00:46:19,604
to generate risk maps,
so that authorities
1033
00:46:19,646 --> 00:46:22,336
can make better decisions
on patrol planning.
1034
00:46:22,388 --> 00:46:23,908
Alexander: The system
is all about maximizing
1035
00:46:23,955 --> 00:46:25,825
the limited resources
of the Parks.
1036
00:46:25,870 --> 00:46:27,960
It's an uphill battle,
but PAWS certainly has
1037
00:46:28,002 --> 00:46:31,402
the potential
to make a difference.
1038
00:46:31,440 --> 00:46:35,100
Narrator: In 2018, PAWS was
field tested
1039
00:46:35,140 --> 00:46:38,400
at the Srepok Wildlife Sanctuary
in Cambodia,
1040
00:46:38,447 --> 00:46:40,407
a region deemed to be ideal
1041
00:46:40,449 --> 00:46:43,319
for reintroducing tigers
in Southeast Asia.
1042
00:46:43,365 --> 00:46:45,315
Morgan: Sadly, the tiger
population in Asia,
1043
00:46:45,367 --> 00:46:48,587
has plummeted over
the last 120 years.
1044
00:46:48,631 --> 00:46:50,631
At the turn of the 20th century,
there were more than
1045
00:46:50,677 --> 00:46:53,067
100,000 tigers in this
part of the world.
1046
00:46:53,114 --> 00:46:55,204
Today, they are less than 4,000.
1047
00:46:55,247 --> 00:46:57,157
Aitken: One tiger can be
worth as much as
1048
00:46:57,205 --> 00:46:59,375
$50,000 on the black market.
1049
00:46:59,425 --> 00:47:01,725
Poachers with ties to
organized crime associations
1050
00:47:01,775 --> 00:47:07,125
pose a serious threat to the
tiger's survival as a species.
1051
00:47:07,172 --> 00:47:09,172
Narrator: Over the course of
the first month of trials
1052
00:47:09,217 --> 00:47:13,867
in Cambodia, rangers patrolling
areas recommended by the PAWS AI
1053
00:47:13,918 --> 00:47:18,308
found more than a thousand
snares, double the usual amount.
1054
00:47:18,357 --> 00:47:23,537
They also seized 42 chainsaws,
24 motorbikes and a truck.
1055
00:47:23,579 --> 00:47:25,629
While these results
are encouraging,
1056
00:47:25,668 --> 00:47:27,978
the system still has its flaws.
1057
00:47:28,019 --> 00:47:31,149
One of the main problems facing
the development team,
1058
00:47:31,196 --> 00:47:33,196
is that the predictive models
are based on data
1059
00:47:33,241 --> 00:47:35,551
that contains uncertainties.
1060
00:47:35,591 --> 00:47:38,031
Alexander: If a Ranger locates
and logs a snare,
1061
00:47:38,072 --> 00:47:40,162
they have no way of knowing
when poachers set it.
1062
00:47:40,205 --> 00:47:42,375
This harms the relevancy
of the information.
1063
00:47:42,424 --> 00:47:44,214
Poaching activity also
fluctuates
1064
00:47:44,252 --> 00:47:45,562
from season to season.
1065
00:47:45,601 --> 00:47:48,171
Data collected during the dry
season isn't applicable
1066
00:47:48,213 --> 00:47:50,913
when making predictions
during the rainy season.
1067
00:47:50,955 --> 00:47:52,955
Aitken: PAWS also has a
common algorithmic problem,
1068
00:47:53,000 --> 00:47:54,610
it can't prove a negative.
1069
00:47:54,654 --> 00:47:57,314
If a Park Ranger doesn't find
a snare in a certain area,
1070
00:47:57,352 --> 00:47:59,312
that doesn't necessarily mean
there wasn't one there,
1071
00:47:59,354 --> 00:48:01,664
maybe they just didn't see it.
1072
00:48:01,704 --> 00:48:04,104
Narrator: Beyond the
technological imperfections,
1073
00:48:04,142 --> 00:48:07,152
critics of systems like PAWS
point out,
1074
00:48:07,188 --> 00:48:10,278
that if we really want to put
a stop to illegal hunting,
1075
00:48:10,322 --> 00:48:13,022
more has to be done with
community outreach efforts
1076
00:48:13,064 --> 00:48:15,414
and social justice programs.
1077
00:48:15,457 --> 00:48:17,847
Morgan: Sometimes what gets lost
in the outrage of a poaching,
1078
00:48:17,895 --> 00:48:19,805
is the sad reality
that some people
1079
00:48:19,853 --> 00:48:21,863
don't have another choice.
1080
00:48:21,899 --> 00:48:25,949
If you've got a hungry family,
this might be your only option.
1081
00:48:25,990 --> 00:48:28,430
Narrator: There are also fears
that organizations
1082
00:48:28,470 --> 00:48:30,780
supporting tech-based programs
1083
00:48:30,820 --> 00:48:34,090
might cause resentment among
Park Rangers, as it could be
1084
00:48:34,128 --> 00:48:37,388
perceived as an insult to their
skills as officers.
1085
00:48:37,436 --> 00:48:39,386
Rangers are a proud group,
1086
00:48:39,438 --> 00:48:42,438
who take great satisfaction in
doing what they consider
1087
00:48:42,484 --> 00:48:46,624
to be noble work, often in the
face of grave danger.
1088
00:48:46,662 --> 00:48:50,142
However it becomes clear that
technology combined with
1089
00:48:50,188 --> 00:48:53,708
old-fashioned, boots-on-the-
ground law enforcement can be
1090
00:48:53,756 --> 00:48:57,016
an effective weapon in the war
against illegal hunting.
1091
00:48:57,064 --> 00:48:59,414
In the Grumeti Game Reserve,
picking up
1092
00:48:59,458 --> 00:49:01,368
where the camera identified
the poacher,
1093
00:49:01,416 --> 00:49:04,326
it's now the job of the
rangers to pursue him.
1094
00:49:04,376 --> 00:49:06,326
Alexander: The Grumeti Rangers
use two dogs
1095
00:49:06,378 --> 00:49:08,118
trained to detect human scent,
1096
00:49:08,162 --> 00:49:10,952
and track the poacher's
trail for over nine miles.
1097
00:49:10,991 --> 00:49:14,781
Along the way, they manage to
remove an impressive 34 snares.
1098
00:49:14,821 --> 00:49:17,611
Narrator: The operation results
in the arrest of three men
1099
00:49:17,650 --> 00:49:21,780
found in possession of over a
thousand pounds of bushmeat.
1100
00:49:21,828 --> 00:49:24,048
It's a small victory
for TrailGuard
1101
00:49:24,091 --> 00:49:26,051
and the use of Artificial
Intelligence
1102
00:49:26,093 --> 00:49:29,493
in the fight against poaching,
but a victory nonetheless.
1103
00:49:29,531 --> 00:49:32,141
For conservationists
and the organizations
1104
00:49:32,186 --> 00:49:35,016
that have logged countless hours
developing the systems,
1105
00:49:35,059 --> 00:49:39,499
it's hopefully a sign
of things to come.
1106
00:49:39,541 --> 00:49:42,371
Morgan: While these
initiatives are admirable
1107
00:49:42,414 --> 00:49:45,984
and innovative, it's hard not to
look at our relationship with
1108
00:49:46,026 --> 00:49:49,766
the natural world and wonder
how we got to this point?
1109
00:49:49,812 --> 00:49:52,342
Narrator: Are we doing enough
to reverse the damage
1110
00:49:52,380 --> 00:49:55,780
that humans have done
to our fellow species?
1111
00:49:55,818 --> 00:49:59,688
Can technology have a tangible
impact on preserving wildlife
1112
00:49:59,735 --> 00:50:01,645
for future generations?
1113
00:50:01,694 --> 00:50:04,314
Hopefully the answer is yes,
1114
00:50:04,349 --> 00:50:05,999
but there's no magic bullet
1115
00:50:06,046 --> 00:50:08,306
and the window is closing.
1116
00:50:08,353 --> 00:50:11,363
Something needs to be done
before it's too late,
1117
00:50:11,399 --> 00:50:14,919
and proponents of Artificial
Intelligence based solutions
1118
00:50:14,968 --> 00:50:17,798
believe that the more
we turn to technology,
1119
00:50:17,840 --> 00:50:20,020
the better off
the planet will be.
87877
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.