Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:23,371 --> 00:00:25,371
Narrator: Just east
of Tehran, Iran,
2
00:00:25,416 --> 00:00:28,286
shots ring out as a brazen
daylight assassination
3
00:00:28,332 --> 00:00:31,072
is carried out, killing top
Iranian nuclear scientist
4
00:00:31,118 --> 00:00:32,768
Mohsen Fakhrizadeh.
5
00:00:32,815 --> 00:00:35,205
As the gunfire ceases,
there is no one around
6
00:00:35,252 --> 00:00:36,782
with their finger
on the trigger.
7
00:00:36,819 --> 00:00:38,559
Ramona Pringle: Strangely the
bodyguards didn't find anybody
8
00:00:38,603 --> 00:00:41,393
in the area, so who
killed Fakhrizadeh,
9
00:00:41,432 --> 00:00:45,922
and how exactly did
they pull it off?
10
00:00:45,958 --> 00:00:48,348
Narrator: On a quiet
night in Arizona USA,
11
00:00:48,396 --> 00:00:50,526
one of the marvels of
modern technology,
12
00:00:50,572 --> 00:00:53,442
a driverless car,
is out for a test drive,
13
00:00:53,488 --> 00:00:56,268
when it fails to recognize a
pedestrian in its path.
14
00:00:56,317 --> 00:00:57,797
Nikolas Badminton: Sensors only
recognized the bicycle
15
00:00:57,840 --> 00:00:59,150
she was pushing
16
00:00:59,189 --> 00:01:04,109
and then they could only
recommend emergency braking.
17
00:01:04,151 --> 00:01:05,801
Narrator: At a
famous auction house,
18
00:01:05,848 --> 00:01:08,888
a French Portrait created by
Artificial Intelligence
19
00:01:08,938 --> 00:01:12,028
goes up for sale, stoking
controversy in the art world.
20
00:01:12,072 --> 00:01:14,422
Pringle: Some in the
art establishment
21
00:01:14,465 --> 00:01:16,985
weren't exactly thrilled to see
a computer generated painting
22
00:01:17,033 --> 00:01:18,383
up for auction.
23
00:01:18,426 --> 00:01:19,506
Mhairi Aitken: But the
suggestion that the piece
24
00:01:19,557 --> 00:01:20,987
was created with no
human involvement
25
00:01:21,037 --> 00:01:22,467
is overly simplistic.
26
00:01:22,517 --> 00:01:24,347
Somebody had to write
the code that created it.
27
00:01:24,388 --> 00:01:29,388
♪
28
00:01:29,437 --> 00:01:31,127
Narrator: These are the stories
of the future,
29
00:01:31,178 --> 00:01:34,138
that big data is bringing
to our doorsteps.
30
00:01:34,181 --> 00:01:37,661
The real world impact of
predictions and surveillance.
31
00:01:37,706 --> 00:01:39,966
The power of artificial
intelligence
32
00:01:40,012 --> 00:01:41,882
and autonomous machines.
33
00:01:41,927 --> 00:01:44,577
For better or for worse, these
34
00:01:44,626 --> 00:01:55,806
are the Secrets of Big Data.
35
00:01:55,854 --> 00:01:57,554
Mohsen Fakhrizadeh,
36
00:01:57,595 --> 00:02:00,155
the head of Iran's
nuclear weapons program,
37
00:02:00,207 --> 00:02:02,427
and the country's
deputy of defence,
38
00:02:02,470 --> 00:02:05,650
is driving with his wife to
his vacation house in Absard,
39
00:02:05,690 --> 00:02:08,130
around 80 kilometers
east of Tehran,
40
00:02:08,171 --> 00:02:10,521
when a sudden burst
of gunfire rings out
41
00:02:10,565 --> 00:02:12,435
across the quiet countryside.
42
00:02:12,480 --> 00:02:14,260
The bullets tear through
the hood of his car.
43
00:02:14,308 --> 00:02:16,788
As Fakhrizadeh swerves
and tries to stop,
44
00:02:16,832 --> 00:02:19,402
another round of shots
shatters the windshield.
45
00:02:19,443 --> 00:02:22,493
Now wounded, he exits the car
and tries in vain
46
00:02:22,533 --> 00:02:24,413
to hide behind the open door.
47
00:02:24,448 --> 00:02:27,578
A third torrent of shot tears
through Fakhrizadeh's body,
48
00:02:27,625 --> 00:02:32,405
killing him instantly.
49
00:02:32,456 --> 00:02:34,066
Anthony Morgan: This was a
brazen assassination
50
00:02:34,110 --> 00:02:37,720
of a top government official.
51
00:02:37,766 --> 00:02:39,376
Narrator: Fakhrizadeh's
security detail
52
00:02:39,420 --> 00:02:43,030
that had been tailing his car
rushes to the scene, guns drawn,
53
00:02:43,075 --> 00:02:45,465
ready to engage
with the attacker.
54
00:02:45,513 --> 00:02:47,253
Pringle: Strangely, the
bodyguards didn't find anybody
55
00:02:47,297 --> 00:02:49,997
in the area.
So who killed Fakhrizadeh?
56
00:02:50,039 --> 00:02:53,429
And how exactly did
they pull it off?
57
00:02:53,477 --> 00:02:55,217
Narrator: The answer
sits in an office
58
00:02:55,262 --> 00:02:57,312
far away from the carnage,
59
00:02:57,351 --> 00:03:00,481
staring at a computer screen as
though playing a video game.
60
00:03:00,528 --> 00:03:02,308
But this is no video game.
61
00:03:02,356 --> 00:03:05,186
This is a complex
military operation.
62
00:03:05,228 --> 00:03:08,188
Morgan: The assassin
ended Fakhrizadeh's life
63
00:03:08,231 --> 00:03:12,541
without ever setting foot
on Iranian soil.
64
00:03:12,583 --> 00:03:15,243
Narrator: Unmanned military
weaponry is nothing new.
65
00:03:15,282 --> 00:03:18,112
As far back as the 1930s,
the U.S. Navy
66
00:03:18,154 --> 00:03:20,904
was experimenting with
unmanned aerial vehicles,
67
00:03:20,939 --> 00:03:23,119
commonly known as drones.
68
00:03:23,159 --> 00:03:25,289
But it wasn't until the
Vietnam War
69
00:03:25,335 --> 00:03:27,895
that they were deployed
extensively.
70
00:03:27,946 --> 00:03:29,636
Aitken: Early drone models
were used mainly
71
00:03:29,687 --> 00:03:32,597
for reconnaissance and acted
as decoys during battles.
72
00:03:32,647 --> 00:03:35,127
They also contributed to US
propaganda campaigns,
73
00:03:35,171 --> 00:03:38,001
dropping leaflets in
enemy territory.
74
00:03:38,043 --> 00:03:40,923
Narrator: Even though the US was
able to eventually succeed
75
00:03:40,959 --> 00:03:43,789
in mass-producing drones for
military purposes,
76
00:03:43,832 --> 00:03:47,142
they were often viewed as
costly and unreliable.
77
00:03:47,183 --> 00:03:50,713
However, in 1982 this
attitude shifted
78
00:03:50,752 --> 00:03:53,152
when Israel defeated
Syrian forces using
79
00:03:53,189 --> 00:03:56,579
unmanned aircraft and suffered
very few casualties.
80
00:03:56,627 --> 00:03:58,797
[explosion]
81
00:03:58,847 --> 00:04:00,587
Pringle: The United States
sat up and took notice
82
00:04:00,631 --> 00:04:03,901
of the success Israel had with
drones and decided to redouble
83
00:04:03,939 --> 00:04:08,159
their efforts to further
develop their own technology.
84
00:04:08,204 --> 00:04:10,214
Narrator: Weaponized drones
used by US Forces
85
00:04:10,250 --> 00:04:13,470
made their debut in Afghanistan.
86
00:04:13,514 --> 00:04:15,394
The first unmanned aerial
missile strike
87
00:04:15,429 --> 00:04:18,479
happened in October 2001,
88
00:04:18,519 --> 00:04:20,259
when the US attempted to
assassinate
89
00:04:20,303 --> 00:04:22,613
Taliban leader Mullah Omar.
90
00:04:22,653 --> 00:04:24,263
[explosion]
91
00:04:24,307 --> 00:04:26,607
The missile failed to find
its intended target,
92
00:04:26,657 --> 00:04:28,357
but several bodyguards
were killed
93
00:04:28,398 --> 00:04:30,748
in a vehicle near
Omar's compound.
94
00:04:30,792 --> 00:04:33,192
Aitken: These early
attack drones were reliant on a
95
00:04:33,229 --> 00:04:35,969
remote human operator to control
them and launch their weapons,
96
00:04:36,014 --> 00:04:38,324
but today the technology has
advanced to the point
97
00:04:38,365 --> 00:04:42,845
where full autonomous
capabilities are possible.
98
00:04:42,891 --> 00:04:45,201
Narrator: Armed with Artificial
intelligence systems,
99
00:04:45,241 --> 00:04:47,851
the latest unmanned
weapons are self-guiding
100
00:04:47,896 --> 00:04:51,466
and can even attack without
human intervention.
101
00:04:51,508 --> 00:04:54,508
In the aftermath of the attack
on Fakhrizadeh in Iran,
102
00:04:54,555 --> 00:04:56,635
there is much confusion,
103
00:04:56,687 --> 00:04:59,557
his security detail tries to
figure out what happened.
104
00:04:59,603 --> 00:05:01,263
Morgan: The immediate
assumption was that a drone
105
00:05:01,301 --> 00:05:03,221
carried out the execution.
106
00:05:03,259 --> 00:05:06,829
Drones are often used in
complex, high-risk operations.
107
00:05:06,871 --> 00:05:08,831
Pringle: A drone assault
seems unlikely in this case
108
00:05:08,873 --> 00:05:10,793
because they mostly
deploy missiles.
109
00:05:10,832 --> 00:05:13,922
This was a high caliber
machine gun attack.
110
00:05:13,965 --> 00:05:16,225
Narrator: The chaos is
heightened when moments later,
111
00:05:16,272 --> 00:05:19,412
a pickup truck parked nearby
explodes,
112
00:05:19,449 --> 00:05:21,409
raining debris down on the road.
113
00:05:21,451 --> 00:05:25,151
In the smoldering wreckage,
Fakhrizadeh's bodyguards
114
00:05:25,194 --> 00:05:27,504
notice what appears to be
a large machine gun
115
00:05:27,544 --> 00:05:29,554
attached to a robotic apparatus.
116
00:05:29,590 --> 00:05:32,380
Quite literally,
the smoking gun.
117
00:05:32,419 --> 00:05:34,159
Morgan: The
assassination was carried out
118
00:05:34,203 --> 00:05:36,343
with a remote-controlled
machine gun.
119
00:05:36,379 --> 00:05:39,899
It was likely assisted
by some form of AI.
120
00:05:39,948 --> 00:05:42,338
Narrator: The world's
superpowers are developing
121
00:05:42,385 --> 00:05:45,345
Artificial Intelligence
applications for an array
122
00:05:45,388 --> 00:05:50,218
of military functions
at breakneck speeds.
123
00:05:50,262 --> 00:05:51,742
In recent simulated dogfights
124
00:05:51,786 --> 00:05:54,046
tested by the United States
military,
125
00:05:54,092 --> 00:05:56,572
AI fighter pilot
systems outperformed
126
00:05:56,617 --> 00:06:00,397
their human counterparts
by a significant margin.
127
00:06:00,447 --> 00:06:02,487
Morgan: AI pilot systems
use something called
128
00:06:02,536 --> 00:06:04,276
deep reinforcement learning.
129
00:06:04,320 --> 00:06:07,320
That's when they are repeatedly
put in combat situations
130
00:06:07,367 --> 00:06:09,407
and are rewarded for
successful actions
131
00:06:09,456 --> 00:06:13,236
and penalized for
unsuccessful ones.
132
00:06:13,285 --> 00:06:15,935
Narrator: In the early stages,
AI systems are merely trying
133
00:06:15,984 --> 00:06:18,204
to keep their aircraft
from crashing,
134
00:06:18,247 --> 00:06:20,467
but after billions of
different scenarios,
135
00:06:20,510 --> 00:06:22,730
they have become highly
proficient
136
00:06:22,773 --> 00:06:25,123
in their techniques
in air combat.
137
00:06:25,167 --> 00:06:27,257
Morgan: Although this
technology is advancing rapidly,
138
00:06:27,299 --> 00:06:30,039
we are plausibly
years away before we see
139
00:06:30,085 --> 00:06:34,515
autonomous warplanes in
real combat situations.
140
00:06:34,568 --> 00:06:35,738
Narrator: But this
isn't the case
141
00:06:35,786 --> 00:06:37,786
with other unmanned weaponry.
142
00:06:37,832 --> 00:06:40,012
According to a
United Nations Report,
143
00:06:40,051 --> 00:06:41,711
in March of 2020,
144
00:06:41,749 --> 00:06:44,189
the first autonomous drone
attacks in history
145
00:06:44,229 --> 00:06:47,539
took place on a battlefield
in Libya.
146
00:06:47,581 --> 00:06:51,321
Soldiers loyal to warlord
Khalifa Haftar were retreating
147
00:06:51,367 --> 00:06:54,067
from the Turkish backed forces
of the Libyan government
148
00:06:54,109 --> 00:06:56,679
when they were hunted down
and dive-bombed
149
00:06:56,720 --> 00:06:58,590
by munitions-packed drones
150
00:06:58,635 --> 00:07:00,985
operating without human control.
151
00:07:01,029 --> 00:07:02,599
Aitken: This marks
the first time ever,
152
00:07:02,639 --> 00:07:04,509
that the decision to attack
humans was made by
153
00:07:04,554 --> 00:07:06,644
a machine rather than a person.
154
00:07:06,687 --> 00:07:08,037
These weapons are
what's known as
155
00:07:08,079 --> 00:07:10,429
"loitering munitions"
or kamikaze drones
156
00:07:10,473 --> 00:07:14,133
and are programmed to strike
without connectivity.
157
00:07:14,172 --> 00:07:16,702
Narrator: The incident has
raised serious ethical concerns
158
00:07:16,740 --> 00:07:20,350
among some observers, who feel
that it is morally reprehensible
159
00:07:20,396 --> 00:07:22,786
for machines to make life or
death decisions
160
00:07:22,833 --> 00:07:24,793
without any human input,
161
00:07:24,835 --> 00:07:27,925
and there have been calls for a
ban on the technology.
162
00:07:27,969 --> 00:07:30,059
Pringle: Some would claim that
these fears are exaggerated,
163
00:07:30,101 --> 00:07:32,801
and that humans will ultimately
always be in control,
164
00:07:32,843 --> 00:07:35,543
and the idea of autonomous
machines running amok
165
00:07:35,585 --> 00:07:38,625
and hunting down humans
is a little far fetched.
166
00:07:38,675 --> 00:07:40,845
But what's really scary
is these tools
167
00:07:40,895 --> 00:07:44,155
ending up in the
wrong human hands.
168
00:07:44,202 --> 00:07:46,342
Narrator: Defenders of
unmanned weaponry argue that
169
00:07:46,378 --> 00:07:48,158
armed with the relevant data,
170
00:07:48,206 --> 00:07:51,166
AI-enabled technology
is so advanced
171
00:07:51,209 --> 00:07:53,169
that it can perform tasks on the
battlefield
172
00:07:53,211 --> 00:07:55,651
with pinpoint accuracy,
173
00:07:55,692 --> 00:07:58,262
minimizing collateral damage.
174
00:07:58,303 --> 00:08:01,743
Such is the case in the
assassination of Fakhrizadeh.
175
00:08:01,785 --> 00:08:04,175
Aitken: Fakhrizadeh's wife was
sitting right next to him,
176
00:08:04,222 --> 00:08:06,442
in the passenger's seat
when the killing took place.
177
00:08:06,486 --> 00:08:08,226
Considering that they are
in a moving car,
178
00:08:08,270 --> 00:08:10,360
and the execution was
performed remotely,
179
00:08:10,402 --> 00:08:13,232
it's remarkable that she
escaped without a scratch.
180
00:08:13,275 --> 00:08:16,275
Narrator: The operation is a
master class in AI,
181
00:08:16,321 --> 00:08:18,631
assisted military technology.
182
00:08:18,672 --> 00:08:21,682
The first hurdle that the
assassination team has to clear,
183
00:08:21,718 --> 00:08:24,978
is to positively identify
Fakhrizadeh as the driver,
184
00:08:25,026 --> 00:08:26,846
in real time.
185
00:08:26,897 --> 00:08:30,597
This was achieved by a decoy car
staged to look broken-down,
186
00:08:30,640 --> 00:08:32,820
set up along the
scientist's route.
187
00:08:32,860 --> 00:08:35,170
Morgan: Iranian
officials have speculated
188
00:08:35,210 --> 00:08:37,260
that the decoy car
must have had a camera.
189
00:08:37,299 --> 00:08:39,429
Images from that camera
would then fed
190
00:08:39,475 --> 00:08:41,215
into a facial
recognition algorithm
191
00:08:41,259 --> 00:08:43,699
to confirm that it was
indeed Fakhrizadeh.
192
00:08:43,740 --> 00:08:46,400
Narrator: The robotic apparatus
and gun are constructed
193
00:08:46,438 --> 00:08:48,528
to fit the back
of a pickup truck.
194
00:08:48,571 --> 00:08:50,971
Several cameras aimed in
different directions
195
00:08:51,008 --> 00:08:52,968
are then attached
to the vehicle,
196
00:08:53,010 --> 00:08:56,190
giving the team a live view of
the whole area.
197
00:08:56,231 --> 00:08:59,971
To top it all off, the truck is
loaded with explosives
198
00:09:00,017 --> 00:09:02,587
so that it could be destroyed
after the assassination,
199
00:09:02,629 --> 00:09:04,809
eliminating any
potential evidence.
200
00:09:04,848 --> 00:09:06,628
Pringle: One of the
biggest challenges
201
00:09:06,676 --> 00:09:08,416
the assassination team faces
202
00:09:08,460 --> 00:09:10,990
is that a machine gun
recoils after every shot,
203
00:09:11,028 --> 00:09:13,638
altering the course of
the bullets that follow.
204
00:09:13,683 --> 00:09:16,033
Narrator: Also complicating
matters is that there is
205
00:09:16,077 --> 00:09:17,947
a one and a half second delay
206
00:09:17,992 --> 00:09:20,602
in what the command room sees
on their screens,
207
00:09:20,647 --> 00:09:22,607
and what is happening
on the ground.
208
00:09:22,649 --> 00:09:24,479
This may seem insignificant,
209
00:09:24,520 --> 00:09:27,130
but the car is in motion
and it's enough of a lag
210
00:09:27,175 --> 00:09:30,215
for even the best aimed shot
to miss its mark.
211
00:09:30,265 --> 00:09:34,225
Amazingly, the AI system used
by the assassination team
212
00:09:34,269 --> 00:09:37,009
is programmed to account
for the visual delay,
213
00:09:37,054 --> 00:09:40,014
the gun's shake, and the speed
of Fakhrizadeh's car,
214
00:09:40,057 --> 00:09:43,187
which results in the flawless
execution of the operation
215
00:09:43,234 --> 00:09:45,244
with no collateral damage.
216
00:09:45,280 --> 00:09:47,720
Fakhrizadeh's assassin
pulled the trigger
217
00:09:47,761 --> 00:09:49,461
from the comfort of an office
218
00:09:49,501 --> 00:09:52,501
thousands of miles away,
leading some experts to believe
219
00:09:52,548 --> 00:09:54,768
that this is yet another example
220
00:09:54,811 --> 00:09:58,211
that we are heading towards a
future of "armchair warfare"
221
00:09:58,249 --> 00:10:01,559
where unmanned battlefields are
more and more commonplace.
222
00:10:01,601 --> 00:10:03,081
Aitken: Those in favor
of autonomous weapons,
223
00:10:03,124 --> 00:10:05,824
argue that delegating acts of
war to machines makes sense.
224
00:10:05,866 --> 00:10:07,686
And by removing humans
from the firing line,
225
00:10:07,737 --> 00:10:09,347
they might lead
to fewer casualties.
226
00:10:09,391 --> 00:10:11,131
But that argument
largely only accounts
227
00:10:11,175 --> 00:10:14,395
for casualties on one side.
228
00:10:14,439 --> 00:10:16,879
Narrator: In the bigger picture,
there are those that fear
229
00:10:16,920 --> 00:10:19,100
we are entering
a period of escalation
230
00:10:19,140 --> 00:10:21,660
in AI-backed military technology
231
00:10:21,708 --> 00:10:23,748
reminiscent of the
nuclear arms race
232
00:10:23,797 --> 00:10:26,927
between the US and former Soviet
Union during the Cold War,
233
00:10:26,974 --> 00:10:29,504
with China as a third party.
234
00:10:29,541 --> 00:10:32,021
Aitken: We know very little
about what kind of AI-enabled
235
00:10:32,066 --> 00:10:34,546
military technology Russia
or China may have developed.
236
00:10:34,590 --> 00:10:36,160
That secrecy leads to
governments trying to
237
00:10:36,200 --> 00:10:38,550
second guess
where others are at.
238
00:10:38,594 --> 00:10:41,294
Narrator: Secrecy is indeed
vital to the success
239
00:10:41,336 --> 00:10:43,686
of any sensitive military
operation
240
00:10:43,730 --> 00:10:46,910
and as the burial of Mohsen
Fakhrizadeh took place,
241
00:10:46,950 --> 00:10:49,820
no one had claimed
responsibility for his death,
242
00:10:49,866 --> 00:10:52,166
but eventually
one nation confessed.
243
00:10:52,216 --> 00:10:54,306
Morgan: In the months following
Fakhrizadeh's death,
244
00:10:54,349 --> 00:10:56,389
the former head of the Mossad
all but acknowledged
245
00:10:56,438 --> 00:10:59,618
Israel's responsibility
for the attack.
246
00:10:59,659 --> 00:11:02,179
This isn't exactly a surprise
given that Israel's fear
247
00:11:02,226 --> 00:11:04,786
of a nuclear-armed
Iran is well known.
248
00:11:04,838 --> 00:11:06,968
Narrator: For many years,
the Mossad,
249
00:11:07,014 --> 00:11:09,634
Israel's foreign
intelligence agency,
250
00:11:09,669 --> 00:11:13,279
has been trying to derail
Iran's nuclear program.
251
00:11:13,324 --> 00:11:16,684
So while the nationality of the
perpetrators is unsurprising,
252
00:11:16,719 --> 00:11:19,899
the audacity and use of
technology in the operation,
253
00:11:19,940 --> 00:11:23,290
leaves some international
observers taken aback.
254
00:11:23,334 --> 00:11:26,994
And the Iranians still had no
idea how exactly the Israelis
255
00:11:27,034 --> 00:11:30,394
got the gun and robotic
attachment into Iran.
256
00:11:30,428 --> 00:11:33,078
Pringle: Using some old
fashioned military stealth,
257
00:11:33,127 --> 00:11:35,827
the Israelis managed to sneak
the weapon across the border
258
00:11:35,869 --> 00:11:38,829
by dismantling it, and then
smuggling the pieces into Iran
259
00:11:38,872 --> 00:11:41,272
individually, at different
times and places.
260
00:11:41,309 --> 00:11:44,969
The gun was then covertly
rebuilt at an unknown location.
261
00:11:45,008 --> 00:11:47,528
Narrator: Curiously, on the
day of the assassination,
262
00:11:47,576 --> 00:11:49,836
Fakhrizadeh ignored the
recommendation
263
00:11:49,883 --> 00:11:51,973
of his security team
who suggested
264
00:11:52,015 --> 00:11:55,145
they drive him to Absard
in an armoured vehicle.
265
00:11:55,192 --> 00:11:58,072
For some reason, he insisted on
driving himself
266
00:11:58,108 --> 00:11:59,888
and his wife in his own car.
267
00:11:59,936 --> 00:12:02,366
And it ended up
costing him his life.
268
00:12:02,417 --> 00:12:05,247
Pringle: It seems strange that
he would make this decision,
269
00:12:05,289 --> 00:12:07,339
as the Israelis had
made numerous attempts
270
00:12:07,378 --> 00:12:09,688
to kill him in the past.
271
00:12:09,729 --> 00:12:12,209
Narrator: An assassination
squad had prepared an attack
272
00:12:12,253 --> 00:12:15,603
on Fakhrizadeh in Tehran
twelve years earlier,
273
00:12:15,647 --> 00:12:19,037
but the mission was cancelled
at the eleventh hour.
274
00:12:19,086 --> 00:12:20,646
The Mossad believed
the plan was leaked,
275
00:12:20,696 --> 00:12:23,606
and Iranian forces were
ready with an ambush.
276
00:12:23,655 --> 00:12:27,915
This time, Fakhrizadeh
wasn't so lucky.
277
00:12:27,964 --> 00:12:30,364
With the operation
a resounding success,
278
00:12:30,401 --> 00:12:34,011
some experts wonder if the
assassination of Fakhrizadeh
279
00:12:34,057 --> 00:12:36,757
is a glimpse into the future of
military operations,
280
00:12:36,799 --> 00:12:39,579
where data driven Artificial
Intelligence systems
281
00:12:39,628 --> 00:12:41,588
do most of the heavy lifting.
282
00:12:41,630 --> 00:12:43,550
Aitken: It's an area
of significant concern.
283
00:12:43,588 --> 00:12:46,068
But at least in the short term,
most militaries are looking
284
00:12:46,113 --> 00:12:48,033
into Artificial
Intelligence to assist,
285
00:12:48,071 --> 00:12:50,161
rather than replace,
humans in war.
286
00:12:50,204 --> 00:12:51,814
Narrator: But there are
warning signs.
287
00:12:51,858 --> 00:12:53,898
A 2018 research paper
288
00:12:53,947 --> 00:12:56,857
by the non-profit Rand
Corporation declared
289
00:12:56,906 --> 00:12:59,996
that the overuse of Artificial
Intelligence in militaries,
290
00:13:00,040 --> 00:13:02,000
could result in an accidental
291
00:13:02,042 --> 00:13:05,002
nuclear war by the year 2040.
292
00:13:05,045 --> 00:13:06,995
Aitken: Inevitably
at some point,
293
00:13:07,047 --> 00:13:09,137
mistakes will happen,
things will go wrong.
294
00:13:09,179 --> 00:13:12,789
And we don't as yet have the
adequate legal or regulatory
295
00:13:12,835 --> 00:13:15,005
frameworks in place to know
who is accountable, or
296
00:13:15,055 --> 00:13:17,575
who should be held responsible
when mistakes happen.
297
00:13:17,622 --> 00:13:19,892
Narrator: Are we moving
towards a future
298
00:13:19,929 --> 00:13:23,059
where warfare will pit algorithm
against algorithm,
299
00:13:23,106 --> 00:13:26,016
and military dominance will
shift from scale of force
300
00:13:26,066 --> 00:13:29,326
and superior weaponry to
technological factors
301
00:13:29,373 --> 00:13:32,553
like more advanced data
collection and A.I.?
302
00:13:32,594 --> 00:13:35,164
Only time will tell, but if
recent developments
303
00:13:35,205 --> 00:13:37,425
are any indication, it appears
304
00:13:37,468 --> 00:13:40,558
that the future of warfare
is already upon us.
305
00:13:40,602 --> 00:13:42,302
♪
306
00:13:42,343 --> 00:13:50,353
♪ [show theme music]
307
00:13:50,394 --> 00:13:53,444
Narrator: On a quiet
night in March, 2018,
308
00:13:53,484 --> 00:13:55,924
the streets of Tempe, Arizona
309
00:13:55,965 --> 00:13:58,835
bear witness to a marvel of
modern technology,
310
00:13:58,881 --> 00:14:02,231
and one of Big Data's most
impressive accomplishments...
311
00:14:02,276 --> 00:14:03,966
The autonomous car.
312
00:14:04,017 --> 00:14:05,497
Morgan: It was an Uber
self-driving car
313
00:14:05,540 --> 00:14:06,890
out for a test drive.
314
00:14:06,933 --> 00:14:09,283
A Volvo XC90 SUV.
315
00:14:09,326 --> 00:14:11,586
It was going just over
40 miles per hour
316
00:14:11,633 --> 00:14:14,203
which was under the speed limit
for that four lane road.
317
00:14:14,244 --> 00:14:15,994
Narrator: Sitting in the
driver's seat
318
00:14:16,029 --> 00:14:19,339
is a safety backup driver,
Rafaela Vasquez,
319
00:14:19,380 --> 00:14:21,210
a woman with a checkered past,
320
00:14:21,251 --> 00:14:23,511
who was trying to turn her
life around,
321
00:14:23,558 --> 00:14:26,428
and was able to take advantage
of Uber's hiring policy
322
00:14:26,474 --> 00:14:28,874
that gives those in need
a second chance.
323
00:14:28,911 --> 00:14:31,831
Vasquez's job is
to monitor the trip.
324
00:14:31,871 --> 00:14:34,441
Her hands and feet are not
touching the controls
325
00:14:34,482 --> 00:14:37,312
but she is ready to spring
into action if required.
326
00:14:37,354 --> 00:14:39,054
Badminton: An autonomous
vehicle uses
327
00:14:39,095 --> 00:14:41,485
a complex system of
sensor fusion.
328
00:14:41,532 --> 00:14:43,492
It combines data from cameras,
329
00:14:43,534 --> 00:14:46,064
from LiDar, from GPS,
330
00:14:46,102 --> 00:14:48,632
and brings it all together to
build a picture of the world.
331
00:14:48,670 --> 00:14:51,930
It can then determine how best
to drive that autonomous vehicle
332
00:14:51,978 --> 00:14:54,758
through the spaces
that takes it.
333
00:14:54,806 --> 00:14:57,156
Kris Alexander: So you have
massive amounts of information
334
00:14:57,200 --> 00:14:58,990
being fed into an
onboard computer
335
00:14:59,028 --> 00:15:01,338
with incredible processing
capability.
336
00:15:01,378 --> 00:15:04,078
The system's autonomous
algorithms process everything
337
00:15:04,120 --> 00:15:07,250
the vehicle encounters
and uses it to navigate.
338
00:15:07,297 --> 00:15:09,597
Narrator: The algorithms
are programmed to use
339
00:15:09,647 --> 00:15:12,387
machine-learning, and gain
intelligence in each moment,
340
00:15:12,433 --> 00:15:14,573
as they process millions
of simulated
341
00:15:14,609 --> 00:15:16,609
and real-world scenarios.
342
00:15:16,654 --> 00:15:20,834
In 2016, Uber, envisioning a
future where it will replace
343
00:15:20,876 --> 00:15:23,916
its human drivers with machines,
began test-driving
344
00:15:23,966 --> 00:15:27,226
a fleet of nine self-driving
vehicles in Arizona.
345
00:15:27,274 --> 00:15:28,934
Morgan: Uber
recognizes that human error
346
00:15:28,971 --> 00:15:30,931
is the cause of many accidents.
347
00:15:30,973 --> 00:15:33,193
And they dedicated themselves
to improving safety
348
00:15:33,236 --> 00:15:35,456
for self-driving vehicles.
349
00:15:35,499 --> 00:15:38,809
Narrator: Uber's self-driving
car is performing perfectly,
350
00:15:38,850 --> 00:15:41,550
until a figure suddenly
appears in the darkness.
351
00:15:41,592 --> 00:15:43,992
[distant sirens]
352
00:15:44,030 --> 00:15:46,380
Morgan: A woman pushing
a bicycle was crossing the road
353
00:15:46,423 --> 00:15:48,823
just in the Volvo's path
and for some reason,
354
00:15:48,860 --> 00:15:51,470
the car didn't stop.
355
00:15:51,515 --> 00:15:54,075
Narrator: The explanation
for this failure lies deep
356
00:15:54,127 --> 00:15:56,127
in the state-of-the-art
technology that performs
357
00:15:56,172 --> 00:15:58,442
thousands of operations
per second
358
00:15:58,479 --> 00:16:02,609
onboard self-driving vehicles.
359
00:16:02,657 --> 00:16:06,567
The car's sensor data reveals
a surprising fact:
360
00:16:06,617 --> 00:16:08,747
for some reason, the
autonomous vehicle
361
00:16:08,793 --> 00:16:11,973
didn't recognize Herzberg
as a pedestrian.
362
00:16:12,014 --> 00:16:14,364
In the almost six seconds
before the crash,
363
00:16:14,408 --> 00:16:16,928
she was classified and
reclassified
364
00:16:16,976 --> 00:16:19,146
at least eight times.
365
00:16:19,195 --> 00:16:21,715
Badminton: The radar and
LiDAR systems first mistook her
366
00:16:21,763 --> 00:16:23,683
as a vehicle,
and then an unknown object,
367
00:16:23,721 --> 00:16:25,811
and then a vehicle again,
368
00:16:25,854 --> 00:16:28,684
before finally determining
that she was a bicycle,
369
00:16:28,726 --> 00:16:31,596
coming into the path
of the oncoming Uber.
370
00:16:31,642 --> 00:16:34,732
Narrator: As it turns out, there
was a mistake in the algorithms.
371
00:16:34,776 --> 00:16:37,126
The engineers didn't program
their systems
372
00:16:37,170 --> 00:16:40,870
to recognize people crossing the
street illegally.
373
00:16:40,912 --> 00:16:44,052
Only those using crosswalks were
classified as pedestrians,
374
00:16:44,090 --> 00:16:46,310
and Herzberg was
not at a crosswalk.
375
00:16:46,353 --> 00:16:47,663
Badminton: The sensors
only recognized
376
00:16:47,702 --> 00:16:49,442
the bicycle she was pushing,
377
00:16:49,486 --> 00:16:52,316
and then they could only
recommend emergency braking.
378
00:16:52,359 --> 00:16:54,749
Narrator: Braking has been
an ongoing issue with
379
00:16:54,796 --> 00:16:57,706
Uber's self-driving cars, and
the problems stem directly
380
00:16:57,755 --> 00:16:59,365
from one of their policies.
381
00:16:59,409 --> 00:17:01,059
Morgan: Uber's vehicles
were experiencing
382
00:17:01,107 --> 00:17:03,497
what they called "erratic
vehicle behavior".
383
00:17:03,544 --> 00:17:06,164
Something as small as a bird
darting crossing the road
384
00:17:06,199 --> 00:17:09,119
would be enough to trigger the
vehicle to slam on its brakes.
385
00:17:09,158 --> 00:17:12,378
This caused some really
rough rides for passengers.
386
00:17:12,422 --> 00:17:14,772
And so, they came
up with a solution.
387
00:17:14,816 --> 00:17:16,906
Alexander: But Uber disabled
the auto-braking feature
388
00:17:16,948 --> 00:17:18,728
in all of its vehicles
in autonomous mode,
389
00:17:18,776 --> 00:17:20,336
so only the safety operator
390
00:17:20,387 --> 00:17:22,557
could apply the brakes
in an emergency.
391
00:17:22,606 --> 00:17:24,256
Morgan: Turning off the
auto-braking system
392
00:17:24,304 --> 00:17:26,654
of an entire fleet of vehicles,
393
00:17:26,697 --> 00:17:28,347
even if you have
good reason to do it,
394
00:17:28,395 --> 00:17:31,135
it comes with a lot of risk.
395
00:17:31,180 --> 00:17:33,970
Emergency personnel
were unable to revive her,
396
00:17:34,009 --> 00:17:37,099
and Elaine Herzberg became the
world's first pedestrian
397
00:17:37,143 --> 00:17:39,543
ever struck and killed
by a self-driving car.
398
00:17:39,580 --> 00:17:41,410
[sirens]
399
00:17:41,451 --> 00:17:43,371
Morgan: Vasquez who showed
no signs of impairment,
400
00:17:43,410 --> 00:17:45,060
stayed on the scene.
401
00:17:45,107 --> 00:17:47,717
She said she was "monitoring the
self-driving car's interface"
402
00:17:47,762 --> 00:17:50,462
at the time of the impact and
didn't see the woman right away.
403
00:17:50,504 --> 00:17:52,164
As soon as she did see her,
404
00:17:52,201 --> 00:17:55,201
she slammed on the brakes,
but too late.
405
00:17:55,248 --> 00:17:58,158
Narrator: But police estimate
Elaine Herzberg would have been
406
00:17:58,207 --> 00:18:01,337
visible on the road nearly
six seconds before impact.
407
00:18:01,384 --> 00:18:04,264
And the question was asked, why
didn't the vehicle's sensors
408
00:18:04,300 --> 00:18:06,520
recognize she was crossing
in front of it,
409
00:18:06,563 --> 00:18:08,393
and activate the braking system?
410
00:18:08,435 --> 00:18:10,305
Alexander: When a
self-driving accident occurs,
411
00:18:10,350 --> 00:18:12,310
assigning blame is very tricky.
412
00:18:12,352 --> 00:18:14,572
It could be the
hardware, software,
413
00:18:14,615 --> 00:18:16,745
or some other
technological malfunction.
414
00:18:16,791 --> 00:18:20,451
And then there's also
the human element.
415
00:18:20,490 --> 00:18:23,280
Narrator: As the investigation
into Elaine's death continued,
416
00:18:23,319 --> 00:18:26,539
authorities discovered
another startling fact.
417
00:18:26,583 --> 00:18:29,593
A former employee issued
a haunting warning
418
00:18:29,630 --> 00:18:33,240
to his superiors just
five days before the crash.
419
00:18:33,286 --> 00:18:35,156
Morgan: An Uber
operations manager sent a
420
00:18:35,201 --> 00:18:37,511
resignation email to various
company executives,
421
00:18:37,551 --> 00:18:38,901
saying that their vehicles were
422
00:18:38,943 --> 00:18:41,773
"routinely in accidents" and
423
00:18:41,816 --> 00:18:46,296
"hitting things
every 15,000 miles"
424
00:18:46,342 --> 00:18:48,522
Narrator: The employee alleged
that "a car was damaged
425
00:18:48,562 --> 00:18:50,832
nearly every other day"
426
00:18:50,868 --> 00:18:52,998
In the month before
the Herzberg accident,
427
00:18:53,044 --> 00:18:55,664
and also cited "poorly
vetted or trained"
428
00:18:55,699 --> 00:18:58,309
safety drivers as
a cause of these accidents.
429
00:18:58,354 --> 00:19:00,404
The police then start
to look closely
430
00:19:00,443 --> 00:19:03,453
at the role of the safety driver
Rafaella Vasquez,
431
00:19:03,490 --> 00:19:05,970
and discover some
shocking evidence.
432
00:19:06,014 --> 00:19:07,894
The SUV's dashcam footage
433
00:19:07,929 --> 00:19:10,409
tells a disturbing tale.
434
00:19:10,453 --> 00:19:11,933
Alexander:
After careful analysis,
435
00:19:11,976 --> 00:19:14,806
authorities determined that for
roughly a third of the trip,
436
00:19:14,849 --> 00:19:18,069
Vasquez was looking down to the
right instead of monitoring
437
00:19:18,113 --> 00:19:21,073
the road and vehicle conditions
like she should have been doing.
438
00:19:21,116 --> 00:19:23,546
It was likely she was
looking at her cellphone.
439
00:19:23,597 --> 00:19:25,247
Badminton: During the
investigation,
440
00:19:25,294 --> 00:19:27,514
Vasquez's phone told them
everything they needed to know.
441
00:19:27,557 --> 00:19:30,127
She was watching
The Voice on Hulu
442
00:19:30,169 --> 00:19:34,699
and wasn't paying attention
to the road at all.
443
00:19:34,738 --> 00:19:36,478
Narrator: It's a stunning
discovery,
444
00:19:36,523 --> 00:19:39,353
but some believe that
Rafaela Vasquez's behavior
445
00:19:39,395 --> 00:19:42,045
is not solely responsible
for the accident,
446
00:19:42,093 --> 00:19:44,443
and there may be some
underlying factors
447
00:19:44,487 --> 00:19:46,967
within autonomous technology
that should shoulder
448
00:19:47,011 --> 00:19:48,931
at least some of the blame.
449
00:19:48,970 --> 00:19:50,890
Badminton: Her
distraction isn't surprising,
450
00:19:50,928 --> 00:19:52,278
it's almost expected.
451
00:19:52,321 --> 00:19:54,241
It's called
automation complacency.
452
00:19:54,280 --> 00:19:56,720
And this happens when people
work with automated systems,
453
00:19:56,760 --> 00:19:58,940
and they stop paying attention
to what's happening
454
00:19:58,980 --> 00:20:01,200
on a second-by-second basis.
455
00:20:01,243 --> 00:20:03,723
Alexander: These people work
tedious and repetitive shifts
456
00:20:03,767 --> 00:20:06,377
and expect nothing to go wrong,
because nothing does go wrong
457
00:20:06,422 --> 00:20:08,602
nearly 100% of the time.
458
00:20:08,642 --> 00:20:13,042
So, naturally they lose interest
and don't pay attention.
459
00:20:13,081 --> 00:20:15,131
Narrator: Authorities found that
Uber's self-driving division
460
00:20:15,170 --> 00:20:17,090
failed to do the
necessary training
461
00:20:17,128 --> 00:20:19,348
and supervision of
backup drivers.
462
00:20:19,392 --> 00:20:22,052
Another factor in
the tragic outcome.
463
00:20:22,090 --> 00:20:25,400
Badminton: It begs the question:
is the machine or the person
464
00:20:25,441 --> 00:20:28,841
at fault in
Elaine Herzberg's death?
465
00:20:28,879 --> 00:20:31,139
Narrator: As the media
stokes the controversy
466
00:20:31,186 --> 00:20:34,486
around the tragedy, Arizona's
citizens lash out
467
00:20:34,537 --> 00:20:37,107
at the driverless machines
on their roads.
468
00:20:37,148 --> 00:20:39,538
Technology development
company Waymo,
469
00:20:39,586 --> 00:20:41,456
part of Google, was targeted.
470
00:20:41,501 --> 00:20:42,721
Alexander:
The publicity around the case
471
00:20:42,763 --> 00:20:44,333
stirred up anger in people.
472
00:20:44,373 --> 00:20:46,683
In nearby Chandler,
Waymo's autonomous white vans
473
00:20:46,723 --> 00:20:48,293
were harassed repeatedly.
474
00:20:48,334 --> 00:20:50,684
They had rocks thrown at them
and several attempts were made
475
00:20:50,727 --> 00:20:52,287
to run them off the road.
476
00:20:52,338 --> 00:20:54,168
Badminton: The anger
spilled out into action.
477
00:20:54,209 --> 00:20:56,779
Tires were slashed,
and even on one occasion,
478
00:20:56,820 --> 00:20:59,560
a backup driver was
threatened with a gun.
479
00:20:59,606 --> 00:21:01,866
Narrator: But Waymo
representatives insist
480
00:21:01,912 --> 00:21:04,182
their self-driving
vehicles are safe,
481
00:21:04,219 --> 00:21:06,089
citing 20 million miles driven
482
00:21:06,134 --> 00:21:08,314
on public roads by early 2020,
483
00:21:08,354 --> 00:21:10,974
74,000 of those
without a driver.
484
00:21:11,008 --> 00:21:14,098
Badminton: They've had multiple
accidents but zero fatalities,
485
00:21:14,142 --> 00:21:16,142
which is pretty great
when you consider
486
00:21:16,187 --> 00:21:17,617
there's an average
of one death per
487
00:21:17,667 --> 00:21:22,277
160 million kilometers
driven by humans.
488
00:21:22,324 --> 00:21:25,464
Narrator: Tesla has faced
lawsuits for two separate deaths
489
00:21:25,501 --> 00:21:27,981
that resulted from crashes
while its drivers used
490
00:21:28,025 --> 00:21:32,765
the car's Autopilot program.
491
00:21:32,813 --> 00:21:35,733
Alexander: Autopilot is not a
fully-autonomous driving system.
492
00:21:35,772 --> 00:21:38,562
It's a form of driver-assistance
that accelerates,
493
00:21:38,601 --> 00:21:41,391
brakes and steers its vehicles
while the driver is present
494
00:21:41,430 --> 00:21:43,350
but not actively driving.
495
00:21:43,389 --> 00:21:45,779
The problem is, instead of
watching the road,
496
00:21:45,826 --> 00:21:50,046
the drivers get bored and
careless and accidents happen.
497
00:21:50,091 --> 00:21:51,881
Narrator: It's another case
of the dangers
498
00:21:51,919 --> 00:21:54,009
of automation complacency.
499
00:21:54,051 --> 00:21:57,231
Since 2016, there's been at
least ten deaths
500
00:21:57,272 --> 00:22:00,542
in eight accidents where Tesla's
Autopilot was involved.
501
00:22:00,580 --> 00:22:02,970
Morgan: The controversies
around self-driving cars
502
00:22:03,017 --> 00:22:05,447
are extremely complex.
503
00:22:05,498 --> 00:22:07,238
Just think of the
ethical decisions
504
00:22:07,282 --> 00:22:09,462
you and I might have to
make while driving.
505
00:22:09,502 --> 00:22:12,162
Imagine a self-driving vehicle
forced to choose between
506
00:22:12,200 --> 00:22:14,900
driving onto the sidewalk and
hitting a crowd of pedestrians,
507
00:22:14,942 --> 00:22:16,732
in order to save its passengers
508
00:22:16,770 --> 00:22:20,730
from striking an object
on the road.
509
00:22:20,774 --> 00:22:23,564
Narrator: Arizona authorities
clear Uber of any wrongdoing,
510
00:22:23,603 --> 00:22:25,083
but suspend their program of
511
00:22:25,126 --> 00:22:27,606
autonomous vehicle tests
on their roads.
512
00:22:27,650 --> 00:22:29,440
Rafaela Vasquez is arrested
513
00:22:29,478 --> 00:22:32,958
and charged with
negligent homicide.
514
00:22:33,003 --> 00:22:35,573
Morgan: There's still no
denying the possible benefits
515
00:22:35,615 --> 00:22:38,045
of autonomous vehicles
replacing cars
516
00:22:38,095 --> 00:22:40,445
and transport trucks
on the road.
517
00:22:40,489 --> 00:22:42,969
Some of the most significant
potential for growth
518
00:22:43,013 --> 00:22:46,543
is in public transportation
where subways, trains and buses
519
00:22:46,582 --> 00:22:51,762
could all be replaced by
autonomous vehicles.
520
00:22:51,805 --> 00:22:54,365
Narrator: In fact, there have
already been many
521
00:22:54,416 --> 00:22:56,416
pilot projects testing
driverless
522
00:22:56,462 --> 00:23:00,162
public transit systems
around the world.
523
00:23:00,204 --> 00:23:04,084
But some have been marred
by controversy.
524
00:23:04,121 --> 00:23:06,911
At the time of the accident
that killed Elaine Herzberg,
525
00:23:06,950 --> 00:23:08,910
confidence in autonomous
vehicles
526
00:23:08,952 --> 00:23:11,652
was growing around the world.
527
00:23:11,694 --> 00:23:14,744
But the incident cast a
shadow over the industry.
528
00:23:14,784 --> 00:23:16,964
With many arguing that the
weaknesses in their
529
00:23:17,004 --> 00:23:25,934
safety and reliability still
need to be addressed.
530
00:23:25,969 --> 00:23:27,669
Badminton: Self-driving
vehicles are a new frontier,
531
00:23:27,710 --> 00:23:29,540
and the market's
expected to grow
532
00:23:29,582 --> 00:23:33,672
to $64 billion dollars by 2026,
533
00:23:33,716 --> 00:23:36,676
a 22% growth year-on-year.
534
00:23:36,719 --> 00:23:38,849
Narrator: And Uber is just
one of the companies
535
00:23:38,895 --> 00:23:42,245
racing to be first in the global
autonomous vehicle market.
536
00:23:42,290 --> 00:23:45,030
Traditional carmakers like GM,
537
00:23:45,075 --> 00:23:48,635
Ford and Volvo are
trying to muscle in
538
00:23:48,688 --> 00:23:51,388
on what was once the territory
of tech giants.
539
00:23:51,430 --> 00:23:53,340
By March of 2018,
540
00:23:53,388 --> 00:23:55,128
confidence in autonomous
vehicles
541
00:23:55,172 --> 00:23:57,742
was at an all time high
around the world.
542
00:23:57,784 --> 00:24:00,444
But in light of the accident
that killed Elaine Herzberg,
543
00:24:00,482 --> 00:24:04,012
criticism of the technology
grows with many arguing
544
00:24:04,051 --> 00:24:06,531
that the weaknesses in their
safety and reliability
545
00:24:06,575 --> 00:24:09,575
still need to be addressed.
546
00:24:09,622 --> 00:24:11,192
Alexander: The more
congested cities get,
547
00:24:11,232 --> 00:24:12,762
the more safety issues
there will be
548
00:24:12,799 --> 00:24:14,189
with self-driving vehicles.
549
00:24:14,235 --> 00:24:16,145
And there's still many
unresolved dangers,
550
00:24:16,193 --> 00:24:18,113
like adverse weather conditions.
551
00:24:18,152 --> 00:24:20,462
How can a self-driving
vehicle navigate,
552
00:24:20,502 --> 00:24:23,072
when snow or rain eliminates
its sensors' ability
553
00:24:23,113 --> 00:24:25,643
to detect lane markers
and lane dividers?
554
00:24:25,681 --> 00:24:27,641
Narrator: But one of the
biggest problems faced
555
00:24:27,683 --> 00:24:31,083
by technology firms and
car manufacturers is:
556
00:24:31,121 --> 00:24:33,731
will their autonomous vehicles
ever truly be able to
557
00:24:33,776 --> 00:24:37,816
function safely without
any human interaction?
558
00:24:37,867 --> 00:24:41,127
If the tragedy of Elaine
Herzberg is any indication,
559
00:24:41,175 --> 00:24:43,255
we have a long road to travel
560
00:24:43,307 --> 00:24:48,007
before we reach that
destination.
561
00:24:48,051 --> 00:24:58,891
♪ [show theme music]
562
00:24:58,932 --> 00:25:00,592
Narrator: New York City.
563
00:25:00,629 --> 00:25:02,849
The epicenter of the
modern art world.
564
00:25:02,892 --> 00:25:06,072
At the venerable Christie's
auction house in Manhattan,
565
00:25:06,113 --> 00:25:09,293
a bizarre looking portrait is
about to go up for sale,
566
00:25:09,333 --> 00:25:11,253
which will send shockwaves
567
00:25:11,292 --> 00:25:12,772
through the global
art community.
568
00:25:12,815 --> 00:25:14,855
Pringle: I think
it's an incredible piece.
569
00:25:14,904 --> 00:25:17,214
It's strange because
you can tell it's a portrait
570
00:25:17,254 --> 00:25:18,954
in the style of the old masters,
571
00:25:18,995 --> 00:25:22,165
but the subject's facial
features are blurry and smudged.
572
00:25:22,216 --> 00:25:24,036
It also kind of
looks unfinished,
573
00:25:24,087 --> 00:25:27,867
with large sections
of the canvas left blank.
574
00:25:27,917 --> 00:25:30,307
Narrator: The work is credited
to a Paris-based collective
575
00:25:30,354 --> 00:25:32,884
who call themselves 'Obvious'.
576
00:25:32,922 --> 00:25:35,712
The group consists of three
twenty-something French men,
577
00:25:35,751 --> 00:25:37,621
Hugo Caselles-Dupré,
578
00:25:37,666 --> 00:25:40,836
Pierre Fautrel and
Gauthier Vernier,
579
00:25:40,887 --> 00:25:43,627
none of whom are well
known in the art world.
580
00:25:43,672 --> 00:25:45,202
Alexander: One interesting thing
about the sale is that
581
00:25:45,239 --> 00:25:47,329
the portrait was never
previously shown
582
00:25:47,371 --> 00:25:49,591
at exhibitions or in galleries.
583
00:25:49,635 --> 00:25:52,285
It's highly unusual for a
painting to come to market
584
00:25:52,333 --> 00:25:54,683
at auction, sight unseen.
585
00:25:54,727 --> 00:25:57,157
My guess is that Obvious were
intentionally trying to shock
586
00:25:57,207 --> 00:25:59,117
the art world by keeping the
portrait under wraps,
587
00:25:59,166 --> 00:26:01,466
until it was revealed
at Christie's.
588
00:26:01,516 --> 00:26:02,906
Narrator: A tag on
the wall proclaims
589
00:26:02,996 --> 00:26:07,256
that the subject of the portrait
is a man named Edmond de
Belamy.
590
00:26:07,304 --> 00:26:09,524
He appears to be wearing
a black frock coat
591
00:26:09,568 --> 00:26:11,608
with a white collar
showing at the neck.
592
00:26:11,657 --> 00:26:13,657
This indicates that
he could be French,
593
00:26:13,702 --> 00:26:15,882
and may be a member
of the clergy.
594
00:26:15,922 --> 00:26:17,842
Aitken: I guess he was
somebody important,
595
00:26:17,880 --> 00:26:21,320
but the name Edmond Belamy
doesn't ring a bell.
596
00:26:21,362 --> 00:26:23,972
Narrator: There's a reason why
the name Edmond de Belamy
597
00:26:24,017 --> 00:26:27,717
may be unfamiliar,
he never existed.
598
00:26:27,760 --> 00:26:29,760
In the bottom right hand
corner of the portrait,
599
00:26:29,805 --> 00:26:32,895
in lieu of a signature, there is
a mathematical equation
600
00:26:32,939 --> 00:26:35,329
written in cursive
Gallic script.
601
00:26:35,376 --> 00:26:37,676
Alexander: The signature is
actually a snippet
602
00:26:37,726 --> 00:26:39,946
of computer code -
part of an algorithm.
603
00:26:39,989 --> 00:26:42,079
Narrator: The Portrait of Edmond
de Belamy is about to
604
00:26:42,122 --> 00:26:44,602
make history as the first work
of art produced
605
00:26:44,646 --> 00:26:47,606
by artificial intelligence
to be sold at auction.
606
00:26:47,649 --> 00:26:50,909
And in the process, it may just
redefine our notion of art
607
00:26:50,957 --> 00:26:52,787
in the 21st century.
608
00:26:52,828 --> 00:26:55,528
Pringle: Some in the art
establishment weren't exactly
609
00:26:55,570 --> 00:26:57,270
thrilled to see a
computer-generated
610
00:26:57,311 --> 00:26:59,101
painting up for auction.
611
00:26:59,139 --> 00:27:01,749
Especially at Christie's, the
oldest and most prestigious
612
00:27:01,794 --> 00:27:03,974
fine arts auction house
in the world.
613
00:27:04,013 --> 00:27:06,363
Narrator: The inclusion of
Obvious certainly causes
614
00:27:06,407 --> 00:27:09,187
some indignation in the high
brow art community,
615
00:27:09,236 --> 00:27:11,016
and leaves some
asking the question,
616
00:27:11,064 --> 00:27:13,504
"If the portrait was not
created by human hands,
617
00:27:13,544 --> 00:27:15,634
is it even art?"
618
00:27:15,677 --> 00:27:17,417
Aitken: It's an
interesting question.
619
00:27:17,461 --> 00:27:19,031
But the suggestion that the
piece was created
620
00:27:19,072 --> 00:27:21,552
with no human involvement
is overly simplistic.
621
00:27:21,596 --> 00:27:24,156
Somebody had to write
the code that created it.
622
00:27:24,207 --> 00:27:25,687
Narrator: Obvious believe that
623
00:27:25,731 --> 00:27:28,471
"creativity isn't
just for humans."
624
00:27:28,516 --> 00:27:30,946
And produce the portrait using
a machine-learning model
625
00:27:30,997 --> 00:27:33,697
called a Generative
Adversarial Network
626
00:27:33,739 --> 00:27:35,959
or GAN for short.
627
00:27:36,002 --> 00:27:38,002
Alexander: GANs use
two neural networks
628
00:27:38,047 --> 00:27:40,087
that are essentially competing
with one another to see
629
00:27:40,136 --> 00:27:44,176
which one can become
better at its assigned job.
630
00:27:44,227 --> 00:27:46,097
Narrator: These two neural
networks are known as
631
00:27:46,142 --> 00:27:48,802
the generator
and the discriminator.
632
00:27:48,841 --> 00:27:52,021
The purpose of the generator is
to produce outputs
633
00:27:52,061 --> 00:27:54,411
that trick the discriminator
into mistaking them
634
00:27:54,455 --> 00:27:57,195
for real data, while the
discriminator
635
00:27:57,240 --> 00:28:00,900
attempts to flag which outputs
are artificial.
636
00:28:00,940 --> 00:28:03,380
Pringle: It's a zero sum
game where one side loss
637
00:28:03,420 --> 00:28:05,290
is another side's gain.
638
00:28:05,335 --> 00:28:09,115
In this case, is it a face
or is it not a face?
639
00:28:09,165 --> 00:28:12,125
Narrator: This back and forth
creates a feedback loop
640
00:28:12,168 --> 00:28:15,128
and eventually the generator
will begin to create
641
00:28:15,171 --> 00:28:18,521
superior output and the
discriminator will become more
642
00:28:18,566 --> 00:28:22,046
adept at catching data that has
been artificially generated.
643
00:28:22,091 --> 00:28:24,011
Morgan: Think of it
like a counterfeiter,
644
00:28:24,050 --> 00:28:26,790
trying to fool a bank teller
with a fake note.
645
00:28:26,835 --> 00:28:28,705
If the counterfeiter keeps
getting caught,
646
00:28:28,750 --> 00:28:30,010
they're going to kind of
get better and better
647
00:28:30,056 --> 00:28:31,796
at producing fake notes.
648
00:28:31,840 --> 00:28:33,710
But at the same time,
the bank teller
649
00:28:33,755 --> 00:28:35,835
will get better and better
at detecting them.
650
00:28:35,888 --> 00:28:38,628
And there's this
evolutionary arms race.
651
00:28:38,673 --> 00:28:41,893
Narrator: In time, the generator
produces synthetic data
652
00:28:41,937 --> 00:28:45,157
that fools the discriminator
into believing it is genuine.
653
00:28:45,201 --> 00:28:47,551
For Obvious, this meant
the creation of
654
00:28:47,595 --> 00:28:49,855
the Portrait of
Edmond de Belamy.
655
00:28:49,902 --> 00:28:51,952
Alexander:
Obvious fed 15,000 portraits,
656
00:28:51,991 --> 00:28:55,171
from the 14th to the 20th
century into the GAN algorithm.
657
00:28:55,211 --> 00:28:56,781
The generator network
then learned
658
00:28:56,822 --> 00:28:59,042
the basic rules of
the images, for example,
659
00:28:59,085 --> 00:29:01,385
they all have a mouth,
a nose and two eyes.
660
00:29:01,435 --> 00:29:03,565
Based on those rules,
the network then begins
661
00:29:03,611 --> 00:29:05,791
to generate new images.
662
00:29:05,831 --> 00:29:07,791
Narrator: The discriminator
network reviews the images
663
00:29:07,833 --> 00:29:10,443
and tries to guess which ones
are authentic portraits
664
00:29:10,487 --> 00:29:12,227
from the dataset and which ones
665
00:29:12,272 --> 00:29:14,932
were artificially created
by the generator.
666
00:29:14,970 --> 00:29:17,410
When the generator succeeds in
tricking the discriminator,
667
00:29:17,451 --> 00:29:19,711
the process is complete.
668
00:29:19,758 --> 00:29:21,148
Aitken: At this point,
it's up to Obvious
669
00:29:21,194 --> 00:29:23,854
to evaluate the outputs and
choose the best examples.
670
00:29:23,892 --> 00:29:26,982
I guess they liked the Portrait
of Edmond de Belamy the most.
671
00:29:27,026 --> 00:29:29,376
Narrator: Heading into the
auction, it was estimated
672
00:29:29,419 --> 00:29:31,729
that the portrait would
likely fetch somewhere between
673
00:29:31,770 --> 00:29:34,510
$7 and #10,000 dollars U.S.
674
00:29:34,555 --> 00:29:37,985
But as the bidding begins,
it becomes abundantly clear
675
00:29:38,037 --> 00:29:41,037
that interest in this new form
of artistic expression
676
00:29:41,083 --> 00:29:43,743
has been grossly underestimated.
677
00:29:43,782 --> 00:29:46,312
The creativity of AI has
also made headway
678
00:29:46,349 --> 00:29:48,529
into other artistic disciplines.
679
00:29:48,569 --> 00:29:51,049
In late 2021, Ai-Da the robot,
680
00:29:51,093 --> 00:29:52,793
Robot: I am Ai-Da.
681
00:29:52,834 --> 00:29:54,794
Narrator: ...made history by
becoming the first machine
682
00:29:54,836 --> 00:29:57,226
to recite poetry
written exclusively
683
00:29:57,273 --> 00:30:00,363
by its algorithms in front
of an audience.
684
00:30:00,407 --> 00:30:02,317
Pringle: The robot was
fed Dante's epic poem
685
00:30:02,365 --> 00:30:04,715
The Divine Comedy and then the
algorithms evaluated
686
00:30:04,759 --> 00:30:07,459
the language patterns and used
her database of words
687
00:30:07,501 --> 00:30:09,421
to create original poetry.
688
00:30:09,459 --> 00:30:12,459
Narrator: Ai-Da is capable of
producing an astonishing
689
00:30:12,506 --> 00:30:14,676
20,000 words in ten seconds,
690
00:30:14,725 --> 00:30:17,155
generated by her
AI language model.
691
00:30:17,206 --> 00:30:20,376
Ai-Da: I wept silently,
692
00:30:20,427 --> 00:30:22,907
taking in the scene.
693
00:30:22,951 --> 00:30:25,261
Aitken: Okay, now
that raises some concerns!
694
00:30:25,301 --> 00:30:27,221
Narrator: And it's not just
in the written word
695
00:30:27,260 --> 00:30:30,310
that AI's encroachment on the
arts is having an impact.
696
00:30:30,350 --> 00:30:32,610
The music industry
is feeling it too.
697
00:30:32,656 --> 00:30:34,476
♪ [gentle music]
698
00:30:34,528 --> 00:30:36,918
Pringle: We're now seeing
artificial intelligence
699
00:30:36,965 --> 00:30:39,005
that is capable of producing
music that is virtually
700
00:30:39,054 --> 00:30:41,844
indistinguishable from
human compositions.
701
00:30:41,883 --> 00:30:45,373
Narrator: In 2019,
Chinese tech giant Huawei
702
00:30:45,408 --> 00:30:48,278
wanted to showcase the
technology in their smartphones,
703
00:30:48,324 --> 00:30:50,814
so they decided to do
something bold,
704
00:30:50,849 --> 00:30:53,549
finish one of the most notable,
incomplete symphonies
705
00:30:53,590 --> 00:30:56,330
in music history
using artificial intelligence.
706
00:30:56,376 --> 00:30:57,726
Alexander: Schubert's
symphony number 8
707
00:30:57,768 --> 00:30:59,948
is one of his most
acclaimed works,
708
00:30:59,988 --> 00:31:02,248
but for some reason,
he never finished it.
709
00:31:02,295 --> 00:31:04,505
Narrator: With the help of
composer Lucas Cantor,
710
00:31:04,558 --> 00:31:07,078
Huawei set out to
answer one question,
711
00:31:07,126 --> 00:31:09,476
"if Schubert had finished the
last two movements,
712
00:31:09,519 --> 00:31:11,829
what would they sound like?"
713
00:31:11,870 --> 00:31:13,780
The composer's body of work,
714
00:31:13,828 --> 00:31:16,138
some 2,000 pieces
of piano music,
715
00:31:16,178 --> 00:31:19,488
was fed into the phone's
dual Neural Processing Unit
716
00:31:19,529 --> 00:31:22,179
in the form of data.
717
00:31:22,228 --> 00:31:25,228
The AI then analyzed the tones,
pitch and rhythms
718
00:31:25,274 --> 00:31:27,714
that Schubert preferred
in his symphonies,
719
00:31:27,755 --> 00:31:30,275
and created melodies
from that information.
720
00:31:30,323 --> 00:31:32,283
Alexander: For those
who bristle at the idea
721
00:31:32,325 --> 00:31:35,365
of AI impinging on something as
revered as classical music,
722
00:31:35,415 --> 00:31:40,025
you could make the argument that
music is essentially just code.
723
00:31:40,072 --> 00:31:42,602
Narrator: But algorithms can't
create art in a vacuum,
724
00:31:42,639 --> 00:31:45,339
they still need to be
guided by humans.
725
00:31:45,381 --> 00:31:47,431
For the Portrait of
Edmond de Belamy,
726
00:31:47,470 --> 00:31:50,950
Obvious had to make creative
decisions and curate
727
00:31:50,996 --> 00:31:53,816
the results generated by the
artificial intelligence.
728
00:31:53,868 --> 00:31:55,998
Pringle: AI merely follows
a series of steps
729
00:31:56,044 --> 00:31:58,354
laid out by a set of
prescribed rules.
730
00:31:58,394 --> 00:32:01,624
It takes a human eye to discern
what may or may not have value.
731
00:32:01,658 --> 00:32:03,178
Narrator: And back at the
auction house,
732
00:32:03,225 --> 00:32:07,135
the Portrait of Edmond de Belamy
appears to have some value.
733
00:32:07,186 --> 00:32:08,966
Alexander: There
are four competing bidders.
734
00:32:09,014 --> 00:32:11,414
One online in France,
two others by phone,
735
00:32:11,451 --> 00:32:13,711
and one in person
in the room in New York.
736
00:32:13,757 --> 00:32:15,497
Narrator:
As the auction heats up,
737
00:32:15,542 --> 00:32:18,372
eyebrows are being raised
in the art community.
738
00:32:18,414 --> 00:32:21,814
Some onlookers are shocked that
a work created by a computer
739
00:32:21,852 --> 00:32:24,202
is garnering so much attention.
740
00:32:24,246 --> 00:32:27,376
Many still don't even consider
the portrait of Edmond de Belamy
741
00:32:27,423 --> 00:32:29,863
to be art in the
traditional sense.
742
00:32:29,904 --> 00:32:31,564
Aitken: Is it art?
743
00:32:31,601 --> 00:32:34,261
I suppose that depends on
one's definition of the word.
744
00:32:34,300 --> 00:32:36,220
Most creatives would say that
745
00:32:36,258 --> 00:32:39,998
art is a method by which humans
express some concept or emotion.
746
00:32:40,045 --> 00:32:42,175
So by that definition,
AI generated art
747
00:32:42,221 --> 00:32:43,741
wouldn't be considered art.
748
00:32:43,787 --> 00:32:45,347
Pringle: But isn't art,
like beauty,
749
00:32:45,398 --> 00:32:46,968
in the eye of the beholder?
750
00:32:47,008 --> 00:32:49,658
If it invokes some kind of
response in the audience,
751
00:32:49,706 --> 00:32:52,356
then it by all means
should be considered art.
752
00:32:52,405 --> 00:32:55,665
And who decides
what is and isn't art?
753
00:32:55,712 --> 00:32:58,982
I would argue that the datasets
and algorithms are tools.
754
00:32:59,020 --> 00:33:01,020
Just as paint and
canvas are tools.
755
00:33:01,066 --> 00:33:02,716
The artist is the one
manipulating them
756
00:33:02,763 --> 00:33:04,633
and making those
creative choices.
757
00:33:04,678 --> 00:33:06,158
Narrator: Critics are
quick to point out
758
00:33:06,201 --> 00:33:08,811
that Obvious' work lacks intent,
759
00:33:08,856 --> 00:33:11,816
one of the cornerstones of
artistic expression.
760
00:33:11,859 --> 00:33:13,509
Aitken: When human artists
distort the aspects
761
00:33:13,556 --> 00:33:16,426
of a subject, they do it on
purpose, it has intent.
762
00:33:16,472 --> 00:33:18,342
But in the case of the
Portrait of Edmond de Belamy,
763
00:33:18,387 --> 00:33:20,517
the AI has reproduced
what it thinks
764
00:33:20,563 --> 00:33:22,873
a human face might look like.
Simply put,
765
00:33:22,913 --> 00:33:25,873
the AI hasn't exactly
been entirely successful.
766
00:33:25,916 --> 00:33:28,216
Narrator: Doubters in the face
of new creative fields
767
00:33:28,267 --> 00:33:30,567
are not unique to our times.
768
00:33:30,617 --> 00:33:33,487
Photography as an art,
was derided in the early days
769
00:33:33,533 --> 00:33:35,623
because it came from a machine,
770
00:33:35,665 --> 00:33:37,885
but now, it's a vital
and respected
771
00:33:37,928 --> 00:33:39,538
element in the art world.
772
00:33:39,582 --> 00:33:41,372
Pringle:
Change is really hard.
773
00:33:41,410 --> 00:33:43,460
People are afraid of new things.
774
00:33:43,499 --> 00:33:46,549
Especially things they
don't fully understand.
775
00:33:46,589 --> 00:33:48,939
Narrator: Acceptance into
the mainstream is not
776
00:33:48,983 --> 00:33:51,813
the only challenge
facing AI artists,
777
00:33:51,855 --> 00:33:54,375
there is also the
issue of credit.
778
00:33:54,423 --> 00:33:57,693
Controversially, Obvious didn't
even write the algorithm
779
00:33:57,731 --> 00:34:00,651
that they used to create the
Portrait of Edmond de Bellamy.
780
00:34:00,690 --> 00:34:02,910
It was authored by
a pioneering artist
781
00:34:02,953 --> 00:34:05,173
and programmer named
Robbie Barrat,
782
00:34:05,217 --> 00:34:07,917
whose algorithm-generated
nudes and landscapes
783
00:34:07,958 --> 00:34:10,398
were lauded by the AI community.
784
00:34:10,439 --> 00:34:12,749
Barrat shared his code online,
785
00:34:12,789 --> 00:34:15,439
as a free open source license.
786
00:34:15,488 --> 00:34:17,488
Alexander: So if
Obvious didn't write the code
787
00:34:17,533 --> 00:34:19,673
and the data sets they used
were old portraits
788
00:34:19,709 --> 00:34:22,019
from hundreds of years ago,
not original works,
789
00:34:22,060 --> 00:34:24,760
what exactly did they
contribute to the process?
790
00:34:24,801 --> 00:34:27,541
Narrator: Obvious doesn't deny
using Barrat's algorithm,
791
00:34:27,587 --> 00:34:29,237
but they claim that
they tweaked it
792
00:34:29,284 --> 00:34:31,634
in order to produce
the desired outcomes.
793
00:34:31,678 --> 00:34:33,718
In response to this, Tom White,
794
00:34:33,767 --> 00:34:35,767
an AI artist from New Zealand
795
00:34:35,812 --> 00:34:38,822
used Barrat's code to produce
his own set of portraits
796
00:34:38,859 --> 00:34:43,079
and the results were strikingly
similar to the work of Obvious,
797
00:34:43,124 --> 00:34:44,914
calling into question
just how much
798
00:34:44,952 --> 00:34:47,002
they adjusted the algorithm.
799
00:34:47,041 --> 00:34:48,831
Pringle: At the heart
of this controversy,
800
00:34:48,869 --> 00:34:51,349
it's the definition of the
authorship and ownership,
801
00:34:51,393 --> 00:34:53,223
which is complex in
a case like this.
802
00:34:53,265 --> 00:34:56,745
If they used Barrat's code,
shouldn't he get some credit
803
00:34:56,790 --> 00:34:59,140
and even a share in the
proceeds of the auction?
804
00:34:59,184 --> 00:35:00,794
Alexander:
Appropriation happens
805
00:35:00,837 --> 00:35:02,447
all the time in the art world.
806
00:35:02,491 --> 00:35:04,231
Writers "borrow" styles
from other writers.
807
00:35:04,276 --> 00:35:05,966
Rap artists sample other songs.
808
00:35:06,016 --> 00:35:08,016
It's part of the game.
809
00:35:08,062 --> 00:35:10,892
Narrator: Barrat bears no ill
will towards Obvious.
810
00:35:10,934 --> 00:35:13,504
After all, he made the code
readily available
811
00:35:13,546 --> 00:35:15,716
on the internet free of charge.
812
00:35:15,765 --> 00:35:18,375
But he is somewhat critical of
the portrait itself,
813
00:35:18,420 --> 00:35:20,120
and he is not alone.
814
00:35:20,161 --> 00:35:21,161
Morgan: Many artists
would point out
815
00:35:21,206 --> 00:35:22,946
technical flaws in the work.
816
00:35:22,990 --> 00:35:25,120
The composition is skewed
to the upper left
817
00:35:25,166 --> 00:35:26,776
and there are many gaps
in the composition
818
00:35:26,820 --> 00:35:29,560
that make it feel incomplete.
819
00:35:29,605 --> 00:35:31,085
Alexander: Maybe these
imperfections
820
00:35:31,129 --> 00:35:32,609
are the whole point.
821
00:35:32,652 --> 00:35:34,832
This is a new visual style
created by an artist
822
00:35:34,871 --> 00:35:36,571
and a machine
collaborating in a way
823
00:35:36,612 --> 00:35:39,832
that we really
haven't seen before.
824
00:35:39,876 --> 00:35:42,356
Narrator: After seven minutes
of frenzied bidding,
825
00:35:42,401 --> 00:35:44,401
when the hammer falls at
Christie's,
826
00:35:44,446 --> 00:35:46,226
the Portrait of
Edmond de Bellamy
827
00:35:46,274 --> 00:35:48,234
sells for an extraordinary
828
00:35:48,276 --> 00:35:51,976
$432,500 USD
829
00:35:52,019 --> 00:35:53,849
to an anonymous phone bidder,
830
00:35:53,890 --> 00:35:58,200
more than 40 times
the estimated price.
831
00:35:58,243 --> 00:36:01,593
Like it or not, AI generated art
has officially arrived
832
00:36:01,637 --> 00:36:04,067
on the global stage.
833
00:36:04,118 --> 00:36:05,818
Alexander: It's a
staggering amount of money,
834
00:36:05,859 --> 00:36:08,209
considering that Obvious
were basically unknown
835
00:36:08,253 --> 00:36:10,173
in the art world,
up until that point.
836
00:36:10,211 --> 00:36:12,521
And they aren't even artists,
837
00:36:12,561 --> 00:36:14,961
according to the traditional
definition of the word.
838
00:36:14,998 --> 00:36:17,128
Narrator: The sale shocks
the establishment,
839
00:36:17,175 --> 00:36:19,215
and may one day be
looked back on
840
00:36:19,264 --> 00:36:22,574
as one of the seminal moments
for art in the 21st century,
841
00:36:22,615 --> 00:36:27,135
redefining the very idea
of what art is.
842
00:36:27,185 --> 00:36:29,225
There are two differing
arguments emerging
843
00:36:29,274 --> 00:36:31,284
regarding this type
of creative AI.
844
00:36:31,319 --> 00:36:33,499
Those that think it will
kill our creativity,
845
00:36:33,539 --> 00:36:37,589
and those that think
it will be enhanced.
846
00:36:37,630 --> 00:36:39,020
Alexander: Technology
has disrupted
847
00:36:39,066 --> 00:36:40,936
pretty much every industry.
848
00:36:40,981 --> 00:36:43,291
So why should
the art world be immune?
849
00:36:43,331 --> 00:36:46,471
If artists learn to create in
conjunction with technology
850
00:36:46,508 --> 00:36:48,598
and not use it as
some sort of crutch,
851
00:36:48,641 --> 00:36:51,731
then I don't see it as
an existential threat.
852
00:36:51,774 --> 00:36:53,344
Narrator: But there are
alarmists,
853
00:36:53,385 --> 00:36:55,465
who fear that we are moving
towards a future,
854
00:36:55,517 --> 00:36:57,557
where computers will
write our novels,
855
00:36:57,606 --> 00:37:00,826
compose our songs
and paint our pictures,
856
00:37:00,870 --> 00:37:03,400
and the result will be
an overall diminishment
857
00:37:03,438 --> 00:37:06,698
in human creativity.
Others disagree.
858
00:37:06,746 --> 00:37:08,746
Pringle: The bottom
line is that even the most
859
00:37:08,791 --> 00:37:12,361
sophisticated AI systems can't
replicate the human instinct
860
00:37:12,404 --> 00:37:14,624
to find inspiration
in our surroundings,
861
00:37:14,667 --> 00:37:17,577
and to use our ingrained
creativity to produce art.
862
00:37:17,626 --> 00:37:20,496
But it's an exciting
new tool for artists,
863
00:37:20,542 --> 00:37:22,372
and in that way we are
right at the frontier
864
00:37:22,414 --> 00:37:24,554
of what could be
a whole new medium.
865
00:37:24,590 --> 00:37:26,110
Narrator: There will always
be doubters
866
00:37:26,156 --> 00:37:29,676
who challenge the legitimacy of
AI-generated creative work,
867
00:37:29,725 --> 00:37:31,375
questioning whether or not
868
00:37:31,423 --> 00:37:33,643
it even qualifies as art.
869
00:37:33,686 --> 00:37:35,336
Morgan: In the words
of Andy Warhol,
870
00:37:35,383 --> 00:37:37,523
who, in the tradition of
many great artists,
871
00:37:37,559 --> 00:37:40,299
may have stolen this phrase
from Marshall McLuhan,
872
00:37:40,345 --> 00:37:45,345
"Art is what you can
get away with."
873
00:37:45,393 --> 00:37:47,833
Ai-Da Robot: We looked up
from our verses
874
00:37:47,874 --> 00:37:50,444
like blindfolded captives,
875
00:37:50,485 --> 00:37:53,615
Sent out to seek the light;
876
00:37:53,662 --> 00:37:55,532
but it never came,
877
00:37:55,577 --> 00:37:59,837
A needle and thread
would be necessary,
878
00:37:59,886 --> 00:38:03,496
For the completion
of the picture.
879
00:38:03,542 --> 00:38:14,862
♪ [show theme music]
880
00:38:14,901 --> 00:38:17,301
Narrator: A Japanese woman in
her 60s, is at an appointment
881
00:38:17,338 --> 00:38:19,078
at the Institute
of Medical Science
882
00:38:19,122 --> 00:38:21,522
at the University of Tokyo.
883
00:38:21,560 --> 00:38:24,040
It is the foremost centre for
advanced medical
884
00:38:24,084 --> 00:38:26,744
and bioscience research
in the country.
885
00:38:26,782 --> 00:38:30,532
She has been recently diagnosed
with a rare form of cancer,
886
00:38:30,569 --> 00:38:32,879
acute myeloid leukemia.
887
00:38:32,919 --> 00:38:35,699
Which starts in the bone marrow
and can quickly spread
888
00:38:35,748 --> 00:38:38,488
to the bloodstream and other
parts of the body.
889
00:38:38,533 --> 00:38:42,023
If not treated, it can
be life-threatening.
890
00:38:42,058 --> 00:38:44,018
Morgan: Unfortunately,
the treatments her previous
891
00:38:44,060 --> 00:38:46,720
doctors administered, which
included chemotherapy
892
00:38:46,759 --> 00:38:49,329
were largely unsuccessful.
893
00:38:49,370 --> 00:38:50,980
Pringle: They didn't know
why she wasn't responding
894
00:38:51,024 --> 00:38:52,244
to the treatments.
895
00:38:52,286 --> 00:38:54,676
The woman's recovery was
unusually slow
896
00:38:54,723 --> 00:38:56,683
and her doctors were confounded.
897
00:38:56,725 --> 00:38:58,545
Narrator: If medical experts
at the Institute
898
00:38:58,597 --> 00:39:00,377
can't find a solution,
899
00:39:00,425 --> 00:39:03,515
it's likely the woman will
succumb to her illness.
900
00:39:03,558 --> 00:39:06,338
For a time, the hospital
has been using AI
901
00:39:06,387 --> 00:39:09,697
to help with the diagnosing
of patients.
902
00:39:09,738 --> 00:39:12,958
An Artificial Intelligence
program developed by IBM,
903
00:39:13,002 --> 00:39:15,742
nick-named "Watson".
904
00:39:15,788 --> 00:39:17,008
Aitken: It's an exciting new
technology
905
00:39:17,050 --> 00:39:18,310
they've been trying out.
906
00:39:18,356 --> 00:39:20,176
They're hoping it can
successfully aid physicians
907
00:39:20,227 --> 00:39:21,837
in the treatment of patients.
908
00:39:21,881 --> 00:39:23,971
Watson has access to a
massive database,
909
00:39:24,013 --> 00:39:26,153
containing 20 million
research papers.
910
00:39:26,189 --> 00:39:27,759
This includes a huge
amount of information
911
00:39:27,800 --> 00:39:29,500
on gene-related cancer.
912
00:39:29,541 --> 00:39:31,761
Morgan: The hospital
has used the same AI
913
00:39:31,804 --> 00:39:34,634
for other patients suffering
from hematological diseases,
914
00:39:34,676 --> 00:39:36,716
with up to 80% success in
915
00:39:36,765 --> 00:39:38,895
identifying the causes
of their illness.
916
00:39:38,941 --> 00:39:41,811
Narrator: For AI's like 'Watson'
to work, it needs to
917
00:39:41,857 --> 00:39:44,897
analyze a considerable
amount of information,
918
00:39:44,947 --> 00:39:47,647
a patient's medical history,
current symptoms
919
00:39:47,689 --> 00:39:49,869
and notes from their
previous doctors.
920
00:39:49,909 --> 00:39:52,689
It will then cross-check it
against huge amounts
921
00:39:52,738 --> 00:39:55,608
of clinical cancer case data,
where it will form
922
00:39:55,654 --> 00:39:59,224
it's hypotheses and suggest
possible treatments.
923
00:39:59,266 --> 00:40:01,746
Aitken: And it can do
this very, very quickly,
924
00:40:01,790 --> 00:40:04,970
accessing and analyzing up to
200 million pages of information
925
00:40:05,011 --> 00:40:08,281
on nearly 100 servers, in an
extremely short period of time.
926
00:40:08,318 --> 00:40:10,228
Narrator: Over the past decade,
927
00:40:10,277 --> 00:40:12,317
highly advanced machine
learning systems
928
00:40:12,366 --> 00:40:15,016
have become an integral part
of healthcare systems
929
00:40:15,064 --> 00:40:18,334
around the world - taking on
important medical tasks,
930
00:40:18,372 --> 00:40:20,162
once only reserved for
931
00:40:20,200 --> 00:40:22,900
highly-trained medical
specialists.
932
00:40:22,942 --> 00:40:24,552
Aitken: These systems
don't just analyze
933
00:40:24,596 --> 00:40:26,726
and interpret or cross-check
the raw data.
934
00:40:26,772 --> 00:40:30,042
They actually "learn" from it
and they get smarter over time.
935
00:40:30,079 --> 00:40:31,599
Pringle: It's really
incredible in that
936
00:40:31,646 --> 00:40:33,556
it's been quite successful
in making these
937
00:40:33,605 --> 00:40:35,995
reliable hypotheses
to help patients.
938
00:40:36,042 --> 00:40:39,092
Narrator: AI has been deployed
in many ICUs,
939
00:40:39,132 --> 00:40:41,962
where it assists in monitoring
and treating patients
940
00:40:42,004 --> 00:40:44,574
with life-threatening
conditions.
941
00:40:44,616 --> 00:40:46,746
It's also used in
Radiology departments
942
00:40:46,792 --> 00:40:49,452
to aid technicians
in X-ray diagnostics,
943
00:40:49,490 --> 00:40:52,100
and in planning appropriate
drug treatments.
944
00:40:52,145 --> 00:40:56,315
In 2015, a Canadian woman,
Krista Jones
945
00:40:56,366 --> 00:40:58,716
was facing a double mastectomy,
946
00:40:58,760 --> 00:41:01,890
due to a rare form of
breast cancer.
947
00:41:01,937 --> 00:41:04,587
But she was able to forgo
the invasive procedure
948
00:41:04,636 --> 00:41:07,156
when machine-learning algorithms
were used to locate
949
00:41:07,203 --> 00:41:10,863
previously imperceptible
tumors in her mammograms.
950
00:41:10,903 --> 00:41:14,043
She is now cancer free.
951
00:41:14,080 --> 00:41:16,000
While examples like
this are encouraging,
952
00:41:16,038 --> 00:41:18,688
some still question
whether or not
953
00:41:18,737 --> 00:41:22,177
there is concrete proof that AI
improves a patient's outcome.
954
00:41:22,218 --> 00:41:23,788
Aitken: AI can analyze
and cross-check
955
00:41:23,829 --> 00:41:26,089
literally millions of pieces
of medical data.
956
00:41:26,135 --> 00:41:29,005
No doctor or even a team
of doctors can do that.
957
00:41:29,051 --> 00:41:31,531
Narrator: Researchers at
Stanford University
958
00:41:31,576 --> 00:41:33,666
found that in patients
they examined,
959
00:41:33,708 --> 00:41:35,408
AI was much faster
960
00:41:35,449 --> 00:41:37,799
than the average radiologist
when screening
961
00:41:37,843 --> 00:41:41,023
and diagnosing for several
different pathologies.
962
00:41:41,063 --> 00:41:44,333
A patient could obtain their
result in less than 2 minutes,
963
00:41:44,371 --> 00:41:48,071
when normally, they would have
to wait up for up to 3 hours.
964
00:41:48,114 --> 00:41:51,474
But there are still many critics
who are concerned about its use.
965
00:41:51,509 --> 00:41:53,509
Pringle: The criticism is
that many of the studies
966
00:41:53,554 --> 00:41:55,514
that have been released
are not conclusive.
967
00:41:55,556 --> 00:41:57,426
And in fact, many argue that
968
00:41:57,471 --> 00:42:01,301
there are actually
inherent risks to the patients.
969
00:42:01,344 --> 00:42:04,094
Narrator: Researchers suggest a
potential contributing factor
970
00:42:04,130 --> 00:42:06,390
are the limitations
AI encounters
971
00:42:06,436 --> 00:42:10,046
in accessing various data sets.
972
00:42:10,092 --> 00:42:12,492
Aitken: For AI to
function effectively, it needs
973
00:42:12,530 --> 00:42:15,270
to scan large amounts of data
from many different sources.
974
00:42:15,315 --> 00:42:17,265
But the problem is, not all
of these sources
975
00:42:17,317 --> 00:42:19,837
are always included in the
AI's diagnostic decisions.
976
00:42:19,885 --> 00:42:21,665
Sometimes that's a result of
human error,
977
00:42:21,713 --> 00:42:23,453
sometimes not everything
gets transferred,
978
00:42:23,497 --> 00:42:24,717
or there might be
inconsistencies
979
00:42:24,759 --> 00:42:27,589
in how data is entered
into the system.
980
00:42:27,632 --> 00:42:30,642
Narrator: A 2021 study by
the University of Washington
981
00:42:30,678 --> 00:42:33,248
revealed an additional concern.
982
00:42:33,289 --> 00:42:37,249
While examining various AI
models designed to detect Covid,
983
00:42:37,293 --> 00:42:40,603
researchers found that instead
of 'learning' about pathologies
984
00:42:40,645 --> 00:42:43,425
related to Covid, the technology
had a tendency
985
00:42:43,473 --> 00:42:46,653
to search for potential
diagnostic shortcuts,
986
00:42:46,694 --> 00:42:49,834
such as, making potentially
erroneous associations
987
00:42:49,871 --> 00:42:52,001
with other irrelevant
medical factors,
988
00:42:52,047 --> 00:42:53,957
like the age of the patient.
989
00:42:54,006 --> 00:42:58,746
This increases its
chances of misdiagnosis.
990
00:42:58,793 --> 00:43:00,933
Pringle: It's safe to say
that in their current state,
991
00:43:00,969 --> 00:43:03,409
these AI programs
aren't perfect;
992
00:43:03,450 --> 00:43:07,590
they can and sometimes do,
make mistakes.
993
00:43:07,628 --> 00:43:10,978
Narrator: But AI advocates say
it's pros vastly outweigh
994
00:43:11,023 --> 00:43:12,813
any potential cons.
995
00:43:12,851 --> 00:43:16,071
Recently, St. Michael's hospital
in Toronto, Canada
996
00:43:16,115 --> 00:43:18,595
began using machine learning
algorithms to help
997
00:43:18,639 --> 00:43:21,899
with diagnoses and they saw
a 20-percent reduction
998
00:43:21,947 --> 00:43:25,077
in mortality amongst
high-risk patients.
999
00:43:25,124 --> 00:43:28,134
By using IBM's AI 'Watson',
1000
00:43:28,170 --> 00:43:30,910
medical experts at the
University of Tokyo
1001
00:43:30,956 --> 00:43:33,306
are hopeful it may be able to
help the woman
1002
00:43:33,349 --> 00:43:35,399
suffering from leukemia.
1003
00:43:35,438 --> 00:43:38,618
Watson will detail and
analyze gene mutations
1004
00:43:38,659 --> 00:43:40,919
in the female patient
and it turns out,
1005
00:43:40,966 --> 00:43:43,706
there are more than
one thousand of them.
1006
00:43:43,751 --> 00:43:45,971
Pringle: Watson
can do it in 10 minutes,
1007
00:43:46,014 --> 00:43:48,284
but it would take 2 weeks
for human doctors
1008
00:43:48,321 --> 00:43:50,021
to accomplish the same task.
1009
00:43:50,062 --> 00:43:51,632
Narrator: Doctors wait
for Watson's
1010
00:43:51,672 --> 00:43:54,112
diagnostic assessment on the
woman's condition.
1011
00:43:54,153 --> 00:43:57,773
But its use has led to both
ethical and legal dilemmas.
1012
00:43:57,809 --> 00:43:59,719
Aitken: Mistakes are
always a possibility,
1013
00:43:59,767 --> 00:44:01,807
and they could exacerbate a
patient's health problems,
1014
00:44:01,856 --> 00:44:03,896
or possibly even lead to death.
1015
00:44:03,945 --> 00:44:06,465
Narrator: If something like this
does happen, many wonder,
1016
00:44:06,513 --> 00:44:10,433
who or even what would be
held responsible?
1017
00:44:10,473 --> 00:44:11,823
Pringle: It may be
difficult to establish
1018
00:44:11,866 --> 00:44:14,086
legal accountability in
these situations.
1019
00:44:14,129 --> 00:44:16,779
Is it the patient's doctor?
The programmer?
1020
00:44:16,828 --> 00:44:19,788
Or even the AI itself?
1021
00:44:19,831 --> 00:44:21,271
Morgan: It may sound
like science fiction
1022
00:44:21,310 --> 00:44:23,880
to put an AI on trial,
but it's not that far fetched
1023
00:44:23,922 --> 00:44:25,882
when you consider
the legal actions we've taken
1024
00:44:25,924 --> 00:44:28,494
against things like
corporations.
1025
00:44:28,535 --> 00:44:30,755
Narrator: Legal experts counter
that they currently have
1026
00:44:30,798 --> 00:44:34,188
more pressing concerns
regarding the technology.
1027
00:44:34,236 --> 00:44:36,756
Specifically, in how
personal medical data
1028
00:44:36,804 --> 00:44:39,204
is obtained and shared.
1029
00:44:39,241 --> 00:44:40,941
Aitken: AI primarily
learns about a patient
1030
00:44:40,982 --> 00:44:43,252
through medical records, and
if they share these record
1031
00:44:43,289 --> 00:44:45,509
with other institutions it may
be in violation
1032
00:44:45,552 --> 00:44:47,862
of a person's privacy.
1033
00:44:47,902 --> 00:44:51,082
Narrator: In 2020, the renowned
US medical centre,
1034
00:44:51,123 --> 00:44:54,043
The Mayo Clinic, acknowledged
for its speciality in
1035
00:44:54,082 --> 00:44:56,522
Cancer, Cardiology
and heart surgery,
1036
00:44:56,563 --> 00:44:59,573
was accused of allegedly sharing
private medical data
1037
00:44:59,609 --> 00:45:02,049
with several AI developers.
1038
00:45:02,090 --> 00:45:03,480
Pringle: It was for
research purposes,
1039
00:45:03,526 --> 00:45:06,306
but the patients were not
informed that they did this.
1040
00:45:06,355 --> 00:45:09,705
And understandably many say
this is highly unethical.
1041
00:45:09,750 --> 00:45:12,060
Narrator: Legal experts
also warn of another
1042
00:45:12,100 --> 00:45:14,320
potential breach of privacy.
1043
00:45:14,363 --> 00:45:17,803
A hospital's AI could obtain a
person's private data
1044
00:45:17,845 --> 00:45:21,495
from companies that are not
in the medical or tech industry,
1045
00:45:21,544 --> 00:45:24,114
such as banks or
insurance policies.
1046
00:45:24,156 --> 00:45:26,716
That could result in it making
inferences
1047
00:45:26,767 --> 00:45:29,027
to a patient's 'lifestyle'.
1048
00:45:29,074 --> 00:45:30,734
Morgan: It could change the way
that it does diagnostics,
1049
00:45:30,771 --> 00:45:32,471
the medications it prescribes,
1050
00:45:32,512 --> 00:45:34,382
the prognosis of the outcome.
1051
00:45:34,427 --> 00:45:36,207
This could have serious impacts
1052
00:45:36,255 --> 00:45:38,465
on whether we treat different
groups of people
1053
00:45:38,518 --> 00:45:41,428
the same way,
to treat them all fairly.
1054
00:45:41,477 --> 00:45:42,827
Narrator: There may be
other biases
1055
00:45:42,870 --> 00:45:44,960
entrenched in the AI algorithms.
1056
00:45:45,003 --> 00:45:47,353
There is evidence that in
the US, people of colour
1057
00:45:47,396 --> 00:45:51,656
are underrepresented in
the data it analyzes.
1058
00:45:51,705 --> 00:45:53,315
Aitken: Most of the data
used to train their algorithms,
1059
00:45:53,359 --> 00:45:55,799
is obtained from
only 3 U.S. states.
1060
00:45:55,840 --> 00:45:57,800
That's not a
diverse enough dataset,
1061
00:45:57,842 --> 00:45:59,582
and could result in
these AI systems
1062
00:45:59,626 --> 00:46:02,496
treating people
of colour less effectively.
1063
00:46:02,542 --> 00:46:04,152
Pringle: Some of these
algorithms allocate
1064
00:46:04,196 --> 00:46:07,156
less medical resources to
minority groups because often
1065
00:46:07,199 --> 00:46:09,509
they have less access
to medical care,
1066
00:46:09,549 --> 00:46:12,729
and therefore use less resources
than causcasian patients.
1067
00:46:12,770 --> 00:46:15,080
This results in the AI thinking
the minority patients
1068
00:46:15,120 --> 00:46:19,250
are healthier than they are,
and gives them less priority.
1069
00:46:19,298 --> 00:46:21,948
Narrator: Critics believe this
could lead to more incidents
1070
00:46:21,996 --> 00:46:25,386
of misdiagnosis in minority
patients and treatment errors,
1071
00:46:25,434 --> 00:46:28,394
such as prescribing
the wrong medications.
1072
00:46:28,437 --> 00:46:30,827
But AI developers and
hospitals counter
1073
00:46:30,875 --> 00:46:32,875
that many medical centers are
expanding
1074
00:46:32,920 --> 00:46:35,180
the diversity of their test
subjects,
1075
00:46:35,227 --> 00:46:37,877
and in the long run, people of
colour will greatly benefit
1076
00:46:37,925 --> 00:46:43,145
from AI's integration
into healthcare systems.
1077
00:46:43,191 --> 00:46:45,411
Morgan: The tech is
constantly improving
1078
00:46:45,454 --> 00:46:48,504
and according to its proponents,
it has greater and greater
1079
00:46:48,544 --> 00:46:52,114
access to minority populations
for research purposes.
1080
00:46:52,157 --> 00:46:53,377
Aitken: This is especially
important for people
1081
00:46:53,419 --> 00:46:54,939
living in less
developed countries.
1082
00:46:54,986 --> 00:46:56,896
For example, several
countries in Africa,
1083
00:46:56,944 --> 00:46:59,214
have hospitals with no
radiologists at all.
1084
00:46:59,251 --> 00:47:01,341
Thankfully AI is helping
to fill that gap,
1085
00:47:01,383 --> 00:47:05,433
analyzing images and x-rays
taken from patients.
1086
00:47:05,474 --> 00:47:07,224
Narrator: Asian countries
in particular,
1087
00:47:07,259 --> 00:47:09,959
have become leaders in
AI healthcare innovation,
1088
00:47:10,001 --> 00:47:12,131
with Japan leading the way.
1089
00:47:12,177 --> 00:47:14,177
And as Watson readies its
diagnosis
1090
00:47:14,222 --> 00:47:16,052
at the University of Tokyo,
1091
00:47:16,094 --> 00:47:18,404
we're about to see why.
1092
00:47:18,444 --> 00:47:20,794
Morgan: Her doctors
are stunned when Watson
1093
00:47:20,838 --> 00:47:24,618
determines that the original
diagnosis is in fact wrong.
1094
00:47:24,667 --> 00:47:27,847
It turns out she has a
rare form of leukemia.
1095
00:47:27,888 --> 00:47:30,798
And that's why the earlier
treatments weren't working.
1096
00:47:30,848 --> 00:47:33,238
They were treating
the wrong illness.
1097
00:47:33,285 --> 00:47:35,585
Narrator: Armed with
this new information,
1098
00:47:35,635 --> 00:47:38,025
doctors revise the woman's
treatment plan,
1099
00:47:38,072 --> 00:47:41,292
and her condition improves
significantly.
1100
00:47:41,336 --> 00:47:45,036
Morgan: What ended up saving
this woman's life...was data.
1101
00:47:45,079 --> 00:47:47,169
The AI was able to recognize
patterns in it
1102
00:47:47,212 --> 00:47:49,822
that her doctors were unable to.
1103
00:47:49,867 --> 00:47:51,557
Narrator: But the positive
outcome raises
1104
00:47:51,607 --> 00:47:53,737
an important question
that people working in
1105
00:47:53,783 --> 00:47:56,573
health and medicine
will have to deal with.
1106
00:47:56,612 --> 00:47:59,662
If AI can do this
more accurately and efficiently
1107
00:47:59,702 --> 00:48:01,662
than a medical professional,
1108
00:48:01,704 --> 00:48:03,324
what impact will this
have on the future
1109
00:48:03,358 --> 00:48:05,668
of Healthcare's workforce?
1110
00:48:05,708 --> 00:48:07,538
Pringle: It's a situation
that medical experts
1111
00:48:07,580 --> 00:48:09,150
are seriously concerned about.
1112
00:48:09,190 --> 00:48:12,280
The tech is fully automated,
it's self-reliant.
1113
00:48:12,324 --> 00:48:14,334
So, larger scale
integration of AI
1114
00:48:14,369 --> 00:48:15,889
could create a significant
amount of
1115
00:48:15,936 --> 00:48:17,626
unemployed, highly
skilled workers,
1116
00:48:17,677 --> 00:48:20,067
like doctors, nurses
and technicians.
1117
00:48:20,114 --> 00:48:21,464
Of course, what they've got
going for them
1118
00:48:21,507 --> 00:48:23,807
that the AI doesn't
have is empathy.
1119
00:48:23,857 --> 00:48:25,817
They've got, they've got
bedside manner.
1120
00:48:25,859 --> 00:48:27,429
Aitken: As AI plays
increasing roles
1121
00:48:27,469 --> 00:48:29,039
in all aspects of healthcare,
1122
00:48:29,080 --> 00:48:30,820
it will inevitably change the
ways we interact
1123
00:48:30,864 --> 00:48:32,524
with health services and lead to
1124
00:48:32,561 --> 00:48:34,611
different kinds of jobs
in healthcare.
1125
00:48:34,650 --> 00:48:36,440
But it's unlikely
ever to replace
1126
00:48:36,478 --> 00:48:39,218
medical professionals
altogether.
1127
00:48:39,264 --> 00:48:41,404
Narrator: Penetration
of AI into medicine,
1128
00:48:41,440 --> 00:48:43,750
will likely be very slow,
1129
00:48:43,790 --> 00:48:46,360
providing ample time
for healthcare institutions
1130
00:48:46,401 --> 00:48:50,361
to re-adjust and accommodate
factors such as staff training.
1131
00:48:50,405 --> 00:48:53,755
And according to recent
research, AI may actually
1132
00:48:53,800 --> 00:48:56,110
increase employment
opportunities in health care
1133
00:48:56,150 --> 00:48:59,020
by 15% in the upcoming years.
1134
00:48:59,066 --> 00:49:01,026
Morgan:
It's the old saying,
1135
00:49:01,068 --> 00:49:04,028
' when one door closes,
another opens,'
1136
00:49:04,071 --> 00:49:06,071
and a lot of people argue that
AI's integration
1137
00:49:06,117 --> 00:49:07,947
will produce a ton
of high- skilled
1138
00:49:07,988 --> 00:49:09,558
computer programming jobs.
1139
00:49:09,598 --> 00:49:12,378
But I'd say we should
be careful about this.
1140
00:49:12,427 --> 00:49:14,557
We want to make sure not to
automate the kinds of jobs
1141
00:49:14,603 --> 00:49:17,133
we would actually want to do.
1142
00:49:17,171 --> 00:49:19,351
Aitken: It's important to
note that AI's functionality
1143
00:49:19,391 --> 00:49:21,571
is most optimal when it's not
working autonomously,
1144
00:49:21,610 --> 00:49:23,530
but rather when it's used
as an additional tool
1145
00:49:23,569 --> 00:49:26,659
to assist medical practitioners.
1146
00:49:26,702 --> 00:49:28,442
Narrator: A perfect
example of that,
1147
00:49:28,487 --> 00:49:30,317
is the elderly Japanese woman,
1148
00:49:30,358 --> 00:49:33,448
whose life was saved
with the help of Watson.
1149
00:49:33,492 --> 00:49:36,152
She was discharged from the
hospital and began receiving
1150
00:49:36,190 --> 00:49:38,410
outpatient treatment at home.
1151
00:49:38,453 --> 00:49:39,633
Pringle: She is
managing her condition
1152
00:49:39,672 --> 00:49:42,242
and what's most important,
she is alive.
1153
00:49:42,283 --> 00:49:44,243
She can thank AI for that.
1154
00:49:44,285 --> 00:49:46,105
Aitken: There is no
denying that AI will play
1155
00:49:46,157 --> 00:49:47,937
a big part in the
future of healthcare.
1156
00:49:47,985 --> 00:49:49,985
Who knows how far the
technology will go?
1157
00:49:50,030 --> 00:49:52,120
Narrator: As AI continues
to be integrated
1158
00:49:52,163 --> 00:49:53,993
into our healthcare systems,
1159
00:49:54,034 --> 00:49:56,564
the moral and legal debate
surrounding its use
1160
00:49:56,602 --> 00:49:59,302
will surely intensify.
1161
00:49:59,344 --> 00:50:02,394
Is it really more dependable
than a human doctor?
1162
00:50:02,434 --> 00:50:04,314
Can it be ethical
and impartial
1163
00:50:04,349 --> 00:50:06,349
when dealing with
minority patients?
1164
00:50:06,394 --> 00:50:10,574
And if not, who or what
will be held accountable?
1165
00:50:10,616 --> 00:50:13,746
Or perhaps AI proponents are
correct, when they say
1166
00:50:13,793 --> 00:50:16,013
that there is something
more important,
1167
00:50:16,056 --> 00:50:18,796
all the lives it might
help save in the future.
91778
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.