Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:16,059 --> 00:00:24,809
♪
2
00:00:24,850 --> 00:00:27,240
Narrator: Hundreds of
millions of dollars disappear
3
00:00:27,288 --> 00:00:30,508
after the owner of Canada's
largest cryptocurrency exchange
4
00:00:30,552 --> 00:00:35,512
allegedly dies under mysterious
cicumstances in India.
5
00:00:35,557 --> 00:00:37,117
Ramona Pringle: There are people
out there that want the body
6
00:00:37,167 --> 00:00:40,517
exhumed so that they can verify
that he's actually dead.
7
00:00:40,562 --> 00:00:41,872
Nikolas Badminton:
Sometimes they simply disappear
8
00:00:41,911 --> 00:00:47,221
once they have people's money.
9
00:00:47,264 --> 00:00:49,574
Narrator: An American woman is
denied prescription mediacation,
10
00:00:49,614 --> 00:00:51,624
because an algorithm determines
11
00:00:51,660 --> 00:00:54,580
that she is a high risk
for addiction.
12
00:00:54,619 --> 00:00:57,669
Kris Alexander: But she claims
she hasn't been abusing opiates.
13
00:00:57,709 --> 00:00:59,449
Mhairi Aitken: It's possible
the A.I. made a mistake.
14
00:00:59,494 --> 00:01:01,804
And if it did, that raises
serious concerns
15
00:01:01,844 --> 00:01:04,894
about its reliability.
16
00:01:04,934 --> 00:01:06,544
Narrator: The President of Gabon
appears in a
17
00:01:06,588 --> 00:01:08,238
video address to the nation,
18
00:01:08,285 --> 00:01:09,935
but some citizens believe
19
00:01:09,982 --> 00:01:12,722
that all may not be as it seems.
20
00:01:12,768 --> 00:01:15,898
Pringle: His face looks
a lot puffier and smoother,
21
00:01:15,945 --> 00:01:18,505
and the distance between
his eyes looks different.
22
00:01:18,556 --> 00:01:21,116
Anthony Morgan: There are
plenty of experts who admit
23
00:01:21,168 --> 00:01:23,038
that President Bongo's
video looks
24
00:01:23,083 --> 00:01:28,133
like it could have
been fabricated.
25
00:01:28,175 --> 00:01:29,955
Narrator: These are the stories
of the future
26
00:01:30,002 --> 00:01:33,752
that big data is bringing
to our doorsteps.
27
00:01:33,789 --> 00:01:37,659
The real world impact of
predictions and surveillance.
28
00:01:37,706 --> 00:01:40,136
The power of artificial
intelligence
29
00:01:40,187 --> 00:01:42,187
and autonomous machines.
30
00:01:42,232 --> 00:01:44,102
For better or for worse,
31
00:01:44,147 --> 00:01:47,147
these are the
Secrets of Big Data.
32
00:01:47,194 --> 00:01:59,644
♪♪
33
00:01:59,684 --> 00:02:02,254
Narrator: On December 8, 2018,
34
00:02:02,296 --> 00:02:04,166
30-year-old Gerald Cotten and
35
00:02:04,211 --> 00:02:06,171
his new bride Jennifer Robertson
36
00:02:06,213 --> 00:02:07,783
check into the opulent
37
00:02:07,823 --> 00:02:13,393
Oberoi Rajvilas resort
in Jaipur, India.
38
00:02:13,437 --> 00:02:15,737
Cotten is the founder of
QuadrigaCX,
39
00:02:15,787 --> 00:02:20,137
Canada's largest
cryptocurrency exchange.
40
00:02:20,183 --> 00:02:22,053
Pringle: QuadrigaCX
is an online platform
41
00:02:22,098 --> 00:02:24,668
where people can trade Bitcoin
and other cryptocurrencies.
42
00:02:24,709 --> 00:02:27,449
Sort of like a stock exchange.
43
00:02:27,495 --> 00:02:30,795
Narrator: Cryptocurrency can be
used to facilitate the sale,
44
00:02:30,846 --> 00:02:33,196
purchase, or trade of goods
between parties,
45
00:02:33,240 --> 00:02:36,630
just like the money
in our pockets.
46
00:02:36,678 --> 00:02:39,988
The difference is that it's
digital and uses encryption
47
00:02:40,029 --> 00:02:42,469
to control the creation of
monetary units
48
00:02:42,510 --> 00:02:47,950
and to verify the transfer and
ownership of funds.
49
00:02:47,993 --> 00:02:51,433
2017 was a banner year
for Bitcoin.
50
00:02:51,475 --> 00:02:55,475
Its value skyrocketed from
around $900 US
51
00:02:55,523 --> 00:02:58,223
to almost $20,000,
52
00:02:58,265 --> 00:03:02,085
and as a result, QuadrigaCX
flourished and Gerald Cotten
53
00:03:02,138 --> 00:03:06,708
became something of a
cryptocurrency tycoon.
54
00:03:06,751 --> 00:03:08,671
Badminton: Quadriga charged a
fee for every trade,
55
00:03:08,710 --> 00:03:11,800
and they handled almost
$2 billion in transactions
56
00:03:11,843 --> 00:03:15,673
from 363,000 accounts a year.
57
00:03:15,717 --> 00:03:18,147
There was a speculative frenzy
and everybody was looking
58
00:03:18,198 --> 00:03:22,848
to pump the market
and make some money.
59
00:03:22,898 --> 00:03:24,638
Narrator: Cotten and Robertson
are in India
60
00:03:24,682 --> 00:03:26,512
to celebrate their honeymoon.
61
00:03:26,554 --> 00:03:29,084
But things take a shocking turn.
62
00:03:29,121 --> 00:03:31,821
Shortly after checking
into the hotel,
63
00:03:31,863 --> 00:03:35,003
Cotten complains of severe
stomach pains.
64
00:03:35,040 --> 00:03:37,700
A hotel employee drives them to
a local hospital,
65
00:03:37,739 --> 00:03:40,179
where doctors initially dismiss
his condition
66
00:03:40,220 --> 00:03:41,790
as traveller's sickness.
67
00:03:41,830 --> 00:03:44,350
However, within 24 hours,
68
00:03:44,398 --> 00:03:50,618
Gerald Cotten is declared dead.
69
00:03:50,665 --> 00:03:52,315
Morgan: Cotten suffered
from Crohn's disease,
70
00:03:52,362 --> 00:03:56,372
but he didn't really
tell anyone about it.
71
00:03:56,410 --> 00:03:58,630
It causes inflammation of the
digestive tract,
72
00:03:58,673 --> 00:04:00,283
which can lead to abdominal
pain, fatigue,
73
00:04:00,327 --> 00:04:01,807
malnutrition, weight loss...
74
00:04:01,850 --> 00:04:06,770
But it is rarely fatal.
75
00:04:06,811 --> 00:04:08,071
Narrator: The official
cause of death is
76
00:04:08,117 --> 00:04:10,417
complications from Crohn's,
77
00:04:10,467 --> 00:04:13,167
but the gastroenterologist who
treated Cotten
78
00:04:13,209 --> 00:04:17,779
later admitted that they were
not sure about the diagnosis.
79
00:04:17,822 --> 00:04:20,962
Regardless of the cause, this
seemingly healthy young man
80
00:04:20,999 --> 00:04:24,259
is now dead and as
the story unfolds,
81
00:04:24,307 --> 00:04:26,737
the fallout from his passing
will have a profound effect
82
00:04:26,788 --> 00:04:30,788
on Quadriga's customers.
83
00:04:30,835 --> 00:04:33,875
Pringle: Cotten took Quadriga's
account passwords to his grave.
84
00:04:33,925 --> 00:04:37,275
Basically, around
$250 million Canadian dollars
85
00:04:37,320 --> 00:04:43,200
from over 75,000
customers vanished.
86
00:04:43,239 --> 00:04:44,939
Narrator: Cotten had
earlier claimed
87
00:04:44,980 --> 00:04:47,770
that he wrote the passwords down
and kept them in a safe
88
00:04:47,809 --> 00:04:50,769
bolted to the rafters
in his attic.
89
00:04:50,812 --> 00:04:53,292
After his death, one of his
employees went to the house
90
00:04:53,336 --> 00:04:56,296
to search for it,
but came up empty.
91
00:04:56,339 --> 00:04:59,599
He also didn't have a so-called
"dead man's switch",
92
00:04:59,647 --> 00:05:02,127
a back-up system that would have
automatically transmitted
93
00:05:02,171 --> 00:05:04,571
the codes to a
prearranged contact,
94
00:05:04,608 --> 00:05:06,648
should Quadriga's
accounts go dormant
95
00:05:06,697 --> 00:05:07,827
for a certain amount of time.
96
00:05:07,872 --> 00:05:13,662
♪
97
00:05:13,704 --> 00:05:16,324
Morgan: At this point, his
customers really start to panic.
98
00:05:16,359 --> 00:05:18,189
Many of them have invested
huge amounts of money
99
00:05:18,230 --> 00:05:20,280
and it looks like it's gone.
100
00:05:20,320 --> 00:05:25,110
♪
101
00:05:25,150 --> 00:05:27,020
Narrator: As the news of
Cotten's death
102
00:05:27,065 --> 00:05:29,235
and the missing passwords
hits the internet,
103
00:05:29,285 --> 00:05:32,415
some of Quadriga's account
holders become suspicious.
104
00:05:32,462 --> 00:05:35,122
Rumours begin to circulate
that Quadriga
105
00:05:35,160 --> 00:05:37,030
was nothing more than
a Ponzi scheme,
106
00:05:37,075 --> 00:05:40,165
an investment fraud that pays
out existing customers
107
00:05:40,209 --> 00:05:43,079
with money acquired
from new investors.
108
00:05:43,125 --> 00:05:45,345
Some even take it
one step further
109
00:05:45,388 --> 00:05:47,998
and suggest something
truly scandalous.
110
00:05:48,043 --> 00:05:50,743
Maybe Gerald Cotten
faked his own death.
111
00:05:50,785 --> 00:05:56,395
♪
112
00:05:56,443 --> 00:05:59,753
Morgan: There is a theory that
this was an elaborate exit scam.
113
00:05:59,794 --> 00:06:02,454
When a perpetrator needs a way
out of paying their investors
114
00:06:02,492 --> 00:06:05,712
by concocting some kind of story
that is beyond their control,
115
00:06:05,756 --> 00:06:08,756
something like our bank
"froze our assets."
116
00:06:08,803 --> 00:06:12,593
So they'll stall and avoid the
issue until investors eventually
117
00:06:12,633 --> 00:06:17,733
just give up and accept that
their money is probably gone.
118
00:06:17,768 --> 00:06:19,028
Badminton: Sometimes
they simply disappear
119
00:06:19,074 --> 00:06:20,604
once they have people's money.
120
00:06:20,641 --> 00:06:22,821
This kind of fraud is
common in the crypto world,
121
00:06:22,860 --> 00:06:28,080
because it's difficult to trace
and to catch the scammers.
122
00:06:28,126 --> 00:06:30,646
Narrator: The difficulty is
due to the anonymous nature
123
00:06:30,694 --> 00:06:32,354
of virtual assets.
124
00:06:32,392 --> 00:06:35,482
Cryptocurrencies are
decentralized, meaning
125
00:06:35,525 --> 00:06:38,745
all transactions are peer to
peer with no authorities,
126
00:06:38,789 --> 00:06:44,139
such as banks or government
institutions involved.
127
00:06:44,186 --> 00:06:45,876
Pringle: Transactions are
recorded on what's called
128
00:06:45,927 --> 00:06:48,927
the blockchain, which is a
public digital ledger.
129
00:06:48,973 --> 00:06:50,853
It's kind of like
a shared spreadsheet
130
00:06:50,888 --> 00:06:52,668
that's accessible to
anyone on the internet
131
00:06:52,716 --> 00:06:56,846
where every transaction
is publicly recorded.
132
00:06:56,894 --> 00:06:58,374
Narrator: With its
humble beginnings,
133
00:06:58,418 --> 00:07:00,678
few people would have predicted
the meteoric rise
134
00:07:00,724 --> 00:07:04,124
in the value of Bitcoin.
135
00:07:04,162 --> 00:07:06,162
This once obscure cryptocurrency
136
00:07:06,208 --> 00:07:08,468
grew out of the dark
corners of the internet
137
00:07:08,515 --> 00:07:10,995
and exploded onto
the investing scene.
138
00:07:11,039 --> 00:07:14,039
And it made some
people very rich.
139
00:07:14,085 --> 00:07:18,695
Gerald Cotten included.
140
00:07:18,742 --> 00:07:20,792
Pringle: Before his death,
Cotten and Robertson
141
00:07:20,831 --> 00:07:23,181
enjoyed the high life.
142
00:07:23,225 --> 00:07:25,835
They owned numerous houses
and rental properties,
143
00:07:25,880 --> 00:07:30,670
a 51-foot yacht and
even a small airplane.
144
00:07:30,711 --> 00:07:32,801
Morgan: The couple also
spent a small fortune
145
00:07:32,843 --> 00:07:34,723
on all kinds of luxuries.
146
00:07:34,758 --> 00:07:37,588
They liked to brag that they'd
been to more than 56 countries.
147
00:07:37,631 --> 00:07:39,591
But there's only
one small problem
148
00:07:39,633 --> 00:07:44,553
with all of their wealth
- it wasn't real.
149
00:07:44,594 --> 00:07:47,734
Narrator: When the price of
Bitcoin tanked in 2018,
150
00:07:47,771 --> 00:07:50,471
many of Quadriga's customers
became alarmed
151
00:07:50,513 --> 00:07:52,303
and wanted to pull
their money out,
152
00:07:52,341 --> 00:07:56,781
but Cotten struggled
to make the payments.
153
00:07:56,824 --> 00:07:58,224
Badminton: Cotten was helping
himself to customers' money
154
00:07:58,260 --> 00:08:00,040
to finance his lifestyle.
155
00:08:00,088 --> 00:08:02,608
Quadriga became his own
personal bank account
156
00:08:02,656 --> 00:08:04,526
and when people started
asking for their money,
157
00:08:04,571 --> 00:08:07,701
he didn't have it.
158
00:08:07,748 --> 00:08:09,878
Narrator: Curious internet
sleuths began looking into
159
00:08:09,924 --> 00:08:12,674
Cotten's background and
discovered that this wasn't
160
00:08:12,709 --> 00:08:18,539
the first time he had duped
unsuspecting investors.
161
00:08:18,585 --> 00:08:21,625
Incredibly, as a teenager,
he launched his first
162
00:08:21,675 --> 00:08:25,065
online pyramid scheme
called S&S Investments,
163
00:08:25,113 --> 00:08:27,073
a venture that promised returns
164
00:08:27,115 --> 00:08:31,855
of up to 150 percent
in just two days.
165
00:08:31,902 --> 00:08:34,782
Cotten ran the scam for three
months before closing up shop
166
00:08:34,818 --> 00:08:38,338
and vanishing with
investors' money.
167
00:08:38,387 --> 00:08:40,387
Pringle: Cotten went on to
perpetrate other scams
168
00:08:40,432 --> 00:08:44,002
before launching Quadriga,
engaging in money laundering,
169
00:08:44,045 --> 00:08:50,435
identity theft and
various other schemes.
170
00:08:50,486 --> 00:08:52,226
Narrator: While it's inarguable
that cryptocurrency
171
00:08:52,270 --> 00:08:54,490
has something of a
shadowy history,
172
00:08:54,534 --> 00:08:57,584
it has developed an almost
cult-like following
173
00:08:57,624 --> 00:09:01,154
among a younger generation of
technophiles.
174
00:09:01,192 --> 00:09:03,802
Many supporters staunchly
believe that digital currencies
175
00:09:03,847 --> 00:09:06,677
could soon become
part of daily life.
176
00:09:06,720 --> 00:09:09,550
Others are doubtful.
177
00:09:09,592 --> 00:09:11,332
Badminton: One of the biggest
problems with cryptocurrencies
178
00:09:11,376 --> 00:09:14,816
in their current state
is price volatility.
179
00:09:14,858 --> 00:09:17,378
There are huge swings in value,
which makes them unusable
180
00:09:17,426 --> 00:09:22,516
on a day-to-day basis as
a method of payment.
181
00:09:22,562 --> 00:09:24,132
Narrator: Critics of
cryptocurrency
182
00:09:24,172 --> 00:09:26,782
also like to point out
that it seems to be
183
00:09:26,827 --> 00:09:30,657
the currency of choice among
tech savvy criminals.
184
00:09:30,700 --> 00:09:33,090
The clandestine nature
of the blockchain,
185
00:09:33,137 --> 00:09:35,527
makes it ideal for
money laundering.
186
00:09:35,575 --> 00:09:37,095
The history of crypto
187
00:09:37,141 --> 00:09:41,321
is littered with tales
of theft and fraud.
188
00:09:41,363 --> 00:09:44,193
The veil of secrecy isn't
always impenetrable,
189
00:09:44,235 --> 00:09:47,195
and sometimes, for people
like Gerald Cotten,
190
00:09:47,238 --> 00:09:50,328
the truth eventually
comes to light.
191
00:09:50,372 --> 00:09:52,502
Pringle: Cotten knew his house
of cards was collapsing
192
00:09:52,548 --> 00:09:54,378
as the price of Bitcoin
plummeted
193
00:09:54,419 --> 00:09:56,379
and he only had two choices,
194
00:09:56,421 --> 00:10:00,471
go to jail or disappear forever.
195
00:10:00,512 --> 00:10:02,512
His death came at a rather
convenient time,
196
00:10:02,558 --> 00:10:06,778
which is why some people think
he might still be alive.
197
00:10:06,823 --> 00:10:10,313
♪
198
00:10:10,348 --> 00:10:12,388
Narrator: Those who believe
Cotten may have faked
199
00:10:12,437 --> 00:10:15,437
his own death cite several
suspicious details
200
00:10:15,484 --> 00:10:17,974
that were revealed
in the aftermath.
201
00:10:18,008 --> 00:10:21,268
First off, there is the matter
of the death certificate -
202
00:10:21,316 --> 00:10:23,486
Cotten's name is misspelled.
203
00:10:23,535 --> 00:10:26,225
This may seem inconsequential,
204
00:10:26,277 --> 00:10:28,107
but the former chairman
of the company
205
00:10:28,149 --> 00:10:31,589
that managed the hospital and
issued the certificate
206
00:10:31,631 --> 00:10:38,201
had recently been found
guilty of fraud.
207
00:10:38,246 --> 00:10:40,116
Morgan: It's no secret
that it's easy to buy
208
00:10:40,161 --> 00:10:42,121
forged and faked
documents in India,
209
00:10:42,163 --> 00:10:44,513
so for somebody who wanted to
fake their own death,
210
00:10:44,556 --> 00:10:47,726
it would be a good
place to do it.
211
00:10:47,777 --> 00:10:50,297
Narrator: But the real red flag
for the conspiracy theorists
212
00:10:50,345 --> 00:10:53,995
is the fact that Cotten
composed a will, only four days
213
00:10:54,044 --> 00:10:56,664
before he and Robertson
left for India,
214
00:10:56,699 --> 00:10:59,479
leaving 12 million dollars
worth of real estate,
215
00:10:59,528 --> 00:11:01,618
along with various other assets,
216
00:11:01,661 --> 00:11:02,841
to his new bride.
217
00:11:02,879 --> 00:11:11,109
♪
218
00:11:11,148 --> 00:11:14,888
The will also made no reference
to any external hard drives,
219
00:11:14,935 --> 00:11:18,585
known as cold wallets, which
contained most of the money
220
00:11:18,634 --> 00:11:22,204
that Quadriga's customers
had invested.
221
00:11:22,246 --> 00:11:23,986
Pringle: It might just be
a coincidence,
222
00:11:24,031 --> 00:11:27,251
but a relatively healthy 30 year
old man with a history of
fraud,
223
00:11:27,295 --> 00:11:29,775
writing a will less than
a week before his death,
224
00:11:29,819 --> 00:11:32,299
definitely seems a bit fishy.
225
00:11:32,343 --> 00:11:34,823
And why no mention of the
supposed millions
226
00:11:34,868 --> 00:11:36,958
stored in the cold wallets?
227
00:11:37,000 --> 00:11:39,870
Narrator: Shortly after
Cotten's reported passing,
228
00:11:39,916 --> 00:11:42,006
blockchain investigators
discovered
229
00:11:42,049 --> 00:11:45,049
that almost all of
the wallets were empty.
230
00:11:45,095 --> 00:11:48,265
Cotten had transferred the funds
into personal accounts
231
00:11:48,316 --> 00:11:50,406
on other cryptocurrency
exchanges
232
00:11:50,448 --> 00:11:55,578
and then squandered most
of them on risky trades.
233
00:11:55,627 --> 00:11:57,627
Morgan: This was
a massive fraud.
234
00:11:57,673 --> 00:12:00,153
Cotten had to know that when
he came back to Canada,
235
00:12:00,197 --> 00:12:04,587
he was going to have
to face the music.
236
00:12:04,636 --> 00:12:07,546
Narrator: Cotten's scam wasn't
particularly sophisticated,
237
00:12:07,596 --> 00:12:09,726
it was really just a modern take
238
00:12:09,772 --> 00:12:11,732
on an old fashioned
Ponzi scheme.
239
00:12:11,774 --> 00:12:14,344
But cryptocurrency is
susceptible
240
00:12:14,385 --> 00:12:16,815
to all kinds of
illegal activity.
241
00:12:16,866 --> 00:12:20,256
Hackers have shown no shortage
of creativity when it comes to
242
00:12:20,304 --> 00:12:24,614
devising new and innovative
ways to steal it.
243
00:12:24,656 --> 00:12:26,746
Badminton: Hackers can break
into mobile data networks
244
00:12:26,789 --> 00:12:29,309
or users' home WiFi,
and then manipulate
245
00:12:29,357 --> 00:12:31,447
or divert crypto trades.
246
00:12:31,489 --> 00:12:35,449
They can also capture login and
password information this way.
247
00:12:35,493 --> 00:12:38,453
Pringle: Online digital wallets
and exchange platforms
248
00:12:38,496 --> 00:12:42,366
similar to Quadriga are also
vulnerable to cyberattacks.
249
00:12:42,413 --> 00:12:45,683
Narrator: Research analysts note
that while crypto-related crime
250
00:12:45,721 --> 00:12:49,071
may be at historic highs,
the growth of lawful use
251
00:12:49,116 --> 00:12:51,896
far outweighs the growth
of criminal activity.
252
00:12:51,945 --> 00:12:54,945
In 2021, illegal trades
253
00:12:54,991 --> 00:12:58,301
represented just 0.15 percent
254
00:12:58,342 --> 00:13:01,172
of the $15.8 trillion dollars
255
00:13:01,215 --> 00:13:04,605
in total crypto transactions.
256
00:13:04,653 --> 00:13:05,743
Badminton: It's not
all doom and gloom.
257
00:13:05,785 --> 00:13:07,655
While the blockchain
is somewhat anonymous,
258
00:13:07,699 --> 00:13:10,089
every transaction
is logged on a ledger
259
00:13:10,137 --> 00:13:12,837
that can be seen by anyone with
an internet connection.
260
00:13:12,879 --> 00:13:15,529
And with the right resources,
authorities can use
261
00:13:15,577 --> 00:13:20,447
this transparency to
stop illegal activity.
262
00:13:20,495 --> 00:13:22,755
Narrator: Crypto detractors
admit that although
263
00:13:22,802 --> 00:13:25,672
it may not work as a currency
for the time being,
264
00:13:25,717 --> 00:13:27,847
the underlying blockchain
technology
265
00:13:27,894 --> 00:13:30,944
does have value in other areas.
266
00:13:30,984 --> 00:13:32,904
Morgan: Businesses are starting
to use the blockchain
267
00:13:32,942 --> 00:13:35,552
to secure their data,
to oversee supply chain
268
00:13:35,597 --> 00:13:39,557
and even to manage
intellectual property.
269
00:13:39,601 --> 00:13:41,821
Narrator: But the primary
function of the blockchain
270
00:13:41,864 --> 00:13:43,874
is still cryptocurrency,
271
00:13:43,910 --> 00:13:46,090
and as long as there is
money to be made,
272
00:13:46,129 --> 00:13:48,519
there will always be characters
like Gerald Cotten
273
00:13:48,566 --> 00:13:51,606
looking to cheat the system.
274
00:13:51,656 --> 00:13:53,746
For her part, Jennifer Robertson
275
00:13:53,789 --> 00:13:55,789
is adamant that her
husband is dead,
276
00:13:55,835 --> 00:13:59,745
and that all of the
speculation is ludicrous.
277
00:13:59,795 --> 00:14:02,445
She claims she brought Cotten's
body back to Canada
278
00:14:02,493 --> 00:14:05,933
and that he was buried five days
after his death in a small,
279
00:14:05,975 --> 00:14:09,585
closed casket ceremony in
Halifax, Nova Scotia.
280
00:14:09,631 --> 00:14:14,381
But still suspicions linger...
281
00:14:14,418 --> 00:14:16,418
Pringle: There are people out
there that want the body exhumed
282
00:14:16,464 --> 00:14:20,864
so that they can verify
that he's actually dead.
283
00:14:20,903 --> 00:14:23,783
Narrator: The FBI and Royal
Canadian Mounted Police
284
00:14:23,819 --> 00:14:28,169
are actively investigating
Cotten's case.
285
00:14:28,215 --> 00:14:29,995
Morgan: Is Cotten still alive?
286
00:14:30,043 --> 00:14:31,873
Well, the jury is still out.
287
00:14:31,914 --> 00:14:34,744
It is pretty tough to fake your
own death and get away with it,
288
00:14:34,786 --> 00:14:37,486
but Cotten does have a
proven track record
289
00:14:37,528 --> 00:14:40,358
of successfully lying to
huge numbers of people.
290
00:14:40,401 --> 00:14:45,101
So...who knows?
291
00:14:45,145 --> 00:14:47,145
Narrator: Investigators were
eventually able to recover
292
00:14:47,190 --> 00:14:49,280
around $46 million dollars
293
00:14:49,323 --> 00:14:51,983
from Quadriga's accounts.
294
00:14:52,021 --> 00:14:54,761
Of the over 75,000
duped customers,
295
00:14:54,806 --> 00:14:57,676
only 17,000 of them made claims,
296
00:14:57,722 --> 00:14:59,292
and they will likely receive
297
00:14:59,333 --> 00:15:04,123
only around 10% of
their money back.
298
00:15:04,164 --> 00:15:07,174
The strange saga of
Gerald Cotten, is a window
299
00:15:07,210 --> 00:15:11,000
into the controversial world
of cryptocurrency.
300
00:15:11,040 --> 00:15:13,430
Will it render traditional
currency obsolete
301
00:15:13,477 --> 00:15:15,307
and become the unified system
302
00:15:15,349 --> 00:15:18,569
for all global
financial transactions?
303
00:15:18,613 --> 00:15:22,233
Crypto evangelists, often with
full-throated zeal,
304
00:15:22,269 --> 00:15:25,749
say it is inevitable that one
day cryptocurrency
305
00:15:25,794 --> 00:15:29,364
will be an integral part of
our everyday lives.
306
00:15:29,406 --> 00:15:32,366
If and when that day comes,
is anyone's guess.
307
00:15:32,409 --> 00:15:34,589
Maybe Gerald Cotten knows.
308
00:15:34,629 --> 00:15:35,889
Wherever he is.
309
00:15:35,935 --> 00:15:39,675
♪
310
00:15:39,721 --> 00:15:42,551
♪ [show theme music]
311
00:15:42,593 --> 00:15:49,473
♪♪
312
00:15:49,513 --> 00:15:51,993
Narrator: For many years,
a 33 year old woman
313
00:15:52,038 --> 00:15:54,338
living in Michigan, USA
has endured
314
00:15:54,388 --> 00:15:58,258
a painful medical condition
called endometriosis,
315
00:15:58,305 --> 00:16:00,955
a disorder of the female
reproductive system,
316
00:16:01,003 --> 00:16:06,533
where abnormal cells begin to
grow outside the uterus.
317
00:16:06,574 --> 00:16:09,754
As a result of this condition,
she has had emergency surgery
318
00:16:09,794 --> 00:16:12,804
to remove life-threatening cysts
on one of her ovaries
319
00:16:12,841 --> 00:16:16,151
and lives in constant pain.
320
00:16:16,192 --> 00:16:18,242
Alexander: In order to
protect her identity,
321
00:16:18,281 --> 00:16:20,811
we will call her Kathryn.
It's not her real name,
322
00:16:20,849 --> 00:16:23,459
but to manage her day-to-day
pain, Kathryn's doctor
323
00:16:23,504 --> 00:16:25,994
had prescribed her different
oral opioids like Percocet.
324
00:16:26,028 --> 00:16:28,158
Fortunately, they seem
to be working.
325
00:16:28,204 --> 00:16:32,514
♪
326
00:16:32,556 --> 00:16:35,426
Narrator: But sometimes,
no amount of medication
327
00:16:35,472 --> 00:16:37,692
can alleviate Kathryn's
suffering.
328
00:16:37,735 --> 00:16:41,565
One day, the pain becomes too
intense for her to bear,
329
00:16:41,609 --> 00:16:44,439
and she checks into
the local hospital.
330
00:16:44,481 --> 00:16:47,531
Doctors immediately begin
monitoring her to see
331
00:16:47,571 --> 00:16:51,271
if she may be developing
another ovarian cyst.
332
00:16:51,314 --> 00:16:54,194
They provide her with
intravenous opioid medication
333
00:16:54,230 --> 00:16:56,840
to help relieve her pain.
334
00:16:56,885 --> 00:16:58,925
But on her fourth day
in the hospital,
335
00:16:58,974 --> 00:17:02,504
something startling happens.
336
00:17:02,543 --> 00:17:04,333
Pringle: Even though she has
the potential to develop
337
00:17:04,371 --> 00:17:07,331
another cyst, which would only
intensify the pain,
338
00:17:07,374 --> 00:17:10,034
a staff member informs her that
she will no longer be receiving
339
00:17:10,072 --> 00:17:14,292
any kind of opioid medication.
340
00:17:14,337 --> 00:17:15,947
Aitken: The reason is that the
medical staff believe
341
00:17:15,991 --> 00:17:18,651
she needs help that is not
'pain-related'.
342
00:17:18,689 --> 00:17:21,689
Narrator: Although 'Kathryn' is
still in serious pain,
343
00:17:21,736 --> 00:17:23,686
she is discharged
from the hospital
344
00:17:23,738 --> 00:17:25,908
with no further explanation.
345
00:17:25,957 --> 00:17:32,527
And only two weeks later, she
gets even more troubling news.
346
00:17:32,573 --> 00:17:34,793
Pringle: She receives a letter
from her gynecologist's office
347
00:17:34,836 --> 00:17:37,136
telling her that they are
dropping her as a patient.
348
00:17:37,186 --> 00:17:42,536
And she is very surprised
to find out why.
349
00:17:42,583 --> 00:17:44,853
Narrator: Her doctor recently
obtained a medical report
350
00:17:44,889 --> 00:17:47,199
from the NarxCare database,
351
00:17:47,240 --> 00:17:49,590
an AI program that uses
algorithms
352
00:17:49,633 --> 00:17:52,293
to help identify a patient's
risk of misusing
353
00:17:52,332 --> 00:17:57,422
or overdosing on drugs.
354
00:17:57,467 --> 00:18:02,817
The A.I. had flagged
Kathryn as "High Risk".
355
00:18:02,864 --> 00:18:08,874
Opioids are potentially very
dangerous medications.
356
00:18:08,913 --> 00:18:11,133
They are derived from the plant
that shares the same name
357
00:18:11,177 --> 00:18:15,527
as the drug: 'Opium'.
358
00:18:15,572 --> 00:18:20,102
When ingested, opioids interact
with specific cell receptors
359
00:18:20,142 --> 00:18:22,062
that mute a person's
pain perception
360
00:18:22,101 --> 00:18:26,541
and then boost feelings
of pleasure.
361
00:18:26,583 --> 00:18:28,633
Alexander: They are relatively
safe, if properly prescribed
362
00:18:28,672 --> 00:18:30,372
and monitored by a physician.
363
00:18:30,413 --> 00:18:31,893
But there are serious
health risks
364
00:18:31,936 --> 00:18:35,586
if opioids are used
incorrectly or abused.
365
00:18:35,636 --> 00:18:38,726
Narrator: Too high of a dose can
slow a person's breathing
366
00:18:38,769 --> 00:18:41,769
and heart rate, which
can lead to death.
367
00:18:41,816 --> 00:18:45,076
And they are very addictive.
368
00:18:45,124 --> 00:18:46,954
Pringle: The problem was born
in the '90s,
369
00:18:46,995 --> 00:18:49,815
with doctors overprescribing
opioids to patients.
370
00:18:49,867 --> 00:18:51,997
But before doctors and the rest
of society realized
371
00:18:52,043 --> 00:18:55,573
how addictive these drugs were,
the damage had been done.
372
00:18:55,612 --> 00:18:58,402
Now it's become a serious
problem all over the world,
373
00:18:58,441 --> 00:19:03,621
but it's at epidemic
levels in the US.
374
00:19:03,664 --> 00:19:08,104
Narrator: In 2019, nearly
50,000 Americans died
375
00:19:08,147 --> 00:19:10,667
from opioid-related overdoses,
376
00:19:10,714 --> 00:19:12,804
more than any other year
on record,
377
00:19:12,847 --> 00:19:18,327
136 people every day.
378
00:19:18,374 --> 00:19:21,294
Over the last 20 years, the
U.S. Department of Justice
379
00:19:21,334 --> 00:19:27,514
has spent millions of dollars
trying to mitigate the problem.
380
00:19:27,557 --> 00:19:29,207
Alexander: They created these
state-level prescription drug
381
00:19:29,255 --> 00:19:31,425
databases called P.D.M.P.
382
00:19:31,474 --> 00:19:34,044
- Prescription Drug
Monitoring Programs.
383
00:19:34,085 --> 00:19:36,125
They collect data on any
controlled substance
384
00:19:36,175 --> 00:19:39,695
that is prescribed to patients.
385
00:19:39,743 --> 00:19:42,403
Narrator: The U.S. government
has also recently contracted
386
00:19:42,442 --> 00:19:46,102
many private companies to mine
this P.D.M.P. data,
387
00:19:46,141 --> 00:19:48,361
some incorporating the
assistance of
388
00:19:48,404 --> 00:19:51,324
machine learning algorithms,
which are meant to indicate
389
00:19:51,364 --> 00:19:56,154
if a patient is potentially
misusing opioids.
390
00:19:56,195 --> 00:19:57,755
Aitken: It tracks the specific
doctors who prescribe
391
00:19:57,805 --> 00:20:00,365
their medications, the
pharmacists handing them out,
392
00:20:00,416 --> 00:20:02,846
the amount of dosages they
receive and if there are any
393
00:20:02,897 --> 00:20:06,467
overlapping days in their
prescriptions.
394
00:20:06,509 --> 00:20:08,379
Pringle: Based on the data,
it calculates and creates
395
00:20:08,424 --> 00:20:11,604
a patient's 'risk score, with 0
being those individuals
396
00:20:11,645 --> 00:20:13,035
who are at the least at risk,
397
00:20:13,081 --> 00:20:19,131
and 999 being those
highest at risk.
398
00:20:19,174 --> 00:20:21,394
Narrator: Many medical experts
assert that these programs
399
00:20:21,437 --> 00:20:24,787
are reliable in accurately
assessing a patient's risk,
400
00:20:24,832 --> 00:20:30,792
and are an essential tool to
help curb the opioid epidemic.
401
00:20:30,838 --> 00:20:32,798
Alexander: It allows doctors and
pharmacists to intervene,
402
00:20:32,840 --> 00:20:34,580
giving them the chance
to stop or reduce
403
00:20:34,624 --> 00:20:38,544
a patient's prescriptions before
they become a problem.
404
00:20:38,585 --> 00:20:40,665
Narrator: In 2021, the U.S.
405
00:20:40,717 --> 00:20:43,237
Center for Disease
Control & Prevention
406
00:20:43,285 --> 00:20:46,155
found that there were
almost 20,000 more
407
00:20:46,201 --> 00:20:50,641
opioid overdose deaths
than the previous year.
408
00:20:50,684 --> 00:20:53,084
Pringle: This is a
significant increase but
409
00:20:53,121 --> 00:20:54,951
many addiction
experts counter that
410
00:20:54,992 --> 00:20:57,652
prescription opioids
are not the true culprit.
411
00:20:57,691 --> 00:20:59,301
That many deaths
are the result of
412
00:20:59,345 --> 00:21:02,825
black-market opioids
like fentanyl.
413
00:21:02,870 --> 00:21:04,920
Aitken: Some argue we should use
every tool at our disposal
414
00:21:04,959 --> 00:21:06,739
to combat opioid addiction.
415
00:21:06,787 --> 00:21:11,917
And that includes these
problematic A.I. programs.
416
00:21:11,966 --> 00:21:14,096
Narrator: But leveraging these
systems might come
417
00:21:14,142 --> 00:21:17,232
at the expense of fair treatment
for prescription users,
418
00:21:17,276 --> 00:21:19,706
who have been taking
opioids for years,
419
00:21:19,756 --> 00:21:22,626
with no reported problems.
420
00:21:22,672 --> 00:21:26,072
The issue lies with how the
technology analyses the data
421
00:21:26,110 --> 00:21:28,330
of patients like Kathryn.
422
00:21:28,374 --> 00:21:30,074
Aitken: It's possible
the A.I. made a mistake.
423
00:21:30,114 --> 00:21:32,604
And if it did, that raises
serious concerns about
424
00:21:32,639 --> 00:21:34,899
it's reliability in such a
high-stakes context
425
00:21:34,945 --> 00:21:38,165
as pain relief and
medical treatment.
426
00:21:38,209 --> 00:21:39,779
Alexander: There is
evidence to suggest
427
00:21:39,820 --> 00:21:41,650
that people with
long-term illnesses
428
00:21:41,691 --> 00:21:43,911
do tend to get flagged
the most, and not because
429
00:21:43,954 --> 00:21:46,094
they're abusing opioids,
but because of the
430
00:21:46,130 --> 00:21:50,270
large amount of data they've
accumulated over their lifetime.
431
00:21:50,309 --> 00:21:52,489
Narrator: But patients' history
isn't the only data
432
00:21:52,528 --> 00:21:54,658
that the system keeps track of.
433
00:21:54,704 --> 00:21:59,364
The PDMP database also keeps
detailed records of physicians,
434
00:21:59,405 --> 00:22:01,615
and if the algorithm suspects
435
00:22:01,668 --> 00:22:04,108
that they may be over
prescribing opioids,
436
00:22:04,148 --> 00:22:08,808
they could be at risk of losing
their medical license.
437
00:22:08,849 --> 00:22:11,499
Pringle: This could incentivize
doctors to not take on patients
438
00:22:11,547 --> 00:22:13,507
suffering from long
term chronic pain,
439
00:22:13,549 --> 00:22:17,379
out of fear that they could
jeopardize their careers.
440
00:22:17,423 --> 00:22:20,773
Narrator: A 2021 survey of
U.S. medical clinics reveals
441
00:22:20,817 --> 00:22:23,597
that roughly 43 percent
are now declining
442
00:22:23,646 --> 00:22:27,906
new patients who
require opioids.
443
00:22:27,955 --> 00:22:29,515
Alexander: This leaves patients
with no other choice
444
00:22:29,565 --> 00:22:31,605
but to keep looking
for a physician.
445
00:22:31,654 --> 00:22:34,314
The problem is, if they are
lucky enough to find one,
446
00:22:34,353 --> 00:22:39,143
their practice may be far away,
possibly out of state.
447
00:22:39,183 --> 00:22:42,543
Narrator: Critics say this may
only exacerbate the problem.
448
00:22:42,578 --> 00:22:45,798
The algorithms know when a
person changes physicians
449
00:22:45,842 --> 00:22:48,852
and can easily ascertain the
distance between their clinic
450
00:22:48,889 --> 00:22:53,199
and the patient's home.
And flag this as a problem.
451
00:22:53,241 --> 00:22:54,421
Aitken: If a patient is
travelling far
452
00:22:54,460 --> 00:22:55,900
to see their physician,
453
00:22:55,939 --> 00:22:57,719
it may look like they are
'Doctor Shopping',
454
00:22:57,767 --> 00:22:59,417
which is when a person
is trying to obtain
455
00:22:59,465 --> 00:23:01,675
multiple prescriptions from as
many doctors as possible.
456
00:23:01,728 --> 00:23:06,518
It's common among people with
substance abuse issues.
457
00:23:06,559 --> 00:23:08,299
Narrator: And there are
further complications
458
00:23:08,343 --> 00:23:10,693
if patients are prescribed
other medications
459
00:23:10,737 --> 00:23:14,217
in concert with opioids.
460
00:23:14,262 --> 00:23:17,092
At one point in time,
Kathryn was suffering
461
00:23:17,134 --> 00:23:19,224
from post-traumatic
stress disorder,
462
00:23:19,267 --> 00:23:21,397
from an unspecified incident,
463
00:23:21,443 --> 00:23:24,623
and was given a prescription
for Benzodiazepine,
464
00:23:24,664 --> 00:23:31,934
a sedative that helps alleviate
anxiety and insomnia.
465
00:23:31,975 --> 00:23:35,275
A report by the U.S. National
Institute on Drug Abuse
466
00:23:35,326 --> 00:23:38,976
found that combining
benzodiazepines with opioids
467
00:23:39,026 --> 00:23:41,716
is extremely dangerous because
both drugs
468
00:23:41,768 --> 00:23:44,808
are strong sedatives and
can affect a person's ability
469
00:23:44,858 --> 00:23:49,208
to breathe - a common cause
of fatal overdoses.
470
00:23:49,253 --> 00:23:52,873
The study also found that when
these drugs are taken together,
471
00:23:52,909 --> 00:23:56,169
a person's chances of dying
are 10 times higher
472
00:23:56,217 --> 00:23:59,997
than those taking only opioids.
473
00:24:00,047 --> 00:24:02,347
Alexander: As it turns out, the
algorithm that assessed Kathryn
474
00:24:02,397 --> 00:24:04,967
does include drugs that treat
post-traumatic stress
475
00:24:05,008 --> 00:24:09,578
as "variables" in how
it determines risk.
476
00:24:09,622 --> 00:24:11,672
Aitken: This shows the A.I. did
what it is designed to do.
477
00:24:11,711 --> 00:24:15,451
It flagged Kathryn as being at a
higher risk of overdose.
478
00:24:15,497 --> 00:24:17,797
Narrator: Many academics believe
that the technology's
479
00:24:17,847 --> 00:24:22,107
good intentions might have
unforeseen consequences.
480
00:24:22,156 --> 00:24:25,116
The algorithms may unfairly
target women because
481
00:24:25,159 --> 00:24:30,689
statistically, they are most
likely to suffer from PTSD.
482
00:24:30,730 --> 00:24:32,600
Pringle: Studies show
that many women endure
483
00:24:32,645 --> 00:24:34,905
significant trauma
on a day-to-day basis.
484
00:24:34,951 --> 00:24:37,081
Whether it be spousal abuse or
485
00:24:37,127 --> 00:24:41,867
in some instances, sexual abuse.
486
00:24:41,915 --> 00:24:45,655
Narrator: In 2021, a U.S. law
professor published a paper
487
00:24:45,701 --> 00:24:49,011
citing evidence that women
who've endured sexual abuse
488
00:24:49,052 --> 00:24:51,532
during their lifetime
are much more likely
489
00:24:51,577 --> 00:24:55,487
to be denied
opioid prescriptions.
490
00:24:55,537 --> 00:24:57,707
With no legal access
to medications
491
00:24:57,757 --> 00:25:00,407
for their psychological
or physical pain,
492
00:25:00,455 --> 00:25:06,155
it's believed many will turn
to the black market.
493
00:25:06,200 --> 00:25:07,770
Alexander: If you can't get your
prescription from your doctor
494
00:25:07,810 --> 00:25:10,200
and pharmacist, you're
left with no choice
495
00:25:10,247 --> 00:25:13,687
than to obtain them
by some other means.
496
00:25:13,729 --> 00:25:15,299
Aitken: And that puts these
women, and anyone else
497
00:25:15,339 --> 00:25:17,819
for that matter, in an even
more vulnerable situation
498
00:25:17,864 --> 00:25:19,694
because once
on the black market,
499
00:25:19,735 --> 00:25:21,475
the quality of the
product can vary.
500
00:25:21,520 --> 00:25:23,300
Meaning they're exposing
themselves to drugs
501
00:25:23,347 --> 00:25:27,737
that may be laced with
other unknown substances.
502
00:25:27,787 --> 00:25:30,087
Narrator: Kathryn still hasn't
found a new doctor,
503
00:25:30,137 --> 00:25:32,397
so she is understandably
concerned about
504
00:25:32,443 --> 00:25:34,663
where she'll be able to get
her medications from,
505
00:25:34,707 --> 00:25:37,057
if her endometriosis
flares up again,
506
00:25:37,100 --> 00:25:42,060
or if any other medical
situation arises.
507
00:25:42,105 --> 00:25:44,535
Pringle: The hospital never
provided her with an explanation
508
00:25:44,586 --> 00:25:46,976
as to why her prescriptions
were discontinued,
509
00:25:47,023 --> 00:25:54,383
so she remains in the dark.
510
00:25:54,422 --> 00:25:57,032
Narrator: Despondent and
desperate for answers,
511
00:25:57,077 --> 00:25:59,467
Kathryn began doing
research online
512
00:25:59,514 --> 00:26:01,654
and stumbled on a
possible reason
513
00:26:01,690 --> 00:26:04,300
for why she may have been
flagged by the system,
514
00:26:04,345 --> 00:26:06,695
she had sick pets.
515
00:26:06,739 --> 00:26:08,959
At the time of her
hospitalisation,
516
00:26:09,002 --> 00:26:11,182
she owned two older rescue dogs
517
00:26:11,221 --> 00:26:13,831
with significant medical issues.
518
00:26:13,876 --> 00:26:17,486
Both had been prescribed
opioids, benzodiazepines,
519
00:26:17,532 --> 00:26:21,932
and barbiturates by
their veterinarians.
520
00:26:21,971 --> 00:26:23,361
Alexander: Prescriptions for
pets are put under
521
00:26:23,407 --> 00:26:25,847
their owner's name. So to the
algorithm it may look like
522
00:26:25,888 --> 00:26:28,018
she is using her
sick dogs as an excuse
523
00:26:28,064 --> 00:26:32,374
to obtain opioid
prescriptions for herself.
524
00:26:32,416 --> 00:26:35,026
Narrator: But A.I. advocates
say that such mistakes
525
00:26:35,071 --> 00:26:38,901
could be remedied by continually
improving the technology,
526
00:26:38,945 --> 00:26:42,425
and streamlining the data so
that the algorithms only use
527
00:26:42,470 --> 00:26:46,430
information that is
relevant to the patient.
528
00:26:46,474 --> 00:26:48,354
Aitken: The problem of opioid
addiction is complex,
529
00:26:48,389 --> 00:26:51,039
and it can't be resolved by
any one single solution.
530
00:26:51,087 --> 00:26:53,127
So technology won't
solve everything.
531
00:26:53,176 --> 00:26:55,176
But if it's designed and
developed responsibly,
532
00:26:55,222 --> 00:26:56,922
there's a role for technology
in addressing
533
00:26:56,963 --> 00:26:59,793
some of these challenges.
534
00:26:59,835 --> 00:27:01,745
Pringle: This situation
needs as much help
535
00:27:01,794 --> 00:27:03,624
and attention as it can get.
536
00:27:03,665 --> 00:27:05,665
Since the beginning
of the COVID pandemic,
537
00:27:05,711 --> 00:27:10,591
overdose deaths in the U.S.
have soared by nearly 30%.
538
00:27:10,629 --> 00:27:12,239
Narrator: And it's
not just there.
539
00:27:12,282 --> 00:27:15,022
Opioid abuse in Canada,
Australia,
540
00:27:15,068 --> 00:27:18,068
and some European countries is
now reaching similar levels
541
00:27:18,114 --> 00:27:20,604
to those seen in the US,
542
00:27:20,639 --> 00:27:22,949
which has prompted the
governments of Belgium,
543
00:27:22,989 --> 00:27:24,689
Finland and the U.K.
544
00:27:24,730 --> 00:27:26,730
to issue new guidelines
for doctors
545
00:27:26,775 --> 00:27:30,125
in prescribing opioid-based
medicines.
546
00:27:30,170 --> 00:27:31,780
Alexander: They too are
exploring the possibility
547
00:27:31,824 --> 00:27:36,444
of using A.I to
address the issue.
548
00:27:36,480 --> 00:27:38,740
Narrator: It remains to be seen
what real impact
549
00:27:38,787 --> 00:27:43,617
technology will have on the
opioid epidemic.
550
00:27:43,662 --> 00:27:46,012
While the intentions are no
doubt in the right place,
551
00:27:46,055 --> 00:27:48,485
the middle ground between
preventing addiction
552
00:27:48,536 --> 00:27:50,356
and providing comfort to those
553
00:27:50,407 --> 00:27:53,577
who legitimately need
medication remains elusive.
554
00:27:53,628 --> 00:28:00,288
♪
555
00:28:00,330 --> 00:28:03,160
♪ [show theme music]
556
00:28:03,203 --> 00:28:12,433
♪♪
557
00:28:12,473 --> 00:28:14,783
Narrator: It's the fall of 2018
558
00:28:14,823 --> 00:28:18,093
and Ali Bongo,
the President of Gabon,
559
00:28:18,131 --> 00:28:21,611
hasn't been seen in public
in several weeks.
560
00:28:21,656 --> 00:28:25,786
Many citizens have become
suspicious of his absence.
561
00:28:25,834 --> 00:28:27,714
Knowing that he had
suffered a stroke,
562
00:28:27,749 --> 00:28:29,449
the public begins to suspect
563
00:28:29,490 --> 00:28:32,230
he might be incapacitated.
564
00:28:32,275 --> 00:28:34,535
But the government assures them
that he will be delivering
565
00:28:34,582 --> 00:28:39,202
his customary New Year's
address.
566
00:28:39,239 --> 00:28:42,629
On Dec 31st, a video of
President Bongo
567
00:28:42,677 --> 00:28:45,417
is released to the public.
568
00:28:45,462 --> 00:28:47,902
Ali Bongo: Hello, compatriots.
569
00:28:47,943 --> 00:28:50,553
The Gabonese people, before
anything else,
570
00:28:50,598 --> 00:28:55,908
allow me to address to you...
571
00:28:55,951 --> 00:28:57,471
Pringle: When people see
this video, it raises
572
00:28:57,518 --> 00:28:59,828
more questions than it answers.
573
00:28:59,868 --> 00:29:02,648
Ali Bongo:
[speaking foreign language]
574
00:29:02,697 --> 00:29:04,437
Narrator:
There is something unusual
575
00:29:04,481 --> 00:29:07,401
about Bongo's appearance.
576
00:29:07,441 --> 00:29:08,831
Morgan: His voice
sounds odd,
577
00:29:08,877 --> 00:29:11,227
and he doesn't really
look like himself.
578
00:29:11,271 --> 00:29:14,541
His facial expressions
are a little unnatural.
579
00:29:14,578 --> 00:29:19,058
Ali Bongo:
[speaking foreign language]
580
00:29:19,105 --> 00:29:21,795
Narrator: Rumours begin to
circulate that his video address
581
00:29:21,847 --> 00:29:24,107
was fabricated by the
Gabonese government,
582
00:29:24,153 --> 00:29:28,553
in order to deceive
its citizens.
583
00:29:28,592 --> 00:29:30,122
Aitken: What's driving the
rumours is the fact that
584
00:29:30,159 --> 00:29:32,549
a few months earlier, the
President was out of the country
585
00:29:32,596 --> 00:29:38,856
receiving medical treatment in
Saudi Arabia and England.
586
00:29:38,907 --> 00:29:40,337
Narrator: There is a
growing suspicion
587
00:29:40,387 --> 00:29:42,217
that he is either very ill,
588
00:29:42,258 --> 00:29:50,268
or even more troubling,
that he is actually dead.
589
00:29:50,310 --> 00:29:52,010
Some are convinced
that the video
590
00:29:52,051 --> 00:29:55,361
is what's called a deepfake.
591
00:29:55,402 --> 00:29:58,582
A form of synthetic media that
uses artificial intelligence
592
00:29:58,622 --> 00:30:01,062
to manipulate images or audio
593
00:30:01,103 --> 00:30:04,063
in order to create
fake events or actions,
594
00:30:04,106 --> 00:30:05,846
often to portray individuals
595
00:30:05,891 --> 00:30:09,681
saying or doing something
they actually didn't.
596
00:30:09,720 --> 00:30:14,200
And they do so with
striking authenticity.
597
00:30:14,247 --> 00:30:17,857
'Deepfakes' roots go back
as far as the 1990s,
598
00:30:17,903 --> 00:30:21,433
when huge advancements were
made in digital media.
599
00:30:21,471 --> 00:30:24,041
Computer programmers
began experimenting
600
00:30:24,083 --> 00:30:27,043
with this new technology,
fabricating images
601
00:30:27,086 --> 00:30:31,126
and superimposing human faces
on other people.
602
00:30:31,177 --> 00:30:33,087
Badminton: One group of
programmers are eventually able
603
00:30:33,135 --> 00:30:35,825
to digitally modify
a person's mouth movements,
604
00:30:35,877 --> 00:30:37,657
so it looks like they are
saying things they never did.
605
00:30:37,705 --> 00:30:41,095
But it wasn't very convincing.
606
00:30:41,143 --> 00:30:42,713
Narrator: In the following
decades,
607
00:30:42,753 --> 00:30:45,373
and with the development of
artificial intelligence,
608
00:30:45,408 --> 00:30:48,368
the technology makes
huge leaps and bounds.
609
00:30:48,411 --> 00:30:52,201
With its help, programmers are
more able to accurately mimic
610
00:30:52,241 --> 00:30:58,251
or fabricate human beings in
the digital environment.
611
00:30:58,291 --> 00:31:00,291
Aitken: The term 'Deepfake'
doesn't really become
612
00:31:00,336 --> 00:31:03,296
part of the popular vocabulary
until 2017,
613
00:31:03,339 --> 00:31:05,249
when a user by that
very same name,
614
00:31:05,298 --> 00:31:08,518
posts a counterfeit video
on a Reddit chain.
615
00:31:08,562 --> 00:31:09,962
Morgan: People are shocked.
616
00:31:09,998 --> 00:31:13,088
It's a pornographic film
where the performer's face
617
00:31:13,132 --> 00:31:17,272
has been swapped with the face
of the actress, Gal Gadot.
618
00:31:17,310 --> 00:31:18,750
Fake Obama:
We're entering an era...
619
00:31:18,789 --> 00:31:20,749
Narrator: The first Deepfake
to garner significant
620
00:31:20,791 --> 00:31:24,321
mainstream attention
is a 2018 video
621
00:31:24,360 --> 00:31:27,890
that features former President
Barack Obama giving a speech
622
00:31:27,929 --> 00:31:30,629
about the dangers of
what's on the internet.
623
00:31:30,671 --> 00:31:33,371
Fake Obama: You see,
I would never say these things.
624
00:31:33,413 --> 00:31:34,763
Badminton:
But it turns out, a filmmaker
625
00:31:34,805 --> 00:31:36,715
had simply swapped
his mouth for Obama's,
626
00:31:36,764 --> 00:31:38,114
making it look and sound
627
00:31:38,157 --> 00:31:40,067
as if the President
was really talking.
628
00:31:40,115 --> 00:31:42,245
Jordan Peele: Moving forward,
we need to be more vigilant
629
00:31:42,291 --> 00:31:44,471
with what we trust
from the internet.
630
00:31:44,511 --> 00:31:46,861
Narrator: The video was
intended as an ironic
631
00:31:46,905 --> 00:31:49,335
Public Service Announcement
for the digital age,
632
00:31:49,385 --> 00:31:52,555
a deepfaked warning
about deepfakes.
633
00:31:52,606 --> 00:31:54,646
And it went viral
after being shared
634
00:31:54,695 --> 00:31:58,655
millions of times across
social media platforms.
635
00:31:58,699 --> 00:32:00,789
According to the
deepfake's creators,
636
00:32:00,831 --> 00:32:03,621
it required 56 hours
of machine learning
637
00:32:03,660 --> 00:32:06,620
to get the simulation right.
638
00:32:06,663 --> 00:32:08,233
The message culminates
with something
639
00:32:08,274 --> 00:32:10,494
Obama would never say.
640
00:32:10,537 --> 00:32:13,707
Fake Obama:
Stay woke bitches!
641
00:32:13,757 --> 00:32:16,627
Narrator: Deepfakes are created
by using algorithms
642
00:32:16,673 --> 00:32:19,763
that analyze various images
and videos of a person
643
00:32:19,807 --> 00:32:22,937
and then use this knowledge
to digitally reconstruct
644
00:32:22,984 --> 00:32:26,034
their body, its movements,
manner of speech
645
00:32:26,074 --> 00:32:28,994
and even the cadence
of their voice.
646
00:32:29,034 --> 00:32:31,124
Aitken: So not surprisingly,
there's a lot of concern
647
00:32:31,166 --> 00:32:32,556
that the technology
is going to be used
648
00:32:32,602 --> 00:32:35,262
for nefarious purposes.
But some people argue
649
00:32:35,301 --> 00:32:39,351
that it can also have
practical applications.
650
00:32:39,392 --> 00:32:41,392
Narrator: Educators say
'Deepfakes' can allow
651
00:32:41,437 --> 00:32:44,607
schoolboards to adopt new
innovative teaching methods
652
00:32:44,658 --> 00:32:46,528
that have the potential to
engage students
653
00:32:46,573 --> 00:32:50,143
on a deeper level.
654
00:32:50,185 --> 00:32:53,095
Pringle: Imagine a video or a
hologram of Abraham Lincoln,
655
00:32:53,145 --> 00:32:56,405
or Albert Einstein speaking
directly to a class.
656
00:32:56,452 --> 00:32:59,372
This will have far more of an
impact than just reading
657
00:32:59,412 --> 00:33:02,812
about them in a textbook or
watching a documentary.
658
00:33:02,850 --> 00:33:04,290
Narrator: But Deepfake
critics fear that
659
00:33:04,330 --> 00:33:06,550
as the technology improves,
660
00:33:06,593 --> 00:33:10,213
so will its overall
accessibility.
661
00:33:10,249 --> 00:33:12,559
Over the past few years,
the amount of
662
00:33:12,599 --> 00:33:15,859
'Deepfake' content has been
growing rapidly.
663
00:33:15,906 --> 00:33:17,686
At the end of 2018,
664
00:33:17,734 --> 00:33:21,744
there were roughly 8,000
deepfake videos posted online,
665
00:33:21,782 --> 00:33:23,872
but just nine months later,
666
00:33:23,914 --> 00:33:28,144
that number grew
to almost 15,000.
667
00:33:28,180 --> 00:33:30,230
Aitken: It's not just companies
related to the tech industry.
668
00:33:30,269 --> 00:33:32,619
There's academics,
industrial researchers,
669
00:33:32,662 --> 00:33:35,802
even amateur enthusiasts
that are now making them.
670
00:33:35,839 --> 00:33:37,059
Tom Cruise Deepfake:
What's up TikTok?
671
00:33:37,102 --> 00:33:38,542
Got a little tip for you.
672
00:33:38,581 --> 00:33:41,281
Call it a TipTok
[laughs]
673
00:33:41,323 --> 00:33:43,503
Narrator: One enterprising
visual effects artist
674
00:33:43,543 --> 00:33:47,903
even went so far as to create a
wildly popular TikTok channel
675
00:33:47,938 --> 00:33:51,288
entirely devoted to deepfake
Tom Cruise videos.
676
00:33:51,333 --> 00:33:54,253
It features an AI-generated
doppelganger
677
00:33:54,293 --> 00:33:56,033
meant to look and
sound like him.
678
00:33:56,077 --> 00:33:58,297
Fake Cruise: It's crazy!
679
00:33:58,340 --> 00:34:00,430
Narrator: But there is
a downside,
680
00:34:00,473 --> 00:34:02,523
as deepfakes continue to
proliferate
681
00:34:02,562 --> 00:34:04,562
and improve in quality,
682
00:34:04,607 --> 00:34:07,997
there is a genuine fear that the
general public may be unable
683
00:34:08,046 --> 00:34:12,826
to distinguish imitation
from reality.
684
00:34:12,876 --> 00:34:15,616
Morgan: Technology like this has
the potential to wreak havoc
685
00:34:15,662 --> 00:34:22,322
on our society if it falls
into the wrong hands.
686
00:34:22,364 --> 00:34:25,414
Narrator: Critics warn that it
may also have dire consequences
687
00:34:25,454 --> 00:34:27,284
if a government uses the
technology
688
00:34:27,326 --> 00:34:29,756
to misinform their people.
689
00:34:29,806 --> 00:34:32,586
And Gabon's citizens suspect
this could be
690
00:34:32,635 --> 00:34:36,115
why their government may have
faked Ali Bongo's video address.
691
00:34:36,161 --> 00:34:37,731
Morgan: According to the
country's constitution,
692
00:34:37,771 --> 00:34:41,341
if a sitting President
is too ill, or unfit to lead,
693
00:34:41,383 --> 00:34:46,133
the opposition party
can call an election.
694
00:34:46,171 --> 00:34:47,481
Pringle: This would give the
government a very good reason
695
00:34:47,520 --> 00:34:49,090
to deceive the public.
696
00:34:49,130 --> 00:34:51,570
Remember, he had had a stroke.
697
00:34:51,611 --> 00:34:55,751
And if he is not well enough
to lead, he can be replaced.
698
00:34:55,789 --> 00:34:58,659
Narrator: What President Bongo's
government doesn't predict
699
00:34:58,705 --> 00:35:02,055
is that the video ends up having
the exact opposite effect,
700
00:35:02,100 --> 00:35:04,540
only adding to people's
suspicions
701
00:35:04,580 --> 00:35:08,630
and increasing instability
in the country.
702
00:35:08,671 --> 00:35:10,541
Barely a week after
the video is released,
703
00:35:10,586 --> 00:35:14,066
a faction within the military
launches a coup,
704
00:35:14,112 --> 00:35:18,032
claiming that the government
has lost its legitimacy.
705
00:35:18,072 --> 00:35:19,472
Aitken:
The coup fails and tensions
706
00:35:19,508 --> 00:35:21,378
within the country
eventually simmer down.
707
00:35:21,423 --> 00:35:22,733
But it does raise
the possibility
708
00:35:22,772 --> 00:35:24,302
that a similar situation,
709
00:35:24,339 --> 00:35:26,599
where a government's legitimacy
is called into question,
710
00:35:26,646 --> 00:35:29,166
could happen really anywhere.
711
00:35:29,214 --> 00:35:33,394
Narrator: In 2019, the US
Intelligence community warned
712
00:35:33,435 --> 00:35:36,865
that Deepfakes were a serious
national security risk,
713
00:35:36,917 --> 00:35:39,697
because adversarial countries
could use them as a weapon
714
00:35:39,746 --> 00:35:43,446
against them and their allies.
715
00:35:43,489 --> 00:35:46,009
Morgan: 'Deepfakes' can be
used to destroy trust in
716
00:35:46,056 --> 00:35:51,756
public and private institutions,
and even individuals.
717
00:35:51,801 --> 00:35:53,671
Narrator: Shortly after
the Russian invasion
718
00:35:53,716 --> 00:35:56,196
of Ukraine in 2022,
719
00:35:56,241 --> 00:35:59,981
a fake video of Ukrainian
president Volodymyr Zelensky
720
00:36:00,027 --> 00:36:01,637
telling his soldiers to
surrender,
721
00:36:01,681 --> 00:36:03,811
was circulated online.
722
00:36:03,857 --> 00:36:07,507
It wasn't very convincing.
Zelensky's head was pixelated
723
00:36:07,556 --> 00:36:09,376
and seemed too big for his body.
724
00:36:09,428 --> 00:36:11,518
and his voice sounded different.
725
00:36:11,560 --> 00:36:20,220
Fake Zelensky:
[speaking Ukrainian]
726
00:36:20,265 --> 00:36:23,005
Narrator: The video was quickly
dismissed as a deepfake,
727
00:36:23,050 --> 00:36:25,100
but it showed the potential
for the spread of
728
00:36:25,139 --> 00:36:29,879
misinformation in a time of war.
729
00:36:29,926 --> 00:36:31,836
As was the case with Zelensky,
730
00:36:31,885 --> 00:36:34,315
when people viewed
the Ali Bongo video,
731
00:36:34,366 --> 00:36:38,196
they couldn't help but
notice his peculiar appearance.
732
00:36:38,239 --> 00:36:40,889
Pringle: His face looks
a lot puffier and smoother,
733
00:36:40,937 --> 00:36:45,027
and the distance between his
eyes looks different.
734
00:36:45,072 --> 00:36:46,292
Aitken: Other things
people claimed were odd
735
00:36:46,334 --> 00:36:48,644
how little he blinks and
that his speech pattern
736
00:36:48,684 --> 00:36:50,254
seems totally different.
737
00:36:50,295 --> 00:36:58,735
[speaking foreign language]
738
00:36:58,781 --> 00:37:00,651
Narrator: And there are
other telltale signs,
739
00:37:00,696 --> 00:37:04,216
unnatural looking colour,
blurring of the face
740
00:37:04,265 --> 00:37:09,575
and technical inconsistencies
within the audio.
741
00:37:09,618 --> 00:37:11,578
Morgan: There are plenty of
'Deepfake' experts
742
00:37:11,620 --> 00:37:14,100
who admit that President
Bongo's video looks
743
00:37:14,144 --> 00:37:15,974
like it could have
been fabricated.
744
00:37:16,016 --> 00:37:17,966
Narrator: But others counter
that the stroke
745
00:37:18,018 --> 00:37:19,758
he suffered months earlier,
746
00:37:19,802 --> 00:37:22,202
may have impacted
his speaking ability,
747
00:37:22,240 --> 00:37:24,890
and that it's possible
he had botox injections,
748
00:37:24,938 --> 00:37:28,938
which caused the
puffiness in his face.
749
00:37:28,985 --> 00:37:30,545
Pringle: They also claim he
had a large amount of make-up
750
00:37:30,596 --> 00:37:32,026
applied to his face...
751
00:37:32,075 --> 00:37:33,725
These explanations
make sense.
752
00:37:33,773 --> 00:37:36,303
So it is possible
the doubters are wrong.
753
00:37:36,341 --> 00:37:38,951
But, once you've sowed
the seeds of doubt,
754
00:37:38,995 --> 00:37:42,085
the damage has been done.
755
00:37:42,129 --> 00:37:44,479
Narrator: Critics warn that
'Deepfake' technology poses
756
00:37:44,523 --> 00:37:47,443
an even greater threat to
other members of society -
757
00:37:47,482 --> 00:37:49,702
Specifically women.
758
00:37:49,745 --> 00:37:52,355
In 2019, it was discovered
759
00:37:52,400 --> 00:37:56,320
that of the 15,000 deepfake
videos posted online,
760
00:37:56,361 --> 00:38:02,501
a staggering 96%
were pornographic.
761
00:38:02,541 --> 00:38:05,811
Morgan: A significant number of
them are female public figures.
762
00:38:05,848 --> 00:38:08,238
Obviously these videos were
made without their consent.
763
00:38:08,286 --> 00:38:10,586
And they're just
the easiest targets.
764
00:38:10,636 --> 00:38:15,506
Really, no one is safe.
765
00:38:15,554 --> 00:38:18,734
Narrator: A cyber-security firm
found that in 2020,
766
00:38:18,774 --> 00:38:21,824
'Deepfakes' were weaponized
against as many as
767
00:38:21,864 --> 00:38:23,654
100,000 different women
768
00:38:23,692 --> 00:38:27,962
as a form of "revenge porn'.
769
00:38:28,001 --> 00:38:29,571
Aitken: It's a real
and horrific issue,
770
00:38:29,611 --> 00:38:32,351
and one which is very
hard to regulate.
771
00:38:32,397 --> 00:38:33,787
Morgan: It's illegal
in some countries,
772
00:38:33,833 --> 00:38:36,663
as well as certain US states.
773
00:38:36,705 --> 00:38:38,185
Aitken: Legislating
against deepfakes is
774
00:38:38,228 --> 00:38:40,268
much more difficult,
and much progress remains
775
00:38:40,318 --> 00:38:43,538
to be achieved on the issue.
776
00:38:43,582 --> 00:38:45,112
Narrator: Some of the tech
industry's most
777
00:38:45,148 --> 00:38:47,668
prominent companies are also
trying to limit
778
00:38:47,716 --> 00:38:49,976
the damage that deepfakes cause.
779
00:38:50,023 --> 00:38:53,073
Google and Facebook are
currently exploring new ways
780
00:38:53,113 --> 00:38:55,293
to flag synthetic content.
781
00:38:55,333 --> 00:38:57,073
One of which involves using
782
00:38:57,117 --> 00:39:02,207
the very same technology
that helps create them.
783
00:39:02,252 --> 00:39:03,732
Badminton: Artificial
Intelligence can be trained
784
00:39:03,776 --> 00:39:05,946
to spot technical
inconsistencies in images,
785
00:39:05,995 --> 00:39:09,695
videos and audio that
have been 'Deepfaked'.
786
00:39:09,738 --> 00:39:11,608
Morgan: Many companies are
also considering adding
787
00:39:11,653 --> 00:39:14,443
a unique digital signature
to images and videos
788
00:39:14,482 --> 00:39:17,702
so that it's easy to track
where they're coming from.
789
00:39:17,746 --> 00:39:20,446
Narrator: But even something
like a digital signature
790
00:39:20,488 --> 00:39:23,268
may not have been enough to
quell the suspicion surrounding
791
00:39:23,317 --> 00:39:27,097
the authenticity of Ali Bongo's
New Year's address.
792
00:39:27,147 --> 00:39:28,837
Pringle: Was it actually fake?
793
00:39:28,888 --> 00:39:30,888
To solve the mystery
once and for all,
794
00:39:30,933 --> 00:39:33,113
in 2020, a cyber-defence firm
795
00:39:33,153 --> 00:39:34,553
runs the New Year's Eve video
796
00:39:34,589 --> 00:39:36,499
through two forensic tests.
797
00:39:36,548 --> 00:39:39,678
To the surprise of many, they
concluded it was likely
798
00:39:39,725 --> 00:39:43,335
not 'Deepfaked'!
799
00:39:43,381 --> 00:39:45,341
Morgan: But even so,
look at the damage
800
00:39:45,383 --> 00:39:47,823
that just the accusation caused
801
00:39:47,863 --> 00:39:51,873
not just public mistrust,
there was an attempted coup!
802
00:39:51,911 --> 00:39:55,921
Narrator: Yet, as 'Deepfake'
technology continues to improve,
803
00:39:55,958 --> 00:39:59,608
as evidenced by the
@deeptomcruise TikTok account,
804
00:39:59,658 --> 00:40:02,838
many experts believe it may
eventually become impossible
805
00:40:02,878 --> 00:40:06,838
for anyone to determine what's
real and what isn't.
806
00:40:06,882 --> 00:40:09,192
Aitken: And this may be
where the true damage lies.
807
00:40:09,232 --> 00:40:12,712
Simply by existing, deepfakes
will call into question
808
00:40:12,758 --> 00:40:14,888
our understanding of what
constitutes proof
809
00:40:14,934 --> 00:40:17,764
that something has
happened, or existed.
810
00:40:17,806 --> 00:40:20,366
Morgan: In the past, videos
and images have been taken
811
00:40:20,418 --> 00:40:24,328
as incontrovertible proof that
something did actually occur.
812
00:40:24,378 --> 00:40:26,508
But now, it's going
to be a lot harder
813
00:40:26,554 --> 00:40:31,044
to tell fact from fiction.
814
00:40:31,080 --> 00:40:33,040
Narrator:
In the best case scenario,
815
00:40:33,082 --> 00:40:36,352
the positive applications of
deepfake technology
816
00:40:36,390 --> 00:40:38,480
will one day emerge
from the shadow
817
00:40:38,523 --> 00:40:40,263
of their destructive
counterparts.
818
00:40:40,307 --> 00:40:42,957
And become a force
for good in the world.
819
00:40:43,005 --> 00:40:48,615
But until then, don't believe
everything you see.
820
00:40:48,663 --> 00:40:49,843
Tom Cruise Deepfake: Just wait
'till what's coming next!
821
00:40:49,882 --> 00:40:55,192
[laughs]
822
00:40:55,235 --> 00:41:02,975
♪ [show theme music]
823
00:41:03,025 --> 00:41:10,815
♪
824
00:41:10,859 --> 00:41:12,509
Narrator: Chinnavenkateswarlu
825
00:41:12,557 --> 00:41:15,337
is a farmer in
Andrha Pradesh, India,
826
00:41:15,385 --> 00:41:18,075
looking for new ways to
increase the size of the crop
827
00:41:18,127 --> 00:41:22,177
on his small, 3 acre-farm.
828
00:41:22,218 --> 00:41:24,608
Alexander: Chinnavenkateswarlu's
family has had this plot of land
829
00:41:24,656 --> 00:41:26,046
for several generations.
830
00:41:26,092 --> 00:41:29,662
On it, he grows several
different types of ground nuts.
831
00:41:29,704 --> 00:41:32,494
Narrator: In early 2016,
832
00:41:32,533 --> 00:41:34,973
he is invited to be part
of a pilot program,
833
00:41:35,014 --> 00:41:37,154
which has been organized
by Microsoft,
834
00:41:37,190 --> 00:41:40,410
and a prominent international
crop research institute,
835
00:41:40,454 --> 00:41:45,424
along with the cooperation
of the Indian government.
836
00:41:45,459 --> 00:41:47,809
Pringle: He is one of 175
farmers within the region
837
00:41:47,853 --> 00:41:49,073
that have been
chosen to participate
838
00:41:49,115 --> 00:41:51,765
in this research project.
839
00:41:51,813 --> 00:41:54,083
Narrator: Its aim is to see
if they can improve India's
840
00:41:54,120 --> 00:41:59,600
agricultural industry by using
artificial intelligence.
841
00:41:59,647 --> 00:42:01,127
Morgan: India is just
one of many countries
842
00:42:01,170 --> 00:42:03,910
that is using AI to optimize
food production.
843
00:42:03,956 --> 00:42:06,566
The projects are still
in their initial phases,
844
00:42:06,611 --> 00:42:09,441
but they are already
showing some promise.
845
00:42:09,483 --> 00:42:11,703
Narrator: The AI's
automated system will send
846
00:42:11,746 --> 00:42:15,136
Chinnavenkateswarlu and the
other farmers text messages
847
00:42:15,184 --> 00:42:20,714
containing important information
on how to improve their crops.
848
00:42:20,755 --> 00:42:22,445
Alexander: It will do this by
providing recommendations
849
00:42:22,496 --> 00:42:25,326
for land preparation,
suggested sowing dates,
850
00:42:25,368 --> 00:42:27,368
managing and testing
soil nutrients
851
00:42:27,414 --> 00:42:31,854
and providing notes on the
application of fertilizers.
852
00:42:31,897 --> 00:42:34,067
Narrator: Researchers for the
pilot project are hopeful
853
00:42:34,116 --> 00:42:37,246
that AI will be able to boost
the farmer's food production
854
00:42:37,293 --> 00:42:39,033
while improving the overall
855
00:42:39,078 --> 00:42:44,038
nutritional quality
of their crops.
856
00:42:44,083 --> 00:42:46,833
With an estimated 2 billion
people across the world
857
00:42:46,868 --> 00:42:48,868
suffering from malnutrition,
858
00:42:48,914 --> 00:42:52,054
and millions more who simply
don't get enough to eat,
859
00:42:52,091 --> 00:42:55,751
programs like this may be able
to make a contribution
860
00:42:55,790 --> 00:42:58,970
to combating the
world's food crisis.
861
00:42:59,011 --> 00:43:00,491
Pringle: The cause of the
global food crisis
862
00:43:00,534 --> 00:43:02,884
is quite varied and complex.
863
00:43:02,928 --> 00:43:05,228
But one of the major
reasons is poverty.
864
00:43:05,278 --> 00:43:07,368
A lot of people are living
on an income
865
00:43:07,410 --> 00:43:09,460
of less than $2 dollars a day,
866
00:43:09,499 --> 00:43:13,159
meaning they can't afford to
feed their families.
867
00:43:13,199 --> 00:43:14,899
Narrator: And there is
substantial evidence
868
00:43:14,940 --> 00:43:17,900
that the food crisis
will only get worse.
869
00:43:17,943 --> 00:43:21,773
Recent studies say that by 2050,
870
00:43:21,816 --> 00:43:25,776
the global population will hit
10 billion.
871
00:43:25,820 --> 00:43:27,950
And the UN predicts
we'll need to produce
872
00:43:27,996 --> 00:43:33,646
up to 70% more food
to meet this increase.
873
00:43:33,698 --> 00:43:36,608
With this in mind, AI is
being used in order to make
874
00:43:36,657 --> 00:43:40,177
cultivation more
efficient and bountiful.
875
00:43:40,226 --> 00:43:42,526
Pringle: Machine learning
algorithms can analyze visual
876
00:43:42,576 --> 00:43:45,226
data of farmer's crops that have
been captured by drones
877
00:43:45,274 --> 00:43:47,804
and provide a report on the
state of the crops
878
00:43:47,842 --> 00:43:50,282
and the viability of the farm.
879
00:43:50,323 --> 00:43:51,853
Morgan: AI can also
use that drone data
880
00:43:51,890 --> 00:43:53,810
to improve the quality
of the soil,
881
00:43:53,848 --> 00:43:58,458
by identifying potential
nutrient deficiencies.
882
00:43:58,505 --> 00:44:01,285
It can also spot insects,
like locusts,
883
00:44:01,334 --> 00:44:02,904
and then warn farmers
to let them know
884
00:44:02,944 --> 00:44:07,514
that their crops are
at risk of being devoured.
885
00:44:07,557 --> 00:44:10,127
Narrator: In Uganda, farmers
have been using this technology
886
00:44:10,169 --> 00:44:13,219
to help identify
different diseases.
887
00:44:13,259 --> 00:44:15,869
Using a simple
smartphone application,
888
00:44:15,914 --> 00:44:20,404
they are able to analyze plants
for possible contamination.
889
00:44:20,440 --> 00:44:22,920
The app can then provide
treatment information
890
00:44:22,964 --> 00:44:28,234
to help mitigate the risk of it
infecting their entire crop.
891
00:44:28,274 --> 00:44:29,974
Alexander: Several tech
companies are also developing
892
00:44:30,015 --> 00:44:32,885
AI robots that can assess the
quality of crops.
893
00:44:32,931 --> 00:44:34,761
And harvest them at a much
faster pace
894
00:44:34,802 --> 00:44:38,632
and at a higher
volume than before.
895
00:44:38,676 --> 00:44:40,456
Narrator: But As AI
becomes more common
896
00:44:40,503 --> 00:44:42,423
in the agriculture industry,
897
00:44:42,462 --> 00:44:44,292
it may have a negative impact
898
00:44:44,333 --> 00:44:48,903
on some of the very people
it's meant to help.
899
00:44:48,947 --> 00:44:51,337
Pringle: Agriculture is a
$3 trillion industry,
900
00:44:51,384 --> 00:44:54,394
that employs over 1.5 billion
people worldwide.
901
00:44:54,430 --> 00:44:57,870
That is 20% of the
world's population.
902
00:44:57,912 --> 00:45:01,052
And it's predicted that over the
next decades this technology
903
00:45:01,089 --> 00:45:04,139
may put millions of
workers out of a job.
904
00:45:04,179 --> 00:45:06,619
Morgan: Robotic equipment is
both faster and cheaper
905
00:45:06,660 --> 00:45:08,710
than human labour,
so many people worry
906
00:45:08,749 --> 00:45:10,879
that once this technology
is introduced,
907
00:45:10,925 --> 00:45:13,185
that there will be huge shocks
in the labour market
908
00:45:13,232 --> 00:45:18,192
as humans are forced
out of work.
909
00:45:18,237 --> 00:45:20,537
Narrator: But there are other
positive effects to consider.
910
00:45:20,587 --> 00:45:22,977
AI's application in agriculture,
911
00:45:23,024 --> 00:45:25,774
may be beneficial to the
environment.
912
00:45:25,810 --> 00:45:28,460
With the help of technology,
farmers might be able
913
00:45:28,508 --> 00:45:34,298
to produce greater yields while
using less land.
914
00:45:34,340 --> 00:45:36,430
Pringle: Less land to cultivate
and harvest means
915
00:45:36,472 --> 00:45:39,172
there will be less
carbon emissions.
916
00:45:39,214 --> 00:45:41,484
Morgan: Every year,
huge numbers of crops are lost
917
00:45:41,521 --> 00:45:46,831
to extreme weather events
as a result of climate change.
918
00:45:46,874 --> 00:45:50,574
Narrator: A 2019 study by the
University of Illinois
919
00:45:50,617 --> 00:45:52,617
found that since the early '80s,
920
00:45:52,662 --> 00:45:54,882
corn yields in the
American midwest
921
00:45:54,926 --> 00:45:57,666
were reduced by as much as 37%,
922
00:45:57,711 --> 00:46:01,671
due to excessive
rainfall and drought.
923
00:46:01,715 --> 00:46:04,625
But AI applications are proving
to be effective in
924
00:46:04,674 --> 00:46:10,074
preparing farmers for the impact
of these adverse conditions.
925
00:46:10,115 --> 00:46:12,115
Alexander: Algorithms are now
able to analyze a region's
926
00:46:12,160 --> 00:46:14,210
weather conditions from
satellite imagery
927
00:46:14,249 --> 00:46:18,429
and compare it
to historical data.
928
00:46:18,471 --> 00:46:20,781
Narrator: Such information will
be especially important
929
00:46:20,821 --> 00:46:23,081
to farmers like those
Chinnavenkateswarlu,
930
00:46:23,128 --> 00:46:25,608
as Andrha Pradesh is
known to experience
931
00:46:25,652 --> 00:46:31,922
unpredictable weather.
932
00:46:31,963 --> 00:46:33,443
Morgan: For the past decade,
933
00:46:33,486 --> 00:46:35,226
as a likely result
of climate change,
934
00:46:35,270 --> 00:46:37,840
it's become more and more
difficult to predict exactly
935
00:46:37,882 --> 00:46:40,062
when monsoon season will happen,
936
00:46:40,101 --> 00:46:43,931
and that has meant poorer
and poorer crop yields.
937
00:46:43,975 --> 00:46:46,535
This AI technology can
analyse weather data
938
00:46:46,586 --> 00:46:48,496
as far back as we have it,
939
00:46:48,544 --> 00:46:50,854
in order to make very
accurate predictions about
940
00:46:50,895 --> 00:46:55,505
exactly when we should be
planting and reaping our crops.
941
00:46:55,551 --> 00:46:58,211
This can go a long way towards
reducing the impact
942
00:46:58,250 --> 00:47:03,210
of monsoons on crop yields.
943
00:47:03,255 --> 00:47:06,555
♪
944
00:47:06,606 --> 00:47:09,086
Narrator: Chinnavenkateswarlu
usually sows his nut crops
945
00:47:09,130 --> 00:47:11,610
in early June,
but before he does,
946
00:47:11,654 --> 00:47:14,274
he receives a text
suggesting he should wait.
947
00:47:14,309 --> 00:47:18,879
So he puts it off
until June 25th.
948
00:47:18,923 --> 00:47:20,843
Pringle: He's hoping this
slight shift in timing
949
00:47:20,881 --> 00:47:24,281
will help increase his harvest.
950
00:47:24,319 --> 00:47:26,229
Narrator: Initiatives like
the one Chinnavenkateswarlu
951
00:47:26,278 --> 00:47:30,978
is participating in are becoming
more and more commonplace.
952
00:47:31,022 --> 00:47:34,072
Globally, investment in
artificial intelligence
953
00:47:34,112 --> 00:47:37,862
in agriculture is predicted to
grow from $1 billion in 2020,
954
00:47:37,898 --> 00:47:41,988
to $4 billion in 2026.
955
00:47:42,033 --> 00:47:43,303
Alexander: Some of this
investment is targeting
956
00:47:43,338 --> 00:47:46,338
what is called AI-driven
indoor vertical farming.
957
00:47:46,385 --> 00:47:48,995
This is when, in order to
maximize land use,
958
00:47:49,040 --> 00:47:51,740
crops are grown upwards,
not as they usually are
959
00:47:51,781 --> 00:47:54,571
on a horizontal, flat field.
960
00:47:54,610 --> 00:47:57,270
Pringle: This approach uses
up to 90% less water
961
00:47:57,309 --> 00:48:00,829
and can produce over 20 times
more food per acre
962
00:48:00,878 --> 00:48:03,788
than the traditional
way of farming.
963
00:48:03,837 --> 00:48:05,527
Narrator: With so many
corporations getting into
964
00:48:05,578 --> 00:48:09,228
AI farming, critics worry what
impact that may have
965
00:48:09,277 --> 00:48:11,887
on farms that don't have the
financial resources
966
00:48:11,932 --> 00:48:16,152
to invest in this new
technology.
967
00:48:16,197 --> 00:48:19,507
Over the past decade,
family-owned farms have been
968
00:48:19,548 --> 00:48:22,328
disappearing in many parts of
the western world,
969
00:48:22,377 --> 00:48:24,417
most of them put out
of business by large,
970
00:48:24,466 --> 00:48:28,596
corporate food producers.
971
00:48:28,644 --> 00:48:30,124
Morgan: Some of these family
farms have been around
972
00:48:30,168 --> 00:48:37,648
for decades,
and now they are gone.
973
00:48:37,697 --> 00:48:40,177
Narrator: Owing to the high
cost of implementation
974
00:48:40,221 --> 00:48:42,571
and integration,
there are concerns
975
00:48:42,615 --> 00:48:46,225
that as AI increasingly becomes
a part of food production,
976
00:48:46,271 --> 00:48:49,491
large corporations will continue
to increase their control
977
00:48:49,535 --> 00:48:53,055
over the world's food resources.
978
00:48:53,104 --> 00:48:54,284
Alexander: If just
a few corporations
979
00:48:54,322 --> 00:48:56,672
dominate food production,
history tells us
980
00:48:56,716 --> 00:48:58,756
that problems like climate
change and poverty
981
00:48:58,805 --> 00:49:02,235
are likely not to be
adequately addressed.
982
00:49:02,287 --> 00:49:04,377
Pringle: For such enormous
issues to be tackled,
983
00:49:04,419 --> 00:49:07,899
we would need a shift not just
in how and what we produce,
984
00:49:07,945 --> 00:49:10,725
but also in how the
profits are distributed.
985
00:49:10,773 --> 00:49:12,733
And that might be
beyond the reach
986
00:49:12,775 --> 00:49:15,945
of artificial intelligence.
987
00:49:15,996 --> 00:49:17,736
Narrator: AI can't do
everything,
988
00:49:17,780 --> 00:49:20,910
but as Chinnavenkateswarlu is
about to find out,
989
00:49:20,958 --> 00:49:25,528
it might be able to improve
crop health and yield.
990
00:49:25,571 --> 00:49:28,791
He harvests his ground
nuts on October 28th,
991
00:49:28,835 --> 00:49:32,395
which is a little
later than usual.
992
00:49:32,447 --> 00:49:35,537
Morgan: He's usually able to
harvest about a ton per hectare,
993
00:49:35,581 --> 00:49:37,411
but this year he's noticed
that its increased
994
00:49:37,452 --> 00:49:39,802
to about 1.35 tonnes
per hectare.
995
00:49:39,846 --> 00:49:42,496
That's a big deal.
996
00:49:42,544 --> 00:49:44,334
Alexander: And even more
significantly,
997
00:49:44,372 --> 00:49:46,552
all the other farmers that were
part of the same project
998
00:49:46,592 --> 00:49:49,512
had an average increase
of 30% in their yield.
999
00:49:49,551 --> 00:49:51,601
So judging by this little
initiative,
1000
00:49:51,640 --> 00:49:55,780
it seems that AI can definitely
have some positive impact.
1001
00:49:55,818 --> 00:49:58,038
Narrator: What real effect
artificial intelligence
1002
00:49:58,082 --> 00:50:01,822
will have in the fight against
world hunger remains unknown.
1003
00:50:01,868 --> 00:50:04,038
But one thing is for certain,
1004
00:50:04,088 --> 00:50:07,048
in order to feed the world's
growing population,
1005
00:50:07,091 --> 00:50:11,531
it's paramount that we transform
our agricultural systems.
1006
00:50:11,573 --> 00:50:14,193
And technology might just
be the approach
1007
00:50:14,228 --> 00:50:16,538
that puts more food
on the table.
1008
00:50:16,578 --> 00:50:21,018
♪
79471
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.