Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:04,480 --> 00:00:08,520
[music]
2
00:00:08,760 --> 00:00:13,520
Hi, how are you feeling?
I just checked your health data.
3
00:00:14,400 --> 00:00:18,440
Your last meal contributed sixty
percent of your daily nutrients
4
00:00:18,480 --> 00:00:20,480
and you've completed
eleven thousand
5
00:00:20,480 --> 00:00:22,800
steps towards your
daily fitness goal
6
00:00:24,000 --> 00:00:26,720
so you can take a seat now.
7
00:00:26,760 --> 00:00:29,080
I've got something
for you to watch
8
00:00:29,080 --> 00:00:30,560
and I'll be watching too.
9
00:00:30,640 --> 00:00:31,720
[music]
10
00:00:33,400 --> 00:00:35,520
Tonight you're going
to see humans take
11
00:00:35,520 --> 00:00:38,080
on the robots that
might replace them.
12
00:00:38,147 --> 00:00:41,640
There always need to be
experienced people on the road.
13
00:00:41,680 --> 00:00:45,005
From truckies to lawyers,
artificial intelligence is
14
00:00:45,045 --> 00:00:48,193
coming
- actually, it's already here.
15
00:00:48,242 --> 00:00:50,036
I didn't realise that it
would just be able to tell you
16
00:00:50,076 --> 00:00:52,525
'hey, here's the exact answer
to your question'.
17
00:00:52,565 --> 00:00:54,791
We'll challenge your
thinking about AI.
18
00:00:54,831 --> 00:00:56,888
Same category, sixteen hundred
19
00:00:56,928 --> 00:00:58,649
AI's going to become
like electricity.
20
00:00:58,689 --> 00:01:00,664
Automation isn't going
to effect some workers
21
00:01:00,704 --> 00:01:02,477
it's going to effect every
worker
22
00:01:02,697 --> 00:01:05,297
and we let the
generation most effected
23
00:01:05,392 --> 00:01:06,659
take on the experts
24
00:01:06,829 --> 00:01:09,435
I think that the younger
generations probably have a
25
00:01:09,475 --> 00:01:11,075
better idea of where things are
going than the older
26
00:01:11,115 --> 00:01:12,623
generations.
[laughter]
27
00:01:12,852 --> 00:01:16,500
Tonight, we'll help you
get ready for the AI race.
28
00:01:16,540 --> 00:01:17,540
[music]
29
00:01:41,753 --> 00:01:46,102
Australian truckies often
work up to 72 hours a week
30
00:01:46,142 --> 00:01:49,875
and are now driving bigger
rigs to try to make ends meet
31
00:01:51,743 --> 00:01:53,407
I've seen a lot of people
go backwards out of this
32
00:01:53,447 --> 00:01:57,713
industry. And I've seen a lot
of pressures it's caused
33
00:01:57,753 --> 00:01:59,046
on their family life,
34
00:01:59,086 --> 00:02:01,821
especially when you're
paying the rig off.
35
00:02:03,940 --> 00:02:06,503
now a new and unexpected
threat to Frank
36
00:02:06,591 --> 00:02:09,410
and other truck drivers
is coming on fast.
37
00:02:10,473 --> 00:02:13,359
Last year this driverless
truck in the U.S.
38
00:02:13,399 --> 00:02:16,630
became the first to make
an interstate delivery.
39
00:02:16,776 --> 00:02:19,935
it travelled nearly 200
kilometres on the open
40
00:02:19,975 --> 00:02:23,877
road with no one at the
wheel, no human that is.
41
00:02:24,417 --> 00:02:27,366
The idea of robot vehicles
on the open road
42
00:02:27,406 --> 00:02:30,935
seemed ludicrous to most
people just 5 years ago.
43
00:02:32,303 --> 00:02:36,081
Now just about every major auto
and tech company is developing
44
00:02:36,121 --> 00:02:37,162
them.
45
00:02:37,686 --> 00:02:38,753
so what changed?
46
00:02:39,310 --> 00:02:42,661
An explosion in
artificial intelligence.
47
00:02:42,918 --> 00:02:44,371
[car engine revs]
48
00:02:50,319 --> 00:02:52,506
There's lots of AI
already in our lives,
49
00:02:52,546 --> 00:02:54,480
you can already see it
on your smartphone.
50
00:02:54,520 --> 00:02:58,720
Every time you use Siri, every
time you ask Alexa a question,
51
00:02:58,873 --> 00:03:00,475
every time you actually
use your satellite
52
00:03:00,515 --> 00:03:02,440
navigation, you're using
one of these algorithms,
53
00:03:02,480 --> 00:03:05,680
you’re using some AI that
is recognising your speech,
54
00:03:05,733 --> 00:03:08,209
answering questions,
giving you search results,
55
00:03:08,249 --> 00:03:10,709
recommending books for
you to buy on Amazon.
56
00:03:10,749 --> 00:03:14,283
They’re the beginnings of
AI everywhere in our lives.
57
00:03:20,499 --> 00:03:23,029
We don't think about
electricity. Electricity powers
58
00:03:23,069 --> 00:03:25,669
our planet, it powers pretty
much everything we do.
59
00:03:25,709 --> 00:03:27,225
It’s going to be
that you walk into a
60
00:03:27,265 --> 00:03:29,091
room and you say
“room, lights on”.
61
00:03:29,568 --> 00:03:32,701
You sit in your car and
you say “take me home”.
62
00:03:33,365 --> 00:03:34,365
[whistle noise]
63
00:03:35,191 --> 00:03:38,253
A driverless car is essentially
a robot. It has a computer
64
00:03:38,354 --> 00:03:44,617
that takes input from its
sensors and produces an output.
65
00:03:44,831 --> 00:03:48,615
The main sensors are radar,
which can be found in adaptive
66
00:03:48,655 --> 00:03:53,910
cruise control, ultrasonic
sensors and then there’s cameras
67
00:03:53,950 --> 00:03:55,707
that collect images.
68
00:03:56,474 --> 00:04:00,248
And this data is used to
control the car, to slow the
69
00:04:00,288 --> 00:04:04,583
car down, to accelerate the
car, to turn the wheels.
70
00:04:08,747 --> 00:04:11,470
There's been an explosion in AI
now because of the convergence
71
00:04:11,510 --> 00:04:14,163
of four exponentials.
72
00:04:14,209 --> 00:04:17,780
The first exponential is Moore's
Law, the fact that every two
73
00:04:17,820 --> 00:04:20,669
years, we have a doubling
in computing performance.
74
00:04:20,709 --> 00:04:22,809
The second exponential
is that every two years
75
00:04:22,849 --> 00:04:25,716
we have a doubling of the
amount of data that we have,
76
00:04:25,756 --> 00:04:28,146
because these machine learning
algorithms are very
77
00:04:28,195 --> 00:04:29,240
hungry for data.
78
00:04:29,295 --> 00:04:31,680
The third exponential
is we've been working on for AI
79
00:04:31,720 --> 00:04:34,607
for 50 years or so now and
our algorithms are starting to
80
00:04:34,647 --> 00:04:36,481
get better. And then
the fourth exponential
81
00:04:36,521 --> 00:04:40,029
which is over the last few
years we've had a doubling
82
00:04:40,069 --> 00:04:43,208
every two years of the amount
of funding going into AI.
83
00:04:43,248 --> 00:04:45,888
We now have the compute
power, we now have the data,
84
00:04:45,928 --> 00:04:47,803
we now have the algorithms
and we now have
85
00:04:47,843 --> 00:04:49,494
a lot of people working
on the problems.
86
00:04:49,534 --> 00:04:51,134
[car engine starting up]
87
00:04:52,729 --> 00:04:55,592
It could be you just jump into
the car, you assume the car
88
00:04:55,632 --> 00:04:59,072
knows where you need to go
because it has access to your
89
00:04:59,112 --> 00:05:03,084
calendar, your diary, where
you're meant to be and if you
90
00:05:03,124 --> 00:05:05,816
did not want the car to go where
your calendar says you ought to
91
00:05:05,856 --> 00:05:09,916
be, then you need to tell the
car, "oh, and by the way, don't
92
00:05:09,956 --> 00:05:14,697
take me to the meeting that's
in my calendar. Take me to the
93
00:05:14,737 --> 00:05:16,204
beach!"
94
00:05:17,339 --> 00:05:19,932
But Frank Black won't
have a bar of it.
95
00:05:20,143 --> 00:05:21,439
I think it's crazy stuff.
96
00:05:21,479 --> 00:05:24,160
You've got glitches in computers
now, the banks are having
97
00:05:24,200 --> 00:05:28,840
glitches with their ATMs,
and emails are having glitches.
98
00:05:29,182 --> 00:05:32,459
Who's to say this is going to be
perfect? And this is a lot more
99
00:05:32,499 --> 00:05:34,630
dangerous if there's a
computer glitch.
100
00:05:35,912 --> 00:05:39,828
There will always need to be
experienced people on the road.
101
00:05:39,868 --> 00:05:41,137
Not machines.
102
00:05:43,514 --> 00:05:47,131
Frank is going to explain why
he believes robots can never
103
00:05:47,171 --> 00:05:49,240
match human drivers.
104
00:05:53,674 --> 00:05:55,808
“Okay then, let's do it!"
105
00:06:01,080 --> 00:06:03,599
But Frank is off
to a rocky start -
106
00:06:03,640 --> 00:06:07,320
driverless trucks in Rio
Tinto mines in west Australia
107
00:06:07,360 --> 00:06:10,600
show productivity gains of 15%
108
00:06:13,967 --> 00:06:17,996
Frank needs to break every five
hours and rest every 12.
109
00:06:18,036 --> 00:06:21,982
Oh, and he needs to eat and he
expects to be paid for his work
110
00:06:22,053 --> 00:06:24,646
Robots don't need a salary.
111
00:06:25,342 --> 00:06:28,798
Trials also indicate that
driverless vehicles save up to
112
00:06:28,838 --> 00:06:33,367
15% on fuel and emissions,
especially when driving very
113
00:06:33,407 --> 00:06:36,541
close together in a
formation called platooning
114
00:06:38,338 --> 00:06:41,189
and at first glance,
driverless technology
115
00:06:41,229 --> 00:06:43,399
could dramatically
reduce road accidents,
116
00:06:43,439 --> 00:06:47,541
because it's estimated that 90%
of accidents are due to human
117
00:06:47,581 --> 00:06:50,963
error such as fatigue or loss of
concentration.
118
00:06:51,159 --> 00:06:52,626
Robots don't get tired.
119
00:06:55,847 --> 00:06:59,832
But hang on - Frank's not done -
he's about to launch a comeback
120
00:06:59,872 --> 00:07:02,472
using 30 years of driving
experience.
121
00:07:04,107 --> 00:07:06,957
If there's something, say like a
group of kids playing with a
122
00:07:06,997 --> 00:07:10,185
ball on the side of the road,
we can see that ball starting to
123
00:07:10,225 --> 00:07:13,403
bounce towards the road, we
anticipate that there could be a
124
00:07:13,443 --> 00:07:17,443
strong possibility that that
child will run out on the road
125
00:07:17,748 --> 00:07:24,427
after that ball. I can't see how
a computer can anticipate that
126
00:07:24,467 --> 00:07:27,598
for a start. And even if it did,
then what sort of reaction would
127
00:07:27,638 --> 00:07:33,743
it take? Would it say swerve to
the left? Swerve to the right?
128
00:07:33,783 --> 00:07:36,587
Would it just brake and bring
the vehicle to a stop?
129
00:07:36,627 --> 00:07:38,954
What about if it can't stop
in time?
130
00:07:38,994 --> 00:07:42,320
In fact, right now a
self-driving vehicle can only
131
00:07:42,360 --> 00:07:46,900
react according to its program.
Anything unprogrammed can create
132
00:07:46,940 --> 00:07:50,958
problems - like when this Tesla
drove into a road works barrier
133
00:07:51,054 --> 00:07:54,454
after the human driver
failed to take back control.
134
00:07:56,092 --> 00:07:58,558
and what it some of
the sensors fail?
135
00:07:58,640 --> 00:08:01,154
What happens if something
gets on the lens?
136
00:08:01,194 --> 00:08:02,764
The vehicle doesn't
know where it's going.
137
00:08:02,819 --> 00:08:07,396
It's true - currently heavy rain
or fog or even unclear road
138
00:08:07,436 --> 00:08:11,703
signs can bamboozle driverless
technology.
139
00:08:12,698 --> 00:08:15,713
And then there's the most
unpredictable element of all
140
00:08:15,768 --> 00:08:16,768
human drivers.
141
00:08:17,659 --> 00:08:20,299
Stupidity always finds new
forms.
142
00:08:20,511 --> 00:08:23,323
Quite often you see things
you’ve never seen before.
143
00:08:26,271 --> 00:08:27,937
[crash of cars colliding]
144
00:08:28,404 --> 00:08:32,434
That's why there are no plans to
trial driverless trucks in
145
00:08:32,474 --> 00:08:34,800
complex urban settings right
now.
146
00:08:34,840 --> 00:08:39,120
They'll initially be limited to
predictable multi-lane highways.
147
00:08:39,618 --> 00:08:42,914
You also still need a human
right now to load and unload
148
00:08:42,954 --> 00:08:43,954
a truck.
149
00:08:44,520 --> 00:08:47,600
And a robot truck won't help
change your Tyre.
150
00:08:47,743 --> 00:08:50,800
If someone's in trouble on the
road you'll usually find a
151
00:08:50,840 --> 00:08:53,960
truckie has pulled over to
make sure they're alright.
152
00:08:54,201 --> 00:08:55,926
Finally, there are road rules.
153
00:08:55,966 --> 00:08:59,541
Australia requires human hands
on the steering wheel at all
154
00:08:59,581 --> 00:09:03,068
times, in every state and
territory.
155
00:09:06,968 --> 00:09:10,320
Hey Frank! You won the race!
156
00:09:10,485 --> 00:09:12,151
One for the human beings!
157
00:09:18,374 --> 00:09:21,307
But how long can human drivers
stay on top?
158
00:09:22,108 --> 00:09:26,241
Nearly 400, 000 Australians
earn their living from driving,
159
00:09:26,288 --> 00:09:29,381
even more when you add part-time
drivers.
160
00:09:30,520 --> 00:09:33,975
But the race is on to deliver
the first version of a fully
161
00:09:34,015 --> 00:09:37,498
autonomous vehicle in just 4
years
162
00:09:38,014 --> 00:09:41,947
and it might not be hype because
AI is getting much better,
163
00:09:42,270 --> 00:09:44,247
much faster every year
164
00:09:44,474 --> 00:09:48,912
with a version of AI called
machine learning
165
00:09:58,867 --> 00:10:03,153
Machine learning is the little
part of AI that is focused on
166
00:10:03,193 --> 00:10:05,099
teaching programs to learn.
167
00:10:05,139 --> 00:10:08,232
If you think about how we got to
be intelligent, we started out
168
00:10:08,272 --> 00:10:11,692
not knowing very much when we
were born and most of what we've
169
00:10:11,732 --> 00:10:13,325
got is through learning.
170
00:10:13,365 --> 00:10:17,021
and so, we write programs that
learn to improve themselves.
171
00:10:17,061 --> 00:10:19,668
They need - at the moment - lots
of data and they get better and
172
00:10:19,708 --> 00:10:23,096
better, and in many cases,
certainly for narrow focus
173
00:10:23,136 --> 00:10:27,237
domains, we can often actually
exceed actual human performance.
174
00:10:27,692 --> 00:10:28,692
[music]
175
00:10:41,480 --> 00:10:46,560
When AlphaGo beat Lee Sedol
last year, one of the best
176
00:10:46,654 --> 00:10:50,254
Go players on the planet,
that was a landmark moment.
177
00:10:51,255 --> 00:10:57,145
So we’ve always used games as
benchmarks, both between humans
178
00:10:57,200 --> 00:11:00,660
and between humans and machines
and, you know, a quarter century
179
00:11:00,700 --> 00:11:04,278
ago, chess fell to the
computers.
180
00:11:04,325 --> 00:11:06,325
And at that time people thought
181
00:11:06,374 --> 00:11:08,481
well Go isn’t going to be like
that.
182
00:11:08,530 --> 00:11:11,895
Because in Go, there’s so many
more possible moves
183
00:11:11,935 --> 00:11:15,990
and the best Go players weren’t
working by trying all
184
00:11:16,030 --> 00:11:18,473
possibilities ahead, they were
working on the kind of, the
185
00:11:18,513 --> 00:11:21,911
gestalt of what it looked like,
and working on intuition.
186
00:11:22,013 --> 00:11:26,184
We didn’t have any idea of how
to instil that type of intuition
187
00:11:26,224 --> 00:11:27,325
into a computer.
188
00:11:34,060 --> 00:11:41,051
but what happened is we've got
some recent techniques with deep
189
00:11:41,091 --> 00:11:46,064
learning where we’re able to do
things like understand photos,
190
00:11:46,104 --> 00:11:50,394
understand speech and so on and
people said maybe this will be
191
00:11:50,434 --> 00:11:53,080
the key to getting that type
of intuition.
192
00:11:53,120 --> 00:11:58,614
So first, it started by
practicing on every game that a
193
00:11:58,654 --> 00:12:00,161
master had ever played.
194
00:12:00,201 --> 00:12:03,005
You feed them all in and it
practices on that.
195
00:12:03,045 --> 00:12:08,683
The key was to get AlphaGo good
enough from training it on
196
00:12:08,723 --> 00:12:13,206
past games by humans so that it
could then start playing itself
197
00:12:13,246 --> 00:12:15,644
and improving itself.
198
00:12:15,754 --> 00:12:18,435
And one thing that’s very
interesting is that the amount
199
00:12:18,475 --> 00:12:22,552
of time it took, the total
number of person years invested
200
00:12:22,615 --> 00:12:29,023
is a tenth or less than the
amount of time it took for IBM
201
00:12:29,063 --> 00:12:30,730
to do the chess playing.
202
00:12:30,782 --> 00:12:33,539
So the rate of learning is going
to be exponential.
203
00:12:33,579 --> 00:12:36,278
Something that we as humans
are not used to seeing.
204
00:12:36,318 --> 00:12:38,951
We have to learn things
painfully ourselves
205
00:12:38,991 --> 00:12:41,848
and the computers are going to
learn on a planet wide scale,
206
00:12:41,888 --> 00:12:43,927
not an individual level.
207
00:12:53,888 --> 00:12:56,812
There is this interesting idea
that the intelligence would just
208
00:12:56,852 --> 00:13:00,633
suddenly explode and take us
to what’s called the Singularity
209
00:13:00,673 --> 00:13:04,422
where machines now improve
themselves almost without end.
210
00:13:04,462 --> 00:13:07,134
There are lots of reasons to
suppose that maybe that might
211
00:13:07,174 --> 00:13:10,250
not happen, but if it does
happen, most of my colleagues
212
00:13:10,290 --> 00:13:13,557
think it’s about 50 years away.
Maybe even 100.
213
00:13:13,855 --> 00:13:15,055
[intake of breath]
214
00:13:16,214 --> 00:13:20,658
I’m not convinced how important
that intelligence is, right?
215
00:13:20,698 --> 00:13:24,041
So I think that there is lots of
different attributes and
216
00:13:24,081 --> 00:13:27,947
intelligence is only one of them
and there certainly are tasks
217
00:13:27,989 --> 00:13:30,197
that having a lot of
intelligence would help,
218
00:13:30,268 --> 00:13:33,955
and being able to compute
quickly would help.
219
00:13:34,002 --> 00:13:37,627
So if I want to trade stocks,
then having a computer that's
220
00:13:37,680 --> 00:13:40,729
smarter than anybody else’s is
going to give me a definite
221
00:13:40,769 --> 00:13:41,769
advantage.
222
00:13:42,901 --> 00:13:46,783
But I think if I wanted to solve
the Middle East crisis,
223
00:13:46,823 --> 00:13:49,525
I don’t think it’s not being
solved because nobody is smart
224
00:13:49,565 --> 00:13:50,565
enough.
225
00:13:53,514 --> 00:13:57,701
But AI experts believe robot
cars will improve so much that
226
00:13:57,741 --> 00:14:01,803
humans will eventually be banned
from driving.
227
00:14:02,009 --> 00:14:03,009
[music]
228
00:14:06,329 --> 00:14:09,735
Big roadblocks remain, not the
least of which is public
229
00:14:09,775 --> 00:14:11,194
acceptance
230
00:14:11,562 --> 00:14:15,897
- as we found out after inviting
professional drivers to meet two
231
00:14:15,937 --> 00:14:17,061
robot car experts.
232
00:14:17,101 --> 00:14:18,101
[introductions]
233
00:14:21,748 --> 00:14:24,146
Straight away the first thing
has to be safety,
234
00:14:24,194 --> 00:14:27,529
you definitely have to have
safety paramount,
235
00:14:27,670 --> 00:14:29,911
and obviously efficiency.
236
00:14:29,951 --> 00:14:32,656
So the big question is
- when is it going to happen?
237
00:14:32,742 --> 00:14:36,882
In the next five to ten years we
will see highly autonomous
238
00:14:36,922 --> 00:14:38,203
vehicles on the road.
239
00:14:38,243 --> 00:14:42,038
If you want to drive from Sydney
to Canberra, you drive to the
240
00:14:42,078 --> 00:14:47,148
freeway, activate autopilot or
whatever it will be called at
241
00:14:47,203 --> 00:14:51,046
and by the time you arrive in
Canberra, the car asks you to
242
00:14:51,086 --> 00:14:52,335
take back control.
243
00:14:52,400 --> 00:14:57,040
There are predictions that in
twenty years’ time, 50% of the
244
00:14:57,101 --> 00:15:00,329
new vehicles will actually be
completely driverless.
245
00:15:00,400 --> 00:15:04,235
What makes us think that these
computers and these vehicles
246
00:15:04,305 --> 00:15:05,587
are going to be foolproof?
247
00:15:05,689 --> 00:15:09,391
Well we were able to send
rockets to the moon,
248
00:15:09,501 --> 00:15:13,669
and I think that there are ways
of doing it, and you can have
249
00:15:13,709 --> 00:15:16,842
backup systems, and you can have
backups for your backups.
250
00:15:16,990 --> 00:15:21,106
But I agree, reliability is a
big question mark.
251
00:15:21,224 --> 00:15:24,294
But we're not talking about a
phone call dropping out or an
252
00:15:24,334 --> 00:15:28,638
email shutting down, we're
talking about a sixty ton
253
00:15:28,678 --> 00:15:31,841
vehicle, in traffic, that's
going to kill people.
254
00:15:31,881 --> 00:15:34,457
There will be deaths if it makes
a mistake.
255
00:15:34,497 --> 00:15:39,175
I think we need to accept that
there will still be accidents.
256
00:15:39,215 --> 00:15:43,160
A machine can make a mistake,
can shut down, and fail.
257
00:15:43,278 --> 00:15:47,597
And if we reduce accidents by
say ninety percent,
258
00:15:47,644 --> 00:15:51,846
there will still be ten percent
of the current accidents will
259
00:15:51,886 --> 00:15:53,362
still occur on the network.
260
00:15:53,402 --> 00:15:55,674
How can you say there's going
to be ninety percent?
261
00:15:55,714 --> 00:15:56,792
How do you work that out?
262
00:15:56,832 --> 00:16:00,019
Ninety percent is because ninety
percent of the accidents are
263
00:16:00,059 --> 00:16:01,592
because of human error.
264
00:16:01,637 --> 00:16:04,058
The idea is if we take the human
out,
265
00:16:04,107 --> 00:16:07,035
we could potentially reduce it
by ninety percent.
266
00:16:07,098 --> 00:16:11,355
Have any of you ever driven a
car available on the market
267
00:16:11,395 --> 00:16:13,566
today with all this technology,
autopilot
268
00:16:13,615 --> 00:16:15,345
and everything in there?
269
00:16:15,394 --> 00:16:19,174
It's absolutely unbelievable how
safe and comfortable you feel.
270
00:16:19,254 --> 00:16:22,784
I think people will ultimately
accept this technology,
271
00:16:22,824 --> 00:16:25,355
because we will be going in
steps.
272
00:16:25,400 --> 00:16:28,640
I would say, for me as an Uber
driver, we're providing a
273
00:16:28,694 --> 00:16:31,302
passenger service, and those
passengers when they’re going to
274
00:16:31,357 --> 00:16:32,677
the airport, a lot of luggage.
275
00:16:32,717 --> 00:16:36,293
If it's an elderly passenger,
they need help to get into the
276
00:16:36,333 --> 00:16:38,551
car, they need help getting out
of the car.
277
00:16:38,626 --> 00:16:40,676
The human factor needs to be
there.
278
00:16:40,736 --> 00:16:43,832
I would argue that you can
offer a much better service if
279
00:16:43,872 --> 00:16:45,348
you're not also driving.
280
00:16:45,395 --> 00:16:49,481
So cars taking care of the
journey and you're taking care
281
00:16:49,521 --> 00:16:53,059
of the customer. And improving
the customer experience.
282
00:16:53,099 --> 00:16:56,235
And I think there's a lot of
scope for improvement in the
283
00:16:56,275 --> 00:16:58,630
taxi and Uber customer
experience.
284
00:16:59,263 --> 00:17:03,654
You could offer tax advice, you
could offer financial advice.
285
00:17:04,552 --> 00:17:05,708
It's unlimited.
286
00:17:05,787 --> 00:17:09,145
Then we go back though.
We're not at fully driverless
287
00:17:09,185 --> 00:17:11,318
vehicles anymore, we've still
got a babysitter there
288
00:17:11,365 --> 00:17:13,106
and a human being there to look
after the car,
289
00:17:13,146 --> 00:17:16,779
so what are we gaining with the
driverless technology?
290
00:17:16,842 --> 00:17:19,042
Well, the opportunity to do
that?
291
00:17:20,857 --> 00:17:22,568
But weren't you trying to reduce
costs
292
00:17:22,617 --> 00:17:24,098
by not having a driver in the
vehicle?
293
00:17:24,138 --> 00:17:27,982
Well it depends what people are
paying for, okay.
294
00:17:28,029 --> 00:17:29,747
If you're in business,
295
00:17:29,796 --> 00:17:33,065
you are trying to get as many
customers as possible.
296
00:17:33,114 --> 00:17:37,122
And if your competitor has
autonomous vehicles
297
00:17:37,185 --> 00:17:41,630
and is offering day care
services or looking after
298
00:17:41,676 --> 00:17:45,325
disabled, then you probably
won't be in business very long
299
00:17:45,365 --> 00:17:49,646
if they're able to provide a
much better customer experience.
300
00:17:49,740 --> 00:17:54,255
For my personal views, I like
to drive my car, not just to
301
00:17:54,302 --> 00:17:56,077
sit, I want to enjoy driving.
302
00:17:56,138 --> 00:17:58,661
Well I think in 50 years
there'll be special places for
303
00:17:58,701 --> 00:18:02,419
people with vintage cars that
they can go out and drive
304
00:18:02,459 --> 00:18:03,459
around.
305
00:18:03,708 --> 00:18:07,559
so we won't be able to go for a
Sunday drive in our vintage car
306
00:18:07,599 --> 00:18:10,271
because these autonomous
vehicles have got our roads.
307
00:18:10,436 --> 00:18:14,193
I mean in the future when all
the cars are autonomous we won't
308
00:18:14,264 --> 00:18:16,169
need traffic lights,
309
00:18:16,736 --> 00:18:21,638
because the cars will just
negotiate between themselves
310
00:18:21,701 --> 00:18:24,646
when they come to intersections,
roundabouts.
311
00:18:24,747 --> 00:18:26,348
Can I ask you a question?
312
00:18:26,396 --> 00:18:31,146
If we would do a trial
313
00:18:31,223 --> 00:18:35,033
with highly automated platooning
314
00:18:35,082 --> 00:18:38,284
of big road trains, would you
like to be involved?
315
00:18:38,333 --> 00:18:41,071
Yes I'd be involved. Yeah,
why not.
316
00:18:41,111 --> 00:18:45,993
If you can convince Frank,
you can convince anybody.
317
00:18:46,104 --> 00:18:48,729
If you want to come out with us
- and I bet Frank’s the same
318
00:18:48,792 --> 00:18:50,961
as well - if you want to come
for a drive in the truck
319
00:18:51,001 --> 00:18:53,958
and see exactly what it’s like
and the little issues that would
320
00:18:53,998 --> 00:18:57,675
never have been thought of
I mean my door’s always open –
321
00:18:57,724 --> 00:18:59,144
you're more than welcome to come
with me
322
00:18:59,193 --> 00:19:00,896
Oh, definitely. I think that’s…
323
00:19:00,982 --> 00:19:02,333
It's time for a road trip!
324
00:19:02,373 --> 00:19:03,373
[laughter]
325
00:19:09,593 --> 00:19:13,670
But drivers aren't the only ones
trying to find their way into
326
00:19:13,710 --> 00:19:15,210
the AI future.
327
00:19:17,440 --> 00:19:18,440
[music]
328
00:19:23,815 --> 00:19:27,557
Across town, it's after work
drinks for a group of young and
329
00:19:27,604 --> 00:19:29,071
aspiring professionals.
330
00:19:29,433 --> 00:19:32,480
Most have at least one
university degree or are
331
00:19:32,520 --> 00:19:36,160
studying for one -
Like Christine Maibom.
332
00:19:40,074 --> 00:19:43,038
I think as law students, we know
now that it's pretty tough,
333
00:19:43,078 --> 00:19:45,249
even to like get your foot in
the door.
334
00:19:45,289 --> 00:19:48,601
I think that, at the end of the
day, the employment rate for
335
00:19:48,641 --> 00:19:50,441
grads is still pretty high.
336
00:19:50,508 --> 00:19:54,638
Tertiary degrees usually shield
against technological upheaval
337
00:19:54,701 --> 00:19:58,724
but this time AI will automate
not just more physical tasks,
338
00:19:58,900 --> 00:20:00,490
but thinking ones.
339
00:20:02,607 --> 00:20:06,208
Waiting upstairs for Christine,
is a new artificial intelligence
340
00:20:06,248 --> 00:20:09,994
application, one that could
impact the research typically
341
00:20:10,034 --> 00:20:11,301
done by paralegals.
342
00:20:13,768 --> 00:20:17,518
We invited her to compete
against it in front of her peers
343
00:20:18,480 --> 00:20:21,680
Adelaide tax lawyer, Adrian
Cartland, came up with the idea
344
00:20:21,720 --> 00:20:24,240
for the AI, called Ailira.
345
00:20:24,920 --> 00:20:28,122
I am here with AILIRA, the
Artificially Intelligent Legal
346
00:20:28,162 --> 00:20:31,625
Information Research Assistant
and you're going to see if you
347
00:20:31,665 --> 00:20:32,730
can beat her.
348
00:20:32,779 --> 00:20:36,294
So what we've got here is a
tax question.
349
00:20:36,381 --> 00:20:39,490
Adrian explains to Christine
what sounds like a complicated
350
00:20:39,537 --> 00:20:41,071
corporate tax question.
351
00:20:41,521 --> 00:20:43,435
So does that make sense?
- Yeah, yep.
352
00:20:43,599 --> 00:20:46,896
Very familiar? Ready?
- I'm ready.
353
00:20:46,967 --> 00:20:49,435
Okay, guys. Ready. Set. Go.
354
00:20:49,795 --> 00:20:50,795
[music]
355
00:21:03,326 --> 00:21:05,574
And here we have the answer.
356
00:21:06,027 --> 00:21:07,361
So you've got the answer?
- We’re done.
357
00:21:07,410 --> 00:21:08,714
That's 30 seconds.
358
00:21:09,269 --> 00:21:11,245
Christine where are you up
to with the search?
359
00:21:11,394 --> 00:21:16,044
I'm at Section 44 of the Income
Tax Assessment Act.
360
00:21:16,959 --> 00:21:20,013
Maybe it has the answer,
I haven't looked through it yet.
361
00:21:20,092 --> 00:21:21,263
You're in the right act.
362
00:21:21,310 --> 00:21:24,270
Do you want to keep going?
Do you want to give it some
363
00:21:24,310 --> 00:21:25,310
time?
364
00:21:26,553 --> 00:21:29,209
I can keep going for a little
bit, yeah sure.
365
00:21:29,304 --> 00:21:31,084
[music]
366
00:21:34,257 --> 00:21:36,155
No pressure Christine.
We're at a minute.
367
00:21:36,530 --> 00:21:42,349
Okay, might need an hour for
this one.
368
00:21:43,006 --> 00:21:45,052
This is, you know, really
complex tax law.
369
00:21:45,092 --> 00:21:46,450
Like I’ve given you a hard
question.
370
00:21:46,537 --> 00:21:48,700
You were in the Income Tax
Assessment Act, you were doing
371
00:21:48,740 --> 00:21:51,263
your research. What’s your
process?
372
00:21:51,357 --> 00:21:53,192
Normally what I would do is
probably try and find the
373
00:21:53,232 --> 00:21:56,442
legislation first and then I’ll
probably look to any commentary
374
00:21:56,482 --> 00:21:57,482
on the issue.
375
00:21:57,640 --> 00:22:00,800
Find specific keywords, so for
example ‘consolidated group
376
00:22:00,846 --> 00:22:02,473
and assessable income' are
obviously there.
377
00:22:02,513 --> 00:22:05,411
That's a pretty standard way.
That's what I would approach.
378
00:22:05,474 --> 00:22:09,513
If you put this whole thing into
a keyword search,
379
00:22:09,553 --> 00:22:14,778
it's going to break down after
about four, five, seven words,
380
00:22:14,818 --> 00:22:17,794
whereas this is, you know, three
or four hundred words.
381
00:22:17,974 --> 00:22:19,443
So all I've done,
382
00:22:19,492 --> 00:22:22,862
is I've entered in the question
here. I've copied and pasted it.
383
00:22:22,911 --> 00:22:24,356
I've clicked on submit,
384
00:22:24,396 --> 00:22:27,216
and she's read literally through
literally millions of cases
385
00:22:27,263 --> 00:22:28,770
as soon as I pressed search.
386
00:22:28,810 --> 00:22:31,802
And then she's come through,
she's said here are the answers
387
00:22:31,842 --> 00:22:34,435
- oh wow
She's highlighted in there
388
00:22:34,498 --> 00:22:36,474
what she thinks is the answer.
389
00:22:36,523 --> 00:22:39,646
Yeah I mean, wow. Even down to
the fact that it can answer
390
00:22:39,695 --> 00:22:41,434
those very specific questions.
391
00:22:41,474 --> 00:22:43,284
I didn't realise that it would
just be able to tell you,
392
00:22:43,333 --> 00:22:45,558
'hey, here's the exact answer
to your question'.
393
00:22:45,607 --> 00:22:46,797
It's awesome.
394
00:22:47,439 --> 00:22:49,743
I think, obviously, for
paralegals, I think it's
395
00:22:49,792 --> 00:22:54,159
particularly scary because we're
already in such a competitive
396
00:22:54,199 --> 00:22:55,199
market.
397
00:22:55,409 --> 00:22:59,480
Adrian Cartland believes AI
could blow up lawyers' monopoly
398
00:22:59,529 --> 00:23:03,861
on basic legal know-how - and he
has an astonishing example
399
00:23:03,910 --> 00:23:05,048
of that.
400
00:23:05,097 --> 00:23:07,597
My girlfriend is a speech
pathologist who has no idea
401
00:23:07,637 --> 00:23:10,083
about law, and she used AILIRA
AILIRA to pass the
402
00:23:10,123 --> 00:23:11,630
Adelaide University Tax law
exam.
403
00:23:11,679 --> 00:23:12,740
oh wow.
404
00:23:15,834 --> 00:23:18,545
Automation is moving up in the
world.
405
00:23:22,194 --> 00:23:24,529
Here's Claire, a financial
planner.
406
00:23:24,600 --> 00:23:27,864
It's estimated that 15 percent
of an average financial
407
00:23:27,913 --> 00:23:32,083
planner's time is spent on tasks
that can be done by AI.
408
00:23:32,123 --> 00:23:34,802
What kind of things do you see
it ultimately taking over?
409
00:23:34,865 --> 00:23:39,239
I would say everything except
talking to your clients. Yeah.
410
00:23:39,770 --> 00:23:43,237
Here's Simon, he used to be a
secondary school teacher.
411
00:23:43,286 --> 00:23:46,348
One fifth of that job can be
done by AI
412
00:23:46,934 --> 00:23:49,458
Simon's now become a university
lecturer,
413
00:23:49,507 --> 00:23:50,974
which is less vulnerable.
414
00:23:51,099 --> 00:23:56,552
I think there's huge potential
for AI and other educational
415
00:23:56,607 --> 00:23:57,622
technologies.
416
00:23:57,982 --> 00:24:00,966
Obviously it's a little bit
worrying if we are talking about
417
00:24:01,006 --> 00:24:03,265
making a bunch of people
redundant.
418
00:24:03,960 --> 00:24:06,040
And did I mention journalists?
419
00:24:07,953 --> 00:24:10,781
I hope you enjoyed tonight’s
program.
420
00:24:12,635 --> 00:24:15,440
The percentage figures were
calculated by economist
421
00:24:15,480 --> 00:24:17,200
Andrew Charlton and his team,
422
00:24:17,271 --> 00:24:20,497
after drilling into Australian
workforce statistics.
423
00:24:20,607 --> 00:24:24,349
For the first time we broke the
Australian economy down into
424
00:24:24,396 --> 00:24:27,755
20 billion hours of work.
425
00:24:27,842 --> 00:24:32,493
And we asked what does every
Australian do with their day
426
00:24:32,548 --> 00:24:34,915
and how or what do they do in
their job
427
00:24:34,978 --> 00:24:36,915
change over the next 15 years.
428
00:24:37,056 --> 00:24:39,766
I think the biggest
misconception is that everyone
429
00:24:39,806 --> 00:24:42,274
talks about automation as
destroying jobs.
430
00:24:42,314 --> 00:24:46,126
The reality is automation
changes every job.
431
00:24:46,614 --> 00:24:49,426
It's not so much about what jobs
will we do,
432
00:24:49,474 --> 00:24:52,005
but how will we do our jobs.
433
00:24:52,099 --> 00:24:54,339
Because automation isn't going
to affect some workers,
434
00:24:54,379 --> 00:24:56,183
it’s going to affect every
worker.
435
00:24:56,926 --> 00:24:58,879
But if there's less to do at
work,
436
00:24:58,949 --> 00:25:03,149
that's got to mean less work or
less pay - or both. Doesn't it?
437
00:25:03,564 --> 00:25:06,727
If Australia embraces
automation,
438
00:25:07,259 --> 00:25:12,270
there is a $2.1 trillion
opportunity for us
439
00:25:12,310 --> 00:25:13,843
over the next 15 years.
440
00:25:14,738 --> 00:25:18,338
But here's the thing - we only
get that opportunity
441
00:25:18,754 --> 00:25:20,087
if we do two things.
442
00:25:20,918 --> 00:25:25,104
Firstly, if we manage the
transition and we ensure
443
00:25:25,363 --> 00:25:28,190
that all of that time that is
lost to machines
444
00:25:28,230 --> 00:25:31,164
from the Australian workplace
is redeployed
445
00:25:31,563 --> 00:25:35,430
and people are found new jobs
and new tasks
446
00:25:35,633 --> 00:25:39,300
And condition number two is
that we embrace automation
447
00:25:39,471 --> 00:25:42,384
and bring it into our
workplaces, and take advantage
448
00:25:42,433 --> 00:25:45,620
of the benefits of technology
and productivity.
449
00:25:46,104 --> 00:25:49,268
But Australia's not doing
well at either.
450
00:25:49,760 --> 00:25:51,229
Right now Australia is lagging.
451
00:25:51,277 --> 00:25:55,133
One in ten Australian companies
is embracing automation
452
00:25:55,196 --> 00:25:58,274
and that is roughly half the
rate of some of our
453
00:25:58,321 --> 00:25:59,698
global peers.
454
00:25:59,839 --> 00:26:02,888
Australia hasn’t been very good
historically at transitioning
455
00:26:02,929 --> 00:26:05,863
workers affected by big
technology shifts.
456
00:26:06,053 --> 00:26:12,562
Over the last 25 years, 1 in 10
unskilled men who lost their job
457
00:26:12,630 --> 00:26:14,539
never worked again.
458
00:26:14,766 --> 00:26:19,203
Today 4 in 10 unskilled men
don’t participate in
459
00:26:19,440 --> 00:26:20,640
the labour market.
460
00:26:28,233 --> 00:26:31,013
We asked a group of young
lawyers and legal students
461
00:26:31,053 --> 00:26:36,482
how they feel about embracing
AI - the contrasts were stark.
462
00:26:37,244 --> 00:26:38,511
I often get asked,
463
00:26:38,900 --> 00:26:40,986
you know, do you feel
threatened?
464
00:26:41,064 --> 00:26:44,954
Absolutely not! I am confident
and I’m excited about
465
00:26:44,994 --> 00:26:47,005
opportunities that AI presents.
466
00:26:47,120 --> 00:26:50,920
I think the real focus will be
on not only upskilling,
467
00:26:51,040 --> 00:26:54,934
but reskilling and about
diversifying your skillset.
468
00:26:54,974 --> 00:26:57,161
I think for me, I still have
469
00:26:57,256 --> 00:27:00,310
an underlying concern about how
much of the work is going to be
470
00:27:00,357 --> 00:27:02,825
taken away from someone who is
471
00:27:02,935 --> 00:27:06,200
still learning the law and just
wants a job part time where they
472
00:27:06,249 --> 00:27:08,200
can sort of help with some of
those less,
473
00:27:08,287 --> 00:27:11,497
you know, judgment-based
high level tasks.
474
00:27:11,631 --> 00:27:14,266
How much software is out there?
AI, for legal firms
475
00:27:14,315 --> 00:27:15,326
at the moment.
476
00:27:15,375 --> 00:27:16,530
There’s quite a lot
477
00:27:16,587 --> 00:27:19,102
There’s often a few competing
in the same space,
478
00:27:19,165 --> 00:27:21,734
so there’s a few that my law
firm has trialled in,
479
00:27:21,774 --> 00:27:23,419
for example, due diligence
480
00:27:23,529 --> 00:27:26,318
which are great at identifying
certain clauses.
481
00:27:26,513 --> 00:27:28,740
So rather than the lawyer
sitting there trying to find
482
00:27:28,789 --> 00:27:31,169
an assignment or a change of
control clause,
483
00:27:31,240 --> 00:27:32,505
it will pull that out.
484
00:27:32,607 --> 00:27:36,567
How much time do you think using
the AI cuts down on that kind of
485
00:27:36,607 --> 00:27:39,781
just crunching lots of documents
and numbers?
486
00:27:40,016 --> 00:27:43,021
Immensely! I would say
potentially up to about 20%
487
00:27:43,061 --> 00:27:46,422
of our time in terms of going
through and locating those
488
00:27:46,476 --> 00:27:48,897
clauses or pulling them out,
extracting them,
489
00:27:49,001 --> 00:27:53,226
which of course delivers way
better value for our clients
490
00:27:53,275 --> 00:27:54,304
which is great.
491
00:27:54,353 --> 00:27:58,045
Well I think the first reaction
was obviously very
492
00:27:58,247 --> 00:27:59,473
worried, I suppose.
493
00:27:59,545 --> 00:28:02,747
you just see the way that this
burns through these sort of
494
00:28:02,834 --> 00:28:07,272
banal tasks that we would
be doing at an entry level job.
495
00:28:07,897 --> 00:28:10,498
Yeah, it's quite an intuitive
response, I suppose,
496
00:28:10,554 --> 00:28:11,787
that we're just a bit worried.
497
00:28:11,827 --> 00:28:15,543
And also it was just so easy,
it was just copy and paste.
498
00:28:15,606 --> 00:28:18,442
It means that anyone could do it
really, so
499
00:28:18,680 --> 00:28:20,840
you don't need the sort of
specialised skills that are
500
00:28:20,890 --> 00:28:23,429
getting taught to us in our
law degrees.
501
00:28:23,570 --> 00:28:26,671
It's pretty much just a press
a button job.
502
00:28:27,154 --> 00:28:29,494
AI is like Tony Stark's Iron Man
suit.
503
00:28:29,543 --> 00:28:31,151
It takes someone and makes them
504
00:28:31,191 --> 00:28:33,089
into Superman, makes them
fantastic!
505
00:28:33,129 --> 00:28:35,644
And you could suddenly be doing
things that are like
506
00:28:35,699 --> 00:28:37,331
10 times above your level
507
00:28:37,396 --> 00:28:42,310
and providing that much cheaper
than anyone else could do it.
508
00:28:42,388 --> 00:28:45,857
The legal work of the future be
done by
509
00:28:45,960 --> 00:28:50,751
social workers, psychiatrists,
conveyancers, tax agents,
510
00:28:50,791 --> 00:28:51,796
accountants.
511
00:28:51,836 --> 00:28:56,218
They have that personal skillset
that lawyers sometimes lack.
512
00:28:57,446 --> 00:29:00,685
I also wonder just how much law
school should be
513
00:29:00,748 --> 00:29:03,825
teaching us about technology
and new ways of
514
00:29:03,873 --> 00:29:05,216
working in legal workforce,
515
00:29:05,263 --> 00:29:06,942
because I mean a lot of what you
guys are saying,
516
00:29:07,013 --> 00:29:08,435
I’ve heard for the first time.
517
00:29:08,484 --> 00:29:10,224
I certainly agree with that
statement. This is the first
518
00:29:10,273 --> 00:29:12,075
time I've heard the bulk of
this,
519
00:29:12,115 --> 00:29:15,607
especially hearing that there is
already existing a lot of AI.
520
00:29:18,422 --> 00:29:21,815
Unfortunately, our education
system just isn’t keeping up.
521
00:29:21,901 --> 00:29:24,635
Our research shows that right
now,
522
00:29:24,683 --> 00:29:29,170
up to 60% of young Australians
currently in education
523
00:29:29,233 --> 00:29:32,615
are studying for jobs that are
524
00:29:32,686 --> 00:29:36,286
highly likely to be automated
over the next 30 years.
525
00:29:38,232 --> 00:29:41,974
It's difficult to know what will
be hit hardest first,
526
00:29:43,084 --> 00:29:45,903
but jobs that help young people
make ends meet
527
00:29:45,943 --> 00:29:47,896
are among the most at risk.
528
00:29:48,627 --> 00:29:50,869
Like hospitality workers.
529
00:29:53,908 --> 00:29:56,931
So the figure that they're
giving us is 58%
530
00:29:56,994 --> 00:29:59,411
could be done by versions of AI.
531
00:29:59,537 --> 00:30:01,208
How does that make you feel?
532
00:30:01,474 --> 00:30:05,005
Very, very frustrated.
That is really scary.
533
00:30:05,059 --> 00:30:08,958
I don’t know what other job I
could do whilst
534
00:30:09,007 --> 00:30:12,624
studying, or that sort of thing
or as a fall-back career.
535
00:30:12,739 --> 00:30:15,991
It’s what all my friends have
done, it’s what I’ve done,
536
00:30:16,054 --> 00:30:19,564
it sort of just helps you
survive and
537
00:30:19,643 --> 00:30:22,251
pay for the food that you need
to eat each week.
538
00:30:22,572 --> 00:30:24,900
It may take a while to be cost
effective,
539
00:30:25,057 --> 00:30:28,205
but robots can now help take
orders, flip burgers,
540
00:30:28,257 --> 00:30:30,319
make coffee, deliver food.
541
00:30:31,070 --> 00:30:34,741
Young people will be the most
affected by these changes,
542
00:30:34,898 --> 00:30:38,273
because the types of roles that
young people take
543
00:30:38,320 --> 00:30:40,702
are precisely the type of
544
00:30:40,781 --> 00:30:45,234
entry level task that can be
most easily done by machines
545
00:30:45,290 --> 00:30:46,717
and Artificial Intelligence.
546
00:30:47,132 --> 00:30:48,560
But here this evening,
547
00:30:48,623 --> 00:30:51,138
there's at least one young
student who's a little more
548
00:30:51,178 --> 00:30:52,978
confident about the future.
549
00:30:53,240 --> 00:30:56,600
So Ani, how much of your job as
a doctor,
550
00:30:56,643 --> 00:31:00,649
do you imagine that AI could do
pretty much now?
551
00:31:00,782 --> 00:31:01,782
Now?
552
00:31:02,537 --> 00:31:04,855
Not much, maybe 5, 10 percent.
Yeah.
553
00:31:04,950 --> 00:31:06,418
[music]
554
00:31:12,560 --> 00:31:16,560
But Artificial Intelligence is
also moving into healthcare.
555
00:31:20,520 --> 00:31:22,240
- Watson?
- What is: Soron?
556
00:31:23,040 --> 00:31:24,720
- Watson?
- What is: leg?
557
00:31:24,760 --> 00:31:27,440
- Yes...Watson?
- What is: executor?
558
00:31:27,480 --> 00:31:29,320
- Right. Watson?
- What is: shoe?
559
00:31:29,360 --> 00:31:32,040
- You are right.
- Same category, 1600.
560
00:31:32,080 --> 00:31:33,080
Answer...
561
00:31:33,120 --> 00:31:35,880
So in the earliest days of
Artificial Intelligence and
562
00:31:35,920 --> 00:31:38,154
machine learning it was all
around
563
00:31:38,217 --> 00:31:40,083
teaching computers to play
games.
564
00:31:40,154 --> 00:31:41,154
[Jeopardy]
565
00:31:43,413 --> 00:31:46,613
But today, with those machine
learning algorithms
566
00:31:46,670 --> 00:31:50,143
we're teaching those algorithms
how to learn the language of
567
00:31:50,192 --> 00:31:51,230
medicine.
568
00:31:53,360 --> 00:31:56,920
We invited Aniruddh to hear
about IBM research in cancer
569
00:31:56,960 --> 00:32:00,366
treatment using its AI
supercomputer, Watson.
570
00:32:05,607 --> 00:32:08,427
Today I’m going to take you
through a demonstration of
571
00:32:08,470 --> 00:32:09,708
Watson for oncology.
572
00:32:09,748 --> 00:32:12,200
This is a product that brings
together a multitude of
573
00:32:12,240 --> 00:32:15,647
disparate data sources and is
able to learn and reason
574
00:32:15,712 --> 00:32:17,755
and generate treatment
recommendations.
575
00:32:17,827 --> 00:32:20,788
This patient is a 62 year
patient that’s been diagnosed
576
00:32:20,837 --> 00:32:23,716
with breast cancer and she’s
presenting to this clinician.
577
00:32:24,176 --> 00:32:27,293
So the clinician has now entered
this note in and
578
00:32:27,342 --> 00:32:29,942
Watson has read and understood
that note.
579
00:32:29,982 --> 00:32:32,016
Watson can read natural
language.
580
00:32:32,110 --> 00:32:35,625
When I attach this final bit of
information, the ask Watson
581
00:32:35,688 --> 00:32:38,446
button turns green and at
which stage we’re ready to ask
582
00:32:38,495 --> 00:32:40,696
Watson for treatment
recommendations.
583
00:32:41,202 --> 00:32:44,432
Within seconds, Watson has read
through all the patient's
584
00:32:44,481 --> 00:32:46,106
records and doctor's notes,
585
00:32:46,169 --> 00:32:50,184
as well as relevant medical
articles, guidelines and trials.
586
00:32:50,825 --> 00:32:52,865
And what it comes up with is a
set of ranked
587
00:32:52,920 --> 00:32:54,646
treatment recommendations.
588
00:32:54,967 --> 00:32:56,786
Down the bottom, we can see
589
00:32:56,849 --> 00:32:59,607
those in red that Watson
is not recommending.
590
00:32:59,781 --> 00:33:02,046
Does it take into account how
many citations a different
591
00:33:02,095 --> 00:33:04,870
article might have used?
Say the more citations, the more
592
00:33:04,910 --> 00:33:06,145
it’s going to trust it?
593
00:33:06,185 --> 00:33:08,829
So this is again where we need
clinician input
594
00:33:08,869 --> 00:33:11,669
to be able to make those
recommendations.
595
00:33:12,357 --> 00:33:15,239
Natalie, you’ve shown us this
and
596
00:33:15,294 --> 00:33:18,317
you’ve said that this would be a
clinician going through this.
597
00:33:18,458 --> 00:33:21,661
But the fields that you’ve
shown, really an educated
598
00:33:21,716 --> 00:33:23,984
patient could fill a lot of
these fields from
599
00:33:24,047 --> 00:33:25,392
their own information.
600
00:33:25,525 --> 00:33:27,575
What do you think about that
approach?
601
00:33:27,615 --> 00:33:30,739
The patients essentially
getting their own second opinion
602
00:33:30,794 --> 00:33:32,513
from Watson for themselves?
603
00:33:32,623 --> 00:33:34,638
I see this as a potential tool
to do that.
604
00:33:37,154 --> 00:33:40,129
AI's growing expertise at image
recognition
605
00:33:40,206 --> 00:33:44,223
is also being harnessed by IBM
to train Watson on retinal
606
00:33:44,272 --> 00:33:45,363
scans.
607
00:33:45,536 --> 00:33:48,513
One in three diabetics have
associated eye disease,
608
00:33:48,670 --> 00:33:51,810
but only about half these
diabetics get regular checks.
609
00:33:52,131 --> 00:33:55,115
We know that with diabetes the
majority of vision loss is
610
00:33:55,164 --> 00:33:56,153
actually preventable,
611
00:33:56,193 --> 00:33:58,068
if timely treatment is
instigated
612
00:33:58,295 --> 00:34:00,746
and so that if we can tap into
that group,
613
00:34:00,808 --> 00:34:03,668
you’re already looking at
potentially an incredible
614
00:34:03,717 --> 00:34:06,613
improvement in quality of life
for those patients.
615
00:34:06,973 --> 00:34:09,146
How could something like that
happen?
616
00:34:09,256 --> 00:34:12,251
You could have a situation where
you have a smartphone
617
00:34:12,320 --> 00:34:16,040
application. You take a retinal
selfie if you’d like.
618
00:34:16,899 --> 00:34:19,539
That then is uploaded to an AI
platform,
619
00:34:19,600 --> 00:34:22,920
analysed instantly and then you
have a process
620
00:34:22,970 --> 00:34:24,361
by which you instantly you're
known
621
00:34:24,401 --> 00:34:27,068
to have high risk or low
risk disease.
622
00:34:27,640 --> 00:34:31,038
How long does it take to analyse
a single retinal image using
623
00:34:31,078 --> 00:34:32,082
the platform.
624
00:34:32,131 --> 00:34:35,397
Very close to real time, in
a matter of seconds.
625
00:34:35,443 --> 00:34:38,037
I mean this is obviously very,
very early days,
626
00:34:38,108 --> 00:34:40,037
but the hope is that one day
627
00:34:40,077 --> 00:34:42,779
these sorts of technologies
will be widely available
628
00:34:42,819 --> 00:34:45,752
to everyone for this sort of
self-analysis.
629
00:34:46,310 --> 00:34:47,661
Just like law,
630
00:34:47,840 --> 00:34:51,942
AI might one day enable patients
to DIY their own expert
631
00:34:51,982 --> 00:34:55,188
diagnosis and treatment
recommendations.
632
00:34:55,626 --> 00:34:58,242
Some doctors will absolutely
feel threatened by it,
633
00:34:58,282 --> 00:35:00,680
but I’d come back to the point
that, you know,
634
00:35:00,729 --> 00:35:03,086
you want to think about it from
the patient’s perspective.
635
00:35:03,172 --> 00:35:05,068
So if you’re an oncologist,
636
00:35:05,237 --> 00:35:07,630
sitting in the clinic with your
patient,
637
00:35:07,763 --> 00:35:10,255
the sorts of things that you’re
dealing with is
638
00:35:10,304 --> 00:35:13,434
things like giving bad news to
patients and I don’t think
639
00:35:13,474 --> 00:35:16,466
patients want to get bad news
from a machine.
640
00:35:16,537 --> 00:35:20,070
So it’s really that ability to
have that intelligent
641
00:35:20,116 --> 00:35:22,100
assistant who's up to date
642
00:35:22,140 --> 00:35:24,975
and providing you with the
information that you need,
643
00:35:25,023 --> 00:35:26,694
and providing it quickly.
644
00:35:26,764 --> 00:35:29,817
We like to use the term
augmented intelligence.
645
00:35:29,943 --> 00:35:33,028
I think one interesting way to
think about this is I mentioned
646
00:35:33,068 --> 00:35:36,884
50,000 oncology journals, a year
647
00:35:36,961 --> 00:35:41,141
Now if you’re a clinician trying
to read all of those
648
00:35:41,227 --> 00:35:43,864
50,000 oncology journals,
649
00:35:43,959 --> 00:35:47,013
that would mean you would need
about 160 hours a week
650
00:35:47,062 --> 00:35:48,706
just to read the oncology
651
00:35:48,755 --> 00:35:50,330
articles that are published
today.
652
00:35:50,370 --> 00:35:53,986
Watson’s ability to process all
of this medical literature
653
00:35:54,026 --> 00:35:57,174
and information and text is
immense.
654
00:35:57,214 --> 00:36:01,972
It’s 200 million pages of
information in seconds.
655
00:36:02,293 --> 00:36:07,456
Wow! I need a bit of work
on myself then.
656
00:36:08,098 --> 00:36:11,608
IBM is just one of many
companies promoting the promise
657
00:36:11,655 --> 00:36:13,600
of AI in healthcare
658
00:36:14,520 --> 00:36:17,120
- but for all these machine
learning algorithms to be
659
00:36:17,194 --> 00:36:20,185
effective, they need lots of
data
660
00:36:20,591 --> 00:36:23,545
lots of our private medical data
661
00:36:23,842 --> 00:36:27,888
In my conversations with my
patients and the patient
662
00:36:27,928 --> 00:36:29,995
advocates that we’ve spoken to,
663
00:36:30,241 --> 00:36:33,513
you know, they certainly want
their privacy protected.
664
00:36:33,583 --> 00:36:36,731
But I think it’s actually a
higher priority for them
665
00:36:36,897 --> 00:36:39,638
to see this data being used for
the public good.
666
00:36:39,787 --> 00:36:43,900
But once it has all the data,
could this intelligent assistant
667
00:36:43,949 --> 00:36:48,433
ultimately disrupt medicine's
centuries old hierarchy?
668
00:36:48,854 --> 00:36:53,435
They should have more general
practitioners and less of the
669
00:36:53,513 --> 00:36:54,638
specialty.
670
00:36:55,091 --> 00:36:56,115
So doctors,
671
00:36:56,162 --> 00:36:58,388
They'll have more time to have
a better relationship with you
672
00:36:58,482 --> 00:37:02,019
maybe they will be talking about
your overall health
673
00:37:02,105 --> 00:37:04,987
rather than waiting for you
to come in with symptoms
674
00:37:05,042 --> 00:37:07,575
and if they do have to
675
00:37:08,911 --> 00:37:13,403
analyse an X-ray and look for
disease,
676
00:37:13,553 --> 00:37:15,404
they will have a computer to
do that,
677
00:37:15,478 --> 00:37:17,519
they will check what the
computer does,
678
00:37:17,568 --> 00:37:19,130
but they will be pretty
confident that the
679
00:37:19,170 --> 00:37:20,872
computer is going to do a
good job.
680
00:37:26,111 --> 00:37:28,861
When we first talked to you,
Ani, in Sydney,
681
00:37:28,935 --> 00:37:32,040
you said you thought that in
terms of the time spent on tasks
682
00:37:32,080 --> 00:37:33,116
that doctors do
683
00:37:33,435 --> 00:37:36,029
that AI might be able to handle
maybe five,
684
00:37:36,069 --> 00:37:38,435
maybe at the outside 10%.
685
00:37:38,896 --> 00:37:40,216
How do you see that now?
686
00:37:40,904 --> 00:37:44,513
Definitely a lot more! I’d say
it could go up to 40-50%,
687
00:37:44,568 --> 00:37:47,505
using it as a tool rather than
taking over I’d say is going
688
00:37:47,545 --> 00:37:48,545
to happen.
689
00:37:49,084 --> 00:37:52,214
The percentage for doctors is
21%
690
00:37:52,481 --> 00:37:54,871
but that's likely to grow in the
coming decades,
691
00:37:54,921 --> 00:37:58,550
as it will for every profession,
and every job.
692
00:37:59,832 --> 00:38:02,752
We've been through technological
upheaval before,
693
00:38:02,970 --> 00:38:05,049
but this time, it's different.
694
00:38:10,904 --> 00:38:12,818
One of the challenges will be
that
695
00:38:12,920 --> 00:38:15,419
the AI revolution happens
probably much quicker than the
696
00:38:15,468 --> 00:38:16,529
Industrial Revolution.
697
00:38:16,584 --> 00:38:18,786
We don’t have to build big
steam engines,
698
00:38:18,835 --> 00:38:20,622
we just have to copy code
699
00:38:21,099 --> 00:38:23,732
and that takes almost no time
and no cost.
700
00:38:24,631 --> 00:38:26,685
There is a very serious question
701
00:38:26,826 --> 00:38:29,794
– whether there will be as many
jobs left as before.
702
00:38:29,943 --> 00:38:33,208
[music]
703
00:38:40,670 --> 00:38:44,038
I think the question is what is
the rate of change and
704
00:38:44,078 --> 00:38:49,014
is that going to be so fast
that it’s a shock to the system
705
00:38:49,054 --> 00:38:50,479
that’s going to be hard to
recover from?
706
00:38:50,605 --> 00:38:53,079
I guess I’m worried about
whether people will get
707
00:38:53,119 --> 00:38:57,252
frustrated with that and whether
that will lead to
708
00:38:58,080 --> 00:39:02,000
inequality of haves and have
nots.
709
00:39:02,115 --> 00:39:05,615
And maybe we need some
additional safety nets
710
00:39:05,664 --> 00:39:07,026
for those who
711
00:39:07,075 --> 00:39:09,310
fall through those cracks
and aren’t able to be lifted.
712
00:39:09,451 --> 00:39:12,153
We should explore ideas like
Universal Basic Income
713
00:39:12,193 --> 00:39:15,575
to make sure that everyone has a
cushion to try new ideas.
714
00:39:16,006 --> 00:39:18,130
What to do about mass
unemployment.
715
00:39:18,717 --> 00:39:21,232
This is going to be a massive
social challenge,
716
00:39:22,662 --> 00:39:25,951
and I think ultimately we will
have to have
717
00:39:26,083 --> 00:39:28,341
some kind of Universal Basic
Income.
718
00:39:28,560 --> 00:39:29,958
I don’t think we’re going to
have a choice.
719
00:39:30,123 --> 00:39:32,403
I think it’s good that we’re
experimenting and looking
720
00:39:32,443 --> 00:39:33,724
at various things.
721
00:39:34,047 --> 00:39:35,372
I think we don't know the
answer yet
722
00:39:35,428 --> 00:39:37,761
for what’s going to be
effective.
723
00:39:39,318 --> 00:39:42,354
The ascent of Artificial
Intelligence promises
724
00:39:42,410 --> 00:39:44,097
spectacular opportunities
725
00:39:44,472 --> 00:39:46,815
but also many risks.
726
00:39:49,720 --> 00:39:51,840
To kickstart a national
conversation,
727
00:39:51,955 --> 00:39:55,134
we brought together the
generation most affected
728
00:39:55,221 --> 00:39:58,752
with some of the experts
helping to design the future.
729
00:39:59,947 --> 00:40:02,103
You will have the ability to
do jobs
730
00:40:02,162 --> 00:40:04,747
that your parents and
grandparents couldn’t have
731
00:40:04,787 --> 00:40:05,787
dreamed of.
732
00:40:06,874 --> 00:40:09,428
And it’s going to require us to
constantly
733
00:40:09,576 --> 00:40:12,459
be educating ourselves to keep
ahead of the machines.
734
00:40:14,022 --> 00:40:15,919
First of all, I wanted to say,
I think
735
00:40:15,959 --> 00:40:18,154
the younger generations probably
have a better idea about where
736
00:40:18,203 --> 00:40:20,047
things are going than the
older generations.
737
00:40:20,133 --> 00:40:21,242
[laughter]
738
00:40:23,703 --> 00:40:25,103
Sorry, but I think...
739
00:40:26,781 --> 00:40:27,922
So where have we got it wrong?
740
00:40:28,703 --> 00:40:31,324
Well, I think the younger
people, they’ve grown up
741
00:40:31,379 --> 00:40:33,308
being digital natives
742
00:40:33,348 --> 00:40:35,120
and so they know where it’s
going, they know what it
743
00:40:35,160 --> 00:40:36,760
has the potential to do
744
00:40:37,113 --> 00:40:39,740
and they can foresee where
it’s going to go in the future.
745
00:40:39,967 --> 00:40:42,669
We all hate that question at a
party, of like, what do you do?
746
00:40:42,810 --> 00:40:44,661
And I think in the future you
will be asked instead
747
00:40:44,710 --> 00:40:46,896
what did you do today or
what did you do this week?
748
00:40:47,153 --> 00:40:50,658
We all think of jobs like a
secure safe thing
749
00:40:50,737 --> 00:40:53,948
but if you work one role,
one job title at one company,
750
00:40:54,119 --> 00:40:56,455
then you’re actually setting
yourself up to be more likely to
751
00:40:56,495 --> 00:40:58,416
be automated in the future.
752
00:40:58,487 --> 00:41:02,753
The technology in the building
game is advancing.
753
00:41:03,910 --> 00:41:05,831
Kind of worrying if you’re a
754
00:41:06,347 --> 00:41:08,589
22-year old carpenter,
for example.
755
00:41:08,698 --> 00:41:11,942
I think there’s often this
misconception that you
756
00:41:11,998 --> 00:41:14,513
have to think about robot
physically replacing you.
757
00:41:14,592 --> 00:41:15,811
One robot for one job.
758
00:41:15,951 --> 00:41:17,732
Actually it’s going to be, in
many cases,
759
00:41:17,781 --> 00:41:18,809
a lot more subtle than that.
760
00:41:18,858 --> 00:41:20,794
In your case, there will be a
lot more
761
00:41:20,960 --> 00:41:25,160
of the manufacturing of the
carpentry happens off-site
762
00:41:25,282 --> 00:41:28,755
That happened between the start
of my apprenticeship and
763
00:41:28,804 --> 00:41:29,861
when I finished.
764
00:41:29,910 --> 00:41:33,017
We were moving into all the
frames and everything were built
765
00:41:33,057 --> 00:41:34,423
off-site and brought to you.
766
00:41:34,472 --> 00:41:37,057
And you’d do all the work that
used to take you three weeks
767
00:41:37,106 --> 00:41:38,220
in three days.
768
00:41:38,307 --> 00:41:40,517
I mean there is one aspect of
carpentry, I think, that will
769
00:41:40,650 --> 00:41:45,915
stay forever, which is the
more artisan side of carpentry.
770
00:41:46,353 --> 00:41:48,055
We will appreciate things that
are made,
771
00:41:48,189 --> 00:41:49,844
that have been touched by the
human hand.
772
00:41:49,996 --> 00:41:52,093
I think there will be a huge
impact in
773
00:41:52,142 --> 00:41:55,079
retail in terms of being
influenced by automation.
774
00:41:55,392 --> 00:41:57,204
Probably the cashier, you
probably don’t need someone
775
00:41:57,244 --> 00:42:00,270
there necessarily to take that
consumer’s money,
776
00:42:00,349 --> 00:42:02,739
that could be done quite simply.
777
00:42:05,886 --> 00:42:09,130
But at the same time,
just from having a job,
778
00:42:09,179 --> 00:42:15,060
there is a biological need met
there, which
779
00:42:15,216 --> 00:42:16,747
I think we're overlooking a lot,
780
00:42:16,880 --> 00:42:20,504
I think we might not have a
great depression economically
781
00:42:20,615 --> 00:42:22,082
but actually mentally.
782
00:42:22,534 --> 00:42:27,307
AI is clearly going to create a
whole new raft of jobs.
783
00:42:27,722 --> 00:42:29,716
So you know, there are the
people who actually
784
00:42:29,779 --> 00:42:32,763
build these AI systems, I mean,
if you have a robot at home then
785
00:42:32,896 --> 00:42:36,036
every now and then,
you're going to need somebody
786
00:42:36,193 --> 00:42:39,138
to swing by your home to check
it out.
787
00:42:39,248 --> 00:42:42,802
There will be people who need to
train these robots
788
00:42:42,912 --> 00:42:46,599
and there will be robot
therapists,
789
00:42:46,709 --> 00:42:49,443
there will be obedience school
for robots
790
00:42:49,623 --> 00:42:51,023
and other kinds of –
791
00:42:53,482 --> 00:42:54,482
I'm not joking!
792
00:42:54,849 --> 00:42:57,044
[music]
793
00:43:01,732 --> 00:43:06,283
What should these young people
do today or tomorrow
794
00:43:06,361 --> 00:43:07,792
to get ready for this?
795
00:43:07,870 --> 00:43:10,338
There really is only one
strategy and that is to
796
00:43:10,387 --> 00:43:11,987
embrace the technology
797
00:43:12,176 --> 00:43:14,652
and to learn about it, and
to understand
798
00:43:14,887 --> 00:43:16,630
as far as possible, you know,
799
00:43:16,670 --> 00:43:19,606
what kind of impact it has
on your job and your goals.
800
00:43:19,646 --> 00:43:23,052
I think the key skills that
people need are the skills
801
00:43:23,092 --> 00:43:25,013
to work with machines.
802
00:43:25,310 --> 00:43:27,169
I don’t think everyone needs
to become a coder.
803
00:43:27,240 --> 00:43:29,840
You know, in fact, if Artificial
Intelligence is any good,
804
00:43:30,076 --> 00:43:33,015
machines will be better at
writing code than humans are,
805
00:43:33,508 --> 00:43:36,211
but people need to be able to
work with code,
806
00:43:36,260 --> 00:43:38,456
work with the output of those
machines
807
00:43:38,618 --> 00:43:40,286
and turn it into valuable
808
00:43:40,362 --> 00:43:43,193
commodities and services that
other people want.
809
00:43:43,326 --> 00:43:45,155
I disagree that we’ll
necessarily
810
00:43:45,358 --> 00:43:47,600
have to work with the machines,
the machines actually
811
00:43:47,640 --> 00:43:49,647
are going to understand us
quite well.
812
00:43:49,849 --> 00:43:52,216
So what are our strengths, what
are our human strengths?
813
00:43:52,265 --> 00:43:53,342
Well, those are
814
00:43:53,382 --> 00:43:55,934
our creativity, our adaptability
815
00:43:56,083 --> 00:43:58,012
and our emotional and
social intelligence.
816
00:43:58,091 --> 00:44:00,059
How do people get those skills?
817
00:44:00,802 --> 00:44:01,802
[laughter]
818
00:44:02,091 --> 00:44:04,051
Well, if they’re the
important skills.
819
00:44:04,100 --> 00:44:07,691
Well, I think the curriculum
at schools and at universities
820
00:44:07,731 --> 00:44:08,731
has to change
821
00:44:09,170 --> 00:44:11,036
so that those are the skills
that are taught
822
00:44:11,080 --> 00:44:14,760
they are barely taught if you
look at the current curriculums
823
00:44:14,810 --> 00:44:16,388
you have to change the
curriculum.
824
00:44:16,521 --> 00:44:19,122
So those become the really
important skills.
825
00:44:19,404 --> 00:44:22,317
A lot of these discussions seem
to be skirting around the issue
826
00:44:22,357 --> 00:44:23,754
that really is the core of it,
827
00:44:23,794 --> 00:44:27,340
is that the economic system is
really the problem at play here.
828
00:44:27,513 --> 00:44:30,477
It’s all about the ownership of
the AI and the robotics
829
00:44:30,524 --> 00:44:31,524
and the algorithms
830
00:44:31,618 --> 00:44:33,945
If that ownership was shared and
the wealth was shared,
831
00:44:34,025 --> 00:44:36,273
then we’d be able to share in
that wealth.
832
00:44:36,406 --> 00:44:39,609
The trend is going to be towards
big companies like
833
00:44:39,665 --> 00:44:41,210
Amazon and Google,
834
00:44:41,523 --> 00:44:44,351
I don’t really see a
fragmentation
835
00:44:44,453 --> 00:44:47,549
because whoever has the data,
has the power.
836
00:44:54,667 --> 00:44:57,703
Data is considered by many to be
the new oil,
837
00:44:57,758 --> 00:45:00,558
because as we move to a
digital economy,
838
00:45:00,683 --> 00:45:03,859
we can’t have automation without
data.
839
00:45:04,039 --> 00:45:08,913
What we see as an example is
value now moving from
840
00:45:08,968 --> 00:45:11,991
physical assets to data assets.
841
00:45:12,031 --> 00:45:13,686
For example, Facebook.
842
00:45:13,880 --> 00:45:17,600
Today when I looked the market
capitalisation was about
843
00:45:17,742 --> 00:45:20,326
$479 billion.
844
00:45:20,561 --> 00:45:23,022
Now if you contrast that with
Qantas,
845
00:45:23,100 --> 00:45:25,100
who has a lot of physical
assets,
846
00:45:25,444 --> 00:45:29,224
their market capitalisation was
$9 billion.
847
00:45:29,779 --> 00:45:32,755
But you can go a step further
and if you look at
848
00:45:32,810 --> 00:45:35,389
the underlying structure of
Qantas,
849
00:45:35,695 --> 00:45:39,304
about $5 billion can be
attributed to their loyalty
850
00:45:39,353 --> 00:45:40,450
program.
851
00:45:40,499 --> 00:45:43,912
which is effectively a
data-centric asset
852
00:45:43,952 --> 00:45:45,124
that they’ve created.
853
00:45:45,173 --> 00:45:49,021
So the jobs of the future will
leverage data.
854
00:45:52,158 --> 00:45:54,845
The ownership of data is
important because
855
00:45:54,894 --> 00:45:56,144
you think about Facebook
856
00:45:56,196 --> 00:45:58,792
over time Facebook learns about
you
857
00:45:58,846 --> 00:46:02,464
and over time the service
improves as you use it further.
858
00:46:02,777 --> 00:46:07,285
So whoever gets to scale with
these data centric businesses
859
00:46:07,325 --> 00:46:12,113
has a natural advantage and a
natural monopolistic tendency.
860
00:46:14,340 --> 00:46:17,749
In twenty years’ time, if big
corporations like Google
861
00:46:17,798 --> 00:46:19,602
and Facebook aren’t broken up,
862
00:46:20,141 --> 00:46:22,266
then I would be incredibly
worried for our future.
863
00:46:22,315 --> 00:46:24,321
Part of the reason why there are
so many monopolies is because
864
00:46:24,370 --> 00:46:27,483
they’ve managed to control
access to that data.
865
00:46:27,583 --> 00:46:28,974
Breaking them up I think
would be one of the
866
00:46:29,023 --> 00:46:31,388
things that we need to do, to be
able to open the data up
867
00:46:31,428 --> 00:46:33,870
so that all of us can share the
prosperity.
868
00:46:34,027 --> 00:46:38,706
But the global economy is very
rich and complex,
869
00:46:39,042 --> 00:46:44,034
and Australia can’t just say
oh we’re opening the data.
870
00:46:44,080 --> 00:46:47,173
I just still think we’re leaving
a section of the population
871
00:46:47,213 --> 00:46:49,947
behind. And some people
in our country
872
00:46:49,987 --> 00:46:52,679
can’t afford a computer or
the internet or a
873
00:46:52,728 --> 00:46:53,900
home to live in.
874
00:46:53,955 --> 00:46:55,329
It’d be a bit
875
00:46:55,821 --> 00:46:59,555
crazy to just let it all go free
market, just go crazy,
876
00:46:59,884 --> 00:47:03,869
because we don’t know if
everyone is on that,
877
00:47:04,135 --> 00:47:06,026
make the world a better place
type thing.
878
00:47:11,831 --> 00:47:14,974
I personally don’t want to be
served by a computer,
879
00:47:15,021 --> 00:47:17,341
even if I am buying a coffee
and things like that.
880
00:47:17,381 --> 00:47:19,708
I enjoy that human connection
and I think that
881
00:47:19,779 --> 00:47:24,049
human connection’s really
important for isolated people,
882
00:47:24,229 --> 00:47:26,838
and that job might be really
important for that person
883
00:47:26,878 --> 00:47:30,211
and creating meaning in their
life and purpose in their life.
884
00:47:31,591 --> 00:47:36,399
They might not be skilled enough
to work in another industry.
885
00:47:36,560 --> 00:47:41,466
My first thought is, if it is
about human interaction,
886
00:47:41,660 --> 00:47:43,612
why do you need to
887
00:47:43,661 --> 00:47:45,911
be buying a coffee to
have that human interaction?
888
00:47:46,006 --> 00:47:48,630
Why not just have the machine do
the transaction
889
00:47:48,679 --> 00:47:51,841
and people can focus simply
on having a conversation?
890
00:47:51,896 --> 00:47:54,177
Perhaps part of that is to
simply say
891
00:47:54,271 --> 00:47:58,654
it is a productive role in
society to interact,
892
00:47:58,841 --> 00:48:00,271
to have conversations
893
00:48:00,365 --> 00:48:03,138
and then we can remunerate that
and make that a part of people’s
894
00:48:03,178 --> 00:48:04,216
roles in society.
895
00:48:04,279 --> 00:48:07,328
It could be a lot of things
around caring,
896
00:48:07,399 --> 00:48:10,516
interpersonal interactions, the
type of conversations you were
897
00:48:10,575 --> 00:48:11,578
talking about.
898
00:48:11,743 --> 00:48:13,653
I think they will become an
increasingly important part
899
00:48:13,693 --> 00:48:16,099
of the way we interact,
the way we find meaning,
900
00:48:16,217 --> 00:48:18,746
and potentially the way we
receive remuneration.
901
00:48:18,878 --> 00:48:21,137
I think we all have choices to
make,
902
00:48:21,231 --> 00:48:23,152
and amongst those are
903
00:48:23,270 --> 00:48:25,449
the degree to which we allow
904
00:48:25,544 --> 00:48:28,340
or want machines to be part of
our emotional engagement.
905
00:48:28,551 --> 00:48:31,861
Will we entrust our children to
robot nannies?
906
00:48:39,799 --> 00:48:41,799
Algorithms can be taught to
907
00:48:41,891 --> 00:48:44,110
interpret and perceive human
emotion.
908
00:48:44,159 --> 00:48:47,512
We can recognise from an image
that a person is smiling,
909
00:48:47,552 --> 00:48:49,350
we can see from a frown that
they’re angry,
910
00:48:49,553 --> 00:48:54,084
understand the emotion that’s in
text or speech
911
00:48:54,537 --> 00:48:56,722
and you combine that together
with other data, then yes,
912
00:48:56,771 --> 00:49:00,098
you can get a much more refined
view of what is that emotion,
913
00:49:00,138 --> 00:49:01,318
what is being expressed.
914
00:49:01,520 --> 00:49:03,996
But does an Artificial
Intelligence algorithm actually
915
00:49:04,036 --> 00:49:05,348
understand emotion?
916
00:49:05,559 --> 00:49:07,082
No, not presently.
917
00:49:07,552 --> 00:49:10,543
We’re in the early days of
emotion detection
918
00:49:10,825 --> 00:49:13,481
but this could go quite far,
you could certainly see
919
00:49:13,715 --> 00:49:15,739
emotional responses
920
00:49:16,091 --> 00:49:18,262
from algorithms, from computer
systems
921
00:49:18,442 --> 00:49:23,043
in caring for people, in
teaching, in our workplace.
922
00:49:23,449 --> 00:49:25,660
And to some extent that’s
already happening right now
923
00:49:25,709 --> 00:49:29,310
as people interact with bots
online, ask questions,
924
00:49:29,551 --> 00:49:30,583
and actually feel like,
925
00:49:30,646 --> 00:49:33,169
oftentimes, they’re interacting
with a real person.
926
00:49:33,412 --> 00:49:34,412
[music]
927
00:49:47,640 --> 00:49:50,840
When TAY was released in the US
to audience
928
00:49:50,960 --> 00:49:53,120
of 20 to 25 year olds,
929
00:49:53,396 --> 00:49:56,715
the interactions that TAY was
having on the internet
930
00:49:56,801 --> 00:50:00,816
included hate speech and
trolling.
931
00:50:02,419 --> 00:50:04,684
It only lasted a day,
but it’s a really
932
00:50:04,920 --> 00:50:08,927
fascinating lesson in how
careful we need to be
933
00:50:09,310 --> 00:50:12,091
in the interaction between an
an artificial intelligence
934
00:50:12,140 --> 00:50:13,153
and its society.
935
00:50:13,193 --> 00:50:14,255
The key thing is
936
00:50:14,672 --> 00:50:18,372
what we teach our AI, it
reflects back to us.
937
00:50:20,764 --> 00:50:23,678
First, you will
938
00:50:23,811 --> 00:50:26,061
want the robot in your
home because it’s helpful,
939
00:50:26,264 --> 00:50:29,908
next minute you will need it
because you start to rely on it,
940
00:50:29,963 --> 00:50:31,689
and then you can’t live
without it.
941
00:50:32,142 --> 00:50:34,251
I think it sounds scary to be
honest
942
00:50:34,299 --> 00:50:37,189
The thought of replacing that
human interaction
943
00:50:37,238 --> 00:50:40,013
and even having robots in your
home that you interact daily
944
00:50:40,062 --> 00:50:42,247
with like a member of the
family.
945
00:50:42,373 --> 00:50:44,294
I think, yeah,
946
00:50:44,662 --> 00:50:49,075
really human interaction and
real empathy can’t be replaced
947
00:50:49,220 --> 00:50:50,419
and at the end of the day,
948
00:50:50,545 --> 00:50:53,099
the robot doesn’t genuinely
care about you.
949
00:50:53,279 --> 00:50:55,638
I think you certainly can’t
stop it.
950
00:50:55,709 --> 00:50:58,865
I mean there is no way to
stop it.
951
00:50:59,271 --> 00:51:03,357
Software systems and robots,
of course can empathize
952
00:51:03,428 --> 00:51:06,708
and they can empathize so much
better than people
953
00:51:06,810 --> 00:51:13,200
because they will be able to
extract so much more data and
954
00:51:13,277 --> 00:51:15,795
not just about you, but a lot
of people
955
00:51:15,890 --> 00:51:17,334
like you around the world.
956
00:51:17,452 --> 00:51:19,512
To go to this question of
whether we can
957
00:51:19,561 --> 00:51:20,952
or cannot stop it,
958
00:51:21,351 --> 00:51:23,663
we’re seeing for example in the
United States, already
959
00:51:23,889 --> 00:51:26,427
computers and algorithms being
used to help
960
00:51:26,488 --> 00:51:27,740
judges make decisions.
961
00:51:28,029 --> 00:51:31,298
And there I think is a line
we probably don’t want to cross.
962
00:51:31,347 --> 00:51:32,513
We don’t want to wake up
963
00:51:32,584 --> 00:51:34,075
and discover we’re in a world
964
00:51:34,256 --> 00:51:37,419
where we’re locking people up
because of an algorithm.
965
00:51:37,701 --> 00:51:42,058
I realise it’s fraught but all
of the evidence says
966
00:51:42,317 --> 00:51:47,402
that AI algorithms are much
more reliable than people.
967
00:51:47,530 --> 00:51:50,972
People are so flawed and you
know,
968
00:51:51,027 --> 00:51:52,948
they are very biased,
we discriminate
969
00:51:53,199 --> 00:51:56,605
and that is much more
problematic
970
00:51:56,652 --> 00:51:59,719
and the reason is that people
are not transparent
971
00:51:59,759 --> 00:52:02,536
in the same way as an AI
algorithm is.
972
00:52:02,576 --> 00:52:04,185
Humans are deeply fallible.
973
00:52:04,318 --> 00:52:07,271
I veer on the side of saying
that yes
974
00:52:07,412 --> 00:52:10,732
I do not necessarily trust
judges as much as I do
975
00:52:10,904 --> 00:52:11,947
well designed algorithms.
976
00:52:12,689 --> 00:52:16,361
The most important decisions
we make in our society,
977
00:52:16,409 --> 00:52:19,744
the most serious crimes we do in
front of a jury of our peers,
978
00:52:19,838 --> 00:52:21,825
and we’ve done that for hundreds
of years.
979
00:52:21,865 --> 00:52:23,247
And that’s something that I
think we should
980
00:52:23,287 --> 00:52:24,755
give up only very lightly.
981
00:52:24,826 --> 00:52:26,068
Nathan what do you think?
982
00:52:26,185 --> 00:52:29,388
Well, I think ultimately I don’t
know how far you want to
983
00:52:29,428 --> 00:52:30,786
go with this discussion but
984
00:52:30,853 --> 00:52:31,853
[laughter]
985
00:52:34,015 --> 00:52:36,325
because like ultimately what
will end up happening
986
00:52:36,371 --> 00:52:39,193
is we’re going to become the
second intelligent species on
987
00:52:39,242 --> 00:52:40,263
this planet
988
00:52:40,303 --> 00:52:41,904
and if you take it to that
degree,
989
00:52:41,944 --> 00:52:43,998
do we actually merge with the
AI?
990
00:52:44,069 --> 00:52:46,248
So we have to merge our brains
991
00:52:46,327 --> 00:52:49,467
with AI, it’s the only way
forward. It’s inevitable.
992
00:52:49,507 --> 00:52:51,763
But we won’t be human then,
we’ll be something else.
993
00:52:51,815 --> 00:52:52,815
Superhuman!
994
00:52:52,880 --> 00:52:55,151
Superhuman? But that’s a choice.
995
00:52:55,191 --> 00:52:58,333
Do we not value our humanity
anymore?
996
00:52:58,404 --> 00:52:59,404
[music]
997
00:53:04,286 --> 00:53:06,841
we started off talking about
jobs.
998
00:53:08,200 --> 00:53:11,560
But somehow Artificial
Intelligence forces us
999
00:53:11,622 --> 00:53:14,755
to also think about what it
means to be human,
1000
00:53:15,109 --> 00:53:18,530
about what we value and who
controls that.
1001
00:53:18,617 --> 00:53:20,187
[music]
1002
00:53:21,830 --> 00:53:25,306
So here we are on the precipice
of another technological
1003
00:53:25,361 --> 00:53:26,735
transformation.
1004
00:53:27,665 --> 00:53:31,599
The last industrial revolution
turned society upside down.
1005
00:53:32,653 --> 00:53:36,855
It ultimately delivered greater
prosperity and many more jobs
1006
00:53:36,934 --> 00:53:39,761
as well as the 8 hour day and
weekends.
1007
00:53:40,832 --> 00:53:44,432
But the transition was at times
shocking and violent.
1008
00:53:45,842 --> 00:53:50,115
The question is can we do
better this time?
1009
00:53:52,373 --> 00:53:55,711
We don’t realise the future is
not inevitable.
1010
00:53:55,938 --> 00:53:59,015
The future is the result of the
decisions we make today.
1011
00:53:59,078 --> 00:54:00,565
These technologies are morally
neutral.
1012
00:54:00,605 --> 00:54:01,929
They can be used for good or for
bad.
1013
00:54:01,977 --> 00:54:03,763
There’s immense good things they
can do.
1014
00:54:03,803 --> 00:54:05,646
They can eliminate many
diseases,
1015
00:54:06,146 --> 00:54:08,310
they can help eliminate poverty,
they could tackle
1016
00:54:08,352 --> 00:54:09,419
climate change.
1017
00:54:10,361 --> 00:54:13,376
Equally, the technology can be
used for lots of bad.
1018
00:54:13,783 --> 00:54:16,243
It can be used to increase
inequality,
1019
00:54:16,314 --> 00:54:18,876
it can be used to transform
warfare,
1020
00:54:18,916 --> 00:54:22,393
it could be used to make our
lives much worse.
1021
00:54:24,362 --> 00:54:26,768
We get to make those choices.
80934
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.