Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
0
1
00:00:00,300 --> 00:00:01,610
What Alison should know--
1
2
00:00:01,610 --> 00:00:04,400
What is the Internet, anyway?
2
3
00:00:04,400 --> 00:00:07,870
Internet is that massive computer network,
3
4
00:00:07,870 --> 00:00:10,900
the one that's becoming really big now.
4
5
00:00:10,900 --> 00:00:11,920
What do you mean that's big?
5
6
00:00:11,920 --> 00:00:14,320
How does (mumbles), you write to it, like mail?
6
7
00:00:14,320 --> 00:00:16,290
No, a lot of people use it to communicate with,
7
8
00:00:16,290 --> 00:00:17,950
I guess they can communicate with NBC,
8
9
00:00:17,950 --> 00:00:18,840
writers and producers.
9
10
00:00:18,840 --> 00:00:21,308
Alison, can you explain what Internet is?
10
11
00:00:21,308 --> 00:00:24,058
(dramatic music)
11
12
00:00:29,060 --> 00:00:30,550
How amazing is that?
12
13
00:00:30,550 --> 00:00:31,900
Just over 20 years ago,
13
14
00:00:31,900 --> 00:00:34,380
people didn't even know what the Internet was
14
15
00:00:34,380 --> 00:00:37,210
and today we can't even imagine our lives without it.
15
16
00:00:37,210 --> 00:00:39,460
Welcome to the Deep Learning A-Z course.
16
17
00:00:39,460 --> 00:00:40,620
My name is Kirill Eremenko,
17
18
00:00:40,620 --> 00:00:43,140
and along with the co-instructor Hadelin de Ponteves,
18
19
00:00:43,140 --> 00:00:45,370
we're super excited to have you on board.
19
20
00:00:45,370 --> 00:00:47,500
And today we're going to give you a quick overview
20
21
00:00:47,500 --> 00:00:52,180
of what deep learning is and why it's picking up right now.
21
22
00:00:52,180 --> 00:00:53,520
So, let's get started.
22
23
00:00:53,520 --> 00:00:55,290
Why did we have a look at that clip?
23
24
00:00:55,290 --> 00:00:57,570
And what is this photo over here?
24
25
00:00:57,570 --> 00:01:00,190
Well, that clip was from 1994,
25
26
00:01:00,190 --> 00:01:03,110
this is a photo of a computer from 1980.
26
27
00:01:03,110 --> 00:01:05,990
And the reason why we're kinda delving into history
27
28
00:01:05,990 --> 00:01:08,980
a little bit is because neural networks,
28
29
00:01:08,980 --> 00:01:09,970
along with deep learning,
29
30
00:01:09,970 --> 00:01:12,060
have been around for quite some time,
30
31
00:01:12,060 --> 00:01:14,830
and they've only started picking up now
31
32
00:01:14,830 --> 00:01:16,660
and impacting the world right now.
32
33
00:01:16,660 --> 00:01:19,180
But if you look back at the '80s you'll see that,
33
34
00:01:19,180 --> 00:01:22,160
even though they were invented in the '60s and '70s,
34
35
00:01:22,160 --> 00:01:27,160
they really caught onto a trend or caught wind in the '80s.
35
36
00:01:28,410 --> 00:01:30,700
People started talking about them a lot,
36
37
00:01:30,700 --> 00:01:32,780
there was a lot of research in that area,
37
38
00:01:32,780 --> 00:01:35,810
and everybody thought that deep learning or neural networks
38
39
00:01:35,810 --> 00:01:39,890
were this new thing that is going to impact the world,
39
40
00:01:39,890 --> 00:01:41,230
is going to change everything,
40
41
00:01:41,230 --> 00:01:42,910
is gonna solve all the world problems,
41
42
00:01:42,910 --> 00:01:46,080
and then it kind of slowly died off over the next decade.
42
43
00:01:46,080 --> 00:01:46,913
So what happened?
43
44
00:01:46,913 --> 00:01:49,810
Why did the neural networks not survive
44
45
00:01:49,810 --> 00:01:51,790
and not change the world?
45
46
00:01:51,790 --> 00:01:53,950
The reason for that, that they were just not good enough,
46
47
00:01:53,950 --> 00:01:57,160
that they're not that good at predicting things
47
48
00:01:57,160 --> 00:01:58,570
and not that good at modeling
48
49
00:01:58,570 --> 00:02:02,270
and, basically, just not a good invention,
49
50
00:02:02,270 --> 00:02:03,330
or is there another reason?
50
51
00:02:03,330 --> 00:02:05,060
Well, actually, there is another reason.
51
52
00:02:05,060 --> 00:02:06,660
And the reason is in front of us,
52
53
00:02:06,660 --> 00:02:08,800
it's the fact that technology back then
53
54
00:02:08,800 --> 00:02:11,560
was not up to the right standard
54
55
00:02:11,560 --> 00:02:13,660
to facilitate neural networks.
55
56
00:02:13,660 --> 00:02:16,310
In order for neural networks and deep learning
56
57
00:02:16,310 --> 00:02:17,800
to work properly, you need two things.
57
58
00:02:17,800 --> 00:02:20,130
You need data, and you need a lot of data,
58
59
00:02:20,130 --> 00:02:21,530
and you need processing power,
59
60
00:02:21,530 --> 00:02:24,030
you need strong computers to process that data
60
61
00:02:24,030 --> 00:02:25,900
and facilitate the neural networks.
61
62
00:02:25,900 --> 00:02:29,750
So let's have a look at how data,
62
63
00:02:29,750 --> 00:02:32,240
or storage of data, has evolved over the years,
63
64
00:02:32,240 --> 00:02:34,760
and then we'll look at how technology has evolved.
64
65
00:02:34,760 --> 00:02:36,090
So, here we've got three years.
65
66
00:02:36,090 --> 00:02:38,763
1956, 1980, 2017.
66
67
00:02:39,750 --> 00:02:43,160
How did storage look back in 1956?
67
68
00:02:43,160 --> 00:02:45,150
Well, there is a hard drive,
68
69
00:02:45,150 --> 00:02:48,380
and that hard drive is only a 5, wait for it,
69
70
00:02:48,380 --> 00:02:49,680
megabyte hard drive.
70
71
00:02:49,680 --> 00:02:53,570
That's 5 megabytes right there on the forklift,
71
72
00:02:53,570 --> 00:02:56,510
the size of a small room, that's a hard drive
72
73
00:02:56,510 --> 00:03:00,873
being transported to another location on a plane.
73
74
00:03:02,450 --> 00:03:05,470
That is what storage looked like in 1956.
74
75
00:03:05,470 --> 00:03:08,930
You had to pay, a company had to pay $2,500,
75
76
00:03:08,930 --> 00:03:12,700
of those days dollars, to rent that hard drive.
76
77
00:03:12,700 --> 00:03:15,333
To rent it, not buy it, to rent it for one month.
77
78
00:03:16,280 --> 00:03:18,710
In 1980, the situation improved a little bit.
78
79
00:03:18,710 --> 00:03:22,730
So here we got a 10 megabytes hard drive for $3,500.
79
80
00:03:22,730 --> 00:03:25,070
It's still very expensive and only 10 megabytes,
80
81
00:03:25,070 --> 00:03:27,170
so that's like one photo these days.
81
82
00:03:27,170 --> 00:03:32,170
And today, in 2017, we've got a 256 gigabyte SSD card
82
83
00:03:32,780 --> 00:03:36,980
for $150 which can fit on your finger.
83
84
00:03:36,980 --> 00:03:39,510
And if you're watching this video,
84
85
00:03:39,510 --> 00:03:42,540
a year later or, like, in 2019 or 2025,
85
86
00:03:42,540 --> 00:03:43,760
you're probably laughing to yourself
86
87
00:03:43,760 --> 00:03:47,160
because by then you have even stronger storage capacity.
87
88
00:03:47,160 --> 00:03:49,060
But, nevertheless, the point stands.
88
89
00:03:49,060 --> 00:03:51,170
If we compare these across the board
89
90
00:03:51,170 --> 00:03:53,920
and, without even taking price and size into consideration,
90
91
00:03:53,920 --> 00:03:58,280
just the capacity of whatever was trending at the time,
91
92
00:03:58,280 --> 00:04:03,280
so from 1956 to 1980 capacity increased about double
92
93
00:04:04,130 --> 00:04:09,110
and then it increased about 25,600 times.
93
94
00:04:09,110 --> 00:04:13,410
And the length of the period is not that different,
94
95
00:04:13,410 --> 00:04:16,130
from 1956 to 1980 is 24 years,
95
96
00:04:16,130 --> 00:04:18,710
from 1980 to 2017, 37 years,
96
97
00:04:18,710 --> 00:04:21,680
so not that much of an increase in time
97
98
00:04:21,680 --> 00:04:24,800
but a huge jump in technological progress,
98
99
00:04:24,800 --> 00:04:28,140
and that stands to show that this is not a linear trend,
99
100
00:04:28,140 --> 00:04:30,540
this is an exponential growth in technology.
100
101
00:04:30,540 --> 00:04:33,750
And if we take into account price and size,
101
102
00:04:33,750 --> 00:04:37,170
it'll be in the millions of increase.
102
103
00:04:37,170 --> 00:04:40,540
And here we actually have a chart on a logarithmic scale,
103
104
00:04:40,540 --> 00:04:44,420
so if we plot the hard drive cost per gigabyte
104
105
00:04:44,420 --> 00:04:46,300
you'll see that it looks something like this.
105
106
00:04:46,300 --> 00:04:49,830
We're very quickly approaching zero.
106
107
00:04:49,830 --> 00:04:52,880
Right now you can get storage on DropBox and Google Drive,
107
108
00:04:52,880 --> 00:04:55,671
which doesn't cost you anything, Cloud storage,
108
109
00:04:55,671 --> 00:04:57,250
and that's going to continue.
109
110
00:04:57,250 --> 00:04:58,890
And, in fact, over the years,
110
111
00:04:58,890 --> 00:05:01,180
this is going to go even further.
111
112
00:05:01,180 --> 00:05:03,820
Right now, scientists are even looking into
112
113
00:05:03,820 --> 00:05:05,925
using DNA for storage.
113
114
00:05:05,925 --> 00:05:07,640
Right now it's quite expensive,
114
115
00:05:07,640 --> 00:05:11,950
it costs $7,000 to synthesize 2 megabytes of data
115
116
00:05:12,920 --> 00:05:15,140
and then another $2,000 to read it,
116
117
00:05:15,140 --> 00:05:17,130
but that kinda reminds you of this whole situation
117
118
00:05:17,130 --> 00:05:18,900
of the hard drive and the plane,
118
119
00:05:18,900 --> 00:05:21,420
that this is gonna be mitigated very, very quickly
119
120
00:05:21,420 --> 00:05:23,220
with this exponential growth.
120
121
00:05:23,220 --> 00:05:25,210
10 years from now, 20 years from now,
121
122
00:05:25,210 --> 00:05:26,960
everybody's gonna be using DNA storage
122
123
00:05:26,960 --> 00:05:28,600
if we go down this direction.
123
124
00:05:28,600 --> 00:05:30,990
Here are some stats on all that.
124
125
00:05:30,990 --> 00:05:33,550
You can explore this further, maybe pause the video
125
126
00:05:33,550 --> 00:05:35,440
if you want to read a bit more about this.
126
127
00:05:35,440 --> 00:05:36,910
This is from nature.com.
127
128
00:05:36,910 --> 00:05:40,430
And, basically, you can store all of the world's data
128
129
00:05:40,430 --> 00:05:44,610
in just 1 kilogram of DNA storage.
129
130
00:05:44,610 --> 00:05:47,610
Or you can store about one billion terabytes of data
130
131
00:05:47,610 --> 00:05:49,300
in one gram of DNA storage.
131
132
00:05:49,300 --> 00:05:52,070
So, that's just something to show
132
133
00:05:52,070 --> 00:05:53,650
how quickly we're progressing
133
134
00:05:53,650 --> 00:05:56,700
and that this is why deep learning is picking up
134
135
00:05:56,700 --> 00:05:58,890
now that we are finally at the stage
135
136
00:05:58,890 --> 00:06:00,990
where we have enough data to train
136
137
00:06:02,283 --> 00:06:04,200
super cool, super sophisticated models.
137
138
00:06:04,200 --> 00:06:06,368
Back then, in the '80s when it was first initiated,
138
139
00:06:06,368 --> 00:06:08,610
it just wasn't the case.
139
140
00:06:08,610 --> 00:06:12,680
And the second thing we talked about is processing capacity.
140
141
00:06:12,680 --> 00:06:15,240
So here we've got an exponential curve,
141
142
00:06:15,240 --> 00:06:17,760
again, on a log scale.
142
143
00:06:17,760 --> 00:06:20,140
It's not ideally portrayed here
143
144
00:06:20,140 --> 00:06:21,910
but on the right you can see it's a log scale.
144
145
00:06:21,910 --> 00:06:24,310
And this is how computers have been evolving.
145
146
00:06:24,310 --> 00:06:26,280
So, again, feel free to pause this slide.
146
147
00:06:26,280 --> 00:06:28,800
This is called Moore's law, you've probably heard of it,
147
148
00:06:28,800 --> 00:06:32,180
how quickly the processing capacity of computers
148
149
00:06:32,180 --> 00:06:34,120
has been evolving.
149
150
00:06:34,120 --> 00:06:35,840
Right now we're somewhere over here,
150
151
00:06:35,840 --> 00:06:39,080
where an average computer you can buy for 1,000 bucks
151
152
00:06:39,080 --> 00:06:43,600
thinks at the speed of the brain of a rat.
152
153
00:06:43,600 --> 00:06:47,520
And by 2025 it'll be the speed of a human, or 2023.
153
154
00:06:47,520 --> 00:06:50,970
And then by 2050 or 2045,
154
155
00:06:50,970 --> 00:06:54,650
we'll surpass all of the humans combined.
155
156
00:06:54,650 --> 00:06:58,250
So, basically, we are entering the era of computers
156
157
00:06:58,250 --> 00:07:00,250
that are extremely powerful,
157
158
00:07:00,250 --> 00:07:05,250
that can process things way faster than we can imagine,
158
159
00:07:05,620 --> 00:07:08,490
and that is what is facilitating deep learning.
159
160
00:07:08,490 --> 00:07:10,710
So, all this brings us to the question,
160
161
00:07:10,710 --> 00:07:12,160
what is deep learning?
161
162
00:07:12,160 --> 00:07:15,350
What is this whole neural network situation?
162
163
00:07:15,350 --> 00:07:18,110
What is going on, what are we even talking about here?
163
164
00:07:18,110 --> 00:07:20,460
And you've probably seen a picture of something like this,
164
165
00:07:20,460 --> 00:07:21,500
so let's dive into it.
165
166
00:07:21,500 --> 00:07:23,390
What is deep learning?
166
167
00:07:23,390 --> 00:07:25,510
This gentleman over here, Geoffrey Hinton,
167
168
00:07:25,510 --> 00:07:28,963
is known as the godfather of deep learning.
168
169
00:07:29,850 --> 00:07:33,400
He did research on deep learning in the '80s
169
170
00:07:33,400 --> 00:07:35,740
and he's done lots and lots of work,
170
171
00:07:35,740 --> 00:07:40,740
lots of research papers he's published in deep learning.
171
172
00:07:40,970 --> 00:07:42,910
Right now he works at Google.
172
173
00:07:42,910 --> 00:07:45,160
So, a lot of the things that we're gonna be talking about
173
174
00:07:45,160 --> 00:07:47,550
actually come from Geoffrey Hinton.
174
175
00:07:47,550 --> 00:07:49,730
You can see he's got a quite a few Youtube videos,
175
176
00:07:49,730 --> 00:07:52,310
he explains things really well, so,
176
177
00:07:52,310 --> 00:07:53,600
highly recommend checking them out.
177
178
00:07:53,600 --> 00:07:56,510
So, the idea behind deep learning is to
178
179
00:07:57,610 --> 00:07:59,197
look at the human brain,
179
180
00:07:59,197 --> 00:08:02,340
and there's gonna be quite a bit of neuroscience coming up
180
181
00:08:02,340 --> 00:08:03,230
in these tutorials,
181
182
00:08:03,230 --> 00:08:05,250
and what we're trying to do here
182
183
00:08:05,250 --> 00:08:09,810
is to mimic how the human brain operates.
183
184
00:08:09,810 --> 00:08:10,950
We don't know that much,
184
185
00:08:10,950 --> 00:08:12,240
we don't know everything about the human brain,
185
186
00:08:12,240 --> 00:08:14,110
but that little amount that we know,
186
187
00:08:14,110 --> 00:08:16,640
we want to mimic it and recreate it.
187
188
00:08:16,640 --> 00:08:17,473
And why is that?
188
189
00:08:17,473 --> 00:08:18,920
Well, because the human brain seems to be
189
190
00:08:18,920 --> 00:08:22,170
one of the most powerful tools on this planet for learning,
190
191
00:08:22,170 --> 00:08:25,500
for learning adapting skills and then applying them.
191
192
00:08:25,500 --> 00:08:29,510
If computers could copy that, then we could just leverage
192
193
00:08:29,510 --> 00:08:32,900
what natural selection has already decided for us,
193
194
00:08:32,900 --> 00:08:36,770
all of those algorithms that it has decided are the best,
194
195
00:08:36,770 --> 00:08:38,600
we're just gonna leverage that, (mumbles).
195
196
00:08:39,780 --> 00:08:41,640
So, let's see how this works.
196
197
00:08:41,640 --> 00:08:44,960
Here we've got some neurons.
197
198
00:08:44,960 --> 00:08:48,900
These are neurons which have been smeared onto glass
198
199
00:08:48,900 --> 00:08:51,220
and then have been looked at under a microscope
199
200
00:08:51,220 --> 00:08:52,130
with some coloring,
200
201
00:08:52,130 --> 00:08:54,350
and you can see what they look like.
201
202
00:08:54,350 --> 00:08:56,810
They have like a body, they have these branches,
202
203
00:08:56,810 --> 00:08:58,529
and they have like tails, and so on,
203
204
00:08:58,529 --> 00:09:01,443
you can see then they have a nucleus inside in the middle.
204
205
00:09:02,670 --> 00:09:05,210
That's basically what a neuron looks like.
205
206
00:09:05,210 --> 00:09:08,020
In the human brain there is approximately
206
207
00:09:08,020 --> 00:09:10,630
a 100 billion neurons altogether.
207
208
00:09:10,630 --> 00:09:11,680
These are individual neurons,
208
209
00:09:11,680 --> 00:09:13,640
these are actually motor neurons
209
210
00:09:13,640 --> 00:09:15,220
because they're bigger, they're easier to see.
210
211
00:09:15,220 --> 00:09:18,310
But nevertheless there's a 100 billion neurons
211
212
00:09:18,310 --> 00:09:19,870
in the human brain,
212
213
00:09:19,870 --> 00:09:21,430
and each neuron is connected
213
214
00:09:21,430 --> 00:09:23,880
to as many as about a 1,000 of its neighbors.
214
215
00:09:23,880 --> 00:09:26,510
So, to give you a picture, this is what it looks like.
215
216
00:09:26,510 --> 00:09:31,510
This is an actual section of the human brain,
216
217
00:09:32,580 --> 00:09:37,580
this is the cerebellum, which is this part of your brain.
217
218
00:09:38,080 --> 00:09:38,913
At the back.
218
219
00:09:38,913 --> 00:09:40,910
It's responsible for
219
220
00:09:42,210 --> 00:09:45,720
motorics and for keeping your balance
220
221
00:09:45,720 --> 00:09:48,520
and some language capabilities and stuff like that.
221
222
00:09:48,520 --> 00:09:53,520
This is just to show how vast, how many neurons there are.
222
223
00:09:54,380 --> 00:09:57,900
There are billions and billions and billions of neurons
223
224
00:09:57,900 --> 00:09:58,820
all connecting your brain.
224
225
00:09:58,820 --> 00:10:02,040
It's not like we're talking about five or 500 or a 1,000
225
226
00:10:02,040 --> 00:10:05,310
or a million, there's billions of neurons in there.
226
227
00:10:05,310 --> 00:10:08,260
Yeah, so, that's what we're gonna be trying to recreate.
227
228
00:10:08,260 --> 00:10:11,800
So, how do we recreate this in a computer?
228
229
00:10:11,800 --> 00:10:14,440
Well, we create an artificial structure,
229
230
00:10:14,440 --> 00:10:16,320
called an artificial neural net,
230
231
00:10:16,320 --> 00:10:20,480
where we have nodes, or neurons.
231
232
00:10:20,480 --> 00:10:23,360
And we're gonna have some neurons for input value,
232
233
00:10:23,360 --> 00:10:27,530
so these are values that you know about a certain situation.
233
234
00:10:27,530 --> 00:10:29,490
For instance, if you're modeling something,
234
235
00:10:29,490 --> 00:10:30,705
you want to predict something,
235
236
00:10:30,705 --> 00:10:32,020
you're always gonna have some input,
236
237
00:10:32,020 --> 00:10:35,228
something to start your predictions off.
237
238
00:10:35,228 --> 00:10:36,690
That's called the input layer.
238
239
00:10:36,690 --> 00:10:38,010
Then you have the output,
239
240
00:10:38,010 --> 00:10:40,130
so that's a value that you want to predict,
240
241
00:10:40,130 --> 00:10:41,060
whether it's a price,
241
242
00:10:41,060 --> 00:10:44,340
whether it's is somebody going to leave the bank
242
243
00:10:44,340 --> 00:10:45,410
or stay in the bank,
243
244
00:10:45,410 --> 00:10:47,860
is this a fraudulent transaction,
244
245
00:10:47,860 --> 00:10:50,810
is this a real transaction, and so on.
245
246
00:10:50,810 --> 00:10:52,440
So that's gonna be output layer.
246
247
00:10:52,440 --> 00:10:55,403
And in between, we're going to have a hidden layer.
247
248
00:10:56,510 --> 00:10:59,690
As you can see, in your brain you have so many neurons,
248
249
00:10:59,690 --> 00:11:01,300
so some information is coming in
249
250
00:11:01,300 --> 00:11:05,020
through your eyes, ears, nose, basically your senses,
250
251
00:11:05,020 --> 00:11:08,670
and then it's not just going right away to the output
251
252
00:11:08,670 --> 00:11:09,630
where you have the result,
252
253
00:11:09,630 --> 00:11:11,600
it's going through all of these billions and billions
253
254
00:11:11,600 --> 00:11:14,440
and billions of neurons before it gets to the output.
254
255
00:11:14,440 --> 00:11:15,760
And this is the whole concept behind it
255
256
00:11:15,760 --> 00:11:16,960
that we're going to model the brain,
256
257
00:11:16,960 --> 00:11:18,550
so we need these hidden layers
257
258
00:11:18,550 --> 00:11:20,510
that are there before the output.
258
259
00:11:20,510 --> 00:11:23,120
So, the input layers neurons are connected
259
260
00:11:23,120 --> 00:11:24,300
to the hidden layer neurons,
260
261
00:11:24,300 --> 00:11:26,350
the hidden layer neurons are connected to output value.
261
262
00:11:26,350 --> 00:11:30,490
So this is pretty cool but what is this all about?
262
263
00:11:30,490 --> 00:11:32,020
Where is the deep learning here?
263
264
00:11:32,020 --> 00:11:32,890
Why is it called deep learning?
264
265
00:11:32,890 --> 00:11:34,000
There's nothing deep in here.
265
266
00:11:34,000 --> 00:11:36,970
Well, this is kind of like an option
266
267
00:11:36,970 --> 00:11:39,390
which one might call shallow learning,
267
268
00:11:39,390 --> 00:11:41,800
where there isn't much indeed going on.
268
269
00:11:41,800 --> 00:11:43,340
But, why is it called deep learning?
269
270
00:11:43,340 --> 00:11:46,090
Well, because then we take this to the next level.
270
271
00:11:46,090 --> 00:11:48,180
We separate it even further
271
272
00:11:48,180 --> 00:11:50,940
and we have not just one hidden layer,
272
273
00:11:50,940 --> 00:11:55,470
we have lots and lots and lots of hidden layers,
273
274
00:11:55,470 --> 00:11:57,710
and then we connect everything,
274
275
00:11:57,710 --> 00:11:59,110
just like in the human brain.
275
276
00:11:59,110 --> 00:12:01,850
Connect everything, interconnect everything,
276
277
00:12:01,850 --> 00:12:06,110
and that's how the input values are processed
277
278
00:12:06,110 --> 00:12:07,380
through all these hidden layers,
278
279
00:12:07,380 --> 00:12:10,150
just like in the human brain, then we have an output value.
279
280
00:12:10,150 --> 00:12:12,360
And now we're talking deep learning.
280
281
00:12:12,360 --> 00:12:14,290
So that's what deep learning is all about
281
282
00:12:14,290 --> 00:12:15,800
on a very abstract level.
282
283
00:12:15,800 --> 00:12:18,260
In the further tutorials we're going to dissect
283
284
00:12:18,260 --> 00:12:20,400
and dive deep into deep learning,
284
285
00:12:20,400 --> 00:12:21,840
and by the end of it you will know
285
286
00:12:21,840 --> 00:12:23,370
what deep learning is all about
286
287
00:12:23,370 --> 00:12:26,350
and you'll know how to apply it in your projects.
287
288
00:12:26,350 --> 00:12:29,680
Super excited about this, can't wait to get started,
288
289
00:12:29,680 --> 00:12:31,840
and I look forward to seeing you on the next tutorial.
289
290
00:12:31,840 --> 00:12:33,803
Until then, enjoy deep learning.
24436
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.