Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
1
00:00:00,011 --> 00:00:02,463
(light music)
2
2
00:00:02,463 --> 00:00:05,270
(keyboard clicking)
3
3
00:00:05,270 --> 00:00:06,930
All right, so in a few videos
4
4
00:00:06,930 --> 00:00:10,610
we're going to start looking at sort algorithms,
5
5
00:00:10,610 --> 00:00:12,690
and it would be really helpful for us to be
6
6
00:00:12,690 --> 00:00:16,490
able to compare the performance of one algorithm
7
7
00:00:16,490 --> 00:00:18,190
against another algorithm.
8
8
00:00:18,190 --> 00:00:21,060
So you might think that one way we could do that
9
9
00:00:21,060 --> 00:00:25,430
is to implement one algorithm and then put a line of code
10
10
00:00:25,430 --> 00:00:29,820
that records the start time and then run the implementation
11
11
00:00:29,820 --> 00:00:32,670
and then have a line of code that records the end time
12
12
00:00:32,670 --> 00:00:36,100
and we can subtract the start time from the end time
13
13
00:00:36,100 --> 00:00:38,910
and we get the running time of the implementation
14
14
00:00:38,910 --> 00:00:40,280
of that algorithm.
15
15
00:00:40,280 --> 00:00:42,560
And then we do the same for another algorithm
16
16
00:00:42,560 --> 00:00:44,120
and we just compare the two.
17
17
00:00:44,120 --> 00:00:46,490
Sounds great in theory but unfortunately
18
18
00:00:46,490 --> 00:00:49,540
that's not a good way to do it because of hardware.
19
19
00:00:49,540 --> 00:00:52,550
Hardware is going to influence the running time
20
20
00:00:52,550 --> 00:00:53,540
of these algorithms.
21
21
00:00:53,540 --> 00:00:54,710
I mean, think about it.
22
22
00:00:54,710 --> 00:00:59,470
If we were to run an implementation on a desktop computer
23
23
00:00:59,470 --> 00:01:02,900
that was built in 2017 and compare that
24
24
00:01:02,900 --> 00:01:06,650
to the running time of the exact same implementation
25
25
00:01:06,650 --> 00:01:09,760
on a desktop computer that was built 20 years ago
26
26
00:01:09,760 --> 00:01:11,810
and perhaps is an old Pentium computer,
27
27
00:01:11,810 --> 00:01:14,450
or we'll even go further than that and run
28
28
00:01:14,450 --> 00:01:18,180
the implementation on a Commodore 64 or something like that,
29
29
00:01:18,180 --> 00:01:21,350
obviously the implementations are going to run slower
30
30
00:01:21,350 --> 00:01:23,420
on the older hardware.
31
31
00:01:23,420 --> 00:01:26,470
And if we were to run an implementation on a super computer,
32
32
00:01:26,470 --> 00:01:28,810
it's going to run really, really fast even though
33
33
00:01:28,810 --> 00:01:31,710
it might be a really inefficient algorithm.
34
34
00:01:31,710 --> 00:01:34,400
So we need a more objective measure
35
35
00:01:34,400 --> 00:01:36,870
than just the straight running time.
36
36
00:01:36,870 --> 00:01:41,870
And so what we do is we look at the number of steps
37
37
00:01:42,220 --> 00:01:45,740
that it takes to execute an algorithm.
38
38
00:01:45,740 --> 00:01:49,200
And we call this the time complexity.
39
39
00:01:49,200 --> 00:01:50,850
There are two types of complexity:
40
40
00:01:50,850 --> 00:01:52,900
there's time complexity which is the number
41
41
00:01:52,900 --> 00:01:56,590
of steps involved to run an algorithm;
42
42
00:01:56,590 --> 00:01:59,750
and then there is memory complexity which is the amount
43
43
00:01:59,750 --> 00:02:01,800
of memory it takes to run an algorithm.
44
44
00:02:01,800 --> 00:02:04,590
Now these days, because memory is so cheap,
45
45
00:02:04,590 --> 00:02:07,270
memory complexity isn't such an issue.
46
46
00:02:07,270 --> 00:02:09,850
So these days we mainly concern ourselves
47
47
00:02:09,850 --> 00:02:10,960
with time complexity.
48
48
00:02:10,960 --> 00:02:14,500
So we're interested in how many steps does it take
49
49
00:02:14,500 --> 00:02:15,890
to run an algorithm?
50
50
00:02:15,890 --> 00:02:19,270
Now when we're doing we like to look at the worst case.
51
51
00:02:19,270 --> 00:02:21,760
Looking at the best case doesn't help us because
52
52
00:02:21,760 --> 00:02:24,280
you're rarely gonna have the best case.
53
53
00:02:24,280 --> 00:02:26,460
Now we could take the average case
54
54
00:02:26,460 --> 00:02:30,050
but that's not gonna tell us the absolute
55
55
00:02:30,050 --> 00:02:31,610
worst time complexity.
56
56
00:02:31,610 --> 00:02:33,840
So if we wanna know what the upper bound is,
57
57
00:02:33,840 --> 00:02:36,990
like what is the absolute worst that we can expect
58
58
00:02:36,990 --> 00:02:39,380
from this algorithm, it's much more helpful
59
59
00:02:39,380 --> 00:02:40,930
to look at the worst case.
60
60
00:02:40,930 --> 00:02:44,580
So it's helpful to compare the worst case scenario
61
61
00:02:44,580 --> 00:02:47,160
for one algorithm against the worst case scenario
62
62
00:02:47,160 --> 00:02:48,240
for the other algorithm.
63
63
00:02:48,240 --> 00:02:50,330
And so that's how we do it.
64
64
00:02:50,330 --> 00:02:52,060
So let's look at the algorithm
65
65
00:02:52,060 --> 00:02:54,100
that I have up on this screen.
66
66
00:02:54,100 --> 00:02:56,650
We've already looked at an algorithm for making tea
67
67
00:02:56,650 --> 00:02:59,850
so now we're going to look at one little step.
68
68
00:02:59,850 --> 00:03:03,120
And we have an algorithm for adding sugar to tea.
69
69
00:03:03,120 --> 00:03:07,060
So in this algorithm step one is to get the bowl
70
70
00:03:07,060 --> 00:03:08,450
containing the sugar.
71
71
00:03:08,450 --> 00:03:10,730
Step two is to get a spoon.
72
72
00:03:10,730 --> 00:03:13,360
So we're assuming we have loose sugar here.
73
73
00:03:13,360 --> 00:03:16,540
Step three is to scoop out sugar using the spoon,
74
74
00:03:16,540 --> 00:03:19,070
and step four is to pour the sugar
75
75
00:03:19,070 --> 00:03:21,190
from the spoon into the tea.
76
76
00:03:21,190 --> 00:03:24,880
And of course if you want more than one teaspoon
77
77
00:03:24,880 --> 00:03:27,690
then you have to repeat steps three and four
78
78
00:03:27,690 --> 00:03:30,330
until you've added the desired amount of sugar.
79
79
00:03:30,330 --> 00:03:33,460
Now we can see from this that the number of steps
80
80
00:03:33,460 --> 00:03:35,800
that it's going to take to add sugar to your tea
81
81
00:03:35,800 --> 00:03:38,920
is going to depend on how many sugars you want.
82
82
00:03:38,920 --> 00:03:41,000
If you only want one sugar,
83
83
00:03:41,000 --> 00:03:44,410
then this algorithm will run taking four steps.
84
84
00:03:44,410 --> 00:03:45,840
But if you want two sugars,
85
85
00:03:45,840 --> 00:03:47,760
it's going to take six steps because
86
86
00:03:47,760 --> 00:03:50,510
you're going to have to repeat steps three and four.
87
87
00:03:50,510 --> 00:03:52,910
So let's have a look at what happens
88
88
00:03:52,910 --> 00:03:54,620
for the number of sugars.
89
89
00:03:54,620 --> 00:03:58,330
So if you have one sugar, you just need to do four steps.
90
90
00:03:58,330 --> 00:04:02,350
If you have two sugars, you're gonna need six steps.
91
91
00:04:02,350 --> 00:04:04,080
If you want three sugars in your tea
92
92
00:04:04,080 --> 00:04:06,270
you're gonna need eight steps because you're gonna
93
93
00:04:06,270 --> 00:04:09,830
have to repeat steps three and four three times.
94
94
00:04:09,830 --> 00:04:11,930
And if you want four sugars in your tea,
95
95
00:04:11,930 --> 00:04:14,230
I'm sure most of us would have bailed by now,
96
96
00:04:14,230 --> 00:04:16,770
but I do have a friend that I say takes
97
97
00:04:16,770 --> 00:04:19,360
tea with her sugar rather than the other way around.
98
98
00:04:19,360 --> 00:04:22,230
So she might actually make it down to row four.
99
99
00:04:22,230 --> 00:04:24,640
If you want four sugars in your tea,
100
100
00:04:24,640 --> 00:04:25,770
then you're gonna have to repeat
101
101
00:04:25,770 --> 00:04:28,430
steps three and four four times.
102
102
00:04:28,430 --> 00:04:30,380
And so you're going to take 10 steps
103
103
00:04:30,380 --> 00:04:31,810
to put sugar in your tea.
104
104
00:04:31,810 --> 00:04:33,990
So as we can see the number of steps
105
105
00:04:33,990 --> 00:04:36,920
or the time complexity of our sugar algorithm
106
106
00:04:36,920 --> 00:04:40,540
depends on how many sugars someone wants in their tea.
107
107
00:04:40,540 --> 00:04:42,470
If they only want one sugar,
108
108
00:04:42,470 --> 00:04:44,060
it just takes them four steps.
109
109
00:04:44,060 --> 00:04:47,920
But if they want four sugars, it's gonna take them 10 steps.
110
110
00:04:47,920 --> 00:04:50,440
So the time complexity gives us an idea
111
111
00:04:50,440 --> 00:04:54,110
of how an algorithm will perform as the number of items
112
112
00:04:54,110 --> 00:04:57,080
it has to deal with grows so as we can see
113
113
00:04:57,080 --> 00:04:59,730
as the number of sugars this algorithm has to add
114
114
00:04:59,730 --> 00:05:04,120
to tea increases, the number of steps required increases.
115
115
00:05:04,120 --> 00:05:06,270
Another way of saying this is it tells us
116
116
00:05:06,270 --> 00:05:08,890
how well an algorithm will scale.
117
117
00:05:08,890 --> 00:05:11,060
So how well will it do when it has to deal
118
118
00:05:11,060 --> 00:05:15,590
with 100 items, versus 1,00 items, versus a million items?
119
119
00:05:15,590 --> 00:05:18,840
And we'll see that some algorithm will scale really well
120
120
00:05:18,840 --> 00:05:20,720
and others not so well.
121
121
00:05:20,720 --> 00:05:22,130
The more items there are,
122
122
00:05:22,130 --> 00:05:25,240
the more algorithm's performance will degrade.
123
123
00:05:25,240 --> 00:05:28,710
Now the big O notation is a way of expressing
124
124
00:05:28,710 --> 00:05:31,733
the complexity related to the number of items
125
125
00:05:31,733 --> 00:05:33,900
that an algorithm has to deal with.
126
126
00:05:33,900 --> 00:05:36,060
And it's written as a capital O
127
127
00:05:36,060 --> 00:05:38,680
followed by an expression in parenthesis.
128
128
00:05:38,680 --> 00:05:43,680
So let's work out the big O value for our sugar algorithm.
129
129
00:05:44,570 --> 00:05:49,570
So it's conventional to designate the number of items by N.
130
130
00:05:50,730 --> 00:05:54,910
So in our case the number of desired sugars will equal N.
131
131
00:05:54,910 --> 00:05:58,370
And as we can see from our table
132
132
00:05:58,370 --> 00:06:00,733
and from our algorithm back here,
133
133
00:06:01,760 --> 00:06:06,760
the number of steps is going to be two times N plus two.
134
134
00:06:07,880 --> 00:06:10,950
And the reason for that is you have to repeat
135
135
00:06:10,950 --> 00:06:14,390
steps three and four for the number of sugars you have,
136
136
00:06:14,390 --> 00:06:15,980
so that's the two N.
137
137
00:06:15,980 --> 00:06:19,190
And then the plus two is steps one and two.
138
138
00:06:19,190 --> 00:06:23,140
Now as we've seen when N grows the number of steps grows.
139
139
00:06:23,140 --> 00:06:25,170
Now notice that the two,
140
140
00:06:25,170 --> 00:06:26,530
this is gonna be hard for me to say,
141
141
00:06:26,530 --> 00:06:30,220
but the two two's in the two N plus two,
142
142
00:06:30,220 --> 00:06:32,000
they remain constant.
143
143
00:06:32,000 --> 00:06:34,930
They don't change as the number of sugars changes.
144
144
00:06:34,930 --> 00:06:38,100
And because of that we don't consider them
145
145
00:06:38,100 --> 00:06:40,980
when we're coming up with the big O value.
146
146
00:06:40,980 --> 00:06:44,090
It's N that's influencing the number of steps.
147
147
00:06:44,090 --> 00:06:47,050
And so in this case we say that the time complexity
148
148
00:06:47,050 --> 00:06:50,410
for this algorithm is O of N.
149
149
00:06:50,410 --> 00:06:53,110
And because the time complexity for this algorithm
150
150
00:06:53,110 --> 00:06:55,340
increases in a linear fashion,
151
151
00:06:55,340 --> 00:06:58,614
we consider this to be linear time complexity.
152
152
00:06:58,614 --> 00:07:03,614
So a time complexity of O of N is linear time complexity.
153
153
00:07:03,640 --> 00:07:06,340
And if we go back to our table,
154
154
00:07:06,340 --> 00:07:09,580
we'll see that we go four, six, eight, 10
155
155
00:07:09,580 --> 00:07:13,350
and of course we're gonna keep going, 12, 14, 16, 18.
156
156
00:07:13,350 --> 00:07:16,570
That's a linear progression, it's a linear sequence.
157
157
00:07:16,570 --> 00:07:21,120
And so the time complexity of our adding sugars
158
158
00:07:21,120 --> 00:07:23,920
to tea algorithm is O of N.
159
159
00:07:23,920 --> 00:07:25,820
So these are the big O values
160
160
00:07:25,820 --> 00:07:28,627
we're going to mainly see in this course.
161
161
00:07:28,627 --> 00:07:33,440
You have O of one which is constant time complexity,
162
162
00:07:33,440 --> 00:07:36,930
O of log N which logarithmic time complexity,
163
163
00:07:36,930 --> 00:07:41,460
and that's log to the base two not log to the base 10.
164
164
00:07:41,460 --> 00:07:44,080
So it's log to the base two N.
165
165
00:07:44,080 --> 00:07:47,030
O of N which is linear time complexity.
166
166
00:07:47,030 --> 00:07:49,540
And then we have O of N log N
167
167
00:07:49,540 --> 00:07:52,820
which is N times log to the base two N,
168
168
00:07:52,820 --> 00:07:56,710
and that's N log star N time complexity.
169
169
00:07:56,710 --> 00:07:58,970
And we have O of N squared
170
170
00:07:58,970 --> 00:08:01,210
which is quadratic time complexity.
171
171
00:08:01,210 --> 00:08:04,000
You'll see that when we look at sort algorithms,
172
172
00:08:04,000 --> 00:08:06,930
the first few we look at will actually be quadratic.
173
173
00:08:06,930 --> 00:08:11,320
And the order in the table is in best to worst.
174
174
00:08:11,320 --> 00:08:15,110
So if you can use a constant time algorithm
175
175
00:08:15,110 --> 00:08:19,190
that's way better than a quadratic time algorithm.
176
176
00:08:19,190 --> 00:08:24,190
Now as I said, you heard me say O of one and O of N,
177
177
00:08:24,230 --> 00:08:25,620
that's how you say it.
178
178
00:08:25,620 --> 00:08:28,570
O of and then the value that's in brackets.
179
179
00:08:28,570 --> 00:08:31,700
Now when I was taught about big O notation,
180
180
00:08:31,700 --> 00:08:36,170
for some reason the person teaching the information
181
181
00:08:36,170 --> 00:08:40,770
instead said O to the one, O to the N.
182
182
00:08:40,770 --> 00:08:42,880
And so that was drilled into my head
183
183
00:08:42,880 --> 00:08:44,430
and it's a hard habit to break.
184
184
00:08:44,430 --> 00:08:46,550
The big O notation, it's the sort of thing
185
185
00:08:46,550 --> 00:08:48,800
that you don't have to say most of the time
186
186
00:08:48,800 --> 00:08:50,870
you're reading it or you're writing it.
187
187
00:08:50,870 --> 00:08:54,770
So the correct way of saying it is O of one, O of N,
188
188
00:08:54,770 --> 00:08:56,570
O of log N, et cetera.
189
189
00:08:56,570 --> 00:08:59,040
But I am going to slip up occasionally
190
190
00:08:59,040 --> 00:09:01,790
and say O to the because that's what was
191
191
00:09:01,790 --> 00:09:04,370
drilled into my head when I learned this material.
192
192
00:09:04,370 --> 00:09:05,900
And so if you hear me say that,
193
193
00:09:05,900 --> 00:09:07,730
if you hear me say O to the one,
194
194
00:09:07,730 --> 00:09:09,490
O to the N squared, et cetera,
195
195
00:09:09,490 --> 00:09:13,140
just convert that to O of in your mind.
196
196
00:09:13,140 --> 00:09:15,090
Chances are you're never gonna actually have
197
197
00:09:15,090 --> 00:09:16,700
to say this ever.
198
198
00:09:16,700 --> 00:09:21,700
In my entire career I never once had to say O of whatever.
199
199
00:09:21,860 --> 00:09:25,220
In fact in my entire career we never once discussed
200
200
00:09:25,220 --> 00:09:26,800
big O notation.
201
201
00:09:26,800 --> 00:09:28,800
I'm sure that depending on what type of
202
202
00:09:28,800 --> 00:09:30,360
application you're working on,
203
203
00:09:30,360 --> 00:09:31,690
that might be different for you.
204
204
00:09:31,690 --> 00:09:35,450
But in my career big O notation was sort of a non-issue.
205
205
00:09:35,450 --> 00:09:38,860
It was never something that we needed to worry about.
206
206
00:09:38,860 --> 00:09:41,500
And so if you're talking to somebody
207
207
00:09:41,500 --> 00:09:43,690
and they ask you about big O notation
208
208
00:09:43,690 --> 00:09:46,420
or big O values say O of.
209
209
00:09:46,420 --> 00:09:49,670
And if you hear me say O to the in this course,
210
210
00:09:49,670 --> 00:09:51,790
just convert that to O of.
211
211
00:09:51,790 --> 00:09:54,100
So now let's go over to Wikipedia
212
212
00:09:54,100 --> 00:09:57,500
so we can actually see the big O values
213
213
00:09:57,500 --> 00:10:00,810
plotted on a graph so we can get a visual idea
214
214
00:10:00,810 --> 00:10:04,713
of the differences between the different big O values.
215
215
00:10:08,500 --> 00:10:12,820
So I'm here at Wikipedia and you'll find this image
216
216
00:10:12,820 --> 00:10:15,560
in the Wikipedia article about big O notation
217
217
00:10:15,560 --> 00:10:19,380
and I've also included a link directly to this image
218
218
00:10:19,380 --> 00:10:20,850
in the Resources section.
219
219
00:10:20,850 --> 00:10:25,180
This image was created by user Cmglee.
220
220
00:10:25,180 --> 00:10:30,180
And so this is a graph of some of the big O values,
221
221
00:10:30,680 --> 00:10:34,230
and this represents how an algorithm would degrade.
222
222
00:10:34,230 --> 00:10:38,470
Along the X axis we have the input size,
223
223
00:10:38,470 --> 00:10:39,990
so the number of items;
224
224
00:10:39,990 --> 00:10:43,950
and along the Y axis we have the number of steps.
225
225
00:10:43,950 --> 00:10:46,770
So you can see for a constant algorithm,
226
226
00:10:46,770 --> 00:10:51,050
O of one which is in this violet colour,
227
227
00:10:51,050 --> 00:10:53,550
as the number of items increases
228
228
00:10:53,550 --> 00:10:55,770
the number of steps remains the same.
229
229
00:10:55,770 --> 00:10:58,220
This is just a straight horizontal line.
230
230
00:10:58,220 --> 00:10:59,800
So here's the number of steps.
231
231
00:10:59,800 --> 00:11:02,000
And this line isn't climbing at all
232
232
00:11:02,000 --> 00:11:04,210
as the number of items increases.
233
233
00:11:04,210 --> 00:11:05,590
So that's the best case.
234
234
00:11:05,590 --> 00:11:08,150
If you can get a constant algorithm,
235
235
00:11:08,150 --> 00:11:10,320
that's pretty much the best you can do.
236
236
00:11:10,320 --> 00:11:12,800
The next best thing is logarithmic.
237
237
00:11:12,800 --> 00:11:15,800
And notice that it's log to the base two.
238
238
00:11:15,800 --> 00:11:18,370
And we're gonna see some algorithms
239
239
00:11:18,370 --> 00:11:20,530
that have this time complexity.
240
240
00:11:20,530 --> 00:11:22,840
This is in a dark blue.
241
241
00:11:22,840 --> 00:11:26,220
And as we can see it kind of starts off climbing.
242
242
00:11:26,220 --> 00:11:28,250
So as the number of items increases,
243
243
00:11:28,250 --> 00:11:29,490
the number of steps goes up.
244
244
00:11:29,490 --> 00:11:32,440
But then it levels off and it's pretty much
245
245
00:11:32,440 --> 00:11:34,160
almost constant time.
246
246
00:11:34,160 --> 00:11:36,070
If you can achieve log
247
247
00:11:36,070 --> 00:11:39,410
to the base two N time complexity, that's great.
248
248
00:11:39,410 --> 00:11:41,800
We're not gonna see any with,
249
249
00:11:41,800 --> 00:11:43,850
I think that's square root, the square root of N,
250
250
00:11:43,850 --> 00:11:46,340
but we are going to see some linear algorithms.
251
251
00:11:46,340 --> 00:11:49,280
We've already discussed linear time.
252
252
00:11:49,280 --> 00:11:51,860
And so here's the graph, this is in green.
253
253
00:11:51,860 --> 00:11:55,210
So if you have 10 items you're gonna take 10 steps,
254
254
00:11:55,210 --> 00:11:57,120
20 items, 20 steps.
255
255
00:11:57,120 --> 00:11:59,690
It doesn't have to be an exact match like this.
256
256
00:11:59,690 --> 00:12:01,630
It's more pattern of growth.
257
257
00:12:01,630 --> 00:12:04,430
So if the pattern of growth is linear,
258
258
00:12:04,430 --> 00:12:06,500
it's a linear sequence, then the graph
259
259
00:12:06,500 --> 00:12:07,470
is gonna look like this.
260
260
00:12:07,470 --> 00:12:09,320
And this is what it looked like for our
261
261
00:12:09,320 --> 00:12:11,330
adding sugar to tea algorithm.
262
262
00:12:11,330 --> 00:12:13,440
We didn't start at zero steps.
263
263
00:12:13,440 --> 00:12:16,680
If you want no sugars, obviously there's zero steps,
264
264
00:12:16,680 --> 00:12:19,420
but our algorithm is for adding sugar to your tea,
265
265
00:12:19,420 --> 00:12:22,380
so that assumes that there's going to be sugar added.
266
266
00:12:22,380 --> 00:12:26,340
And so the minimum number of steps for one sugar is four.
267
267
00:12:26,340 --> 00:12:27,900
So we'd actually start at four
268
268
00:12:27,900 --> 00:12:30,640
but then we go six, eight, 10, et cetera.
269
269
00:12:30,640 --> 00:12:32,490
And so it's a linear progression.
270
270
00:12:32,490 --> 00:12:36,550
Now this orange line is N log to the base two N.
271
271
00:12:36,550 --> 00:12:40,190
So here we're multiplying log to the base two N
272
272
00:12:40,190 --> 00:12:41,890
by the number of items.
273
273
00:12:41,890 --> 00:12:43,360
And we're gonna see some algorithms
274
274
00:12:43,360 --> 00:12:45,070
that have this time complexity.
275
275
00:12:45,070 --> 00:12:49,140
This is worse than linear because as you can see
276
276
00:12:49,140 --> 00:12:52,610
the number of steps required climbs much more sharply.
277
277
00:12:52,610 --> 00:12:55,870
And then of course worse than that is quadratic
278
278
00:12:55,870 --> 00:12:58,330
and we're gonna start out looking at sort algorithms
279
279
00:12:58,330 --> 00:13:02,710
that actually have a time complexity of quadratic time.
280
280
00:13:02,710 --> 00:13:04,690
And this is an even sharper climb.
281
281
00:13:04,690 --> 00:13:06,820
This is the red dashed line.
282
282
00:13:06,820 --> 00:13:10,640
And so if you have 10 items it's gonna take 100 steps
283
283
00:13:10,640 --> 00:13:12,530
because that's gonna be 10 squared.
284
284
00:13:12,530 --> 00:13:15,020
And so if you have a quadratic algorithm,
285
285
00:13:15,020 --> 00:13:16,360
this quickly degrades.
286
286
00:13:16,360 --> 00:13:19,100
1,000 items is already a million steps.
287
287
00:13:19,100 --> 00:13:21,910
So keep this graph in mind as we look
288
288
00:13:21,910 --> 00:13:23,150
at the different algorithms.
289
289
00:13:23,150 --> 00:13:26,040
The absolute best is constant time.
290
290
00:13:26,040 --> 00:13:28,780
If you can't get that, you want logarithmic time,
291
291
00:13:28,780 --> 00:13:30,480
log to the base two N.
292
292
00:13:30,480 --> 00:13:32,670
Linear time is still pretty good
293
293
00:13:32,670 --> 00:13:37,250
once you start getting up into N log N and N squared.
294
294
00:13:37,250 --> 00:13:39,490
You can see that the number of steps required
295
295
00:13:39,490 --> 00:13:43,233
rises sharply as the number of items increases.
296
296
00:13:46,490 --> 00:13:49,010
In conclusion, this is what you have to remember really,
297
297
00:13:49,010 --> 00:13:51,580
a big O notation gives us a way of comparing
298
298
00:13:51,580 --> 00:13:54,140
the time complexity of different algorithms
299
299
00:13:54,140 --> 00:13:57,940
in an objective manner, in a hardware-independent manner.
300
300
00:13:57,940 --> 00:14:00,848
Okay, so that's it for now for the big O notation.
301
301
00:14:00,848 --> 00:14:05,060
We'll revisit this as we go through the course.
302
302
00:14:05,060 --> 00:14:06,710
So I'll see you in the next vide.
26779
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.