Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,210 --> 00:00:02,760
Hello and welcome to this new tutorial.
2
00:00:02,760 --> 00:00:08,640
Previously we did find a generator through our class C which contains the architecture of the neural
3
00:00:08,640 --> 00:00:16,040
network inside the init function and the forward function to propagate the signal inside this architecture.
4
00:00:16,170 --> 00:00:21,660
And now that we have defined the class we were ready to create as many objects as we want there is as
5
00:00:21,660 --> 00:00:26,870
many generators as we want but we only need one and that's the one we'll create.
6
00:00:26,940 --> 00:00:28,460
And the Statoil.
7
00:00:28,530 --> 00:00:33,290
So to create an object of the class we need to choose a name for this object.
8
00:00:33,510 --> 00:00:40,350
And the name will choose is not g for as you might have guessed the neural network of the generator
9
00:00:41,090 --> 00:00:44,780
G and then to create a new object of the class.
10
00:00:44,840 --> 00:00:49,850
While there is nothing more simple you take your G class and then you add some parenthesis.
11
00:00:49,890 --> 00:00:52,170
Why do you only need to add some parenthesis.
12
00:00:52,290 --> 00:00:59,190
It's because in the arguments of the class we only inherited from the end module and we didn't put any
13
00:00:59,250 --> 00:00:59,890
argument.
14
00:00:59,940 --> 00:01:04,490
So basically there is no argument and therefore there is an argument in put here.
15
00:01:04,530 --> 00:01:06,410
Hence the only parenthesis.
16
00:01:06,540 --> 00:01:07,190
Perfect.
17
00:01:07,200 --> 00:01:12,910
And so in the flashiest of the flashes we got our generator neural network.
18
00:01:13,080 --> 00:01:14,250
Congratulations.
19
00:01:14,400 --> 00:01:21,390
And now as I said in the end of the previous Statoil we need to initialize the weights the proper way
20
00:01:21,510 --> 00:01:28,290
to respect the convention of the adversarial networks and to do this we have the weights in a function
21
00:01:28,290 --> 00:01:29,760
that can do that for us.
22
00:01:29,760 --> 00:01:32,450
So I'm quickly going to explain what it's going to do.
23
00:01:32,640 --> 00:01:36,780
As you can see we start with the class name variable.
24
00:01:36,780 --> 00:01:44,030
There is some kind of a research tool that will look for some names in the definition of the class so
25
00:01:44,040 --> 00:01:49,980
it will look for some names inside this class and the names it's going to look for are gone and Bajan
26
00:01:49,980 --> 00:01:54,360
on and since can be transposed to the contained can.
27
00:01:54,550 --> 00:01:57,220
Well it will find Canth transposed to the.
28
00:01:57,240 --> 00:02:06,090
And then it will initialize the weights to 0.00 and 0.02 for the convolution modules and then Same for
29
00:02:06,090 --> 00:02:06,550
Birgeneau.
30
00:02:06,570 --> 00:02:13,260
It's going to look for any name in the class that contains Bache norm which of course the budget norm
31
00:02:13,380 --> 00:02:16,110
to D2 budget normalized feature map.
32
00:02:16,180 --> 00:02:22,530
And on each of these layers related to the batched norms and inside each of these best layers it will
33
00:02:22,560 --> 00:02:26,140
initialize the weights to 1.0 0.02.
34
00:02:26,280 --> 00:02:34,140
And remember in each layer we also have some bias and all the bias at the batch on levels will be initialized
35
00:02:34,140 --> 00:02:34,790
to zero.
36
00:02:35,010 --> 00:02:41,160
So that's exactly what it's going to do and it's using this class name trick to look for the convolutions
37
00:02:41,370 --> 00:02:46,860
and the budget formalizations inside the class to initialize these ways the right way.
38
00:02:46,860 --> 00:02:48,560
All right so that's how it works.
39
00:02:48,600 --> 00:02:55,740
And now to apply this function we just need to take our generator neural network which is which we've
40
00:02:55,740 --> 00:03:05,070
just called Net G and then we added that then we're going to use the plie function to apply the weights
41
00:03:05,100 --> 00:03:05,760
in its function.
42
00:03:05,760 --> 00:03:11,510
So I'm just copying this and pasting it inside.
43
00:03:11,880 --> 00:03:16,290
All right and this will just apply the weight in function to our Najia object.
44
00:03:16,290 --> 00:03:19,360
That is the neural network of our generator.
45
00:03:19,380 --> 00:03:20,170
All right.
46
00:03:20,220 --> 00:03:21,590
So congratulations.
47
00:03:21,600 --> 00:03:26,250
We now have a generator a real generator neural network.
48
00:03:26,250 --> 00:03:31,830
So basically we are done with the first big step of this implementation of the deep convolutional Ganns
49
00:03:32,190 --> 00:03:35,940
which was all about defining and creating the generator.
50
00:03:35,940 --> 00:03:41,070
Now we're going to move on to the second big step of this implementation which will be about defining
51
00:03:41,250 --> 00:03:44,380
and creating this time the discriminator.
52
00:03:44,610 --> 00:03:46,840
So we'll do that in the next three to two year olds.
53
00:03:46,860 --> 00:03:51,870
We will start by defining the class then we'll define the forward function and then eventually we'll
54
00:03:51,870 --> 00:03:55,200
create our discriminator object.
55
00:03:55,200 --> 00:03:56,990
Until then enjoy computer vision.
5824
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.