Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:04,990 --> 00:00:08,830
Now that we know the general gist of path
tracing, let's take a look at how the node
2
00:00:08,830 --> 00:00:10,910
tree evaluation happens.
3
00:00:10,910 --> 00:00:15,379
Keep in mind that the node tree is evaluated
each time a ray hits a mesh.
4
00:00:15,379 --> 00:00:17,750
First we need to understand what a node is.
5
00:00:17,750 --> 00:00:19,910
A node is a basically just a function.
6
00:00:19,910 --> 00:00:24,550
It takes some inputs, processes them in some
way, and gives some outputs.
7
00:00:24,550 --> 00:00:28,980
There are some exceptional nodes that we'll
look at, but for the most part that's it.
8
00:00:28,980 --> 00:00:33,800
The key point to remember is that except for
some special nodes, the output of a node depends
9
00:00:33,800 --> 00:00:35,690
only on its inputs.
10
00:00:35,690 --> 00:00:39,390
Given the same inputs, it will always give
the exact same output.
11
00:00:39,390 --> 00:00:42,040
So for instance, here we have a Math node.
12
00:00:42,040 --> 00:00:45,790
In the Add mode, it takes two inputs and generates
one output.
13
00:00:45,790 --> 00:00:48,900
The output is simply the result of adding
the two inputs together.
14
00:00:48,900 --> 00:00:54,390
Here, it should be pretty obvious that given
the same inputs we always get the same output,
15
00:00:54,390 --> 00:00:57,400
but we'll look at some less obvious examples
later.
16
00:00:57,400 --> 00:01:00,280
Let's take a look at a simple node tree.
17
00:01:00,280 --> 00:01:03,010
Here we are adding the values one and two
together.
18
00:01:03,010 --> 00:01:08,040
Then, the output, which should be three is
being fed into a multiplication, and multiplied
19
00:01:08,040 --> 00:01:11,859
with .5, resulting in 1.5.
20
00:01:11,859 --> 00:01:16,150
Then we are making a color, using the output
of the addition for the Red channel, and the
21
00:01:16,150 --> 00:01:19,140
output of the multiplication for the green
and blue channels.
22
00:01:19,140 --> 00:01:24,229
So at this point red should be three, and
green and blue should be 1.5.
23
00:01:24,229 --> 00:01:26,880
This then goes into an Emission shader.
24
00:01:26,880 --> 00:01:30,940
A shader is a somewhat special type of node,
which determines the surface behavior, so
25
00:01:30,940 --> 00:01:35,079
basically a shader decides what happens to
a ray that hits the surface.
26
00:01:35,079 --> 00:01:38,969
For now it's enough to know that an Emission
shader causes the ray to stop and simply return
27
00:01:38,969 --> 00:01:41,710
its color value, without bouncing further.
28
00:01:41,710 --> 00:01:47,560
So an Emission behaves as a light source,
just as if the ray had reached an actual lamp.
29
00:01:47,560 --> 00:01:51,350
Lastly the shader is fed into the Surface
socket of the Material Output.
30
00:01:51,350 --> 00:01:55,380
The Material Output is what tells the render
engine what to actually compute.
31
00:01:55,380 --> 00:01:59,549
The render engine will look for the Material
Output node, and then evaluate all the nodes
32
00:01:59,549 --> 00:02:01,079
connected to it.
33
00:02:01,079 --> 00:02:05,039
The Surface socket just tells the render engine
that we want to use this shader for the actual
34
00:02:05,039 --> 00:02:09,959
surface of the object, while the Volume socket
is for shaders that are evaluated inside an
35
00:02:09,959 --> 00:02:14,550
object's volume, which is useful for things
like clouds and smoke, but we won't be looking
36
00:02:14,550 --> 00:02:16,560
into those in this course.
37
00:02:16,560 --> 00:02:21,610
Now, since we are using only fixed values
and no variable inputs in our computation,
38
00:02:21,610 --> 00:02:25,470
we should get the same result on the whole
object, and since we're using an Emission
39
00:02:25,470 --> 00:02:29,610
shader, the rendered color should directly
reflect the result of our computation.
40
00:02:29,610 --> 00:02:33,780
I applied this material to a cube, so let's
take a look at the result.
41
00:02:33,780 --> 00:02:38,270
Indeed, we got a single color, and we can't
even differentiate the faces of the cube,
42
00:02:38,270 --> 00:02:42,410
as no external lighting is being applied,
since all rays coming from the camera terminate
43
00:02:42,410 --> 00:02:44,370
immediately once they hit the cube.
44
00:02:44,370 --> 00:02:49,390
Now to check that the color is actually what
we computed, we can right click on the cube.
45
00:02:49,390 --> 00:02:53,260
Then at the very bottom, Blender gives us
some information about the pixel we are clicking
46
00:02:53,260 --> 00:02:57,810
on, and here we see that indeed the red value
is three, and the blue and green values are
47
00:02:57,810 --> 00:02:59,590
both 1.5.
48
00:02:59,590 --> 00:03:04,320
To the right, we also see different values
for red green and blue.
49
00:03:04,320 --> 00:03:09,260
Those are the values after color management
is applied, as indicated by the CM.
50
00:03:09,260 --> 00:03:11,880
We'll look at color management in a separate
chapter.
51
00:03:11,880 --> 00:03:17,290
For now, we are only interested in the raw
values computed directly from our shader.
52
00:03:17,290 --> 00:03:21,160
Going back to the node tree, we've looked
at it from left to right, as that's the direction
53
00:03:21,160 --> 00:03:22,160
of data flow.
54
00:03:22,160 --> 00:03:26,710
If we are only looking at a set of sequential
operations, this makes sense, but when we
55
00:03:26,710 --> 00:03:30,960
start looking at more complex trees, with
several branches, it's difficult to know on
56
00:03:30,960 --> 00:03:35,350
which branch to start, so it actually makes
more sense to read from right to left, as
57
00:03:35,350 --> 00:03:38,270
every tree ends with a Material Output.
58
00:03:38,270 --> 00:03:42,480
So in this case we would look at the output,
and see that it's using an Emission shader,
59
00:03:42,480 --> 00:03:46,430
that's already a lot of useful information,
and then we can follow the connections to
60
00:03:46,430 --> 00:03:49,100
see where the color is coming from, and so
on.
61
00:03:49,100 --> 00:03:52,910
This gives us the most important information
first, and let's us make informed decisions
62
00:03:52,910 --> 00:03:56,430
on which branches of the tree to inspect next.
63
00:03:56,430 --> 00:04:01,260
I've mentioned a few times that there are
some special nodes, so let's look at those.
64
00:04:01,260 --> 00:04:05,180
The special nodes are those in the input and
output categories.
65
00:04:05,180 --> 00:04:08,940
From the output category we already talked
about the Material Output node, which must
66
00:04:08,940 --> 00:04:13,700
exist in every material, and it's the only
output node that we'll cover in this course.
67
00:04:13,700 --> 00:04:17,600
The output nodes are special because they
are the only nodes that don't have any output
68
00:04:17,600 --> 00:04:21,810
sockets, and they don't actually process anything,
they just tell the render engine where to
69
00:04:21,810 --> 00:04:26,979
look to find the relevant parts of our tree
and start the node tree evaluation.
70
00:04:26,979 --> 00:04:30,199
But more interesting than the output nodes,
are the input nodes.
71
00:04:30,199 --> 00:04:34,940
For the most part, these nodes also don't
do any processing, though there are some exceptions,
72
00:04:34,940 --> 00:04:37,860
and most of them don't have any input sockets.
73
00:04:37,860 --> 00:04:42,970
We already looked at the simplest input node,
the Value node, which just lets us input values
74
00:04:42,970 --> 00:04:44,449
into our tree.
75
00:04:44,449 --> 00:04:48,949
These are pretty boring, and mostly unnecessary,
as we can just type values in the other nodes
76
00:04:48,949 --> 00:04:53,860
directly, but they are handy when we need
to input the same value into multiple nodes.
77
00:04:53,860 --> 00:04:58,050
The really interesting thing about input nodes,
which was hinted at in the path tracing basics
78
00:04:58,050 --> 00:05:03,819
video, is that many of them allow us to access
crucial information about the object, mesh,
79
00:05:03,819 --> 00:05:07,979
and ray that are being processed, as well
as some other more specialized things like
80
00:05:07,979 --> 00:05:12,729
hair and particle meta-data, but we won't
be looking at those in this course.
81
00:05:12,729 --> 00:05:14,699
Here are some of the most common input nodes.
82
00:05:14,699 --> 00:05:19,979
The Geometry node gives us a bunch of information
about the mesh, like the world space coordinate
83
00:05:19,979 --> 00:05:24,280
at which the current ray hit the mesh, the
normal at that same point, and whether the
84
00:05:24,280 --> 00:05:28,020
face is pointing towards the ray direction
or away form it.
85
00:05:28,020 --> 00:05:32,160
The Texture Coordinate node gives us things
like the UV coordinates of the point at which
86
00:05:32,160 --> 00:05:36,260
the current ray hit the mesh, as well as other
useful coordinates.
87
00:05:36,260 --> 00:05:39,129
We'll look more at these in a dedicated chapter.
88
00:05:39,129 --> 00:05:43,189
The Light Path node gives us some information
about the current ray, like if it came directly
89
00:05:43,189 --> 00:05:47,080
from the camera, or if it got reflected or
refracted in different ways before reaching
90
00:05:47,080 --> 00:05:48,599
the current point.
91
00:05:48,599 --> 00:05:52,749
It also tells us how many times the the path
bounced to reach this point, and how far it
92
00:05:52,749 --> 00:05:54,319
traveled through space.
93
00:05:54,319 --> 00:05:58,789
Note that I kept saying that things refer
to the point at which the current ray hit
94
00:05:58,789 --> 00:05:59,930
the mesh.
95
00:05:59,930 --> 00:06:03,900
That's because it's important to understand
that (with very few exceptions) whenever the
96
00:06:03,900 --> 00:06:08,180
node tree is evaluated, it doesn't have any
knowledge of the surroundings.
97
00:06:08,180 --> 00:06:12,840
The node tree is evaluated each time a ray
hits a surface, and it only has information
98
00:06:12,840 --> 00:06:15,229
about that single point in space.
99
00:06:15,229 --> 00:06:19,770
When a ray hits a surface, all these different
attributes are computed for that one point,
100
00:06:19,770 --> 00:06:22,610
and we can access them with the input nodes.
101
00:06:22,610 --> 00:06:26,729
The input nodes are also the only nodes that
depend on data from the scene, and therefore
102
00:06:26,729 --> 00:06:31,699
are the only nodes that can give different
outputs when given the same inputs, in fact,
103
00:06:31,699 --> 00:06:35,560
these nodes we are looking at don't take any
input at all.
104
00:06:35,560 --> 00:06:40,059
Every single other node that is not an input
or output node takes some kind of input, and
105
00:06:40,059 --> 00:06:45,659
processes it into some kind of output, without
any awareness of the scene context.
106
00:06:45,659 --> 00:06:50,499
Given a constant input, they will always give
the same output for every ray, everywhere,
107
00:06:50,499 --> 00:06:51,499
every time.
108
00:06:51,499 --> 00:06:56,229
Now, if you've done a bit of procedural texturing
in Blender before, this last point may seem
109
00:06:56,229 --> 00:07:00,789
to be false, as there is a whole class of
nodes that doesn't seem output the same values
110
00:07:00,789 --> 00:07:01,789
everywhere.
111
00:07:01,789 --> 00:07:03,879
I'm talking about the texture nodes.
112
00:07:03,879 --> 00:07:08,190
For instance, here we have a Noise Texture
node, with a certain set of inputs, and we
113
00:07:08,190 --> 00:07:12,839
don't seem to be feeding it any input values
that vary throughout space, yet it clearly
114
00:07:12,839 --> 00:07:15,539
does not output the same color everywhere.
115
00:07:15,539 --> 00:07:17,809
It's as if it is aware of where it is in space.
116
00:07:17,809 --> 00:07:22,120
That is because it is actually being fed spacial
coordinates.
117
00:07:22,120 --> 00:07:26,319
For all the texture nodes, when we leave the
Vector input disconnected, Blender sneakily
118
00:07:26,319 --> 00:07:31,370
feeds them Generated texture coordinates,
except for image textures, which get UV coordinates
119
00:07:31,370 --> 00:07:32,560
instead.
120
00:07:32,560 --> 00:07:36,900
Blender does this to make our lives easier,
as we usually do want textures to vary throughout
121
00:07:36,900 --> 00:07:41,289
space, and it would probably seem weird if
the default output of a texture node was just
122
00:07:41,289 --> 00:07:42,979
a solid color.
123
00:07:42,979 --> 00:07:47,409
That means that this setup with only the Noise
node, is exactly equivalent to this one where
124
00:07:47,409 --> 00:07:51,089
we explicitly feed it Generated texture coordinates.
125
00:07:51,089 --> 00:07:55,099
And if we look at the Generated coordinates
themselves, they do indeed vary throughout
126
00:07:55,099 --> 00:07:56,099
space.
127
00:07:56,099 --> 00:07:59,789
So it turns out that the variable data in
the texture nodes is actually the same that
128
00:07:59,789 --> 00:08:04,460
comes from an input node, it's just that it's
being passed in implicitly.
129
00:08:04,460 --> 00:08:09,220
To show that when feeding constant data into
a texture node, we always get a consistent
130
00:08:09,220 --> 00:08:13,290
output, just like with any other normal node,
we can actually feed a constant vector to
131
00:08:13,290 --> 00:08:18,259
the Noise, and then we see that indeed it
outputs the same color every time.
132
00:08:18,259 --> 00:08:22,539
This consistent behavior of the nodes is a
really important point to understand, in order
133
00:08:22,539 --> 00:08:23,909
to master procedural texturing.
14379
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.