Beginning Metal - Part 12: Diffuse and Specular | Ray Wenderlich

In this Metal video tutorial, you’ll earn how to add shading and highlights to your models with diffuse and specular lighting.


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/3537-beginning-metal/lessons/12

Hey there! Thanks for the great tutorial!

I wonder about eye position, we calculate it like so:

vertexOut.eyePosition = (modelConstants.modelViewMatrix * vertexIn.position).xyz;

Which I find a bit confusing since that’s awfully close to calculating vertex position

float4x4 matrix = sceneConstants.projectionMatrix * modelConstants.modelViewMatrix;
vertexOut.position = matrix * vertexIn.position;

I’d expect eye position vector to be something related to camera and not related to vertex, like in this case.

Could anybody please offer an explanation or suggest a read on this?
Thanks

I think I this vector is not an actual eye position, but rather this vector represents direction from eyes to the vertex (and eyes position is probably considered zero point in this case)

And if “direction from eye to vertex” matches the reflection, specular lighting factor is higher.

But I still don’t get why we don’t use projection matrix here. Still would appreciate a read or an explanation.

@bpashch - You’re right in that it’s not the “eyePosition”. It is the direction from the vertex to the camera.

The vertex starts in object space, and the modelViewMatrix moves it to camera space.

In camera space, your camera would be at the original, at (0, 0, 0). If your vertex is at (0, 0, 2), then that’s the direction from the camera. Projection doesn’t come into it. All calculations here are done in camera space. Projection is for flattening 3D vertices into 2D space.

Nowadays I tend to do the lighting calculation in world space rather than camera space. So I just multiply by the modelMatrix and send the fragment shader the camera’s world position to get the direction from the camera and the vertex in world space. The light position is generally in world space, so I find it easier to compare camera and light direction and reflection about the vertex.

As long as you do your calculations in the same space, it doesn’t really matter.

https://www.scratchapixel.com is probably the fullest resource, but can get a bit technical at times.

LearnOpenGL - Basic Lighting is a good explanation for Ambient/Diffuse/Specular. Ignore the OpenGL code - the explanation is good.

@Caroline Thanks for the explanation and for the read! I appreciate that you responded and that it was surprisingly fast :slight_smile:

@Caroline a couple of questions on normals:

  • For plane primitive, what should normal be? I’d expect for these positions below normal would be float3(0, 0, 1) meaning it’d be pointing straight on z axis, as the plane perpendicular to z, but what actually works is float3(1, 0, 0) which makes little sense to me
    let positions: [float3] = [float3(-1, 1, 0), float3(-1, -1, 0), float3(1, -1, 0), float3(1, 1, 0)]

19%20PM

  • For Cube primitive, what should the normals be? I’d expect that for our cube positions, normals would be equal to it’s positions, but it doesn’t work

00%20PM

I notice in this RW tutorial normals are specified per vertex, but the thing is primitives are drawn here without indexing.

Here’s my implementation of cube: GitHub

@caroline and another question, I notice that humanFigure.obj (and couple of other models) doesn’t have normals in it (there’s no vn occurrences in the file)

So I wonder if that’s possible to build the normals in code from the mesh somehow or it makes more sense to load the model in some tool that could build normals and re-export it (but still preserve texture data). What do you think is the best way to approach this?

Cheers

P.S. if that’s something that will be talked about in RW Metal by Tutorials book - please feel free to disregard the question, as I’m going to be reading it next :relaxed:

@bpashch - you could investigate what the normals are in the GPU debugger.

I note that you’re going to be reading Metal by Tutorials book next week - I hope you enjoy it :blush:

I can’t remember what went on in this series as it was a couple of years ago. The first chapter of the book loads up a Model I/O primitive. I just tried it with a plane, and it does seem to have weird numbers. But I then created a plane in Blender and used that, and it had numbers that you’d expect:

22%20am

I tend to use Blender obj models rather than the Model I/O primitives, as I’m usually importing objc anyway.

The book does go into obj models, and encourages you to look at them in Blender. (It uses v2.79 I think, rather than the new beta 2.8)

Between Blender and the Xcode shader debugger, you can see everything that’s going on

You can add normals in code using Apple Developer Documentation

(That’s not in the book I don’t think, although adding tangent values is)

However, because any app that is using external obj models will probably expect them all to be in the same format, personally, I’d simply take the obj into Blender and reexport it. I tried that with humanFigure.obj and it turned out fine.

27%20am

If you are going to be using obj models, I strongly suggest you get at least slightly familiar with looking at them in a 3d app. Blender is free, which is why that’s the one used in the book, but any 3d modeler, such as Maya, Maya LT, Cheetah 3D etc will be fine.

(One thing to be aware of, if you go into Blender, is that the Z axis is the one that points up - as opposed to the Y axis pointing up in the Metal app)

@bpashch - This is a screen capture in Blender, which suggests that the normals of a Blender primitive cube are, as you suggest, the same as the vertex positions (normals are in blue):

07%20am

However, they are just numbers, so it depends upon what the other numbers involved are as to whether those numbers work. Meaning what are the matrices, the projection, is it a left or right handed coordinate system etc etc.

Correction: Although that is how Blender displays them, that’s not how they are exported from Blender and read into Model I/O

This is the exported cube file. The normals are per face not per vertex. Presumably Model I/O takes care of that on the way in.

# Blender v2.77 (sub 0) OBJ File: ''
# www.blender.org
mtllib cube.mtl
o Cube
v 1.000000 -1.000000 -5.323584
v 1.000000 -1.000000 -3.323584
v -1.000000 -1.000000 -3.323584
v -1.000000 -1.000000 -5.323585
v 1.000000 1.000000 -5.323584
v 0.999999 1.000000 -3.323584
v -1.000000 1.000000 -3.323585
v -1.000000 1.000000 -5.323584
vn 0.0000 -1.0000 0.0000
vn 0.0000 1.0000 0.0000
vn 1.0000 0.0000 0.0000
vn -0.0000 -0.0000 1.0000
vn -1.0000 -0.0000 -0.0000
vn 0.0000 0.0000 -1.0000
usemtl Material
s off
f 1//1 2//1 3//1 4//1
f 5//2 8//2 7//2 6//2
f 1//3 5//3 6//3 2//3
f 2//4 6//4 7//4 3//4
f 3//5 7//5 8//5 4//5
f 5//6 1//6 4//6 8//6

These are the normals being used (imported by Model I/O)

02%20am

(I stuffed up the z position by moving it accidentally in Blender btw!)

@Caroline appreciate the explanation. Both addNormals and reexporting model from blender works nicely, thanks :+1:t2:
The weird numbers in GPU debugger that’s exactly what I experience. Obj it is for now, then; I guess it’s not a big loss :slight_smile:

Thanks again for your work Caroline :slightly_smiling_face: