[Chapter 11] - How to start [Chapter 12] - Finding mistakes


First of all I want to say I really enjoy the book, I feel like it is the only accessible way to get started with Metal if you do not have any graphics programming experience.
The only downside - and that is purely my opinion - is that I find it quite confusing that the book doesn’t keep building on the same project. I am following along with the book using my own project so that I can add additional comments and notes, but then when you reach chapter 10 and 11 it is hard to put that information into that project because it is not following the same project setup as the previous chapters.

For chapter 11 I don’t know what would be a good way to implement the starting state of the chapter in the architecture that was used in the project in chapter 9, so I would love some tips on that. I was able to add the transparency, fog toggle and blending into that project setup but for chapter 11 I am a bit lost. So for now I decided to skip that section hoping the rest of the book might give me more insight.

So I got started on chapter 12, and I am seeing a different result for my current project and the starter project, and I don’t know what I can still check. I looked at the lights, the scene setup, the car .obj .mtl + texture files, the vertex and pbr shader and to me it does look the same. However I am seeing a difference in the render:

This is the render in my project:

This is the render in the chapter 12 starting project:

You can really see the difference in the lights/grill/wheels, any idea what could have gone wrong here? :thinking:

Thanks in advance, and again thanks a lot for the great book!

I’m glad you’re enjoying it :slight_smile: - I understand your difficulty with the changing projects. Sometimes it’s easier to separate out the concepts for learning, and the main problem is then one of software architecture - putting the bits back in again.

The difference between the two renders looks like the fragment shader to me - could also be the lighting setup.

If you want to upload the two projects here, or somewhere else if you can, I can take a look at them and see how they differ.

1 Like

Thanks for the response! Is it ok if I invite you As a collaborator to my github? I am not ready for it to be public yet :stuck_out_tongue_winking_eye: but I am totally fine with posting code here that’s related to the issue here, so we can all learn from each others mistakes of course. I think your github is carolight so I sent the request to that account.

Kind regards and thanks you for the effort!!

@caroline Hey Caroline I am really sorry if I have wasted your time! I finally found it … I had an old version of the roughness and the racing-car-color textures :man_facepalming:

For Chapter 11 can I do something like this?
This is my Scene:

class TesselationScene: Scene {

    let model = Model(vertices: [
        [-1,  0,  1],
        [ 1,  0, -1],
        [-1,  0, -1],
        [-1,  0,  1],
        [ 1,  0, -1],
        [ 1,  0,  1]
    ], vertexFunction: "vertex_basic", fragmentFunction: "fragment_basic")

    let arcBallCamera: Camera = {
        let camera = ArcballCamera()
        camera.distance = 8
        camera.target = [0, 0, 0]
        camera.rotation.x = Float(-15).degreesToRadians
        camera.rotation.y = Float(-105).degreesToRadians
        return camera

    override func setupScene() {
        [model].forEach { add(node: $0) }
        inputController.keyboardDelegate = self
        cameras = [arcBallCamera]

Now I made an extra init on Model:

    init(vertices: [float3], vertexFunction: String = "vertex_main", fragmentFunction: String = "fragment_mainPBR", position: float3 = [0, 0, 0], rotation: float3 = [0, 0, 0], scale: float3 = [1, 1 ,1], tiling: UInt32 = 1, transparencyEnabled: Bool = true, blendingEnabled: Bool = false) {
        self.tiling = tiling
        self.blendingEnabled = blendingEnabled
        self.transparencyEnabled = transparencyEnabled
        let allocator = MTKMeshBufferAllocator(device: Renderer.device)
        let vertexBuffer = allocator.newBuffer(MemoryLayout<float3>.stride * vertices.count, type: .vertex)
        let vertexMap = vertexBuffer.map()
        vertexMap.bytes.assumingMemoryBound(to: float3.self).assign(from: vertices, count: vertices.count)
        let indices: [UInt16] = (UInt16(0)..<UInt16(vertices.count)).map { $0 }
        let indexBuffer = allocator.newBuffer(MemoryLayout<UInt16>.stride * indices.count, type: .index)
        let indexMap = indexBuffer.map()
        indexMap.bytes.assumingMemoryBound(to: UInt16.self).assign(from: indices, count: indices.count)

        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: indices.count,
                                 indexType: .uInt16,
                                 geometryType: .triangles,
                                 material: nil)

        let vertexDescriptor = MDLVertexDescriptor()
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float2,
                                                            offset: 0,
                                                            bufferIndex: 0)
        let mdlMesh = MDLMesh(vertexBuffer: vertexBuffer,
                              vertexCount: vertices.count,
                              descriptor: vertexDescriptor,
                              submeshes: [submesh])

        var mtkMeshes: [MTKMesh] = []
        let mdlMeshes = [mdlMesh]
        mdlMeshes.forEach { mdlMesh in
            mdlMesh.addTangentBasis(forTextureCoordinateAttributeNamed: MDLVertexAttributeTextureCoordinate,
                                    tangentAttributeNamed: MDLVertexAttributeTangent,
                                    bitangentAttributeNamed: MDLVertexAttributeBitangent)
            Model.vertexDescriptor = mdlMesh.vertexDescriptor
            guard let mtkMesh = try? MTKMesh(mesh: mdlMesh, device: Renderer.device) else { return }
        meshes = zip(mdlMeshes, mtkMeshes).map {
            Mesh(mdlMesh: $0.0,
                 mtkMesh: $0.1,
                 startTime: .zero,
                 endTime: .zero,
                 vertexFunctionName: vertexFunction,
                 fragmentFunctionName: fragmentFunction,
                 blendingEnabled: blendingEnabled)
        self.samplerState = Model.buildSamplerState()
        debugBoundingBox = DebugBoundingBox(boundingBox: mdlMesh.boundingBox)
        self.animations = [:]
        super.init(name: "VerticesModel", position: position, rotation: rotation, scale: scale, boundingBox: mdlMesh.boundingBox)


struct VertexIn {
    float4 position [[attribute(Position)]];
    float3 normal [[attribute(Normal)]];
    float2 uv [[attribute(UV)]];
    float3 tangent [[attribute(Tangent)]];
    float3 bitangent [[attribute(Bitangent)]];
    ushort4 joints [[attribute(Joints)]];
    float4 weights [[attribute(Weights)]];

struct VertexOut {
    float4 position [[position]];
    float3 worldPosition;
    float3 worldNormal;
    float3 worldTangent;
    float3 worldBitangent;
    float2 uv;
vertex VertexOut vertex_basic(const VertexIn vertexIn [[stage_in]],
                             constant float4x4 *jointMatrices [[buffer(22), function_constant(hasSkeleton)]],
                             constant Uniforms &uniforms [[buffer(BufferIndexUniforms)]])
    float4 position = vertexIn.position;
    float4 normal = float4(vertexIn.normal, 0);
    VertexOut out {
        .position = uniforms.projectionMatrix * uniforms.viewMatrix * uniforms.modelMatrix * position,
        .worldPosition = (uniforms.modelMatrix * position).xyz,
        .worldNormal = uniforms.normalMatrix * normal.xyz,
        .worldTangent = uniforms.normalMatrix * vertexIn.tangent,
        .worldBitangent = uniforms.normalMatrix * vertexIn.bitangent,
        .uv = vertexIn.uv
    return out;
fragment float4 fragment_basic(VertexOut in [[stage_in]],
                                 constant Light *lights [[buffer(BufferIndexLights)]],
                                 constant Material &material [[buffer(BufferIndexMaterials)]],
                                 sampler textureSampler [[sampler(0)]],
                                 constant FragmentUniforms &fragmentUniforms [[buffer(BufferIndexFragmentUniforms)]],
                                 texture2d<float> baseColorTexture [[texture(0), function_constant(hasColorTexture)]],
                                 texture2d<float> normalTexture [[texture(1), function_constant(hasNormalTexture)]],
                                 texture2d<float> roughnessTexture [[texture(2), function_constant(hasRoughnessTexture)]],
                                 texture2d<float> metallicTexture [[texture(3), function_constant(hasMetallicTexture)]],
                                 texture2d<float> aoTexture [[texture(4), function_constant(hasAOTexture)]],
                                 constant bool &transparencyEnabled [[buffer(BufferIndexTransparency)]],
                                 constant bool &blendingEnabled [[buffer(BufferIndexBlending)]],
                                 constant bool &fogEnabled [[buffer(BufferIndexFog)]]){
    return float4(0, 0, 1, 1);

Unfortunately this is telling me Failed to set (contentViewController) user defined inspected property on (NSWindow): Must provide texture coordinates
But I am confused about what this actually means.
It shows a blank screen without the possibility to use the GPU debugger

No worries - I didn’t have the time to get to your project, so I’m glad you found it

Again it’s really hard to tell without seeing the whole app.

When debugging, you can try stripping out things until the app works. It’s often much easier to see what’s going on in less code.

Hey Caroline after working through Chapter 12 I had an epiphany:
I had to setup rendering for this as something separate. I will post the solution that worked for me here. So if anyone wants to get started for Chapter 11 within the whole Renderer-Model-Scene setup, you can find a possible starting setup here. Could you let me know if you see any flaws in this? I have been wondering if it is bad practice to use the same renderEncoder for different kind of models or not, it seems to work out at the moment though…
I still have to go through chapter 11 to see how it pans out but at least the starting state seems ok.

I added a class used for drawing Models by handing it the vertices:

import MetalKit

class TesselatedModel {
    private let name: String
    private let vertices: [float3]
    private static let pipelineState = TesselatedModel.buildPipelineState()
    private static let depthStencilState = TesselatedModel.buildDepthStencilState()
    private lazy var vertexBuffer = {
        Renderer.device.makeBuffer(bytes: vertices,
                                   length: MemoryLayout<float3>.stride * vertices.count,
                                   options: [])
    init(name: String, vertices: [float3]) {
        self.name = name
        self.vertices = vertices

    func render(renderEncoder: MTLRenderCommandEncoder, uniforms vertex: Uniforms) {
        renderEncoder.setVertexBuffer(self.vertexBuffer, offset: 0, index: 0)
        let position = float3([0, 0, 0])
        let rotation = float3(Float(-90).degreesToRadians, 0, 0)
        var modelMatrix: float4x4 {
          let translationMatrix = float4x4(translation: position)
          let rotationMatrix = float4x4(rotation: rotation)
          return translationMatrix * rotationMatrix
        var mvp = vertex.projectionMatrix * vertex.viewMatrix.inverse * modelMatrix
        renderEncoder.setVertexBytes(&mvp, length: MemoryLayout<float4x4>.stride, index: 1)
        renderEncoder.drawPrimitives(type: .triangle, vertexStart: 0, vertexCount: vertices.count)

    private static func buildPipelineState() -> MTLRenderPipelineState {
        let descriptor = MTLRenderPipelineDescriptor()
        descriptor.sampleCount = Renderer.sampleCount
        descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
        descriptor.depthAttachmentPixelFormat = .depth32Float
        let vertexFunction = Renderer.library.makeFunction(name: "vertex_tesselation")
        let fragmentFunction = Renderer.library.makeFunction(name: "fragment_tesselation")
        descriptor.vertexFunction = vertexFunction
        descriptor.fragmentFunction = fragmentFunction
        let vertexDescriptor = MTLVertexDescriptor()
        vertexDescriptor.attributes[0].format = .float3
        vertexDescriptor.attributes[0].offset = 0
        vertexDescriptor.attributes[0].bufferIndex = 0
        vertexDescriptor.layouts[0].stride = MemoryLayout<float3>.stride
        descriptor.vertexDescriptor = vertexDescriptor
        guard let pipelineState = try? Renderer.device.makeRenderPipelineState(descriptor: descriptor) else {
            fatalError("Could not create TesselatedModel PipelineState")
        return pipelineState

    private static func buildDepthStencilState() -> MTLDepthStencilState {
        let descriptor = MTLDepthStencilDescriptor()
        descriptor.depthCompareFunction = .less
        descriptor.isDepthWriteEnabled = true
        return Renderer.device.makeDepthStencilState(descriptor: descriptor)!
     Create control points
     - Parameters:
     - patches: number of patches across and down
     - size: size of plane
     - Returns: an array of patch control points. Each group of four makes one patch.
    func createControlPoints(patches: (horizontal: Int, vertical: Int),
                             size: (width: Float, height: Float)) -> [float3] {
        var points: [float3] = []
        // per patch width and height
        let width = 1 / Float(patches.horizontal)
        let height = 1 / Float(patches.vertical)
        for j in 0..<patches.vertical {
            let row = Float(j)
            for i in 0..<patches.horizontal {
                let column = Float(i)
                let left = width * column
                let bottom = height * row
                let right = width * column + width
                let top = height * row + height
                points.append([left, 0, top])
                points.append([right, 0, top])
                points.append([right, 0, bottom])
                points.append([left, 0, bottom])
        // size and convert to Metal coordinates
        // eg. 6 across would be -3 to + 3
        points = points.map {
            [$0.x * size.width - size.width / 2,
             $0.z * size.height - size.height / 2]
        return points

I added a list of tesselated Models to my Scene class

var tesselatedModels: [TesselatedModel] = []

Renderer calls the render on each of the tesselated models ( add this before the renderEnconder.endEncoding() )

    scene.tesselatedModels.forEach { $0.render(renderEncoder: renderEncoder, uniforms: scene.uniforms) }

This is the setup of my Scene:

import MetalKit

class TesselationScene: Scene {

    let plane: TesselatedModel = {
        let vertices: [float3] = [
          [-1,  0,  1],
          [ 1,  0, -1],
          [-1,  0, -1],
          [-1,  0,  1],
          [ 1,  0, -1],
          [ 1,  0,  1]
        return TesselatedModel(name: "plane", vertices: vertices)
    let arcBallCamera: Camera = {
        let camera = ArcballCamera()
        camera.distance = -2
        camera.target = [0, 0, 0]
        return camera

    override func setupScene() {
        self.tesselatedModels = [plane]
        inputController.keyboardDelegate = self
        cameras = [arcBallCamera]

And this is the setup for the shaders which I put in a new file

#include <metal_stdlib>
#include "Common"

using namespace metal;

struct VertexOut {
  float4 position [[position]];
  float4 color;

struct VertexIn {
  float4 position [[attribute(0)]];

vertex VertexOut vertex_tesselation(VertexIn in [[stage_in]],
                             constant float4x4 &mvp [[buffer(1)]])
  VertexOut out;
  float4 position = mvp * in.position;
  out.position = position;
  out.color = float4(0, 0, 1, 1);
  return out;

fragment float4 fragment_tesselation(VertexOut in [[stage_in]])
  return in.color;
1 Like

Excellent work :slight_smile: - I haven’t run the code, but it looks pretty good.

How about making TesselatedModel a subclass of Node. That will allow you to treat TesselatedModel just the same as the other nodes and you should be able to position it.

One of the strengths of the app design is that you can make anything a node and because each subclass overrides the render method, you can render that subclass in any way you want.

You should use the same render command encoder for each pass. Later you’ll do a shadow pass which will use a different command encoder.

1 Like

Thanks for the feedback, great idea I am going to refactor it into a node!

Hey @caroline, if you do find the time could you checkout my repo? I have described the issue in a ticket there . I have tried and tried but I am unable to find the mistake and move forward unfortunately. :disappointed: You should have access to the repo.

Hi there!

Your architecture is wrong.

You have in your render loop

create command buffer
command buffer creates render encoder

loop through renderables {
  call render method
commit command buffer

The render method creates another command buffer for the compute encoder which never gets committed.

With terrible code, I got it working.

  1. In Renderer, I created a new type property:
static var currentDrawable: MTLDrawable?
  1. In draw, I moved guard let drawable = view.currentDrawable else { return } to the top of the method, and added:
Self.currentDrawable = drawable

That made drawable global :scream: - don’t ever do this!

  1. In TessellatedNode, I checked the global drawable:
guard let drawable = Renderer.currentDrawable else { return }

After the compute encoder endCoding(), I put:


I just did all that to make sure the basic tessellation code worked, which it did. This is terrible architecture, and don’t have command buffers within command buffers!

I would suggest, with the current book architecture, that before rendering anything, and before creating the render command encoder, you have a separate loop in draw where you run through the renderables and find out what needs pre-processing. Call a pre-processing method in your model.

draw() {
  create command buffer
  loop through renderables to pre-process and do computes
  create render encoder
  loop through renderables to render
  commit command buffer
1 Like

Oh my … thank you so much for the effort and the elaborated answer @caroline. I do know it takes up quite a bit of your time.
But I wouldn’t have found this in a million years! It is working with the code you posted, now I am going to try to make it better architecture wise like you suggested!

So the main issue was I did not commit the commandEncoder that was created in the TesselatedNode.

We created that commandBuffer to take care of the kernel function’s execution, so with your changes we now wait for it to complete.

That commandBuffer manipulated the tessellationFactorsBuffer which gets passed to the render commandBuffer.

The way the render commandBuffer knows what to draw would then be the combination of the ControlPointsBuffer we send and the setTessellationFactorBuffer(tessellationFactorsBuffer we do on it? So basically what was going on here is that the tessellationFactorsBuffer couldn’t provide what was needed to render, right?

By Bolding the do computes here that does still mean creating and commiting that compute specific command buffer, correct?

Yes - not committing the command encoder to the gpu meant that the edge factors and inside factors weren’t being calculated. These factors decide how many points the tessellator will create.

1 Like

None of that formatting was me :smiley: - when I backticked to show it was “pseudo code”, it got rendered like that.

You can use the same command buffer as the one you use for the render encoder, but you have to have two different phases, as you can’t be encoding two things at one time.

1 Like

Thank you so much, I finally understand it now. I was now able to use a single commandBuffer for first all compute that is needed but making sure to endEncoding afterwards and then do the rendering on the same one!

1 Like