Beginning Metal - Part 6: Textures | Ray Wenderlich

In this video tutorial on Metal, you'll learn how textures map to vertices and how to use sampler states.


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/3537-beginning-metal/lessons/6
1 Like

I think there might be something wrong with the Download Materials for Part 6: Textures. The file wonā€™t download for me. I can download all the others.

@doyle - Iā€™m sorry to hear that. Can you try again please? Maybe refresh the page? It downloads for me OK, and Iā€™ve also heard from one other person for whom it also downloads OK.

Yes sorry I cleared my browser and itā€™s working now.

1 Like

Iā€™ve completed the demo part of this lesson, but the result is split in half and mirrored in each triangle. iPhone 6 running iOS 10.0.2 and updated to 10.1.1 with the same result.

Shader.metal snippet
fragment half4 textured_fragment(
VertexOut vertexIn [[ stage_in ]],
sampler sampler2d [[ sampler(0) ]],
texture2d texture [[ texture(0) ]]
) {
float4 color = texture.sample(sampler2d, vertexIn.textureCoordinates);
return half4(color.r, color.g, color.b, 1);
}

Plane.swift snippet
let vertices: [Vertex] = [
Vertex(position: float3(-1, 1, 0),
color: float4(1, 0, 0, 1),
texture: float2(0,1)),
Vertex(position: float3(-1, -1, 0),
color: float4(0,1,0,1),
texture: float2(0,0)),
Vertex(position: float3(1, -1, 0),
color: float4(0,0,1,1),
texture: float2(1,0)),
Vertex(position: float3(1,1,0),
color: float4(1,0,1,1),
texture: float2(1,1))
]

Plane.swift Renderable extension snippet
var vertexDescriptor: MTLVertexDescriptor {
let vertexDescriptor = MTLVertexDescriptor()
vertexDescriptor.attributes[0].format = .float3
vertexDescriptor.attributes[0].offset = 0
vertexDescriptor.attributes[0].bufferIndex = 0
vertexDescriptor.attributes[1].format = .float4
vertexDescriptor.attributes[1].offset = MemoryLayout.stride
vertexDescriptor.attributes[1].bufferIndex = 0
vertexDescriptor.attributes[2].format = .float2
vertexDescriptor.attributes[2].offset = MemoryLayout.stride + MemoryLayout.stride
vertexDescriptor.attributes[2].bufferIndex = 0
vertexDescriptor.layouts[0].stride = MemoryLayout.stride
return vertexDescriptor
}

@zurzurzur - Hi - you can compare your code with the final sample code provided.

Youā€™ll need to specifically check Shader.metalā€™s structs and vertex and fragment functions and make sure that all your vertices are the same (they look OK). Check your indices are the same. Check those attributes have the correct offset - your copied code is missing bits. Your fragment function header is also missing a <float>, but it wouldnā€™t compile without that.

It shouldnā€™t be anything to do with the iOS version. The symptom of that is the whole image being upside down.

Thanks for your help. Iā€™ve been looking at my code all day, comparing it to the finished example, and I canā€™t seem to find the problem. Does this code look OK to you? Does anything look out-of-the-ordinary? In Plane.swift, I put the vertexDescriptor variable in a different placeā€¦ thatā€™s the only difference I can find.

Plane.swift

import MetalKit

class Plane: Node {
	var vertexBuffer: MTLBuffer?
	var indexBuffer: MTLBuffer?
	
	// Renderable stored properties
	var pipelineState: MTLRenderPipelineState!
	var fragmentFunctionName: String = "fragment_shader"
	var vertexFunctionName: String = "vertex_shader"
	
	// Texturable stored properties.
	var texture: MTLTexture?
	
	let vertices: [Vertex] = [
		Vertex(position: float3(-1, 1, 0),
		       color: float4(1, 0, 0, 1),
					 texture: float2(0,1)),
		Vertex(position: float3(-1, -1, 0),
		       color: float4(0,1,0,1),
		       texture: float2(0,0)),
		Vertex(position: float3(1, -1, 0),
		       color: float4(0,0,1,1),
					 texture: float2(1,0)),
		Vertex(position: float3(1,1,0),
		       color: float4(1,0,1,1),
		       texture: float2(1,1))
	]
	
	let indices: [UInt16] = [
		0, 1, 2,
		2, 3, 0
	]
	
	var time: Float = 0
	
	struct Constants {
		var animateBy: Float = 0.0
	}
	
	var constants = Constants()
	
	init(device: MTLDevice) {
		super.init()
		buildBuffers(device: device)
		pipelineState = buildPipelineState(device: device)
	}
	
	init(device: MTLDevice, imageName: String) {
		super.init()
		self.texture = setTexture(device: device, imageName: imageName)
		
		if let texture = setTexture(device: device, imageName: imageName) {
			self.texture = texture
			fragmentFunctionName = "textured_fragment"
		}
		
		buildBuffers(device: device)
		pipelineState = buildPipelineState(device: device)
		
	}
	
	private func buildBuffers(device: MTLDevice) {
		vertexBuffer = device.makeBuffer(
			bytes: vertices,
			length: vertices.count * MemoryLayout<Vertex>.size,
			options: []
		)
		indexBuffer = device.makeBuffer(
			bytes: indices,
			length: indices.count * MemoryLayout<UInt16>.size,
			options: []
		)
	}
	
	override func render(commandEncoder: MTLRenderCommandEncoder, deltaTime: Float) {
		super.render(commandEncoder: commandEncoder, deltaTime: deltaTime)
		guard let indexBuffer = indexBuffer else { return }
		
		commandEncoder.setRenderPipelineState(pipelineState)
		
		time += deltaTime
		let animateBy = abs(sin(time)/2 + 0.5)
		constants.animateBy = animateBy
		
		commandEncoder.setVertexBuffer(vertexBuffer, offset: 0, at: 0)
		commandEncoder.setVertexBytes(&constants,
		                              length: MemoryLayout<Constants>.stride,
		                              at: 1
		)
		commandEncoder.setFragmentTexture(texture, at: 0)
		commandEncoder.drawIndexedPrimitives(type: .triangle,
		                                     indexCount: indices.count,
		                                     indexType: .uint16,
		                                     indexBuffer: indexBuffer,
		                                     indexBufferOffset: 0
		)

	}
}

extension Plane: Renderable {
	var vertexDescriptor: MTLVertexDescriptor {
		let vertexDescriptor = MTLVertexDescriptor()
		vertexDescriptor.attributes[0].format = .float3
		vertexDescriptor.attributes[0].offset = 0
		vertexDescriptor.attributes[0].bufferIndex = 0
		vertexDescriptor.attributes[1].format = .float4
		vertexDescriptor.attributes[1].offset = MemoryLayout<float3>.stride
		vertexDescriptor.attributes[1].bufferIndex = 0
		vertexDescriptor.attributes[2].format = .float2
		vertexDescriptor.attributes[2].offset = MemoryLayout<float3>.stride + MemoryLayout<float4>.stride
		vertexDescriptor.attributes[2].bufferIndex = 0
		vertexDescriptor.layouts[0].stride = MemoryLayout<Vertex>.stride
		return vertexDescriptor
	}
}

extension Plane: Texturable {
	
}

Shader.metal

using namespace metal;

struct Constants {
	float animateBy;
};

struct VertexIn {
	float4 position [[ attribute(0) ]];
	float4 color [[ attribute(1) ]];
	float2 textureCoordinates [[ attribute(2) ]];
};

struct VertexOut {
	float4 position [[ position ]];
	float4 color;
	float2 textureCoordinates;
};

vertex VertexOut vertex_shader(const VertexIn vertexIn [[ stage_in ]] ) {
	VertexOut vertexOut;
	vertexOut.position = vertexIn.position;
	vertexOut.color = vertexIn.color;
	vertexOut.textureCoordinates = vertexIn.textureCoordinates;
	
	return vertexOut;
}

fragment half4 fragment_shader(VertexOut vertexIn [[ stage_in ]]) {
	return half4(vertexIn.color);
}

fragment half4 textured_fragment(
																 VertexOut vertexIn [[ stage_in ]],
																 sampler sampler2d [[ sampler(0) ]],
																 texture2d<float> texture [[ texture(0) ]]) {
	float4 color = texture.sample(sampler2d, vertexIn.textureCoordinates);
	return half4(color.r, color.g, color.b, 1);
}

Types.swift

import simd

struct Vertex {
	var position: float3
	var color: float4
	var texture: float2
}

Texturable.swift

import MetalKit

protocol Texturable {
	var texture: MTLTexture? { get set }
}

extension Texturable {
	func setTexture(device: MTLDevice, imageName: String) -> MTLTexture? {
		let textureLoader = MTKTextureLoader(device: device)
		var texture: MTLTexture? = nil
		let textureLoaderOptions: [String: NSObject]
		
		if #available(iOS 10.0, *) {
			let origin = NSString(string: MTKTextureLoaderOriginBottomLeft)
			textureLoaderOptions = [MTKTextureLoaderOptionOrigin : origin]
		} else {
			textureLoaderOptions = [:]
		}
		
		if let textureURL = Bundle.main.url(forResource: imageName, withExtension: nil) {
			do {
				texture = try textureLoader.newTexture(withContentsOf: textureURL, options: textureLoaderOptions)
			} catch {
				print("texture not created")
			}
		}
		
		return texture
	}
}

@zurzurzur

size vs stride. I may not have made the difference clear enough in the videos, so thank you for bringing this up.

Take a look at Apple Developer Documentation

In buildBuffers(), you have MemoryLayout<Vertex>.size. This makes the buffer count * 40, whereas the amount Vertex takes up in memory is actually 48 because of internal padding and alignment.

Another similar example that catches the unwary is if you are matching 3 floats in a shader function. The float3 type has a size and stride of 16 because of alignment. To match 3 floats, youā€™d have to get the size of float and multiply it by 3.

Iā€™ve been steadily going through this series and loving it. I do admit I feel like Iā€™m consistently treading water, but itā€™s getting easier. This was the first one that I had problems with.

I wound up spending a day discovering 3 problems. Something I seemed to have missed, something thatā€™s changed in the OS (Iā€™m using iOS 11b2), and a potential bug in Xcode at this beta stage.

I managed to miss changing the Rederable buildPipelineState() shader function declarations to use the properties. I havenā€™t been able to find where that was supposed to happen.

In 11b2 MTLTexture options now seem to be: [MTKTextureLoader.Options:Any]? which changed my code to:

textureLoaderOptions = [.origin: MTKTextureLoader.Origin.bottomLeft]

Finally, the killer was that dragging in the Images folder for some reason did not allow the resources to be found by:

Bundle.main.url(forResource:withExtension:) 

This kept returning nil. I created a group folder in the project and readied the images and that worked. Iā€™m going to play with that API and Xcode 9 and see if itā€™s a Radar issue.

Despite thisā€¦ I got my Zombieā€¦ Iā€™m starting to have glimmers of ā€œGrokā€ and still persevering.

Thanks so much, Caroline!

@lordandrei - Iā€™m glad youā€™re enjoying it.

  1. The challenge pdf in Video 5 has a whole lot of refactoring in it, and thatā€™s where Renderable is built up.

  2. I apologise - I really thought Iā€™d put that code up here.

This is my texture loader options code, which is the same as yours. They changed it from strings, which I guess is a move forwards.

let textureLoaderOptions: [MTKTextureLoader.Option: Any]
if #available(iOS 10.0, *) {
  textureLoaderOptions = [.origin : MTKTextureLoader.Origin.bottomLeft]
} else {
  textureLoaderOptions = [:]
}
  1. Iā€™ve been finding that when dragging files into Xcode 9 projects, most of the time they are not included in the target. So after adding files, I am always now checking Target Membership on the File Inspector.

So happy you got your Zombie :smile:

Great tutorials, Iā€™ve really learned a lot from these.

Iā€™m just trying to build on the challenge and make a texture semi-transparent. Unfortunately, all I seem to be able to manage is a semi-transparent texture that also ā€˜cutsā€™ through anything thatā€™s in the background to reveal the clear colour.

Iā€™ve got two, textured quads with the opacity of one set to 0.5 and itā€™s scaled to 75% of the otherā€™s size.

The quads are:

vertical

and

vertical

The MTKView is cleared with red. When I try to blend them the result is:

bad_mix

what Iā€™m expecting is:

good_mix

For my pipeline setup Iā€™m using:

    descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
    descriptor.colorAttachments[0].isBlendingEnabled = true
    descriptor.colorAttachments[0].rgbBlendOperation = .add
    descriptor.colorAttachments[0].alphaBlendOperation = .add
    descriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
    descriptor.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
    descriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
    descriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha

The Metal shader functions are:

vertex VertexOut vertex_shader(const VertexIn vertex_in [[ stage_in ]], constant ModelMatrix &matrix [[ buffer(1) ]], constant const UniformsStruct &uniforms [[ buffer(2) ]]) {
    VertexOut vertex_out;
    vertex_out.position = matrix.mvpMatrix * vertex_in.position;
    vertex_out.colour = vertex_in.colour;
    vertex_out.textureCoordinates = vertex_in.textureCoordinates;
    vertex_out.opacity = uniforms.opacity;
    return vertex_out;
}

fragment half4 masked_textured_fragment_shader(VertexOut vertex_from_vertex_shader [[ stage_in ]], sampler sampler2d [[ sampler(0) ]], texture2d<float> mask [[ texture(1) ]], texture2d<float> texture [[ texture(0) ]]) {
    float4 keyPixel = mask.sample(sampler2d, vertex_from_vertex_shader.textureCoordinates);
    float4 colour = texture.sample(sampler2d, vertex_from_vertex_shader.textureCoordinates);
    return half4(colour.r * keyPixel.r, colour.g * keyPixel.g, colour.b * keyPixel.b, vertex_from_vertex_shader.opacity);
}

My current best guess is the pipeline isnā€™t set with the correct options but changing them doesnā€™t make the two quads blend but does give some interesting effects!

@todd6868 -

Setting the correct pipeline state for the second quad is the only thing you have to do to achieve blending - you donā€™t have to do any calculations in the fragment function.

Try setting up one simple pipeline with none of the blending for the back quad.
Then set up the pipeline as you have above for the front quad.

When you render the two quads, switch the pipeline state so the back quad renders without blending and the front quad renders with blending.

to get the above result, this is my fragment function for both quads:

fragment float4 fragment_main (VertexOut in [[ stage_in ]],
                       texture2d<float> texture [[ texture(0) ]]) {
  constexpr sampler sampler2d;
  float4 color = texture.sample(sampler2d, in.textureCoordinates);
  color.a = 0.5;  // set opacity. Ignored if blending is not enabled
  return color;
}

This is ā€œfixed functionā€ blending - you set the state of the GPU to blend by the use of two different pipeline states.

You can read more about it at Metal By Example

1 Like

Hi Caroline, iā€™m trying to add texture to my object based on this section. When i run project, texture looks inside out like below;

Screen Shot 2021-01-07 at 23.39.07

When i changed .uv value on vertex_main func like;
VertexOut out {
.position = uniforms.projectionMatrix * uniforms.viewMatrix
* uniforms.modelMatrix * vertexIn.position,
.worldPosition = (uniforms.modelMatrix * vertexIn.position).xyz,
.worldNormal = uniforms.normalMatrix * vertexIn.normal,
.uv = float2(1-vertexIn.uv.x, vertexIn.uv.y)
};

It looks ok like below;

Screen Shot 2021-01-07 at 23.42.35

Did i made something wrong? Iā€™m loading my texture using .origin: MTKTextureLoader.Origin.bottomLeft option.

If needs, my project is in github ā†’ GitHub - THaliloglu/solarsystem-metal: Solar System demonstration using Apple MetalKit

@thaliloglu - Hi, and welcome to the forums :).

I generally just flip the texture in my texture editor or Preview to suit my code until it works :slight_smile:

The Metal texture coordinates are 0, 0 at the top left. I removed the origin bottomLeft assignment altogether and also your uv changing, so it was just the straight uv numbers and got this:

Screen Shot 2021-01-08 at 8.47.44 pm

Bright green is at 0, 1. So you can see where your model uvs are. You can see that the texture is being sampled as per your uvs, with 0, 1 at the bottom left.

When you use the origin bottomLeft, it flips the pixel coordinates of the texture, so you get this

Screen Shot 2021-01-08 at 8.52.54 pm

Which is obviously pixel flipped.

https://developer.apple.com/documentation/metalkit/mtktextureloader/option/1645864-origin

You have the choice of changing your uvs on the sphere, or changing the texture, or changing the code.

P.S. Thank you for posting working code, it makes it easier to see whatā€™s going on :slight_smile:

Hi Caroline,

Iā€™m using Blender for sphere object and actually, it was looking ok on Blender when i added image file to object for unwrapping.

Iā€™ve used mirror-x option after add texture file, now blender preview looks not ok like below; :slight_smile:
Screen Shot 2021-01-11 at 23.21.31

Iā€™m new to using Blender, for now i will use mirror-x option because i donā€™t know what iā€™m doing wrong :smiley:

Thank you so much :+1:

:woman_facepalming: - Iā€™m sorry, I steered you wrong. Everything exported from Blender is flipped in our app. Itā€™s nothing youā€™re doing wrong.

You can, as youā€™ve done, flip it in Blender (which I think is easier), or you can flip it with a matrix in the vertex function.

1 Like