In this video tutorial on Metal, you'll learn how textures map to vertices and how to use sampler states.
This is a companion discussion topic for the original entry at https://www.raywenderlich.com/3537-beginning-metal/lessons/6
In this video tutorial on Metal, you'll learn how textures map to vertices and how to use sampler states.
I think there might be something wrong with the Download Materials for Part 6: Textures. The file wonāt download for me. I can download all the others.
@doyle - Iām sorry to hear that. Can you try again please? Maybe refresh the page? It downloads for me OK, and Iāve also heard from one other person for whom it also downloads OK.
Yes sorry I cleared my browser and itās working now.
Iāve completed the demo part of this lesson, but the result is split in half and mirrored in each triangle. iPhone 6 running iOS 10.0.2 and updated to 10.1.1 with the same result.
Shader.metal snippet
fragment half4 textured_fragment(
VertexOut vertexIn [[ stage_in ]],
sampler sampler2d [[ sampler(0) ]],
texture2d texture [[ texture(0) ]]
) {
float4 color = texture.sample(sampler2d, vertexIn.textureCoordinates);
return half4(color.r, color.g, color.b, 1);
}
Plane.swift snippet
let vertices: [Vertex] = [
Vertex(position: float3(-1, 1, 0),
color: float4(1, 0, 0, 1),
texture: float2(0,1)),
Vertex(position: float3(-1, -1, 0),
color: float4(0,1,0,1),
texture: float2(0,0)),
Vertex(position: float3(1, -1, 0),
color: float4(0,0,1,1),
texture: float2(1,0)),
Vertex(position: float3(1,1,0),
color: float4(1,0,1,1),
texture: float2(1,1))
]
Plane.swift Renderable extension snippet
var vertexDescriptor: MTLVertexDescriptor {
let vertexDescriptor = MTLVertexDescriptor()
vertexDescriptor.attributes[0].format = .float3
vertexDescriptor.attributes[0].offset = 0
vertexDescriptor.attributes[0].bufferIndex = 0
vertexDescriptor.attributes[1].format = .float4
vertexDescriptor.attributes[1].offset = MemoryLayout.stride
vertexDescriptor.attributes[1].bufferIndex = 0
vertexDescriptor.attributes[2].format = .float2
vertexDescriptor.attributes[2].offset = MemoryLayout.stride + MemoryLayout.stride
vertexDescriptor.attributes[2].bufferIndex = 0
vertexDescriptor.layouts[0].stride = MemoryLayout.stride
return vertexDescriptor
}
@zurzurzur - Hi - you can compare your code with the final sample code provided.
Youāll need to specifically check Shader.metal
ās struct
s and vertex and fragment functions and make sure that all your vertices
are the same (they look OK). Check your indices
are the same. Check those attributes have the correct offset - your copied code is missing bits. Your fragment function header is also missing a <float>
, but it wouldnāt compile without that.
It shouldnāt be anything to do with the iOS version. The symptom of that is the whole image being upside down.
Thanks for your help. Iāve been looking at my code all day, comparing it to the finished example, and I canāt seem to find the problem. Does this code look OK to you? Does anything look out-of-the-ordinary? In Plane.swift, I put the vertexDescriptor variable in a different placeā¦ thatās the only difference I can find.
import MetalKit
class Plane: Node {
var vertexBuffer: MTLBuffer?
var indexBuffer: MTLBuffer?
// Renderable stored properties
var pipelineState: MTLRenderPipelineState!
var fragmentFunctionName: String = "fragment_shader"
var vertexFunctionName: String = "vertex_shader"
// Texturable stored properties.
var texture: MTLTexture?
let vertices: [Vertex] = [
Vertex(position: float3(-1, 1, 0),
color: float4(1, 0, 0, 1),
texture: float2(0,1)),
Vertex(position: float3(-1, -1, 0),
color: float4(0,1,0,1),
texture: float2(0,0)),
Vertex(position: float3(1, -1, 0),
color: float4(0,0,1,1),
texture: float2(1,0)),
Vertex(position: float3(1,1,0),
color: float4(1,0,1,1),
texture: float2(1,1))
]
let indices: [UInt16] = [
0, 1, 2,
2, 3, 0
]
var time: Float = 0
struct Constants {
var animateBy: Float = 0.0
}
var constants = Constants()
init(device: MTLDevice) {
super.init()
buildBuffers(device: device)
pipelineState = buildPipelineState(device: device)
}
init(device: MTLDevice, imageName: String) {
super.init()
self.texture = setTexture(device: device, imageName: imageName)
if let texture = setTexture(device: device, imageName: imageName) {
self.texture = texture
fragmentFunctionName = "textured_fragment"
}
buildBuffers(device: device)
pipelineState = buildPipelineState(device: device)
}
private func buildBuffers(device: MTLDevice) {
vertexBuffer = device.makeBuffer(
bytes: vertices,
length: vertices.count * MemoryLayout<Vertex>.size,
options: []
)
indexBuffer = device.makeBuffer(
bytes: indices,
length: indices.count * MemoryLayout<UInt16>.size,
options: []
)
}
override func render(commandEncoder: MTLRenderCommandEncoder, deltaTime: Float) {
super.render(commandEncoder: commandEncoder, deltaTime: deltaTime)
guard let indexBuffer = indexBuffer else { return }
commandEncoder.setRenderPipelineState(pipelineState)
time += deltaTime
let animateBy = abs(sin(time)/2 + 0.5)
constants.animateBy = animateBy
commandEncoder.setVertexBuffer(vertexBuffer, offset: 0, at: 0)
commandEncoder.setVertexBytes(&constants,
length: MemoryLayout<Constants>.stride,
at: 1
)
commandEncoder.setFragmentTexture(texture, at: 0)
commandEncoder.drawIndexedPrimitives(type: .triangle,
indexCount: indices.count,
indexType: .uint16,
indexBuffer: indexBuffer,
indexBufferOffset: 0
)
}
}
extension Plane: Renderable {
var vertexDescriptor: MTLVertexDescriptor {
let vertexDescriptor = MTLVertexDescriptor()
vertexDescriptor.attributes[0].format = .float3
vertexDescriptor.attributes[0].offset = 0
vertexDescriptor.attributes[0].bufferIndex = 0
vertexDescriptor.attributes[1].format = .float4
vertexDescriptor.attributes[1].offset = MemoryLayout<float3>.stride
vertexDescriptor.attributes[1].bufferIndex = 0
vertexDescriptor.attributes[2].format = .float2
vertexDescriptor.attributes[2].offset = MemoryLayout<float3>.stride + MemoryLayout<float4>.stride
vertexDescriptor.attributes[2].bufferIndex = 0
vertexDescriptor.layouts[0].stride = MemoryLayout<Vertex>.stride
return vertexDescriptor
}
}
extension Plane: Texturable {
}
using namespace metal;
struct Constants {
float animateBy;
};
struct VertexIn {
float4 position [[ attribute(0) ]];
float4 color [[ attribute(1) ]];
float2 textureCoordinates [[ attribute(2) ]];
};
struct VertexOut {
float4 position [[ position ]];
float4 color;
float2 textureCoordinates;
};
vertex VertexOut vertex_shader(const VertexIn vertexIn [[ stage_in ]] ) {
VertexOut vertexOut;
vertexOut.position = vertexIn.position;
vertexOut.color = vertexIn.color;
vertexOut.textureCoordinates = vertexIn.textureCoordinates;
return vertexOut;
}
fragment half4 fragment_shader(VertexOut vertexIn [[ stage_in ]]) {
return half4(vertexIn.color);
}
fragment half4 textured_fragment(
VertexOut vertexIn [[ stage_in ]],
sampler sampler2d [[ sampler(0) ]],
texture2d<float> texture [[ texture(0) ]]) {
float4 color = texture.sample(sampler2d, vertexIn.textureCoordinates);
return half4(color.r, color.g, color.b, 1);
}
import simd
struct Vertex {
var position: float3
var color: float4
var texture: float2
}
import MetalKit
protocol Texturable {
var texture: MTLTexture? { get set }
}
extension Texturable {
func setTexture(device: MTLDevice, imageName: String) -> MTLTexture? {
let textureLoader = MTKTextureLoader(device: device)
var texture: MTLTexture? = nil
let textureLoaderOptions: [String: NSObject]
if #available(iOS 10.0, *) {
let origin = NSString(string: MTKTextureLoaderOriginBottomLeft)
textureLoaderOptions = [MTKTextureLoaderOptionOrigin : origin]
} else {
textureLoaderOptions = [:]
}
if let textureURL = Bundle.main.url(forResource: imageName, withExtension: nil) {
do {
texture = try textureLoader.newTexture(withContentsOf: textureURL, options: textureLoaderOptions)
} catch {
print("texture not created")
}
}
return texture
}
}
size
vs stride
. I may not have made the difference clear enough in the videos, so thank you for bringing this up.
Take a look at Apple Developer Documentation
In buildBuffers()
, you have MemoryLayout<Vertex>.size
. This makes the buffer count
* 40, whereas the amount Vertex
takes up in memory is actually 48 because of internal padding and alignment.
Another similar example that catches the unwary is if you are matching 3 floats in a shader function. The float3
type has a size and stride of 16 because of alignment. To match 3 floats, youād have to get the size of float and multiply it by 3.
Iāve been steadily going through this series and loving it. I do admit I feel like Iām consistently treading water, but itās getting easier. This was the first one that I had problems with.
I wound up spending a day discovering 3 problems. Something I seemed to have missed, something thatās changed in the OS (Iām using iOS 11b2), and a potential bug in Xcode at this beta stage.
I managed to miss changing the Rederable buildPipelineState() shader function declarations to use the properties. I havenāt been able to find where that was supposed to happen.
In 11b2 MTLTexture options now seem to be: [MTKTextureLoader.Options:Any]? which changed my code to:
textureLoaderOptions = [.origin: MTKTextureLoader.Origin.bottomLeft]
Finally, the killer was that dragging in the Images folder for some reason did not allow the resources to be found by:
Bundle.main.url(forResource:withExtension:)
This kept returning nil. I created a group folder in the project and readied the images and that worked. Iām going to play with that API and Xcode 9 and see if itās a Radar issue.
Despite thisā¦ I got my Zombieā¦ Iām starting to have glimmers of āGrokā and still persevering.
Thanks so much, Caroline!
@lordandrei - Iām glad youāre enjoying it.
The challenge pdf in Video 5 has a whole lot of refactoring in it, and thatās where Renderable is built up.
I apologise - I really thought Iād put that code up here.
This is my texture loader options code, which is the same as yours. They changed it from strings, which I guess is a move forwards.
let textureLoaderOptions: [MTKTextureLoader.Option: Any]
if #available(iOS 10.0, *) {
textureLoaderOptions = [.origin : MTKTextureLoader.Origin.bottomLeft]
} else {
textureLoaderOptions = [:]
}
So happy you got your Zombie
Great tutorials, Iāve really learned a lot from these.
Iām just trying to build on the challenge and make a texture semi-transparent. Unfortunately, all I seem to be able to manage is a semi-transparent texture that also ācutsā through anything thatās in the background to reveal the clear colour.
Iāve got two, textured quads with the opacity of one set to 0.5 and itās scaled to 75% of the otherās size.
The quads are:
and
The MTKView is cleared with red. When I try to blend them the result is:
what Iām expecting is:
For my pipeline setup Iām using:
descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
descriptor.colorAttachments[0].isBlendingEnabled = true
descriptor.colorAttachments[0].rgbBlendOperation = .add
descriptor.colorAttachments[0].alphaBlendOperation = .add
descriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
descriptor.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
descriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
descriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha
The Metal shader functions are:
vertex VertexOut vertex_shader(const VertexIn vertex_in [[ stage_in ]], constant ModelMatrix &matrix [[ buffer(1) ]], constant const UniformsStruct &uniforms [[ buffer(2) ]]) {
VertexOut vertex_out;
vertex_out.position = matrix.mvpMatrix * vertex_in.position;
vertex_out.colour = vertex_in.colour;
vertex_out.textureCoordinates = vertex_in.textureCoordinates;
vertex_out.opacity = uniforms.opacity;
return vertex_out;
}
fragment half4 masked_textured_fragment_shader(VertexOut vertex_from_vertex_shader [[ stage_in ]], sampler sampler2d [[ sampler(0) ]], texture2d<float> mask [[ texture(1) ]], texture2d<float> texture [[ texture(0) ]]) {
float4 keyPixel = mask.sample(sampler2d, vertex_from_vertex_shader.textureCoordinates);
float4 colour = texture.sample(sampler2d, vertex_from_vertex_shader.textureCoordinates);
return half4(colour.r * keyPixel.r, colour.g * keyPixel.g, colour.b * keyPixel.b, vertex_from_vertex_shader.opacity);
}
My current best guess is the pipeline isnāt set with the correct options but changing them doesnāt make the two quads blend but does give some interesting effects!
@todd6868 -
Setting the correct pipeline state for the second quad is the only thing you have to do to achieve blending - you donāt have to do any calculations in the fragment function.
Try setting up one simple pipeline with none of the blending for the back quad.
Then set up the pipeline as you have above for the front quad.
When you render the two quads, switch the pipeline state so the back quad renders without blending and the front quad renders with blending.
to get the above result, this is my fragment function for both quads:
fragment float4 fragment_main (VertexOut in [[ stage_in ]],
texture2d<float> texture [[ texture(0) ]]) {
constexpr sampler sampler2d;
float4 color = texture.sample(sampler2d, in.textureCoordinates);
color.a = 0.5; // set opacity. Ignored if blending is not enabled
return color;
}
This is āfixed functionā blending - you set the state of the GPU to blend by the use of two different pipeline states.
You can read more about it at Metal By Example
Hi Caroline, iām trying to add texture to my object based on this section. When i run project, texture looks inside out like below;
When i changed .uv value on vertex_main func like;
VertexOut out {
.position = uniforms.projectionMatrix * uniforms.viewMatrix
* uniforms.modelMatrix * vertexIn.position,
.worldPosition = (uniforms.modelMatrix * vertexIn.position).xyz,
.worldNormal = uniforms.normalMatrix * vertexIn.normal,
.uv = float2(1-vertexIn.uv.x, vertexIn.uv.y)
};
It looks ok like below;
Did i made something wrong? Iām loading my texture using .origin: MTKTextureLoader.Origin.bottomLeft option.
If needs, my project is in github ā GitHub - THaliloglu/solarsystem-metal: Solar System demonstration using Apple MetalKit
@thaliloglu - Hi, and welcome to the forums :).
I generally just flip the texture in my texture editor or Preview to suit my code until it works
The Metal texture coordinates are 0, 0 at the top left. I removed the origin bottomLeft
assignment altogether and also your uv changing, so it was just the straight uv numbers and got this:
Bright green is at 0, 1. So you can see where your model uvs are. You can see that the texture is being sampled as per your uvs, with 0, 1 at the bottom left.
When you use the origin bottomLeft, it flips the pixel coordinates of the texture, so you get this
Which is obviously pixel flipped.
https://developer.apple.com/documentation/metalkit/mtktextureloader/option/1645864-origin
You have the choice of changing your uvs on the sphere, or changing the texture, or changing the code.
P.S. Thank you for posting working code, it makes it easier to see whatās going on
Hi Caroline,
Iām using Blender for sphere object and actually, it was looking ok on Blender when i added image file to object for unwrapping.
Iāve used mirror-x option after add texture file, now blender preview looks not ok like below;
Iām new to using Blender, for now i will use mirror-x option because i donāt know what iām doing wrong
Thank you so much
- Iām sorry, I steered you wrong. Everything exported from Blender is flipped in our app. Itās nothing youāre doing wrong.
You can, as youāve done, flip it in Blender (which I think is easier), or you can flip it with a matrix in the vertex function.