I have mixed chapter 16 and 9 to have a 3D scene with particles.
The app runs with no error message, but on the screen l see only the particles (very nice ).
The renderParticlesEncoder is the last in the commandBuffer.
I do not understand why I don’t see the 3D scene. When I comment the renderParticlesEncoder, the 3D scene is normaly rendering on the screen.
Perhaps a problem of blending ?
Thank you for your help.
When you have two different encoders, only the final render command encoder should draw to the view’s drawable texture. Have you gone through chapter 14, Multipass Rendering? This shows you how to set up an encoder with a render pass descriptor. The render pass descriptor holds color attachments to which you write. In the final pass, you send the color attachments as a texture parameter to the final fragment function and combine into the final texture.
The second half of chapter 20, Advanced Lighting, also uses multiple render passes.
However, do you really need to do two render passes? You can change a pipeline state on a render encoder, so that the same render encoder can do a draw using a different pipeline state (and therefore different shader functions) into the view’s drawable texture. You should only need multiple passes for things like shadowing, deferred rendering or reflections - ie when the result from one render texture is needed in the next render texture.
Thank you very much for your answer. I now understand where the problem comes from. Indeed I went through chapter 14 much too fast! In fact everything is written in this great book!
I am rather seduced by the solution you suggest: use a single render encoder and change the pipeline to call the right functions. This is what is done in chapter 9 with the submeshes. I will try to implement this this week and come back to you to keep you informed.
Thank you again for your help.
Hello Caroline, I implemented your solution with a single encoder, and now it works very well. Thank you very much for your advice !
@jplozach: Great - well done!