In this iOS video depth maps tutorial, learn how to harness iOS 11's incredibly powerful video depth maps to apply realtime video filters and create a special effects masterpiece!
Do you mean pure black and pure white pixels, ignoring the grey ones? Take a look in the file CVPixelBufferExtension.swift. Inside is a function called clamp(). This function loops every single pixel in a CVPixelBuffer and clamps it to be no less than 0.0 or no more than 1.0.
You could write a similar function that counts the number of black pixels (0.0) and the number of white pixels (1.0) and divides those counts by the total number of pixels to get a percentage.