Camera Capture, Variable Scope & "Main thread" purple errors

I am new to Swift and iOS dev, and this is the first app I am messing about with.

I have a CoreML Image Classification task, that takes the “camera capture stream” from the iOS device’s [video] camera and occurs in the background. Once objects have been been identified, and other app logic has occurred, I would like to update the UI’s label with some of the data.

I have used public sources and tutorial content to get to my current state, which works except for getting the data onto the UILabel (printing to console works as expected). Most recently, I made some edits to include the @escaping closure, on another’s suggestion that this would resolve the issue. However, the same error message appears (purple “main thread”) and zero valued variable if i call out to the main thread - now its in the “captureOutput” function’s switch statement.

Can someone explain how the callout to DispatchQueue.main.asyc(execute: { }) is able to access the variable(s) I have been working with? Is this essentially a scoping issue?

The code I am currently using:

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    processCameraBuffer(sampleBuffer: sampleBuffer) { weightedResult in
        print("captureOutput Result: \(weightedResult)") //works as expected
        switch weightedResult {
        case _ where weightedResult >= 10:
            DispatchQueue.main.async {
                let percentFormatted = String(format: "%.2f", weightedResult / 65 * 100)
                self.labelPrediction.text = "Golf Ball: \(percentFormatted)%" // always displays as 0.00%
            let vibrate = SystemSoundID(kSystemSoundID_Vibrate)
            AudioServicesPlaySystemSound(vibrate) // works
            DispatchQueue.main.async {
                self.labelPrediction.text = "No Golf Ball..."
func processCameraBuffer(sampleBuffer: CMSampleBuffer, completion: @escaping (Int) -> Void) {
    let coreMLModel = Inceptionv3()
    if let model = try? VNCoreMLModel(for: coreMLModel.model) {
        let request = VNCoreMLRequest(model: model, completionHandler: { (request, error) in
            if let results = request.results as? [VNClassificationObservation] {
                var counter = 0
                var weightedResult = 0
                for item in results[0...9] {
                    if item.identifier.contains("something") {
                        //some coding logic
                        weightedResult = something
                    counter += 1
                print("Weighted: \(weightedResult)")

            if let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                let handler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
                do {
                    try handler.perform([request])
                } catch {

Many thanks, hope I’ve explained this well enough…

I don’t think it is a scope issue.
I think it is because you made the value in the completion an Int. When you divide the integer 10 by 65, you get the integer 0, which stays 0 when you multiply by 100.
If you change the declaration of processCameraBuffer to read

… completion: @escaping (Float) …

I think it will work. Or you could do

Float(weightedResult) / 65 * 100

for calculating percentFormatted.

1 Like

This topic was automatically closed after 166 days. New replies are no longer allowed.