Kodeco Forums

How to Play, Record, and Edit Videos in iOS

This is a blog post by iOS Tutorial Team member Abdul Azeem, software architect and co-founder at Datainvent Systems, a software development and IT services company. Update 8/14/12: Fixes and clarifications made by Joseph Neuman. Recording videos (and playing around with them programmatically) is one of the coolest things you can do with your phone, […]


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/2902-how-to-play-record-and-edit-videos-in-ios

Here, My both video file do not containing audio that’s why I am getting an error :: of nil value tracksWithMediaType(AVMediaTypeVideo) . What should proper way

Hi,

This is Sharukh , an iOS developer from India. I’ve followed this tutorial and developed and app using Objective C which enables user to merge multiple videos together into a single video file. It was working fine prior to the introduction of iOS 10. But now with the introduction of iOS 10, it stops working. Video merging feature is not working now. Can anybody let me know how can we merge videos in iOS 10 using Objective C ?

Any help would be highly appreciated.

Since this tutorial is very old… I had to struggle quite a bit to write the app in Swift 3. Also the rotation bit does not work at all if you follow the tutorial. You need to actually scale, translate, and rotate the video in order to have videos taken in portrait work.

Below is the code I worked out for merging two videos taken in portrait.

func assetIsPortrait(assetTrack: AVAssetTrack) -> Bool {
    let trackTransform: CGAffineTransform = assetTrack.preferredTransform
    if (trackTransform.a == 0 && trackTransform.b == 1.0 && trackTransform.c == -1.0 && trackTransform.d == 0) {
        return true
    }
    if (trackTransform.a == 0 && trackTransform.b == -1.0 && trackTransform.c == 1.0 && trackTransform.d == 0) {
        return true
    }
    if (trackTransform.a == 1.0 && trackTransform.b == 0 && trackTransform.c == 0 && trackTransform.d == 1.0) {
        return false
    }
    if (trackTransform.a == -1.0 && trackTransform.b == 0 && trackTransform.c == 0 && trackTransform.d == -1.0) {
        return false
    }
    return true //default case
}

func renderSizeForTracks(assetTracks: [AVAssetTrack]) -> CGSize {
    var renderWidth: CGFloat = 0
    var renderHeight: CGFloat = 0
    for assetTrack: AVAssetTrack in assetTracks {
        if (self.assetIsPortrait(assetTrack: assetTrack)) {
            renderWidth = max(renderWidth, assetTrack.naturalSize.height)
            renderHeight = max(renderHeight, assetTrack.naturalSize.width)
        } else {
            renderWidth = max(renderWidth, assetTrack.naturalSize.width)
            renderHeight = max(renderHeight, assetTrack.naturalSize.height)
        }
    }
    return CGSize(width: renderWidth, height: renderHeight)
}

func scaleFactorForAsset(assetTrack: AVAssetTrack) -> CGSize {
    if (!self.assetIsPortrait(assetTrack: assetTrack)) {
        if (assetTrack.naturalSize.width > UIScreen.main.nativeBounds.width && assetTrack.naturalSize.height > UIScreen.main.nativeBounds.height) {
            return CGSize(width: 1.0, height: 1.0)
        }
        let widthRatio: CGFloat = UIScreen.main.bounds.width / assetTrack.naturalSize.width
        let heightRatio: CGFloat = UIScreen.main.bounds.height / assetTrack.naturalSize.height
        return CGSize(width: min(widthRatio, heightRatio), height: min(widthRatio, heightRatio))
    } else {
        if (assetTrack.naturalSize.width > UIScreen.main.nativeBounds.height && assetTrack.naturalSize.height > UIScreen.main.nativeBounds.width) {
            return CGSize(width: 1.0, height: 1.0)
        }
        let widthRatio: CGFloat = UIScreen.main.nativeBounds.height / assetTrack.naturalSize.width
        let heightRatio: CGFloat = UIScreen.main.nativeBounds.width / assetTrack.naturalSize.height
        return CGSize(width: min(widthRatio, heightRatio), height: min(widthRatio, heightRatio))
    }
}

@IBAction func mergeVideos(_ sender: UIButton) {
    if (self.firstAsset != nil && self.secondAsset != nil) {
        self.spinner.startAnimating()
        let mixComposition: AVMutableComposition = AVMutableComposition.init()
        
        let firstTrack: AVMutableCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
        let firstAssetTrack: AVAssetTrack = self.firstAsset!.tracks(withMediaType: AVMediaTypeVideo)[0]
        try! firstTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.firstAsset!.duration), of: firstAssetTrack, at: kCMTimeZero)
        
        let secondTrack: AVMutableCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
        let secondAssetTrack: AVAssetTrack = self.secondAsset!.tracks(withMediaType: AVMediaTypeVideo)[0]
        try! secondTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.secondAsset!.duration), of: secondAssetTrack, at: self.firstAsset!.duration)
        
        if (self.audioAsset != nil) {
            let audioTrack: AVMutableCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
            try! audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, CMTimeAdd(self.firstAsset!.duration, self.secondAsset!.duration)), of: self.audioAsset!.tracks(withMediaType: AVMediaTypeAudio)[0], at: kCMTimeZero)
        }
        let mainInstruction: AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction.init()
        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(self.firstAsset!.duration, self.secondAsset!.duration))
        
        let firstLayerInstruction: AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction.init(assetTrack: firstTrack)
        var firstTransform: CGAffineTransform = self.firstAsset!.preferredTransform
        let firstScale: CGSize = self.scaleFactorForAsset(assetTrack: firstAssetTrack)
        let firstScaleTransform = CGAffineTransform(scaleX: firstScale.width, y: firstScale.height)
        if (self.assetIsPortrait(assetTrack: firstAssetTrack)) {
            let translateTransform = firstScaleTransform.translatedBy(x: firstAssetTrack.naturalSize.height, y: 0)
            firstTransform = translateTransform.rotated(by: CGFloat(M_PI)/2)
        }
        
        firstLayerInstruction.setTransform(firstTransform, at: kCMTimeZero)
        firstLayerInstruction.setOpacity(0, at: self.firstAsset!.duration)
        
        let secondLayerInstruction: AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction.init(assetTrack: secondTrack)
        var secondTransform = self.secondAsset!.preferredTransform
        let secondScale: CGSize = self.scaleFactorForAsset(assetTrack: secondAssetTrack)
        let secondScaleTransform = CGAffineTransform(scaleX: secondScale.width, y: secondScale.height)
        if (self.assetIsPortrait(assetTrack: secondAssetTrack)) {
            let translateTransform = secondScaleTransform.translatedBy(x: secondAssetTrack.naturalSize.height, y: 0)
            secondTransform = translateTransform.rotated(by: CGFloat(M_PI)/2)
        }
        
        secondLayerInstruction.setTransform(secondTransform, at: kCMTimeZero)
        
        mainInstruction.layerInstructions = [firstLayerInstruction, secondLayerInstruction]
        let mainCompositionInstructions: AVMutableVideoComposition = AVMutableVideoComposition.init()
        mainCompositionInstructions.instructions = [mainInstruction]
        mainCompositionInstructions.frameDuration = CMTimeMake(1, 30)
        mainCompositionInstructions.renderSize = self.renderSizeForTracks(assetTracks: [firstAssetTrack, secondAssetTrack])
        
        let videoFilePath = URL.init(fileURLWithPath: NSTemporaryDirectory().appending("mergedVideo-\(arc4random() % 1000).mp4")).absoluteString
        
        let savePathUrl =  URL(string: videoFilePath)!
        
        let exporter: AVAssetExportSession = AVAssetExportSession.init(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
        //exporter.outputURL = url
        exporter.outputURL = savePathUrl
        exporter.outputFileType = AVFileTypeQuickTimeMovie
        exporter.shouldOptimizeForNetworkUse = true
        exporter.videoComposition = mainCompositionInstructions
        exporter.exportAsynchronously(completionHandler: {
            DispatchQueue.main.async {
                self.exportDidFinish(session: exporter)
            }
        })
    } else {
        let alert: UIAlertController = self.dismissAlertWithTitleAndMessage(title: "Error", message: "Please first select videos to merge (audio optional)")
        self.presentAlert(alert: alert)
    }
}

Please update this tutorial.

I have a doubt where
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
or
UIImageOrientation SecondAssetOrientation_ = UIImageOrientationUp;

it has not work?

please update tutorial

please post all codes… not work for me

This tutorial is more than six months old so questions are no longer supported at the moment for it. Thank you!