Video synthesis

AVMutableComposition

The AVFoundation framework provides a rich set of classes for audio and video editing; The key is composition, which combines different assets to form a new asset. Composition is a collection of tracks from one or more media resources. The AVMutableComposition class provides an interface for inserting and deleting tracks, as well as managing their chronological order. The following figure shows how to create a composition from existing assets. If you need to merge multiple assets sequentially into a single file, this is just fine. But if you want to perform any custom audio and video processing operations on track, then you need to merge audio and video separately.

As shown in the figure above, we load two media resources, AVAsset, locally; AVAsset has three tracks each, including two for video and one for Audio. Based on the above two media resources, we create an AVMutableComposition to be used as our composite output. AVMutableCompositionTrack has been designated as the corresponding AVAssetTrack respectively.

AVMutableAudioMix

useAVMutableAudioMixClasses can perform custom operations on audio Tracks in composition. You can also specify the maximum volume of the Audio Track and set the gradient effect for it.

AVMutableVideoComposition

useAVMutableVideoCompositionClass can directly process video track. When you output video from a video composition, you can also specify the output size, scale, and frame rate. Through video composition instruction (instructions, AVMutableVideoCompositionInstruction), can modify the video background color, and set the layer’s instructions. Layer of instructions (AVMutableVideoCompositionLayerInstruction) can be implemented for video track gradient, gradient transformation, such as transparency, transparency, transform effect. Video Composition also allows you to apply some of the effects of the Core Animation framework to your Video through the animationTool property.

AVAssetExportSession

The combination of audio and video can be usedAVAssetExportSession. Initialize an Export Session with Composition, and then set the audioMix and videoComposition properties, respectively.

The whole process

Here is a simple video composition example, which gives us an intuitive understanding of the key steps of video editing:

  1. Obtaining Video ResourcesAVAsset.
  2. Create custom composite objectsAVMutableComposition.
  3. Creating a Video componentAVMutableVideoCompositionThis is the class that handles the stuff you edit in the video. You can set the size, scale, and frame duration of the video you want. And instructions for managing and setting up video components.
  4. Create a followAVVideoCompositingOf the agreementcustomVideoCompositorClassThis class is used to define the properties and methods of the video synthesizer.
  5. Add the resource data to the mutable component, namely the trackAVMutableCompositionTrack(Generally add two: audio track and video track).
  6. Instructions to create a video componentAVMutableVideoCompositionInstructionThis class is used to manage instructions at the application layer.
  7. Instructions for creating a video application layerAVMutableVideoCompositionLayerInstructionThe user manages how the video frame should be applied and combined, i.e., when the subvideo appears and disappears in the total video, its size, animation, etc.
  8. Create a video export session objectAVAssetExportSession, mainly according to videoComposition to create a new video, and output to a specified file path.

Using the example

Create AVAsset

Create two video resources for subsequent compositing

NSURL *url1 = [[NSBundle mainBundle] URLForResource:@"cat.mp4" withExtension:nil];
NSURL *url2 = [[NSBundle mainBundle] URLForResource:@"girl.mp4" withExtension:nil];
self.assets = @[[AVAsset assetWithURL:url1], [AVAsset assetWithURL:url2]];
self.editor = [[SimpleEditor alloc] initWithClips:self.assets];

Copy the code
Create AVMutableComposition

You can create a custom Composition using the AVMutableComposition class. Can use AVMutableCompositionTrack class is added in the Composition of the custom of one or more Composition tracks.

AVAssetTrack *clipVideoTrack = [[[self.clips objectAtIndex:0] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; CGSize videoSize = clipVideoTrack.naturalSize; AVMutableComposition *composition = [AVMutableComposition composition]; // Use the first video as the size composition. NaturalSize = videoSize;Copy the code
Create AVMutableVideoComposition

Using AVMutableVideoComposition object can be custom processing operation on video tracks of composition. Using Video Composition, you can also specify dimensions, scaling, and frame rates for video Tracks.

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.customVideoCompositorClass = [CustomVideoCompositor class];

Copy the code
Create CustomVideoCompositor

CustomVideoCompositor follows the AVVideoCompositing protocol, which includes the following methods:

@protocol AVVideoCompositing<NSObject> // Indicates the type of source frame pixel buffer attributes that the video synthesizer can accept as input. @property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> *sourcePixelBufferAttributes; @property (nonatomic, readonly) NSDictionary<NSString *, id> *requiredPixelBufferAttributesForRenderContext; / / call to notify the custom synthesis synthesis will switch to the other rendering context - (void) renderContextChanged (newRenderContext AVVideoCompositionRenderContext *); / / the current video frame callback - (void) startVideoCompositionRequest (asyncVideoCompositionRequest AVAsynchronousVideoCompositionRequest *); / / cancel the - (void) cancelAllPendingVideoCompositionRequests;Copy the code
Create CustomVideoCompositor
// Indicates the type of source frame pixel buffer attributes that the video synthesizer can accept as input. - (NSDictionary *)sourcePixelBufferAttributes { return @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], (NSString*)kCVPixelBufferOpenGLESCompatibilityKey : [NSNumber numberWithBool:YES]}; } / instructions/video synthesizer needed to create new buffer pixel buffer properties - (NSDictionary *) requiredPixelBufferAttributesForRenderContext {return @ {(nsstrings *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], (NSString*)kCVPixelBufferOpenGLESCompatibilityKey : [NSNumber numberWithBool:YES]}; } / / call to notify the custom synthesis synthesis will switch to the other rendering context - (void) renderContextChanged: nonnull newRenderContext AVVideoCompositionRenderContext *) { } / / the current video frame callback - (void) startVideoCompositionRequest: nonnull AVAsynchronousVideoCompositionRequest *) request { @autoreleasepool { dispatch_async(_renderingQueue, ^{ if (self.shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { CFRetain(resultPixels); // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; }}}); } } - (CVPixelBufferRef)newRenderedPixelBufferForRequest:(AVAsynchronousVideoCompositionRequest *)request error:(NSError  **)errOut { CVPixelBufferRef dstPixels = nil; CustomVideoCompositionInstruction *currentInstruction = request.videoCompositionInstruction; CVPixelBufferRef currentPixelBuffer = [Request sourceFrameByTrackID:currentInstruction.trackID]; DstPixels = currentPixelBuffer; dstPixels = Pixels; return dstPixels; } / / cancel the - (void) cancelAllPendingVideoCompositionRequests {_shouldCancelAllRequests = YES; dispatch_barrier_async(_renderingQueue, ^() { self.shouldCancelAllRequests = NO; }); }Copy the code
Material filling, create AVVideoCompositionInstruction
- (void)buildTransitionComposition:(AVMutableComposition *)composition andVideoComposition:(AVMutableVideoComposition *)videoComposition {
    NSUInteger clipsCount = self.clips.count;
    CMTime nextClipStartTime = kCMTimeZero;

    // 添加两个视频轨道和音频轨道
    AVMutableCompositionTrack *compositionVideoTracks[2];
    AVMutableCompositionTrack *compositionAudioTracks[2];
    compositionVideoTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionVideoTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    CMTimeRange *timeRanges = alloca(sizeof(CMTimeRange) * clipsCount);

    // 使用视频素材 AVAssetTrack,分别填充轨道
    for (int i = 0; i < clipsCount; i++) {
        AVAsset *asset = [self.clips objectAtIndex:i];
        CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);

        AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        [compositionVideoTracks[i] insertTimeRange:timeRange ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];

        AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [compositionAudioTracks[i] insertTimeRange:timeRange ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];

        timeRanges[i] = CMTimeRangeMake(nextClipStartTime, timeRange.duration);
        nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRange.duration);
    }

    NSMutableArray *instructions = [NSMutableArray array];
    for (int i = 0; i < clipsCount; i++) {
    // 创建 AVVideoCompositionInstruction 
        CustomVideoCompositionInstruction *videoInstruction = [[CustomVideoCompositionInstruction alloc] initTransitionWithSourceTrackIDs:@[@(compositionVideoTracks[i].trackID)] forTimeRange:timeRanges[i]];
        videoInstruction.trackID = compositionVideoTracks[i].trackID;
        [instructions addObject:videoInstruction];
    }
    videoComposition.instructions = instructions;
}

Copy the code
The output video
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:self.editor.composition presetName:AVAssetExportPresetHighestQuality]; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.timeRange = CMTimeRangeMake(kCMTimeZero, duration); exporter.outputURL = [NSURL fileURLWithPath:cachesDir]; exporter.shouldOptimizeForNetworkUse = YES; exporter.videoComposition = self.editor.videoComposition; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^ {the if (exporter. Status = = AVAssetExportSessionStatusCompleted) {NSLog (@ "successful");} else {NSLog (@ "fail - synthesis -%@",exporter.error); } }); }];Copy the code

More iOS video editing sharing