This article is about some functions of video capturing and adding background music, which involves some classes with long names and methods, but the usage is relatively simple, mainly to understand some concepts of multimedia, let’s first introduce several commonly used AVFoundation classes:

  • AVURLAsset: subclass of AVAsset, this class is mainly used to get multimedia information, including the type of video and audio, duration, frame per second, but also can be used to get a thumbnail of the specified position of the video.
  • AVMutableCompositionTrack: video and audio capture all need through this class, I feel can be understood as a collection of a video or audio resources corresponding to a track object.
  • AVMutableComposition: This class is a subclass of AVAsset and should have a method [AVMutableComposition Composition] that returns a nil AVMutableComposition object.
  • CMTime: this is not the time we usually talk about the minute and second time, will be used later.
  • AVAssetExportSession: used to merge your collection of video and audio, will eventually be saved as a new file, you can set the file output type, path, and a state AVAssetExportSessionStatus merger.

A separate utility class, MediaManager, is created to do this

Here is the method interface for mediamanager.h:
#import <Foundation/ foundation.h > #import <UIKit/ uikit.h > /** Add music to complete the callback block */ typedef void (^MixcompletionBlock)(void); @interface MediaManager : NSObject / * * capture video and add background music * / + (void) addBackgroundMiusicWithVideoUrlStr: (NSURL *) videoUrl audioUrl: (audioUrl NSURL *) andCaptureVideoWithRange:(NSRange)videoRange completion:(MixcompletionBlock)completionHandle; / * * access multimedia length * / + (CGFloat) getMediaDurationWithMediaUrl (mediaUrlStr nsstrings *); /** Get the merged multimedia file path */ + (NSString *)getMediaFilePath; @endCopy the code
Mediamanager. m

In the method of adding background music, first create the AVURLAsset object corresponding to the video and audio

AVURLAsset* audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil]; AVURLAsset* videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil]; / / create AVMutableComposition object to add video audio resources AVMutableCompositionTrack AVMutableComposition * mixComposition = [AVMutableComposition composition];Copy the code

When we want to capture a video, it must involve the time point and length of the capture. Now we will introduce CMTime and CMTimeRange in detail.

  • CMTimeMake(int64_t value, int32_t timescale); CMTimeMake(int64_t value, int32_t timescale); A timescale refers to the number of frames per second for video playback, and the video playback rate (value/timescale) is the actual number of seconds for video playback. A timescale generally does not change, but intercepts the video length by changing the value of value.

CMTimeMakeWithSeconds(Float64 seconds, int32_t preferredTimeScale) can also be used. PreferredTimeScale is the number of frames played per second.

  • CMTimeRange is similar to NSRange, except that it corresponds to the start time of the video and the length of the video. You can use CMTimeRangeMake(start, duration) to create variables. Both are CMTime types. Method, I’m just passing in NSRange, doing some conversion internally.

After understanding these, you can start to collect video and audio. The following is the video collection. If necessary, you can also obtain the original audio track of the video. The tracksWithMediaType method returns an empty array, causing the application to crash.

/ / start startTime CMTime startTime = CMTimeMakeWithSeconds (videoRange. The location, videoAsset. Duration. Timescale); / / intercept length videoDuration CMTime videoDuration = CMTimeMakeWithSeconds (videoRange. Length, videoAsset. Duration. Timescale); CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoDuration); / / video acquisition compositionVideoTrack AVMutableCompositionTrack * compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; //ofTrack source //atTime insertion time position in the video [compositionVideoTrack insertTimeRange:videoTimeRange ofTrack:([videoAsset tracksWithMediaType:AVMediaTypeVideo].count>0) ? [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject : nil atTime:kCMTimeZero error:nil];Copy the code

Background audio collection

CMTimeRange audioTimeRange = CMTimeRangeMake(kCMTimeZero, videoDuration); / / audio acquisition compositionCommentaryTrack AVMutableCompositionTrack * compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack insertTimeRange:audioTimeRange ofTrack:([audioAsset tracksWithMediaType:AVMediaTypeAudio].count >  0) ? [audioAsset tracksWithMediaType:AVMediaTypeAudio].firstObject : nil atTime:kCMTimeZero error:nil];Copy the code

Then it is to merge the video and background audio, here need to set the output file save path and file type.

//AVAssetExportSession is used to merge files, export merged files, PresetName file output type AVAssetExportSession * assetExportSession = [[AVAssetExportSession alloc] initWithAsset: mixComposition presetName:AVAssetExportPresetPassthrough]; NSString *outPutPath = [NSTemporaryDirectory() stringByAppendingPathComponent:MediaFileName]; / / after mixing of video output path NSURL * outPutPath = [NSURL fileURLWithPath: outPutPath]; if ([[NSFileManager defaultManager] fileExistsAtPath:outPutPath]) { [[NSFileManager defaultManager] removeItemAtPath:outPutPath error:nil]; } / / output video format AVFileTypeMPEG4 AVFileTypeQuickTimeMovie... assetExportSession.outputFileType = AVFileTypeQuickTimeMovie; // NSArray *fileTypes = assetExportSession. assetExportSession.outputURL = outPutPath; . / / whether the output file network optimization assetExportSession shouldOptimizeForNetworkUse = YES; [assetExportSession exportAsynchronouslyWithCompletionHandler:^{ completionHandle(); }];Copy the code

This is how to get the length of a multimedia file.

+ (CGFloat)getMediaDurationWithMediaUrl:(NSString *)mediaUrlStr {
    
    NSURL *mediaUrl = [NSURL URLWithString:mediaUrlStr];
    AVURLAsset *mediaAsset = [[AVURLAsset alloc] initWithURL:mediaUrl options:nil];
    CMTime duration = mediaAsset.duration;
    
    return duration.value / duration.timescale;    
}
Copy the code

Finally, as long as the method of adding background music to the outside is simply called.

- (IBAction)addBackgroundmusic:(id)sender { if (_videoUrl && _audioUrl && self.endTextField.text && self.startTextField.text) { [MediaManager addBackgroundMiusicWithVideoUrlStr:_videoUrl audioUrl:_audioUrl andCaptureVideoWithRange:NSMakeRange([self.startTextField.text floatValue], [self.endTextField.text floatValue] - [self.starttExtField.text floatValue]) completion:^{NSLog(@" Video merge complete ");}]; [self.endTextField.text floatValue] - [self.starttExtField. }}Copy the code