The preface

Wechat is so popular now, and its functions are becoming more and more powerful. I don’t know whether you understand the function of sending video capture in wechat circle of friends or the function of video editing by Apple (the author also guesses that this function of wechat is also imitated by Apple). I feel this function is really very convenient and practical. Recently, the author is also studying audio and video functions, so I have implemented this function.

Function actually looks quite simple, the implementation process also stepped on many pits. On the one hand, record; On the other hand, it’s kind of a rehash of the implementation process, so that you can see the code a little bit more clearly.

The effect

So let’s see what I did

implementation

Implementation process analysis

The whole function can be divided into three parts:

  • For video playback, we can package a video player separately
  • The sliding view below is more complicated and is divided into four parts. Gray cover, left and right handle slider, upper and lower two lines in the middle of slider, picture management view
  • Controller view logical assembly and function implementation

Video player package

Here, AVPlayer, playerLayer and AVPlayerItem are used to realize the video playback function. Since the whole event is based on KVO listening, Block code is added to provide external listening use.

#import "FOFMoviePlayer.h" @interface FOFMoviePlayer() { AVPlayerLooper *_playerLooper; AVPlayerItem *_playItem; BOOL _loop; } @property(nonatomic,strong)NSURL *url; @property(nonatomic,strong)AVPlayer *player; @property(nonatomic,strong)AVPlayerLayer *playerLayer; @property(nonatomic,strong)AVPlayerItem *playItem; @property (nonatomic,assign) CMTime duration; @end @implementation FOFMoviePlayer -(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer{ self = [super init]; if (self) { [self initplayers:superLayer]; _playerLayer.frame = frame; self.url = url; } return self; } -(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer loop:(BOOL)loop{ self = [self initWithFrame:frame url:url superLayer:superLayer]; if (self) { _loop = loop; } return self; } - (void)initplayers:(CALayer *)superLayer{ self.player = [[AVPlayer alloc] init]; self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player]; self.playerLayer.videoGravity = AVLayerVideoGravityResize; [superLayer addSublayer:self.playerLayer]; } - (void)initLoopPlayers:(CALayer *)superLayer{ self.player = [[AVQueuePlayer alloc] init]; self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player]; self.playerLayer.videoGravity = AVLayerVideoGravityResize; [superLayer addSublayer:self.playerLayer]; } -(void)fof_play{ [self.player play]; } -(void)fof_pause{ [self.player pause]; } #pragma mark - Observe -(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context{ if ([keyPath isEqualToString:@"status"]) { AVPlayerItem *item = (AVPlayerItem *)object; AVPlayerItemStatus status = [[change objectForKey:@"new"] intValue]; / / get the changed the status of the if (status = = AVPlayerItemStatusReadyToPlay) {_duration = item. Duration; NSLog(@" ready to play "); // NSLog(@" ready to play "); if (self.blockStatusReadyPlay) { self.blockStatusReadyPlay(item); } } else if (status == AVPlayerItemStatusFailed) { if (self.blockStatusFailed) { self.blockStatusFailed(); } AVPlayerItem *item = (AVPlayerItem *)object; NSLog(@"%@",item.error); NSLog(@"AVPlayerStatusFailed"); } else { self.blockStatusUnknown(); NSLog(@"%@",item.error); NSLog(@"AVPlayerStatusUnknown"); } }else if ([keyPath isEqualToString:@"tracking"]){ NSInteger status = [change[@"new"] integerValue]; if (self.blockTracking) { self.blockTracking(status); } if (status) {// Dragging [self.player pause]; } else {/ / stop dragging}} else if ([keyPath isEqualToString: @ "loadedTimeRanges"]) {NSArray * array = _playItem. LoadedTimeRanges; CMTimeRange timeRange = [array.firstObject CMTimeRangeValue]; CGFloat startSeconds = CMTimeGetSeconds(timerange.start); CGFloat durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval totalBuffer = startSeconds + durationSeconds; / / the total length of buffer double progress = totalBuffer/CMTimeGetSeconds (_duration); if (self.blockLoadedTimeRanges) { self.blockLoadedTimeRanges(progress); } NSLog(@" current buffer time: %f",totalBuffer); }else if ([keyPath isEqualToString:@"playbackBufferEmpty"]){NSLog(@" keyPath isEqualToString:@"playbackBufferEmpty"]){ ); }else if ([keyPath isEqualToString:@"playbackLikelyToKeepUp"]){ if (self.blockPlaybackLikelyToKeepUp) { self.blockPlaybackLikelyToKeepUp([change[@"new"] boolValue]); } } } -(void)setUrl:(NSURL *)url{ _url = url; [self.player replaceCurrentItemWithPlayerItem:self.playItem]; } -(AVPlayerItem *)playItem{ _playItem = [[AVPlayerItem alloc] initWithURL:_url]; // Listen to the status of the player, Ready to play an unknown error, failure, [_playItem addObserver: self forKeyPath: @ "status" options: NSKeyValueObservingOptionNew context: nil]; / / to monitor the cache time [_playItem addObserver: self forKeyPath: @ "loadedTimeRanges" options: NSKeyValueObservingOptionNew context: nil];  // When the cache is insufficient, the video cannot be loaded:  [_playItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil]; / / used to monitor the cache state of play enough [_playItem addObserver: self forKeyPath: @ "playbackLikelyToKeepUp" options: NSKeyValueObservingOptionNew context:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(private_playerMovieFinish) name:AVPlayerItemDidPlayToEndTimeNotification object:nil]; return _playItem; } - (void)private_playerMovieFinish{NSLog(@" PlayerMovieFinish "); if (self.blockPlayToEndTime) { self.blockPlayToEndTime(); } if (_loop) {// Default to provide a loop function [self.player pause]; CMTime time = CMTimeMake(1, 1); __weak typeof(self)this = self; [self.player seekToTime:time completionHandler:^(BOOL finished) { [this.player play]; }]; }} -(void)dealloc{NSLog(@"----- destroy -----"); } @endCopy the code

The author plans to write a separate article about video players.

Slide view down here

Grey cover

It’s a little bit easier to cover it in gray and here the author is just using UIView

self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.leftMaskView.backgroundColor = [UIColor grayColor]; Self. LeftMaskView. Alpha = 0.8; [self addSubview:self.leftMaskView]; self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.rightMaskView.backgroundColor = [UIColor grayColor]; Self. RightMaskView. Alpha = 0.8;Copy the code

Two lines up and down the middle of the slider

These two lines separately encapsulate a view Line. At the beginning, I thought of using a UIView, but I found a problem that the sliding speed of the handle did not match the sliding speed of the Line, and the Line was relatively slow.

@implementation Line -(void)setBeginPoint:(CGPoint)beginPoint{ _beginPoint = beginPoint; [self setNeedsDisplay]; } -(void)setEndPoint:(CGPoint)endPoint{ _endPoint = endPoint; [self setNeedsDisplay]; } - (void)drawRect:(CGRect)rect { CGContextRef context = UIGraphicsGetCurrentContext(); CGContextSetLineWidth(context, 3); CGContextSetStrokeColorWithColor (context, [UIColor colorWithWhite: alpha 0.9:1) the CGColor); CGContextMoveToPoint(context, self.beginPoint.x, self.beginPoint.y); CGContextAddLineToPoint(context, self.endPoint.x, self.endPoint.y); CGContextStrokePath(context); }Copy the code

Image Management View

A VideoPieces package is used to assemble the logic of handles, lines, coverings, and display images. Since there are only 10 images, this is just a for loop, adding 10 UIImageViews

@interface VideoPieces() { CGPoint _beginPoint; } @property(nonatomic,strong) Haft *leftHaft; @property(nonatomic,strong) Haft *rightHaft; @property(nonatomic,strong) Line *topLine; @property(nonatomic,strong) Line *bottomLine; @property(nonatomic,strong) UIView *leftMaskView; @property(nonatomic,strong) UIView *rightMaskView; @end @implementation VideoPieces -(instancetype)initWithFrame:(CGRect)frame{ self = [super initWithFrame:frame]; if (self) { [self initSubViews:frame]; } return self; } - (void)initSubViews:(CGRect)frame{ CGFloat height = CGRectGetHeight(frame); CGFloat width = CGRectGetWidth(frame); CGFloat minGap = 30; CGFloat widthHaft = 10; CGFloat heightLine = 3; _leftHaft = [[Haft alloc] initWithFrame:CGRectMake(0, 0, widthHaft, height)]; _leftHaft. Alpha = 0.8; _lefthaft. backgroundColor = [UIColor colorWithWhite:0.9 alpha:1]; _leftHaft.rightEdgeInset = 20; _leftHaft.lefEdgeInset = 5; __weak typeof(self) this = self; [_leftHaft setBlockMove:^(CGPoint point) { CGFloat maxX = this.rightHaft.frame.origin.x-minGap; If (point.x<maxX) {this.topline. beginPoint = CGPointMake(point.x, heightLine/2.0); This. The bottomLine. BeginPoint = CGPointMake (point x, heightLine / 2.0);  this.leftHaft.frame = CGRectMake(point.x, 0, widthHaft, height);  this.leftMaskView.frame = CGRectMake(0, 0, point.x, height);  if (this.blockSeekOffLeft) { this.blockSeekOffLeft(point.x); } } }]; [_leftHaft setBlockMoveEnd:^{ if (this.blockMoveEnd) { this.blockMoveEnd(); } }]; _rightHaft = [[Haft alloc] initWithFrame:CGRectMake(width-widthHaft, 0, widthHaft, height)]; _rightHaft. Alpha = 0.8; _rightHaft. BackgroundColor = [UIColor colorWithWhite: alpha 0.9:1); _rightHaft.lefEdgeInset = 20; _rightHaft.rightEdgeInset = 5; [_rightHaft setBlockMove:^(CGPoint point) { CGFloat minX = this.leftHaft.frame.origin.x+minGap+CGRectGetWidth(this.rightHaft.bounds); If (point.x>=minX) {this.topline. endPoint = CGPointMake(point.x-widthhaft, heightLine/2.0); This. The bottomLine. The endPoint = CGPointMake (point x - widthHaft heightLine / 2.0);  this.rightHaft.frame = CGRectMake(point.x, 0, widthHaft, height);  this.rightMaskView.frame = CGRectMake(point.x+widthHaft, 0, width-point.x-widthHaft, height);  if (this.blockSeekOffRight) { this.blockSeekOffRight(point.x); } } }]; [_rightHaft setBlockMoveEnd:^{ if (this.blockMoveEnd) { this.blockMoveEnd(); } }]; _topLine = [[Line alloc] init]; _topLine. Alpha = 0.8; _topLine.frame = CGRectMake(widthHaft, 0, width-2*widthHaft, heightLine); _topLine. BeginPoint = CGPointMake (0, heightLine / 2.0); _topLine. The endPoint = CGPointMake (CGRectGetWidth (_topLine. Bounds), heightLine / 2.0); _topLine.backgroundColor = [UIColor clearColor]; [self addSubview:_topLine]; _bottomLine = [[Line alloc] init]; _bottomLine. Alpha = 0.8; _bottomLine.frame = CGRectMake(widthHaft, height-heightLine, width-2*widthHaft, heightLine); _bottomLine. BeginPoint = CGPointMake (0, heightLine / 2.0); _bottomLine. The endPoint = CGPointMake (CGRectGetWidth (_bottomLine. Bounds), heightLine / 2.0); _bottomLine.backgroundColor = [UIColor clearColor]; [self addSubview:_bottomLine]; [self addSubview:_leftHaft]; [self addSubview:_rightHaft]; self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.leftMaskView.backgroundColor = [UIColor grayColor]; Self. LeftMaskView. Alpha = 0.8; [self addSubview:self.leftMaskView]; self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.rightMaskView.backgroundColor = [UIColor grayColor]; Self. RightMaskView. Alpha = 0.8; [self addSubview:self.rightMaskView]; } -(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{ UITouch *touch = touches.anyObject; _beginPoint = [touch locationInView:self]; }Copy the code

Realization of handle

The realization of the handle is optimized here, that is, it is more sensitive when sliding. At the beginning, it is not very sensitive when sliding with the finger. Often the finger slides, but the handle does not move.

-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event

@implementation Haft -(instancetype)initWithFrame:(CGRect)frame{ self = [super initWithFrame:frame]; if (self) { self.userInteractionEnabled = true; } return self; } -(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{ CGRect rect = CGRectMake(self.bounds.origin.x-self.lefEdgeInset, self.bounds.origin.y-self.topEdgeInset, CGRectGetWidth(self.bounds)+self.lefEdgeInset+self.rightEdgeInset, CGRectGetHeight(self.bounds)+self.bottomEdgeInset+self.topEdgeInset); if (CGRectContainsPoint(rect, point)) { return YES; } return NO; } -(void)touch began :(NSSet< touch *> *)touches withEvent:(UIEvent *)event{NSLog(@" start "); } -(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{ NSLog(@"Move"); UITouch *touch = touches.anyObject; CGPoint point = [touch locationInView:self.superview]; CGFloat maxX = CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds); if (point.x>maxX) { point.x = maxX; } if (point.x>=0&&point.x<=(CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds))&&self.blockMove) { self.blockMove(point); } } -(void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{ if (self.blockMoveEnd) { self.blockMoveEnd(); } } - (void)drawRect:(CGRect)rect { CGFloat width = CGRectGetWidth(self.bounds); CGFloat height = CGRectGetHeight(self.bounds); CGFloat our lineWidth = 1.5; CGFloat lineHeight = 12; CGFloat gap = (width - our lineWidth * 2) / 3.0; CGFloat lineY = (height - lineHeight) / 2.0; CGContextRef context = UIGraphicsGetCurrentContext(); CGContextSetLineWidth(context, lineWidth); CGContextSetStrokeColorWithColor (context, [[UIColor grayColor] colorWithAlphaComponent: 0.8]. CGColor); CGContextMoveToPoint(context, gap+lineWidth/2, lineY); CGContextAddLineToPoint(context, gap+lineWidth/2, lineY+lineHeight); CGContextStrokePath(context); CGContextSetLineWidth(context, lineWidth); CGContextSetStrokeColorWithColor (context, [[UIColor grayColor] colorWithAlphaComponent: 0.8]. CGColor); CGContextMoveToPoint(context, gap*2+lineWidth+lineWidth/2, lineY); CGContextAddLineToPoint(context, gap*2+lineWidth+lineWidth/2, lineY+lineHeight); CGContextStrokePath(context); }Copy the code

Controller view logical assembly and function implementation

This part of the logic is the most important and complex.

  • Get 10 thumbnails
- (NSArray *)getVideoThumbnail:(NSString *)path count:(NSInteger)count splitCompleteBlock:(void(^)(BOOL success, NSMutableArray *splitimgs))splitCompleteBlock { AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]]; NSMutableArray *arrayImages = [NSMutableArray array]; [asset loadValuesAsynchronouslyForKeys:@[@"duration"] completionHandler:^{ AVAssetImageGenerator *generator = [AVAssetImageGenerator assetImageGeneratorWithAsset: asset]; / / the generator maximumSize = CGSizeMake (480136); // If CGSizeMake(480,136) is used, {240, 136} is used. With the actual size is proportional to the generator. AppliesPreferredTrackTransform = YES; / / this attribute is that we get the direction of the picture is correct. Such as some video need to rotate direction is the correct direction of the video phone. /** The following two properties need to be set because of errors. If not set error is a bit big, set up after the very, very small difference * * / generator requestedTimeToleranceAfter = kCMTimeZero;  generator.requestedTimeToleranceBefore = kCMTimeZero; Float64 seconds = CMTimeGetSeconds(asset.duration);  NSMutableArray *array = [NSMutableArray array]; for (int i = 0; i<count; I ++) {CMTime time = CMTimeMakeWithSeconds(I *(seconds/10.0),1); [array addObject:[NSValue valueWithCMTime:time]];} __block int I = 0;  [generator generateCGImagesAsynchronouslyForTimes:array completionHandler:^(CMTime requestedTime, CGImageRef _Nullable imageRef, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) { i++;  if (result==AVAssetImageGeneratorSucceeded) { UIImage *image = [UIImage imageWithCGImage:imageRef]; [arrayImages addObject:image];}else{NSLog(@" failed to get image!! "); } if (i==count) { dispatch_async(dispatch_get_main_queue(), ^{ splitCompleteBlock(YES,arrayImages); }); } }]; }]; return arrayImages; }Copy the code

10 images are easy to get, but there is one caveat: call back to asynchronous main queue callbacks! Will there be a serious problem of picture display delay?

  • Listen for left and right slider events
[_videoPieces setBlockSeekOffLeft:^(CGFloat offX) { this.seeking = true; [this.moviePlayer fof_pause];  this.lastStartSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);  [this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastStartSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero]; }]; [_videoPieces setBlockSeekOffRight:^(CGFloat offX) { this.seeking = true; [this.moviePlayer fof_pause];  this.lastEndSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);  [this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastEndSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero]; }];Copy the code

By listening for events on the left and right sliders, the offset distance is converted into time to set the start and end times of the player.

  • Loop for
Self. TimeObserverToken = [self. MoviePlayer. Player addPeriodicTimeObserverForInterval: CMTimeMakeWithSeconds (0.5, NSEC_PER_SEC) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) { if (! This.seeking) {if (fabs(CMTimeGetSeconds(time)- this.lastendSeconds)<=0.02) {[this.moviePlayer fof_pause];  [this private_replayAtBeginTime:this.lastStartSeconds]; } } }];Copy the code

There are two caveats:

  1. AddPeriodicTimeObserverForInterval to release, otherwise there will be a memory leak.
-(void)dealloc{
    [self.moviePlayer.player removeTimeObserver:self.timeObserverToken];
}
Copy the code
  1. This listens for the playback time, and then calculates whether the dragging time of our right hand is reached, if it is reached, it will play again.The author has been thinking about this question for a long time. How to achieve the interception while playing? I almost made a mistake, actually capturing the video. In fact, there is no need to capture the video, just control the play time and the end time, finally only capture once on the line.

conclusion

This micro channel small video editing process, did encounter a lot of small problems. But after careful research, finally perfect, a sense of relief. Ha ha.

The source code

Making the source code

My blog

FlyOceanFish