Beginning of the nonsense: On the basis of completing the function of multi-image Mosaic, the video optional frame view is intercepted and multi-image Mosaic is realized. The following simply encapsulates a utility class to implement the above requirements.

Related article: Using CGContextRef in iOS to simply realize multi-graph Mosaic function

One, achieve the effect

Select video

Select video

1. Drag and drop for video preview.

2, click on the frame capture.

3. Display the current number of selected pictures and merge them.

4. Temporary save to sandbox directory.

Take a look at the multi-image Mosaic under the sandbox path

To view a larger version

To view a larger version

Two, code implementation

1. First look at the external calling code

This is just the SLVideoPlayManagement class.

The playWithAVPlayerItem method takes two arguments:

One is AVPlayerItem, a video resource.

One is UIView, the video playback carrier.

2. Function list

General first overview of the list of features

1. Local video player package:

(1) The playback interface can be dragged to adjust the progress;

(2) Tap the screen to take screenshots;

(3) Display the number of pictures selected by the user and the merge operation button.

2. Video frame view acquisition and saving encapsulation.

(1) Get the current player frame view.

(2) The cache records the image selected by the user.

3. Customize UIView classification to mount video associated video management objects and transfer life cycle.

3, code,

The purpose of creating a UIView class is to transfer the life cycle of an instance object of the video processing class to the video playback carrier View. There is only an association operation added to the classification.

1, WSLVideoPlayManagement video player class

WSLVideoPlayManagement.h

#import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> NS_ASSUME_NONNULL_BEGIN @interface WSLVideoPlayManagement : NSObject + (WSLVideoPlayManagement *)playWithAVPlayerItem:(AVPlayerItem *)playerItem onView:(UIView *)onView; + (AVPlayerItem *)createAVPlayerItemWithLocalUrl:(NSURL *)localUrl; - (void)playWithAVPlayerItem:(AVPlayerItem *)playerItem onView:(UIView *)onView; ResetPlayerUI - (void)resetPlayerUI; - (void)isShowTime:(BOOL)isShowTime; // jump to current play position (sliding offset) - (void)playerSeek:(float)changeDistance; - (void)isShowCombineBtn:(BOOL)isShowCombineBtn; - (void)showChooseImageNum:(NSInteger)chooseNum; @endCopy the code

WSLVideoPlayManagement.m

#import "WSLVideoPlayManagement.h"
#import "UIView+WSLVideoPlayManagement.h"
#import "WSLVideoScreenshotsManagement.h"

@interface WSLVideoPlayManagement()

//播放器相关
@property (nonatomic,strong) AVAsset * asset;

@property (nonatomic,strong) AVPlayerItem * currentPlayerItem;

@property (nonatomic,strong) AVPlayer * player;

@property (nonatomic,strong) AVPlayerLayer * playerLayer;

//播放器挂载视图
@property (nonatomic,weak) UIView * onView;
//时间指示器
@property (nonatomic,strong) UILabel * timeLab;
//开始合成图片
@property (nonatomic,strong) UIButton * benginCombineImagesBtn;
//选择数量
@property (nonatomic,strong) UILabel * chooseImageCountLab;
//视频输出流(获取帧视图)
@property (nonatomic,strong) AVPlayerItemVideoOutput * videoOutPut;
//视频总长度
@property (nonatomic,strong) NSString * videoTotleTimeStr;
//拖拽开始时的时间(需要记录拖拽时候播放进度)
@property (nonatomic,assign) CMTime beginTime;
//视频截图工具类
@property (nonatomic,strong) WSLVideoScreenshotsManagement * videoScreenshotsManagement;

@end

@implementation WSLVideoPlayManagement

+ (WSLVideoPlayManagement *)playWithAVPlayerItem:(AVPlayerItem *)playerItem onView:(UIView *)onView
{
    WSLVideoPlayManagement * videoPlayManagement = [[WSLVideoPlayManagement alloc] init];
    [videoPlayManagement playWithAVPlayerItem:playerItem onView:onView];
    //注册观察者,检测 player 播放状态
    [videoPlayManagement registObserver];
    onView.videoPlayManagement = videoPlayManagement;
    return videoPlayManagement;
}

+ (AVPlayerItem *)createAVPlayerItemWithLocalUrl:(NSURL *)localUrl
{
    AVAsset * asset = [AVAsset assetWithURL:localUrl];
    AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
    return playerItem;
}

- (void)playWithAVPlayerItem:(AVPlayerItem *)playerItem onView:(UIView *)onView
{
    self.onView = onView;
    self.currentPlayerItem = playerItem;
    self.videoOutPut = [[AVPlayerItemVideoOutput alloc] init];
    [self.currentPlayerItem addOutput:self.videoOutPut];

    self.player = [[AVPlayer alloc] initWithPlayerItem:self.currentPlayerItem];
    self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.playerLayer.frame = onView.layer.bounds;
    [onView.layer addSublayer:self.playerLayer];

    //获取帧视图工具类
    self.videoScreenshotsManagement = [[WSLVideoScreenshotsManagement alloc] initWithOnView:onView player:self.player videoOutput:self.videoOutPut];

    //时间拖拽显示器
    self.timeLab = [[UILabel alloc] initWithFrame:CGRectZero];
    self.timeLab.hidden = YES;
    self.timeLab.backgroundColor = [[UIColor groupTableViewBackgroundColor] colorWithAlphaComponent:0.2];
    self.timeLab.textColor = [UIColor whiteColor];
    self.timeLab.textAlignment = NSTextAlignmentCenter;
    CMTime cTime = self.currentPlayerItem.asset.duration;

    self.videoTotleTimeStr = [self transTime:cTime.value / (cTime.timescale)];

    self.timeLab.text = [NSString stringWithFormat:@" 0:00 / %@",self.videoTotleTimeStr];
    [self.timeLab sizeToFit];
    self.timeLab.layer.masksToBounds = YES;
    self.timeLab.layer.cornerRadius = self.timeLab.frame.size.height / 2.0;

    self.timeLab.center = CGPointMake(self.onView.center.x, (self.onView.frame.size.height - self.timeLab.frame.size.height) / 2.0);

    [self.onView addSubview:self.timeLab];

    __weak typeof(self) weakSelf = self;

    [self.player addPeriodicTimeObserverForInterval:CMTimeMake(1.0, 1.0) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {

        // 更新播放进度
     weakSelf.timeLab.text = [NSString stringWithFormat:@" %@ / %@   ",[weakSelf transTime:time.value / (time.timescale)],weakSelf.videoTotleTimeStr];
        [weakSelf.timeLab sizeToFit];
    }];
    
    //合并按钮
    self.benginCombineImagesBtn = [UIButton buttonWithType:(UIButtonTypeCustom)];
    self.benginCombineImagesBtn.hidden = YES;
    self.benginCombineImagesBtn.frame = CGRectMake(10, self.onView.frame.size.height - 50, 70, 40);
    [self.benginCombineImagesBtn setTitle:@"合并" forState:(UIControlStateNormal)];
    [self.benginCombineImagesBtn setTitleColor:[UIColor whiteColor] forState:(UIControlStateNormal)];
    self.benginCombineImagesBtn.layer.masksToBounds = YES;
    self.benginCombineImagesBtn.layer.cornerRadius = 5;
    self.benginCombineImagesBtn.backgroundColor = [[UIColor groupTableViewBackgroundColor] colorWithAlphaComponent:0.2];
    [self.benginCombineImagesBtn addTarget:self action: @selector(beginCombineImages) forControlEvents:(UIControlEventTouchUpInside)];
    [self.onView addSubview:self.benginCombineImagesBtn];

    //显示选择图片数
    self.chooseImageCountLab = [[UILabel alloc] initWithFrame:CGRectMake(CGRectGetMaxX(self.benginCombineImagesBtn.frame) - 10, CGRectGetMinY(self.benginCombineImagesBtn.frame) - 5, 0, 0)];
    self.chooseImageCountLab.layer.masksToBounds = YES;
    self.chooseImageCountLab.backgroundColor = [UIColor whiteColor];
    self.chooseImageCountLab.textColor = [UIColor redColor];
    self.chooseImageCountLab.font = [UIFont systemFontOfSize:15.f];
    [self.onView addSubview:self.chooseImageCountLab];
}

//注册观察者
- (void)registObserver
{
    //观察 playerItem 的状态
    [self.player.currentItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
}

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context
{
    if ([object isKindOfClass:[AVPlayerItem class]]) {
        if ([keyPath isEqualToString:@"status"]) {
            AVPlayerStatus status = [[change objectForKey:@"new"] intValue];
            switch (status) {
                case AVPlayerStatusReadyToPlay:
                {
                    //准备可播放了
                    [self.player play];
                    //获取帧动画调整视频尺寸
                    [self resizePlayerView];
                }
                    break;
                default:
                    break;
            }
        }
    }
}

//获取帧动画调整视频尺寸
- (void)resizePlayerView
{
    [self.videoScreenshotsManagement reSizePlayerView];
}

//调整播放器组件rect
- (void)resetPlayerUI
{
    self.playerLayer.frame = self.onView.layer.bounds;
    self.timeLab.center = CGPointMake(self.onView.center.x, (self.onView.frame.size.height - self.timeLab.frame.size.height) / 2.0);
    self.benginCombineImagesBtn.frame = CGRectMake(10, self.onView.frame.size.height - 50, 70, 40);
    self.chooseImageCountLab.frame = CGRectMake(CGRectGetMaxX(self.benginCombineImagesBtn.frame) - 10, CGRectGetMinY(self.benginCombineImagesBtn.frame) - 5, 0, 0);
}

//时间转换
- (NSString *)transTime:(NSUInteger)time
{
    NSString * timeStr = @"";
    if (time < 60) {
        timeStr = [NSString stringWithFormat:@"00:%02ld",time];
    } else {
        NSUInteger minutes = time / 60;
        NSUInteger sections = time % 60;
        timeStr = [NSString stringWithFormat:@"%ld:%02ld",minutes,sections];
    }
    return timeStr;
}

//显示拖拽播放进度时间
- (void)isShowTime:(BOOL)isShowTime
{
    self.timeLab.hidden = !isShowTime;
    if (isShowTime) {
        [self.player pause];
        self.beginTime = self.currentPlayerItem.currentTime;
    } else {
        [self.player play];
    }
}

//显示合并按钮
- (void)isShowCombineBtn:(BOOL)isShowCombineBtn
{
    self.benginCombineImagesBtn.hidden = !isShowCombineBtn;
    self.chooseImageCountLab.hidden = self.benginCombineImagesBtn.hidden;
}

//显示选择了图片数量
- (void)showChooseImageNum:(NSInteger)chooseNum
{
    NSString * numStr = @"0";
    if (chooseNum < 10) {
        numStr = [NSString stringWithFormat:@"  %ld  ",chooseNum];
    } else {
        numStr = [NSString stringWithFormat:@"  %ld  ",chooseNum];
    }
    self.chooseImageCountLab.text = numStr;
    [self.chooseImageCountLab sizeToFit];
    self.chooseImageCountLab.layer.cornerRadius = self.chooseImageCountLab.frame.size.height / 2.0;

}

//跳到当前播放位置相对秒数位置进行播放
- (void)playerSeek:(float)changeDistance
{
    CMTime currentTime = self.beginTime;
    CMTime totleTime = self.currentPlayerItem.asset.duration;
    int64_t addValue = totleTime.value * (changeDistance / self.onView.frame.size.width);
    int64_t min = 0;
    int64_t max = totleTime.value;
    int64_t value = (currentTime.value / currentTime.timescale) * totleTime.timescale + addValue;
    if (value <= min) {
        value = min;
    } else if (value >= max) {
        value = max;
    }
    CMTime seekTime = CMTimeMake(value, totleTime.timescale);
    [self.player seekToTime:seekTime];
    self.timeLab.text = [NSString stringWithFormat:@" %@ / %@   ",[self transTime:value / (totleTime.timescale)],self.videoTotleTimeStr];
    [self.timeLab sizeToFit];
}

//开始合并
- (void)beginCombineImages
{
    [self.videoScreenshotsManagement beginCombineImages];
}

- (void)dealloc
{
    [self.player.currentItem removeObserver:self.forKeyPath:@"status"];
    NSLog(@"视频管理类销毁了");
}

@end
Copy the code
2, WSLVideoScreenshotsManagement frame view for class

WSLVideoScreenshotsManagement.h

#import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> NS_ASSUME_NONNULL_BEGIN @interface WSLVideoScreenshotsManagement : NSObject - (instancetype)initWithOnView:(UIView *)onView player:(AVPlayer *)player videoOutput:(AVPlayerItemVideoOutput *)videoOutput; ReSizePlayerView - (void)reSizePlayerView; // Get frames - (UIImage *)getCurrentImage; // start merge - (void)beginCombineImages; @end NS_ASSUME_NONNULL_ENDCopy the code

WSLVideoScreenshotsManagement.m

#import "WSLVideoScreenshotsManagement.h" #import "UIView+WSLVideoPlayManagement.h" #import "WSLImageCombineOperation.h" @ interface WSLVideoScreenshotsManagement () / / video mount view @ property (nonatomic, weak) UIView * onView; @Property (nonatomic,strong) UITapGestureRecognizer * lightTap; @property (nonatomic,strong) UIPanGestureRecognizer * panGes; // Start dragging the starting point @property (nonatomic,assign) CGPoint startPanPoint; @property (nonatomic,weak) AVPlayer * player; @property (nonatomic,weak) AVPlayerItemVideoOutput * videoOutPut; @property (nonatomic,strong) NSMutableArray * saveNeedCombineImagesArr; @end @implementation WSLVideoScreenshotsManagement - (instancetype)initWithOnView:(UIView *)onView player:(AVPlayer *)player videoOutput:(AVPlayerItemVideoOutput *)videoOutput; { if (self = [super init]) { self.onView = onView; self.player = player; self.videoOutPut = videoOutput; self.onView.userInteractionEnabled = **YES**; self.lightTap = [[UITapGestureRecognizer alloc] initWithTarget:self action: @selector(lightTapAction:)]; [self.onView addGestureRecognizer:self.lightTap]; self.panGes = [[UIPanGestureRecognizer alloc] initWithTarget:self action: @selector(panGes:)]; [self.onView addGestureRecognizer:self.panGes]; } return self; } // light event - (void)lightTapAction:(UITapGestureRecognizer *)lightTap {switch (lighttap.state) {case UIGestureRecognizerStateEnded: {the if (self) player) status = = AVPlayerStatusReadyToPlay) {NSLog (@ "start gathering picture"); [self addSaveAnimation]; } } break; default: break; }} / / drag and drop event - (void) panGes: (UIPanGestureRecognizer *) pan {switch (pan) state) {case UIGestureRecognizerStateBegan: { self.startPanPoint = [pan locationInView:self.onView]; [self.onView.videoPlayManagement isShowTime:YES]; } break; case UIGestureRecognizerStateChanged: { CGPoint currentPoint = [pan locationInView:self.onView]; [self.onView.videoPlayManagement playerSeek:currentPoint.x - self.startPanPoint.x]; } break; case UIGestureRecognizerStateEnded:{ [self.onView.videoPlayManagement isShowTime:NO]; } default: break; } // Add a screenshot animation - (void)addSaveAnimation {UIImage * needSaveImage = [self getCurrentImage]; if (needSaveImage) { UIImageView * imageView = [[UIImageView alloc] initWithFrame:self.onView.frame]; imageView.contentMode = UIViewContentModeScaleAspectFit; imageView.image = needSaveImage; [self.onView.superview addSubview:imageView]; [UIView animateWithDuration: 0.4 animations: ^ {imageView. Frame = CGRectMake (imageView. Frame. Origin. X. CGRectGetMaxY(self.onView.frame), imageView.frame.size.width / 10, imageView.frame.size.height / 10); ImageView. Alpha = 0.1;} completion: ^ (BOOL finished) {/ / save picture [self. SaveNeedCombineImagesArr addObject: needSaveImage];  [self.onView.videoPlayManagement isShowCombineBtn:YES];  [self.onView.videoPlayManagement showChooseImageNum:self.saveNeedCombineImagesArr.count];  [imageView removeFromSuperview]; }]; }} / / to get frames - (UIImage *) getCurrentImage {CMTime itemTime = self. The player. The currentItem. CurrentTime; CVPixelBufferRef pixelBuffer = [self.videoOutPut copyPixelBufferForItemTime:itemTime itemTimeForDisplay:nil]; CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; CIContext *temporaryContext = [CIContext contextWithOptions:nil]; CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))]; / / the current frame picture UIImage * currentImage = [UIImage imageWithCGImage: videoImage]; CGImageRelease(videoImage); return currentImage; {dispatch_after(dispatch_time(DISPATCH_TIME_NOW, dispatch_time); (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{UIImage * image = [self getCurrentImage]; if (image && self.onView) { self.onView.frame = CGRectMake(self.onView.frame.origin.x, self.onView.frame.origin.y, self.onView.frame.size.width, self.onView.frame.size.width * (image.size.height / image.size.width)); [self.onView.videoPlayManagement resetPlayerUI]; }}); } / / merge - (void) beginCombineImages {[WSLImageCombineOperation combineImages: self. SaveNeedCombineImagesArr callBack:^(UIImage * _Nonnull resultImage) { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES); Nsstrings * filePath = [[paths objectAtIndex: 0] stringByAppendingPathComponent: @ "scarlet ball rise. JPG"].  NSLog(@"filePath = %@",filePath); [WSLImageCombineOperation saveImageToCache:resultImage filePath:filePath];  [self.saveNeedCombineImagesArr removeAllObjects];  [self.onView.videoPlayManagement showChooseImageNum:self.saveNeedCombineImagesArr.count];  [self.onView.videoPlayManagement isShowCombineBtn:**NO**]; }]; } - (NSMutableArray *)saveNeedCombineImagesArr {if (! _saveNeedCombineImagesArr) { _saveNeedCombineImagesArr = [[NSMutableArray alloc] init]; } return _saveNeedCombineImagesArr; } @endCopy the code
UIView+WSLVideoPlayManagement

UIView+WSLVideoPlayManagement.h

#import <UIKit/UIKit.h> #import "WSLVideoPlayManagement.h" NS_ASSUME_NONNULL_BEGIN @interface UIView @property (Nonatomic,strong) WSLVideoPlayManagement * videoPlayManagement; @end NS_ASSUME_NONNULL_ENDCopy the code

UIView+WSLVideoPlayManagement.m


#import "UIView+WSLVideoPlayManagement.h"

#import <objc/runtime.h>

static char * mineWSLVideoPlayManagement;


@implementation UIView (WSLVideoPlayManagement)

- (WSLVideoPlayManagement *)videoPlayManagement
{
    return  objc_getAssociatedObject(self, mineWSLVideoPlayManagement);
}

- (void)setVideoPlayManagement:(WSLVideoPlayManagement *)videoPlayManagement
{
    objc_setAssociatedObject(self, mineWSLVideoPlayManagement, videoPlayManagement, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
}

@end

Copy the code

Third, summary and thinking

CMTime itself is a structure, with two key variables:

1. Value: the number of copies (it is the number of copies in timescale), and value/timescale is the current number of seconds.

Timescale: Divide the unit second into equal parts.

These two parameters may be different for different cmtimes, so you can calculate and synchronize the data.

Ok, a simple local video optional frame view multi-image splicing tool class to achieve good, above is all the code.

Bad code, don’t laugh [fist][fist][Fist]