Live Photo development principles
Making warehouse address: https://github.com/filelife/FLLivePhotoDemo
LivePhoto profile
Live Photo is made up of a 3-second video plus an image.
Native LivePhoto video capture time interval is composed of 1.5s after the button + 1.5s before the button.
[-1.5s ~ 0s, shooting moment,0s ~ 1.5s]Copy the code
Finally, the photo displayed by Livephoto was synthesized, and the central frame in the 3S clip was obtained after the camera collected it.
Synthetic instructions
Material:
1. Video; 2. Processed pictures; Long story short, if you have a video, an image, it’s very easy to produce a LivePhoto. Then, with the images and videos already in our possession, we started to process them. There is a simple connection between iOS LivePhoto images and videos to make sure they are both recognized as LivePhotos by iOS.
1. Manipulate images
We need to process the image MetaData, where the important Key values are:
NSString *const kFigAppleMakerNote_AssetIdentifier = @"17";
Copy the code
Here is how to write the “17” MetaData, and then save the finalJPGPath, which can be used to store LivePhoto.
NSString *const kKeySpaceQuickTimeMetadata = @"mdta";
+ (AVAssetWriterInputMetadataAdaptor *)metadataSetAdapter {
NSString *identifier = [kKeySpaceQuickTimeMetadata stringByAppendingFormat:@"/ % @",kKeyStillImageTime];
const NSDictionary *spec = @{(__bridge_transfer NSString*)kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier :
identifier,
(__bridge_transfer NSString*)kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType :
@"com.apple.metadata.datatype.int8"
};
CMFormatDescriptionRef desc;
CMMetadataFormatDescriptionCreateWithMetadataSpecifications(kCFAllocatorDefault, kCMMetadataFormatType_Boxed, (__bridge CFArrayRef)@[spec], &desc);
AVAssetWriterInput *input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeMetadata outputSettings:nil sourceFormatHint:desc];
CFRelease(desc);
return [AVAssetWriterInputMetadataAdaptor assetWriterInputMetadataAdaptorWithAssetWriterInput:input];
}
Copy the code
- (void)writeToFileWithOriginJPGPath:(NSURL *)originJPGPath
TargetWriteFilePath:(NSURL *)finalJPGPath
AssetIdentifier:(NSString *)assetIdentifier {
CGImageDestinationRef dest = CGImageDestinationCreateWithURL((CFURLRef)finalJPGPath, kUTTypeJPEG, 1, nil);
CGImageSourceRef imageSourceRef = CGImageSourceCreateWithData((CFDataRef)[NSData dataWithContentsOfFile:originJPGPath.path], nil);
NSMutableDictionary *metaData = [(__bridge_transfer NSDictionary*)CGImageSourceCopyPropertiesAtIndex(imageSourceRef, 0, nil) mutableCopy];
NSMutableDictionary *makerNote = [NSMutableDictionary dictionary];
[makerNote setValue:assetIdentifier forKey:kFigAppleMakerNote_AssetIdentifier];
[metaData setValue:makerNote forKey:(__bridge_transfer NSString*)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationAddImageFromSource(dest, imageSourceRef, 0, (CFDictionaryRef)metaData);
CGImageDestinationFinalize(dest);
CFRelease(dest);
}
Copy the code
After doing this, we get the image.
2. Process video
Use the following method to process the video so that it can be bound to the image
+ (void)writeToFileWithOriginMovPath:(NSURL *)originMovPath
TargetWriteFilePath:(NSURL *)finalMovPath
AssetIdentifier:(NSString *)assetIdentifier {
AVURLAsset* asset = [AVURLAsset assetWithURL:originMovPath];
AVAssetTrack *videoTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
if(! videoTrack) {return;
}
AVAssetReaderOutput *videoOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:@{(__bridge_transfer NSString*)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]}];
NSDictionary *audioDic = @{AVFormatIDKey :@(kAudioFormatLinearPCM),
AVLinearPCMIsBigEndianKey:@NO,
AVLinearPCMIsFloatKey:@NO,
AVLinearPCMBitDepthKey :@(16)
};
AVAssetReaderTrackOutput *audioOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:audioDic];
NSError *error;
AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:asset error:&error];
if([reader canAddOutput:videoOutput]) {
[reader addOutput:videoOutput];
} else {
NSLog(@"Add video output error\n");
}
if([reader canAddOutput:audioOutput]) {
[reader addOutput:audioOutput];
} else {
NSLog(@"Add audio output error\n");
}
NSDictionary * outputSetting = @{AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithFloat:videoTrack.naturalSize.width],
AVVideoHeightKey: [NSNumber numberWithFloat:videoTrack.naturalSize.height]
};
AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSetting];
videoInput.expectsMediaDataInRealTime = true;
videoInput.transform = videoTrack.preferredTransform;
NSDictionary *audioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100], AVSampleRateKey,
[ NSNumber numberWithInt: 128000], AVEncoderBitRateKey,
nil];
AVAssetWriterInput *audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:[audioTrack mediaType] outputSettings:audioSettings];
audioInput.expectsMediaDataInRealTime = true;
audioInput.transform = audioTrack.preferredTransform;
NSError *error_two;
AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:finalMovPath fileType:AVFileTypeQuickTimeMovie error:&error_two];
if(error_two) {
NSLog(@"CreateWriterError:%@\n",error_two);
}
writer.metadata = @[ [self metaDataSet:assetIdentifier]];
[writer addInput:videoInput];
[writer addInput:audioInput];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
AVAssetWriterInputMetadataAdaptor *adapter = [self metadataSetAdapter];
[writer addInput:adapter.assetWriterInput];
[writer startWriting];
[reader startReading];
[writer startSessionAtSourceTime:kCMTimeZero];
CMTimeRange dummyTimeRange = CMTimeRangeMake(CMTimeMake(0, 1000), CMTimeMake(200, 3000));
//Meta data reset:
AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem];
item.key = kKeyStillImageTime;
item.keySpace = kKeySpaceQuickTimeMetadata;
item.value = [NSNumber numberWithInt:0];
item.dataType = @"com.apple.metadata.datatype.int8";
[adapter appendTimedMetadataGroup:[[AVTimedMetadataGroup alloc] initWithItems:[NSArray arrayWithObject:item] timeRange:dummyTimeRange]];
dispatch_queue_t createMovQueue = dispatch_queue_create("createMovQueue", DISPATCH_QUEUE_SERIAL);
dispatch_async(createMovQueue, ^{
while (reader.status == AVAssetReaderStatusReading) {
CMSampleBufferRef videoBuffer = [videoOutput copyNextSampleBuffer];
CMSampleBufferRef audioBuffer = [audioOutput copyNextSampleBuffer];
if (videoBuffer) {
while(! videoInput.isReadyForMoreMediaData || ! audioInput.isReadyForMoreMediaData) { usleep(1); }if(audioBuffer) { [audioInput appendSampleBuffer:audioBuffer]; CFRelease(audioBuffer); } / / / / after the cut: / / CMTime startTime = CMSampleBufferGetPresentationTimeStamp (videoBuffer); // CVPixelBufferRef pixBufferRef = [self cropSampleBuffer:videoBufferinRect:CGRectMake(0, 0, 720, 720)]; // [adaptor appendPixelBuffer:pixBufferRef withPresentationTime:startTime]; // // CVPixelBufferRelease(pixBufferRef); // CMSampleBufferInvalidate(videoBuffer); // // videoBuffer = nil; / / no shear: [adaptor assetWriterInput appendSampleBuffer: videoBuffer]; CMSampleBufferInvalidate(videoBuffer); CFRelease(videoBuffer); videoBuffer = nil; }else {
continue;
}
// NULL?
}
dispatch_sync(dispatch_get_main_queue(), ^{
[writer finishWritingWithCompletionHandler:^{
NSLog(@"Finish \n");
}];
});
});
while(writer.status == AVAssetWriterStatusWriting) { [[NSRunLoop currentRunLoop] runUntilDate:[NSDate DateWithTimeIntervalSinceNow: 0.5]]; } } + (AVMetadataItem *)metaDataSet:(NSString *)assetIdentifier { AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem]; item.key = kKeyContentIdentifier; item.keySpace = kKeySpaceQuickTimeMetadata; item.value = assetIdentifier; item.dataType = @"com.apple.metadata.datatype.UTF-8";
return item;
}
Copy the code
3. The synthetic
With the above method, we processed the video and the cover image. Next, we used PHPhotoLibrary to save and write a LivePhoto directly. I provide the following methods to store LivePhoto and get the callback saved to the album.
- (void)saveLivePhotoToAlbumWithMovPath:(NSURL *)movPath ImagePath:(NSURL *)jpgPath completed:(void (^)(BOOL isSuccess))didSaveLivePhoto {
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetCreationRequest *request = [PHAssetCreationRequest creationRequestForAsset];
PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];
[request addResourceWithType:PHAssetResourceTypePairedVideo fileURL:movPath options:options];
[request addResourceWithType:PHAssetResourceTypePhoto fileURL:jpgPath options:options];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if(success) {
NSLog(@"Save success\n");
didSaveLivePhoto(YES);
} else{ didSaveLivePhoto(NO); }}]; }Copy the code
I hope you can click “like” if you find it useful. Because I’m lazy, I actually wrote a Demo, but I put it together with a complete project code and haven’t had time to properly encapsulate it, so I’ll upload it to Github once I’m ready
Add: Well, if you have already recorded a video, let me add another method to help you get a picture of one frame of the video.
- (UIImage *)firstFrame:(NSURL *)videoURL {
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
AVAssetImageGenerator* generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
UIImage* image = [UIImage imageWithCGImage:[generator copyCGImageAtTime:CMTimeMake(0, 1) actualTime:nil error:nil]];
return image;
}
Copy the code