CMTime definition

Usually developers think of the time as a floating-point data format. We use NSTimeInterval, which is actually a simple double that can represent the time in different scenarios. In fact, the AV Foundation itself uses this type when dealing with time issues in AVAudioPlayer and AVAudioRecorder. Although many common development environments use double precision types, they cannot be applied to the development of more advanced time-based media. For example, a single rounding error can result in frame loss or audio loss. Therefore, Apple defines the CMTime data type as the time format in the Core Media framework. The type is defined as follows:

 typedef struct
   
     CMTimeValue    value;      
     CMTimeScale    timescale;  
     CMTimeFlags    flags;      
     CMTimeEpoch    epoch;      
   } CMTime;

Copy the code

The three most relevant components in the above structure are value, timescale, and Flags. CMTimeValue and CMTimeScale are 64-bit and 32-bit signed integer variables, respectively, and are fractions of the CMTime element. CMTimeFlags is a bitmask used to indicate specific states of time, such as whether data is valid, uncertain, or rounded. CMTime instances can mark specific points in time or be used to indicate duration.

CMTime create

There are several ways to create CMTime instances, but the most common way is to use the CMTimeMake function, specifying a 64-bit value parameter and a 32-bit timescale parameter. For example, there are several different ways to create a CMTime expression that represents 5s:

CMTime t1 =CMTimeMake(5, 1);
CMTime t2 =CMTimeMake(3000, 600);
CMTime t3 =CMTimeMake(5000, 1000);
Copy the code

Printing these values to the console using the CMTimeShow function produces the following results:

{5/1 = 5.000}
{3000/600 = 5.000}
{5000/1000 = 5.000}
Copy the code

Note: When processing video content, the common time scale is 600, which is a common multiple of 24FPS, 25FPS and 30FPS for most commonly used videos. A common time scale for audio data is the sampling rate, such as 44 100(44.1kHZ) or 48 000(kHZ).

CMTime calculation

add

CMTime t4 =CMTimeAdd(t1, t2);
Copy the code

Subtracting the

CMTime t5 =CMTimeSubtract(t3, t1);
Copy the code

Note: multiply is CMTimeMultiply you can refer to the CMTime documentation for yourself.

CMTimeMakeWithSeconds differs from CMTimeMake

CMTimeMake(a,b) the current number of frames,b the number of frames per second, the current playback time a/b. CMTimeMakeWithSeconds(a,b) a Current time,b number of frames per second. Here’s a code to illustrate it:

Float64 seconds = 3; int32_t preferredTimeScale = 600; // The common timescale for video content is 600, which is a common multiple of most video frames at 24fps, 25fps, and 30fps. A common time scale for audio data is the sampling rate, such as 44 100(44.1kHZ) or 48 000(48kHZ). CMTimeinTime = CMTimeMakeWithSeconds(seconds, preferredTimeScale);  
CMTimeShow(inTime); {1800/600 = 3.000} CMTimeMake int64_t value = 1100; int32_t preferredTimeScale = 600; CMTimeinTime = CMTimeMake(value, preferredTimeScale);  
CMTimeShow(inTime); OUTPUT: {1100/600 = 1.833} represents time of 1.833s, 1100 frames in total, 600 frames per second. In fact, in our case, all we care about is the total time at the end. In other words, we can replace that (0, 600) with (x, 600) without any problem. RequestedTimeTolerance So why, so much less effective? We can set CGImageRef image = [gen copyCGImageAtTime:time actualTime:&actualTime error:&error]; Output CMTimeShow(actualTime) to find the time difference is far. Why is that? First, actualTime uses FPS * 1000 when the frame rate per second, and incidentally popularizes the FPS acquisition method:floatfps = [[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] nominalFrameRate]; Then let's think about why we have requestTime and actualTime. I am confused by this API: why does my request time not equal to actualTime? Later I checked the documentation, when you want a certain frame at a point in time, it will look in a range, if there is a cache, or there is a key frame in the index, it will directly return, so as to optimize performance. This definition range of API is requestedTimeToleranceAfter and requestedTimeToleranceBefore if we want to accurately time, so you can set:  gen.requestedTimeToleranceAfter = kCMTimeZero; gen.requestedTimeToleranceBefore = kCMTimeZero;Copy the code

Note: when using AVAssetImageGenerator obtain video thumbnails, found that set the gen. RequestedTimeToleranceAfter = kCMTimeZero; And gen. RequestedTimeToleranceAfter = kCMTimeZero; There is a chance of failure, so just get rid of those two Settings.

CMTimeRange

The Core Media framework also provides a data type for time ranges, called CMTimeRange, which plays an important role in the API for resource editing, as defined below:

typedef struct
{
    CMTime          start;      
    CMTime          duration;   
} CMTimeRange;
Copy the code

Where, start represents the CMTime value of the starting point of time, and duratin represents the CMTime value of the duration of time range. General use CMTimeRangeMake and CMTimeRangeFromTimeToTime created as follows:

 CMTimeRange timeRange1 = CMTimeRangeMake(t1, t2);
 CMTimeRange timeRange2 = CMTimeRangeFromTimeToTime(t4, t3);
Copy the code

Intersection and union of CMTimeRange

CMTimeRange range1 =CMTimeRangeMake(kCMTimeZero, CMTimeMake(5,1)); CMTimeRange range2 =CMTimeRangeMake(2, 1), CMTimeMake(5,1)); / / cross time range 2-5 s CMTimeRange intersectionRange = CMTimeRangeGetIntersection (range1, range2); CMTimeRangeShow(intersectionRange); CMTimeRange unionRange =CMTimeRangeGetUnion(range1, range2); CMTimeRangeShow(unionRange);Copy the code

The print result is as follows:

{{2/1 = 2.000}, {3/1 = 3.000}}
{{0/1 = 0.000}, {7/1 = 7.000}}
Copy the code