這個類的主要功能就是獲取一個視頻里面任何一幀的圖片。
- 創(chuàng)建AVAssetImageGenerator
AVAssetImageGenerator的初始化方法有兩個,一個是類方法, 一個是對象方法, 要闖入的參數(shù)都是一個AVAsset對象
- (instancetype)initWithAsset:(AVAsset *)asset;
+ (instancetype)assetImageGeneratorWithAsset:(AVAsset *)asset;
2.實際使用
1.獲取某個特定時間的視頻幀(圖像)
主要用到的方法是:- (nullable CGImageRef)copyCGImageAtTime:(CMTime)requestedTime actualTime:(nullable CMTime *)actualTime error:(NSError * _Nullable * _Nullable)outError
其中CMTime是一個結(jié)構(gòu)體,
typedef struct
{
CMTimeValue value; /*! @field value The value of the CMTime. value/timescale = seconds. */
CMTimeScale timescale; /*! @field timescale The timescale of the CMTime. value/timescale = seconds. */
CMTimeFlags flags; /*! @field flags The flags, eg. kCMTimeFlags_Valid, kCMTimeFlags_PositiveInfinity, etc. */
CMTimeEpoch epoch; /*! @field epoch Differentiates between equal timestamps that are actually different because
of looping, multi-item sequencing, etc.
Will be used during comparison: greater epochs happen after lesser ones.
Additions/subtraction is only possible within a single epoch,
however, since epoch length may be unknown/variable. */
} CMTime;
C語言的API, 需要注意的是: 我們想要的時間 = value/timescale;例如:1s = CMTimeMake(1, 1);
0.5s = CMTimeMake(1, 2);
所以我們獲取某一時刻的視頻幀
-(void)getImageWithTime:(CMTime)time {
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:@"hubblecast.m4v" withExtension:nil];
AVAsset *videoAsset = [AVAsset assetWithURL:videoUrl];
self.videoAsset = videoAsset;
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:self.videoAsset];
self.imageGenerator = imageGenerator;
imageGenerator.maximumSize = CGSizeMake(200, 0);//按比例生成, 不指定會默認視頻原來的格式大小
CMTime actualTime;//獲取到圖片確切的時間
NSError *error = nil;
CGImageRef CGImage = [imageGenerator copyCGImageAtTime:time actualTime:&actualTime error:&error];
if (!error) {
UIImage *image = [UIImage imageWithCGImage:CGImage];
self.imageView.image = image;
CMTimeShow(actualTime); //{111600/90000 = 1.240}
CMTimeShow(time); // {1/1 = 1.000}
}
}
//調(diào)用 獲取第一幀圖片
[self getImageWithTime:CMTimeMake(1, 1)];
上面發(fā)現(xiàn)想要獲取的時間(time)和實際獲取到的時間的幀(actualTime)不一樣
所以得設(shè)置
//防止時間出現(xiàn)偏差
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
得到的
CMTimeShow(actualTime); //{90000/90000 = 1.000}
CMTimeShow(time); // {1/1 = 1.000}
獲取多個時刻的視頻幀
主要用到這個方法, 根據(jù)多個CMTime對象轉(zhuǎn)化為NSValue, 用數(shù)組包裝傳進去,然后根據(jù)傳進去的NSValue個數(shù)進行回調(diào), 回調(diào)的次數(shù)等于NSValue的個數(shù);
- (void)generateCGImagesAsynchronouslyForTimes:(NSArray<NSValue *> *)requestedTimes completionHandler:(AVAssetImageGeneratorCompletionHandler)handler;
self.imageGenerator = // 1
[AVAssetImageGenerator assetImageGeneratorWithAsset:self.videoAsset];
// Generate the @2x equivalent
self.imageGenerator.maximumSize = CGSizeMake(200.0f, 0.0f); // 2
CMTime duration = self.videoAsset.duration;
NSMutableArray *times = [NSMutableArray array]; // 3
CMTimeValue increment = duration.value / 20;
CMTimeValue currentValue = 2.0 * duration.timescale;
while (currentValue <= duration.value) {
CMTime time = CMTimeMake(currentValue, duration.timescale);
[times addObject:[NSValue valueWithCMTime:time]];
currentValue += increment;
}
__block NSUInteger imageCount = times.count; // 4
__block NSMutableArray *images = [NSMutableArray array];
AVAssetImageGeneratorCompletionHandler handler; // 5
handler = ^(CMTime requestedTime,
CGImageRef imageRef,
CMTime actualTime,
AVAssetImageGeneratorResult result,
NSError *error) {
if (result == AVAssetImageGeneratorSucceeded) { // 6
UIImage *image = [UIImage imageWithCGImage:imageRef];
[images addObject:image];
NSLog(@"%@", image);
} else {
NSLog(@"Error: %@", [error localizedDescription]);
}
// If the decremented image count is at 0, we're all done.
if (--imageCount == 0) { // 7
dispatch_async(dispatch_get_main_queue(), ^{
//獲取完畢, 作出相應(yīng)的操作
});
}
};
[self.imageGenerator generateCGImagesAsynchronouslyForTimes:times // 8
completionHandler:handler];