iOS開發(fā)視頻分解成圖片

視頻其實(shí)就是一張張圖片組成的,將視頻拆分成圖片。

這里視頻分解圖片使用的是AVAssetImageGenerator,利用這個(gè)類可以很方便的實(shí)現(xiàn)不同時(shí)間戳下,視頻幀的抓取。注意一般這種視頻分解圖片幀的方法都是放在子線程中的,而UI更新操作都是放在主線程中的。

  • 使用下面方法獲取視頻的每一幀圖片進(jìn)行處理:
/* ! 
@method     generateCGImagesAsynchronouslyForTimes:completionHandler 
@abstract       Returns a series of CGImageRefs for an asset at or near the specified times. 
@param          requestedTimes An NSArray of NSValues, each containing a CMTime, specifying the asset times at which an image is requested. 
@param          handler A block that will be called when an image request is complete. 
@discussion     Employs an efficient "batch mode" for getting images in time order. The client will receive exactly one handler callback for each requested time in requestedTimes. Changes to generator properties (snap behavior, maximum size, etc...) will not affect outstanding asynchronous image generation requests. The generated image is not retained. Clients should retain the image if they wish it to persist after the completion handler returns.
*/

// 獲取每一幀圖片
- (void)generateCGImagesAsynchronouslyForTimes:(NSArray<NSValue *> *)requestedTimes completionHandler:(AVAssetImageGeneratorCompletionHandler)handler;

  • 下面核心代碼:
- (void)splitVideo:(NSURL *)fileUrl fps:(float)fps splitCompleteBlock:(void(^)(BOOL success, NSMutableArray *splitimgs))splitCompleteBlock  { 
   if (!fileUrl) {
       return; 
   } 
   NSMutableArray *splitImages = [NSMutableArray array]; 
   NSDictionary *optDict = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];          
   AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:fileUrl options:optDict];  

   CMTime cmtime = avasset.duration; //視頻時(shí)間信息結(jié)構(gòu)體
   Float64 durationSeconds = CMTimeGetSeconds(cmtime); //視頻總秒數(shù)  
   NSMutableArray *times = [NSMutableArray array]; 
   Float64 totalFrames = durationSeconds * fps; //獲得視頻總幀數(shù)   
   CMTime timeFrame; 
    for (int i = 1; i <= totalFrames; i++) { 
          timeFrame = CMTimeMake(i, fps); //第i幀 幀率 
          NSValue *timeValue = [NSValue valueWithCMTime:timeFrame];
          [times addObject:timeValue]; 
     } 
  AVAssetImageGenerator *imgGenerator = [[AVAssetImageGenerator alloc] initWithAsset:avasset]; //防止時(shí)間出現(xiàn)偏差 
  imgGenerator.requestedTimeToleranceBefore = kCMTimeZero; 
  imgGenerator.requestedTimeToleranceAfter = kCMTimeZero;  
  NSInteger timesCount = [times count];  // 獲取每一幀的圖片
  [imgGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) { 
       NSLog("current-----: %lld", requestedTime.value);        
       NSLog("timeScale----: %d",requestedTime.timescale); // 幀率           
       BOOL    isSuccess = NO; 
       switch (result) {
            case AVAssetImageGeneratorCancelled: 
                NSLog(@"Cancelled"); 
                break; 
            case AVAssetImageGeneratorFailed: 
                NSLog(@"Failed"); 
                break; 
            case AVAssetImageGeneratorSucceeded: { 
                UIImage *frameImg = [UIImage imageWithCGImage:image];   
                [splitImages addObject:frameImg]; 
                if (requestedTime.value == timesCount)  { 
                   isSuccess = YES; 
                    NSLog(@"completed"); 
                 } 
            } 
                break; 
 } 
     if (splitCompleteBlock) { 
       splitCompleteBlock(isSuccess,splitImages); 
    } 
}];


貼上github地址:https://github.com/ismilesky/MaxVideo.git

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容