iOS AVFoundation 利用裁剪后的圖片(視頻幀)拼接視頻

1.初始化視頻設(shè)置

- (void)initVideoSetting{
    NSDictionary *outputSettings =
    [NSDictionary dictionaryWithObjectsAndKeys:
     
     [NSNumber numberWithInt:720], AVVideoWidthKey,
     [NSNumber numberWithInt:1280], AVVideoHeightKey,
     AVVideoCodecH264, AVVideoCodecKey,
     
     nil];
    
    _assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                           outputSettings:outputSettings];
    
    _pixelBufferAdaptor =[[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:_assetWriterInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],kCVPixelBufferPixelFormatTypeKey,nil]];
    
    NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@"1613.mov"];
    NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath];
    
    _assetWriter = [[AVAssetWriter alloc] initWithURL:fileUrl fileType:AVFileTypeQuickTimeMovie error:nil];
    [_assetWriter addInput:_assetWriterInput];
    _assetWriterInput.expectsMediaDataInRealTime = YES;
    [_assetWriter startWriting];
    [_assetWriter startSessionAtSourceTime:kCMTimeZero];
}

2.在視頻處理幀的代理方法中調(diào)用裁剪合成方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    @autoreleasepool {
        @try {
             [self cropSampleBuffer:sampleBuffer inRect:裁剪的尺寸];
        } @catch ( NSException  *e) {
            
        }
    }
}

3.裁剪sampleBuffer的方法

- (void)cropSampleBuffer:(CMSampleBufferRef)sampleBuffer inRect:(CGRect)rect{
    CFRetain(sampleBuffer);
    UIImage *originImage = [self imageFromSampleBuffer:sampleBuffer];//請自行搜索
    UIImage *cropImage = [self cropImage:originImage atRect:rect];//請自行搜索
    [self startTakingVideo:[self pixelBufferFromUIImage:cropImage]];
    CFRelease(sampleBuffer);
}

4.將裁減后的UIImage轉(zhuǎn)成CVPixelBufferRef

CGImageRef image = [originImage CGImage];
    int height = 1280;
    int width = 720;
    
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;
    
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,
                                          height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);
    
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    
    CGContextRef context = CGBitmapContextCreate(pxdata, width,
                                                 height, 8, 4*width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
    return pxbuffer;

5.開始將裁減后的圖和成視頻

- (void)startTakingVideo:(CVPixelBufferRef)sampleBuffer{
    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    static int64_t frameNumber = 0;
    if(_assetWriterInput.readyForMoreMediaData)
        [_pixelBufferAdaptor appendPixelBuffer:sampleBuffer
                         withPresentationTime:CMTimeMake(frameNumber, 25)];
    frameNumber++;
}

6.停止錄制,并保存視頻

- (void)stopTakingVideo{
    [_assetWriter finishWriting];//方法已經(jīng)deprecated,如果找到麻煩告訴下博主
}

最后一些注意事項

請不要使用AVCaptureMovieFileOutput AVCaptureMovieFileOutput是和AVAssetWriterInputPixelBufferAdaptor有沖突的,具體的內(nèi)容還是參見蘋果官方文檔

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

  • 原文:AVFoundation Programming Guide 寫在前面 簡單翻譯一下AVFoundation...
    朦朧1919閱讀 6,410評論 1 14
  • 發(fā)現(xiàn) 關(guān)注 消息 iOS 第三方庫、插件、知名博客總結(jié) 作者大灰狼的小綿羊哥哥關(guān)注 2017.06.26 09:4...
    肇東周閱讀 15,289評論 4 61
  • 目前只翻譯完一小部分,先保存一下.之后有空繼續(xù)....(好長好長啊...). 翻譯的比較渣,但應(yīng)該可以看懂......
    NaitY閱讀 6,102評論 2 12
  • 在路上需要一個和你同甘共苦的人一起走,需要一個懂你的人,那個人你在哪,希望能夠遇見你。
    遇見知己閱讀 184評論 0 1
  • 企業(yè)的需求與轉(zhuǎn)變,決定后期人員流動的狀態(tài),管理層要不斷的創(chuàng)新模式,改變原有的現(xiàn)狀。 參與磨合期的發(fā)展,對于新老員工...
    楊平的閱讀 242評論 0 0

友情鏈接更多精彩內(nèi)容