iOS AVCaptureSession學(xué)習筆記(一)

基本使用流程

AVCaptureSession是AVFoundation的核心類,用于捕捉視頻和音頻,協(xié)調(diào)視頻和音頻的輸入和輸出流.下面是簡書上找的圍繞AVCaptureSession的圖

AVCapureSession.jpg

圍繞AVCaptureSession的核心類的簡介

AVCaptureSession是AVFoundation的核心類,用于捕捉視頻和音頻,協(xié)調(diào)視頻和音頻的輸入和輸出流.

對session的常見操作:

1. 創(chuàng)建AVCaptureSession

設(shè)置SessionPreset,用于設(shè)置output輸出流的bitrate或者說畫面質(zhì)量

// 1 創(chuàng)建session
AVCaptureSession *session = [AVCaptureSession new];
//設(shè)置session顯示分辨率
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
    [session setSessionPreset:AVCaptureSessionPreset640x480];
else
    [session setSessionPreset:AVCaptureSessionPresetPhoto];

2. 給Session添加input輸入

一般是Video或者Audio數(shù)據(jù),也可以兩者都添加,即AVCaptureSession的輸入源AVCaptureDeviceInput.

// 2 獲取攝像頭device,并且默認使用的后置攝像頭,并且將攝像頭加入到captureSession中
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

isUsingFrontFacingCamera = NO;
if ([session canAddInput:deviceInput]){
    [session addInput:deviceInput];
}

3. 給session添加output輸出

添加AVCaptureOutput,即AVCaptureSession的輸出源.一般輸出源分成:音視頻源,圖片源,文件源等.

  • 音視頻輸出AVCaptureAudioDataOutput,AVCaptureVideoDataOutput.
  • 靜態(tài)圖片輸出AVCaptureStillImageOutput(iOS10中被AVCapturePhotoOutput取代了)
  • AVCaptureMovieFileOutput表示文件源.

通常如果需要音視頻幀,需要在將output加入到session之前,設(shè)置videoSetting或者audioSetting,主要是音視頻的格式或者回調(diào)的delegate以及dispatch queue.

// 4 創(chuàng)建拍照使用的AVCaptureStillImageOutput,并且注冊observer觀察capturingStillImage,并將output加入到session. 使用observer的作用監(jiān)控"capturingStillImage",如果為YES,那么表示開始截取視頻幀.在回調(diào)方法中顯示閃屏效果
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:(__bridge void * _Nullable)(AVCaptureStillImageIsCapturingStillImageContext)];
if ([session canAddOutput:stillImageOutput]){
    [session addOutput:stillImageOutput];
}

// 5 創(chuàng)建預(yù)覽output,設(shè)置預(yù)覽videosetting,然后設(shè)置預(yù)覽delegate使用的回調(diào)線程,將該預(yù)覽output加入到session
videoDataOutput = [AVCaptureVideoDataOutput new];

// we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                   [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutput setVideoSettings:rgbOutputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)

// create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured
// a serial dispatch queue must be used to guarantee that video frames will be delivered in order
// see the header doc for setSampleBufferDelegate:queue: for more information
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];

if ([session canAddOutput:videoDataOutput]){
    [session addOutput:videoDataOutput];
}

4. AVCaptureConnection設(shè)置input,output連接的重要屬性

在給AVCaptureSession添加input和output以后,就可以通過audio或者video的output生成AVCaptureConnection.通過connection設(shè)置output的視頻或者音頻的重要屬性,比如ouput video的方向videoOrientation(這里注意videoOrientation并非DeviceOrientation,默認情況下錄制的視頻是90度轉(zhuǎn)角的,這個是相機傳感器導(dǎo)致的,請google)

[[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:NO];
AVCaptureConnection *videoCon = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];

// 原來的刷臉沒有這句話.因此錄制出來的視頻是有90度轉(zhuǎn)角的, 這是默認情況
if ([videoCon isVideoOrientationSupported]) {
//        videoCon.videoOrientation = AVCaptureVideoOrientationPortrait;
    // 下面這句是默認系統(tǒng)video orientation情況!!!!,如果要outputsample圖片輸出的方向是正的那么需要將這里設(shè)置稱為portrait
    //videoCon.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
}

5. 視頻預(yù)覽層AVCaptureVideoPreviewLayer

在input,output等重要信息都添加到session以后,可以用session創(chuàng)建AVCaptureVideoPreviewLayer,這是攝像頭的視頻預(yù)覽層.

previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];// 猶豫使用的aspectPerserve
CALayer *rootLayer = [previewView layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];

6. 啟動session

// 7 啟動session,output開始接受samplebuffer回調(diào)
[session startRunning];

Apple Demo SquareCam的session的完整的初始化

具體解釋見注釋

/**
 *  相機初始化方法
 */
- (void)setupAVCapture
{
    NSError *error = nil;
    
    // 1 創(chuàng)建session
    AVCaptureSession *session = [AVCaptureSession new];
    // 2 設(shè)置session顯示分辨率
    if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
        [session setSessionPreset:AVCaptureSessionPreset640x480];
    else
        [session setSessionPreset:AVCaptureSessionPresetPhoto];
    
    // 3 獲取攝像頭device,并且默認使用的后置攝像頭,并且將攝像頭加入到captureSession中
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    
    isUsingFrontFacingCamera = NO;
    if ( [session canAddInput:deviceInput] )
        [session addInput:deviceInput];
    
    // 4 創(chuàng)建拍照使用的AVCaptureStillImageOutput,并且注冊observer觀察capturingStillImage,并將output加入到session. 使用observer的作用監(jiān)控"capturingStillImage",如果為YES,那么表示開始截取視頻幀.在回調(diào)方法中顯示閃屏效果
    stillImageOutput = [AVCaptureStillImageOutput new];
    [stillImageOutput addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:(__bridge void * _Nullable)(AVCaptureStillImageIsCapturingStillImageContext)];
    if ( [session canAddOutput:stillImageOutput] )
        [session addOutput:stillImageOutput];
    
    // 5 創(chuàng)建預(yù)覽output,設(shè)置預(yù)覽videosetting,然后設(shè)置預(yù)覽delegate使用的回調(diào)線程,將該預(yù)覽output加入到session
    videoDataOutput = [AVCaptureVideoDataOutput new];
    
    // we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
    NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                       [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    [videoDataOutput setVideoSettings:rgbOutputSettings];
    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)
    
    // create a serial dispatch queue used for the sample buffer delegate as well as when a still image is captured
    // a serial dispatch queue must be used to guarantee that video frames will be delivered in order
    // see the header doc for setSampleBufferDelegate:queue: for more information
    videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
    [videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
    
    if ( [session canAddOutput:videoDataOutput] )
        [session addOutput:videoDataOutput];
    
    
    [[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:NO];
    AVCaptureConnection *videoCon = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
    
    // 原來的刷臉沒有這句話.因此錄制出來的視頻是有90度轉(zhuǎn)角的, 這是默認情況
    if ([videoCon isVideoOrientationSupported]) {
//        videoCon.videoOrientation = AVCaptureVideoOrientationPortrait;
        
        // 下面這句是默認系統(tǒng)video orientation情況!!!!,如果要outputsample圖片輸出的方向是正的那么需要將這里設(shè)置稱為portrait
        //videoCon.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
    }
    effectiveScale = 1.0;
    // 6 獲取相機的實時預(yù)覽layer,并且設(shè)置layer的拉升屬性AVLayerVideoGravityResizeAspect,設(shè)置previewLayer的bounds,并加入到view中
    previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    [previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
    [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];// 猶豫使用的aspectPerserve
    CALayer *rootLayer = [previewView layer];
    [rootLayer setMasksToBounds:YES];
    [previewLayer setFrame:[rootLayer bounds]];
    [rootLayer addSublayer:previewLayer];
    
    // 7 啟動session,output開始接受samplebuffer回調(diào)
    [session startRunning];

bail:
    if (error) {
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
                                                            message:[error localizedDescription]
                                                           delegate:nil 
                                                  cancelButtonTitle:@"Dismiss" 
                                                  otherButtonTitles:nil];
        [alertView show];
        [self teardownAVCapture];
    }
}

請參考apple 的官方demo: SquareCam

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容