iOS直播視屏采集

1.采集基本類

1.AVFoundation:里面的主要音視頻采集類有

WechatIMG5.jpeg

2.AVCaptureDevice:硬件設備,包括麥克風、攝像頭等,用該類來初始化視屏采集和音頻采集設備的對象(可初始化音頻采集設備和視屏采集設備)

3.AVCaptureDeviceInput:設備輸入對象,管理采集設備采集到的輸入數(shù)據(jù)(可初始化音頻對象和視屏對象)

4.AVCaptureOutput:硬件輸出設備,接受到采集到輸出數(shù)據(jù)

5.AVCaptureVideoDataOutput:AVCaptureOutput的子類,視屏輸出對象

6.AVCaptureAudioDataOutput:AVCaptureOutput的子類,音頻輸出設備

7.AVCaptureConnection:當把一個輸入和輸出添加到AVCaptureSession之后,AVCaptureSession就會在輸入、輸出設備之間建立連接,而且通過AVCaptureOutput可以獲取這個連接對象。(和音頻輸出或在視屏輸出建立連接,區(qū)分作用)

8.AVCaptureVideoPreviewLayer:視屏采集的預覽圖層,視屏的錄制效果展示layer層。

9.AVCaptureSession:協(xié)調(diào)管理音視頻的采集的輸入和輸出,是采集和數(shù)據(jù)輸出的連接管理對象

2.捕獲音視頻步驟:官方文檔 (參考學習:http://www.itdecent.cn/p/c71bfda055fa

1.創(chuàng)建AVCaptureSession對象

2.獲取AVCaptureDevicel錄像設備(攝像頭),錄音設備(麥克風),注意不具備輸入數(shù)據(jù)功能,只是用來調(diào)節(jié)硬件設備的配置。

3.根據(jù)音頻/視頻硬件設備(AVCaptureDevice)創(chuàng)建音頻/視頻硬件輸入數(shù)據(jù)對象(AVCaptureDeviceInput),專門管理數(shù)據(jù)輸入。

4.創(chuàng)建視頻輸出數(shù)據(jù)管理對象(AVCaptureVideoDataOutput),并且設置樣品緩存代理(setSampleBufferDelegate)就可以通過它拿到采集到的視頻數(shù)據(jù)

5.創(chuàng)建音頻輸出數(shù)據(jù)管理對象(AVCaptureAudioDataOutput),并且設置樣品緩存代理(setSampleBufferDelegate)就可以通過它拿到采集到的音頻數(shù)據(jù)

6.將數(shù)據(jù)輸入對象AVCaptureDeviceInput、數(shù)據(jù)輸出對象AVCaptureOutput添加到媒體會話管理對象AVCaptureSession中,就會自動讓音頻輸入與輸出和視頻輸入與輸出產(chǎn)生連接.

7.創(chuàng)建視頻預覽圖層AVCaptureVideoPreviewLayer并指定媒體會話,添加圖層到顯示容器layer中

8.啟動AVCaptureSession,只有開啟,才會開始輸入到輸出數(shù)據(jù)流傳輸。

3.上代碼

//會話對象
@property (nonatomic,strong)AVCaptureSession *captureSession;
//攝像頭設備
@property (nonatomic,strong)AVCaptureDevice *videoDevice;
//聲音設備
@property (nonatomic,strong)AVCaptureDevice *audioDevice;
//設備視屏輸入對象
@property (nonatomic,strong)AVCaptureDeviceInput *videoDeviceInput;
//設備音頻輸入對象
@property (nonatomic,strong)AVCaptureDeviceInput *audioDeviceInput;
//設備視屏輸出對象
@property (nonatomic,strong)AVCaptureVideoDataOutput *videoOutput;
//設備音頻輸出對象
@property (nonatomic,strong)AVCaptureAudioDataOutput *audioOutput;
//輸入與輸出連接
@property (nonatomic,strong)AVCaptureConnection *videoConnection;

-(void)captureVideoAndAudio
{
    //開始配置
    [self.captureSession beginConfiguration];
    //添加視屏輸入和音頻輸入到session中
    if ([self.captureSession canAddInput:self.videoDeviceInput]) {
        [self.captureSession addInput:self.videoDeviceInput];
    }
    if ([self.captureSession canAddInput:self.audioDeviceInput]) {
        [self.captureSession addInput:self.audioDeviceInput];
    }
    
    //設置代理,獲取視屏輸出數(shù)據(jù)
    //這里要創(chuàng)建一個串行隊列
    dispatch_queue_t videoOutputQueue = dispatch_queue_create("videoOutputQueue", DISPATCH_QUEUE_SERIAL);
    [self.videoOutput setSampleBufferDelegate:self queue:videoOutputQueue];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        //將輸出視屏加入session中
        [self.captureSession addOutput:self.videoOutput];
    }
    dispatch_queue_t audioOutputQueue = dispatch_queue_create("audioOutputQueue", DISPATCH_QUEUE_SERIAL);
    [self.audioOutput setSampleBufferDelegate:self queue:audioOutputQueue];
    if ([self.captureSession canAddOutput:self.audioOutput]) {
        [self.captureSession addOutput:self.audioOutput];
    }
    // 9.獲取視頻輸入與輸出連接,用于分辨音視頻數(shù)據(jù)
    self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
    
    //顯示在layer層
    AVCaptureVideoPreviewLayer *capturePreViewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession];
    capturePreViewLayer.frame = self.view.bounds;
    [self.view.layer insertSublayer:capturePreViewLayer atIndex:0];
    //完成配置
    [self.captureSession commitConfiguration];
    //啟動會話
    [self.captureSession startRunning];
    
}

// 指定攝像頭方向獲取攝像頭
- (AVCaptureDevice *)getVideoDevice:(AVCaptureDevicePosition)position
{
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if (device.position == position) {
            return device;
        }
    }
    return nil;
}

-(void)changeCameraPosition
{
    
    // 獲取需要改變的方向
    AVCaptureDevicePosition currentCameraPosition = [self.videoDevice position];
    
    if (currentCameraPosition == AVCaptureDevicePositionBack)
    {
        currentCameraPosition = AVCaptureDevicePositionFront;
    }
    else
    {
        currentCameraPosition = AVCaptureDevicePositionBack;
    }

//    // 獲取需要改變的方向
    //重新獲取攝像設備
    AVCaptureDevice *videoDevice = [self getVideoDevice:currentCameraPosition];
    //重置視屏輸入
    AVCaptureDeviceInput *videoDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:videoDevice error:nil];
    if (videoDeviceInput) {
        [_captureSession beginConfiguration];
        [self.captureSession removeInput:self.videoDeviceInput];
        if ([self.captureSession canAddInput:videoDeviceInput]) {
            [self.captureSession addInput:videoDeviceInput];
            self.videoDeviceInput = videoDeviceInput;
            self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
            self.videoDevice = videoDevice;
        }else{
            [self.captureSession addInput:self.videoDeviceInput];
        }
        [self.captureSession commitConfiguration];

    }
}

最后demo地址( https://github.com/liaoYuanDi/LYDLiveDemo

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容