iOS 視頻錄制、拍照、指定區(qū)域輸出以及坑

既然有類似微信小視頻的需求,那就廢話不多說,直接開干。
先看效果:


錄制播放.gif

大致有3個(gè)方案UIImagePickerControllerAVCaptureSession + AVCaptureMovieFileOutput、AVCaptureSession + AVAssetWriter,由簡至難來總結(jié)一下,然后再說一說其中遇到的各種坑,特別是iPhone X和iPad。

UIImagePickerController

這個(gè)是目前最簡單的一種食品捕捉方式,當(dāng)然自定義化程度也就隨之減分。
1、創(chuàng)建一個(gè)UIImagePickerController對(duì)象,設(shè)置好sourceType、mediaTypes、delegate這些:

UIImagePickerController *systemImagePickerVc = [[UIImagePickerController alloc] init];
systemImagePickerVc.delegate = self;
systemImagePickerVc.navigationBar.barTintColor = viewController.navigationController.navigationBar.barTintColor;
systemImagePickerVc.navigationBar.tintColor = viewController.navigationController.navigationBar.tintColor;
systemImagePickerVc.videoMaximumDuration = 10;
systemImagePickerVc.mediaTypes = @[(NSString *)kUTTypeImage,(NSString *)kUTTypeMovie];
systemImagePickerVc.videoQuality = UIImagePickerControllerQualityTypeHigh;
systemImagePickerVc.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
systemImagePickerVc.modalPresentationStyle = UIModalPresentationOverCurrentContext;
systemImagePickerVc.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;//看起來逐漸淡入顯示
[viewController presentViewController:systemImagePickerVc animated:YES completion:nil];

然后在代理里面去做文章,取出圖片、視頻做自己的處理:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    NSString *type = [info objectForKey:UIImagePickerControllerMediaType];
    if ([type isEqualToString:@"public.image"]) {
    } else if ([type isEqualToString:@"public.movie"]) {
    }
}

2、UIImagePickerController支持自定義UI,你可以自定義相機(jī)的控件,通過隱藏默認(rèn)控件,然后創(chuàng)建帶有控件的自定義視圖,并覆蓋在相機(jī)預(yù)覽圖層上面:

UIView *cameraOverlayView = [UIView new];
picker.showsCameraControls = NO;
picker.cameraOverlayView = cameraOverlayView;

結(jié):UIImagePickerController相對(duì)單調(diào),但實(shí)施簡單,如果不是主要功能的相機(jī)需求,可以考慮使用這個(gè)從簡處理,而且基本上也不會(huì)有什么坑。

AVCaptureSession + AVCaptureMovieFileOutput

一、視頻錄制

1、首先創(chuàng)建AVCaptureSession會(huì)話

self.captureSession = ({
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        if (IS_IPHONEX) {
            if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
                [session setSessionPreset:AVCaptureSessionPreset1920x1080];
            }
        } else {
            if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
                [session setSessionPreset:AVCaptureSessionPresetHigh];
            }
        }
        
        session;
    });

2、添加音頻輸入、視頻輸入

//視頻輸入
AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:_cameraDevice];
NSError *error = nil;
self.captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
if (error) {
    PRINT("captureDeviceInput error:%@",error.localizedDescription)
    return NO;
}
    
//音頻輸入
AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
error = nil;
AVCaptureDeviceInput *audioCaptureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioCaptureDevice error:&error];
if (error) {
    PRINT("audioCaptureDeviceInput error:%@",error.localizedDescription)
    return NO;

//將輸入設(shè)備添加到會(huì)話
if ([self.captureSession canAddInput:self.captureDeviceInput]) {
    [self.captureSession addInput:self.captureDeviceInput];
    [self.captureSession addInput:audioCaptureDeviceInput];
    //視頻防抖
    AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    if ([connection isVideoStabilizationSupported]) {
        connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
    }
}

3、添加視頻輸出和照片輸出

    //設(shè)備輸出
    self.captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ([self.captureSession canAddOutput:self.captureMovieFileOutput]) {
        [self.captureSession addOutput:self.captureMovieFileOutput];
    }
    
    //照片輸出
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:outputSettings];
    if ([self.captureSession canAddOutput:self.stillImageOutput]) {
        [self.captureSession addOutput:self.stillImageOutput];
    }

4、最后將你的會(huì)話鏈接到layer上去顯示

    //創(chuàng)建視頻預(yù)覽層
    self.captureVideoPreviewLayer = ({
        AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
        previewLayer.frame = self.AVCaptureBackgroundView.bounds;
        if (IS_IPAD) {
            previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
        } else {
            previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        }
        
        if ([self.captureVideoPreviewLayer.connection isVideoOrientationSupported]) {
            [self.captureVideoPreviewLayer.connection setVideoOrientation:[self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation]];
        }
        [self.AVCaptureBackgroundView.layer insertSublayer:previewLayer atIndex:0];
        previewLayer;
        
    });

5、開始錄制

//根據(jù)設(shè)備輸出獲得連接
        AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        //根據(jù)連接取得設(shè)備輸出的數(shù)據(jù)
        if (![self.captureMovieFileOutput isRecording]) {
            //如果支持多任務(wù)則開始多任務(wù)
            if ([[UIDevice currentDevice] isMultitaskingSupported]) {
                //                self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
            }
            if (self.saveVideoUrl) {
                [[NSFileManager defaultManager] removeItemAtURL:self.saveVideoUrl error:nil];
            }
            if ([connection isVideoOrientationSupported]) {
                connection.videoOrientation = [self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation];
            }
            if ([connection isVideoMirroringSupported]) {
                AVCaptureDevicePosition currentPosition=[[self.captureDeviceInput device] position];
                if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
                    connection.videoMirrored = YES;
                } else {
                    connection.videoMirrored = NO;
                }
            }
            NSString *outputFielPath = [NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];
            PRINT("save path is :%s",outputFielPath)
            [self.captureMovieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFielPath] recordingDelegate:self];
        } else {
            [self.captureMovieFileOutput stopRecording];
        }

結(jié)束錄制也就是else里的[self.captureMovieFileOutput stopRecording];
6、取文件。在AVCaptureFileOutputRecordingDelegate的代理方法里去做。
有兩個(gè)代理,因?yàn)?code>startRecording后實(shí)際開始錄制時(shí)機(jī)是有延遲的,結(jié)束錄制同理,所以一般處理在代理里去做。

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
    PRINT("begin record")
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
    PRINT("end record")
}

二、拍照

在上述錄制中我們已經(jīng)加入了照片輸出AVCaptureStillImageOutput,所以直接通過它取圖。

    AVCaptureConnection *imageConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
    if (!imageConnection) {
        return;
    }
    if ([imageConnection isVideoOrientationSupported]) {
        imageConnection.videoOrientation = [self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation];
    }
    if ([imageConnection isVideoMirroringSupported]) {
        AVCaptureDevicePosition currentPosition = [[self.captureDeviceInput device] position];
        if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
            imageConnection.videoMirrored = YES;
        } else {
            imageConnection.videoMirrored = NO;
        }
    }
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:imageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        if (imageDataSampleBuffer == NULL) {
            return;
        }
        [self stopSession];
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
        UIImage *image = [UIImage imageWithData:imageData];
        
        if (!image) {
            return;
        }
        
        self.resultImage = [image fixOrientation];
    }];

三、坑

大致流程就是這樣,說一下里面的坑:
1、最大的坑:細(xì)心讀者肯定看到了里面一些IS_IPADIS_IPHONEX這些東西,而且也無法達(dá)到6s等機(jī)型的全屏效果。因?yàn)槭謾C(jī)的攝像頭的尺寸并不是和機(jī)型的尺寸(也就是mainScreen.bounds)一致,所以你拍出來的照片或視頻是和你在錄制或拍照時(shí)看到的大小和范圍都不一樣(準(zhǔn)確來說,iPhone X和iPad都會(huì)拍出超出你所能看到的畫面景象),這個(gè)是影響用戶體驗(yàn)以及安全性的,拍出了他不想出現(xiàn)在畫面里內(nèi)容。所以,這里面相應(yīng)的處理是用戶在拍或錄之前看到的就是攝像頭的尺寸,由于用了fitmode,所以界面上在拍之前看到的對(duì)于iPhone X或iPad來說不是全屏畫面。這個(gè)完全的解決方法有兩個(gè):(1)錄制或拍攝時(shí)還是采取全屏,錄完后裁剪視頻或圖片(耗時(shí)長,體驗(yàn)差);(2)采用第三種錄制方式AVAssetWriter,拿到每幀去做處理,這個(gè)方案完美,后續(xù)有講。
2、前置攝像頭系統(tǒng)自動(dòng)鏡像了,所以需要再鏡像一次,返還回來;如果切換攝像頭,那么需要重新設(shè)置鏡像,所以上述是在開始錄制的時(shí)候去設(shè)置的。
3、音頻使用的問題,打開這個(gè)界面后,按home到手機(jī)屏幕,會(huì)發(fā)現(xiàn)狀態(tài)欄有紅色一閃而過,那是音頻占用的標(biāo)識(shí),QQ也是如此,微信卻不是,可能做了音視頻分離,然后再合成處理。這一塊會(huì)影響到其他地方的音頻使用,比如你有通話功能的話,你需要在開始其他使用前audiosession setactive no,而且這個(gè)也是有過程的,把握好邏輯時(shí)機(jī)很重要。
4、stopsessionstartsession是會(huì)阻塞線程的,所以建議放在同步子線程中處理,同時(shí),這兩個(gè)也是有延遲的,不是立馬生效的,開啟或關(guān)閉成功后有對(duì)應(yīng)的通知AVCaptureSessionDidStartRunningNotification、AVCaptureSessionDidStopRunningNotification,封裝起來使用吧。

- (void)stopRunningSession:(AVCaptureSession *)session compeletion:(void(^)(BOOL success))completion {
    if (!session || ![session isRunning]) {
        if (completion) {
            completion(YES);
        }
        return;
    }
    __block NSObject *stopRunningOKObsever = nil;
    
    stopRunningOKObsever = [[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionDidStopRunningNotification object:session queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification * _Nonnull note) {
        if (note.object == session) {
            PRINT("session stopRunning success")
            if (completion) {
                completion(YES);
            }
            [[NSNotificationCenter defaultCenter] removeObserver:stopRunningOKObsever];
            stopRunningOKObsever = nil;
        }
    }];
    
    dispatch_async(self.sessionHandleQueue, ^{
        PRINT("session stopRunning")
        [session stopRunning];
    });
}

- (void)startRunningSession:(AVCaptureSession *)session compeletion:(void(^)(BOOL success))completion {
    if (!session || [session isRunning]) {
        if (completion) {
            completion(NO);
        }
        return;
    }
    __block NSObject *startRunningOKObsever = nil;
    
    startRunningOKObsever = [[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionDidStartRunningNotification object:session queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification * _Nonnull note) {
        if (note.object == session) {
            if (completion) {
                completion(YES);
            }
            [[NSNotificationCenter defaultCenter] removeObserver:startRunningOKObsever];
            startRunningOKObsever = nil;
        }
    }];
    
    dispatch_async(self.sessionHandleQueue, ^{
        [session startRunning];
    });
}

5、4的問題解決了,就會(huì)衍生出一個(gè)新的問題,就是[self.captureSession startRunning][self.captureSession stopRunning]有延遲,那么就會(huì)出現(xiàn)開啟或關(guān)閉時(shí)黑一下(或者說閃一下),這個(gè)把握好時(shí)機(jī)去做處理,例如在開啟完成后再去顯示出這個(gè)layer所在的view(這也就是有些應(yīng)用掃一掃之類的開啟時(shí)會(huì)有個(gè)動(dòng)畫或者干脆黑一下),另外,如果你在中間過程中想要暫停畫面,也就是達(dá)到類似于[self.captureSession pauseRunning](雖然沒有這個(gè)API),可以使用斷開連接的方式:self.captureVideoPreviewLayer.connection.enabled = NO;,以及self.captureVideoPreviewLayer.connection.enabled = YES;實(shí)現(xiàn)暫停和繼續(xù)的效果。
6、和5的問題有點(diǎn)類似,在切換前后攝像頭時(shí)會(huì)有remove和addinput的操作,這些操作切換時(shí)會(huì)閃一下或黑一下,這個(gè)時(shí)候可以做一些轉(zhuǎn)場動(dòng)畫之類的去掩蓋住,例如微信采用的是加一層模糊效果去掩蓋這個(gè)變換過程帶來的不好體驗(yàn)。

    [self.captureSession beginConfiguration];
    
    //移除原有對(duì)象
    [self.captureSession removeInput:self.captureDeviceInput];
    //添加新的對(duì)象
    if ([self.captureSession canAddInput:newCaptureDeviceInput]) {
        [self.captureSession addInput:newCaptureDeviceInput];
        self.captureDeviceInput = newCaptureDeviceInput;
    }

    [self.captureSession commitConfiguration];

7、還有拍照時(shí)的聚焦、閃光燈、白平衡等等代碼就不貼了。

結(jié):這種方法適合快速集成自己的拍照錄制界面,但是在iPhone X這種機(jī)型上有弊端,加上馬上又要出新的全面屏,不太完美。雖然他有一些其他的配置選項(xiàng),比如在某段時(shí)間后,在達(dá)到某個(gè)指定的文件尺寸時(shí),或者當(dāng)設(shè)備的最小磁盤剩余空間達(dá)到某個(gè)閾值時(shí)停止錄制。如果還需要更多設(shè)置,比如自定義視頻音頻的壓縮率,或者你想要在寫入文件之前,處理視頻音頻的樣本,那么就得看第三種方法了。

AVCaptureMovieFileOutput 的 demo在此,包含后期圖片裁剪、馬賽克、涂鴉等

AVCaptureSession + AVAssetWriter

最完美也是最麻煩的一個(gè)方案。
1、先去掉第二種方法中創(chuàng)建的AVCaptureMovieFileOutput對(duì)象,然后添加音頻和視頻輸出,注意其中有個(gè)videoQueue這個(gè)是統(tǒng)一操作的同步線程,以及兩個(gè)新的代理AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate

@property (nonatomic, strong) dispatch_queue_t videoQueue;
@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;
@property (nonatomic, strong) AVCaptureAudioDataOutput *audioOutput;

//設(shè)備輸出
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
        [self.captureSession beginConfiguration];
        AVCaptureConnection *connection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
        if ([connection isVideoStabilizationSupported]) {
            connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
        }
        if ([connection isVideoMirroringSupported]) {
            AVCaptureDevicePosition currentPosition = [[self.captureDeviceInput device] position];
            if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
                connection.videoMirrored = YES;
            } else {
                connection.videoMirrored = NO;
            }
        }
        [self.captureSession commitConfiguration];
    }
    self.audioOutput = [[AVCaptureAudioDataOutput alloc] init];
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if([self.captureSession canAddOutput:self.audioOutput]) {
        [self.captureSession addOutput:self.audioOutput];
    }

這個(gè)時(shí)候是可以在代理方法中有實(shí)時(shí)回調(diào)的,如果在代理中return掉,就不會(huì)寫入這一幀,如果你設(shè)置個(gè)開關(guān)shouldWrite那就可以達(dá)到類似抖音那種錄一半暫停,然后換個(gè)地方再錄一半的效果。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    @autoreleasepool {
        //視頻
        if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
            if (!self.assetWriteManager.outputVideoFormatDescription) {
                @synchronized(self) {
                    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                    self.assetWriteManager.outputVideoFormatDescription = formatDescription;
                }
            } else {
                @synchronized(self) {
                    if (self.assetWriteManager.writeState == FMRecordStateRecording) {
                        [self.assetWriteManager appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeVideo];
                    }
                }
            }
        }
        
        //音頻
        if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
            if (!self.assetWriteManager.outputAudioFormatDescription) {
                @synchronized(self) {
                    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                    self.assetWriteManager.outputAudioFormatDescription = formatDescription;
                }
            }
            @synchronized(self) {
                if (self.assetWriteManager.writeState == FMRecordStateRecording) {
                    [self.assetWriteManager appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeAudio];
                }
            }
        }
    }
}

這里的self.assetWriteManager也就是接下來的重點(diǎn)了。
2、為了方便,將AVAssetWriter以及其input等封裝到一個(gè)AVAssetWriteManager中,其中包含:

@property (nonatomic, strong) dispatch_queue_t writeQueue;//統(tǒng)一的操作線程
@property (nonatomic, strong) NSURL *videoUrl;//寫入每幀視頻數(shù)據(jù)的路徑,也就是文件路徑
@property (nonatomic, strong)AVAssetWriter *assetWriter;//寫入操盤手
@property (nonatomic, strong)AVAssetWriterInput *assetWriterVideoInput;
@property (nonatomic, strong)AVAssetWriterInput *assetWriterAudioInput;
@property (nonatomic, strong) NSDictionary *videoCompressionSettings;
@property (nonatomic, strong) NSDictionary *audioCompressionSettings;
@property (nonatomic, assign) BOOL canWrite;//用以開啟或組織寫入的開關(guān)
@property (nonatomic, assign) CGSize outputSize;//寫入時(shí)的視頻size大小

然后先初始化Writer在設(shè)置outputSettings給videoInput時(shí),有參數(shù)AVVideoWidthKey和AVVideoHeightKey就是用來指定視頻寬高,解決前面一種方案里面的1號(hào)坑的核心。

- (void)setUpWriterWithIsFront:(BOOL)isFront {
    self.assetWriter = [AVAssetWriter assetWriterWithURL:self.videoUrl fileType:AVFileTypeMPEG4 error:nil];
    //寫入視頻大小
    NSInteger numPixels = self.outputSize.width * self.outputSize.height;
    //每像素比特
    CGFloat bitsPerPixel = 6.0;
    NSInteger bitsPerSecond = numPixels * bitsPerPixel;
    //碼率和幀率設(shè)置
    NSDictionary *compressionProperties = @{AVVideoAverageBitRateKey:@(bitsPerSecond), AVVideoExpectedSourceFrameRateKey:@(30), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey:AVVideoProfileLevelH264BaselineAutoLevel};
    //視頻屬性
    UIDeviceOrientation orientation = [UIDevice currentDevice].orientation;
    CGFloat widthKey = self.outputSize.height;
    CGFloat heightKey = self.outputSize.width;
    if (IS_IPAD && UIDeviceOrientationIsLandscape(orientation)) {
        widthKey = self.outputSize.width;
        heightKey = self.outputSize.height;
    }//這里是個(gè)坑。
    self.videoCompressionSettings = @{AVVideoCodecKey:AVVideoCodecH264, AVVideoScalingModeKey:AVVideoScalingModeResizeAspectFill, AVVideoWidthKey:@(widthKey), AVVideoHeightKey:@(heightKey), AVVideoCompressionPropertiesKey:compressionProperties};
    
    _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:self.videoCompressionSettings];

    _assetWriterVideoInput.expectsMediaDataInRealTime = YES;
    switch (orientation) {
        case UIDeviceOrientationPortraitUpsideDown:
            _assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? M_PI_2 : (M_PI + M_PI_2));
            break;
        case UIDeviceOrientationLandscapeLeft:
            _assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? -M_PI : 0);
            break;
        case UIDeviceOrientationLandscapeRight:
            _assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? 0 : M_PI);
            break;
        default:
            _assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? M_PI + M_PI_2 : M_PI_2);
            break;
    }

    //音頻設(shè)置
    self.audioCompressionSettings = @{AVEncoderBitRatePerChannelKey:@(28000), AVFormatIDKey:@(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey:@(1), AVSampleRateKey:@(22050)};
    
    _assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:self.audioCompressionSettings];
    _assetWriterAudioInput.expectsMediaDataInRealTime = YES;
    
    if ([_assetWriter canAddInput:_assetWriterVideoInput]) {
        [_assetWriter addInput:_assetWriterVideoInput];
    } else {
        PRINT("AssetWriter videoInput append Failed")
    }
    if ([_assetWriter canAddInput:_assetWriterAudioInput]) {
        [_assetWriter addInput:_assetWriterAudioInput];
    } else {
        PRINT("AssetWriter audioInput Append Failed")
    }
    
    self.writeState = FMRecordStateRecording;
}

然后在獲取幀數(shù)據(jù)時(shí)進(jìn)行寫入操作

//開始寫入數(shù)據(jù)
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer ofMediaType:(NSString *)mediaType {
    if (sampleBuffer == NULL) {
        PRINT("empty sampleBuffer")
        return;
    }
    
    @synchronized(self) {
        if (self.writeState < FMRecordStateRecording) {
            return;
        }
    }
    
    CFRetain(sampleBuffer);
    dispatch_async(self.writeQueue, ^{
        @autoreleasepool {
            @synchronized(self) {
                if (self.writeState > FMRecordStateRecording) {
                    CFRelease(sampleBuffer);
                    return;
                }
            }
            
            if (!self.canWrite && mediaType == AVMediaTypeVideo) {
                [self.assetWriter startWriting];
                [self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                self.canWrite = YES;
            }
            
            //寫入視頻數(shù)據(jù)
            if (mediaType == AVMediaTypeVideo) {
                if (self.assetWriterVideoInput.readyForMoreMediaData) {
                    BOOL success = [self.assetWriterVideoInput appendSampleBuffer:sampleBuffer];
                    if (!success) {
                        @synchronized (self) {
                            [self stopWrite];
                            [self destroyWrite];
                        }
                    }
                }
            }
            
            //寫入音頻數(shù)據(jù)
            if (mediaType == AVMediaTypeAudio) {
                if (self.assetWriterAudioInput.readyForMoreMediaData) {
                    BOOL success = [self.assetWriterAudioInput appendSampleBuffer:sampleBuffer];
                    if (!success) {
                        @synchronized (self) {
                            [self stopWrite];
                            [self destroyWrite];
                        }
                    }
                }
            }
            CFRelease(sampleBuffer);
        }
    } );
}

想結(jié)束的時(shí)候執(zhí)行Finish,同時(shí)回調(diào)出去

- (void)stopWrite {
    self.writeState = FMRecordStateFinish;

    __weak __typeof(self)weakSelf = self;
    if (_assetWriter && _assetWriter.status == AVAssetWriterStatusWriting){
        dispatch_async(self.writeQueue, ^{
            [_assetWriter finishWritingWithCompletionHandler:^{
                dispatch_async(dispatch_get_main_queue(), ^{
                    if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(finishWritingWithURL:)]) {
                        [weakSelf.delegate finishWritingWithURL:weakSelf.videoUrl];
                    }
                });
            }];
        });
    } else {
        if (_assetWriter) {
            PRINT("assetWriter wrong, status:%ld", _assetWriter.status)
        } else {
            PRINT("assetWriter not exist")
        }
        if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(finishWritingWithURL:)]) {
            [weakSelf.delegate finishWritingWithURL:weakSelf.videoUrl];
        }
    }
}

大致流程就是這樣,但是用Writer里面的坑那不是一般的多...

1、正如我在前面初始化Writer里面寫的注釋一樣,AVVideoWidthKey ,AVVideoHeightKey是個(gè)坑。首先,AVVideoHeightKey和AVVideoHeightKey分別是高和寬賦值是相反的。因?yàn)橐话阋匀擞^看的方向做為參考標(biāo)準(zhǔn)來說小視頻的分辨率 寬 X 高 ,而設(shè)備默認(rèn)的方向是Landscape Left,即設(shè)備向左偏移90度,所以實(shí)際的視頻分辨率就是高 x 寬與一般認(rèn)為的相反,而iPad橫屏下它本來就轉(zhuǎn)了90度,所以又是一致的,才有了那個(gè)補(bǔ)?。?/p>

if (IS_IPAD && UIDeviceOrientationIsLandscape(orientation)) {
        widthKey = self.outputSize.width;
        heightKey = self.outputSize.height;
    }

2、關(guān)于videoUrl,也就是寫入數(shù)據(jù)的指定好的路徑,這個(gè)路徑下不能有文件,必須先remove掉才能開始寫入,否則crash,報(bào)錯(cuò)status == AVAssetWriterStatusFailed

- (BOOL)checkPathUrl:(NSURL *)url {
    if (!url) {
        return NO;
    }
    if ([[NSFileManager defaultManager] fileExistsAtPath:[url path]]) {
        return [[NSFileManager defaultManager] removeItemAtPath:[url path] error:nil];
    }
    return YES;
}

3、注意開始寫入時(shí)保證先startWriting,再startSessionAtSourceTime
4、注意線程問題,以及sampleBuffer的釋放
5、初始化Writer時(shí),注意有傳入現(xiàn)在的攝像頭狀態(tài),因?yàn)檫@里有個(gè)巨坑,對(duì)于前后攝像頭他的旋轉(zhuǎn)方向是不一樣的,因?yàn)橐WCiPhone橫屏下錄的視頻輸出時(shí)方向是對(duì)的。
6、回到在session VC那邊開始調(diào)用self.assetWriteManager的地方,每次開始錄制最好都重新初始化這個(gè)assetWriteManager,否則會(huì)出現(xiàn)報(bào)錯(cuò)status == AVAssetWriterStatusUnknown

    self.assetWriteManager = [[JTAVAssetWriteManager alloc] initWithURL:[NSURL fileURLWithPath:outputFilePath] outputSize:self.outputSize];
    self.assetWriteManager.delegate = self;
    [self.assetWriteManager startWriteWithIsFront:isFront];

7、如果要提高清晰度,那就增大outputsize,但是有上限,取決于你的設(shè)備攝像頭,另外也可提高碼率來提高清晰度。
另:第三種方式的demo后續(xù)補(bǔ)上。

結(jié):這種方法無疑很精密,各種設(shè)置都可以自定義,就是麻煩一些,如果要求較高,建議使用這種,而且擴(kuò)展性強(qiáng),以后比如添加什么錄制一半繼續(xù)錄制這種需求都好完成。而且這邊是實(shí)時(shí)拿到緩沖樣本進(jìn)行處理,做一些加水印之類的就比較方便了,不用錄制完成后再去耗時(shí)處理。

最后放一張三種情況下的對(duì)比圖


參考鏈接:
About AVFoundation
Capturing Video on iOS
Camera Capture on iOS
AVCaptureSession pause

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容