一. ?捕捉會(huì)話AVCaptureSession.
AVCaptureSession用于連接輸入和輸出的資源. 捕捉會(huì)話管理從物理設(shè)備(比如: 攝像頭, 麥克風(fēng))得到的數(shù)據(jù)流, 輸出到一個(gè)或多個(gè)目的地. 可以動(dòng)態(tài)的配置輸入和輸出線路, 讓開(kāi)發(fā)者能夠在會(huì)話中重新配置捕捉環(huán)境.
捕捉會(huì)話可以配置會(huì)話預(yù)設(shè)值(session preset), 用來(lái)控制捕捉數(shù)據(jù)的格式和質(zhì)量. 會(huì)話設(shè)置默認(rèn):?AVCaptureSessionPresetHigh
二. 捕捉設(shè)備
AVCaptureDevice用于訪問(wèn)系統(tǒng)的捕捉設(shè)備, 最常用的方法: defaultDeviceWithMediaType:?
AVCaptureDeveice為攝像頭, 麥克風(fēng)等物理設(shè)備定義了接口, 這些設(shè)備內(nèi)置于Mac, iPhone中, 也可能是外部數(shù)碼相機(jī)或者攝像機(jī)等. AVCaptureDevice針對(duì)物理設(shè)備提供了大量的方法, 比如控制攝像頭的對(duì)焦, 曝光, 白平衡或者閃光燈等.
三. 捕捉設(shè)備的輸入
在使用捕捉設(shè)備進(jìn)行處理前, 需要將其加入到會(huì)話中, 捕捉設(shè)備沒(méi)法直接添加到AVCaptureSession中, 需要將他封裝在AVCaptureDeviceInput實(shí)例中. AVCaptureDeviceInput對(duì)象在設(shè)備輸出數(shù)據(jù)和捕捉會(huì)話間扮演橋接作用. AVCaptureDeviceInput的創(chuàng)建方式
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
四. 捕捉設(shè)備的輸出
AVFoundation定義了AVCaptureOutput的許多擴(kuò)展類,?AVCaptureOutput是一個(gè)抽象基類. 用于為捕捉會(huì)話得到的數(shù)據(jù)尋找輸出目的地. 框架定義了一下這個(gè)類的高級(jí)擴(kuò)展類.
AVCaptureDeviceStillImageOutput: 捕捉靜態(tài)圖片.
AVCaptureMovieFileOutput: 捕捉音頻和視頻數(shù)據(jù)
底層的輸出類: AVCaptureAudioDataOutput和AVCaptureVideoDataOutput, 使用它們可以直接訪問(wèn)硬件捕捉到數(shù)字樣本. 使用底層的輸出類可以對(duì)音頻和視頻進(jìn)行實(shí)時(shí)處理.
五. 捕捉連接
AVCaptureConenction: 捕捉輸入和輸出的連接, 可以用來(lái)啟用或者禁用給定輸入或給定輸出的數(shù)據(jù)流. 也可以使用連接來(lái)監(jiān)聽(tīng)音頻信道中的平均和峰值.
六. 捕捉預(yù)覽
AVCaptureVideoPreviewLayer: 對(duì)捕捉數(shù)據(jù)進(jìn)行實(shí)時(shí)預(yù)覽, 類似于AVPlayerLayer, 不過(guò)針對(duì)攝像頭的捕捉進(jìn)行了定制. AVCaptureVideoPreviewLayer也支持視頻重力的概念, 可以控制視頻內(nèi)容的縮放和拉伸效果.
AVLayerGravityResizeAspect: 保持視頻寬高比
AVLayerGravityResizeAspectFill: 保持寬高比, 填滿整個(gè)圖層, 會(huì)造成視頻裁剪
AVLayerGravityResize: 填滿圖層, 視頻變形.
//捕捉會(huì)話的創(chuàng)建
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; ? ?
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if ([session canAddInput:input]) {[session addInput:input];}
AVCaptureStillImageOutput *output = [ [AVCaptureStillImageOutput alloc] init];
if([session canAddOutput:output]) {[session addOutput:output];}
七.知識(shí)點(diǎn)
1. 坐標(biāo)空間轉(zhuǎn)換, 捕捉設(shè)備坐標(biāo)系和屏幕坐標(biāo)系不同左上角, 捕捉設(shè)備坐標(biāo)系是基于攝像頭傳感器的本地設(shè)置, 水平方向不可旋轉(zhuǎn), 左上角為(0, 0), 右下角為(1, 1)
2. AVCaptureVideoPreViewLayer提供了兩個(gè)方法用于兩個(gè)坐標(biāo)系間進(jìn)行轉(zhuǎn)換.?
? ? captureDevicePointOfInterestForPoint: 獲取屏幕坐標(biāo)系point, 返回設(shè)備坐標(biāo)系point
? ? pointForCaptureDeviceOfInterest: 獲取攝像頭坐標(biāo)系, 返回屏幕坐標(biāo)系
? ? 點(diǎn)擊對(duì)焦和點(diǎn)擊曝光通常會(huì)用到轉(zhuǎn)換坐標(biāo).
3. 捕捉會(huì)話設(shè)置
// CameraController
- (Bool)setupSession {
????????self.captureSession = [[AVCaptureSession alloc] init]; ? ? ? ?//創(chuàng)建捕捉會(huì)話
? ? ? ? AVCaptureDevice *videoDevice = [AVCaptureDevice deviceWithMediaType: AVMediaTypeVideo]; ? ?//得到系統(tǒng)默認(rèn)捕捉設(shè)備指針, 返回手機(jī)后置攝像頭
? ? ? ? AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice: videoDevice error: error]; ? ?
? ? ? ? if (videoInput && [self.captureSession canAddInput: videoInput]) {
? ? ? ? ? ? ? ? [self.captureSession addInput: videoInput];
? ? ? ? ? ? ? ? self.activeVideoInput = videoInput;
????????} else {
? ? ? ? ? ? ? ? return NO;
????????}
? ? ? ? AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMeidaTypeAudio]; ? ? ? ?//創(chuàng)建捕捉設(shè)備的輸入
? ? ? ? if (audioDevice && [self.captureSession canAddInput:audioInput]) {
? ? ? ? ? ? ? ?[ self.captureSession addInput: audioInput]
????????} else {
? ? ? ? ? ? ? ? return NO;
????????}
? ? ? ? self.imageOutput = [[AVCaptureStillImageOutput alloc] init]; ? ?//捕捉靜態(tài)圖片
? ? ? ? self.imageOutput.outputSettings = @{AVVideoCodeKey: VAVideoCodecJPEG};
? ? ? ? if ([self.captureSession canAddOutput: self.imageOutput]) {
? ? ? ? ? ? ? ? [self.captureSession addOutput: self.imageOutput];
????????}
? ? ? ? self.movieOutput = [[AVCaptureMovieFileOutput alloc] init]; //保存到文件系統(tǒng)
? ? ? ? if ([self.captureSession canAddOutput: self.movieOutput]) {
? ? ? ? ? ? ? ? [self.captureSession addOutput: self.movieOutput];
????????}
? ? ? ? self.videoQueue = dispatch_queue_create("com.vedioqueue", NULL);
? ? ? ? return YES;
}
4. 啟動(dòng)和停止會(huì)話, 使用捕捉會(huì)話之前首先要啟動(dòng)會(huì)話, 啟動(dòng)會(huì)話的第一步是啟動(dòng)數(shù)據(jù)流, 使他處于準(zhǔn)備捕捉圖片和視頻的狀態(tài). a: 檢查設(shè)備是否處于活躍狀態(tài), 如果沒(méi)有準(zhǔn)備好則調(diào)用startRunning方法, 這是一個(gè)同步調(diào)用會(huì)消耗一定的時(shí)間, 所以要異步方式在videoQueue排隊(duì)調(diào)用. b: stopRunning停止系統(tǒng)中數(shù)據(jù)流, 也是一個(gè)同步調(diào)用, 所以也要采用異步方式調(diào)用.
- (void)startSession { //a
? ? ? ? if (![self.captureSession isRunning]) {
? ? ? ? ? ? ? ? dispatch_async(self.videoQueue, ^{
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession startRunning];
????????????????});
????????}
}
- (void)stopSession { ? ?//b
? ? ? ? if ([self.captureSession isRunning]) {
? ? ? ? ? ? ? ? dispatch_async(self.videoQueue, ^{
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession stopRunning];
????????????????});
? ? ? ?}
}
5. 切換攝像頭
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
? ?NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMidaTypeVideo];
? ? for (AVCaptureDevice *device in devices) {
? ? ? ? ? ? if (device.position == position) {
? ? ? ? ? ? ? ? ? ? return device;
????????????}
? ? ? ? ? ? return nil;
????}
}
- (AVCaputreDevice *)activeCamera {
? ? return self.activeVideoInput.device;
}
- (AVCAptureDevice *)inactiveCamera {
? ? ? ? AVCaptureDevice *device = nil;
? ? ? ? if (self.cameraCount > 1) {
? ? ? ? ? ? ? ? if ?([self activeCamera].position == AVCaptureDevicePositionBack) {
? ? ? ? ? ? ? ? ? ? ? ? device = [self cameraWithPosition:?AVCaptureDevicePositionFront];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ? ? device = [self cameraWithPosition:?AVCaptureDevicePositionBack];
????????????????}
????????}
? ? ? ? return device;
}
- (BOOL)canSwitchCameras {
? ? ? ? return self.cameraCount > 1;
}
- (NSUInteger)cameraCount {
? ? ? ? return [[AVCaptureDevice deviceWithMediaType:AVMediaTypeVideo] count];
}
切換前置和后置攝像頭需要重新配置捕捉會(huì)話, 幸運(yùn)的是可以動(dòng)態(tài)配置AVCaptureSession, 所以不必?fù)?dān)心停止會(huì)話和重啟會(huì)話帶來(lái)的開(kāi)銷, 不過(guò)對(duì)會(huì)話進(jìn)行的任何改變都要通過(guò)beginConfiguration和commitConfiguration進(jìn)行單獨(dú)的原子性的變化.
- (void)switchCameras {
? ? ? ? if (![self canSwitchCameras]) { ? ?return NO;????}
? ? ? ? NSError *error;
? ? ? ? AVCaptureDevice *videoDevice = [self inactiveCamera];
? ? ? ? AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
? ? ? ? if (videoInput) {
? ? ? ? ? ? ? ? [self.captureSession beginConfiguration]; ? ?//標(biāo)注原子配置變化開(kāi)始
? ? ? ? ? ? ? ? [self.captureSession removeInput: self.activeVideoInput];
? ? ? ? ? ? ? ? if ([self.captureSession canAddInput: videoInput]) {
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession addInput: videoInput];
? ? ? ? ? ? ? ? ? ? ? ? self.activeVideoInput = videoInput;
? ? ? ? ? ? ? ? } else {
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession addInput:self.activeInput]; ? ?//防止添加失敗
? ? ? ? ? ? ? ? }
? ? ? ? ? ? ? ?//分批將所有變更整合在一起, 得到一個(gè)有關(guān)會(huì)話單獨(dú)的原子的修改
? ? ? ? ? ? ? ? [self.captureSession commitConfiguration];
????????} else {
? ? ? ? ? ? ? ? return NO; ? ? ? ?//返回之前可以做一些處理
????????}
? ? ? ? return YES;
}
6. 配置捕捉設(shè)備, AVCaptureDevice定義了很多方法可以讓開(kāi)發(fā)者控制攝像頭,尤其是可以獨(dú)立調(diào)整和鎖定攝像頭的焦距, 曝光和白平衡. 對(duì)焦和曝光還可以支持特定的興趣點(diǎn)設(shè)置, 實(shí)現(xiàn)點(diǎn)擊曝光和對(duì)焦的功能. AVCaptureDevice還可以控制設(shè)備的LED作為拍照的閃光燈或手電筒使用. 每當(dāng)修改攝像頭設(shè)備時(shí), 一定要先檢查該動(dòng)作是否被設(shè)備支持, 否則會(huì)出現(xiàn)異常. 并不是所有攝像頭都能支持所有的功能. 例如: 前置攝像頭不支持對(duì)焦. 后置攝像頭可以支持全尺寸對(duì)焦.
AVCaptureDevice *device = ...
if ([devcie isFocusModeSupported: AVCaptureFocusModeAutoFocus]) {
? ? //當(dāng)配置修改可以支持時(shí), 修改技巧為: 先鎖定設(shè)備準(zhǔn)備配置, 執(zhí)行所需要的修改, 最后解鎖配置
? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.focusMode = AVCaptureFocusModeAutoFocus;
? ? ? ? ? ? [device unlockForConfiguration];
????} else {
? ? ? ? ? ? //handle error
????}
}
7. 調(diào)整焦距, a: 詢問(wèn)攝像頭是否支持興趣點(diǎn)對(duì)焦 b: 傳遞一個(gè)point, 已經(jīng)從屏幕坐標(biāo)系轉(zhuǎn)化為捕捉設(shè)備坐標(biāo). c: 是否支持興趣點(diǎn)對(duì)焦并確認(rèn)是否支持自動(dòng)對(duì)焦模式, 這一模式會(huì)使用單獨(dú)掃描的自動(dòng)對(duì)焦.
點(diǎn)擊對(duì)焦的實(shí)現(xiàn):
- (BOOL)cameraSupportTapToFocus {???? //a
? ? ? ? return [[self activeCamera] isFocusPointOfInterestSupported];
}
- (void)focusAtPoint:(CGPoint)point {????//b
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported: AVCaptureFocusModeAutoFocus]) { ? ?//c
? ? ? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ? ? ? device.focusPointOfInterest = point;
? ? ? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ? ? //錯(cuò)誤處理
????????????????}
????????}
}
8. 點(diǎn)擊曝光, a: 是否支持對(duì)興趣點(diǎn)曝光. b: 判斷設(shè)備是否支持鎖定曝光模式, 如果支持使用KVO來(lái)確定設(shè)備adjustingExposure屬性的狀態(tài), 觀察該屬性可以知道曝光何時(shí)調(diào)整完成, 讓我們有機(jī)會(huì)在改點(diǎn)上鎖定曝光. c: 判斷設(shè)備不在調(diào)整曝光等級(jí), 確認(rèn)設(shè)備的exposureMode可以設(shè)置為AVCaptureExposueModeLocked.
- (BOOL)cameraSupportsTapToExpose { ? ?//a
? ? ? ? return [[self activeCamera] isExposurePointOfInterestSupported];
}
static const NSString *THCameraAdjustingExposureContext;
- (void)exposeAtPoint:(CGPoint)point {
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
? ? ? ? if (device.isExposurePointOfInterestSuppoted && [device isExposureModeSupported: exposureMode]) { ? ?
? ? ? ? ? ? ? ? if ([device lockForConfiguration:&error]) {
? ? ? ? ? ? ? ? ? ? ? ? device.exposurePointOfInterest = point;
? ? ? ? ? ? ? ? ? ? ? ? device.exposureMode = exposureMode;
? ? ? ? ? ? ? ? ? ? ? ? if ([device isExposureModeSupported: AVCaptureExposureModelLocked]) { ? ?//b
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [device addObserve: self forKeyPath: @"adjustingExposure" oprions: NSKeyValueObservingOptionNes context: &THCameraAdjustingExposureContext];
????????????????????????}
? ? ? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????????}
????????} else {
? ? ? ? // handle error ? ? ? ?
????????}
????}
}
- (void)observeValueForKeyPath:(NSSting *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
? ? if (context == &THCameraAdjustingExposureContext) {
? ? ? ? AVCaptureDevice *device = (AVCaptureDevice *)object;
? ? ? ? if (!device.isAjustingExposure && [device isExposureModeSupported: AVCaptureExposureModeLocked]) { ? ?//c
? ? ? ? ? ? [object removeObserver: self forKeyPath: @"ajustingExposure" ...];
? ? ? ? ? ? dispatch_async(dispatch_get_main_queue(), ^{
? ? ? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ? device.exposureMode = AVCaptureEXposureModeLocked;
? ? ? ? ? ? ? ? ? ? [device unlockConfiguration];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ?//錯(cuò)誤處理
????????????????}
????????????});
? ? ? } else {
? ? ? ? [super observeValueForKeyPath: keyPath ofObject:....];
????}
}
9. 重新設(shè)置對(duì)焦和曝光
- (void)resetFocusAndExposureMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
? ? BOOL canResetFocus = [device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode];
? ? AVCaptureExpsureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
? ? BOOL canResetExposure = [device isExposurePointOfInterestSuppported] && [device isFocusModeSupported:focusMode];
? ? CGPoint centerPoint = CGPointMake(0.5, 0.5);
? ? NSError *error;
? ? if ([device lockForConfiguration:&error]) {
? ? ? ? if (canResetFocus) {
? ? ? ? ? ? device.focusMode = focusMode;
? ? ? ? ? ? device.focusPointOfInterest = centerPoint;
????????}
? ? ? ? if (canResetExposure) {
? ? ? ? ? ? device.exposureMode = exposureMode;
? ? ? ? ? ?device.exposurePointOfInterest = centerPoint;
????????}
? ? ? ? [device unlockForConfiguration];
????} else {
? ? ? ? //錯(cuò)誤處理
????}
}
10. 調(diào)整閃光燈和手電筒模式, AVCaptureDevice可以讓開(kāi)發(fā)者修改攝像頭的閃光燈和手電筒模式, 設(shè)備后面的LED燈, 當(dāng)拍攝靜態(tài)圖片時(shí)用作閃光燈, 當(dāng)拍攝視頻時(shí)用作連續(xù)的燈光(手電筒). 捕捉設(shè)備的flashMode和torchMode屬性可以被設(shè)置為AVCapture(Torch|Flash)(On| Off | Auto), 3個(gè)值中的一個(gè).
- (BOOL)cameraHasFlash {
? ? return [[self activeCamera] hasFlash];
}
- (AVCaptureFlashMode)flashMode {
? ? return [[self activeCamera] flashMode];
}
- (void)setFlashMode:(AVCaptureFlashMode)flashMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? if ([device isFlashModeSupported:flashMode]) {
? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.flashMode = flashMode;
? ? ? ? ? ? [device unlockForConfiguration];
????????} else {
? ? ? ? ? ? //錯(cuò)誤處理
????????}
????}
}
- (BOOL)cameraHasTorch {
? ? return [[self activeCamera] hasTorch];
}
- (AVCaptureTorchMode)torchMode {
? ? return [[self activeCamera] torchMode];
}
- (void)setTorchMode:(AVCaptureTorchMode)torchMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? if ([device isTorchModeSupported:torchMode]) {
? ? ? ? NSError *error;
? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.torchMode = torchMode;
? ? ? ? ? ? [device unlockForConfiguration];
????????} else {
? ? ? ? ? ? //異常處理
????? ? }
????}
}
11. 拍攝靜態(tài)圖片, 在setupSession方法的實(shí)現(xiàn)中, 我們將一個(gè)AVCaptureStillImageOutput實(shí)例添加到捕捉會(huì)話中, 這個(gè)是AVCaptureOutput子類, 用于捕捉靜態(tài)圖片. a: 獲取AVCaptureStillImageOutput對(duì)象使用的當(dāng)前AVCaptureConnection指針, 當(dāng)查找AVCaptureStillImageOutput連接時(shí)一般會(huì)傳遞AVMediaTypeVideo媒體類型.
- (void)captureStillImage {
? ? AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo]; ? ?//a
? ? if (connection.isVideoOrientationSupported) {
? ? ? ? connection.videoOrientation = [self currentVideoOrientation];
????}
? ? id handler = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
? ? ? ? if (sampleBuffer != NULL) {
? ? ? ? ? ? NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDatRepresentation:sampleBuffer];
? ? ? ? ? ? UIImage *image = [[UIImage alloc] initWithData:imageData];
????????} else {
? ? ? ? ? ? //錯(cuò)誤處理
? ? ? ? ?}
????};
? ? [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection ?completionHandler:handler];
}
- (AVCaptureVideoOrientation)currentVideoOrientation {
? ? AVCaptureVideoOrientation orientation;
? ? switch ([UIDevice currentDevice].orientation) {
? ? ? ? case UIDeviceOrientationPortrait:?
? ? ? ? ? ? ? ? ? ? orientation =?UIDeviceOrientationPortrait;
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? case??UIDeviceOrientationLandscapeRight:
? ? ? ? ? ? ? ? ? ? orientation =??UIDeviceOrientationLandscapeLeft;
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? case?UIDeviceOrientationUpsideDown:
????????????????????orientation =?UIDeviceOrientationUpsideDown;? ? ? ? ? ? ??
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? default:
? ? ? ? ? ? ? ? orientation =?UIDeviceOrientationLandscapeRight;
? ? ? ? ? ? ? ? break;
????}
? ? return orientation;
}
12. 使用Assets Library框架, Assets Library可以讓開(kāi)發(fā)者管理用戶的相冊(cè)和視頻庫(kù).
AVAuthorizationStatus status = [ALAssetLibrary authorizationStatus];
if (status == AVAuthorizationStatusDenied) { ? ?
????//without access
} else { ??
?????//perform authorized access to the library
}
- (void)writeImageToAssetsLibrary:(UIImage *)image { ? ?//將圖片寫(xiě)到本地
? ? ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
? ? [library writeImageToSavePhotosAlbum:image.CGImage orientation:image.orientation completionHander:^(NSURL *assetURL, NSError *error){ ? ? ? ? ? ? ??
? ? ? }];
}
13. 視頻捕捉, AVCaptureMovieFileOutput開(kāi)始錄制時(shí), 會(huì)在頭文件前面寫(xiě)入一個(gè)最小化的頭信息, 隨著錄制的進(jìn)行, 片段會(huì)按照一定的周期寫(xiě)入. 創(chuàng)建完整的頭信息. 間隔可以通過(guò)修改捕捉輸出的movieFragmentInterval屬性來(lái)改變. a: 判斷AVCaptureMovieFileOutput狀態(tài). b: 視頻穩(wěn)定, 支持穩(wěn)定可以顯著提高捕捉到視頻的質(zhì)量. c:攝像頭平滑對(duì)焦模式, 減慢攝像頭對(duì)焦的速度. 通常情況下用戶移動(dòng)攝像頭會(huì)嘗試快速自動(dòng)對(duì)焦, 這會(huì)在捕捉的視頻中出現(xiàn)脈沖式效果, 平滑對(duì)焦會(huì)降低對(duì)焦速度, 從而提供更加自然的錄制效果.?
- (BOOL)isRecording { ? ?//a
? ? return self.movieOutput.isRecording;
}
- (void)startRecording {
? ? if (![self isRecording]) {
? ? ? ? AVCaptureConnection *videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
? ? ? ? if ([videoConnection isVideoOrientationSupported]) {
? ? ? ? ? ? videoConnection.videoOrientation = self.currentVideoOrientation;
????????}
? ? ? ? if ([videoConnection isVideoStabilizationSupported]) { ? ?//b
? ? ? ? ? ? videoConnection.enablesVideoStabilizationWhenAvailable = YES;
????????}
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? if (device.isSmoothAutoFocusSupported) { ? ?//c
? ? ? ? ? ? NSError *error;
? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ?device.smoothAutoFocusEnabled = YES; ? ? ? ? ?
? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????} else {
? ? ? ? ? ? ? ? ? ? //錯(cuò)誤處理
????????????}
????????}
? ? ? ? self.outputURL = [self uniqueURL];
? ? ? ? [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];
????}
}
- (NSURL *)uniqueURL {
? ? NSFileManager *fileManager = [NSFileManager defaultManager];
? ? NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"temp"];
? ? if (dirPath) {
? ? ? ? NSString *filePath = [dirPath stringByAppendingPatchComponent:@"1.mov"];
? ? ? ? return [NSURL fileURLWithPath:filePath];
????} ? ?
? ? return nil;
}
- (void)stopRecording {
? ? if ([self isRecording]) {
? ? ? ? [self.movieOutput stopRecording];
????}
}
14. 實(shí)現(xiàn)AVCaptureFileOutputRecordingDelegate協(xié)議, a:向資源庫(kù)寫(xiě)入前檢查視頻是否可以被寫(xiě)入. ?b:根據(jù)視頻的寬高比設(shè)置圖片高度, 還需要設(shè)置appliesPreferredTrackTransform為YES, 這樣捕捉縮略圖時(shí)會(huì)考慮視頻的變化.
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
? ? if (error) {
? ? ? ? [self.delegate mediaCaptureFailedWithError:error];
? ? } else {
? ? ? ? [self writeVideoToAssetsLibrary:[self.outputURL copy]];
????}
? ? self.outURL = nil;
}
- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {
? ? ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
? ? if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) { ? ?//a
? ? ? ? ALAssetsLibraryWriteVideoCompletionBlock completionBlock;
? ? ? ? completionBlock = ^(){
? ? ? ? ? ? if (error) {
? ? ? ? ? ? ? ? [self.delegate assetLibraryWriteFailedWithError:error];
????????????} else {
? ? ? ? ? ? ? ? [self generateThumbnailForVideoAtURL:videoURL];
????????????}
? ? ? ? };
? ??????? ? ? ? ? ? [library writeVideoAtPathToSavePhotosAlbum:videoURL completionBlock:completionBlock];
????}
}
- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {
? ? dispatch_async(self.videoQueue, ^{
? ? ? ? AVAsset *asset = [AVAsset assetWithURL:videoURL];
? ? ? ? AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
????????imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f); ? ?//b? ? ? ?
? ??????imageGenerator.appliesPreferredTrackTransform = YES;
? ? ? ? CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error: nil];
? ? ? ? UIImage *image = [UIImage imageWithCGImage:imageRef];
? ? ? ? CGImageRelease(imageRef);
? ? ? ? dispatch_async(dispatch_get_main_queue(), ^{
? ? ? ? ? ? //...
????????});
? ? });
}