UIImagePickerController
目前,將視頻捕獲集成到你的應(yīng)用中的最簡(jiǎn)單的方法是使用 UIImagePickerController。這是一個(gè)封裝了完整視頻捕獲管線和相機(jī) UI 的 view controller。
在實(shí)例化相機(jī)之前,首先要檢查設(shè)備是否支持相機(jī)錄制:
if ([UIImagePickerController
isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
NSArray *availableMediaTypes = [UIImagePickerController
availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
if ([availableMediaTypes containsObject:(NSString *)kUTTypeMovie]) {
// 支持視頻錄制
}
}
然后創(chuàng)建一個(gè) UIImagePickerController 對(duì)象,設(shè)置好代理便于進(jìn)一步處理錄制好的視頻 (比如存到相冊(cè)) 以及對(duì)于用戶關(guān)閉相機(jī)作出響應(yīng):
UIImagePickerController *camera = [UIImagePickerController new];
camera.sourceType = UIImagePickerControllerSourceTypeCamera;
camera.mediaTypes = @[(NSString *)kUTTypeMovie];
camera.delegate = self;
typedef NS_ENUM(NSInteger, UIImagePickerControllerSourceType) {
UIImagePickerControllerSourceTypePhotoLibrary,
UIImagePickerControllerSourceTypeCamera,
UIImagePickerControllerSourceTypeSavedPhotosAlbum
} __TVOS_PROHIBITED
mediaTypes 對(duì)應(yīng)的是 音頻:kUTTypeAudio 視頻:kUTTypeVideo 音視頻 :kUTTypeMovie 圖片照片:kUTTypeImage
//編碼質(zhì)量
[camera setVideoQuality:UIImagePickerControllerQualityTypeIFrame1280x720];
//默認(rèn)開啟的攝像頭
[camera setCameraDevice:UIImagePickerControllerCameraDeviceFront];
最后 ,當(dāng)用戶操作,選擇/拍攝完以后的回調(diào)
#pragma mark - UIImagePickerControllerDelegate methods
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSURL *recordedVideoURL= [info objectForKey:UIImagePickerControllerMediaURL];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:recordedVideoURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:recordedVideoURL
completionBlock:^(NSURL *assetURL, NSError *error){}
];
}
[picker dismissViewControllerAnimated:YES completion:nil];
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[picker dismissViewControllerAnimated:YES completion:nil];
}
擴(kuò)展 - 自定義UIImagePickerController UI
UIView *cameraOverlay = [[UIView alloc] init];
picker.showsCameraControls = NO;
picker.cameraOverlayView = cameraOverlay;
然后關(guān)聯(lián) 開始/停止 操作
- (void)takePicture NS_AVAILABLE_IOS(3_1);
// programatically initiates still image capture. ignored if image capture is in-flight.
// clients can initiate additional captures after receiving -imagePickerController:didFinishPickingMediaWithInfo: delegate callback
- (BOOL)startVideoCapture NS_AVAILABLE_IOS(4_0);
- (void)stopVideoCapture NS_AVAILABLE_IOS(4_0);
總結(jié): UIImagePickerController 可以非常方便的進(jìn)行瀏覽相冊(cè)選擇視頻、圖片 ;也可以自己拍攝視頻、照片 。系統(tǒng)提供了常用的功能以及通用的UI 。同時(shí),也正是因?yàn)橄到y(tǒng)的同意方便,限制了對(duì)于功能、界面的自定義 。
--
AVFoundation
提供了基于 AVCaptureSession 的很對(duì)操作硬件層面的接口。
AVCaptureSession 做為音頻 \ 視頻輸入 和 文件輸出的橋梁 。在中間傳輸媒體數(shù)據(jù)流 。
那么根據(jù)流程圖 我們的初始化代碼創(chuàng)建就是這樣的
AVCaptureSession *captureSession = [AVCaptureSession new];
AVCaptureDeviceInput *cameraDeviceInput = …
AVCaptureDeviceInput *micDeviceInput = …
AVCaptureMovieFileOutput *movieFileOutput = …
if ([captureSession canAddInput:cameraDeviceInput]) {
[captureSession addInput:cameraDeviceInput];
}
if ([captureSession canAddInput:micDeviceInput]) {
[captureSession addInput:micDeviceInput];
}
if ([captureSession canAddOutput:movieFileOutput]) {
[captureSession addOutput:movieFileOutput];
}
[captureSession startRunning];
需要注意:
-
AVCaptureDeviceInput 在添加進(jìn)入 Session之前 ,需要驗(yàn)證是否可以添加 。
@discussion An AVCaptureInput instance can only be added to a session using -addInput: if canAddInput: returns YES. */ - (BOOL)canAddInput:(AVCaptureInput *)input; 所有對(duì) capture session 的調(diào)用都是阻塞的,因此建議將它們分配到后臺(tái)串行隊(duì)列中
```
_sessionQueue = dispatch_queue_create( "com.example.capturepipeline.session", DISPATCH_QUEUE_SERIAL );
//具體操作captureSession的時(shí)候就使用后臺(tái)同步隊(duì)列
- (void)startRunning
{
dispatch_sync( _sessionQueue, ^{
[_captureSession startRunning];
} );
}
```
tip: 同步隊(duì)列 中加載 異步任務(wù) ,異步任務(wù)會(huì) 遵循先進(jìn)先出的原則,挨個(gè)執(zhí)行 。同時(shí),系統(tǒng)只會(huì)創(chuàng)建一個(gè)子線程 。所有的任務(wù)逐個(gè)執(zhí)行。 如果加入的是同步任務(wù) ,就不會(huì)開線程,只會(huì)在主線程中執(zhí)行。
輸入設(shè)置-配置音視頻質(zhì)量相關(guān)參數(shù)
-
如果只是簡(jiǎn)單的需求 ,不需要那么精確的控制輸出質(zhì)量
@discussion The value of this property is an NSString (one of AVCaptureSessionPreset*) indicating the current session preset in use by the receiver. The sessionPreset property may be set while the receiver is running.
*/
@property(nonatomic, copy) NSString *sessionPreset;
```
只需要通過(guò)這個(gè)函數(shù) ,設(shè)置系統(tǒng)提供的等級(jí)配置即可
```
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;
```
-
精確控制 幀率、對(duì)焦、曝光
對(duì)于AVCaptureDevice 可以完成對(duì) 幀率范圍 ,曝光 ,對(duì)焦 ,白平衡等的設(shè)置 。
// 官方示例代碼 :
//針對(duì)不同的硬件 ,支持的是不同的 ,所以我們需要查詢,然后選擇合適的 。- (void)configureCameraForHighestFrameRate:(AVCaptureDevice *)device
{
AVCaptureDeviceFormat *bestFormat = nil;
AVFrameRateRange *bestFrameRateRange = nil;
for ( AVCaptureDeviceFormat *format in [device formats] ) {
for ( AVFrameRateRange *range in format.videoSupportedFrameRateRanges ) {
if ( range.maxFrameRate > bestFrameRateRange.maxFrameRate ) {
bestFormat = format;
bestFrameRateRange = range;
}
}
}
if ( bestFormat ) {
if ( [device lockForConfiguration:NULL] == YES ) {
device.activeFormat = bestFormat;
device.activeVideoMinFrameDuration = bestFrameRateRange.minFrameDuration;
device.activeVideoMaxFrameDuration = bestFrameRateRange.minFrameDuration;
[device unlockForConfiguration];
}
}
}
```
對(duì)于 硬件設(shè)置以后另開一個(gè)單獨(dú)寫 。
-
光學(xué)防抖 AVCaptureConnection
視頻防抖 是在 iOS 6 和 iPhone 4S 發(fā)布時(shí)引入的功能。到了 iPhone 6,增加了更強(qiáng)勁和流暢的防抖模式,被稱為影院級(jí)的視頻防抖動(dòng)。防抖并不是在捕獲設(shè)備上配置的,而是在 AVCaptureConnection 上設(shè)置。由于不是所有的設(shè)備格式都支持全部的防抖模式,所以在實(shí)際應(yīng)用中應(yīng)事先確認(rèn)具體的防抖模式是否支持
AVCaptureDevice *device = ...;
AVCaptureConnection *connection = ...;
AVCaptureVideoStabilizationMode stabilizationMode = AVCaptureVideoStabilizationModeCinematic;
if ([device.activeFormat isVideoStabilizationModeSupported:stabilizationMode]) {
[connection setPreferredVideoStabilizationMode:stabilizationMode];
}
```
- HDR
iPhone 6 的另一個(gè)新特性就是視頻 HDR (高動(dòng)態(tài)范圍圖像)
有兩種方法可以配置視頻 HDR:直接將 capture device 的 videoHDREnabled 設(shè)置為啟用或禁用,或者使用 automaticallyAdjustsVideoHDREnabled 屬性來(lái)留給系統(tǒng)處理。
####硬件訪問(wèn)權(quán)限
當(dāng)用戶未授權(quán)時(shí),對(duì)于錄制視頻或音頻的嘗試,得到的將是黑色畫面和無(wú)聲。
- (void)requestAccessForMediaType:(NSString *)mediaType completionHandler:(void (^)(BOOL granted))handler NS_AVAILABLE_IOS(7_0);
//The media type, either AVMediaTypeVideo or AVMediaTypeAudio
####輸出設(shè)置
分為兩種 :
- 基于文件的 AVCaptureMovieFileOutput
編碼錄制的時(shí)候 ,收到的回調(diào) 都是基于文件的 ,返回存放文件url。不能對(duì)CMSampleBufferRef操作
- 另一種是基于流數(shù)據(jù)的AVCaptureVideoDataOutput和AVCaptureAudioDataOutput
這些輸出將會(huì)各自捕獲視頻和音頻的樣本緩存,接著發(fā)送到它們的代理。代理要么對(duì)采樣緩沖進(jìn)行處理 (比如給視頻加濾鏡),要么保持原樣傳送
#####AVCaptureMovieFileOutput
- 1、指定編碼文件保存地址
```
- (void)startRecordingToOutputFileURL:(NSURL*)outputFileURL recordingDelegate:(id<AVCaptureFileOutputRecordingDelegate>)delegate;
```
- 2、編碼開始回調(diào)
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections;
```
-
3、編碼結(jié)束回調(diào)
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error;
AVAssetWriter 直接操作數(shù)據(jù)流
-
1、創(chuàng)建AVCaptureAudioDataOutput 和 AVCaptureVideoDataOutput
設(shè)置好代理
[_videoDataOutput setSampleBufferDelegate:self queue:_videoDataOutputQueue]; -
2、準(zhǔn)備開始編碼
配置AVAssetWriterInput (音頻和視頻分別)設(shè)置輸出參數(shù)配置
if ( [_assetWriter canApplyOutputSettings:audioSettings forMediaType:AVMediaTypeAudio] ){ _audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:audioFormatDescription]; -
3、開始編碼
[_assetWriter startWriting] -
4、編碼流回調(diào)函數(shù)
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;這個(gè)函數(shù)是音頻和視頻共用的 。在這里處理CMSampleBufferRef的時(shí)候,需要注意區(qū)分 。
至此,我們就可以拿到實(shí)時(shí)的編碼流 ,在這里可以進(jìn)行后續(xù)的處理 ,比如添加濾鏡等 。
-
最后一步,寫入
[input appendSampleBuffer:sampleBuffer]
補(bǔ)充
AVAssetWriterInput 也可以做輸出的配置工作 。那么需要提供一個(gè)包含具體輸出參數(shù)的字典。關(guān)于音頻輸出設(shè)置的鍵值被定義在這里, 關(guān)于視頻輸出設(shè)置的鍵值定義在這里
還有一種方法可以直接生成復(fù)合當(dāng)前輸出配置 ,帶有所有參數(shù)的字典的方法 。
_videoCompressionSettings = [_videoDataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie];
_audioCompressionSettings = [_audioDataOutput recommendedAudioSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie];