AVFoundation 視頻捕獲

AVFoundation提供了一系列多媒體捕捉的方式,核心類為如下幾個:AVCaptureSession、AVCaptureInput、AVCaptureOutput。AVCaptureSession負(fù)責(zé)協(xié)調(diào)媒體輸入和輸出。

AVCaptureSession

配置AVCaptureSession

//Config AVCaptureSession AVCaptureSession *session = [[AVCaptureSession alloc] init];[session beginConfiguration]; if ([session canSetSessionPreset:AVCaptureSessionPreset1280x720]) { [session setSessionPreset:AVCaptureSessionPreset1280x720]; } [session commitConfiguration]; self.captureSession = session;

AVCaptureInput

有了Session之后,就需要配置輸入,輸出了。媒體捕捉的輸入主要來自于設(shè)備,使用AVCaptureInput一個子類——AVCaptureDeviceInput。

配置AVCaptureDeviceInput

NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices) { if (device.position == AVCaptureDevicePositionBack) {//這里使用后置攝像頭 self.backCamera = device; self.backCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil]; } } if ([self.captureSession canAddInput:self.backCameraInput]) { [self.captureSession addInput:self.backCameraInput]; }

AVCaptureOutput

常用的輸出有AVCaptureVideoDataOutput、AVCaptureAudioDataOutput、AVCaptureStillImageOutput、AVCaptureFileOutput。

AVCaptureStillImageOutput

AVCaptureStillImageOutput *stillOutput = [[AVCaptureStillImageOutput alloc] init]; self.imageOutput = stillOutput; NSDictionary *imageSetting = @{AVVideoCodecKey: AVVideoCodecJPEG}; [self.imageOutput setOutputSettings:imageSetting]; if ([self.captureSession canAddOutput:audioDataOutput]) { [self.captureSession addOutput:audioDataOutput]; }

AVCaptureVideoPreviewLayer

配置好輸入輸出,我們還需要看到已經(jīng)捕捉到的圖像的樣子,這里使用AVCaptureVideoPreviewLayer。
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession]; self.previewLayer.frame = [UIScreen mainScreen].bounds; [self.view.layer addSublayer:self.previewLayer];
配置好一起之后,可以開始捕獲了。
[self.captureSession startRunning];

獲取圖片

使用圖片輸出獲取一張靜態(tài)圖片
-(void)captureImage { AVCaptureConnection *imageConnection = nil; for (AVCaptureConnection *connect in self.imageOutput.connections) { for (AVCaptureInputPort *port in connect.inputPorts) { if (port.mediaType == AVMediaTypeVideo) { imageConnection = connect; break; } } if (imageConnection != nil) { break; } } if (imageConnection != nil) { [self.imageOutput captureStillImageAsynchronouslyFromConnection:imageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { // UIImage *image = [[UIImage alloc] initWithData:[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]]; // NSLog(@"get image success: %@", image); CVImageBufferRef imageRef = CMSampleBufferGetImageBuffer(imageDataSampleBuffer); }]; } }

錄音

AVCapture捕獲音頻、播放音頻有專門的一個類,如下代碼。
[UIDevice currentDevice].proximityMonitoringEnabled = YES; if ([[[UIDevice currentDevice] systemVersion] compare:@"7.0"] != NSOrderedAscending) { AVAudioSession *session = [AVAudioSession sharedInstance]; NSError *sessionError; //如果此時手機(jī)靠近面部放在耳朵旁,那么聲音將通過聽筒輸出,并將屏幕變暗 if ([[UIDevice currentDevice] proximityState] == YES) { NSLog(@"Device is close to user"); [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; } else { NSLog(@"Device is not close to user"); [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil]; } if (session == nil) { NSLog(@"Error creating session: %@", sessionError); } else { [session setActive:YES error:nil]; } } NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; NSString *playName = [NSString stringWithFormat:@"%@/play.acc", docDir]; if ([[NSFileManager defaultManager] fileExistsAtPath:playName]) { [[NSFileManager defaultManager] removeItemAtPath:playName error:nil]; } NSDictionary *recorderSettingsDict = @{ AVFormatIDKey:@(kAudioFormatMPEG4AAC), AVSampleRateKey: @(44100), AVNumberOfChannelsKey:@(2), AVLinearPCMBitDepthKey:@(8), AVLinearPCMIsBigEndianKey:@(NO), AVLinearPCMIsFloatKey:@(NO) }; NSError *error = nil; self.recorder = [[AVAudioRecorder alloc] initWithURL:[NSURL URLWithString:playName] settings:recorderSettingsDict error:&error]; if (self.recorder) { self.recorder.meteringEnabled = YES; [self.recorder prepareToRecord]; [self.recorder record]; dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ [self.recorder stop]; //如果此時手機(jī)靠近面部放在耳朵旁,那么聲音將通過聽筒輸出,并將屏幕變暗 if ([[UIDevice currentDevice] proximityState] == YES) { NSLog(@"Device is close to user"); [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; } else { NSLog(@"Device is not close to user"); [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil]; } self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL URLWithString:playName] error:nil]; // [self.audioPlayer prepareToPlay]; [self.audioPlayer play]; }); }

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容