需求:使用AVFoundation中的AVCaptureSession實(shí)現(xiàn)設(shè)置相機(jī)的分辨率,幀率(包括高幀率), 切換前后置攝像頭,對焦,屏幕旋轉(zhuǎn),調(diào)節(jié)曝光度...
閱讀前提:
- 原理請參考另一篇文章:iOS視頻流采集概述(AVCaptureSession)
- 基于AVFoundation框架
GitHub地址(附代碼) : iOS視頻采集實(shí)戰(zhàn)(AVCaptureSession)
簡書地址 : iOS視頻采集實(shí)戰(zhàn)(AVCaptureSession)
博客地址 : iOS視頻采集實(shí)戰(zhàn)(AVCaptureSession)
掘金地址 : iOS視頻采集實(shí)戰(zhàn)(AVCaptureSession)
1. 設(shè)置分辨率與幀率
1.1. 低幀率模式(fps <= 30)
在要求幀率小于等于30幀的情況下,相機(jī)設(shè)置分辨率與幀率的方法是單獨(dú)的,即設(shè)置幀率是幀率的方法,設(shè)置分辨率是分辨率的方法,兩者沒有綁定.
-
設(shè)置分辨率
使用此方法可以設(shè)置相機(jī)分辨率,可以設(shè)置的類型可以直接跳轉(zhuǎn)進(jìn)API文檔處自行選擇,目前支持最大的是3840*2160,如果不要求相機(jī)幀率大于30幀,此方法可以適用于你.
- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
/*
Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
*/
AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
if ([session.sessionPreset isEqualToString:preset]) {
NSLog(@"Needn't to set camera resolution repeatly !");
return;
}
if (![session canSetSessionPreset:preset]) {
NSLog(@"Can't set the sessionPreset !");
return;
}
[session beginConfiguration];
session.sessionPreset = preset;
[session commitConfiguration];
}
-
設(shè)置幀率
使用此方法可以設(shè)置相機(jī)幀率,僅支持幀率小于等于30幀.
- (void)setCameraForLFRWithFrameRate:(int)frameRate {
// Only for frame rate <= 30
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[captureDevice lockForConfiguration:NULL];
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
1.2. 高幀率模式(fps > 30)
如果需要對某一分辨率支持高幀率的設(shè)置,如50幀,60幀,120幀...,原先setActiveVideoMinFrameDuration與setActiveVideoMaxFrameDuration是無法做到的,Apple規(guī)定我們需要使用新的方法設(shè)置幀率setActiveVideoMinFrameDuration與setActiveVideoMaxFrameDuration,并且該方法必須配合新的設(shè)置分辨率activeFormat的方法一起使用.
新的設(shè)置分辨率的方法activeFormat與sessionPreset是互斥的,如果使用了一個, 另一個會失效,建議直接使用高幀率的設(shè)置方法,廢棄低幀率下設(shè)置方法,避免產(chǎn)生兼容問題。
Apple在更新方法后將原先分離的分辨率與幀率的設(shè)置方法合二為一,原先是單獨(dú)設(shè)置相機(jī)分辨率與幀率,而現(xiàn)在則需要一起設(shè)置,即每個分辨率有其對應(yīng)支持的幀率范圍,每個幀率也有其支持的分辨率,需要我們遍歷來查詢,所以原先統(tǒng)一的單獨(dú)的設(shè)置分辨率與幀率的方法在高幀率模式下相當(dāng)于棄用,可以根據(jù)項(xiàng)目需求選擇,如果確定項(xiàng)目不會支持高幀率(fps>30),可以使用以前的方法,簡單且有效.
注意: 使用
activeFormat方法后,之前使用sessionPreset方法設(shè)置的分辨率將自動變?yōu)?code>AVCaptureSessionPresetInputPriority,所以如果項(xiàng)目之前有用canSetSessionPreset比較的if語句也都將失效,建議如果項(xiàng)目必須支持高幀率則徹底啟用sessionPreset方法.
+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
BOOL isSuccess = NO;
for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
CMFormatDescriptionRef description = vFormat.formatDescription;
float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
if ([captureDevice lockForConfiguration:NULL] == YES) {
// 對比鏡頭支持的分辨率和當(dāng)前設(shè)置的分辨率
CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
[session beginConfiguration];
if ([captureDevice lockForConfiguration:NULL]){
captureDevice.activeFormat = vFormat;
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
[session commitConfiguration];
return YES;
}
}else {
NSLog(@"%s: lock failed!",__func__);
}
}
}
NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
return NO;
}
+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for (AVCaptureDevice *device in devices) {
if (position == device.position) {
return device;
}
}
return NULL;
}
2. 前后置攝像頭切換
切換前后置攝像頭,看似簡單,實(shí)際應(yīng)用中會產(chǎn)生很多問題,因?yàn)橥徊吭O(shè)備前后置攝像頭支持的分辨率幀率的值是不同的,所以如果從支持切向不支持就會產(chǎn)生問題,具體案例如下
比如iPhoneX, 后置攝像頭最大支持(4K,60fps),前置攝像頭最大支持(2K,30fps),當(dāng)使用(4K,60fps)后置攝像頭切到前置攝像頭如果不做處理則無法切換,程序錯亂.
注意
下面代碼中我們這行代碼session.sessionPreset = AVCaptureSessionPresetLow;,因?yàn)閺暮笾们械角爸梦覀冃枰匦掠嬎惝?dāng)前輸入設(shè)備支持最大的分辨率與幀率,而輸入設(shè)備如果不先添加上去我們無法計算,所以在這里先隨便設(shè)置一個可接受的分辨率以使我們可以把輸入設(shè)備添加,之后在求出當(dāng)前設(shè)備最大支持的分辨率與幀率后再重新設(shè)置分辨率與幀率.
- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {
if (input) {
[session beginConfiguration];
[session removeInput:input];
AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
NSError *error = nil;
AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (error != noErr) {
NSLog(@"%s: error:%@",__func__, error.localizedDescription);
return;
}
// 比如: 后置是4K, 前置最多支持2K,此時切換需要降級, 而如果不先把Input添加到session中,我們無法計算當(dāng)前攝像頭支持的最大分辨率
session.sessionPreset = AVCaptureSessionPresetLow;
if ([session canAddInput:newInput]) {
self.input = newInput;
[session addInput:newInput];
}else {
NSLog(@"%s: add input failed.",__func__);
return;
}
int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
if (resolutionHeight > maxResolutionHeight) {
resolutionHeight = maxResolutionHeight;
self.cameraModel.resolutionHeight = resolutionHeight;
NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
}
int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
if (frameRate > maxFrameRate) {
frameRate = maxFrameRate;
self.cameraModel.frameRate = frameRate;
NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
}
BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
andResolutionHeight:resolutionHeight
bySession:session
position:position
videoFormat:videoFormat];
if (!isSuccess) {
NSLog(@"%s: Set resolution and frame rate failed.",__func__);
}
[session commitConfiguration];
}
}
3.屏幕視頻方向切換
我們在這里首先要區(qū)分下屏幕方向與視頻方向的概念,一個是用來表示設(shè)備方向(UIDeviceOrientation),一個是用來表示視頻方向(AVCaptureVideoOrientation). 我們使用的AVCaptureSession,如果要支持屏幕旋轉(zhuǎn),需要在屏幕旋轉(zhuǎn)的同時將我們的視頻畫面也進(jìn)行旋轉(zhuǎn).
屏幕方向的旋轉(zhuǎn)可以通過通知UIDeviceOrientationDidChangeNotification接收,這里不做過多說明.
- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
[previewLayer setFrame:previewFrame];
switch (orientation) {
case UIInterfaceOrientationPortrait:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
videoOutput:videoOutput];
break;
case UIInterfaceOrientationPortraitUpsideDown:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeLeft:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeRight:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
videoOutput:videoOutput];
break;
default:
break;
}
}
-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
for(AVCaptureConnection *connection in videoOutput.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if([connection isVideoOrientationSupported]) {
[connection setVideoOrientation:orientation];
}
}
}
}
}
4.對焦調(diào)節(jié)
關(guān)于對焦,我們需要特別說明手動設(shè)置對焦點(diǎn)進(jìn)行對焦,因?yàn)閷狗椒▋H接受以左上角為(0,0),右下角為(1,1)的坐標(biāo)系,所以我們需要對UIView的坐標(biāo)系進(jìn)行轉(zhuǎn)換,但是轉(zhuǎn)換需要分為多種情況,如下
- 視頻是否以鏡像模式輸出: 如前置攝像頭可能會開啟鏡像模式(x,y坐標(biāo)是反的)
- 屏幕方向是以Home在右還是在左: 在右的話是以左上角為原點(diǎn),在左的話則是以右下角為原點(diǎn).
- 視頻渲染方式: 是保持分辨率比例,還是填充模式,因?yàn)槭謾C(jī)型號不同,所以可能是填充黑邊,可能超出屏幕,需要重新計算對焦點(diǎn).
如果我們是直接使用AVCaptureSession的AVCaptureVideoPreviewLayer做渲染,我們可以使用captureDevicePointOfInterestForPoint方法自動計算,此結(jié)果會考慮上面所有情況.但如果我們是自己對屏幕做渲染,則需要自己計算對焦點(diǎn),上面的情況都需要考慮. 下面提供自動與手動計算兩種方法.
- (void)autoFocusAtPoint:(CGPoint)point {
AVCaptureDevice *device = self.input.device;
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposurePointOfInterest:point];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
4.1. 自動計算對焦點(diǎn)
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
CGSize frameSize = [captureVideoPreviewLayer frame].size;
if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
viewCoordinates.x = frameSize.width - viewCoordinates.x;
}
// Convert UIKit coordinate to Focus Point(0.0~1.1)
pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
// NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
4.2. 手動計算對焦點(diǎn)
- 如果手機(jī)屏幕尺寸與分辨率比例完全吻合,則直接將坐標(biāo)系轉(zhuǎn)為(0,0)到(1,1)即可
- 如果屏幕尺寸比例與分辨率比例不同,需要進(jìn)一步分析視頻渲染方式來計算,如果是保持分辨率,則肯定會留下黑邊,我們在計算對焦點(diǎn)時需要減去黑邊長度,如果是以分辨率比例填充屏幕則會犧牲一部分像素,我們在計算對焦點(diǎn)時同樣需要加上犧牲的像素.
- (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
point.x = frameSize.width - point.x;
}
for (AVCaptureInputPort *port in [input ports]) {
if ([port mediaType] == AVMediaTypeVideo) {
CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
CGSize resolutionSize = cleanAperture.size;
CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
CGFloat screenSizeRatio = frameSize.width / frameSize.height;
CGFloat xc = .5f;
CGFloat yc = .5f;
if (resolutionRatio == screenSizeRatio) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}else if (resolutionRatio > screenSizeRatio) {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenWidth = resolutionRatio * frameSize.height;
CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
xc = (cropWidth + point.x) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
CGFloat blackBarLength = (frameSize.height - needScreenHeight) / 2;
xc = point.x / frameSize.width;
yc = (point.y - blackBarLength) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}else {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
xc = point.x / frameSize.width;
yc = (cropHeight + point.y) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenWidth = frameSize.height * resolutionRatio;
CGFloat blackBarLength = (frameSize.width - needScreenWidth) / 2;
xc = (point.x - blackBarLength) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}
pointOfInterest = CGPointMake(xc, yc);
}
}
if (position == AVCaptureDevicePositionBack) {
if (captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) {
pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y);
}
}else {
pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
}
//NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
5.曝光調(diào)節(jié)
如果我們是以UISlider作為調(diào)節(jié)控件,最簡單的做法可以將其范圍設(shè)置的與曝光度值的范圍相同,即(-8~8),這樣無需轉(zhuǎn)換值,直接傳入即可,如果是手勢或是其他控件可根據(jù)需求自行調(diào)整.較為簡單,不再敘述.
- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposureTargetBias:newExposureValue completionHandler:nil];
[device unlockForConfiguration];
}
}
6.手電筒模式
- AVCaptureTorchModeAuto: 自動
- AVCaptureTorchModeOn: 打開
- AVCaptureTorchModeOff: 關(guān)閉
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
if ([device hasTorch]) {
NSError *error;
[device lockForConfiguration:&error];
device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
[device unlockForConfiguration];
}else {
NSLog(@"The device not support torch!");
}
}
7.視頻穩(wěn)定性調(diào)節(jié)
注意: 部分機(jī)型,部分分辨率使用此屬性渲染可能會出現(xiàn)問題 (iphone xs, 自己渲染)
-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for(AVCaptureDevice *device in devices){
if([device hasMediaType:AVMediaTypeVideo]){
if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
for(AVCaptureConnection *connection in output.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if(connection.supportsVideoStabilization) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
}else {
NSLog(@"connection don't support video stabilization");
}
}
}
}
}else{
NSLog(@"device don't support video stablization");
}
}
}
}
8.白平衡調(diào)節(jié)
- temperature: 通過華氏溫度調(diào)節(jié) (-150-~250)
- tint: 通過色調(diào)調(diào)節(jié) (-150-~150)
注意在使用setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains方法時必須比較當(dāng)前的AVCaptureWhiteBalanceGains值是否在有效范圍.
-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
AVCaptureWhiteBalanceGains tmpGains = gains;
tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxVal), minVal);
tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
return tmpGains;
}
-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = temperature,
.tint = currentTint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = currentTemperature,
.tint = tint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
9.屏幕填充方式
- AVLayerVideoGravityResizeAspect: 保持分辨率比例,如果屏幕分辨率與視頻分辨率不一致會留下黑邊.
- AVLayerVideoGravityResizeAspectFill: 保持分辨率比例去填充屏幕,即以較小的邊來準(zhǔn)填充屏幕,會犧牲掉一些像素,因?yàn)槌銎聊?
- AVLayerVideoGravityResize:以拉伸的方式來填充屏幕,不會犧牲像素,但是畫面會被拉伸.
- (void)setVideoGravity:(AVLayerVideoGravity)videoGravity previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer session:(AVCaptureSession *)session {
[session beginConfiguration];
[previewLayer setVideoGravity:videoGravity];
[session commitConfiguration];
}