視頻相關入門
『ios』視頻編輯入門 【視頻背景音樂合成】
『ios』視頻編輯入門【添加水印】
『ios』視頻編輯入門【畫中畫實現(xiàn)】改變視頻尺寸
畫中畫這個功能,其實自己可以嘗試搜一下,資料真的很少。這是最中的效果

Simulator Screen Shot - iPhone 8 - 2019-03-12 at 18.13.41.png
其實這個也是拿AVFoundation來實現(xiàn)的。下面我說下具體的步驟,因為有些東西跟前兩篇是一樣的,比如開頭的聲音采集了,視頻采集,所以我就不拆分成不同的步驟了。
1.拿到資源 視頻采集 音頻采集
NSDictionary *opts = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
videoAsset = [AVURLAsset URLAssetWithURL:videoPath options:opts]; //初始化視頻媒體文件
CMTime startTime = CMTimeMakeWithSeconds(0.2, 600);
CMTime endTime = CMTimeMakeWithSeconds(videoAsset.duration.value/videoAsset.duration.timescale-0.2, videoAsset.duration.timescale);
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 視頻通道 工程文件中的軌道,有音頻軌、視頻軌等,里面可以插入各種對應的素材
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
//把視頻軌道數(shù)據(jù)加入到可變軌道中 這部分可以做視頻裁剪TimeRange
[videoTrack insertTimeRange:CMTimeRangeFromTimeToTime(startTime, endTime)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
//聲音采集
AVURLAsset * audioAsset = [[AVURLAsset alloc] initWithURL:videoPath options:opts];
//音頻通道
AVMutableCompositionTrack * audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
//音頻采集通道
AVAssetTrack * audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
[audioTrack insertTimeRange:CMTimeRangeFromTimeToTime(startTime, endTime) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
//3.1 AVMutableVideoCompositionInstruction 視頻軌道中的一個視頻,可以縮放、旋轉(zhuǎn)等
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeFromTimeToTime(kCMTimeZero, videoTrack.timeRange.duration);
// 3.2 AVMutableVideoCompositionLayerInstruction 一個視頻軌道,包含了這個軌道上的所有視頻素材
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videolayerInstruction setOpacity:0.0 atTime:endTime];
上面這一些列操作都是做了準備工作,實際上最重要的地方就是下面的這些。
其實如果你真的來自己操作幾遍視頻編輯類的操作,你就會發(fā)現(xiàn),明明視頻尺寸是1080 * 1920 但是確實橫屏的。其實這都是因為rotation這個東西。具體的可以看這篇文章
關于視頻方向的若干問題.

image.png
所以這里要做的就是判斷旋轉(zhuǎn)方向
2.判斷旋轉(zhuǎn)方向 給視頻設置正確的播放尺寸
封裝代碼
typedef enum {
LBVideoOrientationUp, //Device starts recording in Portrait
LBVideoOrientationDown, //Device starts recording in Portrait upside down
LBVideoOrientationLeft, //Device Landscape Left (home button on the left side)
LBVideoOrientationRight, //Device Landscape Right (home button on the Right side)
LBVideoOrientationNotFound = 99 //An Error occurred or AVAsset doesn't contains video track
} LBVideoOrientation;
-(LBVideoOrientation)videoOrientationWithAsset:(AVAsset *)asset
{
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if ([videoTracks count] == 0) {
return LBVideoOrientationNotFound;
}
AVAssetTrack* videoTrack = [videoTracks objectAtIndex:0];
CGAffineTransform txf = [videoTrack preferredTransform];
CGFloat videoAngleInDegree = RadiansToDegrees(atan2(txf.b, txf.a));
LBVideoOrientation orientation = 0;
switch ((int)videoAngleInDegree) {
case 0:
orientation = LBVideoOrientationRight;
break;
case 90:
orientation = LBVideoOrientationUp;
break;
case 180:
orientation = LBVideoOrientationLeft;
break;
case -90:
orientation = LBVideoOrientationDown;
break;
default:
orientation = LBVideoOrientationNotFound;
break;
}
return orientation;
}
當LBVideoOrientationUp LBVideoOrientationDown的時候和當為LBVideoOrientationRight和LBVideoOrientationLeft的時候分別進行不同的設置
當時豎直的時候把寬高調(diào)換。
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake( videoAssetTrack.naturalSize.height,videoAssetTrack.naturalSize.width+500);
} else {
naturalSize = CGSizeMake( videoAssetTrack.naturalSize.width,videoAssetTrack.naturalSize.height+500);;
}
//CGFloat scaleValue = screenSize.height/naturalSize.height;
CGFloat scaleValue = scaleValue = 1;
進行處理旋轉(zhuǎn)方向 重點
LBVideoOrientation videoOrientation = [self videoOrientationWithAsset:videoAsset];
CGAffineTransform t1 = CGAffineTransformIdentity;
CGAffineTransform t2 = CGAffineTransformIdentity;
CGAffineTransform t3 = CGAffineTransformIdentity;
NSLog(@" --- 視頻轉(zhuǎn)向 -- %ld",(long)videoOrientation);
switch (videoOrientation)
{
case LBVideoOrientationUp:
t1 = CGAffineTransformMakeTranslation(videoTrack.naturalSize.height - 0, 0 - 0);
// t1 = CGAffineTransformScale(originTransform, scaleValue, scaleValue);
t2 = CGAffineTransformRotate(t1, M_PI_2);
t3 = CGAffineTransformScale(t2, scaleValue, scaleValue);
break;
case LBVideoOrientationDown:
t1 = CGAffineTransformMakeTranslation(
0 - 0,
videoTrack.naturalSize.width - 0); // not fixed width is the real height in upside down
t2 = CGAffineTransformRotate(t1, -M_PI_2);
t3 = CGAffineTransformScale(t2, scaleValue, scaleValue);
break;
case LBVideoOrientationRight:
t1 = CGAffineTransformMakeTranslation(0 - 0, 0 - 0);
t2 = CGAffineTransformRotate(t1, 0);
t3 = CGAffineTransformScale(t2, scaleValue, scaleValue);
break;
case LBVideoOrientationLeft:
t1 = CGAffineTransformMakeTranslation(videoTrack.naturalSize.width - 0,
videoTrack.naturalSize.height - 0);
t2 = CGAffineTransformRotate(t1, M_PI);
t3 = CGAffineTransformScale(t2, scaleValue, scaleValue);
break;
default:
NSLog(@"【該視頻未發(fā)現(xiàn)設置支持的轉(zhuǎn)向】");
break;
}
[videolayerInstruction setTransform:t3 atTime:kCMTimeZero];
經(jīng)過上面的操作,視頻的方向尺寸基本搞定,下面需要做的就是設置圖片的位置。其實這個就是水印的操作罷了。但是需要主要的地方。就是水印的布局。一般ios的布局都是從左上方開始,但是在這里水印的布局要從左下方開始。
3.水印圖片布局相關
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
//AVMutableVideoComposition:管理所有視頻軌道,可以決定最終視頻的尺寸,裁剪需要在這里進行
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
float renderWidth, renderHeight;
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 25);
[self applyVideoEffectsToComposition:mainCompositionInst WithWaterImg:img size:CGSizeMake(renderWidth, renderHeight)];
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition WithWaterImg:(UIImage*)img size:(CGSize)size {
//水印
CALayer *imgLayer = [CALayer layer];
imgLayer.contents = (id)img.CGImage;
imgLayer.frame = CGRectMake(0, 0, size.width, 500);
// 2 - The usual overlay
CALayer *overlayLayer = [CALayer layer];
[overlayLayer addSublayer:imgLayer];
overlayLayer.frame = CGRectMake(0, 0,size.width, size.height);
[overlayLayer setMasksToBounds:YES];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
4.導出視頻
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mp4",fileName]];
unlink([myPathDocs UTF8String]);
NSURL* videoUrl = [NSURL fileURLWithPath:myPathDocs];
dlink = [CADisplayLink displayLinkWithTarget:self selector:@selector(updateProgress)];
[dlink setFrameInterval:15];
[dlink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[dlink setPaused:NO];
// 5 - 視頻文件輸出
exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=videoUrl;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
//這里是輸出視頻之后的操作,做你想做的
[self exportDidFinish:exporter];
});
}];
- (void)exportDidFinish:(AVAssetExportSession*)session {
if (session.status == AVAssetExportSessionStatusCompleted) {
NSURL *outputURL = session.outputURL;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.2 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
__block PHObjectPlaceholder *placeholder;
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL.path)) {
NSError *error;
[[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{
PHAssetChangeRequest* createAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:outputURL];
placeholder = [createAssetRequest placeholderForCreatedAsset];
} error:&error];
if (error) {
[SVProgressHUD showErrorWithStatus:[NSString stringWithFormat:@"%@",error]];
}
else{
[SVProgressHUD showSuccessWithStatus:@"視頻已經(jīng)保存到相冊"];
}
}else {
[SVProgressHUD showErrorWithStatus:NSLocalizedString(@"視頻保存相冊失敗,請設置軟件讀取相冊權限", nil)];
}
});
}
}
這是視頻跟圖片并排的放置最后合成視頻的一些基本方法和注意事項。下一篇,視頻并排放置,這才是真正的畫中畫吧。