項(xiàng)目中有一個(gè)視頻自定義區(qū)域裁剪的需求,即:用戶選取要裁剪的區(qū)域去剪輯,區(qū)域之外的視頻畫面會(huì)被丟棄。實(shí)現(xiàn)效果如下:
IMG_5337.PNG
如圖所示,視頻頁(yè)面上是一個(gè)拖拽框,添加手勢(shì),使之可以自由調(diào)節(jié)大小,也可以按比例調(diào)節(jié)大小,按照選取的范圍去裁剪視頻。好了,UI層面上的東西不再多說(shuō)。裁剪視頻需要用到可能很多開(kāi)發(fā)者都比較陌生的兩個(gè)類AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction。
AVMutableVideoCompositionInstruction是視頻軌道中的一個(gè)視頻,可以縮放、旋轉(zhuǎn)等。AVMutableVideoCompositionLayerInstruction是一個(gè)軌道視頻,包含了這個(gè)軌道上所有的視頻素材。還是直接上代碼(我知道你們也不想看我啰哩啰嗦):
BOOL Assetvertical =NO;
//創(chuàng)建AVAsset實(shí)例 AVAsset包含了video的所有信息
NSDictionary *option = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoPath] options:option];
//創(chuàng)建AVMutableComposition實(shí)例.
AVMutableComposition *mixComposition = [AVMutableComposition composition];
//開(kāi)始時(shí)間
CMTime startTime =CMTimeMakeWithSeconds(0,videoAsset.duration.timescale);
//結(jié)束時(shí)間
CMTime endTime =CMTimeMakeWithSeconds(videoAsset.duration.value/videoAsset.duration.timescale,videoAsset.duration.timescale);
// CMTimeRange timeRange =CMTimeRangeMake(startTime, endTime);
NSLog(@"視頻_____%lld____%d",videoAsset.duration.value,videoAsset.duration.timescale);
//3 視頻通道 工程文件中的軌道,有音頻軌、視頻軌等,里面可以插入各種對(duì)應(yīng)的素材
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error;
//把視頻軌道數(shù)據(jù)加入到可變軌道中 這部分可以做視頻裁剪TimeRange
[videoTrack insertTimeRange:CMTimeRangeMake(startTime, endTime)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject]
atTime:kCMTimeZero error:&error];
//有聲音
if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0){
//聲音采集
AVURLAsset * audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:videoPath] options:option];
//音頻通道
AVMutableCompositionTrack * audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
//音頻采集通道
AVAssetTrack * audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
[audioTrack insertTimeRange:CMTimeRangeMake(startTime,endTime) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
}
//3.1 AVMutableVideoCompositionInstruction 視頻軌道中的一個(gè)視頻,可以縮放、旋轉(zhuǎn)等
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(startTime,endTime);
// 3.2 AVMutableVideoCompositionLayerInstruction 一個(gè)視頻軌道,包含了這個(gè)軌道上的所有視頻素材
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
//拍攝的時(shí)候視頻是否是豎屏拍的
BOOL isVideoAssetvertical = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
isVideoAssetvertical = YES;
videoAssetOrientation_ = UIImageOrientationUp;//正著拍
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
isVideoAssetvertical = YES;
videoAssetOrientation_ = UIImageOrientationDown;//倒著拍
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
isVideoAssetvertical = NO;
videoAssetOrientation_ = UIImageOrientationLeft;//左邊拍
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
isVideoAssetvertical = NO;
videoAssetOrientation_ = UIImageOrientationRight;//右邊拍
}
float scaleX = 1.0,scaleY = 1.0,scale = 1.0;
CGSize originVideoSize;
if (isVideoAssetvertical || Assetvertical) {
originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width);
}
else{
originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height);
}
float x = videoPoint.x;
float y = videoPoint.y;
if (shouldScale) {
scaleX = videoSize.width/originVideoSize.width;
scaleY = videoSize.height/originVideoSize.height;
scale = MAX(scaleX, scaleY);
if (scaleX>scaleY) {
NSLog(@"豎屏");
}
else{
NSLog(@"橫屏");
}
}
else{
scaleX = 1.0;
scaleY = 1.0;
scale = 1.0;
}
if (Assetvertical) {
CGAffineTransform trans = CGAffineTransformMake(videoAssetTrack.preferredTransform.a*scale, videoAssetTrack.preferredTransform.b*scale, videoAssetTrack.preferredTransform.c*scale, videoAssetTrack.preferredTransform.d*scale, videoAssetTrack.preferredTransform.tx*scale-x+720, videoAssetTrack.preferredTransform.ty*scale-y);
// [videolayerInstruction setTransform:trans atTime:kCMTimeZero];
CGAffineTransform trans2 = CGAffineTransformRotate(trans, M_PI_2);
[videolayerInstruction setTransform:trans2 atTime:kCMTimeZero];
}
else{
CGAffineTransform trans = CGAffineTransformMake(videoAssetTrack.preferredTransform.a*scale, videoAssetTrack.preferredTransform.b*scale, videoAssetTrack.preferredTransform.c*scale, videoAssetTrack.preferredTransform.d*scale, videoAssetTrack.preferredTransform.tx*scale-x, videoAssetTrack.preferredTransform.ty*scale-y);
[videolayerInstruction setTransform:trans atTime:kCMTimeZero];
}
//裁剪區(qū)域
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
//AVMutableVideoComposition:管理所有視頻軌道,可以決定最終視頻的尺寸,裁剪需要在這里進(jìn)行
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
CGSize naturalSize = originVideoSize;
int64_t renderWidth = 0, renderHeight = 0;
if (videoSize.height ==0.0 || videoSize.width == 0.0) {
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
}
else{
renderWidth = ceil(videoSize.width);
renderHeight = ceil(videoSize.height);
}
mainCompositionInst.renderSize = CGSizeMake(renderWidth,renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1,30);
經(jīng)過(guò)以上這一步,我們就得到了一個(gè)AVMutableVideoComposition對(duì)象,這正是我們所需要的,剩下的就是使用 AVAssetExportSession導(dǎo)出視頻就行了,如下:
//視頻文件輸出
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:presetName];
exporter.outputURL=compressionFileURL;
exporter.outputFileType = outputFileType;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:{
if (completeBlock) {
completeBlock(exporter.error,compressionFileURL);
}
break;
}
case AVAssetExportSessionStatusCancelled:{
NSLog(@"Export Status: Cancell");
break;
}
case AVAssetExportSessionStatusCompleted: {
if (completeBlock) {
completeBlock(nil,compressionFileURL);
}
break;
}
case AVAssetExportSessionStatusUnknown: {
NSLog(@"Export Status: Unknown");
break;
}
case AVAssetExportSessionStatusExporting : {
NSLog(@"Export Status: Exporting");
break;
}
case AVAssetExportSessionStatusWaiting: {
NSLog(@"Export Status: Wating");
break;
}
};
});
}];
注:我們還可以同時(shí)設(shè)置exporter.timeRange,這樣在裁剪視頻區(qū)域的同時(shí)還可以剪輯視頻時(shí)間,不明白的同學(xué)見(jiàn)上一遍文章短視頻從無(wú)到有 (十)視頻自定義時(shí)間剪輯。
有什么問(wèn)題,歡迎留言討論。