AV Foundation系列(五)媒體組合

生活中我們經??赡芘龅竭@么一種需求,你有兩段視頻,你想將兩段視頻組合成一個視頻,你還想為組合的這段新視頻添加背景音樂,這就要用到媒體組合技術,AVMutableComposition是這個技術的一個核心的類,他繼承于AVComposition類,AVComposition類又繼承于AVAsset資源類。

現(xiàn)在有四段視頻,比如視頻1,視頻2,視頻3,視頻4,實現(xiàn)功能,將視頻1,視頻2,視頻3組合一段視頻,視頻3要保證視頻3的視頻數(shù)據和音頻數(shù)據保持一致,提取視頻4的音頻數(shù)據作為新視頻的前段的背景音樂。

@property(nonatomic,strong)AVMutableComposition *mutableComposition;
@property(nonatomic,strong)AVAsset *nebual1Asset;
@property(nonatomic,strong)AVAsset *nebual3Asset;
@property(nonatomic,strong)AVAsset *backhole2Asset;
@property(nonatomic,strong)AVAsset *vidAsset;

@property(nonatomic,strong)AVMutableCompositionTrack *mutableVideoTrack;
@property(nonatomic,strong)AVMutableCompositionTrack *mutableAudioTrack;

@property(nonatomic,strong)AVAsset *compositionAsset;
@property(nonatomic,strong)AVPlayerItem *playerItem;
@property(nonatomic,strong)AVPlayer *player;
@property(nonatomic,strong)AVPlayerLayer *playerLayer;
@property(nonatomic,strong)NSString *storePath;

初始化相關數(shù)據

-(void)setAssetInfo
{
    NSURL *nebula1Url = [[NSBundle mainBundle] URLForResource:@"01_nebula" withExtension:@"mp4"];
    NSURL *nebula3Url = [[NSBundle mainBundle] URLForResource:@"03_nebula" withExtension:@"mp4"];
    NSURL *backholeUrl = [[NSBundle mainBundle] URLForResource:@"02_blackhole" withExtension:@"mp4"];
    NSURL *vidUrl = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"mp4"];
    
    self.storePath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
    self.storePath = [self.storePath stringByAppendingPathComponent:@"cm.mp4"];
    
    self.nebual1Asset = [AVURLAsset URLAssetWithURL:nebula1Url options:nil];
    self.nebual3Asset = [AVURLAsset URLAssetWithURL:nebula3Url options:nil];
    self.backhole2Asset = [AVURLAsset URLAssetWithURL:backholeUrl options:nil];
    self.vidAsset = [AVURLAsset URLAssetWithURL:vidUrl options:nil];
    
    self.mutableComposition = [AVMutableComposition composition];
    
     //添加音頻軌道和視頻軌道
    self.mutableVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    self.mutableAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

}

組合各視頻和音頻

- (IBAction)btnClicked:(id)sender
{
    CMTime startTime = kCMTimeZero;
    CMTime duration = self.nebual1Asset.duration;
    
    AVAssetTrack *video1Track = [[self.nebual1Asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    // 向視頻軌道中添加媒體片段
    [self.mutableVideoTrack insertTimeRange:CMTimeRangeMake(startTime, duration) ofTrack:video1Track atTime:kCMTimeZero error:nil];
    
    AVAssetTrack *video2Track = [[self.nebual3Asset tracksWithMediaType:AVMediaTypeVideo]firstObject];
    // 向視頻軌道中添加媒體片段
    [self.mutableVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.nebual3Asset.duration) ofTrack:video2Track atTime:CMTimeAdd(startTime, duration) error:nil];
    
    AVAssetTrack *vidVideoTrack = [[self.vidAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    // 向視頻軌道中添加媒體片段
    [self.mutableVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.vidAsset.duration) ofTrack:vidVideoTrack atTime:CMTimeAdd(self.nebual3Asset.duration, self.nebual1Asset.duration) error:nil];
    
    AVAssetTrack *audioTrack = [[self.backhole2Asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    // 向音頻軌道中添加媒體片段
    [self.mutableAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(self.nebual3Asset.duration, self.nebual1Asset.duration)) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
    
    AVAssetTrack *vidAudioTrack = [[self.vidAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    // 向音頻軌道中添加媒體片段
    [self.mutableAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.vidAsset.duration) ofTrack:vidAudioTrack atTime:CMTimeAdd(self.nebual1Asset.duration, self.nebual3Asset.duration) error:nil];

    //導出
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:self.mutableComposition presetName:AVAssetExportPresetMediumQuality];
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession.outputURL = [NSURL fileURLWithPath:self.storePath];
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        //導出完成
        if (exportSession.status == AVAssetExportSessionStatusCompleted ) {         //橫屏播放
            [self switchScreen];
            //簡單播放
            [self playerCompostionVideo];
        }
    }];    
}

橫屏實現(xiàn),強制橫屏主要用當前的設備的setOrientation方法,當然首先需要判斷是否能夠響應該方法,如何可以就調用,這里使用了NSInvocation調用對象,其實NSInvocation封裝了調用函數(shù)所需的所有的信息,比如那個對象發(fā)起調用-target,調用哪個方法-selector,調用的參數(shù)setArgument設置,如何有返回值也可以通過getReturn方法得到,這里需要注意的是使用setArgument設置參數(shù)時參數(shù)的索引之從2開始,其實這一點一點都不奇怪,我們要知道IMP函數(shù)指針指向的函數(shù)前兩個參數(shù)就是隱藏的target,selector,所有從2開始。

-(void)switchScreen
{
    dispatch_async(dispatch_get_main_queue(), ^{
        if ([[UIDevice currentDevice] respondsToSelector:@selector(setOrientation:)])
            
        {
            SEL selector=NSSelectorFromString(@"setOrientation:");
            
            NSInvocation *invocation =[NSInvocation invocationWithMethodSignature:[UIDevice instanceMethodSignatureForSelector:selector]];
            
            [invocation setSelector:selector];
            
            [invocation setTarget:[UIDevice currentDevice]];
            
            int val =UIInterfaceOrientationLandscapeRight;
            
            [invocation setArgument:&val atIndex:2];
            [invocation invoke];
        }
    });
 
}

playerCompostionVideo簡單的播放

-(void)playerCompostionVideo
{
    NSFileManager *fileMange = [NSFileManager defaultManager];
    if ([fileMange fileExistsAtPath:self.storePath]) {
        
        NSURL *videoUrl = [NSURL fileURLWithPath:self.storePath];
        self.compositionAsset = [AVAsset assetWithURL:videoUrl];
        self.playerItem = [[AVPlayerItem alloc] initWithAsset:self.compositionAsset];
        //kvo跟蹤playerItem的status狀態(tài)
        [self.playerItem addObserver:self forKeyPath:@"status" options:0 context:nil];
        self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
        self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
        
        dispatch_async(dispatch_get_main_queue(), ^{
            self.playerLayer.frame = self.view.bounds;
            [self.view.layer addSublayer:self.playerLayer];
        });
    }
}

使用KVO觀察AVPlayerItem的status狀態(tài),當AVPlayerItemStatusReadyToPlay處于準備播放狀態(tài)時,開始播放

-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSString *,id> *)change context:(void *)context
{
    AVPlayerItem *playerItem = (AVPlayerItem*)object;
    if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
        //播放
        [self.player play];
    }
}

至此,媒體組合的功能就是實現(xiàn)了,媒體組合技術還是蠻使用的,很多媒體編輯的軟件都用得到。

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯(lián)系作者
【社區(qū)內容提示】社區(qū)部分內容疑似由AI輔助生成,瀏覽時請結合常識與多方信息審慎甄別。
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發(fā)布,文章內容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務。

相關閱讀更多精彩內容

友情鏈接更多精彩內容