最近學(xué)習(xí)Metal在想,想著使用Metal渲染視頻。于是有了想法
1. 使用AVAssetReader讀取數(shù)據(jù)
在GPUImage3的Inputs/MovieInput.swift里面,使用AVAssetReader讀取CMSampleBuffer,然后渲染到renderView上
大致流程就是:

image.png
大致代碼我這里有寫到: 點(diǎn)擊查看
2.AVPlayerItemVideoOutput讀取
上面讀取渲染的時(shí)候是沒(méi)有聲音播放出來(lái),然后我就想如果我想渲染的時(shí)候能有聲音一起播放的感覺(jué)那要怎么做,翻閱資料,我找到了這個(gè)AVPlayerItemVideoOutput。
在AVPlayerItemVideoOutput有一個(gè)方法 它可以檢索適合在指定項(xiàng)目時(shí)間顯示的圖像,并將該圖像標(biāo)記為已采集。

image.png
于是就有了想法,使用播放器播放聲音,畫(huà)面通過(guò)Metal渲染。

image.png
廢話不多說(shuō),直接上代碼:
class XTVideoMovie: NSObject, AVPlayerItemOutputPullDelegate {
weak var delegate: XTVideoMovieDelegate?
var aqPlayer: AVQueuePlayer?
var displayLink: CADisplayLink?
var playerItems: [AVPlayerItem] = []
var outputs: [AVPlayerItemVideoOutput] = []
var playIndex = 0
deinit {
NotificationCenter.default.removeObserver(self)
}
init(items: [AVPlayerItem]) {
super.init()
initDisplayLink()
setupItems(items: items)
}
func initDisplayLink() {
displayLink = CADisplayLink(target: self, selector: #selector(displayLinkCallBack(_:)))
displayLink?.add(to: .current, forMode: .common)
displayLink?.isPaused = true
}
@objc func displayLinkCallBack(_ displayLink: CADisplayLink) {
processPixelBuffer(at: aqPlayer?.currentItem?.currentTime())
}
func setupItems(items: [AVPlayerItem]) {
playerItems = items
for item in items {
NotificationCenter.default.addObserver(self, selector: #selector(playEnd(notification:)), name: .AVPlayerItemDidPlayToEndTime, object: item)
let outputSetting: [String : Any] = [kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
let output = AVPlayerItemVideoOutput(outputSettings: outputSetting)
output.setDelegate(self, queue: DispatchQueue.main)
item.add(output)
outputs.append(output)
}
aqPlayer = AVQueuePlayer(items: items)
}
func play() {
if playerItems.count > 0 {
aqPlayer?.play()
displayLink?.isPaused = false
}
}
func pause() {
if aqPlayer?.rate != 0 {
aqPlayer?.pause()
displayLink?.isPaused = true
}
}
func reset() {
playIndex = 0
pause()
aqPlayer?.seek(to: .zero)
play()
}
func processPixelBuffer(at time: CMTime?) {
guard let outputTime = time else {
return
}
guard outputs[playIndex].hasNewPixelBuffer(forItemTime: outputTime) else {
return
}
/// 當(dāng)前時(shí)間
var currentTime = outputTime
for i in 0..<playIndex {
currentTime = CMTimeAdd(currentTime, playerItems[i].asset.duration)
}
/// 獲取新的pixelBuffer
guard let pixelBuffer = outputs[playIndex].copyPixelBuffer(forItemTime: outputTime, itemTimeForDisplay: nil) else {
return
}
/// 回調(diào)
delegate?.perpare(at: pixelBuffer)
delegate?.perpare(at: currentTime)
}
@objc func playEnd(notification: Notification) {
playIndex += 1
if playIndex >= playerItems.count {
print("播放結(jié)束")
displayLink?.isPaused = true
}
}
// MARK: - AVPlayerItemOutputPullDelegate
func outputMediaDataWillChange(_ Ysender: AVPlayerItemOutput) {
guard displayLink?.isPaused ?? false else {
return
}
displayLink?.isPaused = false
}
func outputSequenceWasFlushed(_ output: AVPlayerItemOutput) {
}
}
這樣就實(shí)現(xiàn)了有聲音的視頻播放,我們也可以把這個(gè)遵循GPUImage3里面的ImageSource協(xié)議,作為數(shù)據(jù)源輸出數(shù)據(jù),就可以對(duì)視頻幀的進(jìn)行操作。還能通過(guò)時(shí)間判斷,對(duì)某一范圍的視頻幀做不同的濾鏡操作。
具體代碼詳見(jiàn)Metal-10(視頻播放)里面。
熱愛(ài)生活,記錄生活!