1. 前言
首先本次的目的是實現(xiàn)iOS 屏幕的采集, 包含系統(tǒng)屏幕和 App內(nèi)部屏幕的畫面, 同時需要在 App內(nèi)部喚起直播, 基于以上的需我們需要 iOS12 之后的技術(shù), 使用ReplayKit iOS12 之后相關(guān) api 才能完成, 然后由于使用擴展程序的諸多限制, 比如內(nèi)存限制不能超過 50M等.
所以這次需求需要
- 從擴展 app 向宿主 app 傳輸視頻幀數(shù)據(jù)有兩種方式
采用 socket進行進程間Broadcast Unload Extension 向 宿主 app 傳輸數(shù)據(jù)
采用 App Group
需要后臺保活持續(xù)采集屏幕數(shù)據(jù)
在宿主 App 進行視頻數(shù)據(jù)編碼
宿主 app 和擴展 app 同時使用公用 iOS 工具類, 所以還需要創(chuàng)建一個 framwork
基于以上目的我們準(zhǔn)備
編譯環(huán)境 Xcode14.2, iOS12
創(chuàng)建 Broadcast Unload Extension
程序永久保活
創(chuàng)建 framework 供 Broadcast Unload Extension 和宿主 app 調(diào)用共用類
系統(tǒng)屏幕數(shù)據(jù)采集
app 內(nèi)屏幕共享
2. 第一步創(chuàng)建 Broadcast Unload Extension
步驟: File -> new -> Target

創(chuàng)建好之后生成 一個擴展 App, 自動生成如圖的一個 sampleHandr類, sampleHandr用來持續(xù)采集視頻,音頻幀數(shù)據(jù)

broadcastStartedWithSetupInfo 宿主 app開始直播屏幕的時候這里會走一次
processSampleBuffer 這個方法會實時回到
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
// 宿主 app開始直播屏幕的時候這里會走一次
// 設(shè)置 socket
// 其中 FIAgoraSampleHandlerSocketManager這個類可以看 Demo 的實現(xiàn)
[[FIAgoraSampleHandlerSocketManager sharedManager] setUpSocket];
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
}
- (void)broadcastFinished {
// User has requested to finish the broadcast.
}
// 實時采集數(shù)據(jù)
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
// 發(fā)送視頻數(shù)據(jù)導(dǎo)宿主 App
[[FIAgoraSampleHandlerSocketManager sharedManager] sendVideoBufferToHostApp:sampleBuffer];
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
// 處理音頻
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
// 麥克風(fēng)
break;
default:
break;
}
}
3. FIAgoraSampleHandlerSocketManager 關(guān)于數(shù)據(jù)傳輸?shù)念?都放到一個framework 當(dāng)中所以首先創(chuàng)建一個 framwork
步驟: File -> new -> Target 創(chuàng)建 framework
創(chuàng)建好之后在宿主 app 和 extension 分別引用, 如圖 2

4. 宿主 App
手動啟動直播, UI 是固定樣式的所以需要一些操作改變系統(tǒng) UI 樣式
需要永久?;? 這里之前我的理解是開啟直播, 系統(tǒng)會自動完成app保活, 但是我的直播總是莫名的中斷, 所以這個暫時我這邊來看是必須得
socket block 監(jiān)測數(shù)據(jù)回調(diào)
編碼, 由于視頻數(shù)據(jù)其實簡單來說是有很多多余數(shù)據(jù)在的, 需要進行壓縮, 裁剪等, 使視頻再不丟幀的情況下傳輸, 就叫做編碼, 一般編碼的為 H264 數(shù)據(jù)
編碼后的數(shù)據(jù)進行推流
4.1 初始化開啟直播的按鈕
self.broadcastPickerView.preferredExtension 這個用來綁定擴展的 bundleId, 這樣開啟直播的時候, 系統(tǒng)頁面就會只展示你自己的擴展了
改變系統(tǒng)提供的按鈕的 UI, 這里有個風(fēng)險, 以后可能會失效, 暫時用沒有什么問題
// 設(shè)置系統(tǒng)的廣播 Picker 視圖
- (void)setupSystemBroadcastPickerView
{
// 兼容 iOS12 或更高的版本
if (@available(iOS 12.0, *)) {
self.broadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(50, 200, 100, 100)];
self.broadcastPickerView.preferredExtension = @"summerxx.com.screen-share-ios.broadcast-extension";
self.broadcastPickerView.backgroundColor = UIColor.cyanColor;
self.broadcastPickerView.showsMicrophoneButton = NO;
[self.view addSubview:self.broadcastPickerView];
}
// 改變系統(tǒng)提供的按鈕的 UI, 這里有個風(fēng)險, 以后可能會失效, 暫時用沒有什么問題
UIButton *startButton = [UIButton buttonWithType:UIButtonTypeCustom];
startButton.frame = CGRectMake(50, 310, 100, 100);
startButton.backgroundColor = UIColor.cyanColor;
[startButton setTitle:@"開啟攝像頭" forState:UIControlStateNormal];
[startButton setTitleColor:UIColor.blackColor forState:UIControlStateNormal];
[startButton addTarget:self action:@selector(startAction) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:startButton];
}
4.2 永久?;? 這里采用的是持續(xù)播放音頻

// 監(jiān)聽
[[NSNotificationCenter defaultCenter]addObserver:self selector:@selector(didEnterBackGround) name:UIApplicationDidEnterBackgroundNotification object:nil];
[[NSNotificationCenter defaultCenter]addObserver:self selector:@selector(willEnterForeground) name:UIApplicationWillEnterForegroundNotification object:nil];
- (void)willEnterForeground
{
// 這里具體可看 Demo
[[FJDeepSleepPreventerPlus sharedInstance] stop];
}
- (void)didEnterBackGround
{
[[FJDeepSleepPreventerPlus sharedInstance] start];
}
4.3 數(shù)據(jù)回調(diào)
__weak __typeof(self) weakSelf = self;
[FIAgoraClientBufferSocketManager sharedManager].testBlock = ^(NSString * testText, CMSampleBufferRef sampleBuffer) {
// 進行視頻編碼
[weakSelf.h264code encodeSampleBuffer:sampleBuffer H264DataBlock:^(NSData * data) {
NSLog(@"%@", data);
// 編碼后可進行推流流程
}];
};
以上就是使用 socket數(shù)據(jù)傳輸視頻幀, 以及我遇到的一些細(xì)節(jié)問題
5. 使用 App Group 進行數(shù)據(jù)傳輸
在 extension 創(chuàng)建一個 App Group
創(chuàng)建一個 NSUserDefaults 綁定 App Group
賦值 NSUserDefaults 傳輸

- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType
{
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
{
// Handle video sample buffer
@autoreleasepool {
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
float cropRate = (float)CVPixelBufferGetWidth(pixelBuffer) / (float)CVPixelBufferGetHeight(pixelBuffer);
CGSize targetSize = CGSizeMake(540, 960);
NTESVideoPackOrientation targetOrientation = NTESVideoPackOrientationPortrait;
if (@available(iOS 11.0, *)) {
CFStringRef RPVideoSampleOrientationKeyRef = (__bridge CFStringRef)RPVideoSampleOrientationKey;
NSNumber *orientation = (NSNumber *)CMGetAttachment(sampleBuffer, RPVideoSampleOrientationKeyRef,NULL);
if (orientation.integerValue == kCGImagePropertyOrientationUp ||
orientation.integerValue == kCGImagePropertyOrientationUpMirrored) {
targetOrientation = NTESVideoPackOrientationPortrait;
} else if(orientation.integerValue == kCGImagePropertyOrientationDown ||
orientation.integerValue == kCGImagePropertyOrientationDownMirrored) {
targetOrientation = NTESVideoPackOrientationPortraitUpsideDown;
} else if (orientation.integerValue == kCGImagePropertyOrientationLeft ||
orientation.integerValue == kCGImagePropertyOrientationLeftMirrored) {
targetOrientation = NTESVideoPackOrientationLandscapeLeft;
} else if (orientation.integerValue == kCGImagePropertyOrientationRight ||
orientation.integerValue == kCGImagePropertyOrientationRightMirrored) {
targetOrientation = NTESVideoPackOrientationLandscapeRight;
}
}
NTESI420Frame *videoFrame = [NTESYUVConverter pixelBufferToI420:pixelBuffer
withCrop:cropRate
targetSize:targetSize
andOrientation:targetOrientation];
NSDictionary *frame = @{
@"width": @(videoFrame.width),
@"height": @(videoFrame.height),
@"data": [videoFrame bytes],
@"timestamp": @(CACurrentMediaTime() * 1000)
};
[self.userDefautls setObject:frame forKey:@"frame"];
[self.userDefautls synchronize];
}
}
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
}
}
在宿主 app
// APP Group 數(shù)據(jù)傳輸
- (void)setupUserDefaults
{
// 通過UserDefaults建立數(shù)據(jù)通道,接收Extension傳遞來的視頻幀
self.userDefaults = [[NSUserDefaults alloc] initWithSuiteName:kAppGroup];
}
// 監(jiān)聽: 屏幕數(shù)據(jù)
- (void)addObserver
{
// KVO
[self.userDefaults addObserver:self forKeyPath:@"frame" options:NSKeyValueObservingOptionNew context:KVOContext];
}
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary<NSKeyValueChangeKey,id> *)change
context:(void *)context
{
if ([keyPath isEqualToString:@"frame"]) {
NSDictionary *i420Frame = change[NSKeyValueChangeNewKey];
NSData *data = i420Frame[@"data"];
NTESI420Frame *frame = [NTESI420Frame initWithData:data];
CMSampleBufferRef sampleBuffer = [frame convertToSampleBuffer];
if (sampleBuffer == NULL) {
return;
}
#warning 不需要解碼, 屏幕共享的數(shù)據(jù), 編碼的同時解碼, 內(nèi)存會暴漲, 這個只用來測試畫面
__weak typeof(self) weakSelf = self;
[self.h264code encodeSampleBuffer:sampleBuffer H264DataBlock:^(NSData * data) {
NSLog(@"%@", data);
// 正常情況應(yīng)該去推流
}];
// 釋放對象
CFRelease(sampleBuffer);
}
}
- (void)dealloc
{
[self.userDefaults removeObserver:self forKeyPath:@"frame"];
}
總結(jié):
以上就是 App Group 數(shù)據(jù)傳輸?shù)姆绞搅? 這兩種方式我寫了 2 個 Demo, Demo 還包含的解碼, 攝像頭采集, 渲染等進行了編解碼的測試
其中查了很多資料, 相關(guān)鏈接會放到最后供大家查看
Demo 我放在這里了, 想要看的話可以這里下載
Demo App Group 方式 https://github.com/summerxx27/ReplayKitShareScreen
Demo socket 方式 https://github.com/summerxx27/ReplayKitShareScreen-socket
文章參照
視頻流輸出方案
https://zhuanlan.zhihu.com/p/549325898
網(wǎng)易云信文檔
http://dev.yunxin.163.com/docs/product/音視頻通話1.0/SDK開發(fā)集成/iOS開發(fā)集成/屏幕共享
用ffmpeg來處理音視頻格式問題以及錄屏的裸數(shù)據(jù)轉(zhuǎn)mp4
http://www.itdecent.cn/p/41ea7e06c971
iOS ReplayKit 50M限制處理策略
http://www.itdecent.cn/p/8c25a3bbcb16
iOS 12 手動開啟錄屏直播
https://www.cnblogs.com/songliquan/p/15891392.html
編碼 demo
https://github.com/gezhaoyou/CaptureVideoDemo/tree/master
iOS ReplayKit 50M限制處理策略!
https://juejin.cn/post/6968738257123147807
編碼 videotoolbox
http://www.itdecent.cn/p/67d0dd931ed6
直播的基礎(chǔ)知識
https://www.cnblogs.com/junhuawang/p/7fe457786.html
Add support for publishing in background mode: VideoToolBox now supports background mode
https://github.com/shogo4405/HaishinKit.swift/issues/626
iOS音視頻開發(fā)八:視頻編碼,H.264 和 H.265 都支持
https://blog.csdn.net/m0_60259116/article/details/124804169
ios VideoToolbox 硬編碼 錯誤碼匯總
http://www.itdecent.cn/p/dce0a52e1bd6
騰訊云嗯的那個
https://cloud.tencent.com/developer/article/2021517
阿里云文檔
https://developer.aliyun.com/ask/64678?spm=a2c6h.13159736
比較詳細(xì)的屏幕擴展
http://www.itdecent.cn/p/bbe736e7b5eb
改變按鈕的樣式
http://kinoandworld.github.io/2021/07/20/RecordScreenLiveSummary/
iOS端屏幕錄制Replaykit項目實踐
http://www.itdecent.cn/p/392777d1995c
騰訊云屏幕共享