??????前段時(shí)間在Andorid平臺實(shí)現(xiàn)了屏幕直播,現(xiàn)將其整理一下,用到的知識點(diǎn)主要為:MediaProjection和MediaCodec。
一.MediaProjection獲取
??????MediaProjection是Android5.0后提出的一套用于錄制屏幕的API,無需root權(quán)限。與 MediaProjection協(xié)同的類有 MediaProjectionManager, MediaCodec。使用MediaProjection需要在AndroidManifest.xml中加入以下權(quán)限:
<uses-permission android:name="android.permission.MANAGE_MEDIA_PROJECTION" />
a.獲取MediaProjectionManager
??????獲取MediaProjection,需要用到MediaProjectionManager,它是一個(gè)系統(tǒng)級的服務(wù),類似WindowManager,ActivityManager等,可以通過getSystemService方法來獲取它的實(shí)例:
MediaProjectionManager mediaProjectionManager =
(MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
public static final String MEDIA_PROJECTION_SERVICE = "media_projection";
b.請求
??????獲取到MediaProjectionManager之后,再來進(jìn)一步獲取MediaProjection,獲取方式如下:
private void requestPermission() {
MediaProjectionManager mediaProjectionManager =
(MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(),
REQUEST_MEDIA_PROJECTION);
}
??????獲取方式是通過startActivityForResult()來獲取,createScreenCaptureIntent()是獲取請求的Intent,如下:
public Intent createScreenCaptureIntent() {
Intent i = new Intent();
i.setClassName("com.android.systemui","com.android.systemui.media.MediaProjectionPermissionActivity");
return i;
}
??????從請求的Intent可以看到,是去啟動(dòng)systemui里面的一個(gè)叫MediaProjectionPermissionActivity的Activity。
c.請求處理
??????由于屏幕錄制會(huì)涉及到個(gè)人隱私,需要彈窗確認(rèn),一起看一下MediaProjectionPermissionActivity的邏輯處理:
public class MediaProjectionPermissionActivity extends Activity
implements DialogInterface.OnClickListener, CheckBox.OnCheckedChangeListener,
DialogInterface.OnCancelListener {
......
......
@Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
mPackageName = getCallingPackage();
IBinder b = ServiceManager.getService(MEDIA_PROJECTION_SERVICE);
mService = IMediaProjectionManager.Stub.asInterface(b);
if (mPackageName == null) {
finish();
return;
}
PackageManager packageManager = getPackageManager();
ApplicationInfo aInfo;
try {
aInfo = packageManager.getApplicationInfo(mPackageName, 0);
mUid = aInfo.uid;
} catch (PackageManager.NameNotFoundException e) {
Log.e(TAG, "unable to look up package name", e);
finish();
return;
}
try {
if (mService.hasProjectionPermission(mUid, mPackageName)) {
setResult(RESULT_OK, getMediaProjectionIntent(mUid, mPackageName,
false /*permanentGrant*/));
finish();
return;
}
} catch (RemoteException e) {
Log.e(TAG, "Error checking projection permissions", e);
finish();
return;
}
......
//彈窗來確認(rèn)是否賦予權(quán)限
......
}
......
......
}
??????通過以上可以看到,在MediaProjectionPermissionActivity創(chuàng)建后,主要做了以下幾件事:
??????1.通過ServiceManager獲取到MediaProjectionManager引用對象;
??????2.獲取調(diào)用者的包名、uid等信息,進(jìn)行檢測判斷,如果該package已經(jīng)請求過且同意過,直接調(diào)用setResult()返回;否則的話,會(huì)彈窗進(jìn)行確認(rèn);
??????接下來看一下允許后,執(zhí)行setResult()的邏輯:
setResult(RESULT_OK, getMediaProjectionIntent(mUid, mPackageName, mPermanentGrant));
private Intent getMediaProjectionIntent(int uid, String packageName, boolean permanentGrant)
throws RemoteException {
IMediaProjection projection = mService.createProjection(uid, packageName,
MediaProjectionManager.TYPE_SCREEN_CAPTURE, permanentGrant);
Intent intent = new Intent();
intent.putExtra(MediaProjectionManager.EXTRA_MEDIA_PROJECTION, projection.asBinder());
return intent;
}
??????可以看到,通過getMediaProjectionIntent()來獲取Intent,通過前面獲取的mService來獲取IMediaProjection實(shí)例,然后通過asBinder()獲取到IMediaProjection實(shí)例對應(yīng)的binder傳入Intent,最后返回Intent。
d.請求返回
??????處理端MediaProjectionPermissionActivity執(zhí)行setResult()后,申請端通過onActivityResult來獲取結(jié)果,data為Intent,通過getMediaProjection來獲取MediaProjection。
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode != RESULT_OK) {
Toast.makeText(this,
"User denied screen recorder permission", Toast.LENGTH_SHORT).show();
return;
}
mMediaProjection = mProjectionManager.getMediaProjection(resultCode, data);
}
??????至此MediaProjection已經(jīng)獲取完畢。
Rom開發(fā)獲取MediaProjection
??????如果應(yīng)用是基于平臺開發(fā),不希望錄屏?xí)r給用戶彈窗提示,可以不通過startActivityForResult,從而跳過去MediaProjectionPermissionActivity申請彈窗權(quán)限,邏輯如下:
a.應(yīng)用系統(tǒng)具有系統(tǒng)權(quán)限
android:sharedUserId="android.uid.system
b.直接去獲取onActivityResult返回的data Intent
public Intent getMediaProjectionIntent(Context context, String packageName,
boolean permanentGrant) throws RemoteException {
IBinder b = ServiceManager.getService(MEDIA_PROJECTION_SERVICE);
sMediaProjectionManager = IMediaProjectionManager.Stub.asInterface(b);
PackageManager packageManager = context.getPackageManager();
ApplicationInfo aInfo;
int uid;
try {
aInfo = packageManager.getApplicationInfo(packageName, 0);
uid = aInfo.uid;
} catch (PackageManager.NameNotFoundException e) {
Log.e(TAG, "unable to look up package name", e);
return null;
}
IMediaProjection projection = sMediaProjectionManager.createProjection(uid, packageName,
MediaProjectionManager.TYPE_SCREEN_CAPTURE, permanentGrant);
Intent intent = new Intent();
intent.putExtra(MediaProjectionManager.EXTRA_MEDIA_PROJECTION, projection.asBinder());
return intent;
}
c.獲取MediaProjection
Intent data = getMediaProjectionIntent(c, PKG_NAME, true);
mMediaProjection = mProjectionManager.getMediaProjection(Activity.RESULT_OK, data);
二.屏幕錄制
??????在獲取到MediaProjection之后,錄屏的權(quán)限已經(jīng)獲得,接下來就可以進(jìn)行屏幕錄制了,需要?jiǎng)?chuàng)建一virtualDisplay來進(jìn)行錄屏,創(chuàng)建方式如下:
mVirtualDisplay = mMediaProjection.createVirtualDisplay("-display", width, height,
1,DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, null);
??????在創(chuàng)建VirtualDisplay時(shí),注意如下:
??????a.width、height分別代表錄制display對應(yīng)的寬和高像素大??;
??????b.surface傳值不能為null,為null時(shí),沒有屏幕數(shù)據(jù)產(chǎn)出;
??????c.當(dāng)surface為surfaceView.getHolder().getSurface()時(shí),錄屏?xí)苯釉趕urfaceView上顯示,在加載surfaceview時(shí),需要執(zhí)行surfaceView.getHolder().setFixedSize(VIDEO_WIDTH, VIDEO_HEIGHT),VIDEO_WIDTH和VIDEO_HEIGHT需要跟createVirtualDisplay時(shí)傳入的width和height保持一致,否則的話,surfaceview內(nèi)的視頻會(huì)有拉伸或位移;
??????d.當(dāng)surface = vencoder.createInputSurface()時(shí),獲取MediaCodec的surface,這個(gè)surface其實(shí)就是一個(gè)入口,屏幕作為輸入源就會(huì)進(jìn)入這個(gè)入口,然后交給MediaCodec編碼,可以將數(shù)據(jù)通過網(wǎng)絡(luò)傳輸給其他設(shè)備顯示。
??????在上述都準(zhǔn)備好后,需要MediaCodec登場了,MediaCodec可以訪問底層的媒體編解碼器,可以對媒體進(jìn)行編/解碼,編碼是錄屏的過程,解碼是顯示的過程。
三.MediaCodec編解碼
??????編碼是錄屏的過程,實(shí)時(shí)獲取屏幕的數(shù)據(jù),接下來看一下通過Mediacodec來創(chuàng)建編碼器。
a.Encoder
??????Encoder負(fù)責(zé)實(shí)時(shí)獲取屏幕數(shù)據(jù),將數(shù)據(jù)儲存,供后續(xù)通過網(wǎng)絡(luò)發(fā)送屏幕數(shù)據(jù)。
??????1.Encoder配置及創(chuàng)建
public static final String MIMETYPE_VIDEO_AVC = "video/avc";
private void startVideoEncoder() {
MediaCodec vencoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
vencoder.configure(format[屬性配置], null, null, CONFIGURE_FLAG_ENCODE);
Surface surface = vencoder.createInputSurface();
mVirtualDisplay = mMediaProjection.createVirtualDisplay("-display", width, height,
1,DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, surface, null, null);
vencoder.start();
}
??????調(diào)用MediaCodec的createEncoderByType()來創(chuàng)建Encoder,對應(yīng)的type為"video/avc",代表屏幕視頻數(shù)據(jù)是H264編碼;獲取到Encoder對象后,調(diào)用Encoder的createInputSurface()來創(chuàng)建surface作為屏幕數(shù)據(jù)的入口,用來儲存后進(jìn)行發(fā)送;配置好Format后,調(diào)用start()來啟動(dòng)進(jìn)行屏幕錄制。
??????2.獲取屏幕數(shù)據(jù)并發(fā)送到指定端去渲染顯示
MediaCodec.BufferInfo vBufferInfo = new MediaCodec.BufferInfo();
while (isRunning) {
int outputBufferId = vencoder.dequeueOutputBuffer(vBufferInfo, 0);//dequeue有效的Output buffer索引,為了發(fā)送傳輸。
ByteBuffer bb;
if (outputBufferId >= 0) {
if (Build.VERSION.SDK_INT < 21) {
ByteBuffer[] outputBuffers = vencoder.getOutputBuffers();//獲取錄屏數(shù)據(jù)存儲的Output buffer數(shù)組
bb = outputBuffers[outputBufferId];
} else {
bb = vencoder.getOutputBuffer(outputBufferId);
}
}
}
??????在獲取輸出緩存時(shí),首先創(chuàng)建一個(gè)BufferInfo對象,然后不斷循環(huán)通過dequeueOutputBuffer(BufferInfo info, long timeoutUs)來請求輸出緩存索引outputBufferId,再通過getOutputBuffer()和outputBufferId來獲取輸出緩存,在獲取索引的時(shí)候需要傳入剛創(chuàng)建的BufferInfo對象,用于存儲ByteBuffer的信息,比如:當(dāng)前是配置幀還是關(guān)鍵幀,使用方式如下:
//讀取索引下的有效數(shù)據(jù),進(jìn)行轉(zhuǎn)換后發(fā)送到指定端
private void onEncodedAvcFrame(ByteBuffer buffer, MediaCodec.BufferInfo info) {
if ((info.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
/*
* 特定格式信息等配置數(shù)據(jù),不是媒體數(shù)據(jù)
*/
} else if ((info.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0) {
/* delimiter: 00 00 00 01 */
/* I-frame:buf[5]==0x65; SPS:buf[5]==0x67; PPS:buf[5]==0x68; */
}
}
//發(fā)送數(shù)據(jù)
??????發(fā)送數(shù)據(jù)完成后,釋放申請的output buffer,釋放方式如下:
vencoder.releaseOutputBuffer(outputBufferId, false[不渲染到surface]);//釋放申請的Output buffer
??????3.Encoder工作流程圖

b.Decoder
??????Decoder負(fù)責(zé)渲染屏幕數(shù)據(jù),將從網(wǎng)絡(luò)接收的屏幕數(shù)據(jù)進(jìn)行入列處理后出列再進(jìn)行渲染。
??????1.Decoder配置及創(chuàng)建
private void startVideoDecoder() {
MediaCodec decoder = MediaCodec.createDecoderByType(MIME_TYPE);
final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
format.setInteger(MediaFormat.KEY_BIT_RATE, VIDEO_WIDTH * VIDEO_HEIGHT);
format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
//橫屏
byte[] header_sps = {0, 0, 0, 1, 103, 66, -128, 31, -38, 1, 64, 22, -24, 6, -48, -95, 53};
byte[] header_pps = {0, 0 ,0, 1, 104, -50, 6, -30};
//豎屏
byte[] header_sps = {0, 0, 0, 1, 103, 66, -128, 31, -38, 2, -48, 40, 104, 6, -48, -95, 53};
byte[] header_pps = {0, 0 ,0, 1, 104, -50, 6, -30};
format.setByteBuffer("csd-0", ByteBuffer.wrap(header_sps));
format.setByteBuffer("csd-1", ByteBuffer.wrap(header_pps));
decoder.configure(format, mSurface, null, 0);//mSurface對應(yīng)需要展示surfaceview的surface
decoder.start();
}
??????調(diào)用MediaCodec的createDecoderByType()來創(chuàng)建Decoder,對應(yīng)的type為"video/avc",代表屏幕視頻數(shù)據(jù)是H264編碼;配置好Format后,調(diào)用start()來啟動(dòng)。
??????2.將遠(yuǎn)端傳輸過來的屏幕數(shù)據(jù)渲染顯示
????????????a.遠(yuǎn)端屏幕數(shù)據(jù)過來后,將數(shù)據(jù)存入到Input Buffer中;
// Get input buffer index
int inputBufferIndex = decoder.dequeueInputBuffer(100);//dequeue可以存儲的有效索引
ByteBuffer inputBuffer;
if (inputBufferIndex >= 0) {
if (Build.VERSION.SDK_INT < 21) {
ByteBuffer[] inputBuffers = decoder.getInputBuffers();//獲取可以存儲的input buffer數(shù)組
inputBuffer = inputBuffers[inputBufferIndex];
} else {
inputBuffer = decoder.getInputBuffer(inputBufferIndex);
}
inputBuffer.clear();
inputBuffer.put(buf, offset, length);//將傳過來的buf放入有效的buffer索引中
decoder.queueInputBuffer(inputBufferIndex, 0, length, System.currentTimeMillis(), 0);//將數(shù)據(jù)queue到需要渲染的input buffer中
}
???????通過getInputBuffer(inputBufferIndex)得到當(dāng)前請求的輸入緩存,在使用之前要進(jìn)行clear(),避免之前的緩存數(shù)據(jù)影響當(dāng)前數(shù)據(jù),然后把網(wǎng)絡(luò)接收的數(shù)據(jù)添加到輸入緩存中,并調(diào)用queueInputBuffer(…)把緩存數(shù)據(jù)入隊(duì);
????????????b.不斷去獲取存入input buffer中的數(shù)據(jù),渲染到surfaceview上顯示
while (mIsRunning) {
// Get output buffer index
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 100);//dequeue一塊已經(jīng)存好數(shù)據(jù)[a步queueInputBuffer的數(shù)據(jù)]的 輸出buffer索引
while (outputBufferIndex >= 0) {
decoder.releaseOutputBuffer(outputBufferIndex, true[渲染到surface]);//將數(shù)據(jù)在surface上渲染[surfaceview上顯示]
outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 0);//不斷dequeue,以備渲染
}
}
??????通過以上可以看到,首先請求一個(gè)空的輸入緩存(input buffer),向其中填充滿數(shù)據(jù)并將它傳遞給編解碼器處理,編解碼器處理完這些數(shù)據(jù)并將處理結(jié)果輸出至一個(gè)空的輸出緩存(output buffer)中。最終請求到一個(gè)填充了結(jié)果數(shù)據(jù)的輸出緩存(output buffer),使用完其中的數(shù)據(jù),并將其釋放給編解碼器再次使用。
??????3.具體流程
????????????a. Client 從 input 緩沖區(qū)隊(duì)列申請 empty buffer [dequeueInputBuffer];
????????????b. Client 把需要編解碼的數(shù)據(jù)拷貝到 empty buffer,然后放入 input 緩沖區(qū)隊(duì)列 [queueInputBuffer];
????????????c. MediaCodec 模塊從 input 緩沖區(qū)隊(duì)列取一幀數(shù)據(jù)進(jìn)行編解碼處理;
????????????d. 編解碼處理結(jié)束后,MediaCodec 將原始數(shù)據(jù) buffer 置為 empty 后放回 input 緩沖區(qū)隊(duì)列,將編解碼后的數(shù)據(jù)放入到 output 緩沖區(qū)隊(duì)列;
????????????e. Client 從 output 緩沖區(qū)隊(duì)列申請編解碼后的 buffer [dequeueOutputBuffer];
????????????f. Client 對編解碼后的 buffer 進(jìn)行渲染/播放;
????????????g. 渲染/播放完成后,Client 再將該 buffer 放回 output 緩沖區(qū)隊(duì)列 ;[releaseOutputBuffer]
原文鏈接:https://blog.csdn.net/gb702250823/java/article/details/81627503
??????4.Decoder工作流程圖

四.總結(jié)
針對以上的分析,最后總結(jié)一下屏幕錄制及分享的工作流程:
??????1.屏幕分享端先獲取MediaProjection;
??????2.屏幕分享端通過MediaCodec的createEncoderByType創(chuàng)建編碼器,進(jìn)行配置后start();
??????3.屏幕觀看端通過MediaCodec的createDecoderByType創(chuàng)建解碼器,進(jìn)行配置后start();
??????4.屏幕分享端循環(huán)執(zhí)行dequeueOutputBuffer(),getOutputBuffer(),sendData(),releaseOutputBuffer(,false);
??????5.屏幕觀看端循環(huán)執(zhí)行dequeueInputBuffer(),getInputBuffer(),queueInputBuffer(),dequeueOutputBuffer(),releaseOutputBuffer(,true);
??????6.分享及觀看結(jié)束時(shí),執(zhí)行stop()、release();
配置幀
??????cfgFrame:配置幀,解碼器在收到該幀后,才能開始解碼,否則的話,會(huì)出現(xiàn)綠屏等現(xiàn)象,格式如下:
byte [] cfgFrame1 = {0, 0, 0, 1, 103, 66, -128, 31, -38, 1, 64, 22, -23, 72, 40, 48, 48, 54, -123, 9, -88, 0, 0, 0, 1, 104, -50, 6, -30};
byte [] cfgFrame2 = {0, 0, 0, 1, 103, 66, -128, 31, -38, 1, 64, 61, -91, 32, -96, -64, -64, -38, 20, 38, -96, 0, 0, 0, 1, 104, -50, 6, -30};
一張圖總結(jié)一下分享流程

??????至此在Android平臺上屏幕直播流程已經(jīng)完成了。