目錄:
1 MessageQueue next()
2 Vsync
3 Choreographer doFrame
4 input
系統(tǒng)是一個無限循環(huán)的模型, Android也不例外,進程被創(chuàng)建后就陷入了無限循環(huán)的狀態(tài)
系統(tǒng)運行最重要的兩個概念:輸入,輸出。
- 輸入: 按鍵事件、觸摸事件、鼠標事件、軌跡球事件
- 輸出: Choreographer 如何指揮完成一幀的繪制
Android 中輸入 輸出 的往復(fù)循環(huán)都是在 looper 中消息機制驅(qū)動下完成的
looper 的循環(huán)中, messageQueue next 取消息進行處理, 處理輸入事件, 進行輸出, 完成和用戶交互

應(yīng)用生命周期內(nèi)會不斷 產(chǎn)生 message 到 messageQueue 中, 有: java層 也有 native層
其中最核心的方法就是 messageQueue 的 next 方法, 其中會先處理 java 層消息, 當 java 層沒有消息時候, 會執(zhí)行 nativePollOnce 來處理 native 的消息 以及監(jiān)聽 fd 各種事件

從硬件來看, 屏幕不會一直刷新, 屏幕的刷新只需要符合人眼的視覺停留機制
24Hz , 連續(xù)刷新每一幀, 人眼就會認為畫面是流暢的
所以我們只需要配合上這個頻率, 在需要更新 UI 的時候執(zhí)行繪制操作
如何以這個頻率進行繪制每一幀: Android 的方案是 Vsync 信號驅(qū)動。
Vsync 信號的頻率就是 24Hz , 也就是每隔 16.6667 ms 發(fā)送一次 Vsync 信號提示系統(tǒng)合成一幀。
監(jiān)聽屏幕刷新來發(fā)送 Vsync 信號的能力,應(yīng)用層 是做不到的, 系統(tǒng)是通過 jni 回調(diào)到 Choreographer 中的 Vsync 監(jiān)聽, 將這個重要信號從 native 傳遞到 java 層。
總體來說 輸入事件獲取 Vsync信號獲取 都是先由 native 捕獲事件 然后 jni 到 java 層實現(xiàn)業(yè)務(wù)邏輯
執(zhí)行的是 messageQueue 中的關(guān)鍵方法: next
1 MessageQueue next()
next 主要的邏輯分為: java 部分 和 native 部分
java 上主要是取java層的 messageQueue msg 執(zhí)行, 無 msg 就 idleHandler
java層 無 msg 會執(zhí)行 native 的 pollOnce@Looper


native looper 中 fd 監(jiān)聽封裝為 requestQueue, epoll_wait 將 fd 中的事件和對應(yīng) request 封裝為 response 處理, 處理的時候會調(diào)用 fd 對應(yīng)的 callback 的 handleEvent
/**
* Waits for events to be available, with optional timeout in milliseconds.
* Invokes callbacks for all file descriptors on which an event occurred.
*
* If the timeout is zero, returns immediately without blocking.
* If the timeout is negative, waits indefinitely until an event appears.
*
* Returns POLL_WAKE if the poll was awoken using wake() before
* the timeout expired and no callbacks were invoked and no other file
* descriptors were ready.
*
* Returns POLL_CALLBACK if one or more callbacks were invoked.
*
* Returns POLL_TIMEOUT if there was no data before the given
* timeout expired.
*
* Returns POLL_ERROR if an error occurred.
*
* Returns a value >= 0 containing an identifier if its file descriptor has data
* and it has no callback function (requiring the caller here to handle it).
* In this (and only this) case outFd, outEvents and outData will contain the poll
* events and data associated with the fd, otherwise they will be set to NULL.
*
* This method does not return until it has finished invoking the appropriate callbacks
* for all file descriptors that were signalled.
*/
int pollOnce(int timeoutMillis, int* outFd, int* outEvents, void** outData);
native 層 pollOnce 主要做的事情是:
- epoll_wait 監(jiān)聽 fd 封裝為 response
- 處理所有 native message
- 處理 response 回調(diào) handleEvent ,handleEvent 很多回回調(diào)到 java 層
vsync 信號,輸入事件, 都是通過這樣的機制完成的。
2 vsync
epoll_wait 機制 拿到的 event , 都在 response pollOnce pollInner 處理了
這里的 dispatchVsync 從 native 回到 java 層

native:
int DisplayEventDispatcher::handleEvent(int, int events, void*) {
if (events & (Looper::EVENT_ERROR | Looper::EVENT_HANGUP)) {
ALOGE("Display event receiver pipe was closed or an error occurred. "
"events=0x%x",
events);
return 0; // remove the callback
}
......
dispatchVsync(vsyncTimestamp, vsyncDisplayId, vsyncCount, vsyncEventData);
}
java:
// Called from native code.
@SuppressWarnings("unused")
@UnsupportedAppUsage
private void dispatchVsync(long timestampNanos, long physicalDisplayId, int frame) {
onVsync(timestampNanos, physicalDisplayId, frame);
}
private final class FrameDisplayEventReceiver extends DisplayEventReceiver
implements Runnable {
@Override
public void onVsync(long timestampNanos, long physicalDisplayId, int frame) {
mTimestampNanos = timestampNanos;
mFrame = frame;
Message msg = Message.obtain(mHandler, this);
msg.setAsynchronous(true);
mHandler.sendMessageAtTime(msg, timestampNanos / TimeUtils.NANOS_PER_MS);
}
@Override
public void run() {
mHavePendingVsync = false;
doFrame(mTimestampNanos, mFrame);
}
}
收到 Vsync 信號后, Choreographer 執(zhí)行 doFrame
應(yīng)用層重要的工作幾乎都在 doFrame 中
3 Choreographer doFrame
首先看下 doFrame 執(zhí)行了什么:
try {
Trace.traceBegin(Trace.TRACE_TAG_VIEW, "Choreographer#doFrame");
AnimationUtils.lockAnimationClock(frameTimeNanos / TimeUtils.NANOS_PER_MS);
mFrameInfo.markInputHandlingStart();
doCallbacks(Choreographer.CALLBACK_INPUT, frameTimeNanos);
mFrameInfo.markAnimationsStart();
doCallbacks(Choreographer.CALLBACK_ANIMATION, frameTimeNanos);
doCallbacks(Choreographer.CALLBACK_INSETS_ANIMATION, frameTimeNanos);
mFrameInfo.markPerformTraversalsStart();
doCallbacks(Choreographer.CALLBACK_TRAVERSAL, frameTimeNanos);
doCallbacks(Choreographer.CALLBACK_COMMIT, frameTimeNanos);
} finally {
AnimationUtils.unlockAnimationClock();
Trace.traceEnd(Trace.TRACE_TAG_VIEW);
}
UI 線程的核心工作就在這幾個方法中:

上述執(zhí)行 callback 的過程就對應(yīng)了圖片中 依次處理 input animation traversal 這幾個關(guān)鍵過程
執(zhí)行的周期是 16.6ms, 實際可能因為一些 delay 造成一些延遲、丟幀
4 input
input 事件的整體邏輯和 vsync 類似
native handleEvent ,在 NativeInputEventReceiver 中處理事件, 區(qū)分不同事件會通過 JNI
走到 java 層,WindowInputEventReceiver 然后進行分發(fā)消費

native :
int NativeInputEventReceiver::handleEvent(int receiveFd, int events, void* data) {
// Allowed return values of this function as documented in LooperCallback::handleEvent
constexpr int REMOVE_CALLBACK = 0;
constexpr int KEEP_CALLBACK = 1;
if (events & ALOOPER_EVENT_INPUT) {
JNIEnv* env = AndroidRuntime::getJNIEnv();
status_t status = consumeEvents(env, false /*consumeBatches*/, -1, nullptr);
mMessageQueue->raiseAndClearException(env, "handleReceiveCallback");
return status == OK || status == NO_MEMORY ? KEEP_CALLBACK : REMOVE_CALLBACK;
}
status_t NativeInputEventReceiver::consumeEvents(JNIEnv* env,
bool consumeBatches, nsecs_t frameTime, bool* outConsumedBatch) {
for (;;) {
uint32_t seq;
InputEvent* inputEvent;
status_t status = mInputConsumer.consume(&mInputEventFactory,
consumeBatches, frameTime, &seq, &inputEvent);
if (status != OK && status != WOULD_BLOCK) {
ALOGE("channel '%s' ~ Failed to consume input event. status=%s(%d)",
getInputChannelName().c_str(), statusToString(status).c_str(), status);
return status;
}
if (status == WOULD_BLOCK) {
if (!skipCallbacks && !mBatchedInputEventPending && mInputConsumer.hasPendingBatch()) {
// There is a pending batch. Come back later.
if (!receiverObj.get()) {
receiverObj.reset(jniGetReferent(env, mReceiverWeakGlobal));
if (!receiverObj.get()) {
ALOGW("channel '%s' ~ Receiver object was finalized "
"without being disposed.",
getInputChannelName().c_str());
return DEAD_OBJECT;
}
}
mBatchedInputEventPending = true;
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Dispatching batched input event pending notification.",
getInputChannelName().c_str());
}
env->CallVoidMethod(receiverObj.get(),
gInputEventReceiverClassInfo.onBatchedInputEventPending,
mInputConsumer.getPendingBatchSource());
if (env->ExceptionCheck()) {
ALOGE("Exception dispatching batched input events.");
mBatchedInputEventPending = false; // try again later
}
}
return OK;
}
assert(inputEvent);
if (!skipCallbacks) {
if (!receiverObj.get()) {
receiverObj.reset(jniGetReferent(env, mReceiverWeakGlobal));
if (!receiverObj.get()) {
ALOGW("channel '%s' ~ Receiver object was finalized "
"without being disposed.", getInputChannelName().c_str());
return DEAD_OBJECT;
}
}
jobject inputEventObj;
switch (inputEvent->getType()) {
case AINPUT_EVENT_TYPE_KEY:
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Received key event.", getInputChannelName().c_str());
}
inputEventObj = android_view_KeyEvent_fromNative(env,
static_cast<KeyEvent*>(inputEvent));
break;
case AINPUT_EVENT_TYPE_MOTION: {
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Received motion event.", getInputChannelName().c_str());
}
MotionEvent* motionEvent = static_cast<MotionEvent*>(inputEvent);
if ((motionEvent->getAction() & AMOTION_EVENT_ACTION_MOVE) && outConsumedBatch) {
*outConsumedBatch = true;
}
inputEventObj = android_view_MotionEvent_obtainAsCopy(env, motionEvent);
break;
}
case AINPUT_EVENT_TYPE_FOCUS: {
FocusEvent* focusEvent = static_cast<FocusEvent*>(inputEvent);
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Received focus event: hasFocus=%s, inTouchMode=%s.",
getInputChannelName().c_str(), toString(focusEvent->getHasFocus()),
toString(focusEvent->getInTouchMode()));
}
env->CallVoidMethod(receiverObj.get(), gInputEventReceiverClassInfo.onFocusEvent,
jboolean(focusEvent->getHasFocus()),
jboolean(focusEvent->getInTouchMode()));
finishInputEvent(seq, true /* handled */);
continue;
}
case AINPUT_EVENT_TYPE_CAPTURE: {
const CaptureEvent* captureEvent = static_cast<CaptureEvent*>(inputEvent);
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Received capture event: pointerCaptureEnabled=%s",
getInputChannelName().c_str(),
toString(captureEvent->getPointerCaptureEnabled()));
}
env->CallVoidMethod(receiverObj.get(),
gInputEventReceiverClassInfo.onPointerCaptureEvent,
jboolean(captureEvent->getPointerCaptureEnabled()));
finishInputEvent(seq, true /* handled */);
continue;
}
case AINPUT_EVENT_TYPE_DRAG: {
const DragEvent* dragEvent = static_cast<DragEvent*>(inputEvent);
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Received drag event: isExiting=%s",
getInputChannelName().c_str(), toString(dragEvent->isExiting()));
}
env->CallVoidMethod(receiverObj.get(), gInputEventReceiverClassInfo.onDragEvent,
jboolean(dragEvent->isExiting()), dragEvent->getX(),
dragEvent->getY());
finishInputEvent(seq, true /* handled */);
continue;
}
default:
assert(false); // InputConsumer should prevent this from ever happening
inputEventObj = nullptr;
}
if (inputEventObj) {
if (kDebugDispatchCycle) {
ALOGD("channel '%s' ~ Dispatching input event.", getInputChannelName().c_str());
}
env->CallVoidMethod(receiverObj.get(),
gInputEventReceiverClassInfo.dispatchInputEvent, seq, inputEventObj);
if (env->ExceptionCheck()) {
ALOGE("Exception dispatching input event.");
skipCallbacks = true;
}
env->DeleteLocalRef(inputEventObj);
} else {
ALOGW("channel '%s' ~ Failed to obtain event object.",
getInputChannelName().c_str());
skipCallbacks = true;
}
}
}
}
java:
final class WindowInputEventReceiver extends InputEventReceiver {
public WindowInputEventReceiver(InputChannel inputChannel, Looper looper) {
super(inputChannel, looper);
}
@Override
public void onInputEvent(InputEvent event) {
Trace.traceBegin(Trace.TRACE_TAG_VIEW, "processInputEventForCompatibility");
List<InputEvent> processedEvents;
try {
processedEvents =
mInputCompatProcessor.processInputEventForCompatibility(event);
} finally {
Trace.traceEnd(Trace.TRACE_TAG_VIEW);
}
if (processedEvents != null) {
if (processedEvents.isEmpty()) {
// InputEvent consumed by mInputCompatProcessor
finishInputEvent(event, true);
} else {
for (int i = 0; i < processedEvents.size(); i++) {
enqueueInputEvent(
processedEvents.get(i), this,
QueuedInputEvent.FLAG_MODIFIED_FOR_COMPATIBILITY, true);
}
}
} else {
enqueueInputEvent(event, this, 0, true);
}
}
@Override
public void onBatchedInputEventPending(int source) {
// mStopped: There will be no more choreographer callbacks if we are stopped,
// so we must consume all input immediately to prevent ANR
final boolean unbuffered = mUnbufferedInputDispatch
|| (source & mUnbufferedInputSource) != SOURCE_CLASS_NONE
|| mStopped;
if (unbuffered) {
if (mConsumeBatchedInputScheduled) {
unscheduleConsumeBatchedInput();
}
// Consume event immediately if unbuffered input dispatch has been requested.
consumeBatchedInputEvents(-1);
return;
}
scheduleConsumeBatchedInput();
}
@Override
public void onFocusEvent(boolean hasFocus, boolean inTouchMode) {
windowFocusChanged(hasFocus, inTouchMode);
}
@Override
public void dispose() {
unscheduleConsumeBatchedInput();
super.dispose();
}
}
input事件的處理流程:
輸入event deliverInputEvent
deliver的 input 事件會來到 InputStage
InputStage 是一個責任鏈, 會分發(fā)消費這些 InputEvent
下面以滑動一下 recyclerView 為例子, 整體邏輯如下:

vsync 信號到來, 執(zhí)行 doFrame,執(zhí)行到 input 階段

touchEvent 消費, recyclerView layout 一些 ViewHolder

scroll 中 fill 結(jié)束,會執(zhí)行 一個 recyclerView viewProperty 變化, 觸發(fā)了invalidate
invalidate 會走硬件加速, 一直到達 ViewRootImpl , 從而將 Traversal 的 callback post choreographer執(zhí)行到 traversal 階段就會執(zhí)行

ViewRootImpl 執(zhí)行 performTraversal , 會根據(jù)目前是否需要重新layout , 然后執(zhí)行l(wèi)ayout, draw 等流程

整個 input 到 traversal 結(jié)束,硬件繪制后, sync 任務(wù)到 GPU , 然后合成一幀。
交給 SurfaceFlinger 來顯示。

SurfaceFlinger 是系統(tǒng)進程, 每一個應(yīng)用進程是一個 client 端, 通過 IPC 機制,client 將圖像顯示工作交給 SurfaceFlinger
launch 一個 app:
