Android自定義camera2相機(jī)系列(四)Opengles 預(yù)覽和拍照進(jìn)行實(shí)時(shí)處理

前面的博客,進(jìn)行了部分的GLSL的語(yǔ)法學(xué)習(xí),這一篇文章主要講述了本人在開發(fā) Camera2 + GLSurfaceView + GLSL 的開發(fā)過(guò)程的記錄。如有錯(cuò)誤還望指正。

此文章部分內(nèi)容都基于 Android 自定義camera2 相機(jī) (二)中的 camera2 相機(jī)打開 設(shè)置預(yù)覽 綁定 等操作,如果有不懂 可以回到第二篇相關(guān)系列 文章中進(jìn)行部分知識(shí)api 的學(xué)習(xí)。

Github 地址

Github 主要類地址

我們先看效果圖,我這里只是在片元著色器中對(duì)R通道的 黑色進(jìn)行了判斷。當(dāng)然如果有需求是 黑白圖 則可以通過(guò) 1-R,1-G,1-B 來(lái)獲得黑白圖。切記GLSL中顏色的范圍是 0-1 ,而Rgb中 范圍則是 0-255.

在這里插入圖片描述

1. 布局添加

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:background="@android:color/black"
    tools:context="cn.tongue.tonguecamera.ui.CameraActivity">
    <FrameLayout
        android:id="@+id/frame_layout"
        android:layout_width="match_parent"
        android:layout_height="wrap_content">
    </FrameLayout>
    <RelativeLayout
        android:id="@+id/homecamera_bottom_relative"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:background="#00ffffff"
        android:layout_alignParentBottom="true">
        <ImageView
            android:id="@+id/iv_back"
            android:layout_width="40dp"
            android:layout_height="30dp"
            android:scaleType="centerInside"
            android:layout_marginBottom="20dp"
            android:layout_marginStart="20dp"
            android:layout_centerVertical="true"
            android:background="@drawable/icon_back" />
        <ImageView
            android:id="@+id/img_camera"
            android:layout_width="80dp"
            android:layout_height="80dp"
            android:scaleType="centerInside"
            android:layout_marginBottom="20dp"
            android:layout_centerInParent="true"
            android:background="@drawable/camera" />
    </RelativeLayout>
    <LinearLayout
        android:id="@+id/home_custom_top_relative"
        android:layout_width="match_parent"
        android:layout_height="50dp"
        android:gravity="center_vertical"
        android:orientation="horizontal"
        android:visibility="gone"
        android:background="#00ffffff"
        android:layout_alignParentTop="true"
        >
        <ImageView
            android:id="@+id/camera_flash"
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:layout_weight="1"
            android:padding="10dp"
            android:src="@drawable/icon_camera_off" />
        <View
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:layout_weight="5"/>
        <ImageView
            android:id="@+id/camera_switch"
            android:layout_width="0dp"
            android:layout_height="wrap_content"
            android:layout_weight="1"
            android:padding="10dp"
            android:src="@drawable/btn_camera_turn_n" />
    </LinearLayout>
</RelativeLayout>

在布局中 我們只添加了 一個(gè)Fragment ,我們需要通過(guò)動(dòng)態(tài)添加的方式 將 GLSurfaceview 添加到主界面中。

2. 獲取屏幕尺寸

通過(guò) setUpCameraOutputs方法,設(shè)置 相機(jī)的配置屬性,并返回相機(jī)的size,以便達(dá)到我們的全屏照相機(jī)需求。

/**
     * 設(shè)置與攝像頭相關(guān)的成員變量。
     * @param width  攝像機(jī)預(yù)覽的可用大小寬度
     * @param height 相機(jī)預(yù)覽的可用尺寸高度
     */
    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    public Size setUpCameraOutputs(int width, int height) {
        mFile = new File(mActivity.getExternalFilesDir(null), "pic.png");
        CameraManager manager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
        try {
            for (String cameraId : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                // 不使用前置攝像頭
                Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
                if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
                    continue;
                }
                StreamConfigurationMap map = characteristics.get(
                        CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                if (map == null) {
                    continue;
                }
                // 靜態(tài)圖像捕獲,選擇最大可用大小。
                Size largest = Collections.max(
                        Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
                        new CompareSizesByArea());
                mImageReader = ImageReader.newInstance(largest.getWidth(),
                        largest.getHeight(), ImageFormat.JPEG, 2);
                mImageReader.setOnImageAvailableListener(
                        mOnImageAvailableListener, mBackgroundHandler);
                //了解我們是否需要交換尺寸以獲得相對(duì)于傳感器的預(yù)覽尺寸
                int displayRotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
                mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
                boolean swappedDimensions = false;
                switch (displayRotation) {
                    case Surface.ROTATION_0:
                    case Surface.ROTATION_180:
                        if (mSensorOrientation == 90 || mSensorOrientation == 270) {
                            swappedDimensions = true;
                        }
                        break;
                    case Surface.ROTATION_90:
                    case Surface.ROTATION_270:
                        if (mSensorOrientation == 0 || mSensorOrientation == 180) {
                            swappedDimensions = true;
                        }
                        break;
                    default:
                        Log.e(TAG, "Display rotation is invalid: " + displayRotation);
                }
                Point displaySize = new Point();
                mActivity.getWindowManager().getDefaultDisplay().getSize(displaySize);
                int rotatedPreviewWidth = width;
                int rotatedPreviewHeight = height;
                int maxPreviewWidth = displaySize.x;
                int maxPreviewHeight = displaySize.y;
                if (swappedDimensions) {
                    rotatedPreviewWidth = height;
                    rotatedPreviewHeight = width;
                    maxPreviewWidth = displaySize.y;
                    maxPreviewHeight = displaySize.x;
                }
                if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {
                    maxPreviewWidth = MAX_PREVIEW_WIDTH;
                }
                if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {
                    maxPreviewHeight = MAX_PREVIEW_HEIGHT;
                }
                mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
                        rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
                        maxPreviewHeight, largest);

                // 將TextureView的寬高比與我們選擇的預(yù)覽大小相匹配。
                int orientation = mActivity.getResources().getConfiguration().orientation;
                // 檢查 遠(yuǎn)光燈
                Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
                mFlashSupported = available == null ? false : available;
                mCameraId = cameraId;
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        } catch (NullPointerException ignored) {
        }
        return mPreviewSize;
    }

通過(guò) 返回的 size,我們?cè)O(shè)置 CameraV2GLSurfaceView 布局的寬高。


public class CameraV2GLSurfaceView extends GLSurfaceView {
    public static boolean shouldTakePic = false;

    public void init(CameraV2 camera, boolean isPreviewStarted, Context context) {
        setEGLContextClientVersion(2);
        CameraV2Renderer mCameraV2Renderer = new CameraV2Renderer();
        mCameraV2Renderer.init(this, camera, isPreviewStarted, context);
        setRenderer(mCameraV2Renderer);
        setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
    }

    public CameraV2GLSurfaceView(Context context) {
        super(context);
    }
}

繼承GLSurfaceView 后,我們就需要 設(shè)置 Renderer 。當(dāng)然這里的 cameraV2是拍照的集成工具類。我們?cè)赗enderer的onSurfaceCreated方法中創(chuàng)建一個(gè)OES紋理。

    /**
     * GLSurfaceView 創(chuàng)建
     *
     * @param gl GL10
     * @param config EGLConfig
     */
    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        // 創(chuàng)建紋理 返回 紋理ID
        mOESTextureId = Utils.createOESTextureObject();
        // 配置濾鏡 加載 定點(diǎn) 和 片元 著色器
        FilterEngine mFilterEngine = new FilterEngine(mOESTextureId, mContext);
        mDataBuffer = mFilterEngine.getBuffer();
        mShaderProgram = mFilterEngine.getShaderProgram();
        glGenFramebuffers(1, mFBOIds, 0);
        glBindFramebuffer(GL_FRAMEBUFFER, mFBOIds[0]);
        // 獲取頂點(diǎn) 和 片元 著色器中變量?jī)?nèi)容
        uColorType = glGetUniformLocation(mShaderProgram, FilterEngine.COLOR_TYPE);
        hChangeColor = GLES20.glGetUniformLocation(mShaderProgram, "vChangeColor");
        hChangeColor2 = GLES20.glGetUniformLocation(mShaderProgram, "vChangeColorB");
        hChangeColor3 = GLES20.glGetUniformLocation(mShaderProgram, "vChangeColorC");
        hArraySize = GLES20.glGetUniformLocation(mShaderProgram, "vArraysSize");
    }

之后根據(jù)OES紋理Id創(chuàng)建SurfaceTexture,用來(lái)接收Camera2的預(yù)覽數(shù)據(jù)

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    private boolean initSurfaceTexture() {
        if (mCamera == null || mCameraV2GLSurfaceView == null) {
            Log.i(TAG, "mCamera or mGLSurfaceView is null!");
            return false;
        }
        // 根據(jù) oesId 創(chuàng)建 SurfaceTexture
        mSurfaceTexture = new SurfaceTexture(mOESTextureId);
        mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
            @Override
            public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                // 每獲取到一幀數(shù)據(jù)時(shí)請(qǐng)求OpenGL ES進(jìn)行渲染
                mCameraV2GLSurfaceView.requestRender();
            }
        });
        //講此SurfaceTexture作為相機(jī)預(yù)覽輸出 (相互綁定)
        mCamera.setPreviewTexture(mSurfaceTexture);
        mCamera.createCameraPreviewSession();
        return true;
    }

最后初始化OpenGL ES環(huán)境,包括 shader編寫 和 編譯,鏈接到program。(由于這里內(nèi)容比較多,在文章末尾我會(huì)附上本人 github demo 地址)

/**
 * 濾鏡 工具
 * 參考url : [https://blog.csdn.net/lb377463323/article/details/78054892]
 * @date 2019年2月12日 14:10:07
 * @author ymc
 */

public class FilterEngine {
    @SuppressLint("StaticFieldLeak")
    private static FilterEngine filterEngine = null;
    private Context mContext;
    /**
     * 存放頂點(diǎn)的Color數(shù)組
     */
    private FloatBuffer mBuffer;
    private int mOESTextureId = -1;
    private int vertexShader = -1;
    private int fragmentShader = -1;
    private int mShaderProgram = -1;
    private int aPositionLocation = -1;
    private int aTextureCoordLocation = -1;
    private int uTextureMatrixLocation = -1;
    private int uTextureSamplerLocation = -1;
    /**
     * 每行前兩個(gè)值為頂點(diǎn)坐標(biāo),后兩個(gè)為紋理坐標(biāo)
     */
    private static final float[] VERTEX_DATA = {
            1f, 1f, 1f, 1f,
            -1f, 1f, 0f, 1f,
            -1f, -1f, 0f, 0f,
            1f, 1f, 1f, 1f,
            -1f, -1f, 0f, 0f,
            1f, -1f, 1f, 0f
    };
    public static final String POSITION_ATTRIBUTE = "aPosition";
    public static final String TEXTURE_COORD_ATTRIBUTE = "aTextureCoordinate";
    public static final String TEXTURE_MATRIX_UNIFORM = "uTextureMatrix";
    public static final String TEXTURE_SAMPLER_UNIFORM = "uTextureSampler";
    public static final String COLOR_TYPE = "vColorType";

    /**
     * 構(gòu)造方法
     * @param oestextureid oes id
     * @param context 上下文
     */
    public FilterEngine(int oestextureid, Context context) {
        mContext = context;
        mOESTextureId = oestextureid;
        mBuffer = createBuffer(VERTEX_DATA);
        /**
         * 預(yù)覽相機(jī)的著色器,頂點(diǎn)著色器不變,需要修改片元著色器,不再用sampler2D采樣,
         * 需要使用samplerExternalOES 紋理采樣器,并且要在頭部增加使用擴(kuò)展紋理的聲明
         * #extension GL_OES_EGL_image_external : require。
         */
        fragmentShader = loadShader(GL_FRAGMENT_SHADER, Utils.readShaderFromResource(mContext, R.raw.base_fragment_shader));
        vertexShader = loadShader(GL_VERTEX_SHADER, Utils.readShaderFromResource(mContext, R.raw.base_vertex_shader));
        mShaderProgram = linkProgram(vertexShader, fragmentShader);
    }

    /**
     * 創(chuàng)建 FloatBuffer 數(shù)組 (防止內(nèi)存回收)
     * @param vertexData float 數(shù)組
     * @return FloatBuffer
     */
    private FloatBuffer createBuffer(float[] vertexData) {
        FloatBuffer buffer = ByteBuffer.allocateDirect(vertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        buffer.put(vertexData, 0, vertexData.length).position(0);
        return buffer;
    }

    /**
     * 加載著色器
     * GL_VERTEX_SHADER 代表生成頂點(diǎn)著色器
     * GL_FRAGMENT_SHADER 代表生成片段著色器
     * @param type 類型
     * @param shaderSource shader string
     * @return shader
     */
    private int loadShader(int type, String shaderSource) {
        int shader = glCreateShader(type);
        if (shader == 0) {
            throw new RuntimeException("Create Shader Failed!" + glGetError());
        }
        glShaderSource(shader, shaderSource);
        glCompileShader(shader);
        return shader;
    }

    /**
     * 將兩個(gè)Shader鏈接至program中
     * @param verShader verShader
     * @param fragShader fragShader
     * @return program
     */
    private int linkProgram(int verShader, int fragShader) {
        int program = glCreateProgram();
        if (program == 0) {
            throw new RuntimeException("Create Program Failed!" + glGetError());
        }
        //附著頂點(diǎn)和片段著色器
        glAttachShader(program, verShader);
        glAttachShader(program, fragShader);
        // 綁定 program
        glLinkProgram(program);
        //告訴OpenGL ES使用此program
        glUseProgram(program);
        return program;
    }

    public void drawTexture(float[] transformMatrix) {
        aPositionLocation = glGetAttribLocation(mShaderProgram, FilterEngine.POSITION_ATTRIBUTE);
        aTextureCoordLocation = glGetAttribLocation(mShaderProgram, FilterEngine.TEXTURE_COORD_ATTRIBUTE);
        uTextureMatrixLocation = glGetUniformLocation(mShaderProgram, FilterEngine.TEXTURE_MATRIX_UNIFORM);
        uTextureSamplerLocation = glGetUniformLocation(mShaderProgram, FilterEngine.TEXTURE_SAMPLER_UNIFORM);

        glActiveTexture(GLES20.GL_TEXTURE0);
        glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mOESTextureId);
        glUniform1i(uTextureSamplerLocation, 0);
        glUniformMatrix4fv(uTextureMatrixLocation, 1, false, transformMatrix, 0);

        if (mBuffer != null) {
            mBuffer.position(0);
            glEnableVertexAttribArray(aPositionLocation);
            glVertexAttribPointer(aPositionLocation, 2, GL_FLOAT, false, 16, mBuffer);

            mBuffer.position(2);
            glEnableVertexAttribArray(aTextureCoordLocation);
            glVertexAttribPointer(aTextureCoordLocation, 2, GL_FLOAT, false, 16, mBuffer);

            glDrawArrays(GL_TRIANGLES, 0, 6);
        }
    }
    ... 
    set
    ....
    get
    ...
   
}

上述工具類中 加載了shader 并且將 shader 綁定到 program中,接下來(lái)我們?cè)O(shè)置 頂單著色器

attribute vec4 aPosition;
uniform mat4 uTextureMatrix;
attribute vec4 aTextureCoordinate;
varying vec2 vTextureCoord;
void main()
{
  vTextureCoord = (uTextureMatrix * aTextureCoordinate).xy;
  gl_Position = aPosition;
}

設(shè)置 片元著色器 glsl 代碼,下邊我會(huì)寫多種濾鏡效果

#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES uTextureSampler;
precision mediump float;
uniform int vColorType;
varying vec2 vTextureCoord;
uniform int vArraysSize;
uniform vec3 vChangeColor;
uniform vec3 vChangeColorB;
uniform vec3 vChangeColorC;

float debugFloatA;
float debugFloatB;

void main()
{
    // 本人 demo 調(diào)試效果 將R 通道的 黑色設(shè)置為 白色
    vec4 vCameraColor = texture2D(uTextureSampler, vTextureCoord);
    gl_FragColor = vec4(vCameraColor.r, vCameraColor.g, vCameraColor.b, 1.0);
    for(int i = 0;i<vArraysSize;++i){
        debugFloatA =  vCameraColor.r * 255.0 - 1.0 ;
        debugFloatB = vCameraColor.r * 255.0 + 1.0 ;
        if( debugFloatA <= vChangeColor[i] ){
            if(  vChangeColor[i] <= debugFloatB ){
                gl_FragColor = vec4(1.0-vCameraColor.r,  1.0-vCameraColor.g,1.0- vCameraColor.b, 1.0);
            }
        }
    }

    // 第二種 黑白濾鏡效果
    vec4 vCameraColor = texture2D(uTextureSampler, vTextureCoord);
    gl_FragColor = vec4(1.0-vCameraColor.r,  1.0-vCameraColor.g,1.0- vCameraColor.b, 1.0);

}

打開相機(jī)

    /**
     * 打開相機(jī)
     *
     * @return boolean
     */
    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    public boolean openCamera() {
        CameraManager cameraManager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
        try {
            if (ActivityCompat.checkSelfPermission(mActivity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
            cameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock camera opening.");
            }
        } catch (CameraAccessException | InterruptedException e) {
            e.printStackTrace();
            return false;
        }
        return true;
    }

相機(jī)狀態(tài)改變回調(diào)

    /**
     * CameraDevice 改變狀態(tài)時(shí)候 調(diào)用
     */
    private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            mCameraDevice = camera;
            //打開相機(jī)時(shí)會(huì)調(diào)用此方法。 我們?cè)谶@里開始相機(jī)預(yù)覽。
            mCameraOpenCloseLock.release();
//            createCameraPreviewSession();
        }
        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            mCameraOpenCloseLock.release();
            camera.close();
            mCameraDevice = null;
        }
        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            mCameraOpenCloseLock.release();
            camera.close();
            mCameraDevice = null;
            mActivity.finish();
        }
    };

拍照 靜態(tài)鎖定

    /**
     * 拍攝靜止圖片。 當(dāng)我們得到響應(yīng)時(shí),應(yīng)該調(diào)用此方法
     */
    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    private void captureStillPicture() {
        try {
            if (null == mActivity || null == mCameraDevice) {
                return;
            }
            final CaptureRequest.Builder captureBuilder =
                    mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
            captureBuilder.addTarget(mImageReader.getSurface());

            // 使用與預(yù)覽相同的AE和AF模式。
            captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                    CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
            // 查看使用支持 flash
            if (mFlashSupported) {
                captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,
                        CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
            }
            int rotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
            captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));

            CameraCaptureSession.CaptureCallback capturecallback
                    = new CameraCaptureSession.CaptureCallback() {

                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                               @NonNull CaptureRequest request,
                                               @NonNull TotalCaptureResult result) {
                    Log.e(TAG, "--------------------:" + mFile.toString());
                    unlockFocus();
                }
            };
            mCaptureSession.stopRepeating();
            mCaptureSession.abortCaptures();
            mCaptureSession.capture(captureBuilder.build(), capturecallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
            Log.e(TAG, "capture err: " + e.getMessage());
        }
    }

    /**
     * 解鎖焦點(diǎn) 在靜止圖像捕獲序列時(shí)調(diào)用此方法
     */
    @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
    private void unlockFocus() {
        try {
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
                    CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
            if (mFlashSupported) {
                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
                        CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
            }
            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
                    mBackgroundHandler);
            // 相機(jī)轉(zhuǎn)為正常狀態(tài)
            mState = STATE_PREVIEW;
            mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,
                    mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

然后OpenGL ES將此OES紋理數(shù)據(jù)繪制到屏幕上,我們就可以預(yù)覽到 通過(guò) 片元著色器修改過(guò)后的預(yù)覽效果,但是這里注意如果 這里只是預(yù)覽的效果改變,但是 拍照后得到的圖片還是 原圖,本人這里暫時(shí)用的方法是 截圖的方式,就是獲取預(yù)覽
流中的 數(shù)據(jù)后 循環(huán)所有像素,重繪保存為圖片即可。
在 Rander 中 的 onDrawFrame() 方法中 ,

        /**
         * 根據(jù)標(biāo)識(shí) 是否截圖
         * 參考url: [http://hounychang.github.io/2015/05/13/%E5%AF%B9GLSurfaceView%E6%88%AA%E5%9B%BE/]
         */
        if (CameraV2GLSurfaceView.shouldTakePic) {
            CameraV2GLSurfaceView.shouldTakePic = false;
//            bindfbo();
            int w = surfaceWidth;
            int h = surfaceHeight;
            int b[] = new int[w * h];
            int bt[] = new int[w * h];
            IntBuffer buffer = IntBuffer.wrap(b);
            buffer.position(0);
            GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buffer);
            for (int i = 0; i < h; i++) {
                for (int j = 0; j < w; j++) {
                    int pix = b[i * w + j];
                    int pb = (pix >> 16) & 0xff;
                    int pr = (pix << 16) & 0x00ff0000;
                    int pix1 = (pix & 0xff00ff00) | pr | pb;
                    bt[(h - i - 1) * w + j] = pix1;
                }
            }
            Bitmap inBitmap;
            inBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
            //為了圖像能小一點(diǎn),使用了RGB_565而不是ARGB_8888
            inBitmap.copyPixelsFromBuffer(buffer);
            inBitmap = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
            ByteArrayOutputStream bos = new ByteArrayOutputStream();
            inBitmap.compress(Bitmap.CompressFormat.PNG, 90, bos);
            byte[] bitmapData = bos.toByteArray();
            ByteArrayInputStream fis = new ByteArrayInputStream(bitmapData);
            File mFile = new File(mContext.getExternalFilesDir(null), "pic1.png");
            try {
                FileOutputStream fos = new FileOutputStream(mFile);
                byte[] buf = new byte[1024];
                int len;
                while ((len = fis.read(buf)) > 0) {
                    fos.write(buf, 0, len);
                }
                fis.close();
                fos.close();

            } catch (IOException e) {
                e.printStackTrace();
            } finally {
                //旋轉(zhuǎn)角度
//                int rotate = BitmapRotating.readPictureDegree(mFile.getPath());
//                BitmapRotating.rotaingImageView(rotate,inBitmap);
                inBitmap.recycle();
//                unbindfbo();
            }
        }
        long t2 = System.currentTimeMillis();
        long t = t2 - t1;
        Log.i(TAG, "onDrawFrame: time: " + t);

到這里上個(gè)月的 簡(jiǎn)單的 項(xiàng)目demo 講解已經(jīng)基本完畢。如有錯(cuò)誤 還望讀者指出,本人小白,不斷學(xué)習(xí)中....

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容