OpenCV on Android 開發(fā) (4)豎屏預(yù)覽圖像問題解決方法-續(xù)

前一篇文章通過使用opencv官方例程的思路,初步解決了豎屏預(yù)覽自動旋轉(zhuǎn)的問題,但是因為要將app的方向固定為landscape,即橫屏模式,這在實際的相機應(yīng)用里會很奇怪,所以作為強迫癥患者,我想在豎屏模式下實現(xiàn)正常的預(yù)覽檢測,花了我2天的時間,在外網(wǎng)扒代碼,終于找到了靠譜解決的方法,但是并不完美,還是存在一些小bug

解決方法的連接https://github.com/opencv/opencv/issues/4704
https://stackoverflow.com/questions/16669779/opencv-camera-orientation-issue
老外也遇到了同樣的問題http://answers.opencv.org/question/20325/how-can-i-change-orientation-without-ruin-camera-settings/但是還沒有人給出正確的解決辦法

經(jīng)過我的測試有兩個方法的表現(xiàn)還不錯

方法一

需要改opencv的庫文件

首先

CameraBridgeViewBase.java 中的deliverAndDrawFrame(CVCameraViewFrame frame)方法全部替換成

protected void deliverAndDrawFrame(CvCameraViewFrame frame) {
        Mat modified;

        if (mListener != null) {
            modified = mListener.onCameraFrame(frame);
        } else {
            modified = frame.rgba();
        }

        boolean bmpValid = true;
        if (modified != null) {
            try {
                Utils.matToBitmap(modified, mCacheBitmap);
            } catch(Exception e) {
                Log.e(TAG, "Mat type: " + modified);
                Log.e(TAG, "Bitmap type: " + mCacheBitmap.getWidth() + "*" + mCacheBitmap.getHeight());
                Log.e(TAG, "Utils.matToBitmap() throws an exception: " + e.getMessage());
                bmpValid = false;
            }
        }

        mFpsMeter.measure();
    }
然后是

JavaCameraView.java 中的initializeCamera(int width, int height)方法,并且需要增加兩個函數(shù)private void setDisplayOrientation(Camera camera, int angle)和private String getOrientation(),通過mCamera.setPreviewDisplay(getHolder());來實現(xiàn)豎屏全屏顯示

protected boolean initializeCamera(int width, int height) {
        Log.d(TAG, "Initialize java camera");
        boolean result = true;
        synchronized (this) {
            mCamera = null;

            if (mCameraIndex == CAMERA_ID_ANY) {
                Log.d(TAG, "Trying to open camera with old open()");
                try {
                    mCamera = Camera.open();
                }
                catch (Exception e){
                    Log.e(TAG, "Camera is not available (in use or does not exist): " + e.getLocalizedMessage());
                }

                if(mCamera == null && Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    boolean connected = false;
                    for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(camIdx) + ")");
                        try {
                            mCamera = Camera.open(camIdx);
                            connected = true;
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + camIdx + "failed to open: " + e.getLocalizedMessage());
                        }
                        if (connected) break;
                    }
                }
            } else {
                if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    int localCameraIndex = mCameraIndex;
                    if (mCameraIndex == CAMERA_ID_BACK) {
                        Log.i(TAG, "Trying to open back camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    } else if (mCameraIndex == CAMERA_ID_FRONT) {
                        Log.i(TAG, "Trying to open front camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    }
                    if (localCameraIndex == CAMERA_ID_BACK) {
                        Log.e(TAG, "Back camera not found!");
                    } else if (localCameraIndex == CAMERA_ID_FRONT) {
                        Log.e(TAG, "Front camera not found!");
                    } else {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(localCameraIndex) + ")");
                        try {
                            mCamera = Camera.open(localCameraIndex);
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + localCameraIndex + "failed to open: " + e.getLocalizedMessage());
                        }
                    }
                }
            }

            if (mCamera == null)
                return false;

            /* Now set camera parameters */
            try {
                Camera.Parameters params = mCamera.getParameters();
                Log.d(TAG, "getSupportedPreviewSizes()");
                List<android.hardware.Camera.Size> sizes = params.getSupportedPreviewSizes();

                if (sizes != null) {
                    /* Image format NV21 causes issues in the Android emulators */
                    if (Build.FINGERPRINT.startsWith("generic")
                            || Build.FINGERPRINT.startsWith("unknown")
                            || Build.MODEL.contains("google_sdk")
                            || Build.MODEL.contains("Emulator")
                            || Build.MODEL.contains("Android SDK built for x86")
                            || Build.MANUFACTURER.contains("Genymotion")
                            || (Build.BRAND.startsWith("generic") && Build.DEVICE.startsWith("generic"))
                            || "google_sdk".equals(Build.PRODUCT))
                        params.setPreviewFormat(ImageFormat.YV12);  // "generic" or "android" = android emulator
                    else
                        params.setPreviewFormat(ImageFormat.NV21);

                    mPreviewFormat = params.getPreviewFormat();
                    //從這里開始不同
                    if (!Build.MODEL.equals("GT-I9100")) params.setRecordingHint(true);
                    params.setPreviewSize(1920, 1080);
                    mCamera.setParameters(params);

                    mFrameWidth = 1920;
                    mFrameHeight = 1080;

                    if (mFpsMeter != null) {
                        mFpsMeter.setResolution(mFrameWidth, mFrameHeight);
                    }

                    int size = mFrameWidth * mFrameHeight;
                    size  = size * ImageFormat.getBitsPerPixel(params.getPreviewFormat()) / 8;
                    mBuffer = new byte[size];

                    mCamera.addCallbackBuffer(mBuffer);
                    mCamera.setPreviewCallbackWithBuffer(this);

                    mFrameChain = new Mat[2];
                    mFrameChain[0] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);
                    mFrameChain[1] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);

                    AllocateCache();

                    mCameraFrame = new JavaCameraFrame[2];
                    mCameraFrame[0] = new JavaCameraFrame(mFrameChain[0], mFrameWidth, mFrameHeight);
                    mCameraFrame[1] = new JavaCameraFrame(mFrameChain[1], mFrameWidth, mFrameHeight);
                    
                    //different

                    mSurfaceTexture = new SurfaceTexture(MAGIC_TEXTURE_ID);
                    mCamera.setPreviewTexture(mSurfaceTexture);
                    
                    //主要修改

                    if (getOrientation().equals("portrait")) {
                        setDisplayOrientation(mCamera, 90);
                    } else if (getOrientation().equals("reverse landscape")){
                        setDisplayOrientation(mCamera, 180);
                    } else if (getOrientation().equals("reverse portrait")) {
                        setDisplayOrientation(mCamera, 270);
                    }
                    mCamera.setPreviewDisplay(getHolder());
                    
                    //end

                    mCamera.startPreview();
                }
                else
                    result = false;
            } catch (Exception e) {
                result = false;
                e.printStackTrace();
            }
        }

        return result;
    }
    //add two function
     
    private void setDisplayOrientation(Camera camera, int angle){
        Method downPolymorphic;
        try {
            downPolymorphic = camera.getClass().getMethod("setDisplayOrientation", int.class);
            if (downPolymorphic != null) {
                downPolymorphic.invoke(camera, angle);
            }
        }
        catch (Exception e) {
            e.printStackTrace();
        }
    }

    private String getOrientation(){
        int orientation = Surface.ROTATION_0;

        WindowManager wm = (WindowManager) getContext().getSystemService(Context.WINDOW_SERVICE);
        if (wm != null) {
            Display display = wm.getDefaultDisplay();
            orientation = display.getOrientation();
        }

        if (orientation == Surface.ROTATION_0) {
        return "portrait";
        }else if (orientation == Surface.ROTATION_90) {
        return "landscape";
        } else if (orientation == Surface.ROTATION_180) {
        return "reverse portrait";
        } else return "reverse landscape";
    }
    //end

該方法顯示的十分完美,但是存在一個無法接受的缺點
我們的onCameraFrame()方法失效了,因為我們調(diào)用了mCamera.setPreviewDisplay(getHolder());這個方法導(dǎo)致opencv對相機每一幀的處理無法顯示出來。。。但是具體怎么解決???我也不知道,期望能有大神把解決方法分享出來23333

方法二

該方法只需要修改opencv的一個庫文件CameraBridgeViewBase.java,相機預(yù)覽的顯示效果還不錯,但是還是還是有問題.。。。2333。。。經(jīng)我測試發(fā)現(xiàn)橫屏?xí)r存在顯示圖像放大的問題,但是豎屏倒是蠻完美的,就是fps會比較顯著的降低,但還在能接受的范圍內(nèi)。

修改如下
protected void deliverAndDrawFrame(CvCameraViewFrame frame) {
        Mat modified;

        if (mListener != null) {
            modified = mListener.onCameraFrame(frame);
        } else {
            modified = frame.rgba();
        }

        boolean bmpValid = true;
        if (modified != null) {
            try {
                Utils.matToBitmap(modified, mCacheBitmap);
            } catch(Exception e) {
                Log.e(TAG, "Mat type: " + modified);
                Log.e(TAG, "Bitmap type: " + mCacheBitmap.getWidth() + "*" + mCacheBitmap.getHeight());
                Log.e(TAG, "Utils.matToBitmap() throws an exception: " + e.getMessage());
                bmpValid = false;
            }
        }

        if (bmpValid && mCacheBitmap != null) {
            Canvas canvas = getHolder().lockCanvas();
            if (canvas != null) {
                canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);
/*
                //原來的方法
                if (BuildConfig.DEBUG)
                    Log.d(TAG, "mStretch value: " + mScale);

                if (mScale != 0) {
                    canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
                         new Rect((int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2),
                         (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2),
                         (int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2 + mScale*mCacheBitmap.getWidth()),
                         (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2 + mScale*mCacheBitmap.getHeight())), null);
                } else {
                     canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
                         new Rect((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,
                         (canvas.getHeight() - mCacheBitmap.getHeight()) / 2,
                         (canvas.getWidth() - mCacheBitmap.getWidth()) / 2 + mCacheBitmap.getWidth(),
                         (canvas.getHeight() - mCacheBitmap.getHeight()) / 2 + mCacheBitmap.getHeight()), null);
                }
*/


    
//method4
                Matrix matrix = new Matrix(); // I rotate it with minimal process
                //matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                //matrix.postRotate(90f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                //float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                //matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                //canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
//end

                if (getDisplay().getRotation() == Surface.ROTATION_0) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    matrix.postRotate(90f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_90) {
                    float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_180) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    matrix.postRotate(270f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_270) {
                    matrix.postRotate(180f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                }


                if (mFpsMeter != null) {
                    mFpsMeter.measure();
                    mFpsMeter.draw(canvas, 20, 30);
                }
                getHolder().unlockCanvasAndPost(canvas);
            }
        }
    }

其實主要的修改只是
在protected void deliverAndDrawFrame(CvCameraViewFrame frame) 函數(shù)里

 if (bmpValid && mCacheBitmap != null) {
            Canvas canvas = getHolder().lockCanvas();
            if (canvas != null) {
                canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);

函數(shù)段后面把原方法替換成

Matrix matrix = new Matrix(); // I rotate it with minimal process
matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
matrix.postRotate(90f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
canvas.drawBitmap(mCacheBitmap, matrix, new Paint());

即可
作為個強迫癥患者,我增加了個判斷語句來實現(xiàn)橫豎屏都能正常的顯示。但是就是這個原因發(fā)現(xiàn)了這個方法的bug,橫屏?xí)r顯示的圖像是放大的。。。2333

看來我還需要再研究下,看看能不能修復(fù)橫屏?xí)糯蟮腷ug

但是要是在AndroidManifest.xml里鎖定豎屏的話

<activity android:name=".MainActivity"
            android:screenOrientation="portrait">

就不出現(xiàn)橫屏?xí)r的bug了,但是該方法fps降低的很明顯,不過顯示效果還是在可以接受的范圍內(nèi)的。。。
方法二效果圖



不過仍然還是只有在把屏幕橫著時人臉檢測才能得到較好的結(jié)果,豎屏?xí)r雖然也能檢測到,但是很不穩(wěn)定23333

總的來說,我還是決定采用方法二的方法,來實現(xiàn)我下一步的目標(biāo)。ヾ(o?ω?)?
留給我的時間不多了╮(╯﹏╰)╭

更新更新

作為強迫癥患者,睡覺時想到了一個解決豎屏?xí)r不能人臉檢測的方法,因為opencv要在橫屏?xí)r才能得到較好的結(jié)果,那么我可以先把豎屏?xí)r得到的圖像順時針旋轉(zhuǎn)90度,這樣就和橫屏?xí)r一樣了,然后我在把得到識別綠框的圖像逆時針旋轉(zhuǎn)90度,再輸出這樣就能做到豎屏?xí)r實現(xiàn)人臉檢測了。
所以我將MainActicity.java中的onCameraViewStarted和onCameraFrame()函數(shù)修改如下

@Override
    public void onCameraViewStarted(int width, int height){
        rgbaImage = new Mat(width, height, CvType.CV_8UC4);
        grayscaleImage = new Mat(height, width, CvType.CV_8UC4);
        Matlin = new Mat(width, height, CvType.CV_8UC4);
        gMatlin = new Mat(width, height, CvType.CV_8UC4);

        absoluteFaceSize = (int)(height * 0.2);

    }

    @Override
    public void onCameraViewStopped(){

    }

    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR1)
    @Override
    public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame InputFrame) {       
        grayscaleImage = InputFrame.gray();
        rgbaImage = InputFrame.rgba();
        int rotation = openCvCameraView.getDisplay().getRotation();

        //使前置的圖像也是正的
        if (camera_scene == CAMERA_FRONT) {
            Core.flip(rgbaImage, rgbaImage, 1);
            Core.flip(grayscaleImage, grayscaleImage, 1);
        }

        //MatOfRect faces = new MatOfRect();

        if (rotation == Surface.ROTATION_0) {
            MatOfRect faces = new MatOfRect();
            Core.rotate(grayscaleImage, gMatlin, Core.ROTATE_90_CLOCKWISE);
            Core.rotate(rgbaImage, Matlin, Core.ROTATE_90_CLOCKWISE);
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(gMatlin, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(Matlin, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
            Core.rotate(Matlin, rgbaImage, Core.ROTATE_90_COUNTERCLOCKWISE);

        } else {
            MatOfRect faces = new MatOfRect();
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(grayscaleImage, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(rgbaImage, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
        }

        return rgbaImage;
    }

最后,經(jīng)過我的修改顯示效果如下:
豎屏后置



豎屏前置


識別效果還不錯,就是側(cè)臉的話,識別不出來。還有一個問題就是現(xiàn)在橫屏?xí)r無法檢測人臉了,我想可能是因為我鎖定為了豎屏模式,導(dǎo)致int rotation = openCvCameraView.getDisplay().getRotation()放回的一致都是Surface.ROTATION_0使得我的其他方法不能實現(xiàn)。
為此,我不鎖定豎屏,對AndroidManifest.xml修改

<activity android:name=".MainActivity"
            android:screenOrientation="fullSensor">

修改MainActivity.java,加入完整的if判斷語句,我試過使用switch來判斷但是fps會降的很低,有很明顯的卡頓感

MatOfRect faces = new MatOfRect();
if (rotation == Surface.ROTATION_0) {
            Core.rotate(grayscaleImage, gMatlin, Core.ROTATE_90_CLOCKWISE);
            Core.rotate(rgbaImage, Matlin, Core.ROTATE_90_CLOCKWISE);
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(gMatlin, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(Matlin, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
            Core.rotate(Matlin, rgbaImage, Core.ROTATE_90_COUNTERCLOCKWISE);

        } else if (rotation == Surface.ROTATION_90) {
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(grayscaleImage, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(rgbaImage, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
        } else if (rotation == Surface.ROTATION_180) {
            Core.rotate(grayscaleImage, gMatlin, Core.ROTATE_90_COUNTERCLOCKWISE);
            Core.rotate(rgbaImage, Matlin, Core.ROTATE_90_COUNTERCLOCKWISE);
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(gMatlin, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(Matlin, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
            Core.rotate(Matlin, rgbaImage, Core.ROTATE_90_CLOCKWISE);
        } else if (rotation == Surface.ROTATION_270) {
            Core.rotate(grayscaleImage, gMatlin, Core.ROTATE_180);
            Core.rotate(rgbaImage, Matlin, Core.ROTATE_180);
            if (cascadeClassifier != null) {
                cascadeClassifier.detectMultiScale(gMatlin, faces, 1.1, 2, 2, new Size(absoluteFaceSize, absoluteFaceSize), new Size());
            }

            Rect[] faceArray = faces.toArray();
            for (int i = 0; i < faceArray.length; i++)

                Imgproc.rectangle(Matlin, faceArray[i].tl(), faceArray[i].br(), new Scalar(0, 255, 0, 255), 2);
            Core.rotate(Matlin, rgbaImage, Core.ROTATE_180);
        }

在CameraBridgeViewBase.java里的deliverAndDrawFrame()

if (bmpValid && mCacheBitmap != null) {
            Canvas canvas = getHolder().lockCanvas();
            if (canvas != null) {
                canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);

后面的片段修改為

Matrix matrix = new Matrix(); // I rotate it with minimal process
                float portraitscale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                float landscapscale = 1f;

                if (getDisplay().getRotation() == Surface.ROTATION_0) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    matrix.postRotate(90f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    //float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    //matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    matrix.postScale(portraitscale, portraitscale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_90) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    //float scale = 1f;
                    //matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    matrix.postScale(landscapscale, landscapscale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_180) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    matrix.postRotate(270f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    //float scale = (float) canvas.getWidth() / (float) mCacheBitmap.getHeight();
                    //matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    matrix.postScale(portraitscale, portraitscale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                } else if (getDisplay().getRotation() == Surface.ROTATION_270) {
                    matrix.preTranslate((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,(canvas.getHeight() - mCacheBitmap.getHeight()) / 2);
                    matrix.postRotate(180f,(canvas.getWidth()) / 2,(canvas.getHeight()) / 2);
                    //float scale = 1f;
                    //matrix.postScale(scale, scale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    matrix.postScale(landscapscale, landscapscale, canvas.getWidth()/2 , canvas.getHeight()/2 );
                    canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
                }

將橫屏?xí)r的float scale 設(shè)置為1就能解決前面出現(xiàn)的橫屏放大的比例不對的問題,該段我也嘗試過換成switch語句但是同樣會有很卡頓的感覺,原因不明,可能這個是編譯switch時自身的問題???
做完上述修改后,橫豎屏就都能進(jìn)行人臉檢測了,但是當(dāng)旋轉(zhuǎn)270度時,即倒著橫屏?xí)r會有很明顯的卡頓感。。。
不管了,反正一般鎖定豎屏模式使用就好了。。。
剩下的工作就是提高檢測的準(zhǔn)確率,實現(xiàn)側(cè)臉檢測和人眼檢測了

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容