CameraX基于Java的初步使用

一、CameraX的相關(guān)介紹說明

? CameraX 和 Lifecycle 結(jié)合在一起,方便開發(fā)者管理生命周期。與camera2相比簡潔了許多
? CameraX 是基于 Camera2 API 實現(xiàn)的,兼容市面上大多數(shù)設(shè)備。
? 開發(fā)者可以通過擴(kuò)展的形式使用和原生攝像頭應(yīng)用同樣的功能(如:人像、夜間模式、濾鏡、美顏)
? CameraX 開發(fā)最低API級別為21,AndroidStudio需要最低3.6的版本

二、CameraX的相關(guān)gradle

model的build.gradle文件中
android {
  ···
  compileOptions {
          sourceCompatibility JavaVersion.VERSION_1_8
          targetCompatibility JavaVersion.VERSION_1_8
   }
}
dependencies {
    def camerax_version = "1.0.0-beta07"
    // CameraX core library using camera2 implementation
    implementation "androidx.camera:camera-camera2:$camerax_version"
    // CameraX Lifecycle Library
    implementation "androidx.camera:camera-lifecycle:$camerax_version"
    // CameraX View class
    implementation "androidx.camera:camera-view:1.0.0-alpha14"
}

相機(jī)是需要權(quán)限的,CameraX需要的相關(guān)動態(tài)權(quán)限

三、CameraX實現(xiàn)預(yù)覽拍照的效果

<androidx.camera.view.CameraView
        android:id="@+id/view_finder"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        />
private CameraView mViewFinder;

protected void onCreate(Bundle savedInstanceState) {
    ...
    mViewFinder = findViewById(R.id.view_finder);
    mViewFinder.bindToLifecycle(this);
    //按鈕點(diǎn)擊
    mViewFinder.setCaptureMode(CameraView.CaptureMode.IMAGE);
                    @SuppressLint("RestrictedApi")
                    File file = new File(getContext().getExternalFilesDir(null) + File.separator + "/wt.jpg");
                    Log.e("TAG", file.toString());
                    ImageCapture.OutputFileOptions outputFileOptions = new ImageCapture.OutputFileOptions.Builder(file).build();
                    mViewFinder.takePicture(outputFileOptions, ContextCompat.getMainExecutor(getContext().getApplicationContext()),
                            new ImageCapture.OnImageSavedCallback() {
                                @Override
                                public void onImageSaved(@NonNull ImageCapture.OutputFileResults outputFileResults) {
                                    Uri savedUri = outputFileResults.getSavedUri();
                                    if(savedUri == null){
                                        savedUri = Uri.fromFile(file);
                                    }
                                    onFileSaved(savedUri);
                                }

                                @Override
                                public void onError(@NonNull ImageCaptureException exception) {

                                }
                            });
}
    //將前面保存的文件添加到媒體中
    private void onFileSaved(Uri savedUri) {
        if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
            sendBroadcast(new Intent(Camera.ACTION_NEW_PICTURE, savedUri));
        }
        String mimeTypeFromExtension = MimeTypeMap.getSingleton().getMimeTypeFromExtension(MimeTypeMap
                .getFileExtensionFromUrl(savedUri.getPath()));
        MediaScannerConnection.scanFile(getApplicationContext(),
                new String[]{new File(savedUri.getPath()).getAbsolutePath()},
                new String[]{mimeTypeFromExtension}, new MediaScannerConnection.OnScanCompletedListener() {
                    @Override
                    public void onScanCompleted(String path, Uri uri) {
                        Log.d("TAG", "Image capture scanned into media store: $uri" + uri);
                    }
                });
    }

四、對預(yù)覽數(shù)據(jù)處理

<androidx.camera.view.PreviewView
        android:id="@+id/viewFinder"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />
private PreviewView mViewFinder;

protected void onCreate(Bundle savedInstanceState) {
    ...
    mViewFinder = findViewById(R.id.viewFinder);
    startCamera();
}
private ListenableFuture<ProcessCameraProvider> cameraProviderFuture;
private void startCamera() {
        cameraProviderFuture = ProcessCameraProvider.getInstance(this);
        cameraProviderFuture.addListener(new Runnable() {
            @Override
            public void run() {
                try {
                    ProcessCameraProvider processCameraProvider = cameraProviderFuture.get();
                    Preview preview = new Preview.Builder()
                            .build();
                    preview.setSurfaceProvider(mViewFinder.createSurfaceProvider());
                    ImageCapture imageCapture = new ImageCapture.Builder()
                            .build();
                    OrientationEventListener orientationEventListener = new OrientationEventListener(PreActivity.this) {
                        @Override
                        public void onOrientationChanged(int orientation) {
                            int rotation;
                            if (orientation >= 45 && orientation < 135) {
                                rotation = Surface.ROTATION_270;
                            } else if (orientation >= 135 && orientation < 225) {
                                rotation = Surface.ROTATION_180;
                            } else if (orientation >= 225 && orientation < 315) {
                                rotation = Surface.ROTATION_90;
                            } else {
                                rotation = Surface.ROTATION_0;
                            }
                            imageCapture.setTargetRotation(rotation);
                        }
                    };
                    orientationEventListener.enable();
                    ImageAnalysis imageAnalysis = new ImageAnalysis.Builder()
                            .setTargetAspectRatio(AspectRatio.RATIO_16_9)
                            .build();
                    ExecutorService executorService = Executors.newSingleThreadExecutor();
                    imageAnalysis.setAnalyzer(executorService, new LuminosityAnalyzer());
                    processCameraProvider.unbindAll();
                    processCameraProvider.bindToLifecycle(PreActivity.this, CameraSelector.DEFAULT_BACK_CAMERA, preview, imageCapture, imageAnalysis);
                } catch (ExecutionException e) {
                    e.printStackTrace();
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        }, ContextCompat.getMainExecutor(this));
    }

private class LuminosityAnalyzer implements ImageAnalysis.Analyzer {
        @Override
        public void analyze(@NonNull ImageProxy image) {
            @SuppressLint("UnsafeExperimentalUsageError")
            Image data = image.getImage();
            ImageProxy.PlaneProxy[] planes = image.getPlanes();

            byte[] dataFromImage = new byte[image.getWidth() * image.getHeight() * 3 / 2];
            ByteBuffer yBuffer = planes[0].getBuffer();
            int yLen = image.getWidth() * image.getHeight();
            yBuffer.get(dataFromImage, 0, yLen);
            ByteBuffer uBuffer = planes[1].getBuffer();
            ByteBuffer vBuffer = planes[2].getBuffer();
            int pixelStride = planes[1].getPixelStride();
            for (int i = 0; i <= uBuffer.remaining(); i += pixelStride) {
                dataFromImage[yLen++] = uBuffer.get(i);
                dataFromImage[yLen++] = vBuffer.get(i);
            }
//dataFromImage為預(yù)覽數(shù)據(jù)
            image.close();
        }
    }

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

友情鏈接更多精彩內(nèi)容