前言
最近,在使用Android做一個(gè)照相機(jī)的開發(fā)。因?yàn)椴荒苁褂孟到y(tǒng)提供的相機(jī)應(yīng)用,所以只能自己寫一個(gè)。Android以前提供相機(jī)的api叫camera,不過在level 21被Google拋棄了。網(wǎng)上的教程,還有很多都是使用camera的,為了好好學(xué)習(xí)一下camera2,就去扒了Google提供的官方示例。下面給一個(gè)github的連接,可以找到全部的源碼。
源碼分析
下述的例子主要提供的功能是:相機(jī)預(yù)覽、拍照、照片保存。具體如下:進(jìn)入該應(yīng)用后,可以看到相機(jī)的預(yù)覽畫面,提供一個(gè)按鈕用于拍照,拍完照片后照片會(huì)保存在SD卡的根目錄下。
架構(gòu)分析
首先來了解一下camera2的整體結(jié)構(gòu):

如上所示,整個(gè)camera2由一個(gè)CameraManager來進(jìn)行統(tǒng)一管理,通過Context的getSystemService方法可以實(shí)例化CameraManager,然后該類主要通過三個(gè)類來對(duì)Camera進(jìn)行操作。下面分別介紹一下:
- CameraDevice:描述一個(gè)照相機(jī)設(shè)備,一個(gè)Android設(shè)備可能會(huì)有多個(gè)攝像頭,通過CameraId可以進(jìn)行區(qū)別。它最主要有一個(gè)相機(jī)狀態(tài)的回調(diào)函數(shù),當(dāng)下達(dá)打開相機(jī)的命令后,若相機(jī)正確的打開便會(huì)回調(diào)該函數(shù)。
- CameraCharacteristic:某個(gè)照相機(jī)設(shè)備的具體參數(shù)。本例主要用到它提供的輸出格式(即輸出數(shù)據(jù)的格式)。
- CameraCaptureSession:相機(jī)捕獲會(huì)話,通過這個(gè)類可以和相機(jī)進(jìn)行對(duì)話(預(yù)覽還是單張拍照還是錄像等)。這里有兩個(gè)回調(diào)函數(shù),捕獲狀態(tài)的回調(diào),和捕獲數(shù)據(jù)的回調(diào)(后文會(huì)有詳述)。
上圖左上部分所示的時(shí)Android設(shè)備和camera設(shè)備的通信情況,兩者之間通過pipeline(管道)進(jìn)行數(shù)據(jù)交換。當(dāng)需要盡心不同的操作時(shí),將CameraCaptureRequest通過管道傳給camera,接收到請(qǐng)求后,camera做出相應(yīng)的反應(yīng),將獲取到得數(shù)據(jù)CmaeraMetadata通過管道傳回給Android設(shè)備。
注意事項(xiàng)
本例,需要使用比較多的權(quán)限,請(qǐng)參看源碼AndroidManifest.xml。
Android設(shè)備的屏幕方向,與攝像頭的原始方向并不一致,需要做方向轉(zhuǎn)換。一般而言,當(dāng)Android設(shè)備橫著放時(shí),與攝像頭的方向是一致的。
為了避免照片失真(照片被拉長或者壓扁),需要保證預(yù)覽的長寬比例、照片的長寬比例和相機(jī)輸出格式的長寬比例三者保持一致。
代碼流轉(zhuǎn)
本例當(dāng)中,用一個(gè)activity承載一個(gè)fragment。所有的代碼都寫在fragment里面,重寫了fragment的幾個(gè)生命周期函數(shù):
- onCreateView:加載fragment的布局文件;
- onViewCreated:實(shí)例化布局控件;
- onActivityCreated:在SD卡的目錄下建立jpg文件等待待將拍到的照片寫進(jìn)去;
- onResume:開始照相機(jī)線程,執(zhí)行一些邏輯判斷;
- onPause:關(guān)閉照相機(jī),停止照相機(jī)線程;

正常來說,代碼的整體流程如圖2所示,activity將需要的fragment加載進(jìn)來后,開始加載顯示預(yù)覽的控件texture,當(dāng)控件加載完畢會(huì)執(zhí)行一個(gè)回調(diào)函數(shù)onSurfaceTextureAvailable(),在這個(gè)回調(diào)函數(shù)里面,打開攝像頭(即執(zhí)行openCamera())。
openCamera()里面,首先要配置相機(jī)的輸出,預(yù)覽圖像和拍照的圖片要作不同的處理,然后根據(jù)當(dāng)前的設(shè)備屏幕環(huán)境,判斷是否需要進(jìn)行數(shù)據(jù)的轉(zhuǎn)換,最后通過cameraManager打開攝像頭(調(diào)用cameraManager.openCamera()方法)。
更詳細(xì)的方法請(qǐng)看后面的源碼,圖2當(dāng)中,中間是判斷屏幕方向的邏輯,右邊是自動(dòng)選擇最合適的顯示邏輯。
源碼
// 代碼比較長,請(qǐng)耐心查看,注釋可能有不正確的地方,請(qǐng)?zhí)岢?// 注意,下面代碼為了配合我的使用,已經(jīng)去掉了按鈕,但是拍照的方法仍然保留,通過調(diào)用方法可以完成拍照
package com.eric_lai.weeding_robot.fragment;
/**
* Created by ERIC_LAI on 16/3/18.
*/
import android.app.Activity;
import android.app.AlertDialog;
import android.app.Dialog;
import android.app.DialogFragment;
import android.app.Fragment;
import android.content.Context;
import android.content.DialogInterface;
import android.content.res.Configuration;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Point;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.util.Log;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.LayoutInflater;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.Toast;
import com.eric_lai.weeding_robot.R;
import com.eric_lai.weeding_robot.view.AutoFitTextureView;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;
public class CameraFragment extends Fragment {
/**
* Conversion from screen rotation to JPEG orientation.
*/
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
private static final String FRAGMENT_DIALOG = "dialog";
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
/**
* 調(diào)試用TAG
*/
private static final String TAG = "CameraFragment";
/**
* 相機(jī)狀態(tài):
* 0: 預(yù)覽
* 1: 等待上鎖(拍照片前將預(yù)覽鎖上保證圖像不在變化)
* 2: 等待預(yù)拍照(對(duì)焦, 曝光等操作)
* 3: 等待非預(yù)拍照(閃光燈等操作)
* 4: 已經(jīng)獲取照片
*/
private static final int STATE_PREVIEW = 0;
private static final int STATE_WAITING_LOCK = 1;
private static final int STATE_WAITING_PRECAPTURE = 2;
private static final int STATE_WAITING_NON_PRECAPTURE = 3;
private static final int STATE_PICTURE_TAKEN = 4;
/**
* Camera2 API提供的最大預(yù)覽寬度和高度
*/
private static final int MAX_PREVIEW_WIDTH = 1920;
private static final int MAX_PREVIEW_HEIGHT = 1080;
/**
* SurfaceTexture監(jiān)聽器
*/
private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
// SurfaceTexture就緒后回調(diào)執(zhí)行打開相機(jī)操作
openCamera(width, height);
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
// 預(yù)覽方向改變時(shí), 執(zhí)行轉(zhuǎn)換操作
configureTransform(width, height);
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture texture) {
}
};
/**
* 正在使用的相機(jī)id
*/
private String mCameraId;
/**
* 預(yù)覽使用的自定義TextureView控件
*/
private AutoFitTextureView mTextureView;
/**
* 預(yù)覽用的獲取會(huì)話
*/
private CameraCaptureSession mCaptureSession;
/**
* 正在使用的相機(jī)
*/
private CameraDevice mCameraDevice;
/**
* 預(yù)覽數(shù)據(jù)的尺寸
*/
private Size mPreviewSize;
/**
* 相機(jī)狀態(tài)改變的回調(diào)函數(shù)
*/
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
// 當(dāng)相機(jī)打開執(zhí)行以下操作:
// 1. 釋放訪問許可
// 2. 將正在使用的相機(jī)指向?qū)⒋蜷_的相機(jī)
// 3. 創(chuàng)建相機(jī)預(yù)覽會(huì)話
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
createCameraPreviewSession();
}
@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
// 當(dāng)相機(jī)失去連接時(shí)執(zhí)行以下操作:
// 1. 釋放訪問許可
// 2. 關(guān)閉相機(jī)
// 3. 將正在使用的相機(jī)指向null
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice cameraDevice, int error) {
// 當(dāng)相機(jī)發(fā)生錯(cuò)誤時(shí)執(zhí)行以下操作:
// 1. 釋放訪問許可
// 2. 關(guān)閉相機(jī)
// 3, 將正在使用的相機(jī)指向null
// 4. 獲取當(dāng)前的活動(dòng), 并結(jié)束它
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = getActivity();
if (null != activity) {
activity.finish();
}
}
};
/**
* 處理拍照等工作的子線程
*/
private HandlerThread mBackgroundThread;
/**
* 上面定義的子線程的處理器
*/
private Handler mBackgroundHandler;
/**
* 靜止頁面捕獲(拍照)處理器
*/
private ImageReader mImageReader;
/**
* 輸出照片的文件
*/
private File mFile;
/**
* ImageReader的回調(diào)函數(shù), 其中的onImageAvailable會(huì)在照片準(zhǔn)備好可以被保存時(shí)調(diào)用
*/
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
}
};
/**
* 預(yù)覽請(qǐng)求構(gòu)建器, 用來構(gòu)建"預(yù)覽請(qǐng)求"(下面定義的)通過pipeline發(fā)送到Camera device
*/
private CaptureRequest.Builder mPreviewRequestBuilder;
/**
* 預(yù)覽請(qǐng)求, 由上面的構(gòu)建器構(gòu)建出來
*/
private CaptureRequest mPreviewRequest;
/**
* 當(dāng)前的相機(jī)狀態(tài), 這里初始化為預(yù)覽, 因?yàn)閯傒d入這個(gè)fragment時(shí)應(yīng)顯示預(yù)覽
*/
private int mState = STATE_PREVIEW;
/**
* 信號(hào)量控制器, 防止相機(jī)沒有關(guān)閉時(shí)退出本應(yīng)用(若沒有關(guān)閉就退出, 會(huì)造成其他應(yīng)用無法調(diào)用相機(jī))
* 當(dāng)某處獲得這個(gè)許可時(shí), 其他需要許可才能執(zhí)行的代碼需要等待許可被釋放才能獲取
*/
private Semaphore mCameraOpenCloseLock = new Semaphore(1);
/**
* 捕獲會(huì)話回調(diào)函數(shù)
*
*/
private CameraCaptureSession.CaptureCallback mCaptureCallback
= new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull CaptureResult partialResult) {
process(partialResult);
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
process(result);
}
// 自定義的一個(gè)處理方法
private void process(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW: {
// 狀態(tài)是預(yù)覽時(shí), 不需要做任何事情
break;
}
case STATE_WAITING_LOCK: {
// 等待鎖定的狀態(tài), 某些設(shè)備完成鎖定后CONTROL_AF_STATE可能為null
Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
if (afState == null) {
captureStillPicture();
} else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
// 如果焦點(diǎn)已經(jīng)鎖定(不管自動(dòng)對(duì)焦是否成功), 檢查AE的返回, 注意某些設(shè)備CONTROL_AE_STATE可
// 能為空
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
// 如果自動(dòng)曝光(AE)設(shè)定良好, 將狀態(tài)置為已經(jīng)拍照, 執(zhí)行拍照
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
} else {
// 以上條件都不滿足, 執(zhí)行預(yù)拍照系列操作
runPrecaptureSequence();
}
}
break;
}
case STATE_WAITING_PRECAPTURE: {
// 等待預(yù)處理狀態(tài), 某些設(shè)備CONTROL_AE_STATE可能為null
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
// 如果AE需要做于拍照或者需要閃光燈, 將狀態(tài)置為"非等待預(yù)拍照"(翻譯得有點(diǎn)勉強(qiáng))
mState = STATE_WAITING_NON_PRECAPTURE;
}
break;
}
case STATE_WAITING_NON_PRECAPTURE: {
// 某些設(shè)備CONTROL_AE_STATE可能為null
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
// 如果AE做完"非等待預(yù)拍照", 將狀態(tài)置為已經(jīng)拍照, 并執(zhí)行拍照操作
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
}
break;
}
}
}
};
/**
* 在UI上顯示Toast的方法
*/
private void showToast(final String text) {
final Activity activity = getActivity();
if (activity != null) {
activity.runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(activity, text, Toast.LENGTH_SHORT).show();
}
});
}
}
/**
* 返回最合適的預(yù)覽尺寸
*
* @param choices 相機(jī)希望輸出類支持的尺寸list
* @param textureViewWidth texture view 寬度
* @param textureViewHeight texture view 高度
* @param maxWidth 能夠選擇的最大寬度
* @param maxHeight 能夠選擇的醉倒高度
* @param aspectRatio 圖像的比例(pictureSize, 只有當(dāng)pictureSize和textureSize保持一致, 才不會(huì)失真)
* @return 最合適的預(yù)覽尺寸
*/
private static Size chooseOptimalSize(Size[] choices, int textureViewWidth,
int textureViewHeight, int maxWidth, int maxHeight, Size aspectRatio) {
// 存放小于等于限定尺寸, 大于等于texture控件尺寸的Size
List<Size> bigEnough = new ArrayList<>();
// 存放小于限定尺寸, 小于texture控件尺寸的Size
List<Size> notBigEnough = new ArrayList<>();
int w = aspectRatio.getWidth();
int h = aspectRatio.getHeight();
for (Size option : choices) {
if (option.getWidth() <= maxWidth && option.getHeight() <= maxHeight &&
option.getHeight() == option.getWidth() * h / w) {
// option.getHeight() == option.getWidth() * h / w 用來保證
// pictureSize的 w / h 和 textureSize的 w / h 一致
if (option.getWidth() >= textureViewWidth &&
option.getHeight() >= textureViewHeight) {
bigEnough.add(option);
} else {
notBigEnough.add(option);
}
}
}
// 1. 若存在bigEnough數(shù)據(jù), 則返回最大里面最小的
// 2. 若不存bigEnough數(shù)據(jù), 但是存在notBigEnough數(shù)據(jù), 則返回在最小里面最大的
// 3. 上述兩種數(shù)據(jù)都沒有時(shí), 返回空, 并在日志上顯示錯(cuò)誤信息
if (bigEnough.size() > 0) {
return Collections.min(bigEnough, new CompareSizesByArea());
} else if (notBigEnough.size() > 0) {
return Collections.max(notBigEnough, new CompareSizesByArea());
} else {
Log.e(TAG, "Couldn't find any suitable preview size");
return choices[0];
}
}
public static CameraFragment newInstance() {
return new CameraFragment();
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
return inflater.inflate(R.layout.fragment_camera, container, false);
}
@Override
public void onViewCreated(final View view, Bundle savedInstanceState) {
mTextureView = (AutoFitTextureView) view.findViewById(R.id.texture);
}
@Override
public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mFile = new File(getActivity().getExternalFilesDir(null), "pic.jpg");
}
@Override
public void onResume() {
super.onResume();
startBackgroundThread();
// 當(dāng)屏幕關(guān)閉后重新打開, 若SurfaceTexture已經(jīng)就緒, 此時(shí)onSurfaceTextureAvailable不會(huì)被回調(diào), 這種情況下
// 如果SurfaceTexture已經(jīng)就緒, 則直接打開相機(jī), 否則等待SurfaceTexture已經(jīng)就緒的回調(diào)
if (mTextureView.isAvailable()) {
openCamera(mTextureView.getWidth(), mTextureView.getHeight());
} else {
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
}
@Override
public void onPause() {
closeCamera();
stopBackgroundThread();
super.onPause();
}
/**
* 設(shè)置相機(jī)的輸出, 包括預(yù)覽和拍照
*
* 處理流程如下:
* 1. 獲取當(dāng)前的攝像頭, 并將拍照輸出設(shè)置為最高畫質(zhì)
* 2. 判斷顯示方向和攝像頭傳感器方向是否一致, 是否需要旋轉(zhuǎn)畫面
* 3. 獲取當(dāng)前顯示尺寸和相機(jī)的輸出尺寸, 選擇最合適的預(yù)覽尺寸
*
* @param width 預(yù)覽寬度
* @param height 預(yù)覽高度
*/
private void setUpCameraOutputs(int width, int height) {
// 獲取當(dāng)前活動(dòng)
Activity activity = getActivity();
// 獲取CameraManager實(shí)例
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
// 遍歷運(yùn)行本應(yīng)用的設(shè)備的所有攝像頭
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// 如果該攝像頭是前置攝像頭, 則看下一個(gè)攝像頭(本應(yīng)用不使用前置攝像頭)
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// 選用最高畫質(zhì)
// maxImages是ImageReader一次可以訪問的最大圖片數(shù)量
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
Log.d(TAG, "largest.width: " + largest.getWidth());
Log.d(TAG, "largest.height: " + largest.getHeight());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
// 獲取手機(jī)目前的旋轉(zhuǎn)方向(橫屏還是豎屏, 對(duì)于"自然"狀態(tài)下高度大于寬度的設(shè)備來說橫屏是ROTATION_90
// 或者ROTATION_270,豎屏是ROTATION_0或者ROTATION_180)
int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();
// 獲取相機(jī)傳感器的方向("自然"狀態(tài)下垂直放置為0, 順時(shí)針?biāo)闫? 每次加90讀)
// 注意, 這個(gè)參數(shù), 是由設(shè)備的生產(chǎn)商來決定的, 大多數(shù)情況下, 該值為90, 以下的switch這么寫
// 是為了配適某些特殊的手機(jī)
int sensorOrientation =
characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
Log.d(TAG, "displayRotation: " + displayRotation);
Log.d(TAG, "sensorOritentation: " + sensorOrientation);
switch (displayRotation) {
// ROTATION_0和ROTATION_180都是豎屏只需做同樣的處理操作
// 顯示為豎屏?xí)r, 若傳感器方向?yàn)?0或者270, 則需要進(jìn)行轉(zhuǎn)換(標(biāo)志位置true)
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (sensorOrientation == 90 || sensorOrientation == 270) {
Log.d(TAG, "swappedDimensions set true !");
swappedDimensions = true;
}
break;
// ROTATION_90和ROTATION_270都是橫屏只需做同樣的處理操作
// 顯示為橫屏?xí)r, 若傳感器方向?yàn)?或者180, 則需要進(jìn)行轉(zhuǎn)換(標(biāo)志位置true)
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (sensorOrientation == 0 || sensorOrientation == 180) {
swappedDimensions = true;
}
break;
default:
Log.e(TAG, "Display rotation is invalid: " + displayRotation);
}
// 獲取當(dāng)前的屏幕尺寸, 放到一個(gè)點(diǎn)對(duì)象里
Point displaySize = new Point();
activity.getWindowManager().getDefaultDisplay().getSize(displaySize);
// 旋轉(zhuǎn)前的預(yù)覽寬度(相機(jī)給出的), 通過傳進(jìn)來的參數(shù)獲得
int rotatedPreviewWidth = width;
// 旋轉(zhuǎn)前的預(yù)覽高度(相機(jī)給出的), 通過傳進(jìn)來的參數(shù)獲得
int rotatedPreviewHeight = height;
// 將當(dāng)前的顯示尺寸賦給最大的預(yù)覽尺寸(能夠顯示的尺寸, 用來計(jì)算用的(texture可能比它小需要配適))
int maxPreviewWidth = displaySize.x;
int maxPreviewHeight = displaySize.y;
// 如果需要進(jìn)行畫面旋轉(zhuǎn), 將寬度和高度對(duì)調(diào)
if (swappedDimensions) {
rotatedPreviewWidth = height;
rotatedPreviewHeight = width;
maxPreviewWidth = displaySize.y;
maxPreviewHeight = displaySize.x;
}
// 尺寸太大時(shí)的極端處理
if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {
maxPreviewWidth = MAX_PREVIEW_WIDTH;
}
if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {
maxPreviewHeight = MAX_PREVIEW_HEIGHT;
}
// 自動(dòng)計(jì)算出最適合的預(yù)覽尺寸
// 第一個(gè)參數(shù):map.getOutputSizes(SurfaceTexture.class)表示SurfaceTexture支持的尺寸List
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
maxPreviewHeight, largest);
// 獲取當(dāng)前的屏幕方向
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
// 如果方向是橫向(landscape)
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
// 方向不是橫向(即豎向)
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}
Log.d(TAG, "real preview width: " + rotatedPreviewWidth);
Log.d(TAG, "real preview height: " + rotatedPreviewHeight);
// Log.d(TAG, "max preview width: " + maxPreviewWidth);
// Log.d(TAG, "max preview width: : " + maxPreviewHeight);
// 下面這兩個(gè)是計(jì)算后的previewSize=======================================
Log.d(TAG, "mPreviewSize.getWidth: " + mPreviewSize.getWidth());
Log.d(TAG, "mPreviewSize.getHeight: " + mPreviewSize.getHeight());
// =================================================================
mCameraId = cameraId;
return;
}
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException e) {
// 對(duì)話框顯示錯(cuò)誤
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}
}
/**
* 通過cameraId打開特定的相機(jī)
*/
private void openCamera(int width, int height) {
// 設(shè)置相機(jī)輸出
setUpCameraOutputs(width, height);
// 配置格式轉(zhuǎn)換
configureTransform(width, height);
// 獲取當(dāng)前活動(dòng)和CameraManager的實(shí)例
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
// 嘗試獲得相機(jī)開打關(guān)閉許可, 等待2500時(shí)間仍沒有獲得則排除異常
if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
// 打開相機(jī), 參數(shù)是: 相機(jī)id, 相機(jī)狀態(tài)回調(diào), 子線程處理器
manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
}
}
/**
* 關(guān)閉正在使用的相機(jī)
*/
private void closeCamera() {
try {
// 獲得相機(jī)開打關(guān)閉許可
mCameraOpenCloseLock.acquire();
// 關(guān)閉捕獲會(huì)話
if (null != mCaptureSession) {
mCaptureSession.close();
mCaptureSession = null;
}
// 關(guān)閉當(dāng)前相機(jī)
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
// 關(guān)閉拍照處理器
if (null != mImageReader) {
mImageReader.close();
mImageReader = null;
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
// 釋放相機(jī)開打關(guān)閉許可
mCameraOpenCloseLock.release();
}
}
/**
* 開啟子線程
*/
private void startBackgroundThread() {
mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
/**
* 停止子線程
*/
private void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
/**
* 創(chuàng)建預(yù)覽對(duì)話
*/
private void createCameraPreviewSession() {
try {
// 獲取texture實(shí)例
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// 設(shè)置寬度和高度
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// 用來開始預(yù)覽的輸出surface
Surface surface = new Surface(texture);
// 預(yù)覽請(qǐng)求構(gòu)建
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
// 創(chuàng)建預(yù)覽的捕獲會(huì)話
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
// 相機(jī)關(guān)閉時(shí), 直接返回
if (null == mCameraDevice) {
return;
}
// 會(huì)話可行時(shí), 將構(gòu)建的會(huì)話賦給field
mCaptureSession = cameraCaptureSession;
try {
// 自動(dòng)對(duì)焦
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// 自動(dòng)閃光
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// 構(gòu)建上述的請(qǐng)求
mPreviewRequest = mPreviewRequestBuilder.build();
// 重復(fù)進(jìn)行上面構(gòu)建的請(qǐng)求, 以便顯示預(yù)覽
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(
@NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 屏幕方向發(fā)生改變時(shí)調(diào)用轉(zhuǎn)換數(shù)據(jù)方法
*
* @param viewWidth mTextureView 的寬度
* @param viewHeight mTextureView 的高度
*/
private void configureTransform(int viewWidth, int viewHeight) {
Activity activity = getActivity();
if (null == mTextureView || null == mPreviewSize || null == activity) {
return;
}
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();
if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
float scale = Math.max(
(float) viewHeight / mPreviewSize.getHeight(),
(float) viewWidth / mPreviewSize.getWidth());
matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
} else if (Surface.ROTATION_180 == rotation) {
matrix.postRotate(180, centerX, centerY);
}
mTextureView.setTransform(matrix);
}
/**
* 實(shí)現(xiàn)拍照的方法
*/
private void takePicture() {
lockFocus();
}
/**
* 鎖定焦點(diǎn)(拍照的第一步)
*/
private void lockFocus() {
try {
// 構(gòu)建自動(dòng)對(duì)焦請(qǐng)求
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_START);
// 告訴mCaptureCallback回調(diào)狀態(tài)
mState = STATE_WAITING_LOCK;
// 提交一個(gè)捕獲單一圖片的請(qǐng)求個(gè)相機(jī)
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 執(zhí)行預(yù)拍照操作
*/
private void runPrecaptureSequence() {
try {
// 構(gòu)建預(yù)拍照請(qǐng)求
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,
CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
// 告訴mCaptureCallback回調(diào)狀態(tài)
mState = STATE_WAITING_PRECAPTURE;
// 提交一個(gè)捕獲單一圖片的請(qǐng)求個(gè)相機(jī)
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 拍照操作
*/
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// Orientation
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 解開鎖定的焦點(diǎn)
*/
private void unlockFocus() {
try {
// 構(gòu)建失能AF的請(qǐng)求
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
// 構(gòu)建自動(dòng)閃光請(qǐng)求(之前拍照前會(huì)構(gòu)建為需要或者不需要閃光燈, 這里重新設(shè)回自動(dòng))
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// 提交以上構(gòu)建的請(qǐng)求
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
// 拍完照后, 設(shè)置成預(yù)覽狀態(tài), 并重復(fù)預(yù)覽請(qǐng)求
mState = STATE_PREVIEW;
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 保存jpeg到指定的文件夾下, 開啟子線程執(zhí)行保存操作
*/
private static class ImageSaver implements Runnable {
/**
* jpeg格式的文件
*/
private final Image mImage;
/**
* 保存的文件
*/
private final File mFile;
public ImageSaver(Image image, File file) {
mImage = image;
mFile = file;
}
@Override
public void run() {
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(mFile);
output.write(bytes);
} catch (IOException e) {
e.printStackTrace();
} finally {
mImage.close();
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
/**
* 比較兩個(gè)Size的大小基于它們的area
*/
static class CompareSizesByArea implements Comparator<Size> {
@Override
public int compare(Size lhs, Size rhs) {
// We cast here to ensure the multiplications won't overflow
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}
}
/**
* 顯示錯(cuò)誤信息的對(duì)話框
*/
public static class ErrorDialog extends DialogFragment {
private static final String ARG_MESSAGE = "message";
public static ErrorDialog newInstance(String message) {
ErrorDialog dialog = new ErrorDialog();
Bundle args = new Bundle();
args.putString(ARG_MESSAGE, message);
dialog.setArguments(args);
return dialog;
}
@Override
public Dialog onCreateDialog(Bundle savedInstanceState) {
final Activity activity = getActivity();
return new AlertDialog.Builder(activity)
.setMessage(getArguments().getString(ARG_MESSAGE))
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int i) {
activity.finish();
}
})
.create();
}
}
}