Camera2 实现重力感应四个方向调试相机预览

发布于:2025-03-24 ⋅ 阅读:(35) ⋅ 点赞:(0)

Camera2API 实现重力感应四个方向调试相机预览


需求

Camera2 相机预览实现四个重力感应方向预览方向正常

场景

  • 实际有个客户需求,用到了四个方向的重力感应,但是之前的工程代码只有两个方向,导致屏幕无法四个方向正常显示
  • 相机预览两个重力感应方向已经满足了需求,但是客户重力感应模块、板卡 实际物理装配时候不可能按照要求装配的,不同厂家的磨具不一样导致板卡方向装配不一样。基于这样的实际情况,软件就需要适配四个方向,两个重力感应预览方向就无法满足实际要求了。
  • 实际产品案例:监控、HDMIN、车载Camera…

需求实现

setAspectRatio 设置显示长宽

设置显示组件长宽
textureView.setAspectRatio(width, height)


 private void setupCamera(int width, int height) {
        CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        try {
            for (String cameraId : manager.getCameraIdList()) {
              ...............
                int orientation = getResources().getConfiguration().orientation;
                if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
                    Log.d(TAG, "setupCamera: ORIENTATION_LANDSCAPE  " + mPreviewSize.getWidth() + " x " + mPreviewSize.getHeight());
                    textureView.setAspectRatio(1920, 1080);
                } else {
                    textureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
                    Log.d(TAG, "setupCamera:  ORIENTATION_PORT  " + mPreviewSize.getWidth() + " x " + mPreviewSize.getHeight());
                }

                break;
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

使用位置:在View 组件流传递过来时候, onSurfaceTextureAvailable 方法中,去设置 setupCamera

 TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            setupCamera(width, height);
            configureTransform(width, height);
            openCamera();
        }


        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
            configureTransform(width, height);
        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
            return false;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture surface) {

        }
    };
				

postScale postRotate 设置缩放和旋转

比如长宽肯定是1920*1080,根据实际产品长宽进行设置,但是如果是竖屏状态下,还是需要这个图像输出流的比例数出来,那么就必须进行缩放,才能完全显示出来的。

补充下基本知识点:
Matrix的操作,总共分为translate(平移),rotate(旋转),scale(缩放)和skew(倾斜)四种,每一种变换在Android的API里都提供了set, post和pre三种操作方式,除了translate,其他三种操作都可以指定中心点。

set是直接设置Matrix的值,每次set一次,整个Matrix的数组都会变掉,这也就意味着你对同一个矩阵先调用setScale,再调用setTranslate,那么矩阵只会执行Translate的操作,前面的scale操作是无效的。

post是后乘,当前的矩阵乘以参数给出的矩阵。可以连续多次使用post,来完成所需的整个变换。例如,要将一个图片先缩放,再平移则可以通过:

同样的道理,如上面设置显示长宽的后面就需要又缩放和旋转逻辑了, 输出流的地方是不会变的,那么接收的地方就需要根据实际情况如重力感应和物理场景【竖屏-横屏】进行流显示的变化了->缩放、旋转操作


onSurfaceTextureAvailable ->  configureTransform(width, height) -> Configuration config =  getResources().getConfiguration()【根据重力感应方向,计算缩放比例和旋转角度及选择方向了】

核心代码如下:
private void configureTransform(int viewWidth, int viewHeight) {
        if (null == textureView || null == mPreviewSize) {
            return;
        }
        int rotation = getWindowManager().getDefaultDisplay().getRotation();
        Configuration config =  getResources().getConfiguration();
        if (config.orientation == Configuration.ORIENTATION_PORTRAIT) {
            Log.d(TAG, "configureTransform: ORIENTATION_PORTRAIT");
            if(Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation){
              isRotation = true;
            }
        } else if (config.orientation == Configuration.ORIENTATION_LANDSCAPE) {
            Log.d(TAG, "configureTransform: ORIENTATION_LANDSCAPE");
            if(Surface.ROTATION_0 == rotation || Surface.ROTATION_180 == rotation){
              isRotation = true;
            }
        }
        Log.d(TAG, "configureTransform: isRotation = "+isRotation);
        if(isRotation){
            rotation = rotation + 1;
            if(rotation > 3){
               rotation = rotation - 3;
            }
        }
        Log.d(TAG," aaaaaaaaaaaa rotation:"+rotation);
        Matrix matrix = new Matrix();
        RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
        RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
        float centerX = viewRect.centerX();
        float centerY = viewRect.centerY();
        if (Surface.ROTATION_90 == rotation) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max(
                    (float) viewHeight / mPreviewSize.getHeight(),
                    (float) viewWidth / mPreviewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(90 * (rotation - 2), centerX, centerY);
            Log.d(TAG," 00000000000000  ");
        }else  if ( Surface.ROTATION_270 == rotation) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max(
                    (float) viewHeight / mPreviewSize.getHeight(),
                    (float) viewWidth / mPreviewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(270, centerX, centerY);
            Log.d(TAG," bbbbbbbbbbbbbbbb  ");
        }  else if (Surface.ROTATION_180 == rotation) {
            float scale = (float)1080/ 1920;
            Log.d(TAG, "configureTransform: rizhi xianshi port state  scale:"+scale+"    centerX:"+centerX+"   centerY:"+centerY);
            matrix.postScale(scale, scale, centerX, centerY);
            textureView.setRotation(270f);
            Log.d(TAG, "configureTransform:   "+"    centerX:"+centerX+"   centerY:"+centerY);
            Log.d(TAG," 111111111111111  ");
        }else if(Surface.ROTATION_0 == rotation){
            float scale = (float)1080/ 1920;

            Log.d(TAG, "configureTransform: rizhi xianshi port state  scale:"+scale+"    centerX:"+centerX+"   centerY:"+centerY);
            matrix.postScale(scale, scale, centerX, centerY);
            textureView.setRotation(270f);
            Log.d(TAG," 222222222222222222  ");

        }
        textureView.setTransform(matrix);
    }

manager.openCamera 打开相机

打开相机就是一个比较常规操作了,如上显示组件 已经准备好,进行了算转和缩放操作,就是显示组件已经设置好了参数,接下来就打开相机,让流数据进来。
打开相机基本代码如下:


 public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            setupCamera(width, height);
            configureTransform(width, height);
            openCamera();
}


private void openCamera() {
        CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        int i = 0;
        try {
            i = manager.getCameraIdList().length;
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        if (getDevVideoList()) {
            i = i - 1;
        }
        if (i == 2) {
            mCameraId = "9";
        } else {
            mCameraId = "0";
        }
        try {
            if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                Log.d(TAG," has no CAMERA  permission");
                return;
            }
            manager.openCamera(mCameraId, stateCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

startPreview

在相机打开操作中,如果相机打开成功,回调地方就需要设置预览了,这个是相机Camera 相关的操作了。 不要和预览组件textureView 搞混淆了。

这里用到的是Camera2 API 实现的

  private void startPreview() {
        setupImageReader();
        SurfaceTexture mSurfaceTexture = textureView.getSurfaceTexture();
        Log.d(TAG, "startPreview:  setDefaultBufferSize : width:"+mPreviewSize.getWidth()+"  x  "+mPreviewSize.getHeight()+"   shiji xiesi 1920 x 1080");
        mSurfaceTexture.setDefaultBufferSize(1920, 1080);
        mPreviewSurface = new Surface(mSurfaceTexture);
        try {
            getPreviewRequestBuilder();
            mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(CameraCaptureSession session) {
                    mCaptureSession = session;
                    repeatPreview();
                }
                @Override
                public void onConfigureFailed(CameraCaptureSession session) {
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

getPreviewRequestBuilder 设置预览参数:

private void getPreviewRequestBuilder() {
        try {
            mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        mPreviewRequestBuilder.addTarget(mPreviewSurface);
        MeteringRectangle[] meteringRectangles = mPreviewRequestBuilder.get(CaptureRequest.CONTROL_AF_REGIONS);
        if (meteringRectangles != null && meteringRectangles.length > 0) {
            Log.d(TAG, "PreviewRequestBuilder: AF_REGIONS=" + meteringRectangles[0].getRect().toString());
        }
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_IDLE);
    }

createCaptureSession 预览准备工作

这里完全找找Camera2 API 开发的流程要求,先进行预览准备工作,可以理解为参数设置。
	  mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(CameraCaptureSession session) {
                    mCaptureSession = session;
                    repeatPreview();
                }
                @Override
                public void onConfigureFailed(CameraCaptureSession session) {
                }
            }, null);

setRepeatingRequest 请求预览

这里只是一个请求操作,前面参数设置了为预览,所以这里局势预览操作

    private void repeatPreview() {
        mPreviewRequestBuilder.setTag(TAG_PREVIEW);
        mPreviewRequest = mPreviewRequestBuilder.build();

        try {
            mCaptureSession.setRepeatingRequest(mPreviewRequest, mPreviewCaptureCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
	
	
	

总结

  • Camera2 实现基本API操作
  • 重力感应四个方向适配,注意显示UI组件几个必备操作:显示大小、缩放、旋转、旋转位置x、y
  • 作为一个笔记篇,使用地方蛮多的,简单总结

核心基本代码如下

public class MainActivity extends Activity {

    private static final String TAG = "MainActivity";

    private static final String TAG_PREVIEW = "dddd";

    private static final SparseIntArray ORIENTATION = new SparseIntArray();

    static {
        ORIENTATION.append(Surface.ROTATION_0, 90);
        ORIENTATION.append(Surface.ROTATION_90, 0);
        ORIENTATION.append(Surface.ROTATION_180, 270);
        ORIENTATION.append(Surface.ROTATION_270, 180);
    }

    private String mCameraId;

    private Size mPreviewSize;

    private ImageReader mImageReader;

    private CameraDevice mCameraDevice;

    private CameraCaptureSession mCaptureSession;

    private CaptureRequest mPreviewRequest;

    private CaptureRequest.Builder mPreviewRequestBuilder;

    private AutoFitTextureView textureView;

    private Surface mPreviewSurface;
    private String hdmiPlug = "0";
    private boolean isRotation = false;

    Handler mHandler = new Handler();
    Runnable mRun = new Runnable() {
        @Override
        public void run() {
            Log.d("dddd", "mHandler =" + hdmiPlug);
            plugHandler.removeCallbacks(plugRun);
            sendHdmiInSpk("0");
            finish();
        }
    };

    Handler plugHandler = new Handler();
    Runnable plugRun = new Runnable() {
        @Override
        public void run() {
            Log.d("dddd", "plugHandler =" + hdmiPlug);
            getHdmiInPlug();
        }
    };

    TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
            setupCamera(width, height);
            configureTransform(width, height);
            openCamera();
        }


        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
            configureTransform(width, height);
        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
            return false;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture surface) {

        }
    };

    private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(CameraDevice camera) {
            mCameraDevice = camera;
            startPreview();
        }

        @Override
        public void onDisconnected(CameraDevice camera) {
            Log.i(TAG, "CameraDevice Disconnected");
        }

        @Override
        public void onError(CameraDevice camera, int error) {
            Log.e(TAG, "CameraDevice Error = " + error);
            if (error == 4) {
                Toast.makeText(MainActivity.this, MainActivity.this.getString(R.string.fise_error_title), Toast.LENGTH_SHORT).show();
                sendHdmiInPlug();
                plugHandler.removeCallbacks(plugRun);
                mHandler.postDelayed(mRun, 1000);
            }
        }
    };

    private CameraCaptureSession.CaptureCallback mPreviewCaptureCallback = new CameraCaptureSession.CaptureCallback() {
        @Override
        public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {

        }

        @Override
        public void onCaptureProgressed(CameraCaptureSession session, CaptureRequest request, CaptureResult partialResult) {

        }
    };


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        int uiFlags = View.SYSTEM_UI_FLAG_LAYOUT_STABLE
                | View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
                | View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN
                | View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
                | View.SYSTEM_UI_FLAG_FULLSCREEN
                | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY;
        getWindow().getDecorView().setSystemUiVisibility(uiFlags);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        // 添加标志以强制Activity全屏显示
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) {
            getWindow().getDecorView().setSystemUiVisibility(View.SYSTEM_UI_FLAG_HIDE_NAVIGATION);
        }
    }

    @Override
    protected void onStart() {
        super.onStart();
        Permission.checkPermission(this);

    }

    private void setupCamera(int width, int height) {
        CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        try {
            for (String cameraId : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT)
                    continue;
                StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                mPreviewSize = getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);
                Log.d(TAG, "getOptimalSize:  ORIENTATION_PORT  " + width + " x " + height);
                int orientation = getResources().getConfiguration().orientation;
                if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
                    Log.d(TAG, "setupCamera: ORIENTATION_LANDSCAPE  " + mPreviewSize.getWidth() + " x " + mPreviewSize.getHeight());
                    textureView.setAspectRatio(1920, 1080);
                } else {
                    textureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
                    Log.d(TAG, "setupCamera:  ORIENTATION_PORT  " + mPreviewSize.getWidth() + " x " + mPreviewSize.getHeight());
                }

                break;
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private boolean getDevVideoList() {
        File file = new File("/dev/video3");
        if (file.exists()) {
            Log.d("huanghb", "video3 true");
            return true;
        }
        Log.d("huanghb", "video3 false");
        return false;
    }


    private void openCamera() {
        CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
        int i = 0;
        try {
            i = manager.getCameraIdList().length;
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        if (getDevVideoList()) {
            i = i - 1;
        }
        if (i == 2) {
            mCameraId = "9";
        } else {
            mCameraId = "0";
        }
        try {
            if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                Log.d(TAG," has no CAMERA  permission");
                return;
            }
            manager.openCamera(mCameraId, stateCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void closeCamera() {
         Log.d(TAG, "closeCamera");
        if (null != mCaptureSession) {
            mCaptureSession.close();
            mCaptureSession = null;
        }
        if (null != mCameraDevice) {
            mCameraDevice.close();
            mCameraDevice = null;
        }
        if (null != mImageReader) {
            mImageReader.close();
            mImageReader = null;
        }
    }

    private void configureTransform(int viewWidth, int viewHeight) {
        if (null == textureView || null == mPreviewSize) {
            return;
        }
        int rotation = getWindowManager().getDefaultDisplay().getRotation();
        Configuration config =  getResources().getConfiguration();
        if (config.orientation == Configuration.ORIENTATION_PORTRAIT) {
            Log.d(TAG, "configureTransform: ORIENTATION_PORTRAIT");
            if(Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation){
              isRotation = true;
            }
        } else if (config.orientation == Configuration.ORIENTATION_LANDSCAPE) {
            Log.d(TAG, "configureTransform: ORIENTATION_LANDSCAPE");
            if(Surface.ROTATION_0 == rotation || Surface.ROTATION_180 == rotation){
              isRotation = true;
            }
        }
        Log.d(TAG, "configureTransform: isRotation = "+isRotation);
        if(isRotation){
            rotation = rotation + 1;
            if(rotation > 3){
               rotation = rotation - 3;
            }
        }
        Log.d(TAG," aaaaaaaaaaaa rotation:"+rotation);
        Matrix matrix = new Matrix();
        RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
        RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
        float centerX = viewRect.centerX();
        float centerY = viewRect.centerY();
        if (Surface.ROTATION_90 == rotation) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max(
                    (float) viewHeight / mPreviewSize.getHeight(),
                    (float) viewWidth / mPreviewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(90 * (rotation - 2), centerX, centerY);
            Log.d(TAG," 00000000000000  ");
        }else  if ( Surface.ROTATION_270 == rotation) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max(
                    (float) viewHeight / mPreviewSize.getHeight(),
                    (float) viewWidth / mPreviewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(270, centerX, centerY);
            Log.d(TAG," bbbbbbbbbbbbbbbb  ");
        }  else if (Surface.ROTATION_180 == rotation) {
            float scale = (float)1080/ 1920;
            Log.d(TAG, "configureTransform: rizhi xianshi port state  scale:"+scale+"    centerX:"+centerX+"   centerY:"+centerY);
            matrix.postScale(scale, scale, centerX, centerY);
            textureView.setRotation(270f);
            Log.d(TAG, "configureTransform:   "+"    centerX:"+centerX+"   centerY:"+centerY);
            Log.d(TAG," 111111111111111  ");
        }else if(Surface.ROTATION_0 == rotation){
            float scale = (float)1080/ 1920;

            Log.d(TAG, "configureTransform: rizhi xianshi port state  scale:"+scale+"    centerX:"+centerX+"   centerY:"+centerY);
            matrix.postScale(scale, scale, centerX, centerY);
            textureView.setRotation(270f);
            Log.d(TAG," 222222222222222222  ");

        }
        textureView.setTransform(matrix);
    }

    private void startPreview() {
        setupImageReader();
        SurfaceTexture mSurfaceTexture = textureView.getSurfaceTexture();
        Log.d(TAG, "startPreview:  setDefaultBufferSize : width:"+mPreviewSize.getWidth()+"  x  "+mPreviewSize.getHeight()+"   shiji xiesi 1920 x 1080");
        mSurfaceTexture.setDefaultBufferSize(1920, 1080);
        mPreviewSurface = new Surface(mSurfaceTexture);
        try {
            getPreviewRequestBuilder();
            mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(CameraCaptureSession session) {
                    mCaptureSession = session;
                    repeatPreview();
                }
                @Override
                public void onConfigureFailed(CameraCaptureSession session) {
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void repeatPreview() {
        mPreviewRequestBuilder.setTag(TAG_PREVIEW);
        mPreviewRequest = mPreviewRequestBuilder.build();

        try {
            mCaptureSession.setRepeatingRequest(mPreviewRequest, mPreviewCaptureCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }


    private void setupImageReader() {

        mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.JPEG, 1);
        mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
            @Override
            public void onImageAvailable(ImageReader reader) {
                Log.i(TAG, "Image Available!");
                Image image = reader.acquireLatestImage();
                
                new Thread(new ImageSaver(image)).start();
            }
        }, null);
    }
    private Size getOptimalSize(Size[] sizeMap, int width, int height) {
        for(Size size:sizeMap){
            Log.d(TAG, ":  size "+size.getWidth()+"   "+size.getHeight());
        }

        List<Size> sizeList = new ArrayList<>();
        for (Size option : sizeMap) {
            if (width > height) {
                if (option.getWidth() > width && option.getHeight() > height) {
                    sizeList.add(option);
                }
            } else {
                if (option.getWidth() > height && option.getHeight() > width) {
                    sizeList.add(option);
                }
            }
        }
        if (sizeList.size() > 0) {
            return Collections.min(sizeList, new Comparator<Size>() {
                @Override
                public int compare(Size lhs, Size rhs) {
                    return Long.signum(lhs.getWidth() * lhs.getHeight() - rhs.getWidth() * rhs.getHeight());
                }
            });
        }
        Log.d(TAG, "getOptimalSize:  zuizhong  sizeMap[0]: width:"+sizeMap[0].getWidth()+"   height:"+sizeMap[0].getHeight());
        return sizeMap[0];
    }

    private void getPreviewRequestBuilder() {
        try {
            mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        mPreviewRequestBuilder.addTarget(mPreviewSurface);
        MeteringRectangle[] meteringRectangles = mPreviewRequestBuilder.get(CaptureRequest.CONTROL_AF_REGIONS);
        if (meteringRectangles != null && meteringRectangles.length > 0) {
            Log.d(TAG, "PreviewRequestBuilder: AF_REGIONS=" + meteringRectangles[0].getRect().toString());
        }
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_IDLE);
    }

    


    public static class ImageSaver implements Runnable {
        private Image mImage;
        public ImageSaver(Image image) {
            mImage = image;
        }
        @Override
        public void run() {
            ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
            byte[] data = new byte[buffer.remaining()];
            buffer.get(data);
            File imageFile = new File(Environment.getExternalStorageDirectory() + "/DCIM/myPicture.jpg");
            FileOutputStream fos = null;
            try {
                fos = new FileOutputStream(imageFile);
                fos.write(data, 0, data.length);
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
                if (fos != null) {
                    try {
                        fos.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }
        }
    }
    
    @Override
		public boolean onKeyDown(int keyCode, KeyEvent event) {
			if (keyCode == KeyEvent.KEYCODE_BACK) {
				plugHandler.removeCallbacks(plugRun);
				sendHdmiInSpk("0");
				finish();
				return true;
			}else if (keyCode == KeyEvent.KEYCODE_HOME) {
				plugHandler.removeCallbacks(plugRun);
				sendHdmiInSpk("0");
				finish();
				return true;
			}
			return super.onKeyDown(keyCode, event);
		}
    
    @Override
    protected void onPause() {
        super.onPause();
        Log.d(TAG, "onPause");
        plugHandler.removeCallbacksAndMessages(null);  
        closeCamera();
        sendHdmiInSpk("0");
        super.onPause();
    }
    
    @Override
    protected void onDestroy() {
        super.onDestroy();
        sendHdmiInSpk("0");
        Log.d("huanghb","onDestroy=");
    }

    @Override
    protected void onResume() {
        super.onResume();
       Log.d(TAG," onREsume");
        if(Permission.isPermissionGranted(this)) {
            setContentView(R.layout.activity_main);
            sendHdmiInPlug();
            textureView = findViewById(R.id.textureView);
            textureView.setSurfaceTextureListener(textureListener);
            Log.d(TAG," 权限申请成功,走正常逻辑...");
            plugHandler.removeCallbacksAndMessages(null);
            plugHandler.postDelayed(plugRun, 2000);
            Log.d(TAG, "onResume: ");
            sendHdmiInSpk("1");
        }else{
            Log.d(TAG," 未授权成功..");
        }
    }

    public void  sendHdmiInSpk(String spk) {
        Log.d(TAG, "sendHdmiInSpk: cmdParam:" + spk);
        Intent intent =new  Intent();
        intent.setAction("com.fise.hdmiin.spk");
        intent.putExtra("hdmiinSpk", spk);
        sendBroadcast(intent);
    }
    
    public void sendHdmiInPlug(){
        Intent intent = new Intent();
        intent.setAction("com.fise.hdmiin.plug");
        sendBroadcast(intent);
    }
    
    private void getHdmiInPlug(){

        hdmiPlug = Uitls.read("/sys/class/fise_hdmiin_state_irq/fise_hdmiin_state_irq_enable");
        Log.d(TAG,"getHdmiInPlug ="+hdmiPlug);
        if(hdmiPlug == null || hdmiPlug.toString().trim().equals("")){
            return;
        }
        if(hdmiPlug.equals("1")){
			       Toast.makeText(MainActivity.this, MainActivity.this.getString(R.string.fise_error_title), Toast.LENGTH_SHORT).show();
			       sendHdmiInPlug();
			       plugHandler.removeCallbacks(plugRun);
			       mHandler.postDelayed(mRun, 1000);
        }else{
          plugHandler.postDelayed(plugRun, 1000);
        }
    }


    @Override
    public void onRequestPermissionsResult(
            int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (requestCode == Permission.REQUEST_CODE) {
            for (int grantResult : grantResults) {
                if (grantResult != PackageManager.PERMISSION_GRANTED) {
                    Log.e("Permission","授权失败!");
                    Toast.makeText(MainActivity.this," Please grant permission",Toast.LENGTH_LONG);
                    this.finish();
                    return;
                }
            }
        }
    }



}