序幕
我们紧接着上一篇openCamera过程继续我们的配流过程学习,也就是createCaptureSession,别眨眼,直接gogogo,出发咯!
开端
这里继续是从app层说起,首先我们先熟悉一下前面章节,再加上本章节内容,上代码!
// 打开相机
private void openCamera() {
if (mCameraId == null || mCameraManager == null) {
return;
}
try {
// 检查权限
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
return;
}
// 打开相机
mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
runOnUiThread(() -> Toast.makeText(this, "打开相机失败", Toast.LENGTH_SHORT).show());
}
}
// 相机设备状态回调
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
// 相机打开成功,保存CameraDevice实例并创建预览会话
mCameraDevice = camera;
createCaptureSession();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
camera.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
camera.close();
mCameraDevice = null;
runOnUiThread(() -> Toast.makeText(Camera2PreviewActivity.this,
"相机打开失败: " + error, Toast.LENGTH_SHORT).show());
}
};
// 创建捕获会话
private void createCaptureSession() {
if (mCameraDevice == null || !mTextureView.isAvailable()) {
return;
}
try {
// 获取TextureView的SurfaceTexture并创建Surface
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(surfaceTexture);
// 创建预览请求构建器
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(
CameraDevice.TEMPLATE_PREVIEW);
// 将Surface作为预览的目标
mPreviewRequestBuilder.addTarget(surface);
// 准备输出表面列表(仅包含预览表面)
List<Surface> outputSurfaces = new ArrayList<>();
outputSurfaces.add(surface);
// 创建相机捕获会话
mCameraDevice.createCaptureSession(outputSurfaces,
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
// 会话配置成功
mCaptureSession = session;
// 启动预览
startPreviewRequest();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
runOnUiThread(() -> Toast.makeText(Camera2PreviewActivity.this,
"预览配置失败", Toast.LENGTH_SHORT).show());
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
前面openCamera的时候带了一个很重要的参数,也就是mStateCallback,他的作用是用来监听camera open成功失败的一个回调的,当open成功的时候就可以在onOpened()回调里面实现创建捕获会话了,当然如果失败了也会在onError()回调里面实现提示打开相机失败相关提示给用户。
到这里就这篇文章的主角正式登场,开始进入createCaptureSession()流程,继续往下看,它到底做了什么,为啥说这个环节至关重要。
发展
现在我们开始mCameraDevice.createCaptureSession代码跟踪之前需要看下,mCameraDevice从何而来
可以从上面的实例代码可以知道是CameraDevice.StateCallback 的onOpened回调参数赋值的,那CameraDevice又是从哪里来的呢
那就回到上一篇文章openCamera流程里
private CameraDevice openCameraDeviceUserAsync(String cameraId,
CameraDevice.StateCallback callback, Executor executor, final int uid,
final int oomScoreOffset) throws CameraAccessException {
try {
cameraUser = cameraService.connectDevice(callbacks, cameraId,
mContext.getOpPackageName(), mContext.getAttributionTag(), uid,
oomScoreOffset, mContext.getApplicationInfo().targetSdkVersion);
} catch (ServiceSpecificException e) {
//这里会回调onError,过程不再赘述
deviceImpl.setRemoteFailure(e);
} catch (RemoteException e) {
//这里会回调onError,过程不再赘述
deviceImpl.setRemoteFailure(e);
}
//这里会回调onOpened,过程继续看,因为我们要搞清楚上面的mCameraDevice是什么
deviceImpl.setRemoteDevice(cameraUser);
return device;
}
继续
frameworks/base/core/java/android/hardware/camera/impl/CameraDeviceImpl.java
public void setRemoteDevice(ICameraDeviceUser remoteDevice) throws CameraAccessException {
//如果连接没问题的话就会执行一个线程,在这里回调
mDeviceExecutor.execute(mCallOnOpened);
}
private final Runnable mCallOnOpened = new Runnable() {
@Override
public void run() {
//可以看到传入的就是CameraDeviceImpl本身
mDeviceCallback.onOpened(CameraDeviceImpl.this);
}
};
所以mCameraDevice就是CameraDeviceImpl实例,mCameraDevice.createCaptureSession也就是调用CameraDeviceImpl.java 的createCaptureSession函数
准备工作搞完了,这里开始正式跟踪创建捕获会话过程
frameworks/base/core/java/android/hardware/camera/impl/CameraDeviceImpl.java
@Override
public void createCaptureSession(List<Surface> outputs,
CameraCaptureSession.StateCallback callback, Handler handler)
throws CameraAccessException {
List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size());
for (Surface surface : outputs) {
outConfigurations.add(new OutputConfiguration(surface));
}
createCaptureSessionInternal(null, outConfigurations, callback,
checkAndWrapHandler(handler), /*operatingMode*/ICameraDeviceUser.NORMAL_MODE,
/*sessionParams*/ null);
}
这里不着急往下,需要先看下createCaptureSessionInternal传入的几个参数,outConfigurations,callback,其中我们文章开头app端示例代码,这里仅传入了一个surface对象进行预览,其实还可以同时传入预览,拍照两个surface对象,如下:
Arrays.asList(surface, mImageReader.getSurface())
正常我们相机肯定不仅仅作为预览的,肯定默认是带了拍照功能的,除此之外还有另一种情况的,比如既有预览,也有录像功能的,一般录像还会存在快照功能的,也就是所谓拍照的功能
这种情况就是需要传入三种suface对象了,如下:
Arrays.asList(surface, mMediaRecoderSurface,mImageReader.getSurface())
这种就是一个surface用来预览,一个surface用来给MediaRecorder接收数据,一个用来捕获当前会话帧 ,callback参数是后面创建连接成功之后需要在onConfigured回调里起预览
提一嘴现在常用录制流程如下:
// 1. 创建持久化输入 Surface(供编码器使用)
Surface persistentSurface = MediaCodec.createPersistentInputSurface();
// 2. 配置 MediaRecorder,关联该 Surface 作为输入
MediaRecorder recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
// ... 其他配置(输出文件、编码参数等)...
recorder.setInputSurface(persistentSurface);
recorder.prepare();
// 3. 相机配置:将同一个 Surface 作为输出目标
CameraDevice cameraDevice = ...; // 已打开的相机
CaptureRequest.Builder previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
previewRequestBuilder.addTarget(persistentSurface); // 相机帧直接输出到编码器的 Surface
// 4. 创建捕获会话,启动录制
cameraDevice.createCaptureSession(Collections.singletonList(persistentSurface),
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler);
recorder.start(); // 开始录制,数据直接从相机流向编码器
}
// ... 其他回调 ...
}, backgroundHandler);
那就继续往下看
private void createCaptureSessionInternal(InputConfiguration inputConfig,
List<OutputConfiguration> outputConfigurations,
CameraCaptureSession.StateCallback callback, Executor executor,
int operatingMode, CaptureRequest sessionParams) throws CameraAccessException {
// TODO: dont block for this
boolean configureSuccess = true;
CameraAccessException pendingException = null;
Surface input = null;
try {
// configure streams and then block until IDLE
configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
operatingMode, sessionParams, createSessionStartTime);
if (configureSuccess == true && inputConfig != null) {
input = mRemoteDevice.getInputSurface();
}
} catch (CameraAccessException e) {
configureSuccess = false;
pendingException = e;
input = null;
if (DEBUG) {
Log.v(TAG, "createCaptureSession - failed with exception ", e);
}
}
}
这里只关注重点configureStreamsChecked(其实其他省略的也是重点,但这边只关注流程)
public boolean configureStreamsChecked(InputConfiguration inputConfig,
List<OutputConfiguration> outputs, int operatingMode, CaptureRequest sessionParams,
long createSessionStartTime)
throws CameraAccessException {
mRemoteDevice.beginConfigure();
// Delete all streams first (to free up HW resources)
for (Integer streamId : deleteList) {
mRemoteDevice.deleteStream(streamId);
mConfiguredOutputs.delete(streamId);
}
// Add all new streams
for (OutputConfiguration outConfig : outputs) {
if (addSet.contains(outConfig)) {
int streamId = mRemoteDevice.createStream(outConfig);
mConfiguredOutputs.put(streamId, outConfig);
}
}
int offlineStreamIds[];
if (sessionParams != null) {
offlineStreamIds = mRemoteDevice.endConfigure(operatingMode,
sessionParams.getNativeCopy(), createSessionStartTime);
} else {
offlineStreamIds = mRemoteDevice.endConfigure(operatingMode, null,
createSessionStartTime);
}
return success;
}
直接看mRemoteDevice.endConfigure,最关键的流程
高潮
frameworks/av/services/camera/libcameraservice/api2/CameraDeviceClient.cpp
binder::Status CameraDeviceClient::endConfigure(int operatingMode,
const hardware::camera2::impl::CameraMetadataNative& sessionParams, int64_t startTimeMs,
std::vector<int>* offlineStreamIds /*out*/) {
//省略n行代码,只跟最关键的
status_t err = mDevice->configureStreams(sessionParams, operatingMode);
return res;
}
以下是一些列函数调用跳转,忽略其他干扰,直接看最终函数目的地
frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp
status_t Camera3Device::configureStreams(const CameraMetadata& sessionParams, int operatingMode) {
ATRACE_CALL();
ALOGV("%s: E", __FUNCTION__);
Mutex::Autolock il(mInterfaceLock);
Mutex::Autolock l(mLock);
// In case the client doesn't include any session parameter, try a
// speculative configuration using the values from the last cached
// default request.
if (sessionParams.isEmpty() &&
((mLastTemplateId > 0) && (mLastTemplateId < CAMERA_TEMPLATE_COUNT)) &&
(!mRequestTemplateCache[mLastTemplateId].isEmpty())) {
ALOGV("%s: Speculative session param configuration with template id: %d", __func__,
mLastTemplateId);
return filterParamsAndConfigureLocked(mRequestTemplateCache[mLastTemplateId],
operatingMode);
}
return filterParamsAndConfigureLocked(sessionParams, operatingMode);
}
status_t Camera3Device::filterParamsAndConfigureLocked(const CameraMetadata& sessionParams,
int operatingMode) {
return configureStreamsLocked(operatingMode, filteredParams);
}
status_t Camera3Device::configureStreamsLocked(int operatingMode,
const CameraMetadata& sessionParams, bool notifyRequestThread) {
res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
return OK;
}
看到这里就知道了,又是跟之前一样,又该通过HIDL直接跳转了,那就跳吧,我也累了,等不及了
hardware/interfaces/camera/device/3.2/default/CameraDeviceSession.cpp
Return<void> CameraDeviceSession::configureStreams(
const StreamConfiguration& requestedConfiguration,
ICameraDeviceSession::configureStreams_cb _hidl_cb) {
// 这里的mDevice是camera_device_t* 类型,可以直接去HAL的so库里找实现不用折腾
status_t ret = mDevice->ops->configure_streams(mDevice, &stream_list);
}
看了上一篇就知道在这里找了,立马就找到了
hardware/qcom/camera/msm8998/QCamera2/HAL3/QCamera3HWI.cpp
int QCamera3HardwareInterface::configure_streams(
const struct camera3_device *device,
camera3_stream_configuration_t *stream_list)
{
LOGD("E");
QCamera3HardwareInterface *hw =
reinterpret_cast<QCamera3HardwareInterface *>(device->priv);
if (!hw) {
LOGE("NULL camera device");
return -ENODEV;
}
int rc = hw->configureStreams(stream_list);
LOGD("X");
return rc;
}
然后咔咔一顿掏
status_t OutputBufferDispatcher::configureStreams(camera3_stream_configuration_t *streamList)
{
std::lock_guard<std::mutex> lock(mLock);
mStreamBuffers.clear();
if (!streamList) {
ALOGE("%s: streamList is nullptr.", __FUNCTION__);
return -EINVAL;
}
// Create a "frame-number -> buffer" map for each stream.
for (uint32_t i = 0; i < streamList->num_streams; i++) {
mStreamBuffers.emplace(streamList->streams[i], std::map<uint32_t, Buffer>());
}
return OK;
}
不对,掏错了,这个才是真的,不睡觉真的眼花了,看来需要个冰廓落提提神了
看这个官方注释应该是稳了
/*===========================================================================
* FUNCTION : configureStreams
*
* DESCRIPTION: Reset HAL camera device processing pipeline and set up new input
* and output streams.
*
* PARAMETERS :
* @stream_list : streams to be configured
*
* RETURN :
*
*==========================================================================*/
int QCamera3HardwareInterface::configureStreams(
camera3_stream_configuration_t *streamList)
{
ATRACE_CAMSCOPE_CALL(CAMSCOPE_HAL3_CFG_STRMS);
int rc = 0;
// Acquire perfLock before configure streams
mPerfLockMgr.acquirePerfLock(PERF_LOCK_START_PREVIEW);
rc = configureStreamsPerfLocked(streamList);
mPerfLockMgr.releasePerfLock(PERF_LOCK_START_PREVIEW);
return rc;
}
这下面这个函数也是让我叹为观止,没见过这么长的,谷歌是真不怕我更加看不清了嘛,下面代码已经大量删减
/*===========================================================================
* FUNCTION : configureStreamsPerfLocked
*
* DESCRIPTION: configureStreams while perfLock is held.
*
* PARAMETERS :
* @stream_list : streams to be configured
*
* RETURN : int32_t type of status
* NO_ERROR -- success
* none-zero failure code
*==========================================================================*/
int QCamera3HardwareInterface::configureStreamsPerfLocked(
camera3_stream_configuration_t *streamList)
{
//这里会将所有的流停掉,为了保证流的合法性,完整性,后面申请的时候再重新start
/* first invalidate all the steams in the mStreamList
* if they appear again, they will be validated */
for (List<stream_info_t*>::iterator it = mStreamInfo.begin();
it != mStreamInfo.end(); it++) {
QCamera3ProcessingChannel *channel = (QCamera3ProcessingChannel*)(*it)->stream->priv;
if (channel) {
channel->stop();
}
(*it)->status = INVALID;
}
if (mRawDumpChannel) {
mRawDumpChannel->stop();
delete mRawDumpChannel;
mRawDumpChannel = NULL;
}
if (mHdrPlusRawSrcChannel) {
mHdrPlusRawSrcChannel->stop();
delete mHdrPlusRawSrcChannel;
mHdrPlusRawSrcChannel = NULL;
}
if (mSupportChannel)
mSupportChannel->stop();
if (mAnalysisChannel) {
mAnalysisChannel->stop();
}
if (mMetadataChannel) {
/* If content of mStreamInfo is not 0, there is metadata stream */
mMetadataChannel->stop();
}
//这里检查状态,没有配置完成就继续完成配流操作
// Check state
switch (mState) {
case INITIALIZED:
case CONFIGURED:
case STARTED:
/* valid state */
break;
default:
LOGE("Invalid state %d", mState);
pthread_mutex_unlock(&mMutex);
return -ENODEV;
}
//下面又重新初始化,配流实际就是建立路由连接,等待后续请求时候获取数据
//更新状态,此时已完成配流
// Update state
mState = CONFIGURED;
// Create analysis stream all the time, even when h/w support is not available
if (!onlyRaw) {
mAnalysisChannel = new QCamera3SupportChannel(
mCameraHandle->camera_handle,
mChannelHandle,
mCameraHandle->ops,
&analysisInfo.analysis_padding_info,
analysisFeatureMask,
CAM_STREAM_TYPE_ANALYSIS,
&analysisDim,
(analysisInfo.analysis_format
== CAM_FORMAT_Y_ONLY ? CAM_FORMAT_Y_ONLY
: CAM_FORMAT_YUV_420_NV21),
analysisInfo.hw_analysis_supported,
gCamCapability[mCameraId]->color_arrangement,
this,
0); // force buffer count to 0
}
//创建不同通路,以及设置各种流配置
//RAW DUMP channel
if (mEnableRawDump && isRawStreamRequested == false){
cam_dimension_t rawDumpSize;
rawDumpSize = getMaxRawSize(mCameraId);
cam_feature_mask_t rawDumpFeatureMask = CAM_QCOM_FEATURE_NONE;
setPAAFSupport(rawDumpFeatureMask,
CAM_STREAM_TYPE_RAW,
gCamCapability[mCameraId]->color_arrangement);
mRawDumpChannel = new QCamera3RawDumpChannel(mCameraHandle->camera_handle,
mChannelHandle,
mCameraHandle->ops,
rawDumpSize,
&padding_info,
this, rawDumpFeatureMask);
if (!mRawDumpChannel) {
LOGE("Raw Dump channel cannot be created");
pthread_mutex_unlock(&mMutex);
return -ENOMEM;
}
}
if (!onlyRaw && isSupportChannelNeeded(streamList, mStreamConfigInfo)) {
mSupportChannel = new QCamera3SupportChannel(
mCameraHandle->camera_handle,
mChannelHandle,
mCameraHandle->ops,
&gCamCapability[mCameraId]->padding_info,
callbackFeatureMask,
CAM_STREAM_TYPE_CALLBACK,
&QCamera3SupportChannel::kDim,
CAM_FORMAT_YUV_420_NV21,
supportInfo.hw_analysis_supported,
gCamCapability[mCameraId]->color_arrangement,
this, 0);
}
if ((mOpMode == CAMERA3_STREAM_CONFIGURATION_CONSTRAINED_HIGH_SPEED_MODE) &&
!m_bIsVideo) {
mDummyBatchChannel = new QCamera3RegularChannel(mCameraHandle->camera_handle,
mChannelHandle,
mCameraHandle->ops, captureResultCb,
setBufferErrorStatus, &gCamCapability[mCameraId]->padding_info,
this,
&mDummyBatchStream,
CAM_STREAM_TYPE_VIDEO,
dummyFeatureMask,
mMetadataChannel);
}
if ((mOpMode == CAMERA3_STREAM_CONFIGURATION_CONSTRAINED_HIGH_SPEED_MODE) &&
!m_bIsVideo) {
mDummyBatchChannel = new QCamera3RegularChannel(mCameraHandle->camera_handle,
mChannelHandle,
mCameraHandle->ops, captureResultCb,
setBufferErrorStatus, &gCamCapability[mCameraId]->padding_info,
this,
&mDummyBatchStream,
CAM_STREAM_TYPE_VIDEO,
dummyFeatureMask,
mMetadataChannel);
}
}
后续这些实例Channel,在预览或者拍照时,会调用到processCaptureRequest 初始化并启动,这是后话,可以先了解一下
int QCamera3HardwareInterface::processCaptureRequest(
camera3_capture_request_t *request,
List<InternalRequest> &internallyRequestedStreams)
{
if (mRawDumpChannel) {
rc = mRawDumpChannel->initialize(IS_TYPE_NONE);
if (rc != NO_ERROR) {
LOGE("Error: Raw Dump Channel init failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
}
if (mHdrPlusRawSrcChannel) {
rc = mHdrPlusRawSrcChannel->initialize(IS_TYPE_NONE);
if (rc != NO_ERROR) {
LOGE("Error: HDR+ RAW Source Channel init failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
}
if (mSupportChannel) {
rc = mSupportChannel->initialize(IS_TYPE_NONE);
if (rc < 0) {
LOGE("Support channel initialization failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
}
if (mAnalysisChannel) {
rc = mAnalysisChannel->initialize(IS_TYPE_NONE);
if (rc < 0) {
LOGE("Analysis channel initialization failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
}
if (mDummyBatchChannel) {
rc = mDummyBatchChannel->setBatchSize(mBatchSize);
if (rc < 0) {
LOGE("mDummyBatchChannel setBatchSize failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
rc = mDummyBatchChannel->initialize(IS_TYPE_NONE);
if (rc < 0) {
LOGE("mDummyBatchChannel initialization failed");
pthread_mutex_unlock(&mMutex);
goto error_exit;
}
}
if (mState == CONFIGURED && mChannelHandle) {
//Then start them.
LOGH("Start META Channel");
rc = mMetadataChannel->start();
if (rc < 0) {
LOGE("META channel start failed");
pthread_mutex_unlock(&mMutex);
return rc;
}
if (mAnalysisChannel) {
rc = mAnalysisChannel->start();
if (rc < 0) {
LOGE("Analysis channel start failed");
mMetadataChannel->stop();
pthread_mutex_unlock(&mMutex);
return rc;
}
}
if (mSupportChannel) {
rc = mSupportChannel->start();
if (rc < 0) {
LOGE("Support channel start failed");
mMetadataChannel->stop();
/* Although support and analysis are mutually exclusive today
adding it in anycase for future proofing */
if (mAnalysisChannel) {
mAnalysisChannel->stop();
}
pthread_mutex_unlock(&mMutex);
return rc;
}
}
}
尾声
从最后这边可以知道配流主要干了几个事情:
1.建立从传感器到各输出目标的通路;
2.分配硬件缓冲区
3.维护会话状态,为后续的捕获请求提供基础。
配流还是得结合起预览一起看,还有谷歌原生的代码真的太有操作了,后面空了再瞄一下,品鉴一下谷歌代码的魅力~