当前位置: 首页 > article >正文

【HarmonyOS】自定义相机拍照和录像 (二)之录像

【HarmonyOS】自定义相机拍照和录像 (二)之录像

一、自定义相机录像开发思路:

在这里插入图片描述
大体开发思路与相机拍照思路相同,都是通过CameraKit进行相机设备调用,视频输入输出流和会话的绑定,在会话里设置相关参数。

唯一与拍照不同的区别是,通过avRecorder进行录像生命周期的管理,开始录像,暂停录像都是通过这个渲染播放对象进行管控。相对于拍照而言,录像的开发要麻烦一些。

二、录像开发步骤

在这里插入图片描述
自定义相机录像开发思路步骤为:
1.设置相机界面

2.选择相机摄像头实例
根据cameraKit提供的CameraManager获取相机管理实例,拿到设备的相机列表,一般分为前后两个。选择你要用的相机。

3.相机输出流
传入你选择的相机实例给cameraManager.createCameraInput创建相机输出流,开启会话后cameraInput.open,进行相机的参数配置(拍照还是摄像模式,闪光灯,焦距比)

4.配置相机会话信息
创建会话,将输入流,输出流,拍照,摄像,绑定到会话上。完事之后,开启会话,相机流就能正常输出。你去操作拍照摄像也可以正常操作。你可以理解会话为一个组合器。

5.录像操作
使用avRecorder进行录像的开启,暂停,释放的处理。

6.退出界面销毁相机
相机资源不释放,会导致再次初始化黑屏,甚至影响系统相机等异常问题。

三、DEMO源码示例:

import { camera } from '@kit.CameraKit';
import { image } from '@kit.ImageKit';
import { BusinessError } from '@kit.BasicServicesKit';
import { media } from '@kit.MediaKit';
import { common } from '@kit.AbilityKit';
import { photoAccessHelper } from '@kit.MediaLibraryKit';
import { fileIo as fs } from '@kit.CoreFileKit';

export class CameraMgr {
  private TAG: string = "CameraMgr";

  private static mCameraMgr: CameraMgr | null = null;

  private mCameraInput: camera.CameraInput | undefined = undefined;
  private mPreviewOutput: camera.PreviewOutput | undefined = undefined;

  private mVideoSession: camera.VideoSession | undefined = undefined;
  private mVideoOutput: camera.VideoOutput | undefined = undefined;
  private mAvRecorder: media.AVRecorder | undefined = undefined;

  private mFile: fs.File | null = null;

  public static Ins(): CameraMgr {
    if (CameraMgr.mCameraMgr) {
      return CameraMgr.mCameraMgr
    }
    CameraMgr.mCameraMgr = new CameraMgr();
    return CameraMgr.mCameraMgr
  }

  /**
   * 初始化视频相机
   * @param context
   * @param surfaceId
   * @returns
   */
  public async initVideoCamera(context: common.Context, surfaceId: string): Promise<void> {
    // 创建CameraManager对象
    let cameraManager: camera.CameraManager = camera.getCameraManager(context);
    if (!cameraManager) {
      console.error("camera.getCameraManager error");
      return;
    }

    // 监听相机状态变化
    cameraManager.on('cameraStatus', (err: BusinessError, cameraStatusInfo: camera.CameraStatusInfo) => {
      if (err !== undefined && err.code !== 0) {
        console.error('cameraStatus with errorCode = ' + err.code);
        return;
      }
      console.info(`camera : ${cameraStatusInfo.camera.cameraId}`);
      console.info(`status: ${cameraStatusInfo.status}`);
    });

    // 获取相机列表
    let cameraArray: Array<camera.CameraDevice> = [];
    try {
      cameraArray = cameraManager.getSupportedCameras();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`getSupportedCameras call failed. error code: ${err.code}`);
    }

    if (cameraArray.length <= 0) {
      console.error("cameraManager.getSupportedCameras error");
      return;
    }

    // 获取支持的模式类型
    let sceneModes: Array<camera.SceneMode> = cameraManager.getSupportedSceneModes(cameraArray[0]);
    let isSupportVideoMode: boolean = sceneModes.indexOf(camera.SceneMode.NORMAL_VIDEO) >= 0;
    if (!isSupportVideoMode) {
      console.error('video mode not support');
      return;
    }

    // 获取相机设备支持的输出流能力
    let cameraOutputCap: camera.CameraOutputCapability =
      cameraManager.getSupportedOutputCapability(cameraArray[0], camera.SceneMode.NORMAL_VIDEO);
    if (!cameraOutputCap) {
      console.error("cameraManager.getSupportedOutputCapability error")
      return;
    }
    console.info("outputCapability: " + JSON.stringify(cameraOutputCap));

    let previewProfilesArray: Array<camera.Profile> = cameraOutputCap.previewProfiles;
    if (!previewProfilesArray) {
      console.error("createOutput previewProfilesArray == null || undefined");
    }

    let photoProfilesArray: Array<camera.Profile> = cameraOutputCap.photoProfiles;
    if (!photoProfilesArray) {
      console.error("createOutput photoProfilesArray == null || undefined");
    }

    let videoProfilesArray: Array<camera.VideoProfile> = cameraOutputCap.videoProfiles;
    if (!videoProfilesArray) {
      console.error("createOutput videoProfilesArray == null || undefined");
    }
    // videoProfile的宽高需要与AVRecorderProfile的宽高保持一致,并且需要使用AVRecorderProfile锁支持的宽高
    let videoSize: camera.Size = {
      width: 640,
      height: 480
    }
    let videoProfile: undefined | camera.VideoProfile = videoProfilesArray.find((profile: camera.VideoProfile) => {
      return profile.size.width === videoSize.width && profile.size.height === videoSize.height;
    });
    if (!videoProfile) {
      console.error('videoProfile is not found');
      return;
    }
    // 配置参数以实际硬件设备支持的范围为准
    let aVRecorderProfile: media.AVRecorderProfile = {
      audioBitrate: 48000,
      audioChannels: 2,
      audioCodec: media.CodecMimeType.AUDIO_AAC,
      audioSampleRate: 48000,
      fileFormat: media.ContainerFormatType.CFT_MPEG_4,
      videoBitrate: 2000000,
      videoCodec: media.CodecMimeType.VIDEO_AVC,
      videoFrameWidth: videoSize.width,
      videoFrameHeight: videoSize.height,
      videoFrameRate: 30
    };
    let options: photoAccessHelper.CreateOptions = {
      title: Date.now().toString()
    };
    let accessHelper: photoAccessHelper.PhotoAccessHelper = photoAccessHelper.getPhotoAccessHelper(context);
    let videoUri: string = await accessHelper.createAsset(photoAccessHelper.PhotoType.VIDEO, 'mp4', options);
    this.mFile = fs.openSync(videoUri, fs.OpenMode.READ_WRITE | fs.OpenMode.CREATE);
    let aVRecorderConfig: media.AVRecorderConfig = {
      audioSourceType: media.AudioSourceType.AUDIO_SOURCE_TYPE_MIC,
      videoSourceType: media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV,
      profile: aVRecorderProfile,
      url: `fd://${this.mFile.fd.toString()}`, // 文件需先由调用者创建,赋予读写权限,将文件fd传给此参数,eg.fd://45--file:///data/media/01.mp4
      rotation: 0, // 合理值0、90、180、270,非合理值prepare接口将报错
      location: { latitude: 30, longitude: 130 }
    };

    let avRecorder: media.AVRecorder | undefined = undefined;
    try {
      avRecorder = await media.createAVRecorder();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`createAVRecorder call failed. error code: ${err.code}`);
    }

    if (avRecorder === undefined) {
      return;
    }
    this.mAvRecorder = avRecorder;
    try {
      await avRecorder.prepare(aVRecorderConfig);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`prepare call failed. error code: ${err.code}`);
    }

    let videoSurfaceId: string | undefined = undefined; // 该surfaceID用于传递给相机接口创造videoOutput
    try {
      videoSurfaceId = await avRecorder.getInputSurface();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`getInputSurface call failed. error code: ${err.code}`);
    }
    if (videoSurfaceId === undefined) {
      return;
    }
    // 创建VideoOutput对象
    let videoOutput: camera.VideoOutput | undefined = undefined;
    try {
      videoOutput = cameraManager.createVideoOutput(videoProfile, videoSurfaceId);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to create the videoOutput instance. error: ${JSON.stringify(err)}`);
    }
    if (videoOutput === undefined) {
      return;
    }
    this.mVideoOutput = videoOutput;
    // 监听视频输出错误信息
    videoOutput.on('error', (error: BusinessError) => {
      console.error(`Preview output error code: ${error.code}`);
    });

    //创建会话
    let videoSession: camera.VideoSession | undefined = undefined;
    try {
      videoSession = cameraManager.createSession(camera.SceneMode.NORMAL_VIDEO) as camera.VideoSession;
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to create the session instance. error: ${JSON.stringify(err)}`);
    }
    if (videoSession === undefined) {
      return;
    }
    this.mVideoSession = videoSession;
    // 监听session错误信息
    videoSession.on('error', (error: BusinessError) => {
      console.error(`Video session error code: ${error.code}`);
    });

    // 开始配置会话
    try {
      videoSession.beginConfig();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to beginConfig. error: ${JSON.stringify(err)}`);
    }

    // 创建相机输入流
    let cameraInput: camera.CameraInput | undefined = undefined;
    try {
      cameraInput = cameraManager.createCameraInput(cameraArray[0]);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to createCameraInput. error: ${JSON.stringify(err)}`);
    }
    if (cameraInput === undefined) {
      return;
    }
    this.mCameraInput = cameraInput;
    // 监听cameraInput错误信息
    let cameraDevice: camera.CameraDevice = cameraArray[0];
    cameraInput.on('error', cameraDevice, (error: BusinessError) => {
      console.error(`Camera input error code: ${error.code}`);
    });

    // 打开相机
    try {
      await cameraInput.open();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to open cameraInput. error: ${JSON.stringify(err)}`);
    }

    // 向会话中添加相机输入流
    try {
      videoSession.addInput(cameraInput);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to add cameraInput. error: ${JSON.stringify(err)}`);
    }

    // 创建预览输出流
    let previewOutput: camera.PreviewOutput | undefined = undefined;
    try {
      previewOutput = cameraManager.createPreviewOutput(previewProfilesArray[0], surfaceId);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to create the PreviewOutput instance. error: ${JSON.stringify(err)}`);
    }

    if (previewOutput === undefined) {
      return;
    }
    // 向会话中添加预览输出流
    try {
      videoSession.addOutput(previewOutput);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to add previewOutput. error: ${JSON.stringify(err)}`);
    }

    // 向会话中添加录像输出流
    try {
      videoSession.addOutput(videoOutput);
    } catch (error) {
      let err = error as BusinessError;
      console.error(`Failed to add videoOutput. error: ${JSON.stringify(err)}`);
    }

    // 提交会话配置
    try {
      await videoSession.commitConfig();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`videoSession commitConfig error: ${JSON.stringify(err)}`);
    }

    // 启动会话
    try {
      await videoSession.start();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`videoSession start error: ${JSON.stringify(err)}`);
    }

  }

  /**
   * 开始录像
   */
  public async startRecord(){
    // 启动录像输出流
    this.mVideoOutput?.start((err: BusinessError) => {
      if (err) {
        console.error(`Failed to start the video output. error: ${JSON.stringify(err)}`);
        return;
      }
      console.info('Callback invoked to indicate the video output start success.');
    });
    // 开始录像
    try {
      await this.mAvRecorder?.start();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`avRecorder start error: ${JSON.stringify(err)}`);
    }
  }

  /**
   * 停止录像
   */
  public async stopRecorder(){
    // 停止录像输出流
    this.mVideoOutput?.stop((err: BusinessError) => {
      if (err) {
        console.error(`Failed to stop the video output. error: ${JSON.stringify(err)}`);
        return;
      }
      console.info('Callback invoked to indicate the video output stop success.');
    });
    // 停止录像
    try {
      await this.mAvRecorder?.stop();
    } catch (error) {
      let err = error as BusinessError;
      console.error(`avRecorder stop error: ${JSON.stringify(err)}`);
    }
  }

  public async destroyVideoCamera(){

    // 停止当前会话
    await this.mVideoSession?.stop();

    // 关闭文件
    fs.closeSync(this.mFile);

    // 释放相机输入流
    await this.mCameraInput?.close();

    // 释放预览输出流
    await this.mPreviewOutput?.release();

    // 释放录像输出流
    await this.mVideoOutput?.release();

    // 释放会话
    await this.mVideoSession?.release();

    // 会话置空
    this.mVideoSession = undefined;
  }
}



http://www.kler.cn/a/419842.html

相关文章:

  • 数据结构4——栈和队列
  • wordpress网站首页底部栏显示网站备案信息
  • 【C语言】递归的内存占用过程
  • 【C语言】结构体(四)
  • 【时时三省】(C语言基础)结构体的自引用
  • C++:map容器——自定义数据类型进行自定义排序规则
  • iptables 用于设置、维护和检查 IP 数据包的过滤规则。其基本用法是通过命令行界面配置流量的过滤策略,分为以下几类规则链:INPUT(入站流量)、OU
  • WINDOWS 单链表SLIST_ENTRY使用
  • Leecode刷题C语言之N皇后②
  • gitlab自动打包python项目
  • 【vue】响应式(object.defineProperty)、可配置的参数、vue渲染机制
  • 华为HarmonyOS 让应用快速拥有账号能力 - 获取用户手机号
  • yolo11经验教训----之一
  • QT的槽函数的四种写法
  • ME6210:常用在个人通信设备电源里的低静态、低压差线性稳压器
  • @antv/x6 再vue中 ,自定义图形,画流程图、数据建模、er图等图形
  • linux网络抓包工具
  • 网际协议(IP)与其三大配套协议(ARP、ICMP、IGMP)
  • 【在Linux世界中追寻伟大的One Piece】多线程(三)
  • 为什么编程语言会设计不可变的对象?字符串不可变?NSString *s = @“hello“变量s是不可变的吗?Rust内部可变性的意义?
  • 源码分析之Openlayers中的Collection类
  • Web开发基础学习——HTML中\<div>元素的理解
  • arkTS:使用ArkUI实现用户信息的持久化管理与自动填充(PersistentStorage)
  • Java 面经之 Spring
  • 【Git系列】Git 提交记录过滤:排除特定关键词的实用指南
  • 【MySQL-6】MySQL的复合查询