当前位置: 首页 > article >正文

视音频数据处理入门:颜色空间(二)---ffmpeg

目录

概述

流程

相关流程

初始化方法

初始化代码

转换方法

转换代码

释放方法

整体代码介绍

代码路径


概述

本篇简单说一下基于FFmpeg的libswscale的颜色空间转换;Libswscale里面实现了各种图像像素格式的转换,例如:YUV与RGB之间的转换等;这里简单说以下Libswscale的颜色空间转换的使用方法。

流程

相关流程

Libswscale使用起来很方便,最主要的函数只有3个:
(1)       sws_getContext():使用参数初始化SwsContext结构体。
(2)       sws_scale():转换一帧图像。
(3)       sws_freeContext():释放SwsContext结构体。
其中sws_getContext()也可以用另一个接口函数sws_getCachedContext()取代。

初始化方法

初始化SwsContext我们这里选用sws_getContext();除了上述函数之外还有另一种方法,更加灵活,可以配置更多的参数。该方法调用的函数如下所示:
1)  sws_alloc_context():为SwsContext结构体分配内存。
2)  av_opt_set_XXX():通过av_opt_set_int(),av_opt_set()…等等一系列方法设置SwsContext结构体的值。在这里需要注意,SwsContext结构体的定义看不到,所以不能对其中的成员变量直接进行赋值,必须通过av_opt_set()这类的API才能对其进行赋值。
3)  sws_init_context():初始化SwsContext结构体。
与第一种方式相比这种复杂的方法可以配置一些sws_getContext()配置不了的参数。比如说设置图像的YUV像素的取值范围是JPEG标准(Y、U、V取值范围都是0-255)还是MPEG标准(Y取值范围是16-235,U、V的取值范围是16-240)。

初始化代码

m_imgConvertCtx = sws_getContext(cfg->srcWide, cfg->srcHigh, srcIter->second,
		cfg->dstWide, cfg->dstHigh, dstIter->second, SWS_BICUBIC, NULL, NULL, NULL);

转换方法

对于转换函数就没有特别说明的了调用sws_scale();值得注意就是这个函数的参数的传递需要按照对应的颜色空间进行排列;

转换代码

sws_scale(m_imgConvertCtx, m_srcPointers, m_srcLinesizes, 0, m_srcHigh, m_dstPointers, m_dstLinesizes);

释放方法


	if (nullptr != m_imgConvertCtx)
	{
		sws_freeContext(m_imgConvertCtx);
	}

	m_imgConvertCtx = nullptr;

整体代码介绍

对应FFmpeg 颜色空间转换的demo这边封装成了一个类函数,主要提供了  NV12、NV21、YUV420P、YUV422P、RGB24、RGBA相互转换功能。如需扩展则实现对应函数即可。

头文件:ColorConversionFFmpeg.h

/**
 * FFmpeg的颜色空间转换
 * YUV Transformation
 *
 * 梁启东 qidong.liang
 * 18088708700@163.com
 * https://blog.csdn.net/u011645307
 *
 *
 * 本程序实现了FFmpeg的YUV数据之间的转换和YUV与RGB的转换。
 * 提供了如下:
 * 	FFMPEG_AV_PIX_FMT_NOKNOW,
 *	FFMPEG_AV_PIX_FMT_NV12,
 *	FFMPEG_AV_PIX_FMT_NV21,
 *	FFMPEG_AV_PIX_FMT_YUV420P,
 *	FFMPEG_AV_PIX_FMT_YUV422P,
 *	FFMPEG_AV_PIX_FMT_RGB24,
 *	FFMPEG_AV_PIX_FMT_RGBA
 *  相互转换功能
 */
#ifndef COLOR_CONVERSION_FFMPEG_H
#define	COLOR_CONVERSION_FFMPEG_H

#ifdef _WIN32
//Windows
extern "C"
{
#include "libswscale/swscale.h"
#include "libavutil/opt.h"
#include "libavutil/imgutils.h"
};
#else
//Linux...
#ifdef __cplusplus
extern "C"
{
#endif
#include <libavutil/opt.h>
#include <libswscale/swscale.h>
#include <libavutil/imgutils.h>
#ifdef __cplusplus
};
#endif
#endif

#include <map>
#include <functional>

#ifndef FFMPEG_PIX_FORMAT
#define	FFMPEG_PIX_FORMAT
typedef enum FFmpegAVPixelFormat
{
	FFMPEG_AV_PIX_FMT_NOKNOW,
	FFMPEG_AV_PIX_FMT_NV12,
	FFMPEG_AV_PIX_FMT_NV21,
	FFMPEG_AV_PIX_FMT_YUV420P,
	FFMPEG_AV_PIX_FMT_YUV422P,
	FFMPEG_AV_PIX_FMT_RGB24,
	FFMPEG_AV_PIX_FMT_RGBA

}FFmpegAVPixelFormat;

#endif//FFMPEG_PIX_FORMAT
#ifndef FFMPEG_SCALE_CONFIG
#define	FFMPEG_SCALE_CONFIG
typedef struct FFmpegSwscaleConfig
{
	unsigned int srcWide;
	unsigned int srcHigh;
	FFmpegAVPixelFormat srcFormat;
	unsigned int dstWide;
	unsigned int dstHigh;
	FFmpegAVPixelFormat dstFormat;

	FFmpegSwscaleConfig()
	{
		srcWide = 0;
		srcHigh = 0;
		srcFormat = FFMPEG_AV_PIX_FMT_NOKNOW;
		dstWide = 0;
		dstHigh = 0;
		dstFormat = FFMPEG_AV_PIX_FMT_NOKNOW;
	}
}FFmpegSwscaleConfig;
#endif // !FFMPEG_SCALE_CONFIG

class ColorConversionFFmpeg
{
public:
	ColorConversionFFmpeg();
	~ColorConversionFFmpeg();

	long Init(FFmpegSwscaleConfig* cfg);
	long Conversion(const char* inputBuff, char* outputBuff);
	long UnInit();

private:
	long BuffToAVPixFmtYUV420P(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtRGBA(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtRGB24(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtNV12(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtNV21(char* inputBuff, unsigned char** pixBuff);
	long BuffToAVPixFmtYUV422P(char* inputBuff, unsigned char** pixBuff);

	long AVPixFmtYUV420PToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtNV12ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtNV21ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtYUV422PToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtRGB24ToBuff(unsigned char** pixBuff, char* outputBuff);
	long AVPixFmtRGBAToBuff(unsigned char** pixBuff, char* outputBuff);

private:
	SwsContext* m_imgConvertCtx;
	uint8_t* m_srcPointers[4]{ nullptr,nullptr,nullptr,nullptr };
	int m_srcLinesizes[4]{0,0,0,0};
	uint8_t* m_dstPointers[4]{ nullptr,nullptr,nullptr,nullptr };
	int m_dstLinesizes[4]{ 0,0,0,0 };

	int m_srcHigh;
	int m_srcWide;
	std::function < long(char* inputBuff, unsigned char** pixBuff) > m_infun;
	std::function < long(unsigned char** pixBuff, char* outputBuff) > m_outfun;
	std::map<FFmpegAVPixelFormat, AVPixelFormat>			m_PixelFormatMap;
	std::map<FFmpegAVPixelFormat,
		std::function < long(
			char* inputBuff,
			unsigned char** pixBuff) >>					    m_srcFormatFunMap;
	std::map<FFmpegAVPixelFormat,
		std::function < long(
			unsigned char** pixBuff,
			char* outputBuff) >>						    m_dstFormatFunMap;
};
#endif//COLOR_CONVERSION_FFMPEG_H

源文件:ColorConversionFFmpeg.cpp

#include "ColorConversionFFmpeg.h"

ColorConversionFFmpeg::ColorConversionFFmpeg()
	: m_imgConvertCtx(nullptr)
	, m_infun(nullptr)
	, m_outfun(nullptr)
	, m_srcHigh(0)
	, m_srcWide(0)
{

	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_NV12, AV_PIX_FMT_NV12));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_NV21, AV_PIX_FMT_NV21));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_YUV420P, AV_PIX_FMT_YUV420P));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_YUV422P, AV_PIX_FMT_YUV422P));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_RGB24, AV_PIX_FMT_RGB24));
	m_PixelFormatMap.insert(std::pair<FFmpegAVPixelFormat, AVPixelFormat>(FFMPEG_AV_PIX_FMT_RGBA, AV_PIX_FMT_BGRA));

	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_NV12] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtNV12,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_NV21] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtNV21,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_YUV420P] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtYUV420P,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_YUV422P] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtYUV422P,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_RGB24] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtRGB24,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_srcFormatFunMap[FFMPEG_AV_PIX_FMT_RGBA] = std::bind(&ColorConversionFFmpeg::BuffToAVPixFmtRGBA,
		this,
		std::placeholders::_1,
		std::placeholders::_2);


	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_NV12] = std::bind(&ColorConversionFFmpeg::AVPixFmtNV12ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_NV21] = std::bind(&ColorConversionFFmpeg::AVPixFmtNV21ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_YUV420P] = std::bind(&ColorConversionFFmpeg::AVPixFmtYUV420PToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_YUV422P] = std::bind(&ColorConversionFFmpeg::AVPixFmtYUV422PToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_RGB24] = std::bind(&ColorConversionFFmpeg::AVPixFmtRGB24ToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);
	m_dstFormatFunMap[FFMPEG_AV_PIX_FMT_RGBA] = std::bind(&ColorConversionFFmpeg::AVPixFmtRGBAToBuff,
		this,
		std::placeholders::_1,
		std::placeholders::_2);

}

ColorConversionFFmpeg::~ColorConversionFFmpeg()
{
	m_PixelFormatMap.clear();
	m_srcFormatFunMap.clear();
	m_dstFormatFunMap.clear();

}

long ColorConversionFFmpeg::Init(FFmpegSwscaleConfig* cfg)
{
	if (nullptr == cfg)
	{
		return -1;
	}
	auto srcIter = m_PixelFormatMap.find(cfg->srcFormat);
	auto dstIter = m_PixelFormatMap.find(cfg->dstFormat);
	if (srcIter == m_PixelFormatMap.end() ||
		dstIter == m_PixelFormatMap.end())
	{
		return -2;
	}
	auto srcFormatFunIter = m_srcFormatFunMap.find(cfg->srcFormat);
	auto dstFormatFunIter = m_dstFormatFunMap.find(cfg->dstFormat);
	if (dstFormatFunIter == m_dstFormatFunMap.end() ||
		srcFormatFunIter == m_srcFormatFunMap.end())
	{
		return -3;
	}

	m_infun = srcFormatFunIter->second;
	m_outfun = dstFormatFunIter->second;

	int nSrctBuffLen = 0, nDstBuffLen = 0;

	nSrctBuffLen = av_image_alloc(m_srcPointers, m_srcLinesizes, cfg->srcWide, cfg->srcHigh, srcIter->second, 1);
	if (nSrctBuffLen <= 0)
	{
		return -4;
	}
	nDstBuffLen = av_image_alloc(m_dstPointers, m_dstLinesizes, cfg->dstWide, cfg->dstHigh, dstIter->second, 1);
	if (nDstBuffLen <= 0 )
	{
		av_freep(&m_srcPointers[0]);
		return -5;
	}

	m_imgConvertCtx = sws_getContext(cfg->srcWide, cfg->srcHigh, srcIter->second,
		cfg->dstWide, cfg->dstHigh, dstIter->second, SWS_BICUBIC, NULL, NULL, NULL);
	
	if (nullptr == m_imgConvertCtx)
	{
		av_freep(&m_srcPointers);
		av_freep(&m_dstPointers);
		return -6;
	}

	m_srcHigh = cfg->srcHigh;
	m_srcWide = cfg->srcWide;

	return 0;
}

long ColorConversionFFmpeg::Conversion(const char* inputBuff, char* outputBuff)
{
	if (nullptr == m_infun ||
		nullptr == m_outfun ||
		nullptr == m_dstPointers[0] ||
		nullptr == m_srcPointers[0] ||
		nullptr == m_imgConvertCtx)
	{
		return 0;
	}
	
	m_infun(const_cast<char*>(inputBuff), m_srcPointers);

	sws_scale(m_imgConvertCtx, m_srcPointers, m_srcLinesizes, 0, m_srcHigh, m_dstPointers, m_dstLinesizes);

	m_outfun(m_dstPointers, outputBuff);
	return 0;
}

long ColorConversionFFmpeg::UnInit()
{
	if (m_srcPointers)
	{
		av_freep(&m_srcPointers);
	}
	if (m_dstPointers)
	{
		av_freep(&m_dstPointers);
	}

	m_dstPointers[0] = nullptr;
	m_srcPointers[0] = nullptr;

	if (nullptr != m_imgConvertCtx)
	{
		sws_freeContext(m_imgConvertCtx);
	}

	m_imgConvertCtx = nullptr;
	m_outfun = nullptr;
	m_infun = nullptr;

	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtYUV420P(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, static_cast<size_t>(m_srcWide * m_srcHigh));											//Y
	memcpy(pixBuff[1], inputBuff + m_srcWide * m_srcHigh, m_srcWide * m_srcHigh / 4);				//U
	memcpy(pixBuff[2], inputBuff + m_srcWide * m_srcHigh * 5 / 4, m_srcWide * m_srcHigh / 4);		//V
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtRGBA(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh*4);
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtRGB24(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh * 3);
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtNV12(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcHigh*m_srcWide);                    //Y
	memcpy(pixBuff[1], inputBuff + m_srcHigh * m_srcWide, m_srcHigh*m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtNV21(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcHigh * m_srcWide);                    //Y
	memcpy(pixBuff[1], inputBuff + m_srcHigh * m_srcWide, m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::BuffToAVPixFmtYUV422P(char* inputBuff, unsigned char** pixBuff)
{
	memcpy(pixBuff[0], inputBuff, m_srcWide * m_srcHigh);											//Y
	memcpy(pixBuff[1], inputBuff + m_srcWide * m_srcHigh, m_srcWide * m_srcHigh / 2);				//U
	memcpy(pixBuff[2], inputBuff + m_srcWide * m_srcHigh * 3 / 2, m_srcWide * m_srcHigh / 2);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtYUV420PToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh);											//Y
	memcpy(outputBuff + m_srcWide * m_srcHigh, pixBuff[1], m_srcWide * m_srcHigh / 4);				//U
	memcpy(outputBuff + m_srcWide * m_srcHigh * 5 / 4, pixBuff[2], m_srcWide * m_srcHigh / 4);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtNV12ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy( outputBuff, pixBuff[0], m_srcHigh * m_srcWide);                    //Y
	memcpy( outputBuff + m_srcHigh * m_srcWide, pixBuff[1], m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtNV21ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcHigh * m_srcWide);                    //Y
	memcpy(outputBuff + m_srcHigh * m_srcWide, pixBuff[1], m_srcHigh * m_srcWide / 2);      //Uv
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtYUV422PToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh);											//Y
	memcpy(outputBuff + m_srcWide * m_srcHigh, pixBuff[1], m_srcWide * m_srcHigh / 2);				//U
	memcpy(outputBuff + m_srcWide * m_srcHigh * 3 / 2, pixBuff[2], m_srcWide * m_srcHigh / 2);		//V
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtRGB24ToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh * 3);
	return 0;
}

long ColorConversionFFmpeg::AVPixFmtRGBAToBuff(unsigned char** pixBuff, char* outputBuff)
{
	memcpy(outputBuff, pixBuff[0], m_srcWide * m_srcHigh * 4);
	return 0;
}

测试文件:main.cpp

/**
* FFmpeg的颜色空间转换的测试程序
* YUV Transformation
*
* 梁启东 qidong.liang
* 18088708700@163.com
* https://blog.csdn.net/u011645307
*
*
* FFmpeg的颜色空间转换的测试程序
*/

#include <iostream>
#include "ColorConversionFFmpeg.h"


#define NV12_To_I420	0
#define I420_To_NV12	0
#define NV21_To_I420	0
#define I420_To_NV21	0
#define I420_To_RGB32	0
#define RGB32_To_I420	0
#define I420_To_RGB24	0
#define RGB24_To_I420	0
#define NV12_To_YUV422P	0
#define YUV422P_To_NV12	1
int main()
{
	FILE* file_in = nullptr;
	FILE* file_out = nullptr;
	char* input_name = nullptr;
	char* output_name = nullptr;

	int w = 0, h = 0;
	float flotScale = 0;
	int out_w = 0, out_h = 0;
	float out_flotScale = 0;

	FFmpegSwscaleConfig cfg;
	ColorConversionFFmpeg obj;

#if NV12_To_YUV422P
	input_name = const_cast<char*>("../in/nv21_480x272.yuv");
	output_name = const_cast<char*>("../out/yuvv422p_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV12;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV422P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 2;

#endif

#if YUV422P_To_NV12

	input_name = const_cast<char*>("../in/YV16(422)_480x272.yuv");
	output_name = const_cast<char*>("../out/nv21_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV422P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV12;

	w = 480;
	h = 272;
	flotScale = 2;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if NV21_To_I420
	input_name = const_cast<char*>("../in/nv21_480x272.yuv");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV21;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_NV21

	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/nv21_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV21;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if NV12_To_I420
	input_name = const_cast<char*>("../in/nv12_480x272.yuv");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_NV12;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_NV12

	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/nv12_480x272.yuv");
	cfg.srcWide = 480;
	cfg.dstWide = 480;
	cfg.dstHigh = 272;
	cfg.srcHigh = 272;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_NV12;

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

#endif

#if I420_To_RGB24
	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/rgb_480x272.rgb");

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 3;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_RGB24;


#endif

#if RGB24_To_I420
	input_name = const_cast<char*>("../in/rgb_480x272.rgb");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	w = 480;
	h = 272;
	flotScale = 3;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_RGB24;

#endif

#if I420_To_RGB32
	input_name = const_cast<char*>("../in/I420_480x272.yuv");
	output_name = const_cast<char*>("../out/rgba_480x272.rgb");

	w = 480;
	h = 272;
	flotScale = 1.5;
	out_w = 480;
	out_h = 272;
	out_flotScale = 4;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_RGBA;

#endif

#if RGB32_To_I420
	input_name = const_cast<char*>("../in/rgba_480x272.rgb");
	output_name = const_cast<char*>("../out/I420_480x272.yuv");

	w = 480;
	h = 272;
	flotScale = 4;
	out_w = 480;
	out_h = 272;
	out_flotScale = 1.5;

	cfg.srcWide = w;
	cfg.dstWide = out_w;
	cfg.dstHigh = out_h;
	cfg.srcHigh = h;
	cfg.dstFormat = FFMPEG_AV_PIX_FMT_YUV420P;
	cfg.srcFormat = FFMPEG_AV_PIX_FMT_RGBA;

#endif

	int in_buff_len = w * h * flotScale;
	int out_buff_len = out_w * out_h * out_flotScale;
	char* inbuff = new char[in_buff_len];
	char* outbuff = new char[out_buff_len];
	fopen_s(&file_in, input_name, "rb+");
	fopen_s(&file_out, output_name, "wb+");


	int ret = obj.Init(&cfg);
	if (0 != ret)
	{
		printf("ColorConversionFFmpeg::Init ret:%d\n", ret);
		fclose(file_in);
		fclose(file_out);
		file_in = nullptr;
		file_out = nullptr;
		return -1;
	}
	while (true)
	{
		if (fread(inbuff, 1, in_buff_len, file_in) != in_buff_len)
		{
			break;
		}

		ret = obj.Conversion(inbuff, outbuff);
		if (0 != ret)
		{
			printf("ColorConversionFFmpeg::Conversion ret:%d\n", ret);
			continue;
		}
		fwrite(outbuff, 1, out_buff_len, file_out);
	}
	ret = obj.UnInit();
	if (0 != ret)
	{
		printf("ColorConversionFFmpeg::UnInit ret:%d\n", ret);
	}
	fclose(file_in);
	fclose(file_out);
	file_in = nullptr;
	file_out = nullptr;


    std::cout << "Hello World!\n";
}

代码路径

csdn:https://download.csdn.net/download/u011645307/21739481?spm=1001.2014.3001.5501

github:https://github.com/liangqidong/ColorConversion.git


http://www.kler.cn/a/571244.html

相关文章:

  • 一周学会Flask3 Python Web开发-在模板中渲染WTForms表单视图函数里获取表单数据
  • 大数据测试总结
  • 简易的微信聊天网页版【项目测试报告】
  • GradingPool-Seq使用方法
  • 2025华为OD机试真题目录【E卷+A卷+B卷+C卷+D卷】持续收录中...
  • 【3D格式转换SDK】HOOPS Exchange技术概览(二):3D数据处理高级功能
  • Three.js 入门(光线投射实现3d场景交互事件)
  • 实时音视频通信SDK/API:EasyRTC嵌入式SDK去中心化WebP2P架构设计,Linux、ARM、小程序适配
  • 物联网设备数据割裂难题:基于OAuth2.0的分布式用户画像系统设计!格行代理是不是套路?2025有什么比较好的副业?低成本的创业好项目有哪些?
  • 【实战ES】实战 Elasticsearch:快速上手与深度实践-3.1.3高亮与排序的性能陷阱
  • 网上打印平台哪个好用?网上打印资料推荐
  • 毕业项目推荐:基于yolov8/yolov5/yolo11的暴力行为检测识别系统(python+卷积神经网络)
  • Glide图片加载优化全攻略:从缓存到性能调优
  • Unity3D 刚体动力学(Rigidbody Dynamics)详解
  • 基于模糊PID控制的供热控制系统设计Simulink仿真
  • SQL注入的分类靶场实践
  • 文本处理Bert面试内容整理-BERT的预训练任务是什么?
  • @Transactional 注解的行为规则
  • 让 LabVIEW 程序更稳定
  • SpringBoot + redisTemplate 实现 redis 数据库迁移、键名修改