通过live555实现H264 RTSP直播(Windows版)

通过live555实现H264 RTSP直播(Windows版)为何标明“Windows版”,因为firehood大神已经实现了linux版:通过live555实现H264RTSP直播相关文章:【1】Win7(Windows7)下用VS2013(VisualStudio2013)编译live555【2】RTSP协议分析【3】windows命名管道一.基础live555的学习基本上都是从E:\live555\testProgs中的

为何标明“Windows版”,因为firehood大神已经实现了linux版:通过live555实现H264 RTSP直播

相关文章:

【1】Win7(Windows 7)下用VS2013(Visual Studio 2013)编译live555

【2】RTSP协议分析

【3】windows命名管道

一.基础

live555的学习基本上都是从E:\live555\testProgs中的testOnDemandRTSPServer.cpp示例开始的,这个例子实现了一个最简单的RTSP服务器。文件名中的“OnDemand”意思是:依指令行事,也就是说只有当客户端通过URL主动访问并发送相关指令时,该RTSP服务器才会将文件流化并推送到客户端。这个例子是基于RTP单播的,关于单播可以参考:Qt调用jrtplib实现单播、多播和广播

通过testOnDemandRTSPServer.cpp可以学习一个RTSP服务器的搭建步骤。这里新建一个名为h264LiveMediaServer的Win32控制台工程,新建并添加h264LiveMediaServer.cpp,然后将testOnDemandRTSPServer.cpp拷贝到h264LiveMediaServer.cpp,接着做少量修改,只保留与H.264会话相关的部分,如下所示:

通过live555实现H264 RTSP直播(Windows版)

#include "liveMedia.hh"
#include "BasicUsageEnvironment.hh"

UsageEnvironment* env;

// True:后启动的客户端总是从当前第一个客户端已经播放到的位置开始播放
// False:每个客户端都从头开始播放影视频文件
Boolean reuseFirstSource = False;

//该函数打印相关信息
static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
	char const* streamName, char const* inputFileName); 

int main(int argc, char** argv) 
{
	//创建任务调度器并初始化使用环境
	TaskScheduler* scheduler = BasicTaskScheduler::createNew();
	env = BasicUsageEnvironment::createNew(*scheduler);

	UserAuthenticationDatabase* authDB = NULL;

	//创建RTSP服务器,开始监听模客户端的连接
	//注意这里的端口号不是默认的554端口,因此访问URL时,需指定该端口号
	RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);
	if (rtspServer == NULL) 
	{
		*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";
		exit(1);
	}

	char const* descriptionString
		= "Session streamed by \"h264LiveMediaServer\"";

	//流名字,媒体名
	char const* streamName = "h264ESVideoTest";
	//文件名,当客户端输入的流名字为h264ESVideoTest时,实际上打开的是test.264文件。
	//这里需要特别注意一点,当通过IDE运行h264LiveMediaServer时,live555推送的是项目工作目录中的视频或音频。工作目录也就是和*.vcxproj同级的目录,
    //此时视频应该放在这个目录下。当双击h264LiveMediaServer.exe运行时,视频理所当然的和h264LiveMediaServer.exe放在一个目录。
	char const* inputFileName = "480320.264"; 
	//当客户点播时,要输入流名字streamName,告诉RTSP服务器点播的是哪个流。  
	//创建媒体会话,流名字和文件名的对应关系是通过增加子会话建立起来的。媒体会话对会话描述、会话持续时间、流名字等与会话有关的信息进行管理。  
	//第2个参数:媒体名、3:媒体信息、4:媒体描述  
	ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName,descriptionString);
	//添加264子会话 这里的文件名才是真正要打开文件的名字 
	//H264VideoFileServerMediaSubsession类派生自FileServerMediaSubsession派生自OnDemandServerMediaSubsession  
	//而OnDemandServerMediaSubsession和PassiveMediaSubsession共同派生自ServerMediaSubsession  
	//关于读取文件之类都在这个类中实现的,如果要将点播改为直播就是要新建类继承此类然后添加新的方法  
	sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource));
	//为rtspserver添加session 
	rtspServer->addServerMediaSession(sms);
	//答应信息到标准输出
	announceStream(rtspServer, sms, streamName, inputFileName);
    
	//试图为RTSP-over-HTTP通道创建一个HTTP服务器.
	if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) 
	{
		*env << "\n(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling.)\n";
	}
	else 
	{
		*env << "\n(RTSP-over-HTTP tunneling is not available.)\n";
	}
	//进入事件循环,对套接字的读取事件和对媒体文件的延时发送操作都在这个循环中完成。 
	env->taskScheduler().doEventLoop();

	return 0; 
}

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
	char const* streamName, char const* inputFileName) {
	char* url = rtspServer->rtspURL(sms);
	UsageEnvironment& env = rtspServer->envir();
	env << "\n\"" << streamName << "\" stream, from the file \""
		<< inputFileName << "\"\n";
	env << "Play this stream using the URL \"" << url << "\"\n";
	delete[] url;
}

如何测试可参考【1】,测试结果如下所示:

通过live555实现H264 RTSP直播(Windows版)
通过live555实现H264 RTSP直播(Windows版)

二.实现

通过live555实现H264 RTSP直播中,博主是通过FIFO队列实现的,FIFO队列实际上是Linux下的命名管道,而Windows下也有命名管道,因此在Windows中的流程图如下所示:

通过live555实现H264 RTSP直播(Windows版)

关于Windows命名管道详见【3】。

这里不使用命名管道来实现,而是直接读取本地H264文件,分解成StartCode+NALU内存块,然后拷贝到Live555 Server。这样一来,就很容易改成命名管道的形式,命名管道的客户端只需读取本地H264文件,分解成StartCode(0x000001或0x00000001)+NALU内存块,并写入管道,命名管道服务器端(在Live555 Server中)读取管道数据,并拷贝到Live555 Server。

通过live555实现H264 RTSP直播(Windows版)

通过“基础”中的分析可以得出,想实现自定义服务器,需要将sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName,reuseFirstSource)),中的H264VideoFileServerMediaSubsession替换成自己的子会话。H264VideoFileServerMediaSubsession类在其createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate)函数中调用了ByteStreamFileSource::createNew(envir(), fFileName),而frame的获取正是在ByteStreamFileSource类中的doGetNextFrame()函数中实现的。因此,这里需要继承H264VideoFileServerMediaSubsession和ByteStreamFileSource类,并重写其中的createNewStreamSource和doGetNextFrame函数。

通过live555实现H264 RTSP直播(Windows版)

代码如下所示:

h264LiveFramedSource.hh

#ifndef _H264LIVEFRAMEDSOURCE_HH
#define _H264LIVEFRAMEDSOURCE_HH


#include <ByteStreamFileSource.hh>


class H264LiveFramedSource : public ByteStreamFileSource
{
public:
	static H264LiveFramedSource* createNew(UsageEnvironment& env, unsigned preferredFrameSize = 0, unsigned playTimePerFrame = 0);


protected:
	H264LiveFramedSource(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame);
	~H264LiveFramedSource();


private:
	//重定义虚函数
	virtual void doGetNextFrame();
};

#endif

h264LiveFramedSource.cpp

#include "h264LiveFramedSource.hh"
#include "GroupsockHelper.hh"
#include "spsdecode.h"

int findStartCode(unsigned char *buf, int zeros_in_startcode)
{
	int info;
	int i;

	info = 1;
	for (i = 0; i < zeros_in_startcode; i++)
	if (buf[i] != 0)
		info = 0;

	if (buf[i] != 1)
		info = 0;
	return info;
}
//此处的NALU包括StartCode
int getNextNalu(FILE* inpf, unsigned char* buf)
{
	int pos = 0;
	int startCodeFound = 0;
	int info2 = 0;
	int info3 = 0;

	while (!feof(inpf) && (buf[pos++] = fgetc(inpf)) == 0);

	while (!startCodeFound)
	{
		if (feof(inpf))
		{
			return pos - 1;
		}
		buf[pos++] = fgetc(inpf);
		info3 = findStartCode(&buf[pos - 4], 3);
        startCodeFound=(info3 == 1);
		if (info3 != 1)
			info2 = findStartCode(&buf[pos - 3], 2);
		 startCodeFound = (info2 == 1 || info3 == 1);
	}
	if (info2)
	{
		fseek(inpf, -3, SEEK_CUR);
		return pos - 3;
	}
	if (info3)
	{
		fseek(inpf, -4, SEEK_CUR);
		return pos - 4;
	}
}

FILE * inpf;
unsigned char* inBuf;
int inLen;
int nFrameRate;
H264LiveFramedSource::H264LiveFramedSource(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame)
: ByteStreamFileSource(env, 0, preferredFrameSize, playTimePerFrame)
{
	char *fname = "480320.264";
	inpf = NULL;
	inpf = fopen(fname, "rb");
	inBuf = (unsigned char*)calloc(1024 * 100, sizeof(char));
	inLen = 0;
	inLen = getNextNalu(inpf, inBuf);
	// 读取SPS帧
	unsigned int nSpsLen = inLen - 4;
	unsigned char *pSps = (unsigned char*)malloc(nSpsLen);
	memcpy(pSps, inBuf + 4, nSpsLen);

	// 解码SPS,获取视频图像宽、高信息
	int width = 0, height = 0, fps = 0;

	h264_decode_sps(pSps, nSpsLen, width, height, fps);

	nFrameRate = 0;
	if (fps)
		nFrameRate = fps;
	else
		nFrameRate = 25;
}

H264LiveFramedSource* H264LiveFramedSource::createNew(UsageEnvironment& env, unsigned preferredFrameSize, unsigned playTimePerFrame)
{
	H264LiveFramedSource* newSource = new H264LiveFramedSource(env, preferredFrameSize, playTimePerFrame);
	return newSource;
}

H264LiveFramedSource::~H264LiveFramedSource()
{
	free(inBuf);
	fclose(inpf);
}

// This function is called when new frame data is available from the device.
// We deliver this data by copying it to the 'downstream' object, using the following parameters (class members):
// 'in' parameters (these should *not* be modified by this function):
//     fTo: The frame data is copied to this address.
//         (Note that the variable "fTo" is *not* modified.  Instead,
//          the frame data is copied to the address pointed to by "fTo".)
//     fMaxSize: This is the maximum number of bytes that can be copied
//         (If the actual frame is larger than this, then it should
//          be truncated, and "fNumTruncatedBytes" set accordingly.)
// 'out' parameters (these are modified by this function):
//     fFrameSize: Should be set to the delivered frame size (<= fMaxSize).
//     fNumTruncatedBytes: Should be set iff the delivered frame would have been
//         bigger than "fMaxSize", in which case it's set to the number of bytes
//         that have been omitted.
//     fPresentationTime: Should be set to the frame's presentation time
//         (seconds, microseconds).  This time must be aligned with 'wall-clock time' - i.e., the time that you would get
//         by calling "gettimeofday()".
//     fDurationInMicroseconds: Should be set to the frame's duration, if known.
//         If, however, the device is a 'live source' (e.g., encoded from a camera or microphone), then we probably don't need
//         to set this variable, because - in this case - data will never arrive 'early'.
void H264LiveFramedSource::doGetNextFrame()
{
	fFrameSize = inLen;
	if (fFrameSize > fMaxSize)
	{
		fNumTruncatedBytes = fFrameSize - fMaxSize;
		fFrameSize = fMaxSize;
	}
	else
	{
		fNumTruncatedBytes = 0;
	}
	memmove(fTo, inBuf, fFrameSize);

	inLen = 0;
	inLen = getNextNalu(inpf, inBuf);
	gettimeofday(&fPresentationTime, NULL);//时间戳 
	fDurationInMicroseconds = 1000000 / nFrameRate;//控制播放速度
	//表示延迟0秒后再执行afterGetting函数,也可以直接用afterGetting(this)
	nextTask() = envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this);
}

h264LiveVideoServerMediaSubssion.hh

#ifndef _H264LIVEVIDEOSERVERMEDIASUBSSION_HH
#define _H264LIVEVIDEOSERVERMEDIASUBSSION_HH
#include "H264VideoFileServerMediaSubsession.hh"

class H264LiveVideoServerMediaSubssion : public H264VideoFileServerMediaSubsession {

public:
	static H264LiveVideoServerMediaSubssion* createNew(UsageEnvironment& env, Boolean reuseFirstSource);

protected: 
	H264LiveVideoServerMediaSubssion(UsageEnvironment& env, Boolean reuseFirstSource);
	~H264LiveVideoServerMediaSubssion();

protected: 
	//重定义虚函数
	FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate);
};

#endif

h264LiveVideoServerMediaSubssion.cpp

#include "h264LiveVideoServerMediaSubssion.hh"
#include "h264LiveFramedSource.hh"
#include "H264VideoStreamFramer.hh"

H264LiveVideoServerMediaSubssion* H264LiveVideoServerMediaSubssion::createNew(UsageEnvironment& env, Boolean reuseFirstSource)
{
	return new H264LiveVideoServerMediaSubssion(env, reuseFirstSource);
}

H264LiveVideoServerMediaSubssion::H264LiveVideoServerMediaSubssion(UsageEnvironment& env, Boolean reuseFirstSource)
: H264VideoFileServerMediaSubsession(env, 0, reuseFirstSource)
{

}

H264LiveVideoServerMediaSubssion::~H264LiveVideoServerMediaSubssion()
{
}

FramedSource* H264LiveVideoServerMediaSubssion::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate)
{
	//estimate bitrate:估计的比特率,记得根据需求修改
	estBitrate = 1000; // kbps
	//创建视频源
	H264LiveFramedSource* liveSource = H264LiveFramedSource::createNew(envir());
	if (liveSource == NULL)
	{
		return NULL;
	}

	//为视频流创建Framer
	return H264VideoStreamFramer::createNew(envir(), liveSource);
}

还需在h264LiveMediaServer.cpp中做相应的修改

#include "liveMedia.hh"
#include "BasicUsageEnvironment.hh"
#include "h264LiveVideoServerMediaSubssion.hh"

UsageEnvironment* env;

// True:后启动的客户端总是从当前第一个客户端已经播放到的位置开始播放
// False:每个客户端都从头开始播放影视频文件
Boolean reuseFirstSource = True;

//该函数打印相关信息
static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms, char const* streamName); 

int main(int argc, char** argv) 
{
	//创建任务调度器并初始化使用环境
	TaskScheduler* scheduler = BasicTaskScheduler::createNew();
	env = BasicUsageEnvironment::createNew(*scheduler);
	UserAuthenticationDatabase* authDB = NULL;

	//创建RTSP服务器,开始监听模客户端的连接
	//注意这里的端口号不是默认的554端口,因此访问URL时,需指定该端口号
	RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);
	if (rtspServer == NULL) 
	{
		*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";
		exit(1);
	}

	char const* descriptionString = "Session streamed by \"h264LiveMediaServer\"";

	//流名字,媒体名
	char const* streamName = "h264ESVideoTest";

	//当客户点播时,要输入流名字streamName,告诉RTSP服务器点播的是哪个流。  
	//创建媒体会话,流名字和文件名的对应关系是通过增加子会话建立起来的。媒体会话对会话描述、会话持续时间、流名字等与会话有关的信息进行管理。  
	//第2个参数:媒体名、3:媒体信息、4:媒体描述  
	ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName ,descriptionString);

	//修改为自己实现的H264LiveVideoServerMediaSubssion
	sms->addSubsession(H264LiveVideoServerMediaSubssion::createNew(*env, reuseFirstSource));

	//为rtspserver添加session 
	rtspServer->addServerMediaSession(sms);

	//答应信息到标准输出
	announceStream(rtspServer, sms, streamName);
    
	//进入事件循环,对套接字的读取事件和对媒体文件的延时发送操作都在这个循环中完成。 
	env->taskScheduler().doEventLoop();

	return 0; 
}

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,char const* streamName) 
{
	char* url = rtspServer->rtspURL(sms);
	UsageEnvironment& env = rtspServer->envir();
	env << "\n\"" << streamName << "\" stream\"\n";
	env << "Play this stream using the URL \"" << url << "\"\n";
	delete[] url;
}

关于spsdecode.h,详见: H.264(H264)解码SPS获取分辨率和帧率

三.测试

通过live555实现H264 RTSP直播(Windows版)

通过live555实现H264 RTSP直播(Windows版)

参考链接:通过live555实现H264 RTSP直播_firehood的博客-CSDN博客

原文链接:https://blog.csdn.net/caoshangpa/article/details/53200527

今天的文章通过live555实现H264 RTSP直播(Windows版)分享到此就结束了,感谢您的阅读,如果确实帮到您,您可以动动手指转发给其他人。

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。
如需转载请保留出处:https://bianchenghao.cn/26083.html

(0)
编程小号编程小号

相关推荐

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注