how to enable h264 codecs in spice server? - vdi

I'm new to spice-server source code. And I want to test streaming codecs of SPICE. I have tried streaming-video=all option to enable streaming codecs, but it's only mjpeg. I know spice server is able to stream in h.264. But I don't know how to enable it. Any suggestion please?

// reds.cpp
// Modify the default codecs, set 'gstreamer:h264;' to be the first.
static const char default_video_codecs[] = "gstreamer:h264;" GSTREAMER_CODECS;
Notice: you need set up gstreamer lib first. SPICE-SERVER will check the gstreamer lib. Also, your spice client should also support gstreamer:h264.
If not, SPICE SERVER will use default MJPEG codecs as fallback.(Details found: video-stream.cpp file, dcc_create_video_encoder function)

Related

Access Haivision SRT statistics from ffmpeg command

I have been successfully using ffmpeg to stream using the Haivision SRT protocol enabled, I need to access the SRT statistics as described here https://github.com/Haivision/srt/blob/master/docs/API/statistics.md
Can anyone help me to understand how to access these statistics from an ffmpeg commadline streaming SRT.

How to dump raw RTSP stream to FTP?

Is this even possible? I tried using FFMPEG but it allows me to dump it only to a file on a server host where i dont have much free space, so im wondering if its somehow possible to dump that to some other places like a FTP?

how to extract EPG from dvb-t (live tv stream - udp) with ffmpeg?

I want to extract EPG from dvb-t (live tv stream - udp) with ffmpeg.
I have a dvb-t device and receiving streams with udp.
I have an dvb-t udp .ts stream generated with mumudvb and I can extract the EPG guide info in XMLTV format with epgrab - https://github.com/hiroshiyui/epgrab
git clone https://github.com/hiroshiyui/epgrab.git
cd epgrab/
cmake .
make
# Point epgrab to your dvb adapter
./epgrab -i /dev/dvb/adapter0/demux0 > out.xml
Hope this helps!
I wrote a utility called dvbtee that can be used as a c++ library, a cross-platform command line utility, or a node.js module.
The command line utility will parse your streams and output the EPG, depending on the arguments you specify, it can generate plain text or a JSON block of data.
dvbtee: a digital television streamer / parser / service information aggregator supporting various interfaces including telnet CLI & http control
The node.js module will emit events containing the PSIP table data (along with EPG info)
node-dvbtee: MPEG2 transport stream parser for Node.js with support for television broadcast PSIP tables

Using WebHDFS to play video over HTTP

I used ffmpeg + libx264 to convert file format as H264, then uploaded the file to Hadoop. I used WebHDFS to access the file by HTTP, but can not online play. If I download this file over HTTP, it can play by HTML5 video. My English is poor, hope you know what I mean.
I got the reason, the format of converted file was crash, need to use qt-faststart tool fix it, then can play, thanks.

Calling ffmpeg api from Oracle

I have installed ffmpeg and ffmpeg-devel packages on Linux.
Oracle 11g is installed and running.
The database stores media files, and for better streaming we need to convert them to AVI format.
For ease of integration, we would like to do this conversion in the database.
Now, the simplest option is to write a wrapper for the ffmpeg command line utility, and enable a PLSQL procedure to call this.
However this would require the following steps:
Read video BLOB
Write to a OS file
Call ffmpeg wrapper giving file name from (2) and output file name
Load output file from 3 into a BLOB in PLSQL
I would like to if possible write a C routine (using the Oracle External Library feature) which accepts the input as the BLOB (OciLOBLocator), calls the appropriate libavformat functions presenting the LOB, and write the return to a LOB (again OciLOBLOcator) which is what the PLSQL layer then uses as the AVI file.
The other advantage of this is it avoids the undesirable impact of issuing a OS command from within Oracle.
The problem I have is that the examples given for ffmpeg show the processing of data from files, whereas I need the libraries to process the LOBs.
The alternative is to see if the OrdVideo data type in Oracle does this kind of conversion by using setformat and process.
Interesting challenge. So it sounds like you would prefer to not have to call the 'ffmpeg' command line utility, but rather leverage the libavformat and libavcodec libraries in native calls within the database. Do I have that right?
I trust that these LOBs/OciLOBLocator things expose a C API for reading and writing? If that's the case, then perhaps you can create a new URLProtocol based on that API. URLProtocols are how FFmpeg deals with I/O. Run 'ffmpeg -protocols' to see all the ones implemented. Examine the source of libavformat/file.c for a simple example of what a URLProtocol entails -- open, read, write, seek, close, and a few other functions.

Resources