Sending LiveVideo to Http server using ffmpeg - windows

I want to stream a video over http server i can do this easily with ffmpeg exe using below command
ffmpeg -i abc.mp4 -f mpeg1video -vf "scale=640:480" -r 20 https://localhost/XXXX
but i want to achieve same result using ffmpeg library instead of exe. i can easily play/record video using ffmpeg libraries.

Related

FFmpeg CLI - swap RTMP source using ZMQ (zmqsend)

My setup is as follows:
Nginx with the RTMP module
Multiple RTMP stream pairs, each one with a primary and backup RTMP endpoint (so streaming to rtmp://localhost/main/$STREAM_NAME and rtmp://localhost/backup/$STREAM_NAME)
Using the Nginx RTMP module exec_publish and exec_publish_done hooks, I push either main or backup to an FFmpeg CLI proc that restreams it to a remote RTMP endpoint (Wowza server in this case, though it's not very relevant to my question)
My problem is that currently, if the main stream is stopped, I have to stop the FFmpeg CLI process that restreams to Wowza and start another with a new input source (the backup stream). This often causes issues on the Wowza side so I'm looking for a way to avoid that.
After some research, I found that FFmpeg encapsulated ZMQ support but it seems documentation is very sparse. Is it possible to send a message to the running FFmpeg process to alert it that it must change its source to a different RTMP stream?
Thanks a lot,
In case it's of interest to anyone, I solved my problem in a different way.
I now use named pipes, like so:
PIPE_FILE= /path/to/pip/file
mkfifo $PIPE_FILE
exec 7<>$PIPE_FILE
ffmpeg -nostdin -i /path/to/source -acodec copy -vcodec copy -vbsf h264_mp4toannexb -f mpegts pipe:1 > $PIPE_FILE
/path/to/source can be a media file on the FS or an RTMP stream, for that matter.
I then re-stream from the pipe to the final RTMP endpoint:
ffmpeg -re -i $PIPE_FILE -c:v libx264 -preset veryfast -r 25 -g 50 -f flv $RTMP_ENDPOINT
When $PIPE_FILE stops receiving data (i.e - when streaming stops or, in the case of sending data from a local media file, when EOF is reached), I immediately launch a different FFmpeg CLI proc and feed the pipe data from the backup media file/stream.
That keeps the re-streaming FFmpeg CLI proc continuously up and running.
Interesting approach. I've got something similar. Instead of the Pipe, I'm using another local rtmp destination.
I've got an nginx rtmp setup with 3 apps. One is the main app, another the backup app, and another is the distribute app.
So I send the main stream to the main app from my streaming software.
I have a ffmpeg process running:
ffmpeg -i rtmp://127.0.0.1/main/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
If this process breaks due to the input shutting down, I run a similar command to pull input from the backup:
ffmpeg -i rtmp://127.0.0.1/backup/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
From my distribute app I stream to my external outputs.
Only issue here is that I get the non-monotonous DTS error after the switch, so I've had to add a few flags when streaming from distribute to my outputs. The command is:
ffmpeg -fflags +genpts+igndts+ignidx -avoid_negative_ts make_zero -use_wallclock_as_timestamps 1 -i rtmp://127.0.0.1/distribute/stream1 -c:v libx264 -preset veryfast -r 25 -g 50 -c:a aac -b:a 128k -f flv $RTMP_ENDPOINT
I've noticed I get some warnings in the ffmpeg process when I switch, if the main and backup streams are coming in with different x264 profiles, let's say one is on High and the other on Baseline or Main.

Is it possible to create a video stream (mpeg or mjpeg) from jpeg images that are continuously created?

I need to create a screen grab video stream of the display low powered embedded device. It does not have the capacity to run a desktop sharing service live VNC. But it can give 2-3 screenshots every second through an API to a separate HTTP client running elsewhere.
Is there a way I can create a video stream from the images retrieved by running the Screenshot API continuously.
You can use ffmpeg to create a real time video stream using continues jpeg files and it will create mpeg format video using images.
Ffmpeg is a software project to work with videos, audios and other multimedia files. You can use ffmpeg class library project or command line exe application to work with ffmpeg.
If images are locally stored in computer, you can directly feed those images in 2 or 3 framerate to create the video. For example you can use following ffmpeg command to create a video file using multiple images.
ffmpeg -framerate 24 -i %d.jpg output.mp4
In above command -i is the input path and it produce output.mp4 file. Similarly you can use following command to create a real time mpegts udp stream.
ffmpeg -loop 1 -i %d.jpg -r 10 -vcodec mpeg4 -f mpegts udp://127.0.0.1:1234

In libav "rtmp://url live=1" is not working as in FFmpeg

I am trying to play live rtmp stream, streamed via Red5 server.
Command used is ffmpeg -i "rtmp://IP/live/1234 live=1" -f flv rtmp://IP/live/1234_56
The above command for live streaming works well with ffmpeg on Window OS. But unable to stream live rtmp video below command is run on libav ,Ubuntu OS using avconv tool.
avconv -i "rtmp://IP/live/1234 live=1" -f flv rtmp://IP/live/1234_56
Even i used *rtmp_live* AVOption to play live stream or to replace live=1 parameter as given below-
avconv -i "rtmp://IP/live/1234 rtmp_live" -f flv rtmp://IP/live/1234_56
But this command doesn't work. Please tell me how to use rtmp_live or live=1 with avconv tool
Thanks in advance.
Have you tried
ffmpeg -i "rtmp://IP/live/1234?live=1" -f flv rtmp://IP/live/1234_56

Wowza error :Failed to play myStream; stream not found.

i am using ffmpeg for encoding a video which will then be restreamed using wowza.i am new to streaming.first i started wowza using command
/etc/init.d/WowzaMediaServer start
after that i start streaming a MP4 file using rtsp protocol.i used the command
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
video start streaming before all of these i changed admin.password and added a username as myuser and password as mypassword when i run above command its streaming but after that they say go to
WowzaMediaServer/examples/LiveVideoStreaming/FlashRTMPPlayer/Player.html
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
when i click on connect its giving me status
"Failed to play myStream; stream not found."
i am following this article http://www.wowza.com/forums/content.php?354-How-to-set-up-live-streaming-using-an-RTSP-RTP-based-encoder
where i am wrong i dont know.i am unable to figure it out. from wowza support team i am not getting satisfactory answers .so someone work on wowza please help me!!why i am not able to connect my video stream to wowza.Please respond stuck badly.
So it appears there are some basic issues with the rtsp from ffmpeg and then no matches to the play request
You have
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
You need to make sure your ffmpeg has libx264 and libfdk_aac plugins available. You should be able to determine this with just
ffmpeg
and it should print out the libraries available.
If you have all the libraries then you are publishing a stream called
myStream.sdp
You then have instructions that say
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
So you should either change your ffmpeg command to
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream
Note no .sdp in the stream name any more or use a Stream field in the player of
myStream.sdp
When publishing a stream and then attempting to play it back they must match, otherwise you get back Stream Not Found.
One way to successfully do this is to specify only the port number (65000 in this example), making sure it isn't 1935 and server in your ffmpeg command then create a mystream.stream file in your content directory of your Wowza server with ONLY the following line:
udp://0.0.0:65000
Then, in Wowza/conf/startupstreams.xml, add the following:
<!-- Native RTP example (SDP file is myStream.sdp) -->
<StartupStream>
<Application>live/_definst_</Application>
<MediaCasterType>rtp</MediaCasterType>
<StreamName>mystream.stream</StreamName>
</StartupStream>
Restart wowza and ffmpeg and then re-try your url in with the stream name mystream.stream.

Can ffmpeg process rtmp stream from FMS at all?

ffmpeg -i rtmp:/vid2/recordings -acodec copy -vcodec copy -y captured.flv
or
ffmpeg -i rtmp://localhost/vid2/recordings -acodec copy -vcodec copy -y captured.flv
The above command only give me this error:
rtmp://localhost/vid2/recordings: no such file or directory
Isn't ffmpeg supposed to be able to handle rtmp streams?
Are you using the Xuggler version of ffmpeg? Here's a tutorial explaining how to obtain and encode rtmp streams with the Xuggler ffmpeg.
http://wiki.xuggle.com/Live_Encoding_Tutorial
No need to use Xuggler's build. Version .6 of ffmpeg does support rtmp. However, make sure you compile with
--enable-librtmp
ffmpeg can catch the stream of rtmp. Try it with entering port like 1935
ffmpeg -i rtmp://localhost:1935/live/newStream
But before doing that check if newStream exist. If not, open new cmd and enter ffmpeg/bin folder
ffmpeg -i sample.avi -f flv rtmp://localhost/live/newStream
Then try to run first code.
appears it can (analyzeduration to get rid of an initial delay)
$ ffplay -analyzeduration 0 -i "rtmp://localhost/live/stream_name live=1"
See http://betterlogic.com/roger/2012/08/ffmpeg-receiving-rtmp-stream-from-flash-media-server/ for some instructions on how to stream to it, as well.
I have same problem with FFMPEG.
I publish video from FFMPEG on FMS correctly and I can see that on the FMS video player.
ffmpeg -re -i /home/videos/sample.mp4 -f flv rtmp://localhost/live/sample
Now I would like to create live stream.
For this case I use this code in the FFMPEG on linux:
ffmpeg -re -i rtmp://localhost:1935/live/sample -vcodec copy -acodec copy -f flv rtmp://localhost/livepkgr/sample_streamd?adbe-live-event=sample_event
By use this syntax I get same error:
Closing connection: NetStream.Play.StreamNotFound
rtmp://localhost:1935/live/sample: Operation not permitt

Resources