ffmpeg restream rtsp to mjpeg - ffmpeg

I have a few IP cameras that stream 720 X264 video over rtsp. The streams are really unreliable when viewing on Android. They also fail if more then 2 connections are made.
I have a ubuntu server that I can use to connect and restream as mjpeg or something else. There are tons of different commands out there but they all seem to involve transcoding the video.
How can I simply restream the live rtsp feed as a mjpeg without doing anything to the video itself? Theres no audio so no worries there.

It seems that recently I did something similar. I have added following section to the /etc/ffserver.conf file:
<Feed monitoring1.ffm>
File /tmp/monitoring1.ffm
FileMaxSize 50M
ACL allow 127.0.0.1
</Feed>
<Stream monitoring1.mjpg>
Feed monitoring1.ffm
Format mpjpeg
VideoCodec mjpeg
VideoFrameRate 22
VideoBufferSize 80
VideoSize 720x264
NoAudio
</Stream>
After that started server with command:
ffserver
and run streaming with command:
ffmpeg -i "rtsp://<ip_camera>:554/user=admin&password=&channel=1&stream=0.sdp" http://localhost:8090/monitoring1.ffm
Tune the ip camera url for your purposes. Now you can access the mjpeg stream by accessing following address with your browser/player:
http://localhost:8090/monitoring1.mjpg
Works fine for me and hope it solves your problem.

Related

H264/MP4 live stream from ffmpeg does not work in browser

I cannot visualize a H264/MP4 stream generated by ffmpeg in Chrome, IE, Edge. It works only in Firefox.
My testing environment is Windows 10, all updates done, all browsers up to date.
I have a source MJPEG stream, which I need to transcode to H264/MP4 and show it in browser in a HTML5 element.
In order to provide a working example, I use here this MJPEG stream: http://200.36.58.250/mjpg/video.mjpg?resolution=320x240. In my real case I have MJPEG input from different sources like IP cameras.
I use the following command line:
ffmpeg.exe -use_wallclock_as_timestamps 1 -f mjpeg -i "http://200.36.58.250/mjpg/video.mjpg?resolution=320x240" -f mp4 -c:v libx264 -an -preset ultrafast -tune zerolatency -movflags frag_keyframe+empty_moov+faststart -reset_timestamps 1 -vsync 1 -flags global_header -r 15 "tcp://127.0.0.1:5000?listen"
If I try to visualize the output in VLC, I use this link: tcp://127.0.0.1:5000 and it works.
Then I try to visualize the stream in browser, so I put this into a html document:
<video autoplay controls>
<source src="http://127.0.0.1:5000" type="video/mp4">
</video>
If I open the document in Firefox it works just fine.
But it does not work when trying to open in Chrome, IE or Edge. It seems that the browser tries to connect to the TCP server exposed by ffmpeg, but something happens because ffmpeg exits after few seconds.
In ffmpeg console I can see this:
av_interleaved_write_frame(): Unknown error
Error writing trailer of tcp://127.0.0.1:5000?listen: Error number -10053 occurred
If I inspect the video element in Chrome is can see this error:
Failed to load resource: net::ERR_INVALID_HTTP_RESPONSE
As far as I know all these browsers should support H264 encoded streams transported in MP4 containers. If in the element I replace the link http://127.0.0.1:5000 with a local link to a mp4/H264 encoded file, it is played just fine in each browser. The problem seems to be related to live streaming.
Does anyone know why this happens and how it can be solved?
Thank you!
You're just outputting to a TCP socket. That's not HTTP. Browsers speak HTTP... you need to use an HTTP server in this case.

wowza + live + ffmpeg + hls player, how to create the playlist.m3u8?

I'm trying to setup a wowza live test server and then I can play hls from my mobile app. It do work without any problem for vod. I can play it in my app. I can also see the .m3p8 file if I enter this uri in the browser.
I tried to do the same in live mode (my goal is to test some streaming parameters for live streaming). I tried to use ffmpeg to create the live stream:
ffmpeg -re -i "myInputTestVideo.mp4" -vcodec libx264 -vb 150000 -g 60 -vprofile baseline -level 2.1 -acodec aac -ab 64000 -ar 48000 -ac 2 -vbsf h264_mp4toannexb -strict experimental -f mpegts udp://127.0.0.1:10000
I created a "source file" and connected it to the "Incoming Streams".
I can see in my application's Monitoring / Network tab that it do getting the data from ffmpeg.
My problem is how to get the playlist.m3p8 file so I can play it from inside my app (hls based)?
Again, for now I need a way to test playing with the streaming settings and in real live I'll have a real live streaming source.
If I understand your issue correctly and since you said that it works for you with a VoD and its own m3u8 uri, you seem to not know how to construct an m3u8 uri for live sources referenced by a stream file (not source file as you incorrectly wrote).
Considering you named your stream file for example udp.stream (that's the file including the udp://127.0.0.1:10000 address), simply point your hls player application to http://{yourwowzaserver}/{yourliveapp}/udp.stream/playlist.m3u8
You could push it to Wowza as rtsp (much better than udp) and then stream it further on to where you want. To push it to Wowza you probably will need to setup a username and password (Server > Source authentication) and then the output stream from ffmpeg can look something like this: rtsp://{user}:{pass}#{yourwowzaserver}/{yourliveapp}/mystream.
In Wowza you will see mystream in Incomming streams. From there you can access it with the classic http(s)://wowzaip:wowzaport/{yourliveapp}/mystream/playlist.m3u8
Anyways, Wowza support both rtsp and udp so you could use directly. If you want transcoding, ffmpeg will be kinder to server resources than Wowza.
Worked:
To change the output of ffmpeg to -f rtsp rtsp://127.0.0.1:1935/my_app/my.stream.stream and use it as the input in wowza.

Using ffserver to do UDP multicast streaming

Here's the deal. I'm working with IPTV hardware and I need to output a bunch of demo streams. These are MPEG2 transport stream that need to be straight up UDP Multicast streams. I have an ffmpeg command that works great:
ffmpeg -re -i /Volumes/Data/DemoVideos/GRAILrpsp.ts -acodec copy -vcodec copy -f mpegts udp://239.192.1.82:12000[ttl=1,buffer_size=2097157]
What I would like to do is convert this into an ffserver config file instead of having to start a whole bunch of ffmpeg streams and then figuring out how to get them to loop. I'm sure I can do it with the right scripting but what a pain, isn't that what ffserver is for? But I can't find any documentation on doing UDP streaming using ffserver. You can set a multicast address and port but it goes to RTP which this hardware isn't designed for. Any help would be greatly appreciated.
At the time of this post, according to the ffserver Documentation it doesn't support MPEG-TS directly in UDP:
ffserver receives prerecorded files or FFM streams from some ffmpeginstance as input, then streams them over RTP/RTSP/HTTP.

Setting up rtsp stream on Windows

I am trying to set up an rtsp stream that can be accessed from an application. I have been experimenting with ffmpeg to realize that. I have succeded as far as I was able to stream from ffmpeg to ffplay but I could not load the stream in vlc for example. Here are the calls that I did from two different shells on the same machine:
ffmpeg.exe -y -loop 1 -r 24 -i test_1.jpg -vcodec libx264 -tune stillimage -f rtsp rtsp://127.0.0.1:1234/stream.sdp
ffplay.exe -rtsp_flags listen rtsp://127.0.0.1:1234/stream.sdp
Can anybody explain to me what I would have to do to load the stream as a network stream using vlc? Any help is appreciated.
I have done this before and I'm not sure what was wrong with rtsp output of ffmpeg. But what i can say right now is please consider using Live555 library if you have any streaming scenario. cause the ffmpeg code (for rtp muxer) is not good and it is buggy. ffmpeg has another solution for streaming server which is called ffserver which prepare ffmpeg pipe for vlc or another third-party application. and that's bad written and buggy too (libav group -another fork of libav* libraries) never used ffserver code and in not sure if they have any plan to consider ffserver as their solution. they have ffplay(avplay), ffmpeg(avconv) and ffprobe but not ffserver.
If you want to use Live555 which really easy, you have to just go to their website (www.live555.com) download the source code and build MediaServer application (It is in 'MediaServer' folder). if you read the code's documentation, I'm sure you will have not any problem.It's a basic rtsp server to stream any (supported) accessible file on your HDD via rtsp url of your server.
if you have any problem with code just comment here, so I can help you more with live555.

Video streaming fails over rtp protocol

Video streaming between Unix Server (ffmpeg) and Windows client (vlc) completed without errors.
Server side:
ffmpeg -f v4l2 -r 25 -i /dev/video0 http://192.168.1.114:27018/feed1.ffm
Client side:
vlc player: Media -> Open Network Stream: http://192.168.1.114:27018/test.swf
However, video streaming has approximately 10 s. delay. For this reason, I tried using rtp instead http, but without result. Specifically, on server side I run:
ffmpeg -f v4l2 -r 25 -i /dev/video0 rtp://192.168.1.114:27018/feed1.ffm
After the stream begun, on client side I typed: rtp://#:27018 but it doesn't respond.
What I am missing? Is there any other way I could avoid delay?
Short (incomplete) solution for the problem with the RTP stream:
Setup FFMPEG with the command line:
ffmpeg -f v4l2 -r 25 -i /dev/video0 rtp://<client_ip>:<client_port>
where <client_ip> and <client_port> need to be replaced with the client's IP address and port number, respectively.
Description of the problem with the RTP stream and the solution:
Generally, when setting up an HTTP server (in this case, namely HTTP multimedia server), on the server's side, the local port and the local IP address that the server needs to listen on are specified. So when you set up FFMPEG to stream on http://192.168.1.114:27018/, it probably means that FFMPEG (the server) will listen on its one interface that has the IP 192.168.1.114 and on the port 27018. Then the client needs to connect to http://192.168.1.114:27018 to get the streams.
However, when setting up an RTP FFMPEG server, the client address(es) and port(s) are specified on the server's side, meaning (inaccurately) that the server sends the packets to the desired addresses and the clients need to listen on their ports if they want the available streams. So the FFMPEG server needs to be setup with the URL rtp://<client_ip>:<client_port> and not the URL rtp://<server_ip>:<server_port>, for the <client_ip> to be able to access the stream on his local port <client_port>.
For more info on the FFMPEG's RTP URL format and a starting point for some intriguing concepts in RTP streaming (like multicasting), visit here.

Resources