I've got ffmpeg to read some RTSP stream and output image2 format to stdout like so:
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2 -update 1 -
But stdout is not good enough for me.. I am trying to pass it to "push" it to some other process that I cannot "pipe" to ffmpeg due to some architecture constraints. I am running on Linux so I was hoping to simulate some tcp/udp socket via the file system e.g. /dev/somthing or similar. Alternatively, maybe it's possible to get ffmpeg to send the image directly to a given tcp/udp address? This didn't work though (ffmpeg expects a file output):
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2 -update 1 "udp://localhost:3333"
Any ideas?
Thanks
The normal image2 muxer expects to write to one or more image files. Use the image2pipe muxer.
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2pipe "udp://localhost:3333"
(-update has no relevance when piping).
Related
I am trying to stream a video called "bbb.mp4" with ffmpeg and visualize it with VLC.
On the OS X Terminal, I do the following:
ffmpeg -re -i bbb.mp4 -an -c:v copy -f mpegts udp://#127.0.0.1:2222
I am video streaming it on the IP address 127.0.0.1 using the port 2222. Once I run this, it shows that it is streaming the video:
But when I visualize it using VLC by going File > Open Network... > Network:
I cannot see anything, even though the code is running in the Terminal:
It does not open. Do you know if I am doing anything wrong?
Try adding packet_size at the end. For example:
ffmpeg -re -i bbb.mp4 -an -c:v copy -f mpegts udp://127.0.0.1:2222?pkt_size=1316
And of course, at VLC:
udp://127.0.0.1:2222?pkt_size=1316
I want to take the video stream from network stream A, while taking the audio stream from network stream B.
I tried the command:
ffmpeg -i rtsp://192.168.1.1 -i http://192.168.1.2 -c copy -map 0:v:0 -map 1:a:0 -f mp4 out.mp4
Which continuously raises the following errors:
[rtsp # 0x564b44779f60] max delay reached. need to consume packet
[rtsp # 0x564b44779f60] RTP: missed 591 packets
While the commands
ffmpeg -i rtsp://192.168.1.1 -c copy -f mp4 out.mp4
and
ffmpeg -i http://192.168.1.2 -c copy -f mp3 out.mp3
work without problems.
The video stream is HEVC, the audio stream is MP3. What am I missing?
To answer my own question:
Looks like the packet loss increases when using two or more sources at once. If anyone knows why, an answer on this would still be appreciated.
However, the packet loss can obviously be prevented by using TCP as transport protocol for RTSP:
ffmpeg -rtsp_transport tcp -i rtsp://...
and I get even better results by additionally raising the thread_queue_size:
-thread_queue_size 1024
Both mentioned options are input options and have to go before the -i.
I have seen several other related questions but they all seem to be related to grabbing a still shot every X number of seconds. How can I grab 1 image when the command is run.
I was trying
ffmpeg -y -i rtsp://admin:admin#192.168.10.113:554/live -f image2 -updatefirst 1 do.jpg
Try
ffmpeg -y -i rtsp://admin:admin#192.168.10.113:554/live -vframes 1 do.jpg
I've been using variations to use my Ubiquiti cameras to give me a Weather Underground JPG.
The tcp transport addition fixed everything. The modified command follows.
E $FFMPEG -y -loglevel fatal -rtsp_transport tcp -i $URL1 -frames:v 2 -r 1 -s 320x240 $TMPFILE
My take on this command, but its not perfect, about 20% of the time I get a corrupted (as in incomplete, or glitchy) image over a bad link:
avconv -rtsp_transport tcp -y -i rtsp://user:pass#192.168.0.1:554/live -vframes 1 do.jpg
Firstly you need download ffmpeg.exe file to your computer and unzip,
Secondly, open Windows Terminal or PowerShell or CMD in your unzipped path and enter the bin directory,enter the following command:
.\ffmpeg -i rtsp://username:password#192.168.1.1:554/media/video0 -ss 1 -f image2 C:\Users\Desktop\1.jpg
You also can use a "proxy" app like https://github.com/gallofeliz/snapshot-proxy-cam that handle fallbacks and centralize your cams
So I have a (live)video stream on udp://10.5.5.100:8463 and I copy it to udp://localhost:1000.
ffmpeg -f mpegts -i "udp://10.5.5.100:8554?fifo_size=10000" -f mpegts -vcodec copy udp://localhost:1000/go
And it works fine in VLC but Wirecast doesn't accept udp://..., but it accepts rtsp://...
but I don't now much about ffmpeg, so I only changed udp to rtsp
ffmpeg -f mpegts -i "udp://10.5.5.100:8554?fifo_size=10000" -f mpegts -vcodec copy rtsp://localhost:1000/go
But it doesn't work and outputs this
rtsp://localhost:1000/go: Protocol not found
Thank you for answers!!
If you put '-f rtsp' instead of '-f mpegts' ffmpeg will try to establish connection to this url.
Proper solution with ffmpeg suite will be complex and include 'ffserver' as rtsp server and 'ffmpeg' as media stream source for the ffserver.
Much more simpler solution is to try vlc:
cvlc -vvv udp://10.5.5.100:8554?fifo_size=10000 --sout '#rtp{sdp=rtsp://localhost:1000/go}'
It starts RTSP server on localhost:1000 and retransmits data from UDP to the clients connected to this RTSP server.
ffmpeg -i rtmp:/vid2/recordings -acodec copy -vcodec copy -y captured.flv
or
ffmpeg -i rtmp://localhost/vid2/recordings -acodec copy -vcodec copy -y captured.flv
The above command only give me this error:
rtmp://localhost/vid2/recordings: no such file or directory
Isn't ffmpeg supposed to be able to handle rtmp streams?
Are you using the Xuggler version of ffmpeg? Here's a tutorial explaining how to obtain and encode rtmp streams with the Xuggler ffmpeg.
http://wiki.xuggle.com/Live_Encoding_Tutorial
No need to use Xuggler's build. Version .6 of ffmpeg does support rtmp. However, make sure you compile with
--enable-librtmp
ffmpeg can catch the stream of rtmp. Try it with entering port like 1935
ffmpeg -i rtmp://localhost:1935/live/newStream
But before doing that check if newStream exist. If not, open new cmd and enter ffmpeg/bin folder
ffmpeg -i sample.avi -f flv rtmp://localhost/live/newStream
Then try to run first code.
appears it can (analyzeduration to get rid of an initial delay)
$ ffplay -analyzeduration 0 -i "rtmp://localhost/live/stream_name live=1"
See http://betterlogic.com/roger/2012/08/ffmpeg-receiving-rtmp-stream-from-flash-media-server/ for some instructions on how to stream to it, as well.
I have same problem with FFMPEG.
I publish video from FFMPEG on FMS correctly and I can see that on the FMS video player.
ffmpeg -re -i /home/videos/sample.mp4 -f flv rtmp://localhost/live/sample
Now I would like to create live stream.
For this case I use this code in the FFMPEG on linux:
ffmpeg -re -i rtmp://localhost:1935/live/sample -vcodec copy -acodec copy -f flv rtmp://localhost/livepkgr/sample_streamd?adbe-live-event=sample_event
By use this syntax I get same error:
Closing connection: NetStream.Play.StreamNotFound
rtmp://localhost:1935/live/sample: Operation not permitt