I receive from local port UDP packets that content only RGB payload (video stream).
How can I display RGB using ffmpeg or vlc?
I hope these help, even though they don't directly answer your question. They give pointers to command line options that are needed to handle raw rgb data with ffmpeg and gstreamer.
I once needed to receive rgb frames from a webcam connected to another computer.
Couldn't work with UDP, because without framing, the packets got dropped, frames became incomplete, rows out of sync. Hence TCP (using UDP is a matter or changing that word in the commands).
Fighting with ffmpeg, gstreamer, mencoder, mplayer, and vlc yielded the following solution.
"Architecture":
[LINUX V4L2 WEBCAM CAPTURE] -> [THEORA VIDEO COMPRESSION] -> [RTP STREAM # PORT 1233] -> [RECEIVING HOST] -> [THEORA VIDEO DECOMPRESSION] -> [RAW RGB/8BPP TO LOCALHOST # PORT 1234]
Command lines:
# VIDEO STREAM SENDING SIDE
vlc -vvv -I dummy --udp-caching=0 --sout-mux-caching=0 --no-audio \
v4l2:///dev/video0:width=640:height=480:caching=200:fps=10/1 :v4l2-caching=0 \
':sout=#transcode{vcodec=mp2v,vb=3000,acodec=none}: \
rtp{mux=ts,dst=destinationhost,port=1233,caching=0}'
# VIDEO STREAM RECEIVING/FORWARDING TO TCP / ffmpeg
ffmpeg -i rtp://0.0.0.0?localport=1233 \
-f rawvideo -pix_fmt rgb24 -vcodec rawvideo tcp://127.0.0.1:1234
Alternatively, one command to accomplish them both (with gstreamer):
# CAPTURE WEBCAM->FORWARD RGB GRAYSCALE TO TCP / gstreamer
gst-launch-0.10 v4l2src device=/dev/video0 ! ffmpegcolorspace \
! video/x-raw-gray, width=320, height=240, 'framerate=(fraction)30/1' \
! tcpclientsink host=localhost port=1234
Related
The situation is kind of complex. I was archiving several CCTV camera feeds (rtsp, h264, no audio) through OpenCV, which worked but the CPU utilization was too high and started to lose some frames time by time.
To reduce the CPU utilization, I started to use FFMPEG to skip the decoding and encoding processes, which worked perfectly on my home machine. However, when I connected to my university VPN and tried to deploy it on our lab server, FFmpeg couldn't read any frame, ffplay couldn't get anything either. However, OpenCV, VLC Player and IINA Player could still read and display the feed.
In Summary,
1 FFMPEG/ffplay
1.1 can only read the feed from my home network(Wi-Fi, optimum)
1.2 from other two networks, the error message says: "Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options"
2 IINA/VLC Player, OpenCV
These tools can get the video all the time.
I'm wondering whether it's related to some specific port access, that the ffmpeg required but the others don't. I'd appreciate it if anyone can provide any suggestions.
As references, the tested ffplay command is simple:
ffplay 'the rtsp address'
Thanks
Update
More tests have been performed.
By specifying rtsp_transport as TCP, ffplay can play the video, but FFmpeg can't access the video. (In the beginning, when both FFmpeg and ffplay worked through my home network, it was UDP)
The FFmpeg command is as follows:
ffmpeg -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 -rtsp_transport tcp "%Y-%m-%d-%H-%M-%S_Test.mp4"
Please help...
Solved by forcing it to use "-rtsp_transport tcp" right before -i.
ffmpeg -rtsp_transport tcp -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 "%Y-%m-%d-%H-%M-%S_Test.mp4"
I've a C920 HD camera connected to a Raspberry Pi 4 and my goal is to be able to access a stream of that camera anytime from my phone / laptop both connected to my network with a VPN.
Now, I managed to use ffmpeg like this:
ffmpeg -f v4l2 -input_format h264 \
-video_size 1920x1080 \
-i /dev/video4 \
-copyinkf -codec copy \
-f mpegts udp://192.168.1.10:5000?pkt_size=1316
On the computer 192.168.1.10 I can launch VLC go into "Network Transmission" and type udp://#:5000 in oder to watch the stream.
This is a single stream and from what I understand my RPi is just "shooting" the frames at that computer whatever it is connected or not, how can I have a proper stream (maybe rtmp?) that I can watch in multiple devices?
Please note: I'm using -copyinkf -codec copy in order to avoid transcoding and other operations that might result in a very high CPU usage. Can I do it this way also?
Thank you.
Nginx can be configured to host an RTMP video stream that will be used to play the stream coming from ffmpeg in all my devices. For this we need to install libnginx-mod-rtmp and configure nginx for RTMP:
apt install libnginx-mod-rtmp
Append the following to /etc/nginx/nginx.conf:
rtmp {
server {
listen 1935;
chunk_size 4096;
allow publish 127.0.0.1;
deny publish all;
application live {
live on;
record off;
}
}
}
systemctl restart nginx
Point ffmpeg to the nginx server:
ffmpeg -f v4l2 -input_format h264 \
-video_size 1920x1080 \
-i /dev/video4 \
-copyinkf -codec copy \
-f flv rtmp://127.0.0.1/live/stream
I also changed the output format to flv in order to improve compatibility with players.
Enjoy.
I am trying to play some audio on my linux server and stream it to multiple internet browsers. I have a loopback device I'm specifying as input to ffmpeg. ffmpeg is then streamed via rtp to a WebRTC server (Janus). It works, but the sound that comes out is horrible.
Here's the command I'm using to stream from ffmpeg to janus over rtp:
nice --20 sudo ffmpeg -re -f alsa -i hw:Loopback,1,0 -c:a libopus -ac
1 -b:a 64K -ar 8000 -vn -rtbufsize 250M -f rtp rtp://127.0.0.1:17666
The WebRTC server (Janus) requires that the audio codec be opus. If I try to do 2 channel audio or increase the sampling rate, the stream slows down or sound worse. The "nice" command is to give the process higher priority.
Using gstreamer instead of ffmpeg works and sounds great!
Here's the cmd I'm using on CentOS 7:
sudo gst-launch-1.0 alsasrc device=hw:Loopback,1,0 ! rawaudioparse ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host=127.0.0.1 port=14365
I've got ffmpeg to read some RTSP stream and output image2 format to stdout like so:
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2 -update 1 -
But stdout is not good enough for me.. I am trying to pass it to "push" it to some other process that I cannot "pipe" to ffmpeg due to some architecture constraints. I am running on Linux so I was hoping to simulate some tcp/udp socket via the file system e.g. /dev/somthing or similar. Alternatively, maybe it's possible to get ffmpeg to send the image directly to a given tcp/udp address? This didn't work though (ffmpeg expects a file output):
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2 -update 1 "udp://localhost:3333"
Any ideas?
Thanks
The normal image2 muxer expects to write to one or more image files. Use the image2pipe muxer.
ffmpeg -rtsp_transport tcp -i "rtsp:xxxxx" -f image2pipe "udp://localhost:3333"
(-update has no relevance when piping).
I want to take the video stream from network stream A, while taking the audio stream from network stream B.
I tried the command:
ffmpeg -i rtsp://192.168.1.1 -i http://192.168.1.2 -c copy -map 0:v:0 -map 1:a:0 -f mp4 out.mp4
Which continuously raises the following errors:
[rtsp # 0x564b44779f60] max delay reached. need to consume packet
[rtsp # 0x564b44779f60] RTP: missed 591 packets
While the commands
ffmpeg -i rtsp://192.168.1.1 -c copy -f mp4 out.mp4
and
ffmpeg -i http://192.168.1.2 -c copy -f mp3 out.mp3
work without problems.
The video stream is HEVC, the audio stream is MP3. What am I missing?
To answer my own question:
Looks like the packet loss increases when using two or more sources at once. If anyone knows why, an answer on this would still be appreciated.
However, the packet loss can obviously be prevented by using TCP as transport protocol for RTSP:
ffmpeg -rtsp_transport tcp -i rtsp://...
and I get even better results by additionally raising the thread_queue_size:
-thread_queue_size 1024
Both mentioned options are input options and have to go before the -i.