I am new to this domain, but I successfully setup RTSP server in one of the container and when running
./rtsp_media_server -l 8445 -s stream1 -p 8000
it is working fine. where 8445 is the expose port, stream1 is the stream name and 8000 is the intake port
URL : rtsp://<host>:8445/stream1
But I need RTP packets to be sent to port 8000 so that I can test the my rtsp media server perfectly.
Do any online rtp server provides the same?
I can see many sample rtsp media server is there like rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov , which I can test in VLC media player. but not sure how to do with RTP packets.
Please correct me if my understanding is wrong and guide me.
You can send H264 over RTP to port 8000 using gstreamer.
The following gst-launch-1.0 command would do it for you in terminal.
gst-launch-1.0 -v videotestsrc pattern=smpte ! video/x-raw,framerate=30/1 ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=8000
Related
I would like to convert the working FFmpeg command to a GStreamer pipeline to extract image from the RTSP stream.
ffmpeg -hide_banner -v error -rtsp_transport tcp -stimeout 10000000 -i 'rtsp://{domain}/Streaming/tracks/101?starttime=20220831T103000Z&endtime=20220831T103010Z' -vframes 1 -y image.jpg
Here is the GStreamer pipeline I tried to convert:
gst-launch-1.0 rtspsrc location="rtsp://{domain}/Streaming/tracks/101?starttime=20220831T103000Z&endtime=20220831T103010Z" max-rtcp-rtp-time-diff=0 latency=0 is_live=true drop-on-latency=true ! decodebin3 ! videoconvert ! jpegenc snapshot=true ! filesink location="/mnt/c/images/frame3.jpg"
I couldn't manage to get it working. It gives the wrong timestamp image and the Gstreamer pipeline never stopped after extracting the image just working like an infinite loop.
But the FFmpeg command works perfect and extracts the correct image and quits from the command after successfully extracting the image.
You may try adding imagefreeze with num-buffers=1:
gst-launch-1.0 rtspsrc protocols=tcp location="rtsp://{domain}/Streaming/tracks/101?starttime=20220831T103000Z&endtime=20220831T103010Z" max-rtcp-rtp-time-diff=0 latency=0 is-live=true drop-on-latency=true ! decodebin ! videoconvert ! imagefreeze num-buffers=1 ! jpegenc snapshot=true ! filesink location="/mnt/c/images/frame3.jpg"
I just make Xamarin.Android APP for live streaming using PI4 Cam.
I used GStreamer in Sender(Pi4) and VLC in Receiver(Windosw Visual Studio).
tried this below
Sender - GStreamer in Raspberry :
raspivid -t 0 -w 800 -h 600 -fps 30-hf -vf -b 50000000 -o - | gst-launch-1.0 -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=192.168.43.201 port=5000
Receiver - VLC SDP file :
c=IN IP4 10.5.110.117
t=0 0
m=video 5000 RTP/AVP 96
a=rtpmap:96 H264/90000
When I received it with Gstreamer on Windows, it's very well and no latency so I wanna take this option.
But I could only get a still shot on VLC(stopped at a paused time 0:00)
Is that SDP file wrong?
Plz give me some advice on what I should try. :)
I am trying to play some audio on my linux server and stream it to multiple internet browsers. I have a loopback device I'm specifying as input to ffmpeg. ffmpeg is then streamed via rtp to a WebRTC server (Janus). It works, but the sound that comes out is horrible.
Here's the command I'm using to stream from ffmpeg to janus over rtp:
nice --20 sudo ffmpeg -re -f alsa -i hw:Loopback,1,0 -c:a libopus -ac
1 -b:a 64K -ar 8000 -vn -rtbufsize 250M -f rtp rtp://127.0.0.1:17666
The WebRTC server (Janus) requires that the audio codec be opus. If I try to do 2 channel audio or increase the sampling rate, the stream slows down or sound worse. The "nice" command is to give the process higher priority.
Using gstreamer instead of ffmpeg works and sounds great!
Here's the cmd I'm using on CentOS 7:
sudo gst-launch-1.0 alsasrc device=hw:Loopback,1,0 ! rawaudioparse ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host=127.0.0.1 port=14365
IIUC with HLS or DASH, I can create a manifest and serve the segments straight from my httpd, e.g. python -m http.server.
I have a UVC video feed coming in on /dev/video1 and I'm battling to create a simple m3u8 in either gstreamer or ffmpeg.
I got as far as:
gst-launch-1.0 -e v4l2src device=/dev/video1 ! videoconvert ! x264enc ! mpegtsmux ! hlssink max-files=5
Any ideas?
Video
To list video1 device capabilities:
ffmpeg -f v4l2 -list_formats all -i /dev/video1
Audio (ALSA example)
To list ALSA devices:
arecord -L
HLS
Use two inputs:
ffmpeg -f alsa -i <alsa_device> -f v4l2 -i /dev/video1 [...] /path/to/docroot/playlist.m3u8
You can find the various HLS parameters in the FFmpeg documentation.
Further reading:
FFmpeg H.264 Encoding Guide
FFmpeg Webcam Capture
I found the option tune=zerolatency was what I needed it from stalling. Still need to figure out how to bring in the audio too.
gst-launch-1.0 -e v4l2src device=/dev/video1 ! videoconvert ! x264enc tune=zerolatency ! mpegtsmux ! hlssink max-files=5
Sadly my Thinkpad X220 is overheating at > 96C.
Would be nice to get the ffmpeg version.
I receive from local port UDP packets that content only RGB payload (video stream).
How can I display RGB using ffmpeg or vlc?
I hope these help, even though they don't directly answer your question. They give pointers to command line options that are needed to handle raw rgb data with ffmpeg and gstreamer.
I once needed to receive rgb frames from a webcam connected to another computer.
Couldn't work with UDP, because without framing, the packets got dropped, frames became incomplete, rows out of sync. Hence TCP (using UDP is a matter or changing that word in the commands).
Fighting with ffmpeg, gstreamer, mencoder, mplayer, and vlc yielded the following solution.
"Architecture":
[LINUX V4L2 WEBCAM CAPTURE] -> [THEORA VIDEO COMPRESSION] -> [RTP STREAM # PORT 1233] -> [RECEIVING HOST] -> [THEORA VIDEO DECOMPRESSION] -> [RAW RGB/8BPP TO LOCALHOST # PORT 1234]
Command lines:
# VIDEO STREAM SENDING SIDE
vlc -vvv -I dummy --udp-caching=0 --sout-mux-caching=0 --no-audio \
v4l2:///dev/video0:width=640:height=480:caching=200:fps=10/1 :v4l2-caching=0 \
':sout=#transcode{vcodec=mp2v,vb=3000,acodec=none}: \
rtp{mux=ts,dst=destinationhost,port=1233,caching=0}'
# VIDEO STREAM RECEIVING/FORWARDING TO TCP / ffmpeg
ffmpeg -i rtp://0.0.0.0?localport=1233 \
-f rawvideo -pix_fmt rgb24 -vcodec rawvideo tcp://127.0.0.1:1234
Alternatively, one command to accomplish them both (with gstreamer):
# CAPTURE WEBCAM->FORWARD RGB GRAYSCALE TO TCP / gstreamer
gst-launch-0.10 v4l2src device=/dev/video0 ! ffmpegcolorspace \
! video/x-raw-gray, width=320, height=240, 'framerate=(fraction)30/1' \
! tcpclientsink host=localhost port=1234