I just make Xamarin.Android APP for live streaming using PI4 Cam.
I used GStreamer in Sender(Pi4) and VLC in Receiver(Windosw Visual Studio).
tried this below
Sender - GStreamer in Raspberry :
raspivid -t 0 -w 800 -h 600 -fps 30-hf -vf -b 50000000 -o - | gst-launch-1.0 -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=192.168.43.201 port=5000
Receiver - VLC SDP file :
c=IN IP4 10.5.110.117
t=0 0
m=video 5000 RTP/AVP 96
a=rtpmap:96 H264/90000
When I received it with Gstreamer on Windows, it's very well and no latency so I wanna take this option.
But I could only get a still shot on VLC(stopped at a paused time 0:00)
Is that SDP file wrong?
Plz give me some advice on what I should try. :)
Related
I am building a service that needs to convert RTP streams into HLS streams. The requirements recently shifted, and I now have to create an RTMP stream from the RTP stream and then convert the RTMP to HLS using two separate FFmpeg processes. The problem is that the RTP to RTMP process doesn't actually output anything to the specified RTMP URL.
Going directly from RTP to HLS with the following command (some options removed for brevity) works as expected:
ffmpeg -f sdp \
-protocol_whitelist file,udp,rtp \
-i example.sdp \
-g 2 \
-hls_time 2.0 \
-hls_list_size 5 \
-vcodec libx264 \
-acodec aac \
-f hls chunks/test-master.m3u8
However, converting RTP to RTMP with the following command yields no output, nor does it seem to be receiving any input despite the use of an identical SDP file:
ffmpeg -f sdp \
-protocol_whitelist pipe,udp,rtp \
-i example.sdp \
-g 2 \
-vcodec libx264 \
-acodec aac \
-f flv rtmp://localhost/test-stream
This is an example of what the SDP file looks like:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=Test
c=IN IP4 127.0.0.1
t=0 0
m=audio 37000 RTP/AVPF 97
a=rtpmap:97 opus/48000/2
a=fmtp:97 minptime=10;useinbandfec=1
m=video 37002 RTP/AVPF 96
a=rtpmap:96 H264/90000
a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=64001E
MediaSoup generates the RTP stream, and the ports and host match up. I've verified that there is actually a stream of data coming through the ports in question using nc. There are no error messages. Am I missing something obvious here?
MediaSoup is built with SFU (Selective Forwarding Unit). As I can see that you're using:
m=video 37002 RTP/AVPF 96 a=rtpmap:96 H264/90000 a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=64001E. You need to make sure the consumer for video stream is using the same video codec, this can be done by console.log(consumer.rtpParameters.codecs);
I am listening audio from source IP address and trying to encode it into speex format and again sending it to the destination IP address using ffmpeg.
My ffmpeg command is:
ffmpeg -protocol_whitelist file,rtp,udp -i temp.sdp -c:a libspeex -f rtp rtp://<dest_ip>:<port>
SDP file content is(temp.sdp):
v=0
c=IN IP4 <source_IP>
t=0 0
m=audio <port> RTP/AVP 98
a=rtpmap:98 L16/8000
Issue: Whenever I am trying to run this command, I am getting too much background noise on speaker.
I could hear music(not clearly), but not human voice.
Also, I have tried with highpass and lowpass filters are as follows:
ffmpeg -protocol_whitelist file,rtp,udp -i temp.sdp -af "highpass=f=200, lowpass=f=3000" -c:a
libspeex -f rtp rtp://<dest_ip>:<port>
I am new to this domain, but I successfully setup RTSP server in one of the container and when running
./rtsp_media_server -l 8445 -s stream1 -p 8000
it is working fine. where 8445 is the expose port, stream1 is the stream name and 8000 is the intake port
URL : rtsp://<host>:8445/stream1
But I need RTP packets to be sent to port 8000 so that I can test the my rtsp media server perfectly.
Do any online rtp server provides the same?
I can see many sample rtsp media server is there like rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov , which I can test in VLC media player. but not sure how to do with RTP packets.
Please correct me if my understanding is wrong and guide me.
You can send H264 over RTP to port 8000 using gstreamer.
The following gst-launch-1.0 command would do it for you in terminal.
gst-launch-1.0 -v videotestsrc pattern=smpte ! video/x-raw,framerate=30/1 ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=8000
i am trying to stream and receive my webcam feed on two terminal on same laptop.For this purpose I am using the following commands:-
foo.sdp:
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 55.2.100
m=video 1235 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1
Transmitting:
ffmpeg -re -i /dev/video0 -r 24 -b 50k -s 858x500 -f mulaw -f rtp rtp://127.0.0.1:3000> foo.sdp
Receiving:
ffplay -i foo.sdp
While transmission seems to be working fine , but when i am using receiving command I am getting en error :
Protocol not on whitelist 'file,crypto'!/0
foo.sdp: Invalid data found when processing input
try add
-protocol_whitelist file,udp,rtp
https://www.ffmpeg.org/ffmpeg-protocols.html#Protocol-Options
https://lists.ffmpeg.org/pipermail/ffmpeg-user/2016-February/030853.html
I receive from local port UDP packets that content only RGB payload (video stream).
How can I display RGB using ffmpeg or vlc?
I hope these help, even though they don't directly answer your question. They give pointers to command line options that are needed to handle raw rgb data with ffmpeg and gstreamer.
I once needed to receive rgb frames from a webcam connected to another computer.
Couldn't work with UDP, because without framing, the packets got dropped, frames became incomplete, rows out of sync. Hence TCP (using UDP is a matter or changing that word in the commands).
Fighting with ffmpeg, gstreamer, mencoder, mplayer, and vlc yielded the following solution.
"Architecture":
[LINUX V4L2 WEBCAM CAPTURE] -> [THEORA VIDEO COMPRESSION] -> [RTP STREAM # PORT 1233] -> [RECEIVING HOST] -> [THEORA VIDEO DECOMPRESSION] -> [RAW RGB/8BPP TO LOCALHOST # PORT 1234]
Command lines:
# VIDEO STREAM SENDING SIDE
vlc -vvv -I dummy --udp-caching=0 --sout-mux-caching=0 --no-audio \
v4l2:///dev/video0:width=640:height=480:caching=200:fps=10/1 :v4l2-caching=0 \
':sout=#transcode{vcodec=mp2v,vb=3000,acodec=none}: \
rtp{mux=ts,dst=destinationhost,port=1233,caching=0}'
# VIDEO STREAM RECEIVING/FORWARDING TO TCP / ffmpeg
ffmpeg -i rtp://0.0.0.0?localport=1233 \
-f rawvideo -pix_fmt rgb24 -vcodec rawvideo tcp://127.0.0.1:1234
Alternatively, one command to accomplish them both (with gstreamer):
# CAPTURE WEBCAM->FORWARD RGB GRAYSCALE TO TCP / gstreamer
gst-launch-0.10 v4l2src device=/dev/video0 ! ffmpegcolorspace \
! video/x-raw-gray, width=320, height=240, 'framerate=(fraction)30/1' \
! tcpclientsink host=localhost port=1234