GStreamer Flow for OS X Desktop - macos

I am a new user of GStreamer on Mac OS X EL Capitan 10.11.3.
I have installed GStreamer from http://gstreamer.freedesktop.org/data/pkg/osx/1.7.1/ with .pkg (version devel and standard version).
I am trying to catch the desktop to send it to an other display.
I streamed my webcam, with success :
$ gst-launch wrappercamerabinsrc mode=2 ! video/x-raw, width=320, height=240 ! osxvideosink
But I do not find the descriptor for my desktop anywhere.
Do you know the equivalent of ximagesrc from Ubuntu to Mac OS X ?
Could you help me ?
Thanks a lot,

You should use avfvideosrc source element, videoscale element to downsample resolution of your video stream, then you need to use videoconvert (in case video sink and your stream have no common pixel format to use).
gst-launch-1.0 avfvideosrc capture-screen=true ! videoscale ! videoconvert ! video/x-raw,width=640,height=480 ! osxvideosink
Put videoscale before videoconvert to avoid extra work done (the less pixels to convert - the faster your pipeline).

avfvideosrc capture-screen=true should do the trick as the source element.

Related

QML MediaPlayer with GStreamer Backend on Windows

I am designing a video streaming application where host (sender) is on Ubuntu and client (reciever) is on Windows (msvc2015 compiler). My main focus is sending and showing the most recent frame as fast as possible on screen. Both computers have QT 5.15.2
I am utilizing GStreamer with JPEG encryption for sending the frames with the below pipeline. (A little bit modified version of the pipeline from QMediaPlayer documentation https://doc.qt.io/qt-5/qmediaplayer.html)
gst-pipeline: appsrc ! video/x-raw, format=BGRx, framerate=0/1 !
videoconvert ! video/xraw, format=I420 ! jpegenc ! rtpjpegpay ! udpsink
host=127.0.0.1 port=5000
I can decode and show this stream with QML MediaPlayer on Linux and the performance is quite good.
MediaPlayer {
id: mediaPlayer
source: "gst-pipeline: udpsrc port=5000 caps = \"application/x-rtp, media=video, clock-rate=90000, encoding-name=JPEG, payload=26\" ! rtpjpegdepay ! jpegdec ! videoconvert ! qtvideosink sync=false"
autoPlay: true
}
However, my target is to show this stream on Windows machine and I am having difficulties with GStreamer backend. Same url results in the error below. I think it indicates QT is not linked to gstreamer backend and passes the given argument to DirectShow.
DirectShowPlayerService::doSetUrlSource: Unresolved error code
0x800c000d (The specified protocol is unknown.)
Stream plays well with gst-launch command given below, it seems only problem is qt is not able to use gstreamer. In this case, I could not find a way to link GStreamer with QT on windows.
.\gst-launch-1.0.exe udpsrc port=5000 caps = "application/x-rtp,
media=video, clock-rate=90000, encoding-name=JPEG, payload=26" !
rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink sync=false
I can also show the video with K-Lite codec using QML MediaPlayer again but the video has a delay around 1-2 seconds which is not desired for my design. The code snippet is given below:
MediaPlayer {
id: mediaPlayer
source: "rtp://127.0.0.1:5000"
autoPlay: true
}
My question is can I integrate gstreamer to QT somehow? If I can not how can I remove the delay that occurs with K-Lite? I would like to obtain the performance I see with gst-launch command.

How extract single frame from video stream using gstreamer / closing stream

I need to take one frame from video stream from web camera and write it to the file.
In ffmpeg I could do it in this way:
ffmpeg -i rtsp://10.6.101.40:554/video.3gp -t 1 img.png
My GStreamer command:
gst-launch-1.0 rtspsrc location="rtsp://10.6.101.40:554/video.3gp" is_live=true ! decodebin ! jpegenc ! filesink location=img.jpg
problem is, gstreamer process keeps running and does not end. How can I take only one frame and force stream to close after file is written?
Is it possible to do this from command line or should I code this in c/python etc...
Thanks a lot.
I was able to do this with:
! jpegenc snapshot=TRUE
See jpegenc - snapshot.
but my source is different so your mileage may vary.
Try using the property number of buffers for element queue, and restrict it to 1. This will give you hopefully a single frame.

GStreamer udpsrc works with gst-launch but not in app (OSX)

I successfully streamed my webcam's image with GStreamer using gst-launch this way :
SERVER
./gst-launch-1.0 -v -m autovideosrc ! video/x-raw,format=BGRA ! videoconvert ! queue ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=XXX.XXX.XXX.XXX port=7480
CLIENT
./gst-launch-1.0 udpsrc port=7480 ! "application/x-rtp, payload=127" ! rtph264depay ! decodebin ! glimagesink
Now I try to reproduce the client side in my app using this pipeline (I don't post the code as I made an Objective-C wrapper around my pipeline and elements) :
udpsrc with caps:"application/x-rtp,media=video,payload=127,encoding-name=H264"
rtph264depay
decodebin
glimagesink (for testing) or a custom appsink (in pull-mode) that converts image to CVPixelBufferRef (tested: it works with videotestsrc / uridecodebin / etc.)
It doesn't work, even if the state messages of the pipeline look quite 'normal'. I have messages in the console concerning SecTaskLoadEntitlements failed error=22 but I have them too when working with the command line.
I'm asking myself what's under gst-launch that I'm missing. I couldn't find any example on the web on udpsrc based pipeline.
My questions are :
Does anybody knows what's actually happening when we launch gst-launch or a way to know what's actually happening?
Are there some examples of working pipelines in code with udpsrc?
EDIT
Here is the image of my pipeline. As you can see, GstDecodeBin element doesn't create a src pad, as it's not receiving - or treating - anything (I set a 'timeout' property to 10 seconds on the udpsrc element, that is thrown). Could it be an OSX sandboxing problem?
Now my pipeline looks like this:
udpsrc
queue
h264 depay
decode bin
video converter
caps filter
appsink / glimagesink
Tested with the method in this question, the app does actually receive something on this port.
Found why it wasn't receiving anything: GstUdpSrc element must be in GST_STATE_NULL to be assigned a port to listen to, or it will listen to the default port (5004) silently.
Everything works fine now.
Setting the environment variable GST_DEBUG to udpsrc:5 helped a lot, for information.

Gstreamer unable to play audio through rtspsrc

I am having trouble to play the audio from the rtsp server, i have no problem for the video playback, but some error occurred while i tried to play audio,
the following is the command used to play video:
C:\gstreamer\1.0\x86_64\bin>gst-launch-1.0 rtspsrc location=rtsp://192.168.2.116/axis-media/media.amp latency=0 !decodebin ! autovideosink
however, when i change the autovideosink to autoaudiosink, which as in follow:
C:\gstreamer\1.0\x86_64\bin>gst-launch-1.0 rtspsrc location=rtsp://192.168.2.116/axis-media/media.amp latency=0 !decodebin ! autoaudiosink
i get the errors below:
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: Internal data flow error.
Additional debug info:
gstbasesrc.c(2933): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1:
streaming task paused, reason not-linked (-1)
I am new to both stackoverflow and Gstreamer, any helps from you would be much appreciated
Thanks to thiagoss's reply , I have my first success on playing both video and audio using the following pipeline:
gst-launch-1.0 rtspsrc location=rtsp://192.168.2.116/axis-media/media.amp latency=0 name=src src. ! decodebin ! videoconvert ! autovideosink src. ! decodebin ! audioconvert ! autoaudiosink
IIRC rtspsrc will output one pad for each stream (video and audio might be separate) so you could be linking your video output to an audiosink.
You can run with -v to see the caps on each pad and verify this. Then you can properly link by using pad names in gst-launch-1.0:
Something like:
gst-launch-1.0 rtspsrc location=rtsp://192.168.2.116/axis-media/media.amp latency=0 name=src src.stream_0 !decodebin ! autovideosink
Check the correct stream_%u number to use for each stream to have it linked correctly.
You can also just be missing a videoconvert before the videosink. I'd also test that.

How to stream video from webcam using Gstreamer?

How to stream video(and if it possible audio too) from webcam using Gstreamer? I already tried to stream video from source, but I can't stream video from webcam on Windows. How I can do this?
Client:
VIDEO_CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998"
DEST=localhost
VIDEO_DEC="rtph263pdepay ! avdec_h263"
VIDEO_SINK="videoconvert ! autovideosink"
LATENCY=100
gst-launch -v gstrtpbin name=rtpbin latency=$LATENCY \
udpsrc caps=$VIDEO_CAPS port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! $VIDEO_DEC ! $VIDEO_SINK \
udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
rtpbin.send_rtcp_src_0 ! udpsink host=$DEST port=5005 sync=false async=false
Server:
DEST=127.0.0.1
VOFFSET=0
AOFFSET=0
VELEM="ksvideosrc is-live=1"
VCAPS="video/x-raw,width=352,height=288,framerate=15/1"
VSOURCE="$VELEM ! $VCAPS"
VENC="avenc_h263p ! rtph263ppay"
VRTPSINK="udpsink port=5000 host=$DEST ts-offset=$VOFFSET name=vrtpsink"
VRTCPSINK="udpsink port=5001 host=$DEST sync=false async=false name=vrtcpsink"
VRTCPSRC="udpsrc port=5005 name=vrtpsrc"
gst-launch gstrtpbin name=rtpbin
$VSOURCE ! $VENC ! rtpbin.send_rtp_sink_2
rtpbin.send_rtp_src_2 ! $VRTPSINK
rtpbin.send_rtcp_src_2 ! $VRTCPSINK
$VRTCPSRC ! rtpbin.recv_rtcp_sink_2
You will have to use GStreamer 1.3.90 or newer and the ksvideosrc element that is available only since that version.
And then you can stream it just like any other input... the details depend on what codecs, container format, streaming protocol and network protocol you want to use. The same goes for audio, that works basically exactly the same as video.
http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/tests/examples/rtp
here you can find some examples that use RTP for streaming. Server side and client side examples, audio-only, video-only or both. And also streaming from real audio/video capture sources (for Linux though, but on Windows it works exactly the same... just with the Windows specific elements for that).

Resources