I'm trying to get fmp4 HLS playing back on a new Chromecast (3rd gen I believe, not Ultra).
I've tried encoding the content with ffmpeg using both x264 and h264 libraries.
The main profile initially gives me a codec not supported error, remove the codec list from the hls manifest fixes this issue.
Switching to baseline (which is not ideal) doesn't give the codec error.
Both then (after removing the codec definitions or using baseline) give the following error:
Uncaught Error: Unable to derive timescale
at Xl (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:344)
at Y.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:337)
at Y.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:340)
at Am.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:384)
at Mj.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:238)
at Wj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:236)
at Oj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:240)
at Mj.fd (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:239)
at Nc (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:39)
at wi.Mc.dispatchEvent (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:38)
Make sure you're not setting:
loadRequestData.media.hlsSegmentFormat
For TS I had to set:
loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS;
But for fmp4 I commented this out.
Related
I'm having trouble creating a gstreamer pipeline that can decode aac streams. I'm using decodebin as the decoder element to keep things as simple as possible. When I push in an aac stream I get the following errors:
0:00:00.243017793 27632 0x556cf6291de0 INFO GST_STATES gstelement.c:2675:gst_element_continue_state:<avdec_aac0> completed state change to PAUSED
0:00:00.243028893 27632 0x556cf6291de0 INFO GST_STATES gstelement.c:2575:_priv_gst_element_state_changed:<avdec_aac0> notifying about state-changed READY to PAUSED (VOID_PENDING pending)
0:00:00.243047154 27632 0x556cf6180830 INFO GST_BUS gstbus.c:588:gst_bus_timed_pop_filtered:<bus1> we got woken up, recheck for message
0:00:00.243865043 27632 0x556cf6291de0 INFO libav :0:: Audio object type 0
0:00:00.243878963 27632 0x556cf6291de0 INFO libav :0:: is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
0:00:00.243901033 27632 0x556cf6291de0 WARN decodebin gstdecodebin2.c:2523:connect_pad:<audiodecoder> Couldn't set avdec_aac0 to PAUSED
...
0:00:00.246185892 27632 0x556cf6291de0 WARN decodebin gstdecodebin2.c:4678:gst_decode_bin_expose:<audiodecoder> error: no suitable plugins found:
Couldn't set avdec_aac0 to PAUSEDCouldn't set avdec_aac_fixed0 to PAUSEDCouldn't set faad0 to PAUSED
0:00:00.246190073 27632 0x556cf6180830 INFO GST_BUS gstbus.c:588:gst_bus_timed_pop_filtered:<bus1> we got woken up, recheck for message
0:00:00.246205032 27632 0x556cf6291de0 INFO GST_ERROR_SYSTEM gstelement.c:2140:gst_element_message_full_with_details:<audiodecoder> posting message: Your GStreamer installation is missing a plug-in.
0:00:00.246253922 27632 0x556cf6291de0 INFO GST_ERROR_SYSTEM gstelement.c:2167:gst_element_message_full_with_details:<audiodecoder> posted error message: Your GStreamer installation is missing a plug-in.
I confirmed the gstreamer plugin is installed via:
gst-inspect-1.0.exe | grep aac
(gst-inspect-1.0:13248): GStreamer-WARNING **: 14:46:08.791: Failed to load plugin 'C:\gstreamer\1.0\msvc_x86_64\lib\gstreamer-1.0\gstwavpack.dll': The specified module could not be found.
This usually means Windows was unable to find a DLL dependency of the plugin. Please check that PATH is correct.
You can run 'dumpbin -dependents' (provided by the Visual Studio developer prompt) to list the DLL deps of any DLL.
There are also some third-party GUIs to list and debug DLL dependencies recursively.
audioparsers: aacparse: AAC audio stream parser
libav: avdec_aac: libav AAC (Advanced Audio Coding) decoder
libav: avdec_aac_fixed: libav AAC (Advanced Audio Coding) decoder
libav: avdec_aac_latm: libav AAC LATM (Advanced Audio Coding LATM syntax) decoder
libav: avenc_aac: libav AAC (Advanced Audio Coding) encoder
libav: avmux_adts: libav ADTS AAC (Advanced Audio Coding) muxer (not recommended, use aacparse instead)
mediafoundation: mfaacenc: Media Foundation Microsoft AAC Audio Encoder MFT
typefindfunctions: audio/aac: aac, adts, adif, loas
voaacenc: voaacenc: AAC audio encoder
I'm having this trouble in both windows and linux. On windows I installed the complete install for the runtime and development builds for gstreamer, and I'm using the latest 5.0 complete builds linked to from ffmpeg's website.
On ubuntu I've done:
sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav libgstrtspserver-1.0-dev libges-1.0-dev libssl-dev ffmpeg faad
I'm still coming up empty.
What am I missing?
This seems to have been caused by my incorrect unwrapping of the FLV data from the incoming RTMP packets. While I unwrapped the audio FLV tag, I did not unwrap the AACAAUDIODATA.AACPacketType byte. Thus my sequence header and audio packets were prepended with a 0 or 1 bytes that could not be read. The hint for this was seeing libav :0:: Audio object type 0 as well as comparing the codec data in gstreamer caps in the log output compared to doing a straight gstreamer filesrc pipeline.
I need a RTSP-server that can listen on a configured port (8554 for example), and then, for example, if I run FFmpeg with:
ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -intra -an -f rtsp -rtsp_transport tcp rtsp://192.168.1.10:8554/test
Then the RTSP-server will RECORD the video, and to play it, I just need to run it with:
ffplay -i rtsp://192.168.1.10:8554/test
I need the RTSP-server to support TCP transport and H264 video encoder and OPUS audio encoder and stream from a live-video (not from a file) + the program should be unlicensed.
This server works great, but don't support OPUS.
Live555 support H264 and OPUS, but only streams from files (VOD).
I've have found some other servers that can stream directly from /dev/video0, but it's also not a good solution for me.
Wowza and Red5Pro does answer all the above requirements, except that they are licenced programs.
Any suggestions for a RTSP-server that support all the above requirements?
EDIT:
I've tried Gstreamer and it looks promising, but I still didn't success.
However, I'm quite sure I'm on the right way (perhaps I don't know how to use yet the pipelines).
I've built gst-rtsp-server, version 1.13.91.
Then, I ran ./test-record "( decodebin name=depay0 ! videoconvert ! rtspsink )"
I ran netstat -anp and I can see clearly, the server is listening on tcp port 8554.
Now it's time to stream to server. I've tried it once with Gstreamer and once with FFmpeg.
Gstreamer
gst-launch-1.0 videotestsrc ! x264enc ! rtspclientsink location=rtsp://127.0.0.1:8554/test
FFmpeg
ffmpeg -f v4l2 -video_size 640x480 -i /dev/video0 -c:v libx264 -qp 10 -an -f rtsp -rtsp_transport tcp rtsp://127.0.0.1:8554/test
In both cases, I can see the RTP packets in wireshark,
and by calling again to netstat -anp, I see:
tcp 0 0 0.0.0.0:8554 0.0.0.0:* LISTEN 14386/test-record
tcp 0 0 127.0.0.1:8554 127.0.0.1:46754 ESTABLISHED 14386/test-record
tcp 0 0 127.0.0.1:46754 127.0.0.1:8554 ESTABLISHED 19479/ffmpeg
So I can surly understand that I'm streaming (or streaming something...). However, when I'm trying to play the video, I'm getting failure (I've tried to play with Gstreamer, FFplay and VLC - all fails...):
Gstreamer
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test latency=300 ! decodebin ! autovideoconvert ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not get/set settings from/on resource.
Additional debug info:
gstrtspsrc.c(7507): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Server can not provide an SDP.
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
FFplay
ffplay -i rtsp://127.0.0.1:8554/test
[rtsp # 0x7fb140000b80] method DESCRIBE failed: 405 Method Not Allowed
rtsp://127.0.0.1:8554/test: Server returned 4XX Client Error, but not one of 40{0,1,3,4}
VLC
vlc rtsp://127.0.0.1:8554/test
VLC media player 3.0.8 Vetinari (revision 3.0.8-0-gf350b6b)
[0000000000857f10] main libvlc: Running vlc with the default interface. Use 'cvlc' to use vlc without interface.
Qt: Session management error: None of the authentication protocols specified are supported
[00007f9fdc000ea0] live555 demux error: Failed to connect with rtsp://127.0.0.1:8554/test
[00007f9fdc001d10] satip stream error: Failed to setup RTSP session
Any ideas what I'm doing wrong ?
Wowza SE works with H264, Opus, VP8 as it supports WebRTC.
This plugin provides a turnkey setup for broadcasting channels live with WebRTC, RTMP, RTSP trough Wowza SE. Also can handle all stream types including RTSP with FFMPEG for on demand adaptive transcoding (in example between WebRTC & HLS).
https://wordpress.org/plugins/videowhisper-live-streaming-integration/
Well, the closest RTSP-server I found so far that matches (almost) all my requirements can be found here: https://github.com/RSATom/RtspRestreamServer (credits for the RTSP-server are for RSATom).
Here is the checklist for all of the features I was looking for:
Support TCP transpot.
Support H264 video codec (currently hard-codec for this codec only).
Support OPUS audio codec (not supported yet, but the server is based Gstreamer library, so it has all the infrastructure to support all the codecs Gstreamer supports - I just need to update the code and make it more generic).
Support RTSP RECORD option from a client with a Live-Stream.
Support RTSP PLAY option from a client.
URL and PORT should be configurable (currently hard-codec - just need to update the code and make it more generic).
The server is Unlicensed.
I have a streaming solution that use MPEG-Dash protocol, and I would like to expose the same files on hls for IOS devices.
I read that fmp4 is now compatible with hls, so I thought that this could be done
When I generate may mpd file with this command:
MP4Box -dash 33000 -frag 33000 -out video.mpd -profile dashavc264:onDemand original.mp4#audio original.mp4#video
what I want is to not duplicate files, and use my generated Dash files with a HLS manifest file.
It seems that this fork of gpac has experimental support for this. Also see this ticket - it has a link to a compiled gpac version from this branch and notes how to use it.
I'm trying to convert an WMA file into mp4 in order to upload the file to youtube.
VN550672.wma.zip
Although the conversion is successful (see below) i'm not able to upload the file to youtube. I'm getting the following error
The video has failed to process. Please make sure you are uploading a supported file type.
VN550672.mp4.zip
Any suggestions?
System configuration:
Python version: 3.6.3
Pydub version: 0.22.1
ffmpeg or avlib?: ffmpeg
ffmpeg/avlib version: 2.8.4
Could it be as easy as youtube requires that the media file includes a video stream? The wma file only has a audio stream.
You can try to transcode and add a dummy video stream using
ffmpeg -i VN550672.WMA -f lavfi -i color=size=426x240 VN550672.mp4
(426x240 is the youtube suggested minimal resolution)
I'm trying to receive a live H264 stream from a wireless camera using RTSP. The camera IP is 192.168.150.1 and it doesn't require authentication.
Since I'm developing under windows, I installed Gstreamer 1.0 - 1.8.3, a complete installation, with all the plugin and everything selected during the installation process.
When I try the pipeline
gst-launch-1.0 rtspsrc location="rtsp://192.168.150.1" latency=100 ! rtph264depay ! avdec_h264 ! autovideosink
I receive this output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.150.1
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Additional debug info:
gstrtspsrc.c(6421): gst_rtspsrc_setup_streams (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I also tried the following command:
gst-play-1.0 rtsp://192.168.150.1
getting this output:
Interactive keyboard handling in terminal not available.
Now playing rtsp://192.168.150.1
Pipeline is live.
ERROR Your GStreamer installation is missing a plug-in. for rtsp://192.168.150.1
ERROR debug information: gsturidecodebin.c(1006): no_more_pads_full (): /GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0:
no suitable plugins found:
gstrtspsrc.c(6421): gst_rtspsrc_setup_streams (): /GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source
Reached end of play list.
When I try the same URL (rtsp://192.168.150.1) in VLC I can see the stream. So my guess is that I'm missing "the right GStreamer RTSP extension plugin"
The output of inspect-1.0 | grep 264 is:
File STDIN:
x264: x264enc: x264enc
videoparsersbad: h264parse: H.264 parser
typefindfunctions: video/x-h264: h264, x264, 264
rtp: rtph264depay: RTP H264 depayloader
rtp: rtph264pay: RTP H264 payloader
openh264: openh264dec: OpenH264 video decoder
openh264: openh264enc: OpenH264 video encoder
libav: avdec_h264: libav H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder
libav: avmux_ipod: libav iPod H.264 MP4 (MPEG-4 Part 14) muxer
I also tried using FFmpeg and I can see the video, but I prefer using Gstreamer because I'm going to use the same configuration (camera, pipeline, gstreamer library...) on an Android device, and in my opinion Gstreamer seems to be the best choice.
From FFmpeg I got this info about the stream
Stream #0:0: Video: h264 (Constrained Baseline), yuv420p, 640x352, 29.92 tbr, 90k tbn, 180k tbc
Does anyone have some advices to sort this thing out?
Wich plugin am I missing? And how can I to add in my installation?
Edit:
The output of gst-launch-1.0.exe -v playbin uri=rtsp://192.168.150.1
Setting pipeline to PAUSED ...
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: ring-buffer-max-size = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: use-buffering = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: download = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: uri = rtsp://192.168.150.1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: source = "\(GstRTSPSrc\)\ source"
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.150.1
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gsturidecodebin.c(1006): no_more_pads_full (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0:
no suitable plugins found:
gstrtspsrc.c(6421): gst_rtspsrc_setup_streams (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I had exactly same problem with GStreamer/1.8.3.
And the reason was that my camera did not provide the "Control URL" attribute in it's session descriptions, while GStreamer is not smart enough to fall back to the base URL in this case (like other players do).
So, I ran the following command to get detailed logs from GStreamer:
gst-play-1.0 rtsp://camera_ip:port/ --gst-debug-level=9 --gst-debug-no-color &> GSTREAMER_LOGS.txt
In the logs I found this line:
DEBUG rtspsrc gstrtspsrc.c:6109:gst_rtspsrc_setup_streams:<source> skipping stream 0x7f01b402c140, no setup
Then looking into the current gstrtspsrc.c code from Kurento's gst-plugins-good bundle, I found that "skipping stream ..., no setup" error only happens when stream->conninfo.location == NULL. And that, as I said, happened because my camera didn't provide the "Control URL" attribute in SDP. Adding the following line to my camera SDP session descriptions solved the issue for me:
a=control:*
But, generally, this probably needs to be fixed in GStreamer code.
I believe this is a limitation of GStreamer that isn't shared by VLC and ffmpeg. I have a similar situation here, where I have three different RTSP cameras, two that work fine with GStreamer and one that doesn't. All three work fine with VLC and ffmpeg.
I used Wireshark to look at the raw RTSP protocol and found that the two cameras that work with GStreamer include an sprop-parameter-sets parameter field, while the one that doesn't work doesn't have this field.
The information encoded in sprop-parameter-sets (the SPS and PPS data) is usually present in the RTP stream that comes from the camera. Apparently VLC and ffmpeg are smart enough to pick this up, but GStreamer is not.
I tried to manually insert the sprop-parameter-sets data by using the caps command line parameter, but was unsuccessful.