Failing to upload an mp4 to youtube - ffmpeg

I'm trying to convert an WMA file into mp4 in order to upload the file to youtube.
VN550672.wma.zip
Although the conversion is successful (see below) i'm not able to upload the file to youtube. I'm getting the following error
The video has failed to process. Please make sure you are uploading a supported file type.
VN550672.mp4.zip
Any suggestions?
System configuration:
Python version: 3.6.3
Pydub version: 0.22.1
ffmpeg or avlib?: ffmpeg
ffmpeg/avlib version: 2.8.4

Could it be as easy as youtube requires that the media file includes a video stream? The wma file only has a audio stream.
You can try to transcode and add a dummy video stream using
ffmpeg -i VN550672.WMA -f lavfi -i color=size=426x240 VN550672.mp4
(426x240 is the youtube suggested minimal resolution)

Related

Chromecast HLS: Unable to derive timescale

I'm trying to get fmp4 HLS playing back on a new Chromecast (3rd gen I believe, not Ultra).
I've tried encoding the content with ffmpeg using both x264 and h264 libraries.
The main profile initially gives me a codec not supported error, remove the codec list from the hls manifest fixes this issue.
Switching to baseline (which is not ideal) doesn't give the codec error.
Both then (after removing the codec definitions or using baseline) give the following error:
Uncaught Error: Unable to derive timescale
at Xl (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:344)
at Y.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:337)
at Y.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:340)
at Am.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:384)
at Mj.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:238)
at Wj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:236)
at Oj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:240)
at Mj.fd (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:239)
at Nc (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:39)
at wi.Mc.dispatchEvent (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:38)
Make sure you're not setting:
loadRequestData.media.hlsSegmentFormat
For TS I had to set:
loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS;
But for fmp4 I commented this out.

Desperately looking for a RTSP server that can stream from a live source (not from a file)

I need a RTSP-server that can listen on a configured port (8554 for example), and then, for example, if I run FFmpeg with:
ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -intra -an -f rtsp -rtsp_transport tcp rtsp://192.168.1.10:8554/test
Then the RTSP-server will RECORD the video, and to play it, I just need to run it with:
ffplay -i rtsp://192.168.1.10:8554/test
I need the RTSP-server to support TCP transport and H264 video encoder and OPUS audio encoder and stream from a live-video (not from a file) + the program should be unlicensed.
This server works great, but don't support OPUS.
Live555 support H264 and OPUS, but only streams from files (VOD).
I've have found some other servers that can stream directly from /dev/video0, but it's also not a good solution for me.
Wowza and Red5Pro does answer all the above requirements, except that they are licenced programs.
Any suggestions for a RTSP-server that support all the above requirements?
EDIT:
I've tried Gstreamer and it looks promising, but I still didn't success.
However, I'm quite sure I'm on the right way (perhaps I don't know how to use yet the pipelines).
I've built gst-rtsp-server, version 1.13.91.
Then, I ran ./test-record "( decodebin name=depay0 ! videoconvert ! rtspsink )"
I ran netstat -anp and I can see clearly, the server is listening on tcp port 8554.
Now it's time to stream to server. I've tried it once with Gstreamer and once with FFmpeg.
Gstreamer
gst-launch-1.0 videotestsrc ! x264enc ! rtspclientsink location=rtsp://127.0.0.1:8554/test
FFmpeg
ffmpeg -f v4l2 -video_size 640x480 -i /dev/video0 -c:v libx264 -qp 10 -an -f rtsp -rtsp_transport tcp rtsp://127.0.0.1:8554/test
In both cases, I can see the RTP packets in wireshark,
and by calling again to netstat -anp, I see:
tcp 0 0 0.0.0.0:8554 0.0.0.0:* LISTEN 14386/test-record
tcp 0 0 127.0.0.1:8554 127.0.0.1:46754 ESTABLISHED 14386/test-record
tcp 0 0 127.0.0.1:46754 127.0.0.1:8554 ESTABLISHED 19479/ffmpeg
So I can surly understand that I'm streaming (or streaming something...). However, when I'm trying to play the video, I'm getting failure (I've tried to play with Gstreamer, FFplay and VLC - all fails...):
Gstreamer
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test latency=300 ! decodebin ! autovideoconvert ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not get/set settings from/on resource.
Additional debug info:
gstrtspsrc.c(7507): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Server can not provide an SDP.
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
FFplay
ffplay -i rtsp://127.0.0.1:8554/test
[rtsp # 0x7fb140000b80] method DESCRIBE failed: 405 Method Not Allowed
rtsp://127.0.0.1:8554/test: Server returned 4XX Client Error, but not one of 40{0,1,3,4}
VLC
vlc rtsp://127.0.0.1:8554/test
VLC media player 3.0.8 Vetinari (revision 3.0.8-0-gf350b6b)
[0000000000857f10] main libvlc: Running vlc with the default interface. Use 'cvlc' to use vlc without interface.
Qt: Session management error: None of the authentication protocols specified are supported
[00007f9fdc000ea0] live555 demux error: Failed to connect with rtsp://127.0.0.1:8554/test
[00007f9fdc001d10] satip stream error: Failed to setup RTSP session
Any ideas what I'm doing wrong ?
Wowza SE works with H264, Opus, VP8 as it supports WebRTC.
This plugin provides a turnkey setup for broadcasting channels live with WebRTC, RTMP, RTSP trough Wowza SE. Also can handle all stream types including RTSP with FFMPEG for on demand adaptive transcoding (in example between WebRTC & HLS).
https://wordpress.org/plugins/videowhisper-live-streaming-integration/
Well, the closest RTSP-server I found so far that matches (almost) all my requirements can be found here: https://github.com/RSATom/RtspRestreamServer (credits for the RTSP-server are for RSATom).
Here is the checklist for all of the features I was looking for:
Support TCP transpot.
Support H264 video codec (currently hard-codec for this codec only).
Support OPUS audio codec (not supported yet, but the server is based Gstreamer library, so it has all the infrastructure to support all the codecs Gstreamer supports - I just need to update the code and make it more generic).
Support RTSP RECORD option from a client with a Live-Stream.
Support RTSP PLAY option from a client.
URL and PORT should be configurable (currently hard-codec - just need to update the code and make it more generic).
The server is Unlicensed.

How do I generate a file M3U8 compatible with fmp4?

I have a streaming solution that use MPEG-Dash protocol, and I would like to expose the same files on hls for IOS devices.
I read that fmp4 is now compatible with hls, so I thought that this could be done
When I generate may mpd file with this command:
MP4Box -dash 33000 -frag 33000 -out video.mpd -profile dashavc264:onDemand original.mp4#audio original.mp4#video
what I want is to not duplicate files, and use my generated Dash files with a HLS manifest file.
It seems that this fork of gpac has experimental support for this. Also see this ticket - it has a link to a compiled gpac version from this branch and notes how to use it.

mpd not using lame or vorbis encoders

I am trying to run an httpd stream via mpd. The config I have is fairly straight forward:
# -------- AUDIO FOR STREAM ---------------------
audio_output {
type "httpd"
name "My HTTP Stream"
encoder "lame" # optional, vorbis or lame
port "8000"
# quality "5.0" # do not define if bitrate is defined
bitrate "128" # do not define if quality is defined
# format "44100:16:1"
# max_clients "0" # optional 0=no limit
}
However when I run mpd I get the following error:
Mar 28 15:40 : fatal_error: line 337: No such encoder: lame
The same occurs when I try using vorbis. I checked my version of mpd and this is the output:
$ mpd --version
Music Player Daemon 0.19.8
...
Encoder plugins:
null wave
...
So as it stands it doesn't seem to have the lame/vorbis encoder plugin installed. I'm currently using OS X so i've installed mpd through homebrew. Any ideas how to fix this?
For whatever reason even though I had lame + vorbis libraries installed as dependencies, they weren't installed as encoder plugins when I installed mpd.
To do this you have to run the brew command with the encoders as options.
brew install mpd --with-lame

Cannot convert mp4 to gif using gifify

I am getting the following error when trying to convert an mp4 video to gif using gifify
Unable to find application named 'Cloud'
Does anyone know how to debug or investigate this type of issue?
Appears that gifify "by default" attempts to upload the gif to "CloudApp" ("Cloud" in earlier versions) https://github.com/jclem/gifify/blob/master/gifify.sh#L88
So run it like gifify -n movie.mp4 instead.
I suggest you add a new issue telling them to log better when there is no CloudApp installed here: https://github.com/jclem/gifify/issues

Resources