Playing fragmented mp4 doesnt continue playing - ffmpeg

Using ffmpeg I create an mp4 using my video camera as the source.
ffmpeg -f dshow -i video="Integrated Webcam":audio="Microphone (Realtek High Definition Audio)"^
-g 52^
-vcodec libx264 -pix_fmt yuv420p -profile:v baseline -level 3^
-f mp4 -movflags empty_moov+default_base_moof+frag_keyframe^
%OUTPUT%\video.mp4
Works with IE11, Chrome and Firefox
And my html video tags:
<video controls autoplay style="width:640px;height:360px;">
<source src="http://localhost/video.mp4"
type='video/mp4;codecs="avc1.42E01E, mp4a.40.2"' />
</video>
The node web server version just has src="http://localhost/" for the
using
I've tried nginx and a node version that I got from this site
The nginx doesnt do anything special. I just basically point the root to the folder so it sees the mp4.
The problem is it only plays however much the webserver sees when the page is loaded. The file is still continuously growing. And if I refresh the page I can see the time length is now longer since the last time.
My question is how can I make the video tag continuously play the fragmented video data without stopping and refreshing the page (which starts from the beginning again)?

Related

ffmpeg feeding mp4 file to rtsp stream

I am trying to feed a mp4 file from ffmpeg to rtsp stream using the command on centos 7:
from console 1: ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:8050/feed1.ffm
from console 2: I have started ffserver and it is started listening.
But when i open http://x.x.x.x:8050/feed1.ffm in browser it shows error:
File ??feed1.ffm? not found
My ffserver.conf file is attached
Browser can't play RTSP directly so you need it delivered in a HTML5 format (HLS) that can load inside a VIDEO tag.
You can directly embed the mp4 inside a VIDEO tag if you wish.
FFMPEG needs to publish the video/stream to a streaming server that supports RTSP or the format you decide to publish as.
You can schedule mp4 videos to play as a live stream with this tool https://broadcastlivevideo.com/schedule-video-playlist-as-live-streaming-channel/ , available as free open source WP plugin.

H264/MP4 live stream from ffmpeg does not work in browser

I cannot visualize a H264/MP4 stream generated by ffmpeg in Chrome, IE, Edge. It works only in Firefox.
My testing environment is Windows 10, all updates done, all browsers up to date.
I have a source MJPEG stream, which I need to transcode to H264/MP4 and show it in browser in a HTML5 element.
In order to provide a working example, I use here this MJPEG stream: http://200.36.58.250/mjpg/video.mjpg?resolution=320x240. In my real case I have MJPEG input from different sources like IP cameras.
I use the following command line:
ffmpeg.exe -use_wallclock_as_timestamps 1 -f mjpeg -i "http://200.36.58.250/mjpg/video.mjpg?resolution=320x240" -f mp4 -c:v libx264 -an -preset ultrafast -tune zerolatency -movflags frag_keyframe+empty_moov+faststart -reset_timestamps 1 -vsync 1 -flags global_header -r 15 "tcp://127.0.0.1:5000?listen"
If I try to visualize the output in VLC, I use this link: tcp://127.0.0.1:5000 and it works.
Then I try to visualize the stream in browser, so I put this into a html document:
<video autoplay controls>
<source src="http://127.0.0.1:5000" type="video/mp4">
</video>
If I open the document in Firefox it works just fine.
But it does not work when trying to open in Chrome, IE or Edge. It seems that the browser tries to connect to the TCP server exposed by ffmpeg, but something happens because ffmpeg exits after few seconds.
In ffmpeg console I can see this:
av_interleaved_write_frame(): Unknown error
Error writing trailer of tcp://127.0.0.1:5000?listen: Error number -10053 occurred
If I inspect the video element in Chrome is can see this error:
Failed to load resource: net::ERR_INVALID_HTTP_RESPONSE
As far as I know all these browsers should support H264 encoded streams transported in MP4 containers. If in the element I replace the link http://127.0.0.1:5000 with a local link to a mp4/H264 encoded file, it is played just fine in each browser. The problem seems to be related to live streaming.
Does anyone know why this happens and how it can be solved?
Thank you!
You're just outputting to a TCP socket. That's not HTTP. Browsers speak HTTP... you need to use an HTTP server in this case.

wowza + live + ffmpeg + hls player, how to create the playlist.m3u8?

I'm trying to setup a wowza live test server and then I can play hls from my mobile app. It do work without any problem for vod. I can play it in my app. I can also see the .m3p8 file if I enter this uri in the browser.
I tried to do the same in live mode (my goal is to test some streaming parameters for live streaming). I tried to use ffmpeg to create the live stream:
ffmpeg -re -i "myInputTestVideo.mp4" -vcodec libx264 -vb 150000 -g 60 -vprofile baseline -level 2.1 -acodec aac -ab 64000 -ar 48000 -ac 2 -vbsf h264_mp4toannexb -strict experimental -f mpegts udp://127.0.0.1:10000
I created a "source file" and connected it to the "Incoming Streams".
I can see in my application's Monitoring / Network tab that it do getting the data from ffmpeg.
My problem is how to get the playlist.m3p8 file so I can play it from inside my app (hls based)?
Again, for now I need a way to test playing with the streaming settings and in real live I'll have a real live streaming source.
If I understand your issue correctly and since you said that it works for you with a VoD and its own m3u8 uri, you seem to not know how to construct an m3u8 uri for live sources referenced by a stream file (not source file as you incorrectly wrote).
Considering you named your stream file for example udp.stream (that's the file including the udp://127.0.0.1:10000 address), simply point your hls player application to http://{yourwowzaserver}/{yourliveapp}/udp.stream/playlist.m3u8
You could push it to Wowza as rtsp (much better than udp) and then stream it further on to where you want. To push it to Wowza you probably will need to setup a username and password (Server > Source authentication) and then the output stream from ffmpeg can look something like this: rtsp://{user}:{pass}#{yourwowzaserver}/{yourliveapp}/mystream.
In Wowza you will see mystream in Incomming streams. From there you can access it with the classic http(s)://wowzaip:wowzaport/{yourliveapp}/mystream/playlist.m3u8
Anyways, Wowza support both rtsp and udp so you could use directly. If you want transcoding, ffmpeg will be kinder to server resources than Wowza.
Worked:
To change the output of ffmpeg to -f rtsp rtsp://127.0.0.1:1935/my_app/my.stream.stream and use it as the input in wowza.

ffmpeg progress is freezing frames when scene change

I'm capturing data from IP camera with RTSP protocol with ffmpeg with command:
ffmpeg -rtsp_transport tcp -progress /media/kamip/stats.txt -i rtsp://192.168.1.220:554/live/h264/ch0
-c:v copy -c:a copy -strict 1 -map 0 -f segment -strftime 1
-segment_time 1800 /media/kamip/cam_%d_%m_%Y_%H_%M_%S.mkv
I'm using this for 5 cameras. One is different type and it is in different location.
Because ffmpeg does not support reconnect I'm writing status to /media/kamip/stats.txt file. In another script I'm parsing this output and every 30 seconds I'm checking if frame number changed, if yes - it is ok, if not, I'm restarting above command.
The problem is only in the night. When is quite dark and suddenly lights on, for example when car is parking, the /media/kamip/stats.txt is showing the same frame number, so my script is recognizing this as a lost connection (video freeze)
I tried "-strict 1" option and I think it is better (one false alarm per day instead of 10 per day), so I think this may be related to ffmpeg, not camera/video source, especially because the video is fine even frame number reported by ffmpeg is still the same. Also VLC does not have this kind of problem (but I cannot use it currently for this camera)
I found that ffmpeg has build-in scene change detector, but it should works only when encoding video (I'm using "copy" option for audio and video)?
I'm thinking about different way of analyzing the video capturing, but this "-progress" in ffmpeg should works fine - and it is working fine for other cameras for few years).
I also do not see any errors,
when I encoded one cutted file with "-loglevel debug" I saw only information like below:
[libx264 # 0x25d77a0] scene cut at 174 Icost:2049115 Pcost:2006553
ratio:0.0208 bias:0.1387 gop:54 (imb:3186 pmb:168)
ffmpeg in latest version
ffmpeg version 3.3.3-1ubuntu1~16.04.york0 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.4) 20160609
any help will be appreciated

Live transcoding and streaming of MP4 works in Android but fails in Flash player with NetStream.Play.FileStructureInvalid error

Recently I had a task to use ffmpeg as a transcoding as well a streaming tool. The task was to convert the file from a given format to MP4 and immediately stream it, by capturing it from stdout. So far so good. The streaming works well with the native player of android tabs as well as the VLC player. The issue is with the flash player. It gives the following error:
NetStream.Play.FileStructureInvalid : Adobe Flash cannot import files that have invalid file structures.
ffmpeg flags used are
$ ffmpeg -loglevel quiet -i somefile.avi -vbsf h264_mp4toannexb -vcodec libx264 \
-acodec aac -f MP4 -movflags frag_keyframe+empty_moov -re - 2>&1
As noted in the docs for -movflags
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback using the qt-faststart tool). A fragmented file consists of a number of fragments, where packets and metadata about these packets are stored together. Writing a fragmented file has the advantage that the file is decodable even if the writing is interrupted (while a normal MOV/MP4 is undecodable if it is not properly finished), and it requires less memory when writing very long files (since writing normal MOV/MP4 files stores info about every single packet in memory until the file is closed). The downside is that it is less compatible with other applications.
Either switch to a flash player that can handle fragmented MP4 files, or use a different container format that supports streaming better.
Also, -re is an input-only option, so it would make more sense to specify it before the input, instead of before the output.

Resources