I am using FFMPEG to record the video stream (h264 / AAC in MP4) provided by an AXIS camera (P1346).
I need to make recordings at specific times. To achieve this, I use a PHP script which is executed every minute (with cron), checks in the database whether a recording should be started or not.
To record the stream, I use proc_open with the following command line:
/usr/local/bin/ffmpeg -i 'rtsp://192.168.0.103:554/axis-media/media.amp?udp&buffer_size=65535' -vcodec copy -acodec copy -t 3600 -y '/path/to/video/folder/file.mp4'
Problem is, it works in most cases, but the recording fails every now and then. Have anyone ever tried to record AXIS streams in command line, and faced bugs while doing it?
Thank you in advance.
Regards,
Mathieu
N.B.: I have encountered the same difficulties using VLC.
EDIT: Network errors can be a good explanation on why the recordings sometimes suddenly interrupt; however, I still don't understand why the whole video file is corrupted after that.
I have seen this issue before when the connection between the server and the camera is interrupted or degraded. We were pulling a feed from a camera over a wireless connection and the periodic latency was an issue. Check the health of the network and the server pulling the stream.
Related
I created an app in Windows to live stream the content of all the screen or only a portion of it (a Windows) to Youtube.
I used this app but I still have a problem I'm not able to understand.
I use different internet connections: ADSL at home 30Mbit, or ADSL router outside ad 2.5Mbit.
In any case, after starting ffmpeg to live stream the fps strats growing from 300 to 2000 the transmission is perfect for some minutes, then the fps slowdown until a very low value for the bitrate of the Youtube streaming. The image is no more clear and disappears, the audio is still working. The CPU is still under the 35-40% of usage.
ffmpeg must be restarted to get another 5-7 minutes of good transmission.
I tryed changing the ffmpeg command line but nothing seams to influence this behaviour.
This is because I still don't undestand Where the problem is. Any suggestions?
A log of a single session (aprox. 20 minutes) is available here http://www.mbinet.it/public/ffmpeg-20180106-094446.txt
Another (aprox. 5 min.) is available here http://www.mbinet.it/public/ffmpeg-20180106-105529.txt
Thanks
We needed to create clips from the remote video by providing the time duration. This is the command we are using
ffmpeg -i {{remote_video}} -ss {{start_time}} -flush_packets 1 -codec copy -t {{duration}} -y {{output_file}}
What We are unable to figure out is how actually FFmpeg does this. It does not download the entire video & still is able to generate clip for remote video.
Looked into documentation but found none.
I think it will be a combination of container format and what "protocol" that is used. The container needs to support some kind of seeking and then the protocol used (file, http, etc) needs to support seek. For example the ffmpeg http protocol implementation can do seeks using the Range-header if the remote server supports is.
Have look at https://github.com/FFmpeg/FFmpeg/blob/master/libavformat/http.c if you want to see how it works for http (search for "seek").
I want to live stream the video captured on my SJ 4000 Camera.
The Camera is Connected to my Rpi by Wi-Fi and the stream is available using the following address: rtsp://192.168.1.254/sjcam.mov
Now, I want to watch this stream in my webpage by using a Streaming Engine on Raspberry Pi.
The rtsp://Camera addr works when I Connected the Camera directly to my Windows PC and attempted using VLC. But I wanted to do it by using Rpi as the streaming engine.
I have attempted the following:
1) Using ffmpeg -i "rtsp://[IP_ADDR]" -vcodec -f http://[my_pc_IP_ADDR]
But am getting an error message "Unable to find a suitable output format for 'http://192.168.55.39:5678".
2) Installed OMX Player. But I do not find proper material to stream using OMX Player.
3) Have come across GStreamer. But still the same problem..I did not find proper material.
Kindly provide your valuable inputs.
Thanks.
If you can launch thttpd or any other HTTP server on the Pi then this and this might work for you - both answers describe how to use ffmpeg to transcode RTSP to an HLS playlist. If running a HTTP server is not an option then you need to combine ffmpeg and ffserver - the are numerous examples (i'll update the answer if I find one that fits your case).
Or does it have some other meaning? I have searched all over the internet, and the documentation is very thin on it... If someone could point me to something that explains exactly what it is, I would appreciate it.
I am talking about this:
ffmpeg "rtmp://...... live=1" .....
tia.
Short answer is yes.
RTMP has live streaming support and vod support. 'live=1' means the rtmp is running a live streaming. In this case, the media server is receiving video feed from source in real-time. Therefore, rewind back to a previous time is not a supported action. Without 'live=1', RTMP is running on vod mode, which means the entire video pre-exist on media server, then the server is capable of rewinding back, or seek to a random position of the video.
Although technically, on client side (preferably with a software, not webpages), if you maintain buffer your self, you can rewind or pause one way or another. Since you are saving data as you are receiving from media server and everything is under your control, you will be capable of rewind and pause live streams. But you will have to implement the buffering and decoding mechanism yourself. ffmpeg command will not be able to help on this.
I am new to live555.
I want to stream my webcam from a windows 7 (64-bit) machine behind home LAN using ffmpeg as the encoder to a live555 server running on a Debian 64-bit linux machine in a data center over the WAN. I want to send a H.264 RTP/UDP stream from ffmpeg and the "testOnDemandRTSPServer" should send out RTSP streams to clients that connect to it.
I am using the following ffmpeg command which sends UDP data to port 1234, IP address AA.BB.CC.DD
.\ffmpeg.exe -f dshow -i video="Webcam C170":audio="Microphone (3- Webcam C170)" -an
-vcodec libx264 -f mpegts udp://AA.BB.CC.DD:1234
On the linux server I am running the testOnDemandRTSPServer on port 5555 which expects raw UDP data from from AA:BB:CC:DD:1234. I try to open the rtsp stream in VLC using rtsp://AA.BB.CC.DD:5555/mpeg2TransportStreamFromUDPSourceTest
But I get nothing in VLC. What am I doing wrong? How can I fix it?
From what I remember, it was non-trivial to write a DeviceSource class, the problem you're describing is definitely something that's discussed quite frequently on the live555 mailing list - you need to get yourself approved to the list a.s.a.p if you want to do anything related to rtsp development.
The problem you seem to be having is related to the fact that some video formats are written with streaming in mind, and the rtsp server can easily stream certain formats because they contain "sync bytes" and other 'markers' which it can use to determine where frame boundaries end. The simplest solution you could use is to get your hands on the SDK for the camera, and use that to request data from the camera. There are many different libraries and toolkits that let you access data from the camera - one of which would be the DirectX SDK. Once you have the camera data, you would need to encode it into a streamable format, you might be able to get the raw camera frames using DirectX, then convert that to mp4 / h264 frame data using ffmpeg (libavcodec, libavformat).
Once you have your encoded frame data, you feed that into your DeviceSource class, and it will take care of streaming the data for you. I wish I had code on hand, but I was bound by NDA to not remove code from the premises, although the general algorithm is documented on the live555 website, so I am able to explain it here.
I hope you have a bit more luck with this. If you get stuck, then remember to add code to your question. Right now the only thing that's stopping your original plan from working (stream file to VLC) is the file format you chose to stream.
One thing you can try is to increase the logging verbosity level of VLC to 2: VLC expects in-band parameter sets in which case it will print a debug message that it is waiting for parameter sets on the messages window. Just having the parameter sets in the SDP of the RTSP DESCRIBE is not sufficient. IIRC you can configure x264 to output parameter sets periodically or at least with every IDR frame.
Other things you can try:
You can test the stream with openRTSP before using VLC. If you use the openRTSP -d 5 -Q rtsp://xxx.xxx.xxx.xxx:5555/mpeg2TransportStreamFromUDPSourceTest options openRTSP will print quality statistics after streaming for 5 seconds. Then you will be able to verify that the testOnDemandRTSPServer is indeed relaying the stream, and that there is not a problem between the ffmpeg application and the testOnDemandRTSPServer.
Have you tried a different stream? Also, I had a similar problem due to issues with my firewall, you might want to make sure you can actually stream data through those ports.
If you are missing a Sync Byte, it's probably a stream issue - try using a different data source and see if that helps, try an .avi file or an .mp4 file, usually .mp4 files are easy to stream. If the streaming works with the .mp4 file, and not with your mpegts file, then it's a problem in your file - ffmpeg is trying to figure out where each "frame" or "frame set" of data ends so that it can try to stream discrete chunks.
It's been over 2 years since I last worked with this stuff, so let me know if you get anywhere.