I am streaming video and audio from my web cam/microphone over UDP. When I view the stream (even on the same machine) there is a delay of about 4 seconds. I have tried setting the UDP Cache setting to 0, or 1 but it doesn't seem to help. I have tried reducing the video and audio bit-rates, using mono sound and reducing the sample-rate all to no avail.
Does anyone have any ideas how I could reduce the delay, to something better suited to for a video conference, i.e < 1 second?
Is there a setting I can apply to the viewer/streamer that can help?
Thanks,
Marc
If you are using rtsp protocol to stream to video/audio, you can adjust the delay at
tools->preferences->all->input/codecs->demuxers->RTP/RTSP->caching value
tools->preferences->all->input/codecs->demuxers->RTP->RTP de-jitter buffer length
Try this.
#!/bin/sh
ETH=eth0
cvlc --miface=$ETH v4l2:///dev/video0 :input-slave=alsa://hw:0,0 :sout=#transcode{vcodec=h264,venc=x264{preset=ultrafast,tune=zerolatency,intra-refresh,lookahead=10,keyint=15},scale=auto,acodec=mpga,ab=128}:rtp{dst=224.10.0.1,port=5004,mux=ts} :sout-keep >/dev/null 2>/dev/null &
vlc1=$!
vlc --miface=$ETH rtp://224.10.0.1 >/dev/null 2>/dev/null &
vlc2=$!
wait $vlc2
kill -9 $vlc1
I've 2 seconds delay with 720p webcam, it produce about 2.5Mbit/s trafic and load for one core ~30%.
In my study of VLC streaming with webcam, I got 2-3 seconds delay for UDP multicast stream transcoded with WMV/ASF container + WMV2 codec from Dell's Creative Integrated Webcam with cif video size.
If using MP4/MOV container + H.264 codec, I got twice the delay of the former with the same settings in bitrate, fps and scale.
I disabled audio in both streaming settings since I wasn't interested in it.
I did the study with two VLC versions:
VLC 1.1.11 (latest Windows stable release)
VLC 2.1.0 (latest nightly build version)
With the first version, I could transcode and stream from the webcam, but it could not playback the stream properly (it just gave a blackened video stream)
With the second version, it worked well for transcoding, streaming and playback.
This study was done on:
Intel Core 2 Duo T7250
4GB DDR2-667 SDRAM
SATA 7200 RPM HDD
GeForce 8400M GS 128MB GDDR3 (+ 128MB shared memory = 256MB video memory)
Windows XP Pro SP3
Related
In Ant Media Server after recording stream on Windows using API, the VOD plays fine on Windows. But when playing the same VOD on macOS using Quick Time Player v10.5, the video freezes after some seconds and audio continues.
VODs playback with Quick Time Player is fine for recordings made on macOS.
How can I overcome this and is it an expected behaviour!
TL;DR;
Transcode the video with ffmpeg after recording or add at least one adaptive bitrate on the Ant Media Server side.
This is a known issue in quick time player. This problem also exists for MacOS/iOS and Safari. Let me tell the cause of the problem and offer a solution.
Problem:
The resolution may be changed in WebRTC sessions according to the network conditions so that the resolution of the recording is being changed to lower or higher resolution.
Most of the players and browsers can handle that. On the other hand, Safari and Quick Time Player cannot handle resolution changes and the problem you mention appear.
Solution:
Transcoding the stream into a specific resolution with ffmpeg or using adaptive bitrate on the server side resolves this issue. Typical ffmpeg command is sufficient
ffmpeg -i INPUT.mp4 OUTPUT.mp4
A. Oguz antmedia.io
I'm investigating an issue with an in-house developed app using WMF to capture UVC data from a Cypress FX3 device. The stream is generated from a test pattern generator fed from an FPGA to the FX3. For a frame size of 1920x1080 (#30FPS), the frame capture works fine. For a frame size of (say) 3264x2448 (at <8FPS to meet throughput restrictions) the app is getting stuck during the ReadSample(). It does appear data is being received as the data image pattern can be seen in memory. Device enumeration looks ok i.e. the reported descriptors look to be correct and SelectFormat() is set accordingly.
Are there any restrictions on frame size?
I see that problem with your hardware is very specific and can be related with hardware part. The frame size - 3264x2448 is very huge. I have experience with Logitecn HD Pro Webcam C920 and I can get max frame size 2304x1536 at 2 FPS and RGB24. If you have connected via USB then it can be problem to transmit it via USB bus (especially for USB2). You must know that Windows Media Foundation is engine of the media part of Windows - for example WindowsStore can work only with Media Foundation. More over, Windows 10 Includes encoder and decoder for HEVC (H265) video and supports 4K and 8K in native, but playing video and working with live video via USB has some difference.
I need to live broadcast multiple RTSP streams out of the audio mixing software StudioOne. For this I am using Jack Audio Connection Kit as the connector. I've already tried using IceCast with Darkice but the latency went up to 6+ seconds which won't work for the project that I'm working on. That's why I'm using the Wowza media server which does RTSP streaming instead of HTTP.
That's where I'm stuck as I need some way of getting the streams from Jack Audio to Wowza on a MAC OS machine. I've tried using FFMpeg but FFMpeg doesn't have the feature to get input from Jack Audio on it's OSX version. I can try to port my whole setup onto an Ubuntu but the mixing software StudioOne isn't available on Ubuntu. I can try using Wine to port StudioOne to Linux but I'm not sure it'll be a good idea for real time mixer to be used as a port, especially when latency is involved.
Is there some other way I can get input from Jack Audio to Wowza Media Server on my MAC?
JACK on OS X is now in FFmpeg as of this commit (67f8a0be545).
Once you have JACK installed, you can compile FFmpeg from source and support should be automatically compiled into FFmpeg.
I'm trying to stream my webcam with FFmpeg to my Red5 server using RTMP. I've done this successfully using FLV format with the following line:
ffmpeg -f video4linux2 -i /dev/video0 -f flv rtmp://localhost/live/livestream
I'm new to FFmpeg and live streaming, and I've tried to stream using H.264/MPEG-4. But my knowledge is a bit limited with the FFmpeg options (which I did find here: http://man.cx/ffmpeg%281%29).
So, my questions would be:
How can I use H.264/MPEG-4 to stream to my Red5 server?
What are the options to stream audio as well?
And one final issue is:
I'm having a delay of about 5 seconds when I play the content with JWPlayer in Mozilla Firefox (on Ubuntu). Can you please help me out to solve this problem? Any suggestions why this might be?
Many thanks
There is no need to use ffmpeg for streaming H.264/MPEG-4 files because red5 has build in support for this. Using ffmpeg will only put an unnecessary load on your CPU usage. Red5 will recognize the file type automatically, you only have to specify the mp4 file in your JWPlayer.
About the delay, as far as I know JWPlayer has a buffer of 3 seconds by default. You can try to lower this (property bufferlength or something like that). And maybe JWPlayer has a "live" property as well to stream with minimal delay, but I am not sure about that. Removing ffmpeg will probably speed up the process also.
I'm using ManagedMEdiaHelpers project as a base to a background audio streaming project.
The audio output is fine but sometimes is takes 1 to 6 seconds to start playing. During that time the device sends some strange noises similar to hiccups or scratching.
The mp3 I'm trying to stream have the following properies:
Bitrate: 320000
Sampling Rate: 44100
What are the possible causes to receive that kind of noises on the start of the stream when the rest of the mp3 plays just fine?
More info
I noticed that during the hiccups the fps count was below 20 so I tried to profile the application.
I got the following message during the hiccups period:
Waring :Very high CPU usage by system threads: System and other applications
are using 65,02% of the CPU. This CPU usage may be caused by other
tasks that are running on the system or they may be caused by system
operations that are triggered by a user application. Ensure that no
other tasks are running on the system during profiling.
It was an hardware limitation of HTC Radar.
I Just tried the same code on an Samsung OMNIA 7 and the stream is just perfect. Also there's no penalty on the fps count so I think that on this device there's no CPU hogging.
Strangely gsmarena says that both devices have the same CPU.