libsourcey & OpenCV stream example - c++11

I am trying to make a second annotated web based video screen for our security cams.
Basicly we are tooling the opencv with C++11 , based solution to boxing/ovelays for important objects in the video frames . We can see locally with imshow .
We want to stream this frames so we can access through web browser based app through ip adresses.
The question is:
What is the advised thirdparty for this simple(!) requirement (libsourcey,ffmpeg,libvl etc)
any example for opencv to ie: mp4 streaming that we can acces from browser.
at first sight video streaming libs looks very complicated and honestly we confused.
Thanks a lot.

Related

Stream microphone from client browser to remote server and pass audio in real time to ffmpeg to combine with a second video source

As a beginner at working with these kinds of real-time streaming services, I've spent hours trying to work out how this is possible, but can't seem to work out I'd precisely go about it.
I'm prototyping a personal basic web app that does the following:
In a web browser, the web application has a button that says 'Stream Microphone' - when pressed it streams the audio from the user's microphone (the user obviously has to consent to give permission to send their microphone audio) through to the server which I was presuming would be running node.js (no specific reason at this point, just thought this is how I'd go about doing it).
The server receives the audio close enough to real-time somehow (not sure how I'd do this).
I can then run ffmpeg on the command line and take the real-time audio coming in real-time and add it as the sound to a video file (let's just say I'm going to play testmovie.mp4) that I want to play.
I've looked at various solutions - such as maybe using WebRTC, RTP/RTSP, Piping audio into ffmpeg, Gstreamer, Kurento, Flashphoner and/or Wowza - but somehow they look overly complicated and usually seem to focus on video along with audio. I just need to work with audio.
As you've found there are numerous different options to receive the audio from a WebRTC enabled browser. The options from easiest to more difficult are probably:
Use a WebRTC enabled server such as Janus, Kurento, Jitsi (not sure about wowzer) etc. These servers tend to have plugin systems and one of them may already have the audio mixing capability you need.
If you're comfortable with node you could use the werift library to receive the WebRTC audio stream and then forward it to FFmpeg.
If you want to take full control over the WebRTC pipeline and potentially do the audio mixing as well you could use gstreamer. From what you've described it should be capable of doing the complete task without having to involve a separate FFmpeg process.
The way we did this is by creating a Wowza module in Java that would take the audio from the incoming stream, take the video from wherever you want it, and mix them together.
There's no reason to introduce a thrid party like ffmpeg in the mix.
There's even a sample from Wowza for this: https://github.com/WowzaMediaSystems/wse-plugin-avmix

Live stream multi-bitrate video

Preface
I have read this two part tutorial (Part-1 and Part-2) by Steamroot on MPEG-DASH, and below is my understanding (please correct me if I am wrong):
The video needs to be encoded into multiple bit-rates using FFmpeg.
The encoded videos need to be transcoded (dashified) using MP4Box.
The dashified videos can be served using a web server.
Problem
I intend to live-stream an event and I need help to understand the following:
Can I club the FFmpeg and MP4Box commands into a single step? Maybe through a wrapper program so that I do not have to run them separately? Is there any other or better solution?
How do I send the dashified content to the web server? FTP? Would any vanilla web server do?
Lastly, a friend had hinted that I could also use GStreamer to achieve my objective. But, I could not find any good resource on the internet for the same. So, where (and how) does GStreamer fit in the above process?
What is the format you will be getting out of your camera for your live-event? There are a lot of solutions a lot more adapted for live streaming (the tutorial I wrote is for VOD streams only). You can check out simple solutions like Wowza Streaming Server, Nible streamer (free), etc, that take a RTMP stream and transform it into other formats (HLS, DASH, etc...).
Most of the livestreaming platforms can even do that for you (livestream.com, youtube, twitch, or even facebook now)
The dashified content will be requested as HTTP ressources by the browser or other players. In the case of a VoD stream, indeed you just need to make the dash segments available through a web-server. For live content, you need something smarter, that will encode, package the segments and make them available on the fly.
Gstreamer can transcode and transmux the original content, and can do it on the fly. You will be able to get different formats as outputs, like RTMP, HLS, and probably even mpeg-dash. Then you still need to make your content available via a webserver.
In conclusion, if you just want to transmit an occasional live event, it's probably a lot easier a platform that will ingest your RTMP stream and do all the complicated steps for you.

Laravel project + media server both for live and vod streaming deployed on Docker

After many hours of research and nothing relevant coming up I decided to ask.
I am pretty new to the concept of video streaming, so please forgive me if my questions may seem elementary.
I am building a project that needs to include media streaming functionality. It should has the following options:
VOD - user uploads a file to the server, that needs to be transcoded to few MP4 files of different resolutions. For transcoding I am trying the approach using CloudTranscode (https://github.com/bfansports/CloudTranscode) deployed as a Docker image. The server should supply stream to the player with certain buffer size, so when the playback is paused we buffer for instance +5 seconds and that's it. Adaptive bitrate would be nice, however I'm not sure how this works with different players (I was thinking about using Video.JS due to high customization option, plus it's free).
Live video capturing - user visits a certain page that captures video from the webcam and sends the stream to the server for further stream distribution to clients. For most browsers WebRTC could be a good option, but iOS devices probably won't work with it, so any suggestions here would be much appreciated
Live video streaming - users visit a certain page where they can watch the stream captured from the user mentioned in point 2. Here the stream may be watched by one or many users (may be as well 1 or 10,000 users)
Cutting to the chase my questions follow:
What would be the best media server software that I can use for that purpose, having on mind high scalability (deployed as Docker container on AWS EC2), and possible huge load of both streaming and watching users, as well as multi-device/platform/browser support?
What would be the best media player for webpage that (again) would be cross-browser/platform/device, keeping in mind good integration with media server itself for purpose of adaptive resolution streaming? Also it would be nice if the player has broad customization options in matter of appearence (for instance thumbnail display when hovering the timeline).
Do you know any better solution for video transcoding than mentioned CloudTranscode, having on mind Docker setup, and some easy to use API (here some on-the-fly transcoding would be nice, so the worker wouldn't need to wait for the whole file to be uploaded)?
What happens if I use autoscalling functionality on EC2 instance, and more instances of the media server are being automatically started? Let's say we have instance 1 (I1) and instance 2 (I2). Some user started broadcasting on I1, and 1000 users are watching the stream which is the server instance's limit because it's running out of resources. Next, another couple of users are trying to view the stream, so they are being connected to I2 by AWS load balancer - how does that work with live stream? Sorry, but I am total newbie to the concept, so again - forgive me for elementary questions.
So far a was able to find a few media servers that may be relevant to my needs including:
Wowza Media Server (paid)
Red5 media server (free)
Kurento Media Server (free)
My application is written in Laravel, ergo I need some PHP integration with the media server.
Obviously free solutions are the most welcome, however I do not mind to pay as long as paid solution covers my needs.
Any input here will be much appreaciated - even partial solutions / suggestions. I'm kinda stuck here, so any suggestions that can bring me closer to the solution are very welcome!
Best regards
If anyone needs such information I ended up using Nginx Plus media server functionalities. It's capable of serving both live and VOD streams, it has out-of-the-box load balancer to switch traffic over multiple container instances and many more great features. Plus they have images to deploy directly from AWS Marketplace, and the license is paid hourly when the EC2 instance is running. Ofcourse there is free version as well, but I am really satisfied with Nginx Plus support.
Capturing live stream from user I've done using getUserMedia() in JS. Still having minor glitches, but I will get it to work (problems are related with WebM chunks that MediaRecorder API spits out, but I'm almost done here using some Python piece of code modifying each chunk on server side).
If anyone needs help I will be happy to help.

I want to upload a camera video stream to Amazon S3 and download it to an Android phone. I'm completely new to this. How can I do this?

I'm really dumb and new to RTP/SIP. Is there a stack that's recommended for uploading video to the cloud from a camera attached to a microprocessor? What's the difference between all the things I'm seeing - MPEG DASH, Live555, ffmpeg, and so on...?
How does WhatsApp or Dropcam transmit live video?
Uploading a video on it's own is fairly trivial if the video has already been captured you will just need to set your app to have access to the local media store, list the files and then upload using any standard technology e.g. HTTP PUT, HTTP POST, FTP, S3 etc
If you want to process the video first you would be wise to first process it via a library such as ffmpeg which has been compiled for Android e.g. https://trac.ffmpeg.org/wiki/How%20to%20compile%20FFmpeg%20for%20Android
If you want to live stream the video hence your reference to RTP/SIP etc you will need to access the camera. You could start with something like the kickflip SDK as a basis and includes a bundled ffmpeg https://github.com/Kickflip/kickflip-android-sdk or libstreaming: https://github.com/fyhertz/libstreaming

Capture raw video byte stream for real time transcoding

I would like to achieve the following:
Set up a proxy server to handle video requests by clients (for now, say all video requests from any Android video client) from a remote video server like YouTube, Vimeo, etc. I don't have access to the video files being requested, hence the need for a proxy server. I have settled for Squid. This proxy should process the video signal/stream being passed from the remote server before relaying it back to the requesting client.
To achieve the above, I would either
1. Need to figure out the precise location (URL) of the video resource being requested, download it really fast, and modify it as I want before HTTP streaming it back to the client as the transcoding continues (simultaneously, with some latency)
2. Access the raw byte stream, pipe it into a transcoder (I'm thinking ffmpeg) and proceed with the streaming to client (also with some expected latency).
Option #2 seems tricky to do but lends more flexibility to the kind of transcoding I would like to perform. I would have to actually handle raw data/packets, but I don't know if ffmpeg takes such input.
In short, I'm looking for a solution to implement real-time transcoding of videos that I do not have direct access to from my proxy. Any suggestions on the tools or approaches I could use? I have also read about Gstreamer (but could not tell if it's applicable to my situation), and MPlayer/MEncoder.
And finally, a rather specific question: Are there any tools out there that, given a YouTube video URL, can download the byte stream for further processing? That is, something similar to the Chrome YouTube downloader but one that can be integrated with a server-side script?
Thanks for any pointers/suggestions!
You should ask single coding questions. What you asked is more like a general "how would a write my application". A few comments though:
squid is a http proxy, video use usually streamed over e.g. rtsp.
yes there are tools that grab the rtsp url from a youtube url, be sure to understand the terms of use for the video servie before going that way though.
gstreamer has a gst-rtsp-server module that contains a rtsp server, that also can be used as a proxy for a given rtsp stream.

Resources