Can I use Kickflip with Parse to stream video TO server from Android device during recording - parse-platform

Parse is turning out to be convenient to use. However the service does not allow for streaming videos to the server. I need a service that allows my users to stream videos to the server while they are recording. I was able to find the KickFlip SDK. Does anyone know how I might be able to pair KickFlip with Parse for Video Streaming? Even if I were to use both services, as opposed to just the KickFlip SDk, how would I coordinate the two? Parse provides rich social database but storing a ParseFile video is limiting (no streaming and 10MB max).

Should work on a REST client with a normal POST ( i would not do your scenario without adding something like Heroku/EC2 to parse.com PAAS architecture, but it will work )... I dont know 'flipkick' but you must know low-level protocol details b4 you include this in a solution.
In REST.POST.Headers you would need to include the combo of header values consistent with a client posting a stream on https to parse.( more here ). ie NO length header, ask for chunked encoding, use http 1.1. On a FILE POST to parse, IMO you dont need to worry about a timeout. You will have to append to the end of your posted bytestream the normal http , chunked-encoding , byte sequence signifying END_OF_DATA.
Sample Curl interaction with headers etc.
Observe relevant parse.file.sz. limitations...

Related

WebSockets (STOMP) streaming audio from back-end Spring Boot to front-end Vuejs

I'm developing an audio streaming platform like Spotify for a school project. I used Vuejs for the front-end application and implemented an audio player. This is working.
Now I need a way to send audio files from a back-end to my front-end. I now have a micro service called streaming service using Spring Boot and used WebSocktets (STOMP) to make a connection with the front-end. But I see it's mostly used for chat applications or conference call applications..
I read it is heavy to send audio files via REST, because of making connections repeatedly. Which way is the most efficient way to implement this?
Please be specific in your answer since I'm not a advanced developer
I read it is heavy to send audio files via REST, because of making connections repeatedly.
Not really. For something like Spotify, a normal HTTP Progressive stream is sufficient. In this case only one TCP connection is typically made, over which a small handful of ranged HTTP requests will go over.
Web Sockets are only appropriate for cases where you need a bidirectional data flow. In this case, you just have requests and responses, which a normal HTTP request is suitable for.
Usage of regular HTTP also means you can utilize standard CDNs.

Live stream multi-bitrate video

Preface
I have read this two part tutorial (Part-1 and Part-2) by Steamroot on MPEG-DASH, and below is my understanding (please correct me if I am wrong):
The video needs to be encoded into multiple bit-rates using FFmpeg.
The encoded videos need to be transcoded (dashified) using MP4Box.
The dashified videos can be served using a web server.
Problem
I intend to live-stream an event and I need help to understand the following:
Can I club the FFmpeg and MP4Box commands into a single step? Maybe through a wrapper program so that I do not have to run them separately? Is there any other or better solution?
How do I send the dashified content to the web server? FTP? Would any vanilla web server do?
Lastly, a friend had hinted that I could also use GStreamer to achieve my objective. But, I could not find any good resource on the internet for the same. So, where (and how) does GStreamer fit in the above process?
What is the format you will be getting out of your camera for your live-event? There are a lot of solutions a lot more adapted for live streaming (the tutorial I wrote is for VOD streams only). You can check out simple solutions like Wowza Streaming Server, Nible streamer (free), etc, that take a RTMP stream and transform it into other formats (HLS, DASH, etc...).
Most of the livestreaming platforms can even do that for you (livestream.com, youtube, twitch, or even facebook now)
The dashified content will be requested as HTTP ressources by the browser or other players. In the case of a VoD stream, indeed you just need to make the dash segments available through a web-server. For live content, you need something smarter, that will encode, package the segments and make them available on the fly.
Gstreamer can transcode and transmux the original content, and can do it on the fly. You will be able to get different formats as outputs, like RTMP, HLS, and probably even mpeg-dash. Then you still need to make your content available via a webserver.
In conclusion, if you just want to transmit an occasional live event, it's probably a lot easier a platform that will ingest your RTMP stream and do all the complicated steps for you.

Streaming from RxJava API Web Endpoint

I've just finished working through a book and RxJava, and whilst it answered a lot of questions I had, one still remains. How do I take a stream and allow a client to subscribe to it over HTTP (i.e. keep an open connection and stream events as they occur upstream)?
I'm content with the server-side details, but sending that to a client via HTTP seems to be a challenge. Whilst gRPC could be a great option, it's not supported in browsers. The Streams Spec looks like a good option once it is implemented. However, at this point in time, those are not viable options.
To achieve streaming functionality (e.g. from a, I presume, REST/JSON API to a client web-site), am I looking at SocksJS / websockets? Or is there some other protocol that can be harnessed?

Send compressed images from server and uncompress them in client

I'm developing a project using CakePHP 3 for server side and Android for client side. In this project I have to send a lot of images of products from the server to the app. When the app requests the images of a product, the server searches in the database for the urls of the images and send them to the app through a json response. Then, in the app I load the images using NetworkImageView from Volley library.
This process works, but the images are heavy in size, so it consumes a lot of data if you use mobile network like 4g. I can't to lose image quality, so I can't treat the images too much.
What I was thinking was to compress (somehow) the images in server side, send the array of bytes through json and uncompress them in the app, so I could minimize data consumption.
I coudn't find any info for what I described above and I'm not sure if this is the right aproach. Any help would be appreciated.
What I was thinking was to compress (somehow) the images in server side, send the array of bytes through json and uncompress them in the app, so I could minimize data consumption.
JSON will increase and not lower the amount of data that needs to be send for obvious reasons. It's an envelope in your use case and the way JSON works it will add more data. Check the JSON spec.
You want to enable gzip compression on your Webserver (Nginx here), check Google for that or superuser.com for more details.
But this won't make a dramatic difference either for mobile use when you send a 20mpx image. I would send small images and only send a lager version when needed, when the user zooms in. Guess that's doable.

Capture raw video byte stream for real time transcoding

I would like to achieve the following:
Set up a proxy server to handle video requests by clients (for now, say all video requests from any Android video client) from a remote video server like YouTube, Vimeo, etc. I don't have access to the video files being requested, hence the need for a proxy server. I have settled for Squid. This proxy should process the video signal/stream being passed from the remote server before relaying it back to the requesting client.
To achieve the above, I would either
1. Need to figure out the precise location (URL) of the video resource being requested, download it really fast, and modify it as I want before HTTP streaming it back to the client as the transcoding continues (simultaneously, with some latency)
2. Access the raw byte stream, pipe it into a transcoder (I'm thinking ffmpeg) and proceed with the streaming to client (also with some expected latency).
Option #2 seems tricky to do but lends more flexibility to the kind of transcoding I would like to perform. I would have to actually handle raw data/packets, but I don't know if ffmpeg takes such input.
In short, I'm looking for a solution to implement real-time transcoding of videos that I do not have direct access to from my proxy. Any suggestions on the tools or approaches I could use? I have also read about Gstreamer (but could not tell if it's applicable to my situation), and MPlayer/MEncoder.
And finally, a rather specific question: Are there any tools out there that, given a YouTube video URL, can download the byte stream for further processing? That is, something similar to the Chrome YouTube downloader but one that can be integrated with a server-side script?
Thanks for any pointers/suggestions!
You should ask single coding questions. What you asked is more like a general "how would a write my application". A few comments though:
squid is a http proxy, video use usually streamed over e.g. rtsp.
yes there are tools that grab the rtsp url from a youtube url, be sure to understand the terms of use for the video servie before going that way though.
gstreamer has a gst-rtsp-server module that contains a rtsp server, that also can be used as a proxy for a given rtsp stream.

Resources