Can I use other HLS players on CAF receiver? - chromecast

I would like to continue using most of what the CAF receiver handles for me, like reporting playback state to the sender, etc. However, I would like to sometimes (and I really mean only sometimes) use hls.js instead of whatever HLS player the CAF receiver uses.
The reason I want to do this is that there are some videos I've tested that doesn't work with the normal CAF receiver but work fine with hls.js on the Chromecast. I have even filed a bug but haven't heard anything about it.
Does anyone know how to do this?

Related

How to integrate our own player with Cobalt

From the souce code of Cobalt, it can been seen that it used the ffmpeg related libraries(eg libasound/libavcodec/libavresample/libavutil...) to decode and render/play the video/audio as its own player(pull mode/push mode), as the playback code is high coupling from cobalt init to video decode, and there is no united interface for the use of integrating other player, so is there any guideline document or sample code to support/intergate other player except ffmpeg with Cobalt?
The porting interface for the player is centered around SbPlayer, defined in src/starboard/player.h -- everything under src/starboard/shared/ should be considered an example, or starter code for you to use to implement SbPlayer. You may use all or none of it as is convenient for you. The key is that you implement SbPlayerand the ancillary media porting APIs like SbMedia and SbDrm, and meet their described contracts.
Starboard (as defined in src/starboard/*.h) is the Cobalt porting interface, so you should not have to modify anything outside of your Starboard implementation in order to fully port Cobalt to a new platform. This will make later rebasing much easier as Starboard is a version-controlled API, but any other code is subject to change at any time, without warning. There are not and there won't ever be any direct references from Cobalt into any Starboard implementation code without going through the Starboard API, so you can swap out any portion of it as needed for your platform.

AudioTrack.getNativeOutputSampleRate reports wrong native rate

I have a native audio app already running on a lot of Android hardware. I use JNI to call AudioTrack.getNativeOutputSampleRate to get the native sample rate for the audio pipeline (duh). This is used to initialize OpenSL for audio output.
On the Google Tango device, getNativeOutputSampleRate returns 48khz. But using that rate yields glitchy audio. However, if I initialize the OpenSL player with 44.1 kHz, I get proper audio.
The fact that the code works on a lot of other phones types (and older OSes) makes me think it is the device itself that reports the wrong thing. But, hey, it's probably something I do anyway...
Anybody has an idea?

Why does flash video seem to skip less than HTML5 video when my computer's processor is busy?

Flash video seems stabler and less prone to playback hiccups when the computer's overloaded and busy. Why would this be? I would expect native browser video playback to be more stable and performant, if anything.
I'm in Google Chrome and windows, FWIW.
Flash has established itself for playing videos long before HTML, so if anything I'd expect the browser support to be less stable. After all, video in Flash had years to mature.
Also Chrome apparently does not use hardware-accelerated video decoding by default.
This depends how the video is being streamed. The flash player you are using, might also be compressing the file, or streaming it differently then native html5 video.

Playing background live-streaming audio

Can anybody give me a link to a working example of playing background live-streaming audio in Window Phone 7 (or 7.1)? I saw a lot of examples (in microsoft.com too) and noone of them works correctly for playing a background live-streaming audio.
FYI, here's an url of live-streaming audio http://radiozetmp3-02.eurozet.pl:8400/
Background audio is not supported on 7.0, only 7.1 (and above).
If you want to play streaming audio in a format/codec which is not natively supported by the phone you must do it with an AudioStreamingAgent. If it is a supported codec, you can use an AudioPlayerAgent (see sample here).
Using an AudioStreamingAgent is a nontrivial task and requires a deep understanding of the codec you need to play so you can convert it to something the phone understands. I know of one person who did this, for a H.264 stream, and it took a long time and much hair pulling to get it working. And before anyone asks: No, they are not able to share code from that project.
If you really must go down this route, the ManagedMediaHelpers (previously here) are a good place to start, but yes, they don't cover all codecs and this is, potentially, very complicated and not something well documented on the web.

How to receive MPEG-TS multicast from Windows

We currently have a system with live video encoded to an MPEG-TS multicast stream, being received by televisions with STBs. In addition to televisions we'd like to embed the video in our Windows application.
I know that VLC will receive the stream, but would prefer both a solution that I can embed in an existing application without playing window moving games, and one without licensing problem. I realize that likely means that I'm not looking at a free solution, that's fine, within reason.
Anyone know of a good product for this? Either something easy to use, or a plug-in for WMP.
You'll need to develop a simple DirectShow filter that listens on a given port and just passes down every packet it had received.
I don't have a sample handy, but it's really simple, several hundreds lines of code.
Then you just connect this filter to an MPEG2 Demultiplexer capable of decoding transport stream.
NVidia and Elecard come to mind first, though the former one does not connect under debugger.
Then you connect the demultiplexer to the decoder and finally to the renderer.
The demultiplexers and decoders handle the live stream issues well, you just capture the UDP packets and send down to them.
Due to licensing issues, MPEG2 decoders cannot be free (ffmpeg and VLC violate the license), so you'll have to buy the decoder.
Visit http://elecard.com, they have a nice range of MPEG2 products.
Expanding on Quassnoi's answer...
You might check out the Haali Media Splitter to act as a "MPEG2 Demultiplexer." This is a filter that just pulls the compressed video and sound out of the transport stream, so I'm guessing it doesn't have any licensing issues. Most PCs with a DVD player on them already have a licensed DirectShow MPEG2 decoder, so you can probably just use one that's already installed (or purchase a license from a place like elecard if you really want to be safe).
As you are developing your DirectShow application, you might find Monogram GraphStudio to be a helpful tool in designing the filter chains.

Resources