Application gets ANR when trying to change *.mpd tracks - chromecast

I'm trying to implement in my app chrome-cast. I'm using the following libs :
implementation 'com.google.android.exoplayer:exoplayer:2.9.6'
implementation 'com.google.android.exoplayer:extension-cast:2.7.0'
My videos are encrypted DRM. I provide to my custom receiver stream URL and proxy URL. The video is played on TV using a chrome cast. The problem appears when I try to change my video/sub-tracks. Using the ExpandedControllerActivity provided by prev libs I can trigger/close the native dialog that shows my video and subtitles tracks. But I get ANR when I choose another track.

Related

Embed M3U8 joomla 3

i have some IPTV urls as (Playliste) .M3U8 .tsformat like this:
http://123.345.543/live/abcd/123456.m3u8
http://123.345.543/abcd/123456.ts
I trying Allvideos Player component and many other player plugin without success...
Please someone can called any extension or plugin can play this type of files
thank you
If you are saying that you can't play the above streams using the HTML5 tag, then, yes, this is expected behaviour at this time.
To play HLS and DASH video streams on a web browser you typically would use a Javascript player like Dash.js, BitMovin, Shaka player etc.
You can see example players which will allow you test the streams here:
http://dashif.org/reference/players/javascript/1.4.0/samples/dash-if-reference-player/
https://bitmovin.com/hls-fragmented-mp4/
https://shaka-player-demo.appspot.com

getting cast.receiver.MediaManager Load metadata error with custom media player for chromecast

This is my first time trying chromecast apps.
I started with the CastVideos-android paired with a Styled Media Player with custom skin url. After some hurdles, was able to get the custom skin to work plus the video clips from the sender app plays nicely.
Now I'm attempting a custom media player using the sample CastHelloVideoPlayer receiver app from google's sample list and paired with the CastVideos-android sender app. After creating a new application id and recompiled CastVideos-android, I tried to cast some videos to the cast device.
1) First thing I noticed is the TV is purely blanked. No default app name or anything, just plain black screen. Didn't think much about it since this is a custom media player, a lot of things may not be set such as the logo/splash/watermark.
2) Main issue I have encountered, when I tried to play a video clip the cast device remains blank. Looking at the chrome debugging console I noticed this error message:
[ 32.941s] [cast.receiver.MediaManager] Load metadata error: [object Object]pd # cast_receiver.js:formatted:2249nd.Zc # cast_receiver.js:formatted:2234tb.log # cast_receiver.js:formatted:675G # cast_receiver.js:formatted:710W.Yb # cast_receiver.js:formatted:4855g.Yb # cast_receiver.js:formatted:3660Jc # cast_receiver.js:formatted:1500Gc # cast_receiver.js:formatted:1550(anonymous function) # cast_receiver.js:formatted:1447
cast_receiver.js:formatted:2249 [ 32.955s] [cast.receiver.MediaManager] Sending error message to b5d9d1e6-f6d6-a0bd-440c-fe7255ebfcbc.11:com.google.sample.cast.refplayer-172
Now I'm surprised to encounter this because the same video clips played nicely when I'm using the Styled Media Player. But failed when I used the sample CastHelloVideoPlayer?
Just found out that the sample receiver app I took from Chromecast sample page is limited to the default media containers defined in Custom Receiver Supported media format. And the sample sender app is sending m3u8 containers. So after changing the sender app to select the correct target media (mp4), everything starts working.
To support HLS, need to use the Google Cast Media Library which is not part of any of the sample apps.

FlowPlayer with Wowza

we're using flow player with Wowza. We've managed to get it so the stream starts on opening the page, however the stream is made up of several individual videos and between videos the user has to click play to resume the stream. Is there anyway to get flowplayer to automatically play the next video? Thanks
Dobro.
Assuming you are using the flash version of flowplayer (not the HTML5). Flowplayer offers a complete api, with events and properties to modify every aspect of the original configuration. For instance you could declare the 'onFinish' method in the clip, and then load another clip automatically
Check the flowplayer documentation:
http://flash.flowplayer.org/documentation/api/clip.html

Retrieving Timed Metadata on Chromecast

We are developing a custom receiver for HLS video playback on Chromecast.
Our stream has Timed ID3 Metadata embedded in the MPEG-2 Transport Stream (TS files). I need the exact position in the stream that these ID3 tags are located for our app to function properly.
In my Custom Receiver, I am registering for the Host.processMetadata event, and am receiving the metadata tags as the fragments are processed, but I am unable to determine the position of the tags in the stream.
I am looking for the best way to determine the position in the stream that the Timed Metadata is located. Is there an API Call I am missing?
Notes:
We are able to stream our HLS video, using the proper CORS headers.
We are getting the accurate position of the Timed Metadata when playing this stream on iOS and Android players.
We are working with an Android Sender.
Working with:
Cast Receiver 2.0.0
Media Player 1.0.0
Chromecast Firmware Version 26653
Thank you!
We are working on adding a new feature to MPL to address this exact same issue, to make the media time corresponding to the ID3 data available in processMetadata. I'll try to update this post when that is implemented and released.
Google has updated the Host API's ProcessMetadata method a week or two after I posted this question. The callback now includes the time in the stream that the metadata is located.
See the docs for more:
https://developers.google.com/cast/docs/reference/player/cast.player.api.Host#processMetadata

Music + Video integration when using BackgroundAudioPlayer.Track/Play vs MediaHistoryItem.NowPlaying

According to the MediaHistory.NowPlaying documentation:
For applications using BackgroundAudioPlayer, there is no need to set the NowPlaying information because it is handled by the system automatically
Since AudioTrack does not contain an equivalent to MediaHistoryItem.PlayerContext, how can the application supply additional data so that it can correctly handle navigation from the Music + Video Hub or the Universal Volume Control?
I ended up setting a NowPlaying anyway and had it link to the "now playing" section of my app.
The BackgroundAudioPlayer API may not require you to set NowPlaying, but Music + Video integration does and it's the only way to provide other information (like PlayerContext data).

Resources