We are developing a custom receiver for HLS video playback on Chromecast.
Our stream has Timed ID3 Metadata embedded in the MPEG-2 Transport Stream (TS files). I need the exact position in the stream that these ID3 tags are located for our app to function properly.
In my Custom Receiver, I am registering for the Host.processMetadata event, and am receiving the metadata tags as the fragments are processed, but I am unable to determine the position of the tags in the stream.
I am looking for the best way to determine the position in the stream that the Timed Metadata is located. Is there an API Call I am missing?
Notes:
We are able to stream our HLS video, using the proper CORS headers.
We are getting the accurate position of the Timed Metadata when playing this stream on iOS and Android players.
We are working with an Android Sender.
Working with:
Cast Receiver 2.0.0
Media Player 1.0.0
Chromecast Firmware Version 26653
Thank you!
We are working on adding a new feature to MPL to address this exact same issue, to make the media time corresponding to the ID3 data available in processMetadata. I'll try to update this post when that is implemented and released.
Google has updated the Host API's ProcessMetadata method a week or two after I posted this question. The callback now includes the time in the stream that the metadata is located.
See the docs for more:
https://developers.google.com/cast/docs/reference/player/cast.player.api.Host#processMetadata
Related
This is my first time trying chromecast apps.
I started with the CastVideos-android paired with a Styled Media Player with custom skin url. After some hurdles, was able to get the custom skin to work plus the video clips from the sender app plays nicely.
Now I'm attempting a custom media player using the sample CastHelloVideoPlayer receiver app from google's sample list and paired with the CastVideos-android sender app. After creating a new application id and recompiled CastVideos-android, I tried to cast some videos to the cast device.
1) First thing I noticed is the TV is purely blanked. No default app name or anything, just plain black screen. Didn't think much about it since this is a custom media player, a lot of things may not be set such as the logo/splash/watermark.
2) Main issue I have encountered, when I tried to play a video clip the cast device remains blank. Looking at the chrome debugging console I noticed this error message:
[ 32.941s] [cast.receiver.MediaManager] Load metadata error: [object Object]pd # cast_receiver.js:formatted:2249nd.Zc # cast_receiver.js:formatted:2234tb.log # cast_receiver.js:formatted:675G # cast_receiver.js:formatted:710W.Yb # cast_receiver.js:formatted:4855g.Yb # cast_receiver.js:formatted:3660Jc # cast_receiver.js:formatted:1500Gc # cast_receiver.js:formatted:1550(anonymous function) # cast_receiver.js:formatted:1447
cast_receiver.js:formatted:2249 [ 32.955s] [cast.receiver.MediaManager] Sending error message to b5d9d1e6-f6d6-a0bd-440c-fe7255ebfcbc.11:com.google.sample.cast.refplayer-172
Now I'm surprised to encounter this because the same video clips played nicely when I'm using the Styled Media Player. But failed when I used the sample CastHelloVideoPlayer?
Just found out that the sample receiver app I took from Chromecast sample page is limited to the default media containers defined in Custom Receiver Supported media format. And the sample sender app is sending m3u8 containers. So after changing the sender app to select the correct target media (mp4), everything starts working.
To support HLS, need to use the Google Cast Media Library which is not part of any of the sample apps.
I need to implement continuous playback in my google cast custom receiver. For that I am handling the video event 'ended' after playing my first content, and I have to make an API call there to get the next content's media url. And now I am confusing about how can I restart playing with my new content.
Please advice.
Thanks in advance.
You can utilize MediaQueueItem (and .Builder as well) to create queue items before its being played (essentially a playlist itself). RemoteMediaPlayer.queueLoad is already used by the VideoCastManager to load and start a new queue of media items.
we're using flow player with Wowza. We've managed to get it so the stream starts on opening the page, however the stream is made up of several individual videos and between videos the user has to click play to resume the stream. Is there anyway to get flowplayer to automatically play the next video? Thanks
Dobro.
Assuming you are using the flash version of flowplayer (not the HTML5). Flowplayer offers a complete api, with events and properties to modify every aspect of the original configuration. For instance you could declare the 'onFinish' method in the clip, and then load another clip automatically
Check the flowplayer documentation:
http://flash.flowplayer.org/documentation/api/clip.html
I'm developing a custom media receiver for chromecast. Is it possible to set the active tracks on media load? For example, I have two embedded audio tracks, and I want the second audio track to be what plays immediately on load. The sender does not know what embedded tracks are available, so using the LoadRequest is not an option.
Support for this was added in the 2.0.0 receiver release on Dec 1, 2014: https://developers.google.com/cast/docs/release-notes
Documentation is here: https://developers.google.com/cast/docs/reference/receiver/cast.receiver.MediaManager#loadTracksInfo
What steps will reproduce the problem?
1. On mediaManager loadData, stream info (audio) is sent to sender.
2. Selected audio track index is sent back to receiver.
3. Receiver disables the existing audio stream and enables the new selected audio stream and reloads the player.
What is the expected output? What do you see instead?
Expected Output: Audion stream should be enabled on the protocol, and no duplicate of stream info should be present.
Current output: No new audio stream is enabled and duplicate stream info is present,and if you change the audio stream again on client and send it to receiver, now the stream info count is tripled.
What version of the product are you using? On what operating system?
Mac, MPL 0.7.0
Smooth streaming playready DRM.
Please provide any additional information below.
Also something very strange: My asset has swedish audio lang as default, but while fetching the audio stream info from smoothstreaming protocol, the first lang is selected by default (in this case it is finish), no matter what. Even if one tries to modify the protocol manually, MPL throws an exception as "Uncaught exception: cannot call method Ra of null"
And it always reamins the first lang of the protocol audio stream , no matter how many times the player is reloaded.
P.S. - There is no API document for smoothstream protocol, the one which is present in API refrences, slapped me with 404 error.
https://developers.google.com/cast/docs/reference/player/player.StreamingProtocol