I need to implement continuous playback in my google cast custom receiver. For that I am handling the video event 'ended' after playing my first content, and I have to make an API call there to get the next content's media url. And now I am confusing about how can I restart playing with my new content.
Please advice.
Thanks in advance.
You can utilize MediaQueueItem (and .Builder as well) to create queue items before its being played (essentially a playlist itself). RemoteMediaPlayer.queueLoad is already used by the VideoCastManager to load and start a new queue of media items.
Related
Sometimes needs to change camera and cycleVideo is not good for this purposes because we cant control deviceId and its video/audio track
We need methods like (especially when it is different devices)
publisher.changeVideoTrack(myVideoTrack);
publisher.changeAudioTrack(myAudioTrack);
You can implement it (based on cycleVideo) very fast in opentok SDK.
At this moment to change stream track we are need to destroy publisher and create new one - its bad solution
Adam here from the OpenTok team. Thank you for your feedback. You can currently change the audio track using the setAudioSource() method of the publisher. We also have a setVideoSource() on our roadmap to add in the future.
I am trying to use the Google Cast SDK to write a tiny app.
This app should be able to show which (Google casted) song my audio system is currently playing.
This should include info about the track's title, length, artist, album etc.
Using the iOS classes I have managed to connect to the correct GCKDevice via the GCKDiscoveryManager. But how do I get the data about the track which is currently playing?
I would prefer an answer for the iOS classes, but I would also greatly appreciate any hint towards achieving the same with the Chromecast browser extension.
Addon:
On the Cast Developers +-Group I was pointed to this link, thanks Leon! I see that I should register a GCKRemoteMediaClientListener which I could do if I had a GCKRemoteMediaClientobject. The only way to get that seems to be via a GCKSession. I can create such a session from scratch but then the property remoteMediaClient is nil. Or I can ask the GCKCastContext.sharedInstance().sessionManager to start a session for me. But if I do that, the device stops playback immediately.
So I am still a bit lost about how to achieve my initial goal.
Any help please?
After some more help by Leon from the Cast Developer Group, I have figured this one out. The key information is the id of the app which is running on my audio device. I figured that out by the deprecated 2.0 API which is why I will not add this detail.
So assume you have that key, you can initialize yiur cast context by
let options = GCKCastOptions.init(receiverApplicationID: applicationID)
GCKCastContext.setSharedInstanceWith(options)
Next thing is to find your device. This was not too hard (the only thong I figured out by myself :o) and the GCKCastContext.sharedInstance().discoveryManageris your friend here.
If you now start a session via the GCKCastContext.sharedInstance().sessionManager it will not stop the playback. Totally logical in hindsight, makes me wonder which app I started before with that random app id I used...
With the session started you can cast that session to a cast session and add a session listener.
if let session = sessionManager.currentSession as? GCKCastSession {
session.add(self)
}
And when that session notifies its activeness as noted in
func castSession(_ castSession: GCKCastSession, didReceive activeInputStatus: GCKActiveInputStatus)
you can now find the long lost remoteMediaClient and register for his precious events
if let client = castSession.remoteMediaClient {
client.add(self)
}
And this will end the long journey by calling your delegate method
func remoteMediaClient(_ client: GCKRemoteMediaClient, didUpdate mediaMetadata: GCKMediaMetadata)
And if you are as I lucky as I was that media data is quite incomplete.
Mine just contained the title and the release date and nothing else.
Thanks Deezer!
But maybe I can use some id or other obtained in the whole process to get the missing data from the Deezer server? Time will tell.
In the meantime I hope this help somebody with more luck.
I am writing a custom receiver for chromecast and was wondering if there is a way we can have our own, custom XHR loader functionality rather then the built in goog.net.XhrIo?
Basically I need to override the functionality of goog.net.XhrIo for all segments/fragments and media files.
I need this to send some beacons back to my servers for analytics.
Thanks!
If you are using the Media Player Library (MPL), then you can use skipRequest() and setResponse() to achieve what you want to do. Note that the Host class provides a number of overrides for updating Segments, Manifest, License and Captions request info.
we're using flow player with Wowza. We've managed to get it so the stream starts on opening the page, however the stream is made up of several individual videos and between videos the user has to click play to resume the stream. Is there anyway to get flowplayer to automatically play the next video? Thanks
Dobro.
Assuming you are using the flash version of flowplayer (not the HTML5). Flowplayer offers a complete api, with events and properties to modify every aspect of the original configuration. For instance you could declare the 'onFinish' method in the clip, and then load another clip automatically
Check the flowplayer documentation:
http://flash.flowplayer.org/documentation/api/clip.html
I just started using Chromecast SDK today and got bit confused with its APIs and samples given in the web.
What I am trying to do is to send some messages to the Chromecast so it will display them on the big screen. I am going to use Chrome API with HTML5/JS/CSS.
Most examples (https://github.com/pjjanak/chromecast-hello-world/blob/master/sender/index.html , http://nerdwin15.com/2013/10/chromecast-development-part-one-chrome-sender/) in the web uses new Cast.Api() in the sender and uses an Activity in doing so. But I could not find a reference to a Cast.Api in the Chrome API. Most Google references deal with Media and I am not sure whether I have to use them. So to sum up, following are the questions I have (Sorry! I did read the API and developer guide but I am still clueless).
Do I have to write a custom receiver to show text on TV screen. Can't I survive default receiver, chrome.cast.media.DEFAULT_MEDIA_RECEIVER_APP_ID
Is handling multimedia files different from displaying text on the Chromecast or can I set the mime type to text/html and send a text stream (doesn't work for me at the moment)
Are those examples on the web uses a deprecated way of sending data to chromecast?
Thanks in advance,
Ish
Ok I think I found the answer from following documents,
https://developers.google.com/cast/docs/receiver_apps
https://github.com/googlecast/CastHelloText-chrome/blob/master/chromehellotext.html
Will try them and let you all know!
Following example page is very useful for anyone who try writing chromecast apps
https://github.com/googlecast
I'm not sure how you fared, but here are some quick responses to your questions.
Yes, you have to write a custom receiver if you want to do anything other than sending images, audio, or video to the Chromecast. You can see a list of supported media Default Media Receiver here: https://developers.google.com/cast/docs/media
Yes (see above), it requires a custom receiver, which will also require your own appId and I'm pretty sure a custom namespace.
To my knowledge, all of the examples up on https://github.com/googlecast should be relevant, but I am working on a few wrappers to try to simplify getting up and running with custom Chrome Sender and Receiver apps. You can check them out here: https://github.com/googlecast Let me know if those help, and if you have any feedback to share.
I hope you've already figured all this stuff out, but if not hopefully this is useful.