I want to know please if it is possible to build a cast app for google home which casts content from an app to Chromecast using Google home. For example, "Ok google, play XXXAudio/videoContent from XXXApp on the TV."
This means that I want to cast a content from XXXApp to Chromecast, but without using the phone to do that, but rather using Google home.
I have imagined doing that like : Google home retrieves the requested data from the XXXApp then casts the content, through the receiver, to Chromecast. The requested content is then displayed on the TV.
Any help or advice will be appreciated.
It depends exactly what you want your cast app to do.
If you want to be able to provide media just like someone like Spotify or a local streaming radio station does, you need to parter with Google. You'll then have access to a range of partner solution feed options.
If you have other cast apps that you want, or other things that you may want to cast to a tv (for example) - these aren't currently available. However, Smart Displays using the Assistant are coming shortly, and these may start to introduce capabilities that will also be available for cast devices.
One could probably do something that started the cast app from an existing supported device and then, once running, control it via an Action. But this isn't something directly available right now.
Related
I have a mobile IOS application which is basically group chat oriented. I was wondering if I can make it beacon-enabled app. With other words, application which can detect beacons, determine id, receive short ads etc. I am quite new to beacons, still reading a lot about it. I found this article which makes me believe that maybe it is possible to achieve my goal.
I want my app, except for its original functionality, to be able to:
Detect a beacon (even if app is in background, without the need to have been paired or whatever)
Receive simple ads by the beacon (while using the app in a certain mode, to be able to project the ads on phone's screen)
Read some real-time info(if required by the user) like: speed(if moving beacon), temperature etc.
In order to achieve all that I would like not to disturb the end user by making him download additional stuff related to the beacon. I want him to have my app and that's all.
You can monitor for beacons while your app is in the background. You can do this by using the CoreLocation Framework or by using a framework of your beacon supplier. (e.g. Estimote)
To the ads: The beacon will only deliver its pre configured ids. (UUID + Major + Minor). If you want to receive ads, you must have a Webservice that will get your beacon ids and return you the ad to display.
(If the ads never change, you can put them directly into your app without a webservice)
How you will show the ads in your app is completely open to you.
I don't really understand what you mean with real-time-info. There are beacons that also send you TLM. So you can get the temperature and the battery level of the beacon. I've never seen moving beacons or beacons that can transmit their speed.
Some useful links:
https://developer.apple.com/ibeacon/
https://xamoom.com/en/2016/07/ibeacon-for-developers/
I'd like to hook up a bttn such that when that button is pressed, a specific song is played through my speakers using the new Chromecast Audio. I couldn't find documentation for a REST API that would allow me to accomplish this.
Is there any direct hookup that would be possible such that I can call some REST API to play audio through Cast Audio?
There is no single rest apis to do so; the process of casting a media, using the Cast SDK, amounts to starting a discovery, selecting (connecting to) a device, setting up the so-called RemoteMediaPlayer and then loading a media. There is plenty of documentation on our Cast documentation site that helps you follow and implement the above steps, along with a good number of sample apps.
In my Android Sender Application, I would like to display the status of a newly discovered Chromecast device. For e.g if another app lets say Youtube is currently casting to this device, then I would like to put a status as "casting Youtube" next to the device name in my receiver list.
For this, once I discover a media route, I connect to that device. On getting a ConnectionCallbacks.onConnected() event, I try to retrieve the application metadata using Cast.CastApi.getApplicationMetadata(GoogleApiClient). But I'm getting a null value here. When I run my Sender App, I make sure that I'm casting to the same Chromecast receiver from another app like Youtube. So I expect the application metadata to reflect Youtube app details its like appId, name etc. Is there a different way to achieve this?
YouTube is still using the old preview SDK so that might be a factor in seeing null. Please try an app that is using the new SDK and see if that returns more useful data. I know there was a bug which caused that method to return null all the time but I believe that was fixed with the recent play services update.
I'd like to do a best of application list. It is just a list with a title and an image.
I saw the MarketplaceDetailTask(); who take the marketPlaceId (marketplaceDetailTask.ContentIdentifier).
My question is, is there a way, knowing this Id, to get the image url of the corresponding app ?
I've done some research but nothing about an API or something to return this url.
There is no built-in API in Windows Phone to retrieve this kind of information. However, you can try querying directly the service used by the Zune client. This service isn't publicly documented, but there's a few blog posts explaining how to use it. For instance:
http://brandonwatson.sys-con.com/node/1767886
I am new to webOS development. I have one app in the app store and in the next update to the app I would like to be able to identify the age of users, their location, how long they use the app, which features they use the most/least and then store that data in a database. How do I do this? Many thanks in advance for your help.
Well, that's a pretty big question. Here's an outline of what to do, with some notes.
First, you're probably not going to be able to get age unless you ask the user directly and they tell you. Also, you're only going to get location if the application is location-aware and the user permits you to collect that data (when you install a location-aware application, it asks the user if they're okay with the fact that the application will be able to get their location).
As for how long they use the app and which features they use, that's easier. Depending on the granularity you need/want to capture, you can just record time stamps when a user starts and stops using a particular feature, such as when scene activate and deactivate methods fire. As long as you store feature name and timestamp, that should give you what you're looking for.
Then comes to question of collection. However you store it in the app, you have a couple of choices for how to get it out of the app. Unless you can get your users to just email the data to you, probably the easiest thing to do would be to create a web app (possibly with no user facing output, since you're just using it to collect data) using something like Google App Engine that gives you a URL you can send a POST request to using an HTTP request. Depending on how you set it up, it could do the request every time you collect a timestamp (bad for battery use, though), just occasionally, or only when the app is doing cleanup (possibly a problem if you don't get the request off in time).
I'd recommend taking a look online at how people do this type of thing in iPhone apps to get a good sense of how to do this type of thing. If you hit problems getting particular things to work, you can of course come here to StackOverflow with specific coding questions.