Google Home - Streaming hls to chromecast - chromecast

I've implemented smart home service for camera, including SYNC, QUERY, EXECUTE. Execute response is returning adequate hls url. However, casting that stream to Android tv via chromecast results in black screen.
Do I need some configuration to be able to stream hls?
Also, I have another question.
Is it possible to watch stream inside Google Home application on phone?

Chromecast supports hls protocol for video streaming now. A possible reason for the
black screen during casting the stream on Android TV using google chromecast can
be due to the usage of the custom driver instead of the generic driver. It would be
great to check if the streaming works fine with a generic driver. More information in
regards to this can be found here.

Related

OpenTok: Is it possible to publish a stream from a broadcasted TV channel?

We have a TV channel, and we are thinking of creating an android App to display our channel, live. I am looking for the right solutions for that.
I just got in touch with OpenTok, and it seems to be mostly about streaming phone's camera or webcam.
So before going deep with my team, I would like to know if OpenTok can also publish streamed live video channel, and continuously, or at least as long as, or as soon as, there is an active subscription on the session.
What format is your TV channel in? If you can play it in a video tag, you could use Chrome/Chromium via puppeteer to play the video, and use captureStream to pass the stream into the OpenTok JS SDK. You could of course also observe when users are added or removed from the session to turn on/ turn off this publisher.
It might also be better to look at HLS streaming, which OpenTok also supports. It might be best to skip the OpenTok step and go straight to a HLS stream.

Does TokBox support a codec compatible with Google Speech API?

Google Speech API claims to support a number of codecs (https://cloud.google.com/speech/docs/basics). I'm interested in processing an archive of a session produced by TokBox WebRTC.
Is there a sample code that does something like this? Does the archive need to be converted to a compatible format?
The default audio codec for WebRTC is Opus, which is indeed supported by Google Speech API. The trick is getting the audio out of an OpenTok stream and forwarded along to the recognition service; unfortunately this is no small effort.
Although some work has been done on this in an experimental capacity, there is no official support at this time. Recommend reaching out to TokBox support directly to discuss the specifics of what you're trying to build (email support at tokbox.com).
Disclosure: I work at TokBox.

Can we implement live audio streaming functionality using tokbox

I am having a requirement to broadcast live audio on my website.
Scenarios are:
One user will talk/sing in my application (only audio) and
His followers will have to listen that live audio instantly in the same application(followers can listen only).
After broadcasting live audio the followers may give replay through chat.
Can we implement the above scenario using Tokbox?
Note: I am developing my web application in ASP.Net MVC5 + WEB API.
You should be able to do it with TokBox.
Use OT.initPublisher({ videoSource: null }) to publish audio only. See https://tokbox.com/developer/guides/audio-video/js/#voice And for singing you may want to tune the audio quality, see https://tokbox.com/developer/guides/audio-video/js/#audio-tuning
Subscribing to this stream will allow your users to listen to the live audio
You will be able access the audio by creating an audio-only archive. The TokBox API provides the audio-only archive for you but you will have to integrate a playback system into your chat system yourself. See https://tokbox.com/developer/guides/archiving/

Migrate smooth streaming encoded video files to Azure Media Service

My company streams videos using IIS Media Services to Silverlight players, the streams are delivered as adaptive bitrates (Microsoft Smooth Streaming). Due to support for Silverlight plugin being dropped by all major browsers, we are planning to migrate our streaming platform to Azure.
I have checked the documentations, samples & read articles and couldn't find anything on how to use existing smooth streaming encoded video without having to re-encode. We have quite a large asset to migrate, around 400GB, re-encoding is not an option, also we plan to dynamically encrypt our content using AES. Does anyone know how to go about this?
You need to perform following steps
Create azure media services asset
Upload files for specified asset.
Then you need to run media encryptor encoder "Windows Azure Media Encryptor"
Configure delivery options
See https://github.com/Azure/azure-sdk-for-media-services/blob/dev/test/net/Scenario/JobTests.cs.
Method
private IAsset CreateSmoothAsset()
covering step 1&2.
There are various tests in this file to cover encrypting asset using "Windows Azure Media Encryptor" encoder(see usage of
GetMediaProcessor(_mediaContext, WindowsAzureMediaServicesTestConfiguration.MpEncryptorName);
)
To configure delivery of protected content see - https://azure.microsoft.com/en-us/documentation/articles/media-services-protect-with-aes128/ .
There is also media processor called "Windows Azure Media Packager" which will allow you to package your smooth asset for example to HLS.
You can onboard your existing Smooth streaming assets to Azure Media Services without re-encoding them and apply dynamic encryption of AES and dynamic packaging to different streaming formats such as HLS, MPEG-DASH and Smooth Streaming. However, there can be some limitations and constrains. If your content is already encrypted such as Smooth Streaming + PlayReady it is not supported to dynamic encrypt to AES. Your content needs to be in clear form if you want to use dynamic encryption. Also your Smooth Streaming assets needs to be Smooth Streaming spec complaint. There are tools which generates Smooth Streaming files which is not spec complaint and not supported by Azure Media Services.
You can use creating assets from existing storage blobs article to start
https://azure.microsoft.com/en-us/documentation/articles/media-services-copying-existing-blob/
I hope this answers your question.
Cenk

How do you properly use ChannelAPI to send a file from wear to mobile and vice versa?

I am trying to record audio on the wear and send it to the mobile and vice versa using the Channel API. However I can't find a working example of how to write this. Can anyone help? Thanks.
You can use WearCompanionLibrary that provides an API for that. The sample WclDemoSample uses that library and one of the features shown there is exactly that; it sends the audio stream (captured from microphone) from a wear device to the connected phone in real-time, using ChannelApi. If you don't want to use the library, you can take a look inside and see how it is done.

Resources