Audio track switching impossible with MPL 0.7.0 and smoothstream protocol - chromecast

What steps will reproduce the problem?
1. On mediaManager loadData, stream info (audio) is sent to sender.
2. Selected audio track index is sent back to receiver.
3. Receiver disables the existing audio stream and enables the new selected audio stream and reloads the player.
What is the expected output? What do you see instead?
Expected Output: Audion stream should be enabled on the protocol, and no duplicate of stream info should be present.
Current output: No new audio stream is enabled and duplicate stream info is present,and if you change the audio stream again on client and send it to receiver, now the stream info count is tripled.
What version of the product are you using? On what operating system?
Mac, MPL 0.7.0
Smooth streaming playready DRM.
Please provide any additional information below.
Also something very strange: My asset has swedish audio lang as default, but while fetching the audio stream info from smoothstreaming protocol, the first lang is selected by default (in this case it is finish), no matter what. Even if one tries to modify the protocol manually, MPL throws an exception as "Uncaught exception: cannot call method Ra of null"
And it always reamins the first lang of the protocol audio stream , no matter how many times the player is reloaded.
P.S. - There is no API document for smoothstream protocol, the one which is present in API refrences, slapped me with 404 error.
https://developers.google.com/cast/docs/reference/player/player.StreamingProtocol

Related

FFMPEG - RTSP: where does sprop-parameter-sets come from?

I'm encoding .mp4 video files (h265) with ffmpeg to create RTSP stream, put it to rtsp-simple-server and then after aler9's restreamer the stream is available to watch. I can watch the result in VLC (adding network source rtsp://my.server/stream), but can't watch the stream in Android application - no sprop-parameter-sets headers exist
So, the question is: where can I get these headers and how can I add them to the stream, so that everything worked?
P.S. if I add the stream to Wowza, these headers (sprop-vps, sprop-sps, sprop-pps) are added by Wowza to the stream, but I need to use the server directly.
OK, found an answer: '-bsf:v', 'dump_extra'

How do I test a webm videostream using jmeter?

AFAIK there are 2 video streaming plugins available for JMeter:
BlazeMeter - HLS Plugin and
UbikLoadPack Video Streaming plugin
UbikLoadPack as a prohibitive pricing tag and HLS Plugin doesn't test the format I need. Also I want a FOSS solution not a paid one.
Does anyone know of some other plugin or method I could test a webm video stream ?
Edit
#dmitri-t when I try to do this it just hangs. Here I found this script that shows how to test a video. But when I changed the parameters to my video and range to 0-100 it hanged.
Also the example is using HTTP and my video uses HTTPS.
Tried to include a timer. It hangs also.
Yet the video loads perfectly in Chrome with the same url I used in Jmeter.
I also tested the request with Postman. It ignores the range header. So what's probably happening in JMeter is that it's trying to load the whole continuous stream. How do I make it consider range header ?
I tested with Postman on an image in the same server to see if range header was being considered or if it was a server problem and range was respected correctly.
Content-Range header doesn't work also. Please check this related question I did relative to the range problem with streams: Request to a webm stream ignores range header
I don't think you need any form of plugin, you can simulate the browser playing the video using normal HTTP Request sampler sending simple HTTP GET request
Here is the evidence that "playing" an webm "stream" is nothing more than downloading it.
It would be a good idea to add Timers to simulate users watching the video till the end (or according to your test case)

Application gets ANR when trying to change *.mpd tracks

I'm trying to implement in my app chrome-cast. I'm using the following libs :
implementation 'com.google.android.exoplayer:exoplayer:2.9.6'
implementation 'com.google.android.exoplayer:extension-cast:2.7.0'
My videos are encrypted DRM. I provide to my custom receiver stream URL and proxy URL. The video is played on TV using a chrome cast. The problem appears when I try to change my video/sub-tracks. Using the ExpandedControllerActivity provided by prev libs I can trigger/close the native dialog that shows my video and subtitles tracks. But I get ANR when I choose another track.

getting cast.receiver.MediaManager Load metadata error with custom media player for chromecast

This is my first time trying chromecast apps.
I started with the CastVideos-android paired with a Styled Media Player with custom skin url. After some hurdles, was able to get the custom skin to work plus the video clips from the sender app plays nicely.
Now I'm attempting a custom media player using the sample CastHelloVideoPlayer receiver app from google's sample list and paired with the CastVideos-android sender app. After creating a new application id and recompiled CastVideos-android, I tried to cast some videos to the cast device.
1) First thing I noticed is the TV is purely blanked. No default app name or anything, just plain black screen. Didn't think much about it since this is a custom media player, a lot of things may not be set such as the logo/splash/watermark.
2) Main issue I have encountered, when I tried to play a video clip the cast device remains blank. Looking at the chrome debugging console I noticed this error message:
[ 32.941s] [cast.receiver.MediaManager] Load metadata error: [object Object]pd # cast_receiver.js:formatted:2249nd.Zc # cast_receiver.js:formatted:2234tb.log # cast_receiver.js:formatted:675G # cast_receiver.js:formatted:710W.Yb # cast_receiver.js:formatted:4855g.Yb # cast_receiver.js:formatted:3660Jc # cast_receiver.js:formatted:1500Gc # cast_receiver.js:formatted:1550(anonymous function) # cast_receiver.js:formatted:1447
cast_receiver.js:formatted:2249 [ 32.955s] [cast.receiver.MediaManager] Sending error message to b5d9d1e6-f6d6-a0bd-440c-fe7255ebfcbc.11:com.google.sample.cast.refplayer-172
Now I'm surprised to encounter this because the same video clips played nicely when I'm using the Styled Media Player. But failed when I used the sample CastHelloVideoPlayer?
Just found out that the sample receiver app I took from Chromecast sample page is limited to the default media containers defined in Custom Receiver Supported media format. And the sample sender app is sending m3u8 containers. So after changing the sender app to select the correct target media (mp4), everything starts working.
To support HLS, need to use the Google Cast Media Library which is not part of any of the sample apps.

Retrieving Timed Metadata on Chromecast

We are developing a custom receiver for HLS video playback on Chromecast.
Our stream has Timed ID3 Metadata embedded in the MPEG-2 Transport Stream (TS files). I need the exact position in the stream that these ID3 tags are located for our app to function properly.
In my Custom Receiver, I am registering for the Host.processMetadata event, and am receiving the metadata tags as the fragments are processed, but I am unable to determine the position of the tags in the stream.
I am looking for the best way to determine the position in the stream that the Timed Metadata is located. Is there an API Call I am missing?
Notes:
We are able to stream our HLS video, using the proper CORS headers.
We are getting the accurate position of the Timed Metadata when playing this stream on iOS and Android players.
We are working with an Android Sender.
Working with:
Cast Receiver 2.0.0
Media Player 1.0.0
Chromecast Firmware Version 26653
Thank you!
We are working on adding a new feature to MPL to address this exact same issue, to make the media time corresponding to the ID3 data available in processMetadata. I'll try to update this post when that is implemented and released.
Google has updated the Host API's ProcessMetadata method a week or two after I posted this question. The callback now includes the time in the stream that the metadata is located.
See the docs for more:
https://developers.google.com/cast/docs/reference/player/cast.player.api.Host#processMetadata

Resources