Is there any way to provide cookies for file-based media (mp4) in Chromecast CAF Receiver application - chromecast

I have file-based media (MP4) in a server (this server is not mine, I just used it to store a mp4 file). In order to stream this media from this server, I need to provide cookies which are collected by my Android sender app (I can play this media stream on my Android app already) for this server.
I am finding a way to provide cookies for the server by Chromecast CAF receiver app (for file-based media). I have checked guides documents of Google cast and searched information online but I can only find some ways to support for other protocol like (DASH, HLS,...)
Source: https://developers.google.com/cast/v2/mpl_player#cors <- this is for old version (v2) of receiver app
If your server requires CORS and cookie information in order to access the media, set the property withCredentials to true and set the header information as needed.
host.updateSegmentRequestInfo = function(requestInfo) {
// example of setting CORS withCredentials
requestInfo.withCredentials = true;
// example of setting headers
requestInfo.headers = {};
requestInfo.headers['content-type'] = 'text/xml;charset=utf-8';
};
Source: Authentication in Chromecast CAF Receiver application <- this is for other protocol (DASH, HLS,.. not file-base media)
playbackConfig.manifestRequestHandler = requestInfo => {
requestInfo.withCredentials = true;
};
playbackConfig.segmentRequestHandler = requestInfo => {
requestInfo.withCredentials = true;
};
playbackConfig.licenseRequestHandler = requestInfo => {
requestInfo.withCredentials = true;
};
And I cannot find a way to providing cookies for file-based media.
So, I would like to ask your help, answers and recommendations. Thanks,

After researching online documents and inspecting Chromecast device session. It looks that Chromecast device does not allow provide cookie (cookie not activated in Chromecast) to authenticate due to security purpose.

Related

Azure Media Service, VideoJS, Need to hide Azure endpoint url in src

In Azure Media Service with VideoJS, We are creating Mainfest file and shows that video in UI,
I want to hide Azure URL or use different way to show video with out shows the Azure Cloud in frontend,
Below URL comes from Spring boot backend,
Need to hide Azure Port from client or any other way to show video from springboot ressonce to Video.js,
Code:
const videoJsOptions = {
techOrder: ['html', 'youtube', 'flash', 'other supported tech'],
autoplay: true,
controls: true,
usingNativeControls: true,
sources: [
{
src:
'https://**my-video.streaming.media.azure.net**/tes-122/manifest',
type: 'application/vnd.ms-sstr+xml'
}
]
};
Not sure I understand the question. Do you want to hide the video streaming URL 'https://my-video.streaming.media.azure.net/tes-122/manifest' from the client?
Even if you hide it in JavaScript the client can do F12 and get the URL right? Unless you want to implement a proxy where all requests from client go thorough the proxy and then the proxy connects to the actual Azure resource.
While a simple proxy server is sufficient, all the video data now has to be routed thorough the proxy so that needs to be quite scalable depending on how many videos you have and how many clients are viewing it at any given time.

Does the Chromecast support casting videos from Reddit? (HLS and Dash videos)

Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLSPlaylist.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_540_v4.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_AUDIO_160_K_v4.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_224_v4.m3u8
Here's an example Reddit video: https://www.reddit.com/r/me_irl/comments/b3vrs4/me_irl
Looking through the JSON, it has a few options for video sources:
"reddit_video": {
"dash_url": "https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd",
"duration": 76,
"fallback_url": "https://v.redd.it/3hyw7hwoajn21/DASH_720?source=fallback",
"height": 720,
"hls_url": "https://v.redd.it/3hyw7hwoajn21/HLSPlaylist.m3u8",
"is_gif": false,
"scrubber_media_url": "https://v.redd.it/3hyw7hwoajn21/DASH_240",
"transcoding_status": "completed",
"width": 1280
}
While I seemingly can get other HLS/m3u8 videos to work with the Chromecast SDK (for example Google's own example HLS video), I cannot seem to get any of these sources to work.
I've tried https://v.redd.it/3hyw7hwoajn21/HLSPlaylist.m3u8 with the stream type set to both "live" or "buffered", I've tried the content type as "application/x-mpegURL", and I've tried the same for the dash URL https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd with content type "application/dash+xml" also to no avail. I found this question that seems to indicate some possibility?
I've also noticed with the DASH file there's a separate video and audio stream (https://v.redd.it/3hyw7hwoajn21/DASH_720 and https://v.redd.it/3hyw7hwoajn21/audio) worst case scenario is there a way to play the video stream with the separate audio stream playing too on the Chromecast?
Is it not possible for the Chromecast to play these video types?
UPDATE
Jesse and aergistal suggested that it has to do with the lack of CORS headers. I built a custom receiver app to be able to get better debugging logs, and this was indeed (the first) issue; Chromecast complains about CORS.
Using nginx on I built a local reverse proxy that adds all the CORS headers, then I give Chromecast that proxy URL instead and this CORS error went away.
However, using the HLS/m3u8 link it still wouldn't stream. Now it complains of the following:
[cast.player.hls.PackedAudioParser] Neither ID3 nor ADTS header was found at 0
and
[cast.player.api.Host] error: cast.player.api.ErrorCode.NETWORK/315
and
[cast.receiver.MediaManager] Load metadata error: Error
Full log:
Which causes it to still not play. Any ideas?
Adding the CORS issue allows the DASHPlaylist.mpd variant to load (it wouldn't before), which is great, but not so great at the same time because the reverse proxy requires you to download the entire response first, and where the DASH URL is just an entire MP4 (whereas the HLS is byte ranges) it means the reverse proxy has to download the entire DASH video first before showing it, which takes ages compared to the HLS.
So it'd still be optimal to get the HLS working due to speed, but is it just doomed not to work due to a playback issue on the Chromecast?
Solution for HLS with separate audio tracks
Based on the information from the latest log there is a mismatch between the chosen segment format and actual format used in the stream. The stream uses AAC in MPEG-TS whereas the Cast SDK tries to parse it as packed audio.
A reply on the Cast issue tracker shows that the HlsSegmentFormat defaults to MPEG2_TS if the stream is multiplexed and MPEG_AUDIO_ES otherwise.
The suggested solution for the CAF receiver is to intercept the load requests and override the segment format with loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS. A slightly modified example:
<html>
<head>
</head>
<body>
<cast-media-player id="player"></cast-media-player>
<script type="text/javascript" src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js">
</script>
<script>
const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();
// intercept the LOAD request
playerManager.setMessageInterceptor(
cast.framework.messages.MessageType.LOAD, loadRequestData => {
loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS;
return loadRequestData;
});
context.start();
</script>
</body>
</html>
Original source
Yet another example
Solution for CORS
The Google Cast reference gives you the solution:
If you're having problems playing streams on a Cast device, it may be an issue with CORS. Use a publicly available CORS proxy server to test your streams
The problem with the publicly available proxies is that they enforce a size limit due to bandwidth concerns, so make your own or use an open-source one. If the app runs on a mobile device you can also make it a local server.
The current streams are not protected by DRM. This will get more complicated or outright impossible later if they add CDN authentication or protect the streams with DRM.
Regarding the CORS headers you must make sure preflight requests are supported: the client might send an OPTIONS first to check CORS support (including allowed methods and headers).
HTTP range requests must also be supported for your streams meaning the appropriate headers must be authorized and exposed.
Example preflight request configuration from https://enable-cors.org:
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET,POST,OPTIONS
Access-Control-Allow-Headers: DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range
Access-Control-Expose-Headers: Content-Length,Content-Range
You will need to allow at least: GET, OPTIONS, the Content-Type and Range headers, and expose Content-Length,Content-Range. Remove duplicate headers if provided by the remote server.
Conclusion
The most ethical answer is to work with Reddit to ensure they set the proper CORS headers. CORS headers are required in the Google documentation.
Simulating your issue
Using this tester:
https://developer.jwplayer.com/tools/stream-tester/
It simulates some of the same experience you were having in your code with the Chromecast SDK. The Google video played without the Playready DRM setting, but the reddit videos did not (in most browsers).
MS EDGE and jwplayer
If you select Playready and put anything for Playready url, even leaving it blank, it works for M3U8.
Internet Explorer and jwplayer
Error, 232011 A manifest request was made without proper crossdomain credentials. Cannot load M3U8: Crossdomain access denied. This video cannot be played because of a technical error.
This indicates that perhaps CORS is not enabled on the reddit servers. More on that below.
Firefox and jwplayer
Nothing there seems to work with jwplayer.
Chrome and jwplayer
Doesn't work with jwplayer.
Safari and jwplayer player
You indicated it worked without needing to set any of the DRM settings.
iPhone/Apple TV
I tried it and the m3u8 videos are able to use AirPlay directly to cast from my phone to the Apple TV (4K).
Simulation Summary
All the M3U8 videos can already stream from iPhone to AppleTV just fine with Airplay. It seems to work it Edge, and also in Safari, so maybe it only works because Reddit has accepted Apple streaming via airplay as a service, but not Chromecast. Not quite sure there, but how else could it be explained? More clarification from someone would be great.
Root Cause Analysis
Notice the google link you shared includes this header:
Access-Control-Allow-Origin
and it is set to * (aka. all), meaning the server will share the requested resources with any domain on the Internet.
https://tools.geekflare.com/report/cors-test/https://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/DesigningForGoogleCast.m3u8
The reddit links don’t have that header, meaning CORS is not enabled to allow resource sharing, meaning it is not designed to work.
Description of the CORS headers
https://www.codecademy.com/articles/what-is-cors
The Access-Control-Allow-Origin header allows servers to specify how their resources are shared with external domains. When a GET request is made to access a resource on Server A, Server A will respond with a value for the Access-Control-Allow-Origin header. Many times, this value will be *, meaning that Server A will share the requested resources with any domain on the Internet. Other times, the value of this header may be set to a particular domain (or list of domains), meaning that Server A will share its resources with that specific domain (or list of domains). The Access-Control-Allow-Origin header is critical to resource security.
There are several resources indicating that CORS must be enabled from the server side:
https://stackoverflow.com/a/28360045/9105725
https://help.ooyala.com/video-platform/concepts/cors_enablement.html
Even Google says these headers need to be set:
https://developers.google.com/cast/docs/chrome_sender/advanced
CORS requirements
For adaptive media streaming, Google Cast requires the presence of CORS headers, but even simple mp4 media streams require CORS if they include Tracks. If you want to enable Tracks for any media, you must enable CORS for both your track streams and your media streams. So, if you do not have CORS headers available for your simple mp4 media on your server, and you then add a simple subtitle track, you will not be able to stream your media unless you update your server to include the appropriate CORS header. In addition, you need to allow at least the following headers: Content-Type, Accept-Encoding, and Range. Note that the last two headers are additional headers that you may not have needed previously.

Connecting to AWS IoT Websocket without MQTT client

I have a client application which runs in the browser which I can't change the implementation of to implement an MQTT client such as mqtt on npm.
The code in the library is as follows and allows me to pass in a socketUrl
const ws = new WebSocket(socketUrl)
I have tried generating a presigned URL for IoT, which seems to work in terms of authenticating (i.e. no Unauthorized response) but I get a 426 Upgrade Required response.
I believe I'm correct in saying that if it were working it'd reply with a 101 Switching protocols but without knowing much about MQTT i'm unsure if this is not happening because I'm doing something wrong or because I'm not using MQTT.
I'm generating a signed URL using the below code (I'll switch to Cognito Identities if I get this working rather than using the fixed key/secret)
const v4 = require('aws-signature-v4')
const crypto = require('crypto')
const socketUrl = v4.createPresignedURL(
'GET',
'myioturl.iot.us-east-1.amazonaws.com',
'/mqtt', // tried just /mytopic, too
'iotdevicegateway',
crypto.createHash('sha256').update('', 'utf8').digest('hex'), {
'key': 'removed',
'secret': 'removed',
'protocol': 'wss',
'region': 'us-east-1'
}
)
The protocols page in the iot documentation seems to suggest that if I point at /mqtt I'm indicating I'll be using MQTT.
mqtt Specifies you will be sending MQTT messages over the WebSocket protocol.
What does this mean if I just specify /foobar? Should I be able to connect to the socket but not using MQTT?
There are quite a few unknowns for me so I'm struggling to work out if it should work at all, and if so, which bit am I doing wrong.

How to access content from secured website via proxy tunnel without authentication required

I would like to scrape / download contend from site which requires authentication. To do so, I need some tunnel between my CLI / Node.js app and the secured website without any authentication. Please see the schema:
Scraper / downloader app -> [no passowrd] -> some proxy -> user (login) -> secured website with login / authentication
Any idea how to make it so?
Depending on how exactly the target site handles authentication, your problem could be fixed by setting a simple node-http-proxy app.
This might be the solution (copy-pasted from the documentation, and modified):
var http = require('http'),
httpProxy = require('http-proxy');
var proxy = httpProxy.createProxyServer({});
var server = http.createServer(function(req, res) {
// Your mileage may vary here.
req.setHeader("Authentication", "Basic mysecrettoken=");
proxy.web(req, res, { target: 'http://127.0.0.1:5060' });
});
server.listen(5050);
See https://github.com/nodejitsu/node-http-proxy for more information.

XMLHttpRequest to get a sound from a different domain

I am currently playing with the Web Audio API and I am looking at buffering and play a source of sound coming from a different domain.
I did quite a bit or researches including on stackoverflow and it seems there are solutions to do cross domain requests (JSONP, YQL...) to query html, json, xml... but nothing to capture an audio source. The standard method in order to pick up a sound source is by using an XMLHttpRequest and forcing the response to be of a type of arrayBuffer:
var request = new XMLHttpRequest();
request.responseType = 'arraybuffer';
request.open("GET", url);
the request.response can afterwards be a buffer that can be played.
This seems to work with a "url" that points to an audio file of the same domain. Is there a way to get the response of a XMLHttpRequest requesting an audio source from an external domain?
I tried the http://query.yahooapis.com/v1/public/yql? with a select query but there is no way to pick up an audio source from the tables (according to https://developer.yahoo.com/yql/console/).
Any idea is welcome.
Many thanks!!
This would only work if the audio file you're requesting is on a server that supports CORS (and you use CORS in the request) - you can't just arbitrarily grab sound files off other servers (as that would enable cross-origin data access). http://enable-cors.org/.

Resources