I am currently playing with the Web Audio API and I am looking at buffering and play a source of sound coming from a different domain.
I did quite a bit or researches including on stackoverflow and it seems there are solutions to do cross domain requests (JSONP, YQL...) to query html, json, xml... but nothing to capture an audio source. The standard method in order to pick up a sound source is by using an XMLHttpRequest and forcing the response to be of a type of arrayBuffer:
var request = new XMLHttpRequest();
request.responseType = 'arraybuffer';
request.open("GET", url);
the request.response can afterwards be a buffer that can be played.
This seems to work with a "url" that points to an audio file of the same domain. Is there a way to get the response of a XMLHttpRequest requesting an audio source from an external domain?
I tried the http://query.yahooapis.com/v1/public/yql? with a select query but there is no way to pick up an audio source from the tables (according to https://developer.yahoo.com/yql/console/).
Any idea is welcome.
Many thanks!!
This would only work if the audio file you're requesting is on a server that supports CORS (and you use CORS in the request) - you can't just arbitrarily grab sound files off other servers (as that would enable cross-origin data access). http://enable-cors.org/.
Related
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLSPlaylist.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_540_v4.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_AUDIO_160_K_v4.m3u8
Called proxy with URL http://192.168.xx.xx:8080/3hyw7hwoajn21/HLS_224_v4.m3u8
Here's an example Reddit video: https://www.reddit.com/r/me_irl/comments/b3vrs4/me_irl
Looking through the JSON, it has a few options for video sources:
"reddit_video": {
"dash_url": "https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd",
"duration": 76,
"fallback_url": "https://v.redd.it/3hyw7hwoajn21/DASH_720?source=fallback",
"height": 720,
"hls_url": "https://v.redd.it/3hyw7hwoajn21/HLSPlaylist.m3u8",
"is_gif": false,
"scrubber_media_url": "https://v.redd.it/3hyw7hwoajn21/DASH_240",
"transcoding_status": "completed",
"width": 1280
}
While I seemingly can get other HLS/m3u8 videos to work with the Chromecast SDK (for example Google's own example HLS video), I cannot seem to get any of these sources to work.
I've tried https://v.redd.it/3hyw7hwoajn21/HLSPlaylist.m3u8 with the stream type set to both "live" or "buffered", I've tried the content type as "application/x-mpegURL", and I've tried the same for the dash URL https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd with content type "application/dash+xml" also to no avail. I found this question that seems to indicate some possibility?
I've also noticed with the DASH file there's a separate video and audio stream (https://v.redd.it/3hyw7hwoajn21/DASH_720 and https://v.redd.it/3hyw7hwoajn21/audio) worst case scenario is there a way to play the video stream with the separate audio stream playing too on the Chromecast?
Is it not possible for the Chromecast to play these video types?
UPDATE
Jesse and aergistal suggested that it has to do with the lack of CORS headers. I built a custom receiver app to be able to get better debugging logs, and this was indeed (the first) issue; Chromecast complains about CORS.
Using nginx on I built a local reverse proxy that adds all the CORS headers, then I give Chromecast that proxy URL instead and this CORS error went away.
However, using the HLS/m3u8 link it still wouldn't stream. Now it complains of the following:
[cast.player.hls.PackedAudioParser] Neither ID3 nor ADTS header was found at 0
and
[cast.player.api.Host] error: cast.player.api.ErrorCode.NETWORK/315
and
[cast.receiver.MediaManager] Load metadata error: Error
Full log:
Which causes it to still not play. Any ideas?
Adding the CORS issue allows the DASHPlaylist.mpd variant to load (it wouldn't before), which is great, but not so great at the same time because the reverse proxy requires you to download the entire response first, and where the DASH URL is just an entire MP4 (whereas the HLS is byte ranges) it means the reverse proxy has to download the entire DASH video first before showing it, which takes ages compared to the HLS.
So it'd still be optimal to get the HLS working due to speed, but is it just doomed not to work due to a playback issue on the Chromecast?
Solution for HLS with separate audio tracks
Based on the information from the latest log there is a mismatch between the chosen segment format and actual format used in the stream. The stream uses AAC in MPEG-TS whereas the Cast SDK tries to parse it as packed audio.
A reply on the Cast issue tracker shows that the HlsSegmentFormat defaults to MPEG2_TS if the stream is multiplexed and MPEG_AUDIO_ES otherwise.
The suggested solution for the CAF receiver is to intercept the load requests and override the segment format with loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS. A slightly modified example:
<html>
<head>
</head>
<body>
<cast-media-player id="player"></cast-media-player>
<script type="text/javascript" src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js">
</script>
<script>
const context = cast.framework.CastReceiverContext.getInstance();
const playerManager = context.getPlayerManager();
// intercept the LOAD request
playerManager.setMessageInterceptor(
cast.framework.messages.MessageType.LOAD, loadRequestData => {
loadRequestData.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS;
return loadRequestData;
});
context.start();
</script>
</body>
</html>
Original source
Yet another example
Solution for CORS
The Google Cast reference gives you the solution:
If you're having problems playing streams on a Cast device, it may be an issue with CORS. Use a publicly available CORS proxy server to test your streams
The problem with the publicly available proxies is that they enforce a size limit due to bandwidth concerns, so make your own or use an open-source one. If the app runs on a mobile device you can also make it a local server.
The current streams are not protected by DRM. This will get more complicated or outright impossible later if they add CDN authentication or protect the streams with DRM.
Regarding the CORS headers you must make sure preflight requests are supported: the client might send an OPTIONS first to check CORS support (including allowed methods and headers).
HTTP range requests must also be supported for your streams meaning the appropriate headers must be authorized and exposed.
Example preflight request configuration from https://enable-cors.org:
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET,POST,OPTIONS
Access-Control-Allow-Headers: DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range
Access-Control-Expose-Headers: Content-Length,Content-Range
You will need to allow at least: GET, OPTIONS, the Content-Type and Range headers, and expose Content-Length,Content-Range. Remove duplicate headers if provided by the remote server.
Conclusion
The most ethical answer is to work with Reddit to ensure they set the proper CORS headers. CORS headers are required in the Google documentation.
Simulating your issue
Using this tester:
https://developer.jwplayer.com/tools/stream-tester/
It simulates some of the same experience you were having in your code with the Chromecast SDK. The Google video played without the Playready DRM setting, but the reddit videos did not (in most browsers).
MS EDGE and jwplayer
If you select Playready and put anything for Playready url, even leaving it blank, it works for M3U8.
Internet Explorer and jwplayer
Error, 232011 A manifest request was made without proper crossdomain credentials. Cannot load M3U8: Crossdomain access denied. This video cannot be played because of a technical error.
This indicates that perhaps CORS is not enabled on the reddit servers. More on that below.
Firefox and jwplayer
Nothing there seems to work with jwplayer.
Chrome and jwplayer
Doesn't work with jwplayer.
Safari and jwplayer player
You indicated it worked without needing to set any of the DRM settings.
iPhone/Apple TV
I tried it and the m3u8 videos are able to use AirPlay directly to cast from my phone to the Apple TV (4K).
Simulation Summary
All the M3U8 videos can already stream from iPhone to AppleTV just fine with Airplay. It seems to work it Edge, and also in Safari, so maybe it only works because Reddit has accepted Apple streaming via airplay as a service, but not Chromecast. Not quite sure there, but how else could it be explained? More clarification from someone would be great.
Root Cause Analysis
Notice the google link you shared includes this header:
Access-Control-Allow-Origin
and it is set to * (aka. all), meaning the server will share the requested resources with any domain on the Internet.
https://tools.geekflare.com/report/cors-test/https://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/DesigningForGoogleCast.m3u8
The reddit links don’t have that header, meaning CORS is not enabled to allow resource sharing, meaning it is not designed to work.
Description of the CORS headers
https://www.codecademy.com/articles/what-is-cors
The Access-Control-Allow-Origin header allows servers to specify how their resources are shared with external domains. When a GET request is made to access a resource on Server A, Server A will respond with a value for the Access-Control-Allow-Origin header. Many times, this value will be *, meaning that Server A will share the requested resources with any domain on the Internet. Other times, the value of this header may be set to a particular domain (or list of domains), meaning that Server A will share its resources with that specific domain (or list of domains). The Access-Control-Allow-Origin header is critical to resource security.
There are several resources indicating that CORS must be enabled from the server side:
https://stackoverflow.com/a/28360045/9105725
https://help.ooyala.com/video-platform/concepts/cors_enablement.html
Even Google says these headers need to be set:
https://developers.google.com/cast/docs/chrome_sender/advanced
CORS requirements
For adaptive media streaming, Google Cast requires the presence of CORS headers, but even simple mp4 media streams require CORS if they include Tracks. If you want to enable Tracks for any media, you must enable CORS for both your track streams and your media streams. So, if you do not have CORS headers available for your simple mp4 media on your server, and you then add a simple subtitle track, you will not be able to stream your media unless you update your server to include the appropriate CORS header. In addition, you need to allow at least the following headers: Content-Type, Accept-Encoding, and Range. Note that the last two headers are additional headers that you may not have needed previously.
Our single page app embeds videos from Youtube for the end-users consumption. Everything works great if the user does have access to the Youtube domain and to the content of that domain's pages.
We however frequently run into users whose access to Youtube is blocked by a web filter box on their network, such as https://us.smoothwall.com/web-filtering/ . The challenge here is that the filter doesn't actually kill the request, it simply returns another page instead with a HTTP status 200. The page usually says something along the lines of "hey, sorry, this content is blocked".
One option is to try to fetch https://www.youtube.com/favicon.ico to prove that the domain is reachable. The issue is that these filters usually involve a custom SSL certificate to allow them to inspect the HTTP content (see: https://us.smoothwall.com/ssl-filtering-white-paper/), so I can't rely TLS catching the content being swapped for me with the incorrect certificate, and I will instead receive a perfectly valid favicon.ico file, except from a different site. There's also the whole CORS issue of issuing an XHR from our domain against youtube.com's domain, which means if I want to get that favicon.ico I have to do it JSONP-style. However even by using a plain old <img> I can't test the contents of the image because of CORS, see Get image data in JavaScript? , so I'm stuck with that approach.
Are there any proven and reliable ways of dealing with this situation and testing browser-level reachability towards a specific domain?
Cheers.
In general, web proxies that want to play nicely typically annotate the HTTP conversation with additional response headers that can be detected.
So one approach to building a man-in-the-middle detector may be to inspect those response headers and compare the results from when behind the MITM, and when not.
Many public websites will display the headers for a arbitrary request; redbot is one.
So perhaps you could ask the party whose content is being modified to visit a url like: youtube favicon via redbot.
Once you gather enough samples, you could heuristically build a detector.
Also, some CDNs (eg, Akamai) will allow customers to visit a URL from remote proxy locations in their network. That might give better coverage, although they are unlikely to be behind a blocking firewall.
I'm working on a simple web browser project that uses QtWebEngine.
What I want to is to log out the httpRequest and httpResponse data that transit in my browser. I only interest in http POST transitions.
In chrome's developer's tool, I can do this by go to Network tab, turn on Preserve log. What I need is
Headers>General>Request URL
>From Data (params of POST)
Response (the raw response data)
Since QtWebEngine uses Chromium, I suppose that most things that chrome can do can also be done in QtWebEngine.
How can I get the above three things using QtWebEngine?
If there is no obvious way to do that, can I write an extension to log them out and make QtWebEngine use this extension? I think in extension I can log out http request header but I have no idea of how to log out the response data.
Edit: I don't want to an external debug tool (like port the log to localhost:myport). I need to use those three pieces of data in my browser application.
Edit2:
chrome.devtools.network.onRequestFinished.addListener
Yes somehow I need this but how can I call this or receive a similar event with Qt5.7?
Many (probably the majority) of AJAX calls are done by a browser on a webpage and that webpage has a URL. Is it possible for a webserver to that's receiving the AJAX request to determine the URL of the webpage where the AJAX call was made? I assume there isn't a standard that requires this data in the headers, but perhaps some browsers include that info? Obviously this doesn't apply if the AJAX call was made from a phone app or other application without a URL.
Very generically (though unreliable), check incoming request headers for Referer. That should give you information about the source page.
Just keep in mind it can be spoofed, absent, etc. and shouldn't be considered bullet-proof (though it doesn't sound like you need it to be anyways).
Google Analytics uses Get Request for .gif image to server
http://www.google-analytics.com/__utm.gif?utmwv=4&utmn=769876874&utmhn=example.com&utmcs=ISO-8859-1&utmsr=1280x1024&utmsc=32-bit&utmul=en-us&utmje=...
We can observer that all parameters are sent in this Get Request and the requested image is no where found useful (Its just 1px by 1px Image)
Known Information: If requesting query string is large then Google are going for Post Request.
Now the question is why not Post Request always irrespective of the query string is large or not.
Being data sent via Get Request its leads to security issue. Since, the parameters will be stored in browser history or in web server logs in case of Get Request.
Could someone give any supportive reasons why Google Analytics is depending on both the things?
Because GET requests is what you use for retrieving information that does not alter stuff.
Please note that the use of POST has quite some downsides, the browser usually warns against reloading a resource requested via POST (to prevent double data-entry), POST requests are not cached (which is why some analytics misuse it), proxied etc.
If you want to retrieve a LOT of data using a URL (advice: rethink if there might be a better option), then it's necessary to use post, from Wikipedia:
There are times when HTTP GET is less suitable even for data retrieval. An example of this is when a great deal of data would need to be specified in the URL. Browsers and web servers can have limits on the length of the URL that they will handle without truncation or error. Percent-encoding of reserved characters in URLs and query strings can significantly increase their length, and while Apache HTTP Server can handle up to 4,000 characters in a URL, Microsoft Internet Explorer is limited to 2048 characters in any URL. Equally, HTTP GET should not be used where sensitive information, such as user names and passwords have to be submitted along with other data for the request to complete. In these cases, even if HTTPS is used to encrypt the message body, data in the URL will be passed in clear text and many servers, proxies, and browsers will log the full URL in a way where it might be visible to third parties. In these cases, HTTP POST should be used.
A POST request would require an ajax call and it wouldn't work because of http://en.wikipedia.org/wiki/Same-origin_policy. But images can easily be cross-site, so they just need to add an img tag to the DOM with the required url and the browser will load it, sending the needed information to their servers for tracking.