firefox OS app,CORS in Firefox os app - ajax

I have been developing web-app(not hosted app) in firefox OS .
I want to access the websites xml/JSON data using XMLHttp request. but it gives error as CORS not allowed to access the data . I know about to add 'Access-Control-Allow-Origin' header in website and enabling CORS may cause security issues.
But is their any alternate way to access the data feed via XMLHttp request?

First change your manifest to have the following fields (the type one gets forgotten by people):
"type": "privileged",
"permissions": {
"systemXHR" : {}
}
Second, move all your JavaScript code to a separate JS file. Because it's not allowed to have inline tags in a privileged application.
Third use the mozSystem constructor like raidendev said:
var xhr = new XMLHttpRequest({ mozSystem: true });

To perform cross-domain http request from Firefox OS app you need to set permission systemXHR in app's manifest:
"permissions": {
"systemXHR" : {}
}
and create XMLHttpRequest with property mozSystem set to true:
var xhr = new XMLHttpRequest({ mozSystem: true });
Also, for any cases where XMLHttpRequest is not applicable, you can use TCP Socket API.
var socket = navigator.mozTCPSocket.open('localhost', 80);
socket.ondata = function (event) {
if (typeof event.data === 'string') {
console.log('Get a string: ' + event.data);
} else {
console.log('Get a Uint8Array');
}
}

Related

Making credentialed requests with Bokeh AjaxDataSource

I have a plot set up to use an AjaxDataSource. This is working pretty well in my local development, and was working as deployed in my Kubernetes cluster. However, after I added HTTPS and Google IAP (Identity-Aware Proxy) to my plotting app, all of the requests to the data-url for my AjaxDataSource are rejected by the Google IAP service.
I have run into this issue in the past with other AJAX requests to Google IAP-protected services, and resolved it by setting {withCredentials: true} in my axios requests. However, I do not have this option while working with Bokeh's AjaxDataSource. How do I get BokehJS to pass the cookies to my service in the AjaxDataSource?
AjaxDataSource can pass headers:
ajax_source.headers = { 'x-my-custom-header': 'some value' }
There's not any way to set cookies (that would be set on the viewer's browser... which does not seem relevant in this context). Doing that would require building a custom extension.
Thanks to bigreddot for pointing me in the right direction. I was able to build a custom extension that did what I needed. Here's the source code for that extension:
from bokeh.models import AjaxDataSource
from bokeh.util.compiler import TypeScript
TS_CODE = """
import {AjaxDataSource} from "models/sources";
export class CredentialedAjaxDataSource extends AjaxDataSource {
prepare_request(): XMLHttpRequest {
const xhr = new XMLHttpRequest();
xhr.open(this.method, this.data_url, true);
xhr.withCredentials = true;
xhr.setRequestHeader("Content-Type", this.content_type);
const http_headers = this.http_headers;
for (const name in http_headers) {
const value = http_headers[name];
xhr.setRequestHeader(name, value)
}
return xhr;
}
}
"""
class CredentialedAjaxDataSource(AjaxDataSource):
__implementation__ = TypeScript(TS_CODE)
Bokeh extensions documentation: https://docs.bokeh.org/en/latest/docs/user_guide/extensions.html

Nuxt window is not defined on server-side rendering

I am trying to get the authorization headers from localStorage inside my middleware. Unfortunately this doesn't work on the first page load, because it is server-rendered.
How could I fix this?
const cookieName = 'feathers-jwt';
import { ApolloClient, createNetworkInterface } from 'apollo-client';
import 'isomorphic-fetch';
const API_ENDPOINT = 'http://localhost:3000/graphql';
const networkInterface = createNetworkInterface({ uri: API_ENDPOINT });
networkInterface.use([{
applyMiddleware(req, next) {
if (!req.options.headers) {
req.options.headers = {}; // Create the header object if needed.
}
req.options.headers['authorization'] = window.localStorage.getItem(cookieName);
next();
}
}]);
const apolloClient = new ApolloClient({
networkInterface,
transportBatching: true
});
export default apolloClient;
source: http://dev.apollodata.com/core/network.html
As I understand it, when you're rendering on the server you don't have access to window and document. In apps that render on both the server and in the client, you need to build in a check to see where you are, and handle that accordingly.
You can use this snippet for the detection of where you are:
var canUseDOM = !!(
typeof window !== 'undefined' &&
window.document &&
window.document.createElement
)
Use it to check if you are running server-side or client-side. In your case I would do the following:
If you're server-side you can check the cookies in the HTTP request itself;
If you're client-side you can check your localStorage store instead.
Of course, you can always opt to server-side render your website as an anonymous not authorised user by default. But that would cause the front-end to blink in and out of authorised state and would be annoying for the user.
In your case, I'd try to find authorisation cookies from the actual cookies that are present in your HTTP request.

How do I make a CORS request with fetch on my localhost?

I'm building a React/Redux app that integrates with GitHub's API. This app will require users to sign-in using GitHub's OAuth. I'm trying to use the npm package isomorphic-fetch to do the request but cannot seem to get it to work.
Here is the Request:
require('isomorphic-fetch');
var types = require(__dirname + '/../constants/action_types');
module.exports.handleAuthClick = function() {
return function(dispatch, getState) {
var state = getState();
return fetch('http://localhost:3000/auth')
.then(function(res) {
if (res.status <= 200 && res.status > 300) {
// set cookie
// return username and token
return {
type: HANDLE_AUTH_CLICK,
data: res.json()
};
}
throw 'request failed';
})
.then(function(jsonRes) {
dispatch(receiveAssignments(jsonRes));
})
.catch(function(err) {
console.log('unable to fetch assignments');
});
};
};
Here is my Router
authRouter.get('/', function(req, res) {
res.redirect('https://github.com/login/oauth/authorize/?client_id=' + clientId);
});
And here is the Error I keep getting
Fetch API cannot load https://github.com/login/oauth/authorize/?client_id=?myclientID
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://localhost:3000' is therefore not allowed access. If an opaque
response serves your needs, set the request's mode to 'no-cors' to fetch the
resource with CORS disabled.
Looks like this is a security option which prevents a web page from making AJAX requests to different domain. I faced the same problem, and below steps fixed it.
Firstly enable CORS in the WebService app using 'package Manager' console
PM>Install-Package Microsoft.AspNet.WebApi.Cors
Inside App_Start/WebApiConfig.cs file inside the method Register (HttpConfiguration config) add code
config.EnableCors();
Finally add the [EnableCors] attribute to the class
namespace <MyProject.Controllers>
{
[EnableCors(origins: "http://example.com", headers: "*", methods: "*")]
public class MyController : ApiController
{
//some code

Receiving binary data using request module in Firefox Add-on SDK

I am using the Add-on builder and I need to receive binary data (image). I would like to do this using the request module but as you can see from the documentation:
https://addons.mozilla.org/en-US/developers/docs/sdk/latest/packages/addon-kit/docs/request.html
There are only text and json properties and raw is absent.
How should I receive binary data in the add-on script?
You cannot do this using the request module, you will have to use the regular XMLHttpRequest via chrome authority. Something like this should work:
var {Cc, Ci} = require("chrome");
var request = Cc["#mozilla.org/xmlextras/xmlhttprequest;1"]
.createInstance(Ci.nsIJSXMLHttpRequest);
request.open("GET", "...");
request.onload = function()
{
onUnload.unload();
var arrayBuffer = request.response;
if (arrayBuffer)
{
var byteArray = new Uint8Array(arrayBuffer);
...
}
};
request.onerror = function()
{
onUnload.unload();
}
request.send(null);
var onUnload = {
unload: function()
{
// Make sure to abort the request if the extension is disabled
try
{
request.abort();
}
catch (e) {}
}
};
require("unload").ensure(onUnload);
The mechanism to ensure that the request is aborted if your extension is suddenly disabled is rather awkward, that's the main reason the request module exists rather than simply giving you XMLHttpRequest. Note that it is important to call onUnload.unload() once the request finishes, otherwise the Add-on SDK will keep it in the list of methods to be called on unload (a memory leak). See documentation of unload module.

XMPP and Same origin policy problem

I'm building a chat application using OpenFire server and JSJaC client library.
The page loads from http://staging.mysite.com and XMPP runs on http://xmpp.mysite.com. As you can see they both share the same domain. So I use the following code on page load.
function OnPageLoad (){
document.domain = "mysite.com";
DoLogin();
}
Anyway it throws me exception saying that I violate the security. Why document.domain doesn't work? Should it work or is it done just for a "beauty"? If yes, what can be done in this specific situation?
I don't have access to the XMLHttpRequest object inside the library and do not control it.
Anyway. I had to dig a little bit deeper the JSJaC library and made some injections to the code. But first I've done some workaround. Basically I added the following headers to the response
Access-Control-Allow-Methods: GET, POST, OPTIONS
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Content-Type, *
Generally this allowed to make crossdomain requests using a native xhr. However it proved to work in only modern browsers. For instance it didn't work in IE8 and any version of Opera simply rejected this header.
Then I used flash based solution. I used flXHR and modified jsjac.uncompressed.js like this.
XmlHttp.create = function () {
// try {
// if (window.XMLHttpRequest) {
// var req = new XMLHttpRequest();
//
// // some versions of Moz do not support the readyState property
// // and the onreadystate event so we patch it!
// if (req.readyState == null) {
// req.readyState = 1;
// req.addEventListener("load", function () {
// req.readyState = 4;
// if (typeof req.onreadystatechange == "function")
// req.onreadystatechange();
// }, false);
// }
//
// return req;
// }
// if (window.ActiveXObject) {
// return new ActiveXObject(XmlHttp.getPrefix() + ".XmlHttp");
// }
// }
// catch (ex) {}
// // fell through
// throw new Error("Your browser does not support XmlHttp objects");
var AsyncClient = new flensed.flXHR({
"autoUpdatePlayer": true,
"instanceId": "myproxy" + _xhrpf.toString(),
// This is important because the library uses the response xml of the object to manipulate the data
"xmlResponseText": true,
"onreadystatechange": function () { }
});
// counter for giving a unique id for the flash xhr object.
_xhrpf++;
return AsyncClient;
};
var _xhrpf = 1;
Then I just added a crossdomain.xml in the root of the target domain. Now it works perfectly if the browser has the flash plugin. Further I want to make some detection mechanism if there is no flash plugin, just make a native xhr and hope for that the browser supports the headers for cross domain requests.

Resources