Websocket issue preventing bokeh app deployed in heroku from loading in website - heroku

I have a bokeh app deployed in heroku. I want to embed it in a website, but am failing to do so.
The app is here:
https://ckgsb-final.herokuapp.com/cn_ckgsb
And this is for the script to use in the website:
from bokeh.embed import server_document
script = server_document("https://ckgsb-final.herokuapp.com/cn_ckgsb")
print(script)
<script id="1014">
(function() {
const xhr = new XMLHttpRequest()
xhr.responseType = 'blob';
xhr.open('GET', "https://ckgsb-final.herokuapp.com/cn_ckgsb/autoload.js?bokeh-autoload-element=1014&bokeh-app-path=/cn_ckgsb&bokeh-absolute-url=https://ckgsb-final.herokuapp.com/cn_ckgsb", true);
xhr.onload = function (event) {
const script = document.createElement('script');
const src = URL.createObjectURL(event.target.response);
script.src = src;
document.body.appendChild(script);
};
xhr.send();
})();
</script>
The procfile for the heroku app is:
web: bokeh serve --port=$PORT --allow-websocket-origin=ckgsb-final.herokuapp.com --address=0.0.0.0 --use-xheaders cn_ckgsb.py
I know the problem is with the websocket. I've tried various combinations of the app url in both the procfile and the script code, but haven't managed to fix it.
Thanks.

You need to configure an allowed websocket origin for the URL of the embedding site as well. When users navigate to mysite.org and the page there tries to embed the Bokeh app, the HTTP origin received by the Bokeh server will be mysite.org. If the Bokeh server has not been configured to allow that origin, the request will be rejected.

Related

Making credentialed requests with Bokeh AjaxDataSource

I have a plot set up to use an AjaxDataSource. This is working pretty well in my local development, and was working as deployed in my Kubernetes cluster. However, after I added HTTPS and Google IAP (Identity-Aware Proxy) to my plotting app, all of the requests to the data-url for my AjaxDataSource are rejected by the Google IAP service.
I have run into this issue in the past with other AJAX requests to Google IAP-protected services, and resolved it by setting {withCredentials: true} in my axios requests. However, I do not have this option while working with Bokeh's AjaxDataSource. How do I get BokehJS to pass the cookies to my service in the AjaxDataSource?
AjaxDataSource can pass headers:
ajax_source.headers = { 'x-my-custom-header': 'some value' }
There's not any way to set cookies (that would be set on the viewer's browser... which does not seem relevant in this context). Doing that would require building a custom extension.
Thanks to bigreddot for pointing me in the right direction. I was able to build a custom extension that did what I needed. Here's the source code for that extension:
from bokeh.models import AjaxDataSource
from bokeh.util.compiler import TypeScript
TS_CODE = """
import {AjaxDataSource} from "models/sources";
export class CredentialedAjaxDataSource extends AjaxDataSource {
prepare_request(): XMLHttpRequest {
const xhr = new XMLHttpRequest();
xhr.open(this.method, this.data_url, true);
xhr.withCredentials = true;
xhr.setRequestHeader("Content-Type", this.content_type);
const http_headers = this.http_headers;
for (const name in http_headers) {
const value = http_headers[name];
xhr.setRequestHeader(name, value)
}
return xhr;
}
}
"""
class CredentialedAjaxDataSource(AjaxDataSource):
__implementation__ = TypeScript(TS_CODE)
Bokeh extensions documentation: https://docs.bokeh.org/en/latest/docs/user_guide/extensions.html

Setting Proxy for Electron App

I am using of some of the npm modules which are making get request behind the scenes to pull some data from websites. But there is no option or setting to set proxy for those requests, so I want to know how to set proxy for entire electron app so that all the requests go through that proxy?
Using request:
Use environment variables :
process.env.HTTP_PROXY = 'http://192.168.0.36:3128'
Using Axios:
Install this package :
npm install https-proxy-agent
Then :
const axios = require('axios');
const HttpsProxyAgent = require('https-proxy-agent');
let config = {}
config.httpsAgent = new HttpsProxyAgent('http://192.168.0.36:3128')
config.url = 'https://example.com'
config.method = 'GET'
axios(config).then(...).catch(...)
Electron app
For the wall app (like IMG SRC in HTML), you can use command line switches supported by Electron :
const { app } = require('electron')
app.commandLine.appendSwitch('proxy-server', '172.17.0.2:3128')
app.on('ready', () => {
// Your code here
})
See documentation

feathersjs -> socketio https request not working

I have an application made in featherjs which I would like to run with https. I have gotten that working. I did that by changing the 'index.js' file to look like this:
const fs = require('fs');
const https = require('https');
const app = require('./app');
const port = app.get('port');
const host = app.get('host');
//const server = app.listen(port);
const server = https.createServer({
key: fs.readFileSync('./certs/aex007.key'),
cert: fs.readFileSync('./certs/aex007.crt')
}, app).listen(port, function(){
console.log("Mfp Backend started: https://" + host + ":" + port);
});
As soon as I now go to e.g. 'https://127.0.0.1/a_service_name' in postman, I get a result after accepting the certificate. When I go to the address in a browser it also give result, the certificate indication is 'red' for it's selfsigned.
So my problem is the following. When I go to 'http://127.0.01' in a browser, in stead of the 'index.html' file I get nothing of my 'socket' information, only a blank page. I get the following error in the console
info: (404) Route: /socket.io/?EIO=3&transport=polling&t=LwydYAw -
Page not found
Then 'index.html' file I'm using is currently containing this:
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.0.3/socket.io.js"></script>
<script type="text/javascript" src="//cdn.rawgit.com/feathersjs/feathers-client/v1.1.0/dist/feathers.js"></script>
<script type="text/javascript">
var socket = io('https://127.0.0.1:3001');
var client = feathers()
.configure(feathers.hooks())
.configure(feathers.socketio(socket));
var todoService = client.service('/some_service');
todoService.on('created', function(todo) {
alert('created');
console.log('Someone created a todo', todo);
});
</script>
Can someone explain to me what to do to get the alert message?
Edit 2017/09/27
I found on the internet that socket.io is configured like
var https = require('https'),
fs = require('fs');
var options = {
key: fs.readFileSync('ssl/server.key'),
cert: fs.readFileSync('ssl/server.crt'),
ca: fs.readFileSync('ssl/ca.crt')
};
var app = https.createServer(options);
io = require('socket.io').listen(app); //socket.io server listens to https connections
app.listen(8895, "0.0.0.0");
However the require of feathers-socket.io is in the app.js not the index.js. I wonder if I can move that?
As daffl pointed out on the feathers slack channel here; check out the documentation which requires in feathers-socketio explicitly before calling configure on the app, in addition to the https portion of the docs. Putting those two together, I would do something like this (untested):
const feathers = require('feathers');
const socketio = require('feathers-socketio');
const fs = require('fs');
const https = require('https');
const app = feathers();
app.configure(socketio());
const opts = {
key: fs.readFileSync('privatekey.pem'),
cert: fs.readFileSync('certificate.pem')
};
const server = https.createServer(opts, app).listen(443);
// magic sauce! Socket w/ ssl
app.setup(server);
The structure of your app.js and index.js is totally up to you. You can do all of the above in a single file as shown, or split out the https/fs requires into index.js, and configuring the app into app.js - I would recommend this approach because it will allow you to change the (usually smaller) index.js file if you every decide to use a reverse proxy like nginx to handle ssl instead of node.

How to control a CasperJS automation script from a remote jquery client using socket.io

I have an automation script in CasperJS controlling a PhantomJS headless browser that logs into a site, enters data over multiple pages / form.
From the same physical server, I have PHP/MySQL serving up a CRM client website. On the CRM site, I want to have the ability to:
Trigger the remote CasperJS script to go browse a remote site and log in and fill out forms
Read the output stream (i.e. "Page 1 complete, page 2 complete" ,etc)
Display the status updates to the client user as the CasperJS script is executing
I am thinking that socket.io is the ticket here. But, I am I going about this all wrong? I am trying to avoid having a selenium server running. I checked this answer on SO but I am not looking for screenshots, I'm looking for the console output from CasperJS to be displayed in the client website.
I had a similar task once and concocted a solution using local Express.js server with Socket.io.
You would launch this server with node.js and then pass tasks to it from PHP by making POST requests to http://127.0.0.1:9000 (I used the excellent Requests library).
Here's a simplified version of my script:
var fs = require("fs");
var express = require("express");
var app = express();
var server = require("http").Server(app);
var io = require("socket.io")(server);
var iosocket;
// Express middleware to get variables from POST request
var bodyParser = require('body-parser');
app.use(bodyParser.urlencoded({ extended: true }));
// Create websocket connection
io.on("connection", function(socket){
console.log('io.js connection');
iosocket = socket;
});
// Receieve task from external POST request
app.post("/scrape", function(req, res){
res.send("Request accepted");
// Url to parse
var url = req.body.url;
// Variable to collect data from scraper
var data = [];
// Launch scraping script
var spawn = require('child_process').spawn,
child = spawn('/path/to/casperjs', ['/path/to/scrape/script.js', url]);
console.log("Spawned parser");
// Receieve data from script
child.stdout.on('data', function (data) {
var message = data.toString();
data.push(message);
// Send data to the web client
iosocket.emit("message", message);
});
// On error
child.stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
// On scraper exit
child.on('close', function (code) {
console.log("Scraper exited with code: " + code);
//
// Put data into a file or a database, for example
//
fs.writeFileSync("path/to/file/results_" + (new Date()).getTime() + ".json", JSON.stringify(data));
});
});
// Bind app to port # localhost
server.listen(9000, "127.0.0.1");
Solution with CasperJS/Phantomjs server is interesting, however people pointed out that it leaks memory, which probably won't be happening if you run short-lived CasperJS scripts.

Cross-domain connections via Socket.io on Nodejitsu

I am having a cross domain problem connecting from localhost to a remote server at Nodejitsu via Socket.io. I get an error "...header contains multiple values 'http://evil.com/, *', but only one is allowed". More details below:
I have an Express/Mongoose/Socket.io app running at Nodejistu serving as a REST API, it serves no HTML files.
Locally I have an Angularjs+Requirejs app (running at http://localhost:8000) trying to connect to the remote API and I can't get access. While I can test the API methods with POSTMAN and am able to read the socket.io script frontend from the Angular RequireJS app, the connection is not granted access and cause server crash looping.
In my NodeJS/Express app on Nodejitsu, I have set the following:
var express = require('express');
var app = express();
var bodyParser = require('body-parser');
var morgan = require('morgan');
var port = process.env.PORT || 80; // set our port the same as Nodejitsu
// ATTACHING SOCKET.IO
var server = require('http').createServer(app);
var io = require('socket.io')(server);
app.set('socketio', io); // socket instance of the app
app.set('server', server);
// CONFIGURE BODY PARSER
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
//CORS SETTING
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "http://localhost:8000");
res.header("Access-Control-Allow-Methods", "GET,PUT,POST,DELETE,OPTIONS");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
res.header("Access-Control-Allow-Credentials", "false");
next();
});
// START SERVER
app.get('server').listen(port);
---------
// package.json
"dependencies": {
"express": "4.11.1",
"morgan": "1.5.1",
"mongoose": "3.8.21",
"body-parser": "1.10.2",
"grunt-release": "0.10.0",
"socket.io": "1.3.2"
},
In the Angular app on localhost:8000, I checked that Header is not duplicated, as the attached png shows.
// main.js
"use strict";
require.config({
paths: {
...
'socketio': 'http://<MYAPP>.jit.su/socket.io/socket.io';,
...
// SocketFactory.js
var socket = io.connect('http://<MYAPP>.jit.su:80/api/boards');
However I get this error message, even when I set Origin to be localhost://8000:
XMLHttpRequest cannot load http://<MYAPP>.jit.su/socket.io/?EIO=3&transport=polling&t=1423052553506-7.
The 'Access-Control-Allow-Origin' header contains multiple values
'http://evil.com/, *', but only one is allowed.
Origin 'http://localhost:8000'; is therefore not allowed access.
I got the same error, but it wasn't even a CORS issue in the end. When using socket.io with express, listen on the server, not the app, as stated in socket.io's docs.
var app = require('express')();
var server = require('http').Server(app);
var io = require('socket.io')(server);
server.listen(80);
(...)

Resources