How to Configure WebSocket Server to Accept Secure Connection Request - laravel-5

Before applying ssl(I take from cloudflare) my website is loaded over http and my socket connection is made over ws
and it's working fine and connection was made successfully.
conn = new WebSocket('ws://myDomain:8090');
But after applying ssl when my website loads over https the I use wss (otherwise it give error)
conn = new WebSocket('wss://myDomain:8090');
Now it gives me the error
WebSocket connection to 'wss://myDomain:8090/' failed: Error in connection establishment: net::ERR_CONNECTION_TIMED_OUT
The websocket server is started over 8090 port I also change the port to 9991 but to no avail.
Here is the code for websocket server
public function handle()
{
$server = IoServer::factory(
new HttpServer(
new WsServer(
new WebSocketController()
)
),
8090
);
$server->run();
}
I don't configure apache to to run a websocket server to accept secure connection request. May be due to this I am getting an error. It means that I am sending a secure connection request to an insecure websocket server. If I am right can you tell me how I configure my websocket server so that it can accept secure connection request.
I am again telling you that I am using the SSL from cloud flare. I tell me my domain and they provide me nameservers to replace it with my existing nameservers.
I requested you to give a clear solution to solve this. I am not using nginx, I am using apache on Lampp.

Someone solved my problem. So I am posting the solution here to help others. I was making two mistakes.
I am using SSL from cloudflare, which causes some issues. So I buy a paid SSL certificate.
I don't configure my websocket server for wss
So here is the code to configure your websocket server for wss in Laravel with Ratchet
public function handle()
{
$loop = Factory::create();
$webSock = new SecureServer(
new Server('0.0.0.0:8090', $loop),
$loop,
array(
'local_cert' => '', // path to your cert
'local_pk' => '', // path to your server private key
'allow_self_signed' => TRUE, // Allow self signed certs (should be false
in production)
'verify_peer' => FALSE
)
);
// Ratchet magic
$webServer = new IoServer(
new HttpServer(
new WsServer(
new WebSocketController()
)
),
$webSock
);
$loop->run();
}

Cloudflare doesn't work with port 8090, here is the list of the ports that are supported by cloudflare.
Also try http://sitemeer.com/#https://yourDomain:8090 to see if your server + domain is serving ssl

Related

Http Client in Laravel only support port 80?

I have a request to a python FASTAPI on port 8000 using Laravel's HTTP client which is a wrapper for Guzzle's HTTP client. Anytime I send a request, it fails because it is sent to port 80. I tried several times to add options to the request but it still fails. Using the Guzzle HTTP client everything works fine. I am just wondering why the HTTP client only works for port 80
Below is the code we tried
$response = Http::withOptions([
'debug' => true,
//'proxy' => [
// 'https' => 'http://127.0.0.1:8088'
// ] // Use this proxy with "HTTP"
])->get('http://127.0.0.1:8088/getImage/1/200');
print($response->body());
return response()->download($response);
We set the proxy to force it to use the port but failed.

Modify server response using node http proxy

I am a beginner to node js and this is my first post in here so apologies if this is a stupid question.
The problem I need to solve is to change information in an HTTPS server response before it hits the client. The reason is that my my client (Im developing a GARMIN sports watch application) cannot read the server response as the body data format differs from the content_type as defined in the response header. I have no means of impacting the server code and on the client I am stuck as well, Garmin has acknowledged it should be fixed but have not put it on priority.
I guess that leaves me with having to implement my own http proxy that I can connect in between the client and server to change either the header or the data format in the server response.
I have browsed around a bit and also played around a bit with node-http-proxy but I am not sure what is the best solution approach. Basically, the SERVER expects HTTPS comms, and preferably I also want secured comms between the PROXY and the CLIENT (since they will exchange user credentials).
I guess my first question is if this is a possible use case:
CLIENT -> (HTTPS_POST_req) -> PROXY -> (HTTPS_POST_req) -> SERVER
SERVER -> (HTTPS_resp) -> PROXY -> (modified_HTTPS_resp) -> CLIENT
I took a quick shot at it using node-http-proxy sample code but got stuck. What I did was to create an http server on localhost and a proxy server that would proxy the request to SERVER.
const http = require('http');
const httpProxy = require('http-proxy');
const proxy = httpProxy.createProxyServer({});
http.createServer(function(req, res) {
console.log('incoming request');
console.log('URL:' + req.url);
console.log('Method:' + req.method);
console.log('Headers:' + JSON.stringify(req.headers));
proxy.web(req, res, { target:'SERVER ADDRESS HERE' });
}).listen(8888, () => {
console.log("Waiting for requests...");
});
From the console I can see the incoming request and also read the url, headers and method but then after a short while I get this:
Error: connect ECONNREFUSED 127.0.0.1:80
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1141:16) {
errno: 'ECONNREFUSED',
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 80
}
I have no idea why something want to connect to localhost on port 80.
Perhaps this approach will never work for HTTPS/SSL traffic? For instance, do I have to setup my own server certificate to do this or can node-http-proxy handle it "under the hood"? Need a bit of help, please.
//Fredrik

Connect Laravel echo from Angular project

I have a Laravel-echo-server with Redis running on my local.
I created a test API endpoint, that emits broadcastable event.
on http://localhost:8000/api/web-socket-test I see the response in echo server CLI.
I set-up laravel-echo auth key and I can get the stat info from server API
http://localhost:6001/apps/APP_ID/status?auth_key=b73a61d0.
The problem is with connecting to echo-server from Angular via ws: protocol.
My connection code is
import {webSocket, WebSocketSubject} from 'rxjs/webSocket';
export class MyComponent implements OnInit, OnDestroy {
myWebSocket: WebSocketSubject<any> = webSocket('ws://127.0.0.1:6001');
ngOnInit() {
this.myWebSocket.subscribe(
msg => console.log('message received: ' + msg),
err => console.log(err),
() => console.log('complete')
);
}
And finally I've got an error: WebSocket connection to 'ws://127.0.0.1:6001/' failed: Connection closed before receiving a handshake response.
How can I establish ws connection?
I believe you want to try connecting using the socket.io client libraries instead of using rxjs raw websockets.
Although it's not immediately clear from the laravel echo server docs, the project title states it's a 'Socket.io server for Laravel Echo'. So I'm assuming you should use the socket.io client libraries for connections.

Error connecting to socket channels when running a echo server through a subdomain

A lot of companies or work places have secutiry protocols in place where only the common ports are open, e.g. 80 (HTTP), 443 (HTTPS). If we were to have our web app try connecting to port 6001 (by default of laravel-echo-server), definitely some of your users will encounter problems as the port is closed... Source
After creating a subdomain in nginx, I launched sockets through it and it worked, except that now I can’t connect to any channel. Client can not be authenticated, got HTTP status 419"message": "CSRF token mismatch.",
Experimentally disabled csrf protection and the error changed to Client can not be authenticated, got HTTP status 403 "message": "",
app.js
/**
* Port configured on proxy server (default: 6001).
* The default port for HTTPS protocol is 443.
*
* #type {string}
*/
const ECHO_DOMAIN = process.env.NODE_ENV === 'production'
? 'ws.site.com'
: 'wsdev.site.com';
window.io = require('socket.io-client');
if (typeof io !== 'undefined') {
window.Echo = new Echo({
broadcaster: 'socket.io',
host: `https://${ECHO_DOMAIN}`,
reconnectionAttempts: 120
});
...
successful connection
How to fix it?
Issue resolved by adding a variable SESSION_DOMAIN=.site.com in .env, wich changes value in 'domain' in config/session.php
Then you need to clear cookies in browser, and the cache on server. You may also need to do composer dump-autoload. Primary source

akka-http extractScheme return 'http' on https request

I run an akka-http secured following the manual http://doc.akka.io/docs/akka-http/current/scala/http/server-side-https-support.html.
Http().bindAndHandle(
extractScheme(s => complete(s)),
interface, port,
connectionContext = https)
A browser shows that https connection is ok.
The problem is that the extractScheme directive returns 'http', but expected 'https'.
Any suggestion?

Resources