According to the Socket.io documentation:
A standalone build of socket.io-client is exposed automatically by the socket.io server as /socket.io/socket.io.js. Alternatively you can serve the file socket.io-client.js found at the root of this repository.
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io('http://localhost');
socket.on('connect', function(){
socket.on('event', function(data){});
socket.on('disconnect', function(){});
});
</script>
However, I would like to serve the socket.io client from a separate CDN (it's cheaper, faster, and reduces load on my server).
How can I do this? Do I have to disable the socket.io default?
As long as the version of the client you are using is the same as what you use on your server, there should not be any problem serving it from a CDN.
That said, the client is tiny (24kb), and if caching is setup properly, this should have very little impact on your server.
update: as mentioned by #maxwell2022, socket.io has its own cdn starting with 1.0.0, so you can use:
<script src="https://cdn.socket.io/socket.io-1.0.0.js"></script>
You can find here CDN links to the socket.io client script files.
0.9.16
//cdnjs.cloudflare.com/ajax/libs/socket.io/0.9.16/socket.io.min.js
0.9.6
//cdnjs.cloudflare.com/ajax/libs/socket.io/0.9.6/socket.io.min.js
...and so on.
According to the wiki, if you choose to serve the client yourself, you can clone the socket.io-client repository and look at the dist/ subdirectory. There are 4 files to serve (this may change):
WebSocketMain.swf
WebSocketMainInsecure.swf
socket.io.js
socket.io.min.js
Just make sure you update these files whenever you update the server.
Related
I am building a web app that uses (mobile devices's) camera, but this is working only on https and localhost.
The web app is served locally using WAMP 3.2.9.
I've managed to use the secure protocol (https) within my wamp configuration, but I'm having problems when I want to share my app to my local network so I can view the app on my phone and test the camera functionality.
In the older versions of Laravel (which used webpack) this was very easy using browsersync, but now, using Vite I don't know exactly how to do this.
My local domain is myapp.test and can be accessed using both http and https.
I tried to use npm run vite --host, which shows the local and network address as well (ex. 192.168..), but when I visit that address on my phone, I can see only the Vite default page This is the Vite development server that provides Hot Module Replacement for your Laravel application., but not the app itself.
In my vite.config.js file I added that ip from vite network:
server: {
https: true,
host: '192.168._._'
},
plugins: [
laravel({
input: [
'resources/css/app.css',
'resources/js/app.js',
],
refresh: [
...refreshPaths,
'app/Http/Livewire/**',
],
}),
mkcert()
],
Note that I also used the mkcert vite plugin to allow me to use https.
Now I'm confused about the vite service that runs on port 5173 by default and the app that should run on port 443 to be on https.
I've also tried using `php artisan serve --host 192.168.. which works on my local network, but it doesn't work with https, so I had to focus on WAMP only.
So how can I have my app shared among my local network with https?
I'll explain about how Vite works compared to Webpack to hopefully help you understand a little better.
Both Webpack and Vite create a bundle of files when using the build commands to compile for production. Using the dev command, that it seems like you're using, they work a little differently. While Webpack watches for file changes to recompile the bundle and BrowserSync then reloads your assets for you, Vite starts a local server to serve the compiled files. This means that you don't proxy your original domain like with BrowserSync. Vite also creates a file in your public folder called "hot", which tells Laravel which url it should use when using the #vite() directive or the Vite::asset() method. Because of that you can use your original domain myapp.test even for the hot reloading of the dev command. I don't think Laravel actually supports --host and if it doesn't I haven't been able to find it or figure it out.
I did find https://github.com/Applelo/vite-plugin-browser-sync to hopefully solve your testing on other devices but I couldn't get it to work with https, otherwise I'm afraid you might have to look into something like ngrok and use the npm run build command instead of dev until better support is built into Laravel.
Update:
To configure the BrowserSync plugin you have to manually configure the proxy:
VitePluginBrowserSync({
bs: {
proxy: 'http://myapp.test/' // The usual access URL
}
})
Since it doesn't seem like Laravel supports --host I have been able to find a workaround: because Laravel reads the asset host URL from the hot file in the public directory, you can replace the contents with the external Vite URL like http://192.168.1.37:5174 after running npm run dev --host. This will make Laravel use that URL when referencing any assets.
For reference: https://socket.io/get-started/chat/
The guide says that it's for localhost only:
Socket.IO is composed of two parts:
A server that integrates with (or mounts on) the Node.JS HTTP Server:
socket.io
A client library that loads on the browser side: socket.io-client
During development, socket.io serves the client
automatically for us, as we’ll see, so for now we only have to install
one module.
I've already completed the guide. It works in development. I now want to test this on Heroku (I already know how to deploy to Heroku). The guide seems to be telling me I need socket.io-client to do that, but I'm not sure how to implement it.
Turns out socket.io-client had nothing to do with it. The example wouldn't work on Heroku because process.env.PORT wasn't being used. In index.js replace this:
http.listen(3000, function(){
console.log('listening on *:3000');
});
with this:
http.listen(process.env.PORT || 3000, function(){
console.log('listening on *:3000');
});
I am using grails resources plugin. On client I am using require.js to fetch js.
my require.js config -
baseUrl: '/js/lib',
With resources plugin enabled -
browser would make request for /js/lib/abc.js wasting ~300ms
On reaching server it will be redirected to /static/2432yi4h32kh4232h4k2h34ll.js
Browser will find this file in its cache and serve it.
So I disabled cached-resources plugin using -
grails.resources.mappers.hashandcache.excludes = ['**/*.js']
and new require.js config -
baseUrl: '/static/js/lib',
urlArgs: "bust=" + application_version,
Removing cached-resources solved redirect issue but also removed the expires header which was being set for js files causing browsers to not cache js files at all.
How can I only disable the name hashing in cached-resources and keep the expires headers it sets.
Otherwise, are there any plugins for Grails I can use to set these headers and they work well with Resources plugin.
I am using Tomcat and Haproxy to serve content.
I think best solution is to put the hashed js file name in require definition, not the original clear name.
You can echo the hashed name using the resource external tag
<r:external uri="js/custom.js"/>
<script type="text/javascript">
var urlOfCSSToLoadInJSCode = '${r.external(uri:"css/custom.css").encodeAsJavaScript()}';
</script>
<r:external uri="icons/favicon.ico"/>
I am trying to setup node-http-proxy. My goal is to put a proxy on my website. I could manually do this by doing the GETs on the server and then changing the links in the HTML but I would like to use an existing solution if there is one. Maybe I don't fully understand what node-http-proxy is. Here is my test code:
require("http-proxy").createServer(function (req, res, proxy) {
proxy.proxyRequest(req, res, {
host: 'npr.org',
port: 80
});
}).listen(8000);
I go to localhost:8000 and it returns NPR. But the source that is returned still includes links directly to NPR such as:
<script type="text/javascript" src="http://s.npr.org/templates/javascript/generated/fingerprint/homepageMetrics-62631a6b672420dab3673f851b6a5de98512e21d.js">
So if I were using the proxy to gain access to a website that is blocked it would not work. Nor would it work if I were using the proxy to keep the end server from knowing the client downloaded something. Basically the only HTTP proxying that is happening is with the initial GET (I think).
Is node-http-proxy capable of proxying all HTTP requests or is that something I will have to do manually?
sudo npm install npr -g
does the work, you may want to take a look at it.
I have website that use XMLHttpRequest (jQuery, actually). I also have another site running on the same server, which serves a script file that makes XHR requests back to THAT site, ie.
http://mysite:50000/index.html includes
<script src="http://mysite:9000/otherscript.js"></script>
and http://mysite:9000/otherscript.js includes
$.ajax({
url: 'http://mysite:9000/ajax/stuff'
});
The problem is - this doesn't work. The AJAX requests from the loaded script simply fail with no error message. From what I've been able to find this is the old same origin policy. Given that I control both sites, is there anything I can do to make this work? The "document.domain" trick doesn't seem to do a thing for XMLHttpRequest.
Nope- can't do this with XHR. Same-domain policy is very restrictive there- same host, same port, same protocol. Sorry! You'll have to resort to other tricks (iframes, title manipulation, etc) to get it to work.
You can do this by adding Access-Control-Allow-Origin header.
If you are using PHP
header("Access-Control-Allow-Origin: http://example.com");
or in Node.js
response.writeHead(200, {'Access-Control-Allow-Origin':' http://example.com'});
This should do the trick for you. It always works for me.
I just solved a similar issue with a PHP service I'm currently playing around with (not sure how relevant a PHP solution is to this directly, but...) by making a single line proxy PHP page, SimpleProxy.php:
<?php
echo file_get_contents('http://localhost:4567');
?>
And in my XMLHttpRequest I use 'SimpleProxy.php' in place of 'http://localhost:4567', which effectively puts the request on the same domain as my .js code.