How to use vagrant with browser sync for gulp? - vagrant

i want to know if someone is using browser sync with vagrant, it is posible?
how do you configure it? I tried to read if there is something on the web, but nothing clear to me. Are there other service to use livereload to all devices at the same time ?

If your vagrant box is running, you should be able to access your website through the domain you chose. For this example, let's assume your site is loaded via mysite.local
In your gulpfile.js you'll need to set up a proxy, like so:
// gulpfile.js
gulp.task('browser-sync', function() {
browserSync.init(["path/to/your/css/*.css", "path/to/your/js/*.js"],
// The magic line. Point the proxy to your local domain
{ proxy: "mysite.local" }
);
});
Restart gulp if it's already running and you should see an IP address in your command line:
Take note of the port number (in this case, :3000)
Now go to http://mysite.local:3000 -- Notice the port! You must add the port that is shown in your command line. BrowserSync will connect!
Note: If it does not work, a cache could be your problem. Try adding a random query string to the end of your URL to get around a caching issue. Example: http://mysite.local:3000/?t=1

Related

Laravel 9 (Vite) shared on local network on https

I am building a web app that uses (mobile devices's) camera, but this is working only on https and localhost.
The web app is served locally using WAMP 3.2.9.
I've managed to use the secure protocol (https) within my wamp configuration, but I'm having problems when I want to share my app to my local network so I can view the app on my phone and test the camera functionality.
In the older versions of Laravel (which used webpack) this was very easy using browsersync, but now, using Vite I don't know exactly how to do this.
My local domain is myapp.test and can be accessed using both http and https.
I tried to use npm run vite --host, which shows the local and network address as well (ex. 192.168..), but when I visit that address on my phone, I can see only the Vite default page This is the Vite development server that provides Hot Module Replacement for your Laravel application., but not the app itself.
In my vite.config.js file I added that ip from vite network:
server: {
https: true,
host: '192.168._._'
},
plugins: [
laravel({
input: [
'resources/css/app.css',
'resources/js/app.js',
],
refresh: [
...refreshPaths,
'app/Http/Livewire/**',
],
}),
mkcert()
],
Note that I also used the mkcert vite plugin to allow me to use https.
Now I'm confused about the vite service that runs on port 5173 by default and the app that should run on port 443 to be on https.
I've also tried using `php artisan serve --host 192.168.. which works on my local network, but it doesn't work with https, so I had to focus on WAMP only.
So how can I have my app shared among my local network with https?
I'll explain about how Vite works compared to Webpack to hopefully help you understand a little better.
Both Webpack and Vite create a bundle of files when using the build commands to compile for production. Using the dev command, that it seems like you're using, they work a little differently. While Webpack watches for file changes to recompile the bundle and BrowserSync then reloads your assets for you, Vite starts a local server to serve the compiled files. This means that you don't proxy your original domain like with BrowserSync. Vite also creates a file in your public folder called "hot", which tells Laravel which url it should use when using the #vite() directive or the Vite::asset() method. Because of that you can use your original domain myapp.test even for the hot reloading of the dev command. I don't think Laravel actually supports --host and if it doesn't I haven't been able to find it or figure it out.
I did find https://github.com/Applelo/vite-plugin-browser-sync to hopefully solve your testing on other devices but I couldn't get it to work with https, otherwise I'm afraid you might have to look into something like ngrok and use the npm run build command instead of dev until better support is built into Laravel.
Update:
To configure the BrowserSync plugin you have to manually configure the proxy:
VitePluginBrowserSync({
bs: {
proxy: 'http://myapp.test/' // The usual access URL
}
})
Since it doesn't seem like Laravel supports --host I have been able to find a workaround: because Laravel reads the asset host URL from the hot file in the public directory, you can replace the contents with the external Vite URL like http://192.168.1.37:5174 after running npm run dev --host. This will make Laravel use that URL when referencing any assets.

Laravel forcing Http for asssets

this is a little bit strange because most of the questions here wanted to force https.
While learning AWS elastic beanstalk. I am hosting a laravel site there. Everything is fine, except that none of my javascripts and css files are being loaded.
If have referenced them in the blade view as :
<script src="{{asset('assets/backend/plugins/jquery/jquery.min.js')}}"></script>
First thing I tried was looking into the file/folder permissions in the root of my project by SSHing into EC2 instance. Didn't work even when I set the permission to public folder to 777.
Later I found out that, the site's main page url was http while all the assets url were 'https'.
I dont want to get into the SSL certificates things just yet, if it is possible.
Is there anyway I can have my assets url be forced to Http only?
Please forgive my naiveity. Any help would be appreciated.
This usually happens if your site is for example behind an reverse proxy, As the URL helper facade, trusts on your local instance that is beyond the proxy, and might not use SSL. Which can be misleading/wrong.
Which is probaly the case on a EC2 instance... as the SSL termination is beyond load balancers/HA Proxies.
i usually add the following to my AppServiceProvider.php
public function boot()
{
if (Str::startsWith(config('app.url'), 'https')) {
\URL::forceScheme('https');
} else {
\URL::forceScheme('http');
}
}
Of course this needs to ensure you've set app.url / APP_URL, if you are not using that, you can just get rid of the if statement. But is a little less elegant, and disallows you to develop on non https

proxy.pac - exception for images

I'm a web developer and I use squid as a proxy, which I entered in firefox as the proxy server.
So when I enter http://www.example.com in firefox, I see the site on my local machine, by having configured squid accordingly.
Now problem is, that some of our customers have GBs of images, and it's a pain to load them all on my machine. So basically I want to use my offline webpage, but loading the images from the live server, so I don't have a broken site without images.
In order to do this I've tried to create a proxy.pac and configured it this way:
function FindProxyForURL(url, host) {
if (shExpMatch(url, "*.jpg")) {
return "DIRECT";
} else {
return "PROXY 192.168.178.31:3128; DIRECT";
}
}
Unfortunately it doesn't really work. What am I doing wrong, and how can I achieve my goal?
According to the Mozilla document on PAC files:
The path and query components of https:// URLs are stripped. In Chrome, you can disable this by setting PacHttpsUrlStrippingEnabled to false, in Firefox the preference is network.proxy.autoconfig_url.include_path.
What this means is when you enter a url such as https://www.example.com/image.jpg, what gets passed to the PAC script is the url https://www.example.com. As a result, you're never going to enter the first condition of your if statement.
In Firefox, you can change this by going to the about:config page and setting network.proxy.autoconfig_url.include_path to true.

Vue Cli Webpack Proxy

I've been developing with the vue-cli and the Webpack template. Everything works flawlessly but I'm having some issues using a custom host. Right now Webpack listens to localhost:8080 (or similar) and I want to be able to use a custom domain such as http://project.dev. Has anybody figured this out?
This might be where the problem resides:
https://github.com/chimurai/http-proxy-middleware
I also added this to the proxyTable:
proxyTable: { 'localhost:8080' : 'http://host.dev' } and it gives me a console response [HPM] Proxy Created / -> http://host.dev
Any advice, direction or suggestion would be great!
Update
I successfully added a proxy to my Webppack project this way:
var mProxy = proxyMiddleware('/', {
target: 'http://something.dev',
changeOrigin: true,
logLevel: 'debug'
})
app.use(mProxy)
This seems to work, but still not on port 80.
Console Log:
[HPM] Proxy created: / -> http://something.dev
I can assume the proxy is working! But my assets are not loaded when I access the url.
Is important to note I'm used to working with Mamp -- and its using port 80. So the only way I can run this proxy is to shut down Mamp and set the port to 80. It seems to work, but when I reload page with the proxy URL -- there is a little delay, trying to resolve, and then console outputs this:
[HPM] GET / -> http://mmm-vue-ktest.dev
[HPM] PROXY ERROR: ECONNRESET. something.dev -> http://something.dev/
And this displays in the browser:
Error occured while trying to proxy to: mmm-vue-ktest.dev/
The proxy table is for forwarding requests to another server, like a development API server.
If you want the webpack dev server to run on this address, you have to add it to your OS's hosts file. Vue or we pack can't do this, it's the job of your OS.
Google will have simple guides for every OS.

Laravel 5 Route Strange Behaviour

I have a Laravel site set up on a Homestead box, so I'm accessing it on sitename.app:8000. I have a route called "news" but when I try to go to sitename.app:8000/news I get oddly bounced out to sitename.app/news/.
If I change the routename to "news2" I can access the desired controller action as per normal at sitename.app:8000/news2. So somehow it's "news" itself that has become uncooperative - and I'm pretty sure that we aren't even getting as far as the NewsController, when I try to access that url.
Can anyone work out from these symptoms what might be going wrong? One "news"-related change I made at some point was to add $router->model('news', "App\News"); in the boot method of the RouteServiceProvider, but removing this doesn't seem to make the difference.
ETA: People keep asking for the routes.php file. I can literally remove everything from the file except
Route::get('news', function() {
return "hello world";
});
Route::get('news2', function() {
return "hello world";
});
and /news2 will work but /news will bounce me out. So I remain pretty convinced that the problem is somewhere deeper than routes.php...
I finally worked out what boneheaded action of mine had been causing this behaviour!
I had created a folder in /public named "news"... i.e. with the same name as an important route. Not sure exactly what havoc this was wreaking behind the scenes for Laravel every time a request for /news was being made, but one can assume it was nothing good.
Advice for anyone tearing their hair out over a route that "mysteriously doesn't work" - check your public folder for possible collisions!
This is a known issue Larvel missing port
The easiest way to solve this problem is to go to public/index.php and set the SERVER_PORT value.
$_SERVER['SERVER_PORT'] = 8000;
Don't forget to set the base url in the config if you are using links on website, the url generator uses the base-url defined in the config.
Last option is to change the vm portfoward in VagrantFile to point to port 80 and use port 80 for your app.

Resources