GulpJS - live reload with Laravel - laravel

I'm trying to get this working and I have spent a day now with not much success... I'm new to both Laravel and GulpJS, which makes it even better I guess... :)
What I'm trying to achieve is a (preferably cross-platform) solution for reloading the browser on file changes using GulpJS on a Laravel instance. I could get gulp-livereload working as per its documentation, but I cannot set it up properly to have an effect on the local domain http://laravel.dev set with MAMP PRO (I know, I know...).
I am happy to leave MAMP behind if it's a must as long as I can set a local development domain on a project-by-project basis. I tried to look around even on https://laracasts.com/ but I'm yet to find a solution which would take care of automatic reloading of the browser with Laravel - it seems this bit is missing from all Laravel-GulpJS solutions...
Anybody for the rescue? Any ideas? I'll keep looking in the meantime... :)

Of course, you can use gulp-livereload with Laravel. And you don’t have to use Elixir for livereload. Just do the following:
Install gulp modules as usual in your Laravel App folder:
npm install --save-dev gulp gulp-livereload
Add javascript to Laravel layout script for livereload with src="//localhost:35729/livereload.js?snipver=1" (sorry, this site doesn't allow me to insert complete javascript snippet)
Install LiveReload browser extension
The simplest gulpfile.js will be like this:
var gulp = require('gulp'),
livereload = require('gulp-livereload');
gulp.task('reload', function() {
livereload.reload();
});
gulp.task('watch', function() {
livereload.listen();
gulp.watch(['./public/**/*.css', './app/**/*.php', './resources/**/*.php'],
['reload']);
});
run gulp watch

Related

Laravel 9 (Vite) shared on local network on https

I am building a web app that uses (mobile devices's) camera, but this is working only on https and localhost.
The web app is served locally using WAMP 3.2.9.
I've managed to use the secure protocol (https) within my wamp configuration, but I'm having problems when I want to share my app to my local network so I can view the app on my phone and test the camera functionality.
In the older versions of Laravel (which used webpack) this was very easy using browsersync, but now, using Vite I don't know exactly how to do this.
My local domain is myapp.test and can be accessed using both http and https.
I tried to use npm run vite --host, which shows the local and network address as well (ex. 192.168..), but when I visit that address on my phone, I can see only the Vite default page This is the Vite development server that provides Hot Module Replacement for your Laravel application., but not the app itself.
In my vite.config.js file I added that ip from vite network:
server: {
https: true,
host: '192.168._._'
},
plugins: [
laravel({
input: [
'resources/css/app.css',
'resources/js/app.js',
],
refresh: [
...refreshPaths,
'app/Http/Livewire/**',
],
}),
mkcert()
],
Note that I also used the mkcert vite plugin to allow me to use https.
Now I'm confused about the vite service that runs on port 5173 by default and the app that should run on port 443 to be on https.
I've also tried using `php artisan serve --host 192.168.. which works on my local network, but it doesn't work with https, so I had to focus on WAMP only.
So how can I have my app shared among my local network with https?
I'll explain about how Vite works compared to Webpack to hopefully help you understand a little better.
Both Webpack and Vite create a bundle of files when using the build commands to compile for production. Using the dev command, that it seems like you're using, they work a little differently. While Webpack watches for file changes to recompile the bundle and BrowserSync then reloads your assets for you, Vite starts a local server to serve the compiled files. This means that you don't proxy your original domain like with BrowserSync. Vite also creates a file in your public folder called "hot", which tells Laravel which url it should use when using the #vite() directive or the Vite::asset() method. Because of that you can use your original domain myapp.test even for the hot reloading of the dev command. I don't think Laravel actually supports --host and if it doesn't I haven't been able to find it or figure it out.
I did find https://github.com/Applelo/vite-plugin-browser-sync to hopefully solve your testing on other devices but I couldn't get it to work with https, otherwise I'm afraid you might have to look into something like ngrok and use the npm run build command instead of dev until better support is built into Laravel.
Update:
To configure the BrowserSync plugin you have to manually configure the proxy:
VitePluginBrowserSync({
bs: {
proxy: 'http://myapp.test/' // The usual access URL
}
})
Since it doesn't seem like Laravel supports --host I have been able to find a workaround: because Laravel reads the asset host URL from the hot file in the public directory, you can replace the contents with the external Vite URL like http://192.168.1.37:5174 after running npm run dev --host. This will make Laravel use that URL when referencing any assets.

How Can I upload my laravel 8 and tailwind css based application to cpanel

I am new to laravel . thanks for the help .
I have build an application using laravel 8 (with jetstream+livewire) and tailwind CSS.
It works perfectly in my local machine . by running :
php artisan serve
But when I upload the site/codes to the live server , it does not get the proper styles that i defined in 'tailwind.config.js' file.
what i get is only the css that can be generated from cdn (I am not using tailwind cdn , i am sure ) .
For more clear question : I get only the compiled css from
app.css file. I do not get the extended feature that I enabled in tailwind.config.js
file. such as , I do not get the output for this in live server (codes from my tailwin.config.js):
variants: {
extend: {
translate: ['group-hover'],
scale: ['group-hover'],
transitionProperty: ['group-hover'],
display: ['group-hover'],
},
},
But in my local machine, everything is perfect.
I have uploaded it in 000webhostapp. by following this guideline
I did not include the node_modules folder in the upload
for any help , thanks in advance.
Make sure that you are serving the public folder and not the root of your Laravel project. You should not be able to see public in your URLs.
This will impact the path to the css file. You should verify loading of all resources by looking in the developer tools, networking tab.

Laravel Mix + BrowserSync infinite loading out of the box

Something's making my out-of-the-box Laravel project with browser-sync and the browser-sync-webpack-plugin installed to load infinitely on the browser-sync page. It works fine on http://localhost, but the browser-sync (localhost:3000) version doesn't stop loading and displays no content, just a white page.
I found this question which was similar to mine but it doesn't have any answers.
This only recently started happening on my machine. At first I thought it was because of the antivirus or firewall but disabling them did no good. I can't even figure out what's causing the page to never load.
Here's what my webpack.mix.js file looks like:
mix.js('resources/assets/js/script.js', 'public/js')
.sass('resources/assets/sass/main.scss', 'public/css')
.disableNotifications()
.browserSync();
Edit: Any tips on narrowing down the problem would also be appreciated.
For me, the problem was adding to the windows hosts file whatever url browsersync was proxying to, and point it to 127.0.0.1.
The default proxy target for the mix-browsersync package used in Laravel is app.test.
You can specify a different proxy target in the mix file:
mix
.sass('resources/sass/app.scss', 'public/css')
.js('resources/js/app.js', 'public/js')
.browserSync('localhost');
And, if like me, you're using the artisan server for development, you can even target that (as long as you're running php artisan serve in you project):
mix
.sass('resources/sass/app.scss', 'public/css')
.js('resources/js/app.js', 'public/js')
.browserSync('localhost:8000');
I have a possible solution for you that worked for me.
My issue was that I was fixated on using the localhost:8000; instead I tried 127.0.0.1:8000. For those of you who don't know, that IP address is your default localhost IP address.
Try adding it like this to you webpack.mix.js file \/\/\/
mix.browserSync({
proxy: 'http://127.0.0.1:8000'
});
Or alternatively you can use
mix.browserSync('127.0.0.1:8000');
Once done you can run npm run watch.

Laravel 5 CORS-issue vs. Ionic Angular

I am currently making an Ionic/Cordova-application with Laravel 5 as a Rest-server (my first time coding a PHP-server).
With Postman all my Get/Post/Update/Delete-functions work on Laravel, and on clientside (Ionic Cordova). I am able to send data to http://postcatcher.in using Chrome Allow-Control-Allow-Origin plugin.. Without the plugin, I get this error:
Since this problem apparently only happens in development-mode (when testing client-side in browser), I assume it's alright to develop with the plugin.
When I try to send data to Laravel through Ionic Cordova, I get this error (even with the Allow-Control-Allow-Origin plugin:
I have tried multiple things, such as https://github.com/barryvdh/laravel-cors, which just doesn't seem to work for me. Neither does suggestions in this forum, using CORS-middlewares. https://laracasts.com/discuss/channels/requests/laravel-5-cors-headers-with-filters
I assume that this is a Laravel-issue, but I am not 100% sure.
To publish the server, I use
php artisan serve, giving it localhost:8000.
For the application, I write: ionic serve, which gives it localhost:8100.
At last, this is the code I use to send data on Ionic-side:
.factory('userFactory', function($http, $q) {
return{
createuser : function(info) {
var deferred = $q.defer();
$http.post('localhost:8000/users', info)
.success(function(response, status){
deferred.resolve(response);
})
.error(function() {
console.log('SOMETHING WENT WRONG');
});
return deferred.promise;
}
}
})
Any help is really appreciated. Really stuck with this issue.
So it turns out that I'm an absolute idiot. I just needed to write "http://" in front on the URL. Rookie mistake.
EDIT: I'll keep the question open for just half an hour, in case anyone has advice for me.

How to use vagrant with browser sync for gulp?

i want to know if someone is using browser sync with vagrant, it is posible?
how do you configure it? I tried to read if there is something on the web, but nothing clear to me. Are there other service to use livereload to all devices at the same time ?
If your vagrant box is running, you should be able to access your website through the domain you chose. For this example, let's assume your site is loaded via mysite.local
In your gulpfile.js you'll need to set up a proxy, like so:
// gulpfile.js
gulp.task('browser-sync', function() {
browserSync.init(["path/to/your/css/*.css", "path/to/your/js/*.js"],
// The magic line. Point the proxy to your local domain
{ proxy: "mysite.local" }
);
});
Restart gulp if it's already running and you should see an IP address in your command line:
Take note of the port number (in this case, :3000)
Now go to http://mysite.local:3000 -- Notice the port! You must add the port that is shown in your command line. BrowserSync will connect!
Note: If it does not work, a cache could be your problem. Try adding a random query string to the end of your URL to get around a caching issue. Example: http://mysite.local:3000/?t=1

Resources