Laravel Routes Isn't working for user uploaded files - laravel

I'm trying to get Laravel application called Acellemail to work with nginx. It has issues with uploaded content. There seems to be some kind of encoding on the public URL which I can't figure out how to write a nginx rewrite rule for.
So the URL looks like this,
https://app.example.com/assets/YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw/welcome-5.png
Actual path to the file on the server is this,
/var/www/app.example.com/html/storage/app/templates/63df9d9803fa3/img/welcome-5.png
Each time I upload a new template, it creates a new folder inside /var/www/app.example.com/html/storage/app/templates
And part of the YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw portion in the public URL changes. Nginx returns 404 errors on these URLs.
Nginx config,
server {
server_name app.example.com;
root /var/www/app.example.com/html/public;
index index.php;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ {
expires max;
log_not_found off;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php8.0-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
fastcgi_read_timeout 300;
}
location ~ /\.ht {
deny all;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/app.example.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/app.example.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = app.example.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
server_name app.example.com;
return 404; # managed by Certbot
}
web.php
https://pastebin.com/9HJaJF5v
Thanks for help in advance!
Tried to route with nginx rewrite rules. But it's not ideal.

Related

Why are my static files not beeing cached properly?

I have a little problem caching my static files in a Laravel/Nginx environment.
I know that the static files are not beeing cached properly because the google pagespeed is telling me so and also other webservices for cache checking are not finding any caching for these files. they are mostly images and fonts.
My nginx config for the site includes expires and add header and looks like this:
server {
root /var/www/websi.te/public;
index index.php;
server_name websi.te www.websi.te;
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php8.1-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
location ~* .(?:css|js|svg|ttf|png|jpg)$ {
expires 1y;
add_header Cache-Control "public";
}
location / {
try_files $uri $uri/ /;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/websi.te/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/websi.te/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = www.websi.te) {
return 301 https://$host$request_uri;
} # managed by Certbot
if ($host = websi.te) {
return 301 https://$host$request_uri;
} # managed by Certbot
server_name websi.te www.websi.te;
listen 80;
return 404; # managed by Certbot
}

Nginx centos 8 Map Multiple Lumen Projects to Single Domain

I have 6 Lumen Projects which was working fine on apache centos. I just reconfigured server to nginx. I was able to setup single project in nginx config but cannot figure out how to setup multiple directories i tried several configs but not working. Here is my nginx config
PS: before mark it as duplicate please try to explain & help me to fix this issue
server {
root /var/www/domain.com/html/api/gateway/public;
index index.php index.html index.htm index.nginx-debian.html;
server_name domain.com www.domain.com;
location / {
autoindex on;
try_files $uri $uri/ /index.php?$query_string;
}
# location /search {
# autoindex on;
# root /var/www/domain.com/html/api/search/public;
# index index.php index.html index.htm index.nginx-debian.html;
# try_files $uri $uri/ /index.php?$query_string;
# }
location ~ \.php$ {
autoindex on;
autoindex_exact_size on;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/domain.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/domain.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = domain.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
listen [::]:80;
server_name domain.com www.domain.com;
return 404; # managed by Certbot
}
This is working fine but when i uncomment search config section it stops working both of them throw forbidden or sometimes 404 error

All Laravel routes are not found (404 error) on https://proclubs.app/login

I have deployed my Laravel app to the following url proclubs.app - this is a domain from Google domains that requires an SSL certificate (the SSL has been setup using Certbot).
I have setup the Laravel Breeze package for authentication (e.g register/login functionality) and this all works fine when testing locally, now I have pushed this to a remote URL none of the routes don't work, and I just get a 404 Not Found message. I have ran the php artisan route:list and can see all the expected routes are there. I am 99% certain I have made a mistake with the nginx server block - I have used the default one that digitalocean provide in etc/nginx/sites-available and edited it accordingly, but not sure what is incorrect for me to get these 404 errors, can anyone suggest what I have done wrong?
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
root /var/www/proclubs/public;
index index.php index.html index.htm;
# Laravel related only
add_header X-Frame-Options "SAMEORIGIN";
add_header X-Content-Type-Options "nosniff";
index index.php;
charset utf-8;
# Make site accessible from http://localhost/
server_name proclubs.app www.proclubs.app;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
error_page 404 /index.php;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.(?!well-known).* {
deny all;
}
}
# another virtual host using mix of IP-, name-, and port-based configuration
#
#server {
# listen 8000;
# listen somename:8080;
# server_name somename alias another.alias;
# root html;
# index index.html index.htm;
#
# location / {
# try_files $uri $uri/ =404;
# }
#}
# HTTPS server
#
#server {
# listen 443;
# server_name localhost;
#
# root html;
# index index.html index.htm;
#
# ssl on;
# ssl_certificate cert.pem;
# ssl_certificate_key cert.key;
#
# ssl_session_timeout 5m;
#
# ssl_protocols SSLv3 TLSv1 TLSv1.1 TLSv1.2;
# ssl_ciphers "HIGH:!aNULL:!MD5 or HIGH:!aNULL:!MD5:!3DES";
# ssl_prefer_server_ciphers on;
#
# location / {
# try_files $uri $uri/ =404;
# }
#}
server {
root /var/www/proclubs/public;
index index.php index.html index.htm;
# Make site accessible from http://localhost/
server_name proclubs.app www.proclubs.app; # managed by Certbot
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
# Uncomment to enable naxsi on this location
# include /etc/nginx/naxsi.rules
}
error_page 404 /404.html;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/run/php/php7.4-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/proclubs.app/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/proclubs.app/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = www.proclubs.app) {
return 301 https://$host$request_uri;
} # managed by Certbot
if ($host = proclubs.app) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80 ;
listen [::]:80 ;
server_name proclubs.app www.proclubs.app;
return 404; # managed by Certbot
}
-- Expected behaviour --
When a user visits the webpage https://proclubs.app/login OR https://proclubs.app/login I expect to see the Laravel Breeze default login page.
-- Actual Behaviour --
When I visit https://proclubs.app/login I just see a 404 Not Found nginx error instead & no routes are working.
p.s i am confused why I have 3 server blocks too...
Server - nginx/1.18.0 (Ubunto 20.04) on DigitalOcean LEMP droplet
If you get 404, probably requested path wrong. I checked your nginx configurations and I see you have 2 servers, for http (first server) and https (second server) requests.
When you enter the url /login path it means that you want to go login folder. But in laravel it is special request.
So your mistake is in the second server (https) your request find a folder, not a special request. You must change your location option with the first server location option.
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
# Uncomment to enable naxsi on this location
# include /etc/nginx/naxsi.rules
}
Change with
location / {
try_files $uri $uri/ /index.php?$query_string;
}

Can not use multiple projects

I am trying to configure my nginx to use multiple project by path.
So I have a client side based on javascript (VueJs project) which send request to api.
The roots of api starts with /api (Laravel Project).
I also have admin panel based on Laravel too. The url of admin panel will start with /admin.
Here is my nginx config file
server {
server_name cabinet.mydomain.org;
# auth_basic "Restricted Content";
# auth_basic_user_file /etc/nginx/.htpasswd;
# access_log /var/www/cabinet/access.log;
# error_log /var/www/cabinet/error.log;
root /var/www/cabinet/api/html/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
#location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
# try_files $uri $uri/ =404;
#}
location /api {
access_log /var/log/cabinet.api.acc.log;
error_log /var/www/cabinet.api.error.log debug;
try_files $uri $uri /index.php$args;
}
location /admin {
access_log /var/log/cabinet.admin.acc.log;
error_log /var/www/cabinet.admin.error.log debug;
root /var/www/cabinet-admin/public;
try_files $uri $uri /index.php$args;
}
location / {
try_files $uri $uri/ /index.html;
root /var/www/cabinet/client/dist;
}
location = / {
return 301 $scheme://$server_name/login/;
}
# location ~* \.(?:ico|css|js|gif|jpe?g|png)$ {
# return 500;
# Some basic cache-control for static files to be sent to the browser
# expires max;
# add_header Pragma public;
# add_header Cache-Control "public, must-revalidate, proxy-revalidate";
# }
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/run/php/php7.2-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
# managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/cabinet.mydomain.org-0001/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/cabinet.mydomain.org-0001/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = cabinet.mydomain.org) {
return 301 https://$host$request_uri;
} # managed by Certbot
#if ($host = cabinet.mydomain.com) {
# return 301 https://cabinet.mydomain.org$request_uri;
#} # managed by Certbot
#if ($host = cabinet.mydomain.org) {
# return 301 https://$host$request_uri;
#} # managed by Certbot
server_name cabinet.mydomain.org cabinet.mydomain.com;
listen 80;
return 301 https://cabinet.mydomain.org$request_uri;
return 404; # managed by Certbot
}
So when trying to visit /admin the server redirects to /login.
Please help to solve this problem
Try this .
server {
server_name cabinet.mydomain.org;
# auth_basic "Restricted Content";
# auth_basic_user_file /etc/nginx/.htpasswd;
# access_log /var/www/cabinet/access.log;
# error_log /var/www/cabinet/error.log;
root /var/www;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
#location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
# try_files $uri $uri/ =404;
#}
location /api {
access_log /var/log/cabinet.api.acc.log;
error_log /var/www/cabinet.api.error.log debug;
root PATH_TO_YOUR_PROJECT;
try_files $uri $uri /PATH_TO_PROJECT/index.php$args;
}
location /admin {
access_log /var/log/cabinet.admin.acc.log;
error_log /var/www/cabinet.admin.error.log debug;
root /var/www/cabinet-admin/public;
try_files $uri $uri /cabinet-admin/public/index.php$args;
}
Please follow the docs associated with
$try_files
root

Firefox redirecting Nginx rewrite

Firefox is the only browser I am having issues with. I have found similar issues, but no solutions seem to work.
When I visit http://example.com nginx rewrites it as http://www.example.com.
I did this because the site used ssl sitewide, where now that has remains on the initial server using a subdomain, so is https://subdomain.example.com. Search engines, old bookmarks, and other old links attempted to take the user to https://example.com.
In all Browsers this works like a charm, except in firefox.
The Problem: Firefox takes the users request of http://example.com and forwards them to https://subdomain.example.com.
And then from the search engine link that reads https://example.com, an SSL error is raised because it's trying to read subomain.example's.
I'm getting confused and now it's 430 in the morning. Does someone have any clues here?
Here's my nginx conf:
upstream thin_server {
server 0.0.0.0:8080 fail_timeout=0;
}
server {
listen 80 default;
listen 443 ssl;
ssl off;
root /home/example/public;
server_name example.com www.example.com;
ssl_certificate /etc/nginx/ssl/www.example.com.chained.crt;
ssl_certificate_key /etc/nginx/ssl/example.key;
index index.htm index.html;
if ($host = 'example.com') {
rewrite ^/(.*)$ http://www.example.com/$1;
}
location / {
try_files $uri/index.html $uri.html $uri #app;
}
location ~* ^.+\.(jpg|jpeg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js|mp3|flv|mpeg|avi)$ {
try_files $uri #app;
}
location #app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://thin_server;
}
error_page 500 502 503 504 /500.html;
client_max_body_size 4G;
keepalive_timeout 10;
}
UPDATE Just started working randomly a couple of days later
I had the a similar issue, Chrome was working fine, IE and firefox did not working with the http to https redirect.
I was searching for a day, build various configurations but nothing helped.
By accident I checked my firewall (ufw status) and realized that port 80 was not open, only 443.
After allowing port 80 it worked.
Here is my nginx config which is working ( I know it is not optimized )
# Redirect http to https
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name domain.tl www.domain.tl *.domain.tl;
return 301 https://www.domain.tl$request_uri;
}
#HTTPS config for SSL with certificate
server {
listen 443 ssl;
listen [::]:443 ssl;
server_name www.domain.tl www.domain.tl;
#Limited Cipers to avoid MD5 etc attacks
ssl_ciphers 'EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH';
ssl_prefer_server_ciphers on;
ssl_session_cache shared:SSL:10m;
#Limit to TLSv1.2 for security
ssl_protocols TLSv1.2;
#Chained certificate to make sure the intermediate is in
ssl_certificate /etc/nginx/ssl/certificate.chain.crt;
ssl_certificate_key /etc/nginx/ssl/certificat_key.key;
#PHP, Wordpress etc config
root /var/www/html;
index index.php index.html index.htm;
# unless the request is for a valid file, send to bootstrap
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
#try_files $uri $uri/ =404;
#Rewrite rule fuer Wordpress
try_files $uri $uri/ /index.php?$args;
}
# PHP7 specific
location ~ \.php$ {
try_files $uri =404;
#fastcgi_pass 127.0.0.1:9000;
# With php5-fpm:
#fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
# OLD CONFIG for php5
# location ~ \.php$ {
# try_files $uri =404;
# fastcgi_split_path_info ^(.+\.php)(/.+)$;
# fastcgi_pass unix:/var/run/php5-fpm.sock;
# fastcgi_index index.php;
# include fastcgi_params;
#}
}

Resources