I'm having a problem with my Laravel 5.1 site that is deployed on a NGINX server with PHP 7 on Ubuntu. When I make request for a route the request are being duplicated ,I noticed it when I tried out a function that I had to prevent duplicate request , by generating a new token like CSRF token but for each request.
EDIT
I got the the bottom of the problem, it was the most unthinkable for me, an
img tag with asset thing caused the double request to happen on nginx ubuntu,
I found it out after more debugging with logger and google the problem for Laravel where someone suggested it could be favicon thing so I tried to remove html and found the issue.
This image didn't exist apperently.
""
When I try to Log::info("test") on a function in any controller I get the following:
[2017-08-03 19:46:39] local.INFO: test
[2017-08-03 19:46:39] local.INFO: test
I don't have this issue on my local WAMP apache server though.
Sites available config (Symlinked it to sites-enabled)
server {
listen 80;
listen [::]:80;
server_name mysite.com www.mysite.com;
return 301 https://$server_name$request_uri;
}
server {
# SSL configuration
listen 443 ssl; # managed by Certbot
listen [::]:443 ssl;
ssl_certificate /etc/letsencrypt/live/mysite.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/mysite.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
error_page 401 403 404 /404.html;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/mysite/site/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name mysite.com www.mysite.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
location /pma {
auth_basic "Auth required";
auth_basic_user_file /etc/nginx/pma_pass;
root /var/www/html;
index index.php index.html index.htm;
location ~ ^/pma/(.+\.php)$ {
try_files $uri =404;
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
location ~* ^/pma/(.+\.(jpg|jpeg|gif|css|png|js|ico|html|xml|txt))$ {
root /var/www/html;
}
}
}
Related
I'm trying to get Laravel application called Acellemail to work with nginx. It has issues with uploaded content. There seems to be some kind of encoding on the public URL which I can't figure out how to write a nginx rewrite rule for.
So the URL looks like this,
https://app.example.com/assets/YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw/welcome-5.png
Actual path to the file on the server is this,
/var/www/app.example.com/html/storage/app/templates/63df9d9803fa3/img/welcome-5.png
Each time I upload a new template, it creates a new folder inside /var/www/app.example.com/html/storage/app/templates
And part of the YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw portion in the public URL changes. Nginx returns 404 errors on these URLs.
Nginx config,
server {
server_name app.example.com;
root /var/www/app.example.com/html/public;
index index.php;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ {
expires max;
log_not_found off;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php8.0-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
fastcgi_read_timeout 300;
}
location ~ /\.ht {
deny all;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/app.example.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/app.example.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = app.example.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
server_name app.example.com;
return 404; # managed by Certbot
}
web.php
https://pastebin.com/9HJaJF5v
Thanks for help in advance!
Tried to route with nginx rewrite rules. But it's not ideal.
I have 6 Lumen Projects which was working fine on apache centos. I just reconfigured server to nginx. I was able to setup single project in nginx config but cannot figure out how to setup multiple directories i tried several configs but not working. Here is my nginx config
PS: before mark it as duplicate please try to explain & help me to fix this issue
server {
root /var/www/domain.com/html/api/gateway/public;
index index.php index.html index.htm index.nginx-debian.html;
server_name domain.com www.domain.com;
location / {
autoindex on;
try_files $uri $uri/ /index.php?$query_string;
}
# location /search {
# autoindex on;
# root /var/www/domain.com/html/api/search/public;
# index index.php index.html index.htm index.nginx-debian.html;
# try_files $uri $uri/ /index.php?$query_string;
# }
location ~ \.php$ {
autoindex on;
autoindex_exact_size on;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/domain.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/domain.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = domain.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
listen [::]:80;
server_name domain.com www.domain.com;
return 404; # managed by Certbot
}
This is working fine but when i uncomment search config section it stops working both of them throw forbidden or sometimes 404 error
I have deployed my Laravel app to the following url proclubs.app - this is a domain from Google domains that requires an SSL certificate (the SSL has been setup using Certbot).
I have setup the Laravel Breeze package for authentication (e.g register/login functionality) and this all works fine when testing locally, now I have pushed this to a remote URL none of the routes don't work, and I just get a 404 Not Found message. I have ran the php artisan route:list and can see all the expected routes are there. I am 99% certain I have made a mistake with the nginx server block - I have used the default one that digitalocean provide in etc/nginx/sites-available and edited it accordingly, but not sure what is incorrect for me to get these 404 errors, can anyone suggest what I have done wrong?
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
root /var/www/proclubs/public;
index index.php index.html index.htm;
# Laravel related only
add_header X-Frame-Options "SAMEORIGIN";
add_header X-Content-Type-Options "nosniff";
index index.php;
charset utf-8;
# Make site accessible from http://localhost/
server_name proclubs.app www.proclubs.app;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
error_page 404 /index.php;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.(?!well-known).* {
deny all;
}
}
# another virtual host using mix of IP-, name-, and port-based configuration
#
#server {
# listen 8000;
# listen somename:8080;
# server_name somename alias another.alias;
# root html;
# index index.html index.htm;
#
# location / {
# try_files $uri $uri/ =404;
# }
#}
# HTTPS server
#
#server {
# listen 443;
# server_name localhost;
#
# root html;
# index index.html index.htm;
#
# ssl on;
# ssl_certificate cert.pem;
# ssl_certificate_key cert.key;
#
# ssl_session_timeout 5m;
#
# ssl_protocols SSLv3 TLSv1 TLSv1.1 TLSv1.2;
# ssl_ciphers "HIGH:!aNULL:!MD5 or HIGH:!aNULL:!MD5:!3DES";
# ssl_prefer_server_ciphers on;
#
# location / {
# try_files $uri $uri/ =404;
# }
#}
server {
root /var/www/proclubs/public;
index index.php index.html index.htm;
# Make site accessible from http://localhost/
server_name proclubs.app www.proclubs.app; # managed by Certbot
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
# Uncomment to enable naxsi on this location
# include /etc/nginx/naxsi.rules
}
error_page 404 /404.html;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/run/php/php7.4-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/proclubs.app/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/proclubs.app/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = www.proclubs.app) {
return 301 https://$host$request_uri;
} # managed by Certbot
if ($host = proclubs.app) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80 ;
listen [::]:80 ;
server_name proclubs.app www.proclubs.app;
return 404; # managed by Certbot
}
-- Expected behaviour --
When a user visits the webpage https://proclubs.app/login OR https://proclubs.app/login I expect to see the Laravel Breeze default login page.
-- Actual Behaviour --
When I visit https://proclubs.app/login I just see a 404 Not Found nginx error instead & no routes are working.
p.s i am confused why I have 3 server blocks too...
Server - nginx/1.18.0 (Ubunto 20.04) on DigitalOcean LEMP droplet
If you get 404, probably requested path wrong. I checked your nginx configurations and I see you have 2 servers, for http (first server) and https (second server) requests.
When you enter the url /login path it means that you want to go login folder. But in laravel it is special request.
So your mistake is in the second server (https) your request find a folder, not a special request. You must change your location option with the first server location option.
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
# Uncomment to enable naxsi on this location
# include /etc/nginx/naxsi.rules
}
Change with
location / {
try_files $uri $uri/ /index.php?$query_string;
}
I am Trying to create two instance on my NGnix server
First would be accessed by
mydomain.com (it listening to port 80 )
Second using
172.32.32.123:81 (it listening to port 81 and this IP is server IP)
this is my default file
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name mydomain.com;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
location ~ /\.ht {
deny all;
}
}
server {
listen 81;
server_name 172.32.32.123:81;
root /var/www/html/root;
index index.html index.php;
set $MAGE_MODE developer; # or production or developer
set $MAGE_ROOT /var/www/html/root/;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
include /var/www/html/root/nginx.conf.sample;
}
}
The server block is working fine for the one when using domain name but in case of IP based domain only home page is working on inner pages we are getting 404 error
It's unclear what's supposed to happen — what is the correct path that's supposed to handle one server versus the other?!
If they're supposed to have the same underlying files, then your root directives are quite suspicious — one is simply /var/www/html/, the other one is /var/www/html/root/ — is that intentional?
Otherwise, in case of a 404 error, the underlying path names (that aren't found) should be mentioned within the file specified by http://nginx.org/r/error_log, which will likely reveal what is up — do those returning 404 actually exist on the disc?!
This would be better moved to ServerFault.
While I'm not sure of it, The inclusion of the port in the ServerName directive looks suspicious to me.
This looks dangerous:
include /var/www/html/root/nginx.conf.sample
If /var/www/html/root, and the directories above it are only writeable by root, it might be OK. Otherwise it's likely to be a root exploit for whichever user can write to the file. Copy the file to somewhere safe, and include it at that location.
Solved the issue with using following default config
server {
listen 80 default_server;
listen [::]:80 default_server;
# SSL configuration
#
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name magedev.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
#try_files $uri $uri/ =404;
index index.html index.php;
#try_files $uri $uri/ #handler;
#try_files $uri $uri/ /index.php;
expires 30d;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
#include fastcgi_params;
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
#fastcgi_index index.php;
#fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
#include /var/www/html/root/nginx.conf.sample;
}
server {
listen 81;
server_name 192.87.123.132;
root /var/www/html/root;
index index.html index.php;
set $MAGE_MODE developer; # or production or developer
set $MAGE_ROOT /var/www/html/root/;
# **Inclusion of try_files Solved the issue for me**
location / {
try_files $uri $uri/ /index.php?$args;
autoindex on;
autoindex_exact_size off;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
include /var/www/html/root/nginx.conf.sample;
}
Inclusion of try_files Solved the issue for me
location / {
try_files $uri $uri/ /index.php?$args;
autoindex on;
autoindex_exact_size off;
}
i am trying to solve problem. I want to redirect all urls starting with www to non-www version of site. In chrome and opera, it works well.
But when i go to firefox and open http://www.example.com it starts downloading page (mime-type octet/stream), on https://www the connection is not reliable. In other browsers it set mime-type text/html.
server {
listen 80;
listen [::]:80;
server_name www.example.com;
return 301 https://example.com$request_uri;
}
server {
listen 80;
listen [::]:80;
root /var/www/domain/www;
index index.php;
server_name example.com;
return 301 https://$server_name$request_uri;
}
server {
root /var/www/domain/www;
# Add index.php to the list if you are using PHP
index index.php;
server_name example.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$args ;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}