Nginx centos 8 Map Multiple Lumen Projects to Single Domain - laravel

I have 6 Lumen Projects which was working fine on apache centos. I just reconfigured server to nginx. I was able to setup single project in nginx config but cannot figure out how to setup multiple directories i tried several configs but not working. Here is my nginx config
PS: before mark it as duplicate please try to explain & help me to fix this issue
server {
root /var/www/domain.com/html/api/gateway/public;
index index.php index.html index.htm index.nginx-debian.html;
server_name domain.com www.domain.com;
location / {
autoindex on;
try_files $uri $uri/ /index.php?$query_string;
}
# location /search {
# autoindex on;
# root /var/www/domain.com/html/api/search/public;
# index index.php index.html index.htm index.nginx-debian.html;
# try_files $uri $uri/ /index.php?$query_string;
# }
location ~ \.php$ {
autoindex on;
autoindex_exact_size on;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/domain.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/domain.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = domain.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
listen [::]:80;
server_name domain.com www.domain.com;
return 404; # managed by Certbot
}
This is working fine but when i uncomment search config section it stops working both of them throw forbidden or sometimes 404 error

Related

Laravel Routes Isn't working for user uploaded files

I'm trying to get Laravel application called Acellemail to work with nginx. It has issues with uploaded content. There seems to be some kind of encoding on the public URL which I can't figure out how to write a nginx rewrite rule for.
So the URL looks like this,
https://app.example.com/assets/YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw/welcome-5.png
Actual path to the file on the server is this,
/var/www/app.example.com/html/storage/app/templates/63df9d9803fa3/img/welcome-5.png
Each time I upload a new template, it creates a new folder inside /var/www/app.example.com/html/storage/app/templates
And part of the YXBwL3RlbXBsYXRlcy82M2RmOWQ5ODAzZmEzL2ltZw portion in the public URL changes. Nginx returns 404 errors on these URLs.
Nginx config,
server {
server_name app.example.com;
root /var/www/app.example.com/html/public;
index index.php;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ {
expires max;
log_not_found off;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php8.0-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
fastcgi_read_timeout 300;
}
location ~ /\.ht {
deny all;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/app.example.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/app.example.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = app.example.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
server_name app.example.com;
return 404; # managed by Certbot
}
web.php
https://pastebin.com/9HJaJF5v
Thanks for help in advance!
Tried to route with nginx rewrite rules. But it's not ideal.

Why are my static files not beeing cached properly?

I have a little problem caching my static files in a Laravel/Nginx environment.
I know that the static files are not beeing cached properly because the google pagespeed is telling me so and also other webservices for cache checking are not finding any caching for these files. they are mostly images and fonts.
My nginx config for the site includes expires and add header and looks like this:
server {
root /var/www/websi.te/public;
index index.php;
server_name websi.te www.websi.te;
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php8.1-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
location ~* .(?:css|js|svg|ttf|png|jpg)$ {
expires 1y;
add_header Cache-Control "public";
}
location / {
try_files $uri $uri/ /;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/websi.te/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/websi.te/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = www.websi.te) {
return 301 https://$host$request_uri;
} # managed by Certbot
if ($host = websi.te) {
return 301 https://$host$request_uri;
} # managed by Certbot
server_name websi.te www.websi.te;
listen 80;
return 404; # managed by Certbot
}

Laradock - Remove port from the url not working

I´ve been searching a lot and found a lot of answers but unfortunately non of the answers is working, my scenario is below:
In my project folder i have laradock and laravel folders. In the .env(inside laradock) i have:
NGINX_HOST_HTTP_PORT=8080
and this is because my 80 is ocuppied. and inside the nginx folder, in the default.conf i have:(note that the code below is pratically the default, i have putted here many things but it didn´t work)
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
# For https
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server ipv6only=on;
# ssl_certificate /etc/nginx/ssl/default.crt;
# ssl_certificate_key /etc/nginx/ssl/default.key;
server_name mam1.test;
root /var/www/laravelProject/public;
index index.php index.html index.htm;
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
location ~ \.php$ {
try_files $uri /index.php =404;
fastcgi_pass php-upstream;
fastcgi_index index.php;
fastcgi_buffers 16 16k;
fastcgi_buffer_size 32k;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
#fixes timeouts
fastcgi_read_timeout 600;
include fastcgi_params;
}
location ~ /\.ht {
deny all;
}
location /.well-known/acme-challenge/ {
root /var/www/letsencrypt/;
log_not_found off;
}
}
Meanwhile i have to edit the hosts file on my mac and put: 127.0.0.1 laravelProject. If i acess laravelProject on my url it shows a message saying, it works which is not what i want, bu if i access with laravelProject:8080 it works great!. but how do i remove the port? i have tried many things, the last one was this but if it keeps NOT redirecting to the correct url.
How do i do this?
Regards

Laravel 403 Forbidden nginx/1.14.0 (Ubuntu) in Nginx Digital Ocean

I deployed my Laravel-5.8 project to DigitalOcean and it works fine as:
http://laravelproject.net
But since I am using Azure AD and Socialite. Azure AD does not allow http but https
/etc/nginx/sites-available/default
server {
listen 80 default_server;
listen [::]:80 default_server;
# SSL configuration
#
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/html/laravelproject;
# Add index.php to the list if you are using PHP
# index index.php index.html index.htm;
# index index.php index.html index.htm index.nginx-debian.html;
server_name laravelproject.net;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php$is_args$args;
# try_files $uri $uri/ =404;
}
# pass PHP scripts to FastCGI server
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1:9000;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
}
I also have:
/etc/nginx/sites-available/default
server {
listen 80;
listen [::]:80;
server_name laravelproject.net;
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name laravelproject.net;
root /var/www/html/peopleedge;
ssl_certificate /etc/letsencrypt/live/laravelproject.net/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/laravelproject.net/privkey.pem;
ssl_protocols TLSv1.2;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384;
ssl_prefer_server_ciphers on;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.php index.html index.htm index.nginx-debian.html;
charset utf-8;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
}
location ~ /\.ht {
deny all;
}
location ~ /.well-known {
allow all;
}
}
When I tried to run the project I got this error:
Laravel 403 Forbidden nginx/1.14.0 (Ubuntu)
How do I resolve it please?
Thank you.
I know its late already but for any other person who may need it,below code config helps me resolve the isssue
`server {
listen 80;
server_name yourip or domain;
root /var/www/html/public;
index index.php;
charset utf-8;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
`
also remember to reload nginx with ` sudo systemctl reload nginx
`

Firefox redirecting Nginx rewrite

Firefox is the only browser I am having issues with. I have found similar issues, but no solutions seem to work.
When I visit http://example.com nginx rewrites it as http://www.example.com.
I did this because the site used ssl sitewide, where now that has remains on the initial server using a subdomain, so is https://subdomain.example.com. Search engines, old bookmarks, and other old links attempted to take the user to https://example.com.
In all Browsers this works like a charm, except in firefox.
The Problem: Firefox takes the users request of http://example.com and forwards them to https://subdomain.example.com.
And then from the search engine link that reads https://example.com, an SSL error is raised because it's trying to read subomain.example's.
I'm getting confused and now it's 430 in the morning. Does someone have any clues here?
Here's my nginx conf:
upstream thin_server {
server 0.0.0.0:8080 fail_timeout=0;
}
server {
listen 80 default;
listen 443 ssl;
ssl off;
root /home/example/public;
server_name example.com www.example.com;
ssl_certificate /etc/nginx/ssl/www.example.com.chained.crt;
ssl_certificate_key /etc/nginx/ssl/example.key;
index index.htm index.html;
if ($host = 'example.com') {
rewrite ^/(.*)$ http://www.example.com/$1;
}
location / {
try_files $uri/index.html $uri.html $uri #app;
}
location ~* ^.+\.(jpg|jpeg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js|mp3|flv|mpeg|avi)$ {
try_files $uri #app;
}
location #app {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_pass http://thin_server;
}
error_page 500 502 503 504 /500.html;
client_max_body_size 4G;
keepalive_timeout 10;
}
UPDATE Just started working randomly a couple of days later
I had the a similar issue, Chrome was working fine, IE and firefox did not working with the http to https redirect.
I was searching for a day, build various configurations but nothing helped.
By accident I checked my firewall (ufw status) and realized that port 80 was not open, only 443.
After allowing port 80 it worked.
Here is my nginx config which is working ( I know it is not optimized )
# Redirect http to https
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name domain.tl www.domain.tl *.domain.tl;
return 301 https://www.domain.tl$request_uri;
}
#HTTPS config for SSL with certificate
server {
listen 443 ssl;
listen [::]:443 ssl;
server_name www.domain.tl www.domain.tl;
#Limited Cipers to avoid MD5 etc attacks
ssl_ciphers 'EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH';
ssl_prefer_server_ciphers on;
ssl_session_cache shared:SSL:10m;
#Limit to TLSv1.2 for security
ssl_protocols TLSv1.2;
#Chained certificate to make sure the intermediate is in
ssl_certificate /etc/nginx/ssl/certificate.chain.crt;
ssl_certificate_key /etc/nginx/ssl/certificat_key.key;
#PHP, Wordpress etc config
root /var/www/html;
index index.php index.html index.htm;
# unless the request is for a valid file, send to bootstrap
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
#try_files $uri $uri/ =404;
#Rewrite rule fuer Wordpress
try_files $uri $uri/ /index.php?$args;
}
# PHP7 specific
location ~ \.php$ {
try_files $uri =404;
#fastcgi_pass 127.0.0.1:9000;
# With php5-fpm:
#fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
# OLD CONFIG for php5
# location ~ \.php$ {
# try_files $uri =404;
# fastcgi_split_path_info ^(.+\.php)(/.+)$;
# fastcgi_pass unix:/var/run/php5-fpm.sock;
# fastcgi_index index.php;
# include fastcgi_params;
#}
}

Resources