Laravel 403 forbidden error in Ubuntu LEMP stack - laravel

I have installed laravel in the "/var/www" folder but I am getting 403 forbidden error, and folders in the "var/www" folder have a lock icon. How can i fix this problem ?
Settings like this:
# Don't use them in a production server
root /var/www/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
server_name 192.168.1.6;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass PHP scripts to FastCGI server
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1:9000;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
}

If your laravel installation is under /var/www/html then you will need to update your nginx conf file to root /var/www/html/public;. Your configuration file looks ok to me except the public part.
Laravel's index.php is located in the public folder.
# Don't use them in a production server
root /var/www/html/public; # This line.
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
server_name 192.168.1.6;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass PHP scripts to FastCGI server
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1:9000;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
}

Related

Multiple Laravel Applications Using Nginx - Windows

I have two different laravel application on my server machine.
They are located at:
D:/APPLICATION/application1
and
D:/APPLICATION/application2
Below is my nginx.conf content:
server {
listen 80;
server_name localhost;
location / {
root "D:/APPLICATION/application1/public";
try_files $uri $uri/ /index.php?$query_string;
index index.php index.html index.htm;
location ~ \.php$ {
try_files $uri /index.php = 404;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
location ^~ /application2 {
alias "D:/APPLICATION/application2/public";
try_files $uri $uri/ /index.php?$query_string;
index index.php index.html index.htm;
location ~ \.php$ {
try_files $uri /index.php = 404;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
}
If I browse http://x.x.x.x/, my first laravel web application comes out perfectly.
But if I browse http://x.x.x.x/application2 I am having No input file specified.
Anything I am missing here?
For windows use fastcgi_pass as 127.0.0.1:9000 instead of unix socket.
Please make sure your php cgi is running. If not, you can start it by
1. Open command prompt
2. Go to path of php-cgi file. (e.g. C:\php-7.3.11, here you'll find fast-cgi.exe).
2. php-cgi.exe -b 127.0.0.1:9000
Nginx configuration with rewrite module.
# Nginx.conf
# App 1(Path: D:/APPLICATION/application1, Url: http://localhost)
# App 2(Path: D:/APPLICATION/application2, Url: http://localhost/application2)
server {
# Listing port and host address
# If 443, make sure to include ssl configuration for the same.
listen 80;
listen [::]:80;
server_name localhost;
# Default index pages
index index.php;
# Root for / project
root "D:/APPLICATION/application1/public";
# Handle main root / project
location / {
#deny all;
try_files $uri $uri/ /index.php?$args;
}
# Handle application2 project
location /application2 {
# Root for this project
root "D:/APPLICATION/application2/public";
# Rewrite $uri=/application2/xyz back to just $uri=/xyz
rewrite ^/application2/(.*)$ /$1 break;
# Try to send static file at $url or $uri/
# Else try /index.php (which will hit location ~\.php$ below)
try_files $uri $uri/ /index.php?$args;
}
# Handle all locations *.php files (which will always be just /index.php)
# via factcgi PHP-FPM unix socket
location ~ \.php$ {
# We don't want to pass /application2/xyz to PHP-FPM, we want just /xyz to pass to fastcgi REQUESTE_URI below.
# So laravel route('/xyz') responds to /application2/xyz as you would expect.
set $newurl $request_uri;
if ($newurl ~ ^/application2(.*)$) {
set $newurl $1;
root "D:/APPLICATION/application2/public";
}
# Pass all PHP files to fastcgi php fpm unix socket
fastcgi_split_path_info ^(.+\.php)(/.+)$;
# Use php fastcgi rather than php fpm sock
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
include fastcgi_params;
# Here we are telling php fpm to use updated route that we've created to properly
# response to laravel routes.
fastcgi_param REQUEST_URI $newurl;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
# Deny .ht* access
location ~ /\.ht {
deny all;
}
}
Note: When we're using session based laravel setup, all the route generator functions(url(), route()) use hostname localhost as root url not localhost/application2. To resolve this issue please do following changes in laravel app.
Define APP_URL in .env file as APP_URL="localhost/application2"
Go to RouteServiceProvider which is located at app/Providers/RouteServiceProvider and force laravel to use APP_URL as root url for your app.
public function boot()
{
parent::boot();
// Add following lines to force laravel to use APP_URL as root url for the app.
$strBaseURL = $this->app['url'];
$strBaseURL->forceRootUrl(config('app.url'));
}
Update: Make sure to run php artisan config:clear or php artisan config:cache command to load the updated value of APP_URL.
For Linux System : Nginx: Serve multiple Laravel apps with same url but two different sub locations in Linux

Tuning Laravel+Nginx in Debian. Routing

Does not work even
Route::get ('foo', function () {
return 'Hello World';
});
in the web.php file - returns 404
My url http://localhost/project/public/foo
Laravel 5.4
Here is my /etc/nginx/sites-available/default file:
# Default server configuration
#
server {
listen 80 default_server;
listen [::]:80 default_server;
root /srv/www;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name _;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include /etc/nginx/fastcgi.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1;
fastcgi_index index.php;
}
}
Any help for me.
root is in the wrong directory for laravel.
it should be /srv/www/project/public
Need root /srv/www/project/public/; and try_files $uri $uri/ /index.php$is_args$args;

Ngnix two server block one by domain another one by IP

I am Trying to create two instance on my NGnix server
First would be accessed by
mydomain.com (it listening to port 80 )
Second using
172.32.32.123:81 (it listening to port 81 and this IP is server IP)
this is my default file
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name mydomain.com;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
location ~ /\.ht {
deny all;
}
}
server {
listen 81;
server_name 172.32.32.123:81;
root /var/www/html/root;
index index.html index.php;
set $MAGE_MODE developer; # or production or developer
set $MAGE_ROOT /var/www/html/root/;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
include /var/www/html/root/nginx.conf.sample;
}
}
The server block is working fine for the one when using domain name but in case of IP based domain only home page is working on inner pages we are getting 404 error
It's unclear what's supposed to happen — what is the correct path that's supposed to handle one server versus the other?!
If they're supposed to have the same underlying files, then your root directives are quite suspicious — one is simply /var/www/html/, the other one is /var/www/html/root/ — is that intentional?
Otherwise, in case of a 404 error, the underlying path names (that aren't found) should be mentioned within the file specified by http://nginx.org/r/error_log, which will likely reveal what is up — do those returning 404 actually exist on the disc?!
This would be better moved to ServerFault.
While I'm not sure of it, The inclusion of the port in the ServerName directive looks suspicious to me.
This looks dangerous:
include /var/www/html/root/nginx.conf.sample
If /var/www/html/root, and the directories above it are only writeable by root, it might be OK. Otherwise it's likely to be a root exploit for whichever user can write to the file. Copy the file to somewhere safe, and include it at that location.
Solved the issue with using following default config
server {
listen 80 default_server;
listen [::]:80 default_server;
# SSL configuration
#
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name magedev.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
#try_files $uri $uri/ =404;
index index.html index.php;
#try_files $uri $uri/ #handler;
#try_files $uri $uri/ /index.php;
expires 30d;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
#include fastcgi_params;
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
#fastcgi_index index.php;
#fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
#include /var/www/html/root/nginx.conf.sample;
}
server {
listen 81;
server_name 192.87.123.132;
root /var/www/html/root;
index index.html index.php;
set $MAGE_MODE developer; # or production or developer
set $MAGE_ROOT /var/www/html/root/;
# **Inclusion of try_files Solved the issue for me**
location / {
try_files $uri $uri/ /index.php?$args;
autoindex on;
autoindex_exact_size off;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.1-fpm.sock;
}
include /var/www/html/root/nginx.conf.sample;
}
Inclusion of try_files Solved the issue for me
location / {
try_files $uri $uri/ /index.php?$args;
autoindex on;
autoindex_exact_size off;
}

Nginx Laravel Php double request per request

I'm having a problem with my Laravel 5.1 site that is deployed on a NGINX server with PHP 7 on Ubuntu. When I make request for a route the request are being duplicated ,I noticed it when I tried out a function that I had to prevent duplicate request , by generating a new token like CSRF token but for each request.
EDIT
I got the the bottom of the problem, it was the most unthinkable for me, an
img tag with asset thing caused the double request to happen on nginx ubuntu,
I found it out after more debugging with logger and google the problem for Laravel where someone suggested it could be favicon thing so I tried to remove html and found the issue.
This image didn't exist apperently.
""
When I try to Log::info("test") on a function in any controller I get the following:
[2017-08-03 19:46:39] local.INFO: test
[2017-08-03 19:46:39] local.INFO: test
I don't have this issue on my local WAMP apache server though.
Sites available config (Symlinked it to sites-enabled)
server {
listen 80;
listen [::]:80;
server_name mysite.com www.mysite.com;
return 301 https://$server_name$request_uri;
}
server {
# SSL configuration
listen 443 ssl; # managed by Certbot
listen [::]:443 ssl;
ssl_certificate /etc/letsencrypt/live/mysite.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/mysite.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
error_page 401 403 404 /404.html;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/mysite/site/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name mysite.com www.mysite.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
location /pma {
auth_basic "Auth required";
auth_basic_user_file /etc/nginx/pma_pass;
root /var/www/html;
index index.php index.html index.htm;
location ~ ^/pma/(.+\.php)$ {
try_files $uri =404;
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
location ~* ^/pma/(.+\.(jpg|jpeg|gif|css|png|js|ico|html|xml|txt))$ {
root /var/www/html;
}
}
}

url\whatever.php nginx 404 error instead of laravel 404 error

I have installed laravel with Nginx using Ubuntu. Everything is working fine so far except for one problem. When a user insert any url like domain.com/user/whatever.php nginx response with 404 error page of its own instead of showing the laravel 404 page.
what am I missing in my nginx config?
my nginx config file:
server {
listen 80;
server_name ip domainFotcom wwwDotdomainDotcom;
return 301 https://domainDotcom$request_uri;
}
server {
listen 443 ssl http2;
server_name ip wwwDotdomainDotcom;
return 301 $scheme://domainDotcom$request_uri;
}
# Default server configuration
#
server {
#listen 80 default_server;
#server_name ip domainDotcom wwwDotdomainDotcom;
#listen [::]:80 default_server;
# SSL configuration
#
listen 443 ssl default_server;
listen [::]:443 ssl default_server;
root /var/www/domain/public;
# Add indexDotphp to the list if you are using PHP
index indexDotphp indexDothtml indexDothtm indexDotnginx-debianDOthtml;
server_name domainDotcom;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
# try_files $uri $uri/ =404;
try_files $uri $uri/ /indexDotphp?$query_string;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
location ~ /.well-known {
allow all;
}
}
As I understand it, if a file ends with ".php" nginx tries to send it to php engine. Then, if the file doesn't exist, php engine throw 404 at nginx-level. You should catch and redirect it to php engine again:
server {
listen 80;
listen [::]:80;
root /var/www/path/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name domain.com www.domain.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
# try_files $uri $uri/ =404;
try_files $uri $uri/ /index.php?$query_string;
}
location ~ \.php$ {
try_files $uri #missing;
# regex to split $uri to $fastcgi_script_name and $fastcgi_path
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
# Bypass the fact that try_files resets $fastcgi_path_info
# see: http://trac.nginx.org/nginx/ticket/321
set $path_info $fastcgi_path_info;
fastcgi_param PATH_INFO $path_info;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_index index.php;
include fastcgi.conf;
}
location #missing {
rewrite ^ /error/404 break;
try_files $uri $uri/ /index.php?$query_string;
}
}
Use the pretty URLs nginx config from the Laravel docs:
location / {
try_files $uri $uri/ /index.php?$query_string;
}
Source: https://laravel.com/docs/5.4/installation#pretty-urls
You also need to ensure you have a 404 error page set up under resources/views/errors/404.blade.php, as mentioned in the docs:
Laravel makes it easy to display custom error pages for various HTTP status codes. For example, if you wish to customize the error page for 404 HTTP status codes, create a resources/views/errors/404.blade.php. This file will be served on all 404 errors generated by your application.
Read further here: https://laravel.com/docs/5.4/errors#http-exceptions
replace this
try_files $uri $uri/ /index.php?$query_string;
with
try_files $uri $uri/ /index.php$is_args$args;
after that run
sudo service nginx restart

Resources