I've setup nginx and imported codeigniter into my projects and I'm unable to resolve the following error.My server is running Ubuntu 16 and php7 I've tried giving the files permission but still no solution. Anyone know where the problem could be? Thanks
My default config file for my server in nginx.
server {
listen 80 default_server;
listen [::]:80 default_server;
# SSL configuration
#
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/observum/site;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
server_name 45.79.4.55;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
}
# Virtual Host configuration for example.com
#
Screenshot
you css/js path is incorrect,its include from 'site' directory
its getting include in your document is
http://45.79.4.55/site/libreriasJS/boceto/css/modern-business.css
but its should be like
http://45.79.4.55/libreriasJS/boceto/css/modern-business.css
Related
I was using Laravel Forge and stopped using it due to a problem with my card. So I continued on my own with Digital Ocean.
I followed the below instructable to apply SSL to the site.
https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-20-04-es
This procedure indicated some errors que posteriormente solucioné:
For example:
maquino#codigobyte:/etc/nginx/sites-enabled$ sudo nginx -t
nginx: [warn] conflicting server name "todocontenidoweb.com" on 0.0.0.0:80, ignored
nginx: [warn] conflicting server name "www.todocontenidoweb.com" on 0.0.0.0:80, ignored
nginx: [warn] conflicting server name "todocontenidoweb.com" on [::]:80, ignored
nginx: [warn] conflicting server name "www.todocontenidoweb.com" on [::]:80, ignored
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful
The issue is that my site is down, I'm already on the third day and Digital Ocean support tells me that they don't see a problem in the configurations. But the site is down.
On the next page it shows me a problem with the secure connection.
https://www.ssllabs.com/ssltest/analyze.html?d=todocontenidoweb.com
At this moment there are several things that make me believe that I have a problem in the following configuration:
In /etc/nginx/sites-enabled$
I have 3 files and one of them is my site todocontenidoweb.com
In todocontenidoweb.com:
server {
listen 80;
listen [::]:80;
server_name todocontenidoweb;
server_tokens off;
root /home/forge/todocontenidoweb.com/public;
In the server_name line all web content; I understand that I should put todocontenidoweb.com www.todocontenidos web.com
But it doesn't allow me and it gives me an error in the certificate if I do this.
I would really appreciate it if someone could help me to solve this situation.
EDITION
File: /etc/nginx/sites-available/mydomain.com
# FORGE CONFIG (DO NOT REMOVE!)
#include forge-conf/todocontenidoweb.com/before/*;
server {
listen 443 ssl;
listen [::]:443 ssl;
include snippets/self-signed.conf;
include snippets/ssl-params.conf;
server_name todocontenidoweb.com www.todocontenidoweb.com;
server_tokens off;
root /home/forge/todocontenidoweb.com/public;
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
# ssl_protocols TLSv1.2 TLSv1.3;
# ssl_ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY13>
# ssl_prefer_server_ciphers off;
# ssl_dhparam /etc/nginx/dhparams.pem;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-XSS-Protection "1; mode=block";
add_header X-Content-Type-Options "nosniff";
index index.html index.htm index.php;
charset utf-8;
# FORGE CONFIG (DO NOT REMOVE!)
#include forge-conf/todocontenidoweb.com/server/*;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
access_log off;
error_log /var/log/nginx/todocontenidoweb.com-error.log error;
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
server {
listen 80;
listen [::]:80;
server_name todocontenidoweb.com www.todocontenidoweb.com;
return 302 https://$server_name$request_uri;
}
# FORGE CONFIG (DO NOT REMOVE!)
#include forge-conf/todocontenidoweb.com/after/*;
//Edit
As you can see below, the site is returning a 301 respose and redirecting to https site, but none of your nginx configs are running on port 443
//End of Edit
There seems to an issue with your nginx config. It seems like your config is only allowing port 80 and not 443. Browsers expect to get 443 as port (unless you otherwise specify) as ssl port. And it seems like you're using some redirect to redirect users from http to https, only problem is you're not running any https service on port 443.
I could be an issue with your nginx config. make sure that the server entry that has the ssl input, also running on port 443.
If you did install ssl with let's encrypt, you can try to generate certificate manually and then you can modify nginx config to run on port 443 as ssl
You can follow this guide from digitalocean , just replace the certificates path with lets-encrypt one
Edit 2 :
I don't think this part should be commented out
# FORGE SSL (DO NOT REMOVE!)
# ssl_certificate;
# ssl_certificate_key;
# ssl_protocols TLSv1.2 TLSv1.3;
# ssl_ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY13>
# ssl_prefer_server_ciphers off;
# ssl_dhparam /etc/nginx/dhparams.pem;
As you're essentially not serving any ssl certificate through the 443 port.
Here is an example server from DigitalOcean. As you can see, they're serving a private and public key. Once you generate your certificate, you need to enter the path of those.
server {
listen 443 http2 ssl;
listen [::]:443 http2 ssl;
server_name your_server_ip;
ssl_certificate /etc/ssl/certs/nginx-selfsigned.crt;
ssl_certificate_key /etc/ssl/private/nginx-selfsigned.key;
ssl_dhparam /etc/ssl/certs/dhparam.pem;
}
End of Edit
https://www.greenhousemarketplace.com
After freshly installing certbot and forcing HTTPS redirect, my CSS and JS no longer loads, even though it is accessible via direct URL.
I'm not sure why, I've updated the links to the CSS and JS files, and set my config.toml to include the https prefix.
sites-enabled/ghm-landing-page
server {
root /var/www/ghm-landing-page/public/;
index index.html index.htm index.nginx-debian.html;
server_name greenhousemarketplace.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/greenhousemarketplace.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/greenhousemarketplace.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = greenhousemarketplace.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80 default_server;
listen [::]:80 default_server;
server_name greenhousemarketplace.com;
return 404; # managed by Certbot
config.toml
# Site settings
baseurl = "https://www.greenhousemarketplace.com/"
languageCode = "en-us"
title = "Greenhouse Marketplace"
theme = "hugo-highlights-theme"
The Javascript is not loading because you are loading mixed content. The script tags at the bottom of the page should use the https:// scheme.
The CSS is not loading because of a SSL_ERROR_BAD_CERT_DOMAIN error. You have the content loading on www., which is a domain not listed on your certificate. Using your cert issuer, be sure to add both the www. and non-www. domains of your domain.
I deployed my Laravel-5.8 to Digital Ocean.
/etc/nginx/sites-available/default
server {
listen 80 default_server;
listen [::]:80 default_server;
# SSL configuration
#
# listen 443 ssl default_server;
# listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/html/peopleedge;
# Add index.php to the list if you are using PHP
# index index.php index.html index.htm;
# index index.php index.html index.htm index.nginx-debian.html;
server_name 123.130.32.49;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php$is_args$args;
# try_files $uri $uri/ =404;
}
# pass PHP scripts to FastCGI server
#
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php-fpm (or other unix sockets):
fastcgi_pass unix:/var/run/php/php7.4-fpm.sock;
# # With php-cgi (or other tcp sockets):
# fastcgi_pass 127.0.0.1:9000;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
}
When I tried to open the site, I got this error:
Secondly, my am I seeing htdocs on my online server since I deployed to DigitalOcean:
There is no existing directory at "C:\xampp\htdocs\peopleedge\storage\logs" and its not buildable: Permission denied
I run all these commands, but the problems still persists:
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
sudo chmod -R 755 /var/www/html/peopleedge
sudo chmod -R o+w /var/www/html/peopleedge/storage/
How do I resolve it?
Thank you
I want to create a website using Laravel Framework in my localhost laragon server
I uploaded the website files in a Web Hosting. and when I enter the url http://www.mywebsite.com. then I just get the files like
But when I enter the url http://www.mywebsite.com/public then it is working perfectly.
Can anyone suggest me that what's I'm doing wrong.
You need to point your domain to public folder of laravel not root since laravel has index.php inside project folder/public/index.php so you need to point project folder/public/
Let suppose you uploaded your application to /var/www/html directory. and your public folder path looks like this /var/www/html/public.
Then your site-available file code should be like this
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www/html/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name www.mywebsite.com;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ =404;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
location ~ \.php$ {
include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
fastcgi_pass unix:/run/php/php7.2-fpm.sock;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
}
You just need to change root path. Then your site will work properly as expected.
i moved all files from public folder (project_folder/public/ )
into project folder
in index.php i changed this lines:
require DIR.'/../vendor/autoload.php';
$app = require_once DIR.'/../bootstrap/app.php';
INTO:
require DIR.'/./vendor/autoload.php';
$app = require_once DIR.'/./bootstrap/app.php';
and works fine ...
Thank you to everyone :)
I installed Nginx server for my laravel project. But css and javascript file not working. Css and javascript file downloadable on server. For example http://myipaddress/css/bootstrap.css download the bootstrap.css file. I call the css file like <link href="{{ URL::asset('css/bootstrap.css') }}" rel="stylesheet"> like this.
My source code
My public folder permissions
/etc/nginx/sites-available/default looks like the following
# Default server configuration
#
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
# SSL configuration
#
listen 443 ssl default_server;
listen [::]:443 ssl default_server;
#
# Note: You should disable gzip for SSL traffic.
# See: https://bugs.debian.org/773332
#
# Read up on ssl_ciphers to ensure a secure configuration.
# See: https://bugs.debian.org/765782
#
# Self signed certs generated by the ssl-cert package
# Don't use them in a production server!
#
# include snippets/snakeoil.conf;
root /var/www/english4-u.com/public;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm;
server_name english4-u.com;
ssl_certificate /etc/nginx/ssl/nginx.crt;
ssl_certificate_key /etc/nginx/ssl/nginx.key;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$query_string;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
location ~ \.php$ {
# include snippets/fastcgi-php.conf;
#
# # With php7.0-cgi alone:
# fastcgi_pass 127.0.0.1:9000;
# # With php7.0-fpm:
try_files $uri =404;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
location ~ /\.ht {
deny all;
}
access_log /var/log/nginx/english4-u.com-access.log;
error_log /var/log/nginx/english4-u.com-error.log error;
location = /favicon.ico { log_not_found off; access_log off; }
location = /robots.txt { access_log off; log_not_found off; }
charset utf-8;
error_page 404 /index.php;}
Can you help me with this?
Thank You.
Have you seen the movie Hidden Figures? There's a scene where a main character can see censored information by holding up redacted papers to a lamp. The relevance you may ask? Your censoring in the question wasn't very hard circumvent.
When I visit your server and check the Console I get ...
The stylesheet http://###/css/bootstrap.css was not loaded because its MIME type, "application/octet-stream", is not "text/css".
The stylesheet http://###/css/font-awesome.min.css was not loaded because its MIME type, "application/octet-stream", is not "text/css".
The stylesheet http://###/css/style.css was not loaded because its MIME type, "application/octet-stream", is not "text/css".
Fix the mime-types of the stylesheets and they should load properly. I would start looking in /etc/nginx/mime.types and verify that you've mapped the css extension to text/css.
I dont know homestead and I would have posted this as comment but I don't have enough rep for that but maybe changing also the user ownership to www-data would work?
sudo chown www-data:www-data -R projectfolder
Then if it works you may have to change ownership for the storage folder if you are allowing file uploads.