I have a DigitalOcean VPS with ubuntu, I'm fairly new to web apps deployng and I'd like some advice on what would be the best way to properly set permissions on my laravel app folder so I don't have to worry about getting permissions errors.
I know about storage and cache folders and how I should give www-data read and write permissions but is there an easy way to set up all this nicely so I don't have to be changing permissions each time a new folder, file is added?
What does your initial setup looks like after say, cloning a git repository into the /var/www folder of your LEMP stack.
I know there are lots of tutorials on this but I'd like to know the more future-proof, solid way to handle this.
Thanks in advance and I'm open to all advice.
Related
I have recently deployed my instance of Vtiger 7 in a load balanced auto scaled configuration.
I have also created a NFS server and mounted this to my Vtiger server. This NFS will also be auto mounted to any additional servers in the auto scaled scenario.
In order for this to all work properly I need to move the /storage and the /test directories to the NFS utilizing a symbolic link.
I have set this up perfectly and also established the proper symbolic links.
Problem I’m running into is that the vtiger will read the data from the symbolic link folders without issue, however is unable to write to these folders due to permissions issues. I’ve set permissions on the NFS folders 775. I’ve also tried 777 permissions just to check it out but still getting the same errors and vtiger will not write to the directories. Any idea of how I can solve this?
After burning by eyes our for many hours, I have solved my own question.
The issue was regarding folder ownership settings. I essentially needed to change the symlink owner and the NFS directory owner to the same owner as the CRM web root.
I just uploaded my laravel project to my shared hosting and was wondering what things and changes to configuration should I make to make the project work?
My typical checklist:
Modify the document root to the /public folder
Make the /storage and bootstrap/cache folders writeable.
Set up the database
Modify the .env file to suit the live environment
Run php artisan migrate
This should get you up and running, or at least bring you to a point where the errors are detailed enough to work things out
I have a functioning laravel app that I developed locally. I moved it onto a server via ftp (just to show someone for feedback).
I changed the APP_URL in .env to the subdomain pointing to the /public folder. Also changed the database information. Everything else was left exactly as is.
I can access the front page without any problem. Anything else (e.g. /login or an AJAX to any other controller) results in a Server Error 500 that leaves no trace in the server error logs.
When I assign different routes to the / those are also displayed. I can show pages that pull data from the database, so that is not the issue.
Both local development and server run apache on linux.
Any pointers?
Update: Thank you for the suggestions so far. I currently cannot access the server via ssh (not my server). I'm working on getting that set up and will try your solutions as soon as I can.
Thanks everyone.
With a little help from the hosting company we found the problem. All we had to was to add
RewriteBase /
to the .htaccess automatically created by laravel.
Make sure that your web-server has read and write permissions to the following folders
public
bootstrap/cache
storage
If the web-server does not have these permissions it cannot compile views, store session data, write to log files or store uploaded files.
Set Webserver as owner:
assuming www-data is your webserver user.
sudo chown -R www-data:www-data /path/to/directory
Not always will work if you CHOWN it, in some cases I had to CHGRP to www-data on my Ubuntu VPS as well.
How I'm checking is this:
Domain has to point and if there's an SSL, a padlock in the web browser has to be seen. No matter on your localhost, but Laragon is the quickest to set it up.
Now I know that I see what I should if I can write something in my index.html file inside. If I can't see it, permissions or roles are wrongly set.
Laravel has tons of info online (or google it) on how to set up CHOWN and CHGRP, so, if I'll clone some repo, or unzip it, the first thing now is to set these two up. If these two are rightly done, I can do npm install, composer install - if it's not a shared hosting where I can't do it, but VPS or localhost.
Now you should be able to see Laravel's pages and only public and storage might want different permissions than the rest.
.env file should be created with right permissions as well, if no, you won't key:generate later for instance.
I created a Micro instance and downloaded a key for access to this. However I have a wordpress blog and that is asking for a username and passsword so it can use ftp to upload some plugins.
How can I create a new user for the ec2 instance? Currently I just have ec2-user. Do I need to create this when setting up the server or can I do this after the initial set up?
When WordPress asks for an FTP username/password, that means the permissions on your webroot are too restrictive for WordPress to update itself, or to add new plugins, etc. I actually LIKE that, since to have the permissions more permissive means the site and server are a bit more vulnerable to uploads/injections, etc.
So to install your new plugin I would recommend you do one of two things:
One option is that you could SSH into your server instance, identify the directory of your webroot (often the webroot is in/var/www/html/) and then chmod -R 777 /var/www/html && chown -R www-data.www-data /var/www/html). That will make your permissions wide-open and you can perform any updates or installations on your site using the WordPress dashboard. You will not be asked for an FTP username/password while these permissions are open. The problem is, you'll then want to go back and reset your permissions afterward to something more restrictive. For a good primer on that, see this article from Smashing Magazine, or the official Codex here.
The second method is a bit more indirect, a good bit more complicated, but safer. It involves using a source control system like Git on both the WordPress server and your own local workstation. If you create a private repository in something like GitHub, and load your site's code into that repository, that means you could also create a clone of it on your workstation, add the plugin(s) or new theme(s) there, and then commit and push the code back to GitHub. You would then hop into your server, cd to the webroot, and then git pull to retrieve your changes with no permission changes needed.
There are many reasons I favor the latter solution, but I know there is more setup involved. So if that feels like too much, you should learn how to do the first method comfortably. It would actually be a great chance to write a simple bash script that can OPEN up your permissions, and then another script to CLOSE them back down, automatically for you. (You can find just such a script here from a GitHub user.)
For some reasons I have a 663.php file in every folder and subfolders of my httpdoc root in my web server ftp. I don't know where this file came from and my host does not know either
I would very much appreciate any help
Depending on it's content it could seem like a PHP Shell Backdoor.
An attacker would upload this file to gain access to your files, database ect.
They usually exploit a flaw in your application, to upload files.
Be sure to update all the software you are running.
Someone might have gained access to your site, also change all passwords.
It looks like this has occurred before to other users:
Your site has been hacked. The 663.php file is sending out anonymous
spam. If you host with GoDaddy, this is a common theme as thousands
of accounts share one IP and one person with shell access can get in
and place an htaccess file above the root folder on the server and
autoload the files into every folder in your website and onto every
site within that IP address. Year1Media
Quote from AolAnswers.
Thank you all for your help. After a little search i found out that It was a Plesk security vulnerability. The problem was solved by running a patch in parallels plesk. Apart from inserting unknown files it also changed .htaccess to redirect to weird websites.
It Is a Plesk Problem
they are correction patches here:
http://kb.parallels.com/en/113321