I have a dedicated server running CentOS.
I have installed
WHM/Cpanel. On my server, I have a domain (example.com) and a user
(example).
The website domain example.com points to /home/example/public_html/. However, my project is Laravel, so the index point is in /public. I need to change the document root from
/home/example/public_html/ to /home/example/public_html/public
I ran the following commands:
nano /var/cpanel/userdata/example/example.com
nano /var/cpanel/userdata/example/example.com_SSL
rm -vf /var/cpanel/userdata/example/example.com.cache
rm -vf /var/cpanel/userdata/example/example.com_SSL.cache
/scripts/updateuserdatacache
/scripts/rebuildhttpdconf
service httpd restart
The problem:
When I run these commands, I see that nothing changed and I see DO NOT HAVE PERMISSION page of Laravel index.php (root not public).
When I run these commands for an empty project I see the results, and when I copy and paste the Laravel project I again see the permission denied page.
What is it?
Related
I have a Laravel project, I want to deploy it into the server, the thing is that normally we have index.php and .htaccess inside the public folder, but in my case, I have brought these two files into the root. So I want to know, what are the changes needed in serve?
How can I upload this to server?
Solution 1
Somehow you need to get the ssh access as a shared user from your hosting provider and then you can use git to clone your repository into your server.
Solution 2
You can copy paste all of your project into the server using ftp from your cpanel or relevant control panel.
Solution 3
Use Amazon as your hosting as it gives 1 year free tier access, and also gives you ssh service. Follow the solution 1 after getting this.
Put back the files to the public folder. You can change the root path of your server to your project's public folder.
Follow the steps to do:
Go to /etc/apache2/sites-enabled/
Open the .conf file inside this folder
Change the docuementRoot to /etc/var/www/html/project_name/public
Restart the apache server using the following command:
sudo systemctl restart apache2
My site runs on php-larvel and nginx as webserver and the root diectory is /public , everything was ok ,but when I wanted to secure my phpmyadmin login url , creating a symlink by this command in /public_html :
ln -s /usr/share/phpmyadmin/phpmyadmin-YOUR-SECRET-CODE
site went down showing 404 , and the nginx shows this error:
2019/04/07 07:09:02 [error] 4163#4163: *110 directory index of
"/home/admin/web/example.com/public_html/" is forbidden
the nginx configuration has already set to serve on /public . I don't know why it is trying to access public_html on requests .
The user that NGINX is running as must be able to access the directories its attempting to read and/or write to. In most cases, NGINX is either running as the user nginx or www-data.
We can check for the user directive in /etc/nginx/nginx.conf by running:
grep user /etc/nginx/nginx.conf
If the return of that command spits out a user, we can then change the ownership of the directories you've mentioned using chown.
chown -R user:group /insert/path/here
So if NGINX is running as www-data:
chown -R www-data:www-data /insert/path/here
If the grep command doesn't spit out a user, we can always see who owns NGINX's directories and we should be able to use that user.
Simply run:
ls -al /etc/nginx
and grab the user from the output.
As a side note, things are a bit different if you happen to be running PHP-FPM with NGINX. In that case, the directories and files should be owned by the user that the PHP-FPM process is running as.
If you are running PHP-FPM, you can cd in to the main directory and check the pool file. If you've not modified anything there for your setup, the default user for PHP-FPM is always www-data, so that should be the user who owns all files and directories.
cd /etc/php/*/fpm/pool.d/
and then:
nano www.conf
I've newly started to use Laradock to build my Laravel projects but I have a problem in editing the files such as Controllers, Models, etc which are made by the php artisan command in the Laradock workspace. The reason is the user in the workspace is a root and on the other side, I'm trying to edit the file in my editor by a common user. So every time I have to run the command chmod -R 777 /newCreatedFile.php to change the permission. So is there any solution to handle this problem?
By the way my OS is ubuntu 18.04
In the Laradock Getting Started guide, it explains how to get Laradock running as a specified user:
Note: You can add --user=laradock to have files created as your host’s user. Example:
docker-compose exec --user=laradock workspace bash
I believe this should solve your issue, as you will no longer have the Docker user running these commands. Try it out!
Note: The core issue may just be that whatever user Laradock is running as is not creating files with group permissions that allows the host machine's user write capabilities, hence why the --user flag can be used. It may not actually be running as the root user itself.
I'm starting to develop a site in ubuntu 14.04. I'm using apache2, so I placed my files under /var/www/html/ folder. Everything was working fine, but I had to restart my current work so I copied the entire project folder from another path, like this
$ sudo cp ~/path/to/folder /var/www/html/
And now I can't see my images, a simple imgtag. I just see this.
When I look for the file, I can see that I'm having an error 403 forbidden. I saw some suggestion to add the option Require all granted but that option is already set in my apache. I suspect that is because the new folder is a copy from a folder without root permissions, so I tried chmod but that also didn't work so I'm completly lost now.
So, how can I see my images from localhost? and most important is why this happen suddenly?
I am moving a magento store from mydomaintest.com to mydomain.com.
When I say move, in this instance, we simply used the Cpanel to Modify Account and changed the Domain Name from mydomaintest.com to mydomain.com.
Then using the advice found in forums I used PHPMyAdmin to update the Magento Core Config table to the new BaseURL for both Secure and Unsecure url's.
After doing this I deleted all files in /var/cache.
Trying to access the site by domain name or IP is providing the following error:
Fatal error: require_once() [function.require]: Failed opening required '/home/mydomain/public_html/errors/report.php' (include_path='/home/mydomain/public_html/app/code/local:/home/mydomain/public_html/app/code/community:/home/mydomain/public_html/app/code/core:/home/mydomain/public_html/lib:.:/usr/lib/php:/usr/local/lib/php') in /home/mydomain/public_html/app/Mage.php on line 847
Please help, we are trying to move live today and can't seem to figure this one out.
Thanks!
John
Go to System > Index management and Reindex data as it also contain the url rewrites. Also be sure to check System > Cache Management (some versions still have that) and flush all cache as var/cache is not the only caching location. The zend components save their cache in the tmp folder.
I had this issue with Magento running with Apache2 on Ubuntu 14.10
Make sure that MySQL module for PHP is install:
dpkg --list | grep php5-mysql
If it is not listed, you need to install it:
sudo apt-get install php5-mysql
Then restart Apache:
sudo service apache2 restart
In our case we get this message because someone deleted the "error" folder - the site works fine until an error happens.
Once we restored the folder (and make sure PHP can access it), we see the normal Magento error page.
If you don't have the folder you can download Magento and extract it from the archive.