No single directory is writable Joomla - joomla

Some really strange happened to me, while migrating my websites from a hoster to my new VPS with CentOS 6, DirectAdmin (and Jira Image V6, optimized for Magento and Joomla).
I migrated one website succesfully, without any problems. The first one. It really works like a charm!
All other websites, with the same Joomla! version, I tried to copy, had the same problems of no single directory or file is writable. I checked all settings, everywhere, as far as my knowledge goes, but nothing. The copy method was exactly the same, as the first one.
What I did and tried so far:
.htaccess check (what could be wrong?)
permissions check (755 and 644) (these are good)
ownership check and user / group check (as far as I know they are ok)
php.ini check (changed and tried a lot, I really don't know much about this)
configuration.php check (all good for sure)
I tried manually uploading, downloading and extracting using SSH, resetting owner via DA.
I also tried to put in php.ini > open_basedir = /tmp/ , which resulted in a blank page. (possibly something?)
I can see the website, I can login into backend, I can use FTP, but I can not modify anything in settings, I can not install anything, I checked the permissions overview and everything is very red, like: Unwritable, really every file and directory. And that is not good.
Additional info:
Old server: PHP 5.4.16 > New one: PHP 5.4.15
Old server: MySQL 5.5.28 > New one: MySQL 5.5.31
Old server: cgi-fcgi > New one: apache2handler
Old server: CentOS 6 > New one: CentOS 6
need anything to know? ask
I am kind of desperate, while reuploading, VPS reinstalling, etc, etc, doesn't work! Who can point me into the right direction?

I guess your site is running under a user you are not expecting (or you ran out of disk space). All commands below are meant to be run from the site webroot, i.e. where the index.php is:
cd /home/yourwebsite/html
or whatever is your server path.
Wrong user is the most frequent as tar will by default mantain the original owner id.
Just make the images folder 777
chmod -R 777 images
and upload a file with media manager.
ls -la images/*
-rw-r--r-- 1 fasterjoomla fasterjoomla 31 Apr 26 13:12 index.html
-rw-r--r-- 1 fasterjoomla fasterjoomla 3746 Apr 26 13:12 joomla_black.gif
-rw-r--r-- 1 apache webserver 2301 Jul 16 11:57 test.png
locate your freshly uploaded image: the beginning of the line will tell you the owner and group, for example here test.png is owned by user apache and group webserver.
Now change the ownership of the whole Joomla installation to that except for the configuration.php, administrator or any other files you may want to protect:
chown -R username:usergroup *
After this you can restore the permissions as per your standard 555/755 and your problem should be solved.
chmod -R 555 *
chmod -R 755 images logs tmp cache
rm -f images/test.png
or whatever is appropriate per your security policy.

What is the Linux distro you migrated from?
One potential source of problems when moving to CentOS is the fact that its default configuration is much more secure (SELinux, secure php.ini settings ect). For instance php_ini is disabled along with exec and a few other commands ect.
Also the apache user can't access anything outside web-root directory.
So there's many little things like that and that is probably why most of your application won't run
hth

I know you said you checked it, but usually if you have to use FTP (and there is a reason that was implemented which is this situation) it means that there is a file ownership problem and that suphp or similar are not installed/operational.
The tricky thing with the ftp is that you need to get the credentials saved to be able to use it and if you are in this situation you can't save the configuration.php with the credentials. That's why on a new installation with this situation you would have been prompted for the credentials and then this would have saved. If you can go to your file system and edit configuration.php to put in that data it would provide the immediate solution.
However the real solution is to either have the apache extensions like mod_suphp that will manage this or to deal with the ownership problem. Joomla needs to be able to own the folders/files when it is doing thing like installing extensions and so on.

I was really desperate and hired a kind of addict in (from distance). He was able to point me at the following fact (for free!) :
I was moving the websites to my new server, and I wanted to make the website ready and working before making it live, and that was the mistake.
I kept working on the website with de URL http://xxx.xxx.xxx.xxx/~user/
I didn't want to make the site live, so change the DNS, until the site worked. BUT...!!!
The site will never fully work in upper scenario, and only works with a static address, so http://(subdomain).yourdomain.com for example.
First thing for me was to immediatly change the DNS, and guess what? It works... I spend really 36 hours or more on this, but I hope I can help others with this, because I never NEVER did see this option, and it is written nowhere! Until now...

Related

localhost on a Mac, mySQL Root, write enabling folders, and migrating to a real server

I'm developing a site on an XAMPP localhost on a Mac. I manipulate my mySQL database via phpMyAdmin (not comfortable with the command line).
Everything works fine (I know, right!).
2 things have got me worried for when I eventually move my site to a real online live server.
First the background:
1) I am using a CMS/Framework type thing. When trying to install it (in the htdocs folder), I found that I needed to write-enable some folder or the other (FileSystem permissions in Finder). So I write-enabled all the folders contained in the mother folder. Mac's have 3 default types of users (right-click a folder in Finder and choose info). They are "Me", "admin" and "everyone". I right-clicked the mother folder (in Finder), selected "Read&Write" for all 3 types of users, and chose "Apply to enclosed items." And the installation worked out fine.
2) I am able to come and go as I please into phpMyAdmin to directly manipulate my database. I presume phpMyAdmin recognizes me as Root. I do not have a password for Root. I do have a separate user created with a password (let's call the user "specificdbuser") and I use "specificdbuser" to connect to the database from within my site's PHP code.
My concerns regarding 1 & 2 are:
1) I'm presuming that enabling Read&Write permissions for all 3 types of users, and in particular for all folders and items within the mother folder, is a security risk. Is there a better way? (a) How do I figure out which folders need to be writeable so that I only make those writeable instead of making everything writeable?, and (b) Instead of giving Read&Write permissions to the 3 default Mac user types, should I instead be creating some new type of user (Root? specificdbuser?) and only give that user permission to Read&Write permissions? As this is a website, do I need to give "everyone" permission to Read&Write? What the heck does "everyone" mean anyway?
2) Let's say I eventually set up my database's Root account with a password. When I eventually migrate my localhost site to a real live online server, will this Root / password combination work on that site too?
I'm kind of confused, are you talking about FileSystem permissions or MySQL Database permission? If it is a FileSystem question, then please check the web service user that runs your PHP scripts. If it's a database permission, then please refer to #2 answer.
I would say, for security reason never use the "root" when connecting to your database. I would suggest you setup the same user name/password/permissions on your local and in server. But if that doesn't make sense you can have a config file that says if "localhost" then db_user = blah_blah, else if server side db_user = blah.

Laravel running on a remote host

I am looking at learning Laravel, it looks great but my one concern is how to get it running on a remote host where I have limited (non root) access.
Is it just a case of uploading the files via ftp or are there any other tricky config things that need done.
Probably your best bet is simply copying all app files, but be aware it may take quite long (many files) if your only access is FTP, with risk of incomplete transfer. May be better (but not necessary) to transfer a single compressed archive file and extract it via PHP zip extension or exec() and tar command if available (you can find many tutorials on the web).
Last but not least, you could try to run composer via PHP script - take a look here for example - but that could be much harder than expected (it didn't work for me some time ago because the hosting service had proc_open disabled).
Also, in your case you most likely have permission to access only your own web root directory and you can't change the document root configuration, therefore probably you won't be able to place "non-public" elements outside the document root as recommended, so at least remember to set file permissions properly.
Most important, remember to check the requirements first (note that starting from version 4.2 Laravel will require PHP 5.4).

How did my PHP session path change?

EDIT - HUGE ERROR ON MY PART
I found another site that had the issue that I knew was not on the same server. Then I realized that the original site with the issue was also on a different server and had not been moved over completely yet. The server in question was actually a Plesk Parallels' server and the issue was caused by a patch applied to the server over the weekend due to a security update. This server did have the file path and I just had to chmod it to 777 instead of 77x for it to work. I apologize for the confusion and thank everyone for trying to help. +'s for all. :)
Original Post
I have a website on a shared hosting server (also mine) that since yesterday started giving me this error:
Warning: session_start() [function.session-start]: open(/var/lib/php/session/sess_678cf69f0f17b87c52136ee0280d23cc, O_RDWR) failed: Permission denied (13) in /var/www/vhosts/domain.net/httpdocs/index.php on line 1
I've checked /usr/lib/php.ini and /usr/local/lib/php.ini to see where it is set and both say it is set to the /tmp directory, which is where it should be set and always has been. The /var/lib/php/session directory never even existed. I did create it and give it 777 permissions but that did not help. Though the bigger issue here is why did it change to begin with. There is no .htaccess file for this site and I cannot find this being set anywhere on the site itself either.
This is the ONLY site on this server with this issue, telling me its something local to the website. I just cannot figure out what. So my question is this: what should I look for to check the session save path settings for an individual site on a shared hosting environment to find out why it suddenly changed for this one client?
FYI, I am running a WHM server.
Thanks
session_save_path(realpath(dirname($_SERVER['DOCUMENT_ROOT']) . '/../tmp'));
You need to add the above code before starting the session.
You don't appear to have write permission to the /var directory on your server. This is a bit weird, but you can work around it. Before the call to session_start() put in a call to session_save_path() and give it the name of a directory writable by the server. More details here
I found another site that had the issue that I knew was not on the same server. Then I realized that the original site with the issue was also on a different server and had not been moved over completely yet. The server in question was actually a Plesk Parallels' server and the issue was caused by a patch applied to the server over the weekend due to a security update. This server did have the file path and I just had to chmod it to 777 instead of 77x for it to work. I apologize for the confusion and thank everyone for trying to help. +'s for all. :)

copying large files over

I have a dedicated server where I host a large website. We need to do an upgrade on the website and I want to create a development copy on a testurl (on a different cpanel account) but same server.
The files are around 1GB in total size and 70,000 in number.
I have tried WS FTP pro but it has only copied 10% in around 20 hours.
What's the easiest and quickest method to create a replica on my development URL?
I am a newbie so please give detailed instructions.
Thanks
I would think the easiest method would be this:
Create the new account in WHM
Login via SSH
Navigate to your existing account folder
Copy the files to the new account folder
This should be pretty easy for you, as long as you know how to access your server via SSH. It's pretty simple:
Login via SSH
Type su and enter your root password (this is only necessary if you SSH into your server using an account other than root - a good practice, in my opinion)
Find and navigate to your source account. I'm assuming you're probably setup to have your web accounts in the /home folder, so try typing something like cd /home/source_folder
Once you're in the correct source directory, type cp -R * /home/destination_folder
That's pretty much it. The -R option recursively copies all the files from your source to your destination, and if you're copying a HUGE number of files, you might consider adding --verbose after the -R option so you can see it working. I apologize in advance if I've gone a little more granular than needed.

Where can i find the .cache folder on a linux hosting

I am trying to add a rss feed into an HTML page. After some searching found something called simplepie.
On trying i get an warning
Warning: ./cache is not writeable.
Make sure you've set the correct
relative or absolute path, and that
the location is server-writable. in
xxx/inc/simplepie.inc on line 1780
On checking for the cache folder on the server i couldnt locate the folder. I am on a linux server. Would creating a cache folder be enough or do i need to get the hosting company to look into it
Thanks
In addition to creating the folder, you will need to make it writable by the user that the script will run as. You may need the hosting company's help on this. Otherwise, you can make it world-writable, though if you can restrict it to just allow the user the script runs as, then that would be best.

Resources