magento install and chmod 777 - magento

Ok so in install of magento in the knowlege-base says to changes permissions to 777. is that right that doesn't sound right to me at all in fact it sounds very insecure what should I do

It depends on what you need to do with the Magento install. If you want to use the Magento Connect Manager built into the administrator interface then you will need 777 permissions. If you don't want this (such as if you're happy to use the command line PEAR installer) then only the var/, media/ and app/etc/ directories need to be 777.

pretty popular requirement, especially when running php as an apache module as, php will run as the same user as webserver (normally nobody or the like). So without chmoding the dir world readable/writable, then the apache daemon cannot write into it. It can be a security issue especially on a shared hosts/servers. There are was of making php run a specific user, but that would probably be best talked about on serverfault.com.

It is better to keep chmod 755

Related

Installing Magento on CentOS nightmare (AVC denial)

I'm trying to install Magento unto my CentOS that I'm running off Hyper-V and it's driving me crazy. I set everything up as per tutorial, but every time I reach the locale page of the setup it says "AVC denial... attempting to write to /var. I'm pretty new with linux but I tried almost everything, I did what the error told me and set the label of the /var directory and all the directories below it to httpd_sys_content_t and made sure it has write permission. After that didn't work I gave up and decided to reposition the server to a custom folder in /usr directory, I changed all the apache config files so it doesn't mention /var directory at all, but the apache process is still attempting to write to it for some reason. Can anybody help me out with this?
Try
grep "denied" /var/log/audit/audit.log
What's there?
And for your information, Magento want to access file on magento folder var, log or report or cache
Go to your Magento folder and you can change it with
sudo chmod -R 777 var
it will give access to write and change file inside there.

Installing dropbox (and use Kirby CMS) on openshift

I'm trying to find a way to integrate Kirby CMS with Dropbox running on Openshift using these tutorials:
http://getkirby.com/blog/kirby-meets-dropbox
http://getkirby.com/forum/how-to/topic:561
I already get stuck installing Dropbox, since I assume I don't really have permission while SSHing:
http://www.dropbox.com/install?os=lnx
So my question: Is there even any way of achieving all that greatness? If no, not even if we get reaaaally creative? If NO, why not? If yes, how?
Thanks a bunch!
I have no experience with Kirby, but here's how to get Dropbox working on Openshift.
The following is a combination of doing a Dropbox install on a server and doing it in a non-standard location. Everything gets done in $OPENSHIFT_DATA_DIR because that's where you have write privileges.
First, make sure you're in $OPENSHIFT_DATA_DIR
cd $OPENSHIFT_DATA_DIR
Next, download the appropriate version of Dropbox:
wget -O - "https://www.dropbox.com/download?plat=lnx.x86" | tar xzf -
This should give you the .dropbox-dist folder in $OPENSHIFT_DATA_DIR.
Next, tell Dropbox to start the installation process, but tell it that your home directory is actually the $OPENSHIFT_DATA_DIR:
HOME=$OPENSHIFT_DATA_DIR ./.dropbox-dist/dropboxd start -i
Follow the instructions to link your Dropbox account to the Openshift server. After it's linked, it should start syncing everything in your Dropbox account to $OPENSHIFT_DATA_DIR/Dropbox. This might be a bad thing for you because you have too much data in your Dropbox account. If so, then you should exclude folders.
You can do that with the CLI script that Dropbox provides. Still in $OPENSHIFT_DATA_DIR, download it:
wget -O dropbox.py "https://www.dropbox.com/download?dl=packages/dropbox.py"
Make sure it's executable:
chmod +x dropbox.py
You need to run it the same way you would Dropbox:
HOME=$OPENSHIFT_DATA_DIR $OPENSHIFT_DATA_DIR/dropbox.py -h
Hope that helps.
You should be able to download/compile/install things into your OPENSHIFT_DATA_DIR (app-root/data) on your gear by using something like ./configure --prefix=~/app-root/data/dropbox, i tried that but i ran into missing the nautilus-whatever package, which i assume you could download and install in the same fashion, but i did not try past that point. As long as whatever you are running can be installed into the app-root/data, and does not require root permissions to run, you should be able to do it. If you get it going, you could also create a downloadable cartridge to run install it more easily.

Joomla 3.1 setup on Ubuntu 12.04 running nginx

My objective is to have Joomla 3.1 running on an Ubuntu server using nginx.
I am testing the setup locally but I keep encountering problems. I think this is concerned with the permissions in my Joomla source files.
I tried to install a package (T3) manually using the "Install From Directory" option. However, I receive the following message:
Warning JFTP: :store: Bad response
JInstaller: :Install: Failed to copy file
/usr/share/nginx/immigrationinformation.com/components/com_installer/t3-1.4.1/source/plg_system_t3/t3.php
to
/usr/share/nginx/immigrationinformation.com/plugins/system/t3/t3.php
Package Install: There was an error installing an extension:
plg_system_t3
I know that this is the wrong way to set up the server but currently I have all the source files permissions set to 777. When this T3 package tries to install, it creates a folder in plugins/system/ called t3. This has only permissions drwxr-xr-x and thus the reason for the above errors.
My question is: What is the correct method to set up my Joomla 3.1 package such that I can ensure a smooth operation of the site, in a secure manner.
Thanks in advance!
The permissions when t3 installs are correct, it sounds like the problem is with ownership rather than permissions.
I think in ubuntu the system user is www-data so you would need to run the following over ssh
chown -hR www-data:www-data /path/to/joomla/root
Then upload the plugin through the joomla installer (and change folder permissions back to 755 and files to 644).

Drupal install on localhost asks for FTP info

I'm running Drupal 7.4 on localhost, and I've downloaded some themes/modules but I'm unable to install them. I go to administration/modules, for example, select 'Upload a module or theme archive to install', choose the tar.gz from my file system, and before the install I'm asked for a FTP user and password and can not advance.
I'm working locally, so I'm thinking maybe I made some mistake during the install. How can I correct this? I have to do a lot of testing on local before moving the site to a server.
I found the solution here. All I have to do is place the modules/themes inside drupal_folder/sites/default/modules or themes and that's it.
Thanks #nmc
This can happen when sites/default folder is not owned by the user that executes the install script. Make sure the folder sites/default is owned by the apache user (from your drupal root):
Ubuntu:
chown www-data sites/default
Fedora:
chown apache sites/default
If your not able to install the module, because the lack of a ftp connection, it's possible to use the old fashion way.
The other solution has described it, but it's not correct for a 100%.
If you want to do it the drupal way, you need to install the modules/themes to
drupalfolder/sites/all/modules
or
drupalfolder/sites/all/themes
if you are having a multi-installation of drupal, then:
drupalfolder/sites/domain_name/modules
or
drupalfolder/sites/domain_name/themes

Is it safe to run a ruby script using sudo?

I am running redmine on Ubuntu, and I am running it using sudo.
If I try to run as my redmine user, I get permission errors on the log file.
Is this safe? Should I be concerned?
You should be careful when running any sort of web application as root. Personally, I would not recommend it.
If permissions problems on the logfile are your only problems, the better solution would be to alter the permissions of the log files/folders. Make sure the log files belong to the user account that Redmine is running as (or have group write permissions and belong to the same group). You might have to use sudo to change those permissions, but it is much safer than running a web application as root.
Any time you run a script using sudo you should be concerned since in effect you are running the script as root. Therefore, to give an extreme example, if the script executes a command similar to rm -rf / you will wipe out the entire system....it's best not to use sudo to execute scripts unless you are completely aware of what the script is doing and any potentially tainted data that is consumed by it!

Resources