Writing permission with VSFTPD and Centos 6.2 - ftp

I have a server with centos 6.2 with httpd and vsftpd.
I have few web site in /var/www and i want to add a ftp user for each site.
My user1 have a directory in /home/user1 and can read/write to it folder from ftp. (it's the user i use to ssh and almost everything)
I made user2 with it's home /var/www/site2 and /bin/nologin (because i want it to be just a ftp user)
I can log in the FTP with the user2 and download file, but i can't upload file or mkdir...
The permission are "drwxrwxrwx. 2 user2 user2 4096 Aug 21 14:35 ." (the 777 was just for testing...)
My vsftpd.conf is :
anonymous_enable=NO
local_enable=YES
write_enable=YES
local_umask=022
dirmessage_enable=YES
xferlog_enable=YES
connect_from_port_20=YES
xferlog_std_format=NO
log_ftp_protocol=YES
chroot_local_user=YES
listen=YES
pam_service_name=vsftpd
userlist_enable=YES
tcp_wrappers=YES
banner_file=/etc/vsftpd/banner
My iptables is currently stop for testing, so it isn't my firewall either...
Thank in advance for your help

Sorry, wrong site my bad...
I post my question at Serverfault and got an answer, so here is the link!
https://serverfault.com/questions/532949/writing-permission-with-vsftpd-and-centos-6-2

Related

Host machine cannot access nginx virtual hosts on the guest machine

Question: how to create simple nginx config that will read folders structure as domains (test.local, myblog.local) and shows the page from this folders, including PHP?
Information:
Windows 10 x64 build
Vagrant 1.9.5
VirtualBox 5.0.22 (latest)
Guest OS: Ubuntu Xenial x64 latest
So, i want to create simple nginx config, that will recreate folder structure. See
my config file on pastebin.
Also here is a Vagrantfile config, which use SMB to mount a folder.
The structure of folders:
├───devhost.local
│ ├───log
│ └───public
│ index.html
│ index.php
│
└───test.local
├───log
└───public
index.html
The rights for files and folders for devhost:
ubuntu#ubuntu-xenial:~$ ls -la /var/www/html/devhost.local/
total 4
drwxr-xr-x 2 ubuntu www-data 0 Jun 7 11:17 .
drwxr-xr-x 2 ubuntu www-data 4096 Jun 7 12:44 ..
drwxr-xr-x 2 ubuntu www-data 0 Jun 7 11:17 log
drwxr-xr-x 2 ubuntu www-data 0 Jun 6 14:13 public
My hosts file in Windows:
192.168.33.10 devhost.local
So, when i have default config in my sites-enabled folder i can open guest machine through 192.168.33.10 and i see html page of nginx, but when i remove this default config and enable my wildcard config (see link my config file) so i cannot access my domains. The sudo nginx -t says that everything is ok, also i tried to restart my guest machine, reload/restart nginx service. Also, i disable Windows 10 Firewall (i dont know if its disabled fully, but says that its disabled). Also, the log files is empty and even not created, both access log and error log.
Where is my mistake? If need more information, please, ask me, i will give.
Thanks a lot!
following nginx setup should help.
server {
listen 80 default_server;
root /var/www/html/$host;
index index.html index.php;
location ~ \.php {
# ... fastcgi details
}
}
I found the solution.
First of all, when i keep only one file with config, my nginx doesnt listen port 80, i check sudo netstat -ntlp | grep LISTEN but there wasnt port 80. So i Google, and found another question on stackoverflow (see link at the end).
Solution: recreate the simlink to my file with config, after that when i run sudo nginx -t i see a few errors. So its seems that before this files was empty or something like that, but i didnt notice this because i edit file directly in sites-available folder.
Thanks to everybody!
This question helps me to solve the problem: nginx not listening to port 80

Correct steps to setup Ambari on a centos VM

I am using: CentOS 7 with Ambari 2.1.1 to try and setup a single node setup on a VM. I want to do this to install vanilla hadoop etc instead of installing a prepackaged VM with some modified version of hadoop.
I am logged in as root. I have created a ssh key pair. I also ran:
"cat id_rsa.pub > authorized_keys"
"chmod 700 .ssh/"
"chmod 640 ./ssh/authorized_keys"
I have edited /etc/ssh/sshd_config to: permit empty passwords, allow root login and also to state where the authorized_keys file is.
Without a password I can run "ssh root#localhost" and log in fine.
I have ran "ambari-server setup" successfully and logged in at localhost:8080 with user: admin pass: admin.
In "Install Options" FQDN I typed "localhost.test" and have selected a copy of my private key for the Host Registration Information.
But not matter what I do I am unable to get the components install under the confirmed hosts part and thus can't get any further.
Can someone please point out what I am missing here?
Thanks to Yusaku on HortonWorks forum for the help.
Ok I ran:
hostname -f
and got localhost
python -c ‘import socket; print socket.getfqdn()’
and got localhost.localdomain
By entering localhost.localdomain into the FQDN I was able to get the install working.

hadoop2.6.0 sudo sbin/start-dfs.sh fail

I'm following the Hadoop official tutorial to run Hadoop on a my machine in a pseudo-distributed mode.
I can use ssh to login in localhost without password:
admin#mycomputer:/usr/local/hadoop/hadoop-2.6.0$ ssh localhost
Welcome to Ubuntu 14.04.1 LTS (GNU/Linux 3.13.0-45-generic x86_64)
* Documentation: https://help.ubuntu.com/
4 packages can be updated.
0 updates are security updates.
Last login: Mon Feb 9 12:31:17 2015 from localhost
admin#mycomputer:~$
And I can also format the namenode without error, but I cannot start Hadoop with start-dfs.sh:
admin#mycomputer:/usr/local/hadoop/hadoop-2.6.0$ sudo sbin/start-dfs.sh
Starting namenodes on [localhost]
root#localhost's password:
localhost: Permission denied, please try again.
Why I'm still asked to provide root password while I can ssh into localhost without it?
I also tried:
sudo passwd
to reset the password, but later encounter the same permission denied error, it seems to me that this password is not the password for root#localhost. How can I solve this problem?
I think you didn't change the permission for the hadoop-2.6.0 folder. Give admin user permission to this folder and try to start.
Follow my below blog link : I provided steps in detail installing in Ubuntu by enriching from another blog.
http://gubendran.blogspot.com/2015/01/install-hadoop-in-single-node-linux.html

Can not log into the ftp server with authenticated users

I have setup the ftp server locally,I created authenticated users to access the ftp server.
I want to give privileges such as delete,write,read to the users.
I followed lot of tutorials, but still couldn't log in to the ftp server.
Here's below changes I have done
remove comment in following lines in /etc/vsftpd.conf
local_enable=YES
write_enable=YES
chroot_list_enable=YES
added users to /etc/vsftp.chroot_list file
then
sudo touch /etc/vsftp.chroot_list
finally restarted the server, but I can't login to the ftp server.
First of all check port 21 is listening with below command. If you have customized port then replace 21 with that port number.
netstat -pan | grep 21
If it is listening then check if you can ftp locally:
ftp localhost
and then stop firewall for some time from your server and then try again.

ubuntu linux in rackspace cloud but www-data user does not show www-data#server1 for login shell

I got a server running on ubuntu 11.10 on rackspace cloud
root and other users are successfully showing the login shell in the format
root#server1:
however, this is not true for www-data user.
How do I correct this?
Root and other user accounts are set to /bin/sh
For www-data, you need to change the login shell for this.
login via ssh as root
and type
chsh -s /bin/bash www-data
This will change the login shell for www-data such that the /bin/sh is used

Resources