Restricting user access on SFTP server backed by S3 - amazon-ec2

I am trying to create a SFTP server which will be backed by S3. I have already succeeded in installing vsftpd and s3fs, linked them and things are working just fine.
Requirements :
A FTP server will have more than one users, and each other will have different s3 buckets linked to their ftp folder. [done]
Approach : Created two different users (say user1 & user2), and mounted buckets to their home directory.
One user should not be able to view folders and files of other users.
Approach : Since I created two different users, I thought the access will be restricted. But looks like when I mount a bucket using s3fs it changes access of the folder to 777.
Now the issue is, I can't restrict my users to access files of other users.
My /etc/vsftpd.conf looks like this :-
ftpd_banner=Welcome to Dave's FTP service.
# Now restrict users to their home directories:
chroot_local_user=YES
allow_writeable_chroot=YES
I have seen and tried issues like this, this and this. These didn't help, so please think again before marking it as duplicate

You need to create mount directory inside the user directory like /home/user1/mountToS3 and then restrict that directory using chmod and chown. This should do. Let me know if you find any issue. :)

Related

Forcing user created folders and files to inherit a specific parent folder's groups in macOS

I have multiple users accessing a file system and I need to make sure each time they create a folder it inherits the groups and permissions of the parent folder. Currently each time they create a folder or upload a file the file or folder's group defaults to that user's default group.
Some of these parent folders have multiple groups assigned to them.
Without changing the default group assigned to the user I need to make sure that any folders/files created/uploaded to the system inherit the group(s) of the parent folder.
I have been playing with:
chown
chgrp
chmod -R +a "g:somegroup allow list,file_inherit,directory_inherit,...,..." /file/path
chmod g+s /file/path
But it has not solved the problem. When the users create folders and upload files they are associated with that user's default group.
The issue with this is that if bob creates a folder and then joe tried to use it, joe does not have permission to write in that folder because the default group associated to bob does not have chmod 775 permissions.
That means the folder is only accessible by the owner.
Hoping to have lots of different groups built into the file system for siloing different clients, users, etc with many people having access to everything, so I can't change every user's default group. I need to be able to just add them to the groups they are supposed to be able to write to.
Suggestions?
We are also having this issue on a brand new iMac running High Sierra. It is killing us. I noticed no one answered but I am hoping maybe you found a solution.
Debbie

Is there a way to list files in an FTP directory that only anonymous has access to?

Admittedly this question I'm asking is just to assist me in an argument I've been scheduled to have with a client.
Our Dev's who reside in another country have an FTP server which has mostly full public access available to all anonymous users, this to simplify the acquisition of new documents and updates for users of the application.
One directory in specific, let's call it updates, actually houses all the new updates but does not grant a directory list to anonymous users due to Access restrictions, so if you try to list the files in the directory using an FTP client, you're met with the generic response:
550 Access is denied.
Failed to retrieve directory listing
However, if you have the exact URL for a file available to the anonymous users in that directory e.g. ftp://ftp.company.com/updates/latest_update_1.zip you can very easily download that file without issue.
My question comes in that I have a client who is somehow monitoring that directory as an anonymous user and knows when a new file (which anonymous has access to) becomes available in that directory and then immediately downloads it. This directly affects their application as often times files are dropped there by Devs during QA and they're not officially available as we've not yet sent out notice of the change log and URL.
So my question is, how exactly is this client doing this? How is he able to list files that anonymous has access to, in a directory which does not list it's files to anonymous users?

localhost on a Mac, mySQL Root, write enabling folders, and migrating to a real server

I'm developing a site on an XAMPP localhost on a Mac. I manipulate my mySQL database via phpMyAdmin (not comfortable with the command line).
Everything works fine (I know, right!).
2 things have got me worried for when I eventually move my site to a real online live server.
First the background:
1) I am using a CMS/Framework type thing. When trying to install it (in the htdocs folder), I found that I needed to write-enable some folder or the other (FileSystem permissions in Finder). So I write-enabled all the folders contained in the mother folder. Mac's have 3 default types of users (right-click a folder in Finder and choose info). They are "Me", "admin" and "everyone". I right-clicked the mother folder (in Finder), selected "Read&Write" for all 3 types of users, and chose "Apply to enclosed items." And the installation worked out fine.
2) I am able to come and go as I please into phpMyAdmin to directly manipulate my database. I presume phpMyAdmin recognizes me as Root. I do not have a password for Root. I do have a separate user created with a password (let's call the user "specificdbuser") and I use "specificdbuser" to connect to the database from within my site's PHP code.
My concerns regarding 1 & 2 are:
1) I'm presuming that enabling Read&Write permissions for all 3 types of users, and in particular for all folders and items within the mother folder, is a security risk. Is there a better way? (a) How do I figure out which folders need to be writeable so that I only make those writeable instead of making everything writeable?, and (b) Instead of giving Read&Write permissions to the 3 default Mac user types, should I instead be creating some new type of user (Root? specificdbuser?) and only give that user permission to Read&Write permissions? As this is a website, do I need to give "everyone" permission to Read&Write? What the heck does "everyone" mean anyway?
2) Let's say I eventually set up my database's Root account with a password. When I eventually migrate my localhost site to a real live online server, will this Root / password combination work on that site too?
I'm kind of confused, are you talking about FileSystem permissions or MySQL Database permission? If it is a FileSystem question, then please check the web service user that runs your PHP scripts. If it's a database permission, then please refer to #2 answer.
I would say, for security reason never use the "root" when connecting to your database. I would suggest you setup the same user name/password/permissions on your local and in server. But if that doesn't make sense you can have a config file that says if "localhost" then db_user = blah_blah, else if server side db_user = blah.

Apache 2 on Mac - localhost requires authentication

I'm using Apache 2 to run my localhost on Mac (Mavericks), and every time I add a file or a folder in the default directory /Library/WebServer/Documents (and its subdirectories) the system asks me to authenticate:
This is a problem specially when using frameworks like Symfony or Zend Framework because they can't have writing access to folders. What can I do to solve this?
Mac/Linux grants access on different levels:
Per user
Per group
Per everybody
The folder Library/WebServer/Documents outside of the logged-in user paradigm (you) therefore write access (and other access like execute) is only granted to the administrator of the computer (or so-called root) which isn't you however on most Mac environment the password of root is your password (different users with the same password).
You will want to modify Apache2 configuration and change the directory to a folder located within your user's directory which is something like /Users/{whatever is your username} (you will probably need to create the folder).
Once, you moved the content of /Library/WebServer/Documents into the new folder, make sure the permission are set properly. Refer to the following documentation at the chapter "How to Modify Permissions with the Info Window".
The reason why your Mac is asking for permissions to write files outside of your user's directory is for security reasons. Imagine if you download a file, execute it and grant access (by giving the root password) then the file could potentially be a virus and erase or do all kind of things on your computer.

copying large files over

I have a dedicated server where I host a large website. We need to do an upgrade on the website and I want to create a development copy on a testurl (on a different cpanel account) but same server.
The files are around 1GB in total size and 70,000 in number.
I have tried WS FTP pro but it has only copied 10% in around 20 hours.
What's the easiest and quickest method to create a replica on my development URL?
I am a newbie so please give detailed instructions.
Thanks
I would think the easiest method would be this:
Create the new account in WHM
Login via SSH
Navigate to your existing account folder
Copy the files to the new account folder
This should be pretty easy for you, as long as you know how to access your server via SSH. It's pretty simple:
Login via SSH
Type su and enter your root password (this is only necessary if you SSH into your server using an account other than root - a good practice, in my opinion)
Find and navigate to your source account. I'm assuming you're probably setup to have your web accounts in the /home folder, so try typing something like cd /home/source_folder
Once you're in the correct source directory, type cp -R * /home/destination_folder
That's pretty much it. The -R option recursively copies all the files from your source to your destination, and if you're copying a HUGE number of files, you might consider adding --verbose after the -R option so you can see it working. I apologize in advance if I've gone a little more granular than needed.

Resources