Synology remove admin account access to user folder - synology

Good evening,
At home we just started using a Synology NAS DS1815+. The problem is 3 of us have the admin password making it impossible for one of us having a trully private folder on the NAS.
My question is: Is it possible to create a folder where just a specific user has access to it and you can see its contents even if you have the synology NAS admin password?
cheers and thanks in advace.

you can open file browser in synology, and create new shared folder, which will be encrypted, and do not selet automatic connection after start...
but to have more admins is generaly not wise idea, and sooner or later some other trouble will come up...
i suggest you, you create admin user, and agree with others, not to use it - loggins are visible in logs...

Related

Can user makes sharing folder in Active Directory?

I'm using Active Directory and belong to specified group. (Not an Administrator.)
I have made folder in my 'C:' and then trying to share to another group users.
But I can't. Just i can get the warning message that '~Access denied~. You did not make shared resources.'.
Is there way to take care of this problem?
Thank you~!
Sharing of resources might have been blocked by your Administrator, either get your self added in power users group / asked them to share folder for you.
If you have Administrator Access - login with that and try. other wise less privileged user can not share the resources.

localhost on a Mac, mySQL Root, write enabling folders, and migrating to a real server

I'm developing a site on an XAMPP localhost on a Mac. I manipulate my mySQL database via phpMyAdmin (not comfortable with the command line).
Everything works fine (I know, right!).
2 things have got me worried for when I eventually move my site to a real online live server.
First the background:
1) I am using a CMS/Framework type thing. When trying to install it (in the htdocs folder), I found that I needed to write-enable some folder or the other (FileSystem permissions in Finder). So I write-enabled all the folders contained in the mother folder. Mac's have 3 default types of users (right-click a folder in Finder and choose info). They are "Me", "admin" and "everyone". I right-clicked the mother folder (in Finder), selected "Read&Write" for all 3 types of users, and chose "Apply to enclosed items." And the installation worked out fine.
2) I am able to come and go as I please into phpMyAdmin to directly manipulate my database. I presume phpMyAdmin recognizes me as Root. I do not have a password for Root. I do have a separate user created with a password (let's call the user "specificdbuser") and I use "specificdbuser" to connect to the database from within my site's PHP code.
My concerns regarding 1 & 2 are:
1) I'm presuming that enabling Read&Write permissions for all 3 types of users, and in particular for all folders and items within the mother folder, is a security risk. Is there a better way? (a) How do I figure out which folders need to be writeable so that I only make those writeable instead of making everything writeable?, and (b) Instead of giving Read&Write permissions to the 3 default Mac user types, should I instead be creating some new type of user (Root? specificdbuser?) and only give that user permission to Read&Write permissions? As this is a website, do I need to give "everyone" permission to Read&Write? What the heck does "everyone" mean anyway?
2) Let's say I eventually set up my database's Root account with a password. When I eventually migrate my localhost site to a real live online server, will this Root / password combination work on that site too?
I'm kind of confused, are you talking about FileSystem permissions or MySQL Database permission? If it is a FileSystem question, then please check the web service user that runs your PHP scripts. If it's a database permission, then please refer to #2 answer.
I would say, for security reason never use the "root" when connecting to your database. I would suggest you setup the same user name/password/permissions on your local and in server. But if that doesn't make sense you can have a config file that says if "localhost" then db_user = blah_blah, else if server side db_user = blah.

Apache 2 on Mac - localhost requires authentication

I'm using Apache 2 to run my localhost on Mac (Mavericks), and every time I add a file or a folder in the default directory /Library/WebServer/Documents (and its subdirectories) the system asks me to authenticate:
This is a problem specially when using frameworks like Symfony or Zend Framework because they can't have writing access to folders. What can I do to solve this?
Mac/Linux grants access on different levels:
Per user
Per group
Per everybody
The folder Library/WebServer/Documents outside of the logged-in user paradigm (you) therefore write access (and other access like execute) is only granted to the administrator of the computer (or so-called root) which isn't you however on most Mac environment the password of root is your password (different users with the same password).
You will want to modify Apache2 configuration and change the directory to a folder located within your user's directory which is something like /Users/{whatever is your username} (you will probably need to create the folder).
Once, you moved the content of /Library/WebServer/Documents into the new folder, make sure the permission are set properly. Refer to the following documentation at the chapter "How to Modify Permissions with the Info Window".
The reason why your Mac is asking for permissions to write files outside of your user's directory is for security reasons. Imagine if you download a file, execute it and grant access (by giving the root password) then the file could potentially be a virus and erase or do all kind of things on your computer.

How to make updates to windows 8 file remotely

We have a simple problem, we are setting up our new server and before we point our DNS to the server we want to check everything so we are changing the hosts file on our Windows 8 machine to do this.
That all works fine the problem is we are migrating tons of sites over and we have employees that are testing the sites for us and they are not very tech savvy, and because we are updating the hosts file alot during the day we figured we could temporarily share the /etc/ folder that contains the hosts file.
However we can only view it from another network PC and not update it with the new sites.
It gives an "accessed denied" error
We have tried everything changed ownership granted administrator permission, granted permissions to Everyone. Still we can't update it. And having my IT team go to the PC individually and update them is taking to long.
Any Suggestions?
And yes we know this is bad security protocol, but again it is Temporary so no lecture comments on that please.
This was actually a very simple problem, we never set the share permission just the file permissions.

copying large files over

I have a dedicated server where I host a large website. We need to do an upgrade on the website and I want to create a development copy on a testurl (on a different cpanel account) but same server.
The files are around 1GB in total size and 70,000 in number.
I have tried WS FTP pro but it has only copied 10% in around 20 hours.
What's the easiest and quickest method to create a replica on my development URL?
I am a newbie so please give detailed instructions.
Thanks
I would think the easiest method would be this:
Create the new account in WHM
Login via SSH
Navigate to your existing account folder
Copy the files to the new account folder
This should be pretty easy for you, as long as you know how to access your server via SSH. It's pretty simple:
Login via SSH
Type su and enter your root password (this is only necessary if you SSH into your server using an account other than root - a good practice, in my opinion)
Find and navigate to your source account. I'm assuming you're probably setup to have your web accounts in the /home folder, so try typing something like cd /home/source_folder
Once you're in the correct source directory, type cp -R * /home/destination_folder
That's pretty much it. The -R option recursively copies all the files from your source to your destination, and if you're copying a HUGE number of files, you might consider adding --verbose after the -R option so you can see it working. I apologize in advance if I've gone a little more granular than needed.

Resources