Prevent objects deletion from Google Storage - windows

I bought CloudBerry Ultimate software (Link for more information) to make backups. On that software I can control when the software deletes the objects I backed up, but I want to be sure that will be impossible to delete files from my Google Storage Nearline, I know that Amazon Web Services have the Amazon Glacier with VaultLock that prevents to delete objects for a period of time and is impossible to delete (even if you have all administrative privileges) any object or modify the parameters.
Does any one know how can I prevent to delete any object from my Google Nearline account?

It's not possible. There's no such thing similar to Glacier Vault Lock. You can temporary suspend a user though.
You can temporarily block a user's access to your organization's
Google services by suspending their account. This doesn't delete their
email, documents, calendars, or other data. And their shared documents
remain accessible to collaborators. But the user can no longer sign in
to their account. Also, new email and calendar invitations are
blocked. After suspending a user, you can restore the account at any
time.
https://support.google.com/a/answer/33312

Related

Service account doesn't get the quota from Google one

One friend of mine subscribed to Google one and granted permission to me use the storage, the problem is when i'm using a service account to upload a file to google drive, the storage shows as 15GB and don't let me upload more than that, is there anyway to grant the storage from my account to the service account?
I couldn't find anything related to Google one, any direction would help.
Edit: I used OAuth2, it isn't the best alternative, but I really don't want to go more into this.
Service accounts are dummy users they have their own drive account but there is no way to extend the allotted space to that account.
Storage is based upon an account. If you want to use the storage of a standard gmail user with a service account. Then the service account needs to be uploading to that drive account.
Have your friend share a directory on their drive account with the service account. Then when you upload files make sure to set the parents id to that of the folder your uploading to.

How to programmatically access only one specific google drive without a service account

I am writing a server side python script with Pydrive which needs to store a file in a specific gdrive. Pydrive and this post suggest to use a service account.
However this would mean that with the credentials of this service account all gdrives are accessible and I would rather avoid that.
Ideal only one specific gdrive or all gdrives where one specific user has access to should be accessible.
Is it possible to give programmatically access to only one specific gdrive?
[Edit]
As mentioned in the comments I am apparently not looking for a OAuth flow.
I am looking for a server-to-server communication for accessing one specific google drive using the principle of least privilege access. Doing this with a service account + domain wide delegate and google drive r/w scope would mean that with this service account all google drives can be accessed which is not what I want.
Unfortunately there is a domain wide policy in place which forbids to share google drives to "other" domains. This means I can not use a service account without domain wide delegation and just share the drive with it.
I don't understand what you mean by "programmatically", when you already tag the question as oAuth - asking for oAuth2 flow, which is interactive. When there is nobody, who would press the buttons, this probably isn't the authentication flow you're looking for. Just share a directory with a service-account; no domain-wide delegation is required (with that enabled, there would be no need to share it).
One could even abstract the whole Drive API access credentials away by using a simple Cloud Function, which has to task to update one file; triggered through HTTP, utilizing the Drive API.
Possible approach - dummy account
You could designate a new account that will be your "service account". In reality it won't be an actual service account, it will just be a dummy account that you can call something like "gdrivebot#yourdomain.com". Then you can share only what is absolutely necessary with it. I think this would be the only way to get that level of fine-grained control that you are looking for. This would require your admin to designate a new account just for this purpose though.

Can I clear users out of my AD without harming any entries in a Dynamics CRM?

Apologies for the basic question; we're having a spring clean of the office Active Directory and plan to remove a large number of legacy users. Saying good-bye to their email is not a problem, but we have an on-premise Dynamics CRM we occasionally refer to. My question is, will there be any implications for that if I delete a user who might have entered a case?
There is no direct link between CRM on-premise 2011 & Active Directory to pull all users overnight & sync. When you create a new user in CRM by giving domain name, it will verify in AD & pull the details to store in CRM. This will happen on tab out.
So when you delete/disable an AD user it won't flow down in Dynamics. But you have to disable them manually (no delete option available). Before doing that make sure to read these best practices.
Best Practices
Make sure to Re-assign any associated records/activities to another
User or Team before disabling User. If you don’t Re-assign the records
they will still be available, but they will still be assigned to the
disabled user.
It is very important to ensure that there are no Workflows owned by
the User to be disabled. All Published Workflows need to be owned by
an administrative account, not an employee’s account.
There are situations where a User’s account only needs to be disabled
for a short period of time, so records don’t necessarily need to be
Re-assigned. (Example: the User went on vacation for a month). Take
into consideration the User’s privileges for those records. If only
the User can modify that record, then no one will be able to modify
the record, if the owner is disabled.
Read this community thread as well.

Google Drive Access - Service Account or OAuth - To read/write user files

I have a .NET console application that performs operations on files. I would like to allow clients to give us access to their Google Drive accounts so we can read and write files. Our console application runs as a service so there is no way for the user to interact with it and authorize our access to their Google Drive account.
I was looking at using a Google Service Account for application level authentication until I learned that a Service Account does not have access to the Google Drive folder of the user that sets up the Service Account. This sort of defeats the purpose because it is the client's Google Drive account I am looking to gain access to.
I saw a workaround posted by SO member #pinoyyid posted in this SO answer where the refresh token can be generated using Google's Oauth2 Playground, but I am concerned that the refresh tokens could expire and user intervention would be needed again to generate another one.
Another response mentioned the solution was to create the Service Account and then share the user's Google Drive account with the Service Account.
What is the recommended approach by Google? How best to gain access to a Google Drive account while only requiring the owner to authenticate on a one-time basis, yet allowing them the ability to revoke access at any time?
Both Service Account and a stored OAuth Refresh Token are viable approaches. Each has its pros and cons.
A Service Account will work where your users only need to grant access to a specific folder which they can share to the SA. Be aware that any files the SA creates are owned by, and consume quota of, the SA. You can't "share the user's Drive account to the SA", you can only share individual folders.
Storing a RT is the more permissive option. You wouldn't use the OAuth playground as described in my answer that you referenced as that's far to clunky to ask users to go through. Instead you would need to write your own registration/authorisation service (you can use AppEngine, Lambda, etc - so it's not difficult to write and host).

Secure folder contents and delete them after certain number of days

I would like to secure folder, so that no one can cut or copy any file or contents of file without "secure" password (or happy to get rid of password bit as well, so no one can cut, copy or move any file or file contents from folder). Also, if all files and folders inside my root folder can be deleted after certain number of days, that will be great. This is to stop people from copying and distributing my files to others without my permission and folder contents to "expire" after certain number of days (e.g. 7 days).
Currently, I manually copy folder to other people's machine, so I do have physical access to their machines.
PS. I am happy to write a script as well, in case there is a way to execute script everytime I open the folder.
I understand, I can't stop people from stealing file contents by manually typing file contents to other file or taking photos of file contents, however I want to make it harder of them.
This is not a PowerShell issue, nor a solution provided by PowerShell. This is an data risk management issue as well as a reality check.
Don't get me wrong, you can write a scrip that encrypts data,
https://blogs.technet.microsoft.com/heyscriptingguy/2015/03/06/powertip-encrypt-files-with-powershell
Even just use EFS, but each of those have several limitations.
https://technet.microsoft.com/en-us/library/bb457116.aspx
Then there are password encrypted zip files. But.....
None of the above stop cut/copy/paste/print and there is no way to make them.
Here is the simple truth to data security which I deliver at all my public speaking engagements and customer deployment engagements.
Nothing can defeat and ocular attack. Meaning...
'If I can see your data, I can take your data.'
It may take me longer than being able to just bulk exfiltrate you data (copy to a USB, CD, DVD, native print, etc), but I can just take a picture, photo copy it, screen grab it from another device, manually write it down.
Either method allows me to walk away with it and give it to whomever.
You can only mitigate / slow down / prevent bulk exfiltration using DLP/RMS protection solutions.
Why are you putting this manually on their systems, vs hosting it in the cloud where they can access it. If you do this in MS Azure, you can leverage Azure Information Protection.
RMS for individuals and Azure Information Protection
RMS for individuals is a free self-service subscription for users in
an organization who need to open files that have been protected by the
Azure Rights Management service from Azure Information Protection. If
these users cannot be authenticated by Azure Active Directory and
their organization does not have Active Directory Rights Management
(AD RMS), this free sign-up service can create an account in Azure
Active Directory for a user. As a result, these users can now
authenticate by using their company email address and then read the
protected files on computers or mobile devices.
https://learn.microsoft.com/en-us/information-protection/understand-explore/rms-for-individuals
Why are you not heavily watermarking your data?
Putting passwords on files and folders do not prevent that ocular attack.
Neither does DLP/RMS. You can apply cut/copy/paste/print policies, remove access after a certain date, restrict access as per the feature set using policies.
Yet, again, this is just prevention against the bulk dumping / sharing of your data. Not the fine grained, patient, write it down or capture from a remote camera approach. Even if you block cut / copy / paste from the host, I can bring that host up is a screen sharing - think remote desktop, and screen shoot in the RDP session. Meaning, using the host tools that I use to connect to an RDP destination. Heck I create a webcast and share it with a group, meaning, I open it on my system and let people view it with me.
No DLP solution is 100%. Anyone telling you this is lying.
As one that has been doing Info/CyberSec for almost 2 decades, evaluated, deployed and used several DLP solutions, what I state here is from experience. DLP is important, and business must look to it as another mitigation in their risk strategies, but must do so with real vision and reality.
No matter who it is from, no technology can prevent this ocular avenue. If you don't want your data leaving your control, then don't share it. Yet, since you are in the education business, that is not an option.
I'll say it again, and again...
'If I can see your data, I can take your data.'

Resources