Minio UI disable public listing of all bucket which has readonly policy - minio

In Minio I have a bucket that has a read-only policy, but I do not want to be viewed in Minio Browser without authentication.
Is it possible?

First reset recursively (optional) existing policy on bucket
mc policy --recursive set none gm/data/ibb
After that you can change the policy as you like

Related

Prevent users with Azure AD accounts to access my storage

I would like to prevent people with managed identity from accessing my storage.
Please help me on this matter.
To prevent users with azure AD account try to disallow all public access to storage account while disable, any future anonymous requests to that account will fail your data will never be available for public access until a user with appropriate permissions
To disallow public access for a storage account , configure the account's AllowBlobPublicAccess property :
Go to azure portal -> storage account -> setting - Find the Configuration -> Set Blob public access – Disable
After you update the public access setting for the storage account, it may take up to 30 second to change is fully propagated
To check the anonymous requests of storage account, I suggest to use below steps:
In azure portal-> under Monitoring -> click Metrics-> set the scope -> Metric Namespace as Blob -> Metric field as Transactions -> Aggregation field as Sum.
Next Add filter ->in filter ->set Authenticationin Property value -> equal sign (=) in Operator field and finally Values field as Anonymous
After configuring the metrics, the unidentified access will appear in the graph and you can also configure alert rule for notification to prevent and security purpose Create, view, and manage Metric Alerts Using Azure Monitor - Azure Monitor | Microsoft Docs
For more information in detail, please find below link:
https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-prevent

Restrict permission when client create a sastoken for a blobstorage

I have a blobstorage where I drop files for an external partner to list the files and read them. I thought that a SAS token would be a perfect way for the external partner to access the container and read the file(s).
So I created a SAS token and realized that if I don't want to create new sas tokens every 10 minutes and send them to the partner I need to set the expire date of the token far into the future, and that is not good if the sastoken is leaked or that the day the token expire the solution will stop working.
So to fix that I could let the client create a sastoken by giving them an accesskey and accountname by using the StorageSharedKeyCredential-class. That works great, maybe to great since it's now the client that decides what permission the sas token should have. So the client might now upload files / create containers etc etc.
So my question is: Is there any way to restrict what kind of permissions the sas token have when the client create the sastoken, so our external partner only can read/list files in a specific container that I have decided.
Best Regards
Magnus
Regarding the issue, I think you want to know how to create service sas token. If so, please refer to the following code.
BlobContainerClient containerClient=new BlobContainerClient(new Uri("https://{account_name}.blob.core.windows.net/{container_name}),new StorageSharedKeyCredential());
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName =containerClient.Name,
Resource = "c"
};
sasBuilder.ExpiresOn = DateTimeOffset.UtcNow.AddHours(1);
sasBuilder.SetPermissions(BlobContainerSasPermissions.Read);
sasBuilder.SetPermissions(BlobContainerSasPermissions.List);
Uri sasUri = containerClient.GenerateSasUri(sasBuilder);
To give a specific container permission, you can do this followings:
Find your container, select Access Policy under the settings blade, and click Add Policy. Select the permissions which you want to give this specific container. Also, public access level is container level.You could refer the Thread which discussed on the similar related issue.
And also try how the RBAC works on Azure storage.
Only roles explicitly defined for data access permit a security principal to access blob or queue data. Roles such as Owner, Contributor, and Storage Account Contributor permit a security principal to manage a storage account, but do not provide access to the blob or queue data within that account.
You can grant the right to create a user delegation key separately from right to the data.
https://learn.microsoft.com/en-us/rest/api/storageservices/get-user-delegation-key is performed at the account level, so you must give this permission with something like the Storage Blob Delegator built-in role at the scope of the storage account.
You can then grant just the data permissions the user should have, using one of these 3 built-in roles at the scope of the blob container:
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
The User Delegation Token can then be generated to grant a subset of the users permissions for a limited time, and can be granted for an entire blob container OR for individual blobs.
For more details you may check this thread.
And You have to use VNet rules in the storage firewall or trusted access to storage to restrict access for clients in the same region.
you may check with this links.
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-blob

MINIO - remove a policy from a user

I have a user which I have applied a policy for using the following format
mc admin policy set myminio getonly user=newuser
Now, I've added newuser into a group, and I want to manage his policies using the group's policies. So adding the user to a group and applying policy on that group is quite straightforward.
Now that he has the group's policy, I want to remove the getonly policy that was applied to him personally - how can I do that?
It seems like there is no direct way to remove policy that have been assigned to a user.
The easiest way that I can think of is deleting the current user and create a new user with the correct policy assigned.
There is another workaround that I have tried.
Assuming I have a user named test, this is what I had done to remove the policy from a user:
mc admin policy set myminio '' user=test
mc policy set none minio/storage
minio is your bucket and storage is the folder.

Laravel accessing S3 bucket vs AWS Role

I have an EC2 instance that runs Laravel 5.1. I am using an S3 bucket through Laravel's api:
AMAZON_KEY=key
AMAZON_SECRET=secret
AMAZON_REGION=us-west-2
AMAZON_S3_BUCKET=my_app_bucket
But I already set up a ROLE that enables this box to use that particular bucket. Why do I also need a key and a secret? From an analysis of the code, it looks like Laravel always demands a key and a secret, so it would seem that I have to actually create an IAM user account with key/secret and use that for s3 access instead of using roles, which is preferred. Is there a way around this, or is this just how Laravel S3 access works?
A fix to use IAM credentials for filesystem, queue, and email was merged a few days ago, so upgrading to Laravel v5.1.7 should do the trick.
https://github.com/laravel/framework/pull/9558

Enable notifications / Watch a Google Play bucket to programatically download reports

There's a lot of new information regarding how to programatically download Google Play reports using gsutil tool. Google Play uses a bucket to store these reports, just like Google Cloud Storage does. I'm already able to download reports from Google Play bucket without a problem. For example:
gsutil cp gs://pubsite_prod_rev_<my project id>/stats/installs/installs_<my app id>_201502_overview.csv .
On the other hand, gsutil offers a feature to watch Google Cloud Storage buckets, so you can receive notifications every time an object in the bucket changes (gsutil notification watchbucket). I am also able to enable notifications in buckets created in my own Google Cloud projects.
The problem is, I'm not able to enable notifications in my Google Play bucket. Is it even possible? I get an AccessDeniedException: 403 Forbidden error when calling:
gsutil notification watchbucket -i playnotif -t sometoken https://notif.mydomain.com gs://pubsite_prod_rev_<my project id>
I've followed all the steps here, being specially careful with those regarding identifying a domain to receive notifications.
As I mentioned above, I'm already able to do all the process I need, but with my own buckets in Google Cloud, not with the Google Play bucket.
The Google Play project has been linked to a Google Cloud project. It did so automatically when I enabled Google Play API access (Google Play Developer Console -> Configuration (left menu) -> API access).
The Google Play project owner and my own Google Cloud project owner is the same.
This owner has successfully registered and validated the domain used to receive the notifications (following the example, I validated both just in case: notif.mydomain.com and mydomain.com, using https in the Google Webmaster Tools)
These domains have also been whitelisted in the Google Developers Console (left sidebar -> APIs & Auth -> Push).
I've successfully enabled notifications in my own Google Cloud buckets using either the project owner account or a service account I created. I've already tried using both (owner and a corresponding service account) in the Google Play bucket, without success.
Any ideas will be greatly appreciated. Thanks!
EDIT:
I had already followed the steps here, but using different procedures (as explained in the comment below). Following Nikita's suggestion, I tried to follow the steps using the same procedure.
So I configured gsutil (through gcloud) to use the owner account:
gcloud config set account owner-of-play-store-project#gmail.com
and while trying to grant full access to the service account, I encountered this error:
$ gsutil acl ch -u my-play-store-service-account#developer.gserviceaccount.com:FC gs://pubsite_prod_rev_my-bucket-id
CommandException: Failed to set acl for gs://pubsite_prod_rev_my-bucket-id/. Please ensure you have OWNER-role access to this resource.
So, I tried to list the default ACL for this bucket, and found:
$ gsutil defacl get gs://pubsite_prod_rev_my-bucket-id
No default object ACL present for gs://pubsite_prod_rev_my-bucket-id. This could occur if the default object ACL is private, in which case objects created in this bucket will be readable only by their creators. It could also mean you do not have OWNER permission on gs://pubsite_prod_rev_my-bucket-id and therefore do not have permission to read the default object ACL.
[]
Conclusion:
It really makes me think that, even using the project owner account, this account doesn't have the OWNER role on the Play Store bucket. This means ACLs can't be modified, not even listed, as well as notifications can't be enabled since, sadly, we don't really own the bucket.
At the moment, you cannot. Google Play owns these buckets, and end users do not have the bucket FULL_CONTROL access necessary to subscribe to Object Change Notifications.

Resources