Context
The Google Play Console monthly reports are available to download programmatically from Google Cloud Storage, but the only way I have found to get the Storage bucket name is by manually checking inside the Play Console[1][2]. This is a bit to involved to ask from our users.
My question
Is it possible to get the bucket name from the Play Console account number and/or package name?
[1] https://support.google.com/googleplay/android-developer/answer/6135870?hl=en#export
[2] https://android-developers.googleblog.com/2015/04/integrate-play-data-into-your-workflow.html
Related
I have been using MinIO as STaas for a few weeks now. I would like to know if there is a way to, given a created user, allow him/her to create buckets to only a previously assigned size. Let's say I want Nana (an user of my server) to be able to create buckets of up to 50GB, she can't buckets bigger than that.
I know it exist AccessPolicies and also multitenancy in a minio deployment, but that's not what I'm asking for.
This is not possible currently - but you can configure quotas on buckets after they have been created: https://docs.min.io/minio/baremetal/reference/minio-mc-admin/mc-admin-bucket-quota.html
The MinIO team is available on their public slack channel or by email to answer questions 24/7/365.
I have a machine which I want to find where my password hash is stored.
the set command returns details about the account and shows that it is connected to a domain however it doesn't show in net user. As well as this on advanced system settings -> User profiles the account shows as type: local and Status: local.
It seems to be a domain user however windows doesn't think it's on a domain.
Because of this searching for hashes has only brought up dead ends. They aren't in the SAM file and they aren't in SECURITY. I also tried password recovery software and the account simply didn't show.
I could see the correct hash through sekurlsa::LogonPasswords full - specifically serkurlsa::msv with mimikatz but now I would like to know where they are stored.
I know they are cached somewhere as I can login without internet, so I think I'm specifically looking for this file.
A brief search of the command suggests they are in the SAM database but I know they aren't.
Any assistance would be appreciated.
I am trying to find:
who created a dataset in BigQuery
and if possible if it was done via GUI or CLI etc.
Currently using the Google Cloud SDK:
I am checking against every project my account is linked with, using the Google Cloud SDK.
With the following command inside a loop for every project, I get info for the label userByEmail.
Command: bq show --format=prettyjson ${dataset} | awk /userByEmail/'{gsub ("\"", ""); print proj",",dat",",$2}' proj=${project} dat=${dataset}
This gives me info about who has access to these datasets, but it's not what I am looking for.
Any ideas on how to get the correct info on an automated fashion?
The only place creator information would be exposed is via the BigQuery audit logs.
Even with the audit information at your disposal, the second part of your question (what tool issued the request) is likely to remain ambiguous.
I am trying to rotate the user access keys & secret keys for all the users, last time when it was required I did it manually but now I want to do it by a rule or automation
I went through some links and found this link
https://github.com/miztiik/serverless-iam-key-sentry
with this link, I tried to use but I was not able to perform the activity, it was always giving me the error, can anyone please or suggest any better way to do it?
As I am new to aws lamda also I am not sure that how my code can be tested?
There are different ways to implements a solution. One common way you can automate this is through a storing the IAM user access keys in Secret Manager for safely storing the keys. Next, you could configure a monthly or 90 days check to rotate the keys utilizing the AWS CLI and store the new keys within AWS Secrets Manager. You could use an SDK of your choice for this.
I am attempting to use the Microsoft Azure Storage Explorer, attaching with a SAS URI. But I always get the error:
Inadequate resource type access. At least service-level ('s') access
is required.
Here is my SAS URI with portions obfuscated:
https://ti<...>hare.blob.core.windows.net/?sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRUD<...>U0%3D
And here is my connection string with portions obfuscated:
BlobEndpoint=https://tidi<...>are.blob.core.windows.net/;QueueEndpoint=https://tidi<...>hare.queue.core.windows.net/;FileEndpoint=https://ti<...>are.file.core.windows.net/;TableEndpoint=https://tid<...>hare.table.core.windows.net/;SharedAccessSignature=sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRU<...>YU0%3D
It seems like the problem is with the construction of my URI/endpoints/connectionstring/etc, more than with permissions granted me on the server, due to the fact that when I click Next, the error displays instantaneously. I do not believe it even tried to reach out to the server.
What am I doing wrong? (As soon as I get this working, I'll be using the URI/etc to embed in my C# app for programmatic access.)
What you need to connect is a service requirement the "SRT" part of the URI.
The URI you have has a SRT of "CO" container and object and needs the "S" part, you need to create a new sas key this can be generated in portal, azure cli or powershell.
In the portal is this part:
You have to enter to the storage acount and select what you need:
Allowed services (if you are looking for blob)
Blob
Allowed resource types
Service (make sure this one is activated)
Container
Object
Allowed permissions (this to do everything)
Read
Write
Delete
List
Add
Create
Example where to look
If you need more info look here:
https://learn.microsoft.com/en-us/rest/api/storageservices/create-account-sas?redirectedfrom=MSDN
If you like to create the SAS key in the CLI use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-cli
If you like to create the SAS key in powershell use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-powershell
I has a similar issue trying to connect to the blob container using a Shared Access Signature (SAS) URL, and this worked for me:
Instead of generating the SAS URL in Azure Portal, I used Azure Storage Explorer.
Right click the container that you want to share -> "Get Shared Access Signature"
Select the Expiry time and permissions and click create
This URL should work when your client/user will try to connect to the container.
Cheers
I had the same problem and managed to get this to work by hacking the URL and changing "srt=co" to "srt=sco". It seems to need the "s".