I'm working with Python Azure SDK and I run into the following problem.
What I'm trying to do is generate container SAS token to be used only on a given container, to do so I'm using Azure SDK generate_container_sas
def get_temporary_access_token(self):
sas_token = generate_container_sas(
self.account_name,
self.container_name,
self.storage_token,
permission=ContainerSasPermissions(read=True, write=True, delete=True, list=True),
expiry=datetime.utcnow() + self.sas_token_expiry_time
)
return sas_token
This returns a string looking something like se=<end_datetime>&sp=<Permission>&sv=2019-07-07&sr=c&sig=<token>
Now using this token I'm able to do all sort of things, but what I'm having trouble to do is using this link for temporary download link for a certain blob.
I was trying to do it using this method:
def get_temporary_download_link(self, blob_full_path, expires_time):
base_temp_url = f'{self._get_base_resource_url()}/{self.container_name}/{blob_full_path}'
token = generate_blob_sas(
account_name=self.account_name,
account_key=self.sas_token,
container_name=self.container_name,
blob_name=blob_full_path,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(seconds=expires_time)
)
return f'{base_temp_url}?{token}'
Now when I try to use the link I've built in the method above I'm failing in the following method b64decode.
From that I can understand that I'm not suppose to use SAS token for temporary download link and I can do this only by using the resource token or user delegation object?
I also tried to "fool" the method by encoding the SAS token but the URL resulting with an error Signature did not match
I didn't manage to find any documentation on what I can or cannot do with the resource SAS token vs the UserDelegationKey, anyone knows if it possible to use the resource SAS token for temporary download?
Thanks in advance
Basically the issue is that you're using a SAS token (in your case created for a blob container) to create a new SAS token. This is not allowed. You will need to use either the account key or user delegation key to generate a SAS token.
Also, you can use the SAS token generated for a blob container as a SAS token for blobs inside that container. If you create a SAS token for a blob container with at least read permission, you can use the same SAS token to download any blob in that blob container.
Related
I'm trying to create an app that uses drf-api-key for authorization. And I want to monitor which api key used in every connection used to database, Is there a way to that?
I tried to get the value of headers.get("Authorization") I get a none value, I just want to retrieve the name of api key used or the prefix of it.
You can simply get the token using:
token = request.auth
I figured it out by using
apiKey = request.headers['X-Api-Key'].split('.')[0]
Want to use signed encryption scope while generating SAS token and then get a blob which has used same scope during the time of upload. Currently it's throwing 403- unauthorized error when I try to fetch a blob which was uploaded to backend using same customer-managed encryption key.
This error may cause if the key has been disabled or deleted, access to customer-managed keys has been revoked, or your access token activation time is expired
Please check that you followed the below workaround correctly
After clicking key vault and key, try to generate a new key vault and key after clicking them to prevent an error. In Encryption scopes check you have given valid key vault and key and infrastructure encryption should be Enable as below
when you try to fetch a blob which was uploaded to backend using same customer-managed encryption key check whether your authentication type should be in account key, and you choose existing scope is valid as below
Once you upload a blob ensure you have generate SAS token and URL and try to use blob SAS URL for accessible
Reference: Authorize with Shared Key and Forbidden (403), Unauthorized (401),
I have a client that uses the orderhive service as an inventory management system, So I do not understand the auth with Laravel 8. What I understood from this link orderhive API docs is that I will get two tokens from the client app_token and refresh_token, then I should send them somewhere / somehow to AWS to get a bunch of tokens then send the request.
I do not understand that process altogether, and I hope there is any package or something built in Laravel to ease that process. Like if the AWS SDK itself could be easier to integrate. All tutorials and lessons are about S3 images, so any info would be appreciated.
The process of dealing with such a scenario is as follows:
you will have all the app credentials
use the credentials to grant access (code grant - JWT) as an example
after you grant the access you shall have (access_token)
the access token will be valid for a certain time (e.g. 8hours)
use the access token to generate a (refresh_token)
store the access_token and the refresh_token and date_generated somewhere
whenever you want to make a call to the API just use the stored tokens
if the access_token expired, use the refresh_token to regenerate a new access_token
when generating a new access_token via the refresh_token, the response should contain a new access_token and new refresh_token
replace the new tokens with the old ones
feel free to ask if any point was not clear for you.
Need to call a RESTfull webservice from Informatica powercenter. it has a never expiring token for authorization.
Tried calling the webservie using HTTP transformation and passing the access token in the header. it works fine and webservice returns the result. But everyone can see the token once mapping is checked out.
How can we manage the token, store in encoded format or is there a away to create application connection to store the token ?
Here is what we do in Informatica world to protect the sensitive data like password and so on.
You store the value in a file and put it inside a folder.
Make the folder invisible / restrict access to read only for that informatica unix user and rest users have 0 permission.
Once it is done, create a parameter in the file and use that file as a parameter for the informatica mapping, so when ever the mapping runs, it will go and pick up the parameter from the file, which the informatica user can only read, and run the mapping .
This prevents
1. Hard coding sensitive elements in the mapping
2. Hiding the sensitive data from all other Unix users other than Informatica unix user, thereby protecting the data
I'm writing code to generate and download a private key for a Google Cloud service account.
Using the IAM API, I was able to create a service account, and my call to generate a key seems to be working. I get back a Service Account Key as described on the IAM API create key page, like
{
"privateKeyType": "TYPE_GOOGLE_CREDENTIALS_FILE",
"privateKeyData": "random-key-stringkajdkjakjfke", ...
}
I downloaded this file as a JSON response and am trying to authenticate with it:
gcloud auth activate-service-account --key-file=service-account-key-file.json
Unfortunately, I get an error stating
The .json key file is not in a valid format.
When I go though the Google Cloud Console flow (IAM & Admin -> Service accounts -> ... -> Create Key -> Create) I get a downloaded JSON file that looks like
{
"type": "service_account",
"private_key": "----BEGIN-PRIVATE-KEY-----",
"auth_uri": "https://gaiastaging.corp.google.com/o/oauth2/auth",
}
This file looks completely different than the response from the IAM API. Explains my error! Unfortunately, this format doesn't seem to be described anywhere. It's mentioned briefly in some docs. Is it a Google Credentials File?
I'd like to take the IAM response file/JSON and convert it to the second credentials file. I've tried writing some code to convert it, but there are some fields like "auth_provider_x509_cert_url" that I don't understand.
Perhaps converting the file is the wrong approach as well? More generally:
How can I generate a file and then use it to authenticate with gcloud?
How should I describe/distinguish between both of the above files? Why is each type of file useful?
About the two files:
A Google Credentials file and a Service Account Credentials file are the same thing - they're both the second type of file that I downloaded off the Google Cloud Console page. No great official docs pages on them, but they're referenced a lot. Probably also Application Default Credentials.
The JSON response from the IAM create call - this is just a response to an API call. It's not useful outside of parsing it with your application code.
To generate a Google Credentials file:
In the JSON response to the IAM create, there's a field privateKeyData. This field actually contains the entire Google Credentials file. It's just encoded as a base64 string. I just downloaded the file from HTML as
<a href="data:attachment/json;base64;charset=utf-8,THAT-LONG-privateKeyData-base64-string-here" download="service-account-key.json">
Download key
</a>
Or if you just want to confirm that it contains all the information quickly, copy paste the base64 privateKeyData field into a file google-credentials and decode it (on Linux) with:
base64 -d google-credentials
I was then able to run
gcloud auth activate-service-account --key-file=google-credentials.json
and got
Activated service account credentials for: [service-account-id#project-id.iam.gserviceaccount.com]