Append to an Azure Blob using SAS URL - azure-blob-storage

I need to continuously append to a blob in a container for which I have been provided SAS URL
I am doing this
var blobClient = new AppendBlobClient(mySASUri);
var blobContentInfo = blobClient.CreateIfNotExists();
but Create or CreateIfNotExists do not take a blob name parameter. which is strange for a create method.
and I get Authentication exception when using the following
mySASUri="https://[myaccount].blob.core.windows.net/[my container]?sp=racwl&st=2022-02-03T08:29:46Z&se=2022-02-03T16:29:46Z&spr=https&sv=2020-08-04&sr=c&sig=[the signature]"
I have been reading a lot of stuff on use of Azure SAS but everything talks about generating SAS or stops at very basic level.
Thanks to anyone who looks at this and can provide either a reading reference or guidance on what api combinations should work for this use case.
Thanks,
Tauqir

Considering your SAS URL is for the container, it would be be better if you create an instance of BlobContainerClient first and then get an instance of AppendBlobClient using GetAppendBlobClientCore method.
Something like:
var blobContainerClient = new BlobContainerClient(new Uri(mySASUri));
var appendBlobClient = blobContainerClient.GetAppendBlobClientCore("append-blob-name");
...do the append blob operations here...

Related

How to create a sharedaccesssignature URL for my blob file in Azure Data Lake Gen2 in csharp

I'm using the Azure.Storage.Files.DataLake nuget package to write and append file on my Azure Storage account that is enabled for Data Lake Gen2 (including hierarchy).
However, I don't seem to find how to generate a SAS-url to access a specific blob, without authenticating a user. Is it possible to have this done with the package, or should I fall back to REST operations for this?
Thanks for any insights
Seems I just had to click through to the GitHub page and there's a good example of how to do this in the Unit tests.
I copied the permalink to the specific part here:
https://github.com/Azure/azure-sdk-for-net/blob/89955a90641742a2cdb0acd924f90d02b1be34ec/sdk/storage/Azure.Storage.Files.DataLake/samples/Sample02_Auth.cs#L126
AccountSasBuilder sas = new AccountSasBuilder
{
Protocol = SasProtocol.None,
Services = AccountSasServices.Blobs,
ResourceTypes = AccountSasResourceTypes.All,
StartsOn = DateTimeOffset.UtcNow.AddHours(-1),
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1),
IPRange = new SasIPRange(IPAddress.None, IPAddress.None)
};
// Allow read access
sas.SetPermissions(AccountSasPermissions.List);
// Create a SharedKeyCredential that we can use to sign the SAS token
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(StorageAccountName, StorageAccountKey);
// Build a SAS URI
UriBuilder sasUri = new UriBuilder(StorageAccountBlobUri);
sasUri.Query = sas.ToSasQueryParameters(credential).ToString();

AWS S3 API - Objects doesn't contains Metadata

trying to figure AWS S3 API and failing miserably...
I currently got a bucket which consist lots of videos.
I need to request all the videos as an object, that will have the video meta-data which I set once uploading, and the link to share the video.
Problem is I'm getting the object without any of the above...
What Iv'e got so far -
AWS.config.update({accessKeyId: 'id', secretAccessKey: 'key', region: 'eu-
west-1'});
var s3 = new AWS.S3();
var params = {
Bucket: 'Bucket-name',
Delimiter: '/',
Prefix: 'resource/folder-with-videos/'
}
s3.listObjects(params, function (err, data) {
if(err)throw err;
console.log(data);
});
Thanks for reading :)
UPDATE - found that when using getObject and adding ExposeHeader to the CORS setting I can indeed get the metadata I set.
problem is getObject only works on a specific Object (video in my case).
any Idea how I can get all the object like listObject and have values of each object like I do on getObject?
Only solution I can think of is doing listObject to get a list of all the objects, and then by this result to do for each object an getObject ajax?... rip UX
thanks :)
A couple of things to sort out first.
As per the documentation, http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#listObjects-property listObjects API returns exactly what is mentioned in the callback parameters. Using a delimiter causes the S3 to list objects only one level below the prefix given. It won't traverse all objects recursively.
In order to get the URL, you can use http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getSignedUrl-property. I am not really sure what you meant by meta-data.

tasker for android - not able to get response from HTTP get request

I am trying to get some data out of google maps API.
The response comes trough, as I can see it, but when I try to use a javascriptlet, i have no luck.
this is my current setup:
I get the data from google maps:
Server:port: http://maps.google.com/maps/api/geocode/json?latlng=%LOCN&sensor=false
I run the javascriptlet trying to get only the value I need out from the json object:
var response = global ('HTTPD');
var gmapslocationname = response.results[0].address_components[2].short_name
and then I try to flash it:
%gmapslocationname
but what I get in the flash is %gmapslocationname
What am I doing wrong here?
thanks
Tasker is not capable of handling objects, hence the HTTPD is stored as a string.
To be able to use it, you need to convert it to object.
Change your code to
var response = global ('HTTPD');
var gmobject = JSON.parse(response);
var gmapslocationname = gmobject.results[0].address_components[2].short_name

Unable to query using a pointer using Android SDK

I am trying to run a query using com.parse:parse-android:1.13.0 version of the parse android SDK. I am creating a query on the local storage and using ParseQuery$whereMatchesQuery() method to match a column storing pointer to another class in my database. The code that I have is the following:
ParseQuery<PracticeSessionDetails> query = ParseQuery.getQuery(PracticeSessionDetails.class);
query.fromLocalDatastore();
query.ignoreACLs();
query.whereEqualTo("user", ParseUser.getCurrentUser());
ParseQuery courseQuery = new ParseQuery("Course");
courseQuery.whereEqualTo("objectId",courseId);
query.whereMatchesQuery("course", courseQuery);
When I run the query using query.getFirst(), I do not get anything retrieved from the local storage. I have already checked running the courseQuery separately and it fetches me Course object that I need. Is this a known issue? I proceeded with this way by getting help from this post.
I think you are mixing query while storing pointer to another class in your database. Use following code to resolve your problem:
ParseQuery<PracticeSessionDetails> query = ParseQuery.getQuery(PracticeSessionDetails.class);
query.fromLocalDatastore();
query.ignoreACLs();
query.whereEqualTo("user", ParseUser.getCurrentUser());
ParseQuery<ParseObject> courseQuery = ParseQuery.getQuery("Course");
courseQuery.whereMatchesQuery("course", query);

SharpGS how to download file?

I am using SharpGS for google cloud storage. I could upload file using the
GetBucket("some-bucket").AddObject() method but I could not download the file using the following code
GetBucket("some-bucket").GetObjectHead("some-file").Content
It gave me null value for the byte return
any idea?
thanks
The GetObjectHead looks up the object using a HEAD request, so it doesn't retrieve the content.
If you take a look at the demo code, you can retrieve object contents by listing the bucket:
var bucket = GetBucket("some-bucket");
foreach (var o in bucket.Objects) {
Console.WriteLine(Encoding.UTF8.GetString(o.Retrieve().Content));
}
There doesn't seem to be a way to get an IObject without listing the bucket. I would suggest adding a method to the IObjectContent class returned from GetObjectHead to fetch the IObject. The project is on GitHub.

Resources