I'm setting up React version of FineUploader to upload files to Azure going through the docs and I could use a bit more clarification on bloburi sent to my API when requesting a SAAS.
We're requesting the SAAS before uploading the file which is why I'm confused.
Say, I'm uploading a file named my-image.jpg into my blob container with the endpoint of https://myaccount.blob.core.windows.net/my-container on Azure.
I also want to rename the file during upload by calling a function and let's assume the function returns 89056c3d-7bb3-my-image.jpg for file name.
Would then https://myaccount.blob.core.windows.net/my-container/89056c3d-7bb3-my-image.jpg be the bloburi I send to my API while requesting a SAAS?
In other words, are we constructing the bloburi using the azure blob storage container URI and the file name we'll end up using?
If I'm interpreting this correctly, what happens if the user is uploading multiple files? What would be the blobUri I'd have to send to request a SAAS?
UPDATE:
When my request hits my backend API to get a SAS, the blobUri comes in as /server/upload/some-guid-value.txt. I'm using the following options when instantiating an uploader. What am I doing wrong?
const uploader = new FineUploaderAzure({
options: {
signature: {
endpoint: 'http://localhost:4879/api/getsas'
},
request: {
containerUrl: 'https://myaccount.blob.core.windows.net/my-container'
},
uploadSuccess: {
endpoint: 'http://localhost:4879/success'
}
}
})
In other words, are we constructing the bloburi using the azure blob storage container URI and the file name we'll end up using?
Correct. Fine Uploader Azure constructs this for you. Be sure to verify permissions (considering the _method param that accompanies the Blob url) before returning a signature.
Related
Each user has a number of private files (photos, videos, etc.) stored on the s3 disk.
From the mobile application side, I send a request to Laravel web service to get the list of files and show it to the user on the client side.
I use the resource collection in Laravel to send responses and send my list to the mobile application.
My question is, how can I access the file itself using the file path on the client side?
Do I need to send a request to Laravel for each file to request a download type response for each file?
Considering that the number of files is more than one file and I want to show a list view inside the mobile application and I don't want to send a request to the server for each photo and download the photo.
I want the accessible links to be returned as soon as I get the list from the laravel app so that I can display them on the app side.
Laravel Side:
Route::get('api/user/{user}/files', function (User $user){
$files = $user->files();
return new FileCollection($files);
});
Route::get('api/download/{path}', function (string $path){
return Storage::disk('s3')->download($path);
});
Client Side:
What do I do here?
You can call Storage::disk('s3')->temporaryUrl($path, now()->addMinute()) to generate publicly accessible links for private files (links are going to expire in 1 minute in this example).
I'm using a s3 bucket for my application's user's uploads. This bucket is private.
When i use the following code the generated url is not accessible from within the application:
return Storage::disk('s3')->url($this->path);
I can solve this by generating a temporary url, this is accessible:
return Storage::disk('s3')->temporaryUrl($this->path, Carbon::now()->addMinutes(10));
Is this the only way to do this? Or are there other alternatives?
When objects are private in Amazon S3, they cannot be accessed by an "anonymous" URL. This is what makes them private.
An objects can be accessed via an AWS API call from within your application if the IAM credentials associated with the application have permission to access the object.
If you wish to make the object accessible via a URL in a web browser (eg as the page URL or when referencing within a tag such as <img>), then you will need to create an Amazon S3 pre-signed URLs, which provides time-limited access to a private object. The URL includes authorization information.
While I don't know Laravel, it would appear that your first code sample is just provide a normal "anonymous" URL to the object in Amazon S3 and is therefore (correctly) failing. Your second code sample is apparently generating a pre-signed URL, which will work for the given time period. This is the correct way for making a URL that you can use in the browser.
I'm using a (working) SAS url to retrieve a file from azure blob storage using AJAX. I've enabled CORS in my Azure storage account and now I'm getting a 400 error:
Authentication information is not given in the correct format. Check the value of Authorization header.
The url works if I just paste it into a browser's address bar.
I am using fine uploader to upload images to azure blob storage, everything is working and now I want to delete file. I used below code
deleteFile: {
enabled: true,
endpoint: 'api/documentManager',
method: 'DELETE'
}
the fie being deleted successfully from blob storage but the callback for sever handling is not getting called. Any thing that I am missing?
There is no server callback when using the delete file feature in fine uploader azure. The request is sent directly to azure. If you would like to contact your server after the delete has completed, do so in the onDeleteComplete callback.
Background...
I'm exploring Parse.com as a back end for an iOS app that also has an html/web browser interface for some users (either via javascript client or asp.net client - to be determined). The web users are 'an audience' for the data/files the app users prepare in the app. They are not the same people.
I need to lock down access to objects in the database (no public access read or write) so I plan to set up an admin user belonging to an administrators role and create an app_users role applying class-level permissions to the various classes accordingly.
Then for my iOS app, using the anonymous users add them to the app_Users role, setting up a default ACL for object level permissions and interact with the data model accordingly.
The app creates PDF files and stores as PFFile objects and I want these to have no public read or write access too. these docs are what will be accessible via the web client.
So...
I don't think i want to use PFUsers for each potential user accessing via a web client -don't want it to over engineered. So I figured send params to Cloud Code (with useMasterKey()) to first return a list of file meta data to present to the user - this works well - I can return the PFFile url or objectId, doc name, file type and size...
The challenge...
Next I'd need to build a Cloud Code function which given objectId or a url will fetch the PDF file and return it in a way my web page can display it to the user.
I've seen a few examples in the Networking section of the docs looks like it might be possible but I can seem to join the dots.
Hope that makes sense - any thoughts?
Edit: Added Code
The code I've been looking at works for text/html - is it possible to response a PDF or binary
Parse.Cloud.httpRequest({
url:'example.com/file.pdf',
 success: function(httpResponse) {
console.log(httpResponse.text);
},
error: function(httpResponse) {
console.error('Request failed: ' + httpResponse.status);
});