Fine uploader azure blob storage delete file option - fine-uploader

I am using fine uploader to upload images to azure blob storage, everything is working and now I want to delete file. I used below code
deleteFile: {
enabled: true,
endpoint: 'api/documentManager',
method: 'DELETE'
}
the fie being deleted successfully from blob storage but the callback for sever handling is not getting called. Any thing that I am missing?

There is no server callback when using the delete file feature in fine uploader azure. The request is sent directly to azure. If you would like to contact your server after the delete has completed, do so in the onDeleteComplete callback.

Related

Laravel Uploading file Direct to API server without saving it on hosting server

I tried to search it everywhere but i couldn't find my answer. Here is the case
I have a form in Laravel,
User can upload video through this form.
The video is going to be saved/upload on VIEMO by API call.
Now what I want is, I don't want to save the video on my server, i.e I don't waent to save it on hosting app server
I want to directly send it to the VIEMO API but I am not sure how do I directly send it.
SO far this is the code
$video = $request->file('video');
dd($video);
Vimeo::connection('main')->upload($video);
Can somebody guide me how do i send this video coming through HTTP POST method directly to api.
Thank You
Instead of passing a UploadedFile instance to upload(), try passing the temporary file path:
Vimeo::connection('main')->upload($request->file('video')->path());

Clarification for FineUploader blobUri property uploading to Azure

I'm setting up React version of FineUploader to upload files to Azure going through the docs and I could use a bit more clarification on bloburi sent to my API when requesting a SAAS.
We're requesting the SAAS before uploading the file which is why I'm confused.
Say, I'm uploading a file named my-image.jpg into my blob container with the endpoint of https://myaccount.blob.core.windows.net/my-container on Azure.
I also want to rename the file during upload by calling a function and let's assume the function returns 89056c3d-7bb3-my-image.jpg for file name.
Would then https://myaccount.blob.core.windows.net/my-container/89056c3d-7bb3-my-image.jpg be the bloburi I send to my API while requesting a SAAS?
In other words, are we constructing the bloburi using the azure blob storage container URI and the file name we'll end up using?
If I'm interpreting this correctly, what happens if the user is uploading multiple files? What would be the blobUri I'd have to send to request a SAAS?
UPDATE:
When my request hits my backend API to get a SAS, the blobUri comes in as /server/upload/some-guid-value.txt. I'm using the following options when instantiating an uploader. What am I doing wrong?
const uploader = new FineUploaderAzure({
options: {
signature: {
endpoint: 'http://localhost:4879/api/getsas'
},
request: {
containerUrl: 'https://myaccount.blob.core.windows.net/my-container'
},
uploadSuccess: {
endpoint: 'http://localhost:4879/success'
}
}
})
In other words, are we constructing the bloburi using the azure blob storage container URI and the file name we'll end up using?
Correct. Fine Uploader Azure constructs this for you. Be sure to verify permissions (considering the _method param that accompanies the Blob url) before returning a signature.

parse.com: cloud code to return a PFFile

Background...
I'm exploring Parse.com as a back end for an iOS app that also has an html/web browser interface for some users (either via javascript client or asp.net client - to be determined). The web users are 'an audience' for the data/files the app users prepare in the app. They are not the same people.
I need to lock down access to objects in the database (no public access read or write) so I plan to set up an admin user belonging to an administrators role and create an app_users role applying class-level permissions to the various classes accordingly.
Then for my iOS app, using the anonymous users add them to the app_Users role, setting up a default ACL for object level permissions and interact with the data model accordingly.
The app creates PDF files and stores as PFFile objects and I want these to have no public read or write access too. these docs are what will be accessible via the web client.
So...
I don't think i want to use PFUsers for each potential user accessing via a web client -don't want it to over engineered. So I figured send params to Cloud Code (with useMasterKey()) to first return a list of file meta data to present to the user - this works well - I can return the PFFile url or objectId, doc name, file type and size...
The challenge...
Next I'd need to build a Cloud Code function which given objectId or a url will fetch the PDF file and return it in a way my web page can display it to the user.
I've seen a few examples in the Networking section of the docs looks like it might be possible but I can seem to join the dots.
Hope that makes sense - any thoughts?
Edit: Added Code
The code I've been looking at works for text/html - is it possible to response a PDF or binary
Parse.Cloud.httpRequest({
url:'example.com/file.pdf',
 success: function(httpResponse) {
console.log(httpResponse.text);
},
error: function(httpResponse) {
console.error('Request failed: ' + httpResponse.status);
});

upload files directly to amazon s3 using fineuploader

I am trying upload files to directly to s3 but as per my research its need server side code or dependency on facebook,google etc. is there any way to upload files directly to amazon using fineuploder only?
There are three ways to upload files directly to S3 using Fine Uploader:
Allow Fine Uploader S3 to send a small request to your server before each API call it makes to S3. In this request, your server will respond with a signature that Fine Uploader needs to make the request. This signatures ensures the integrity of the request, and requires you to use your secret key, which should not be exposed client-side. This is discussed here: http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/.
Ask Fine Uploader to sign all requests client-side. This is a good option if you don't want Fine Uploader to make any requests to your server at all. However, it is critical that you don't simply hardcode your AWS secret key. Again, this key should be kept a secret. By utilizing an identity provider such as Facebook, Google, or Amazon, you can request very limited and temporary credentials which are fed to Fine Uploader. It then uses these credentials to submit requests to S3. You can read more about this here: http://blog.fineuploader.com/2014/01/15/uploads-without-any-server-code/.
The third way to upload files directly to S3 using Fine Uploader is to either generate temporary security credentials yourself when you create a Fine Uploader instance, or simply hard-code them in your client-side code. I would suggest you not hard-code security credentials.
Yes, with fine uploader you can do.Here is a link that explains very well what you need to do http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/
Here is what you need. In this blogpost fineuploader team introduces serverless s3 upload via javascript. http://blog.fineuploader.com/2014/01/15/uploads-without-any-server-code/

How can I make POST requests without making my API key public?

Using the imageshack API I can upload images to imageshack but I have to use an API key to do that. I can create a POST form for the image upload to imageshack but the key has to be put in the form and that exposes the API key publicly. How can I upload images to imageshack without exposing my API key?
I think the only way to do this properly is that the image is first POSTed to your OWN application by the user.
Then in your app you internally redirect this POST to ImageShack, where you can use your API key safely without anyone ever seeing it.
You can use something easy like RestClient to run the POST request from your back-end. You will need to store the image temporarily on your server, either in memory or on disk, for retransmission to ImageShack.
So:
User sends image with POST to your server
Your server receives the image in the POST request from the user
Your server runs a POST with this image to ImageShack using your API key
The POST request from step 1 returns successfully to the user

Resources