Move files to BLOB using Automate - azure-blob-storage

I want to move files my email to attachment to Azure BLOB using Power Automate. I am aware of the BLOB storage connection but I can't use it as I don't have the access key.
After surfing Google, I managed to find the below link. I need help on how to get the x-ms header and how to choose the folder inside the BLOB to upload the file into.
I lack all kind on experience in HTTP and Azure BLOB. :(
Please help.
Link: https://powerusers.microsoft.com/t5/Using-Flows/how-to-upload-to-blob-container-via-sas-url/m-p/125756#M3360

After reproducing from our end, Here is how we could able to save files in our blob from the HTTP connector.
We have used “x-ms-blob-type” as a header with a value of “BlockBlob”. Make sure you add the path to your storage account in URI in the below format.
https://<STORAGE ACCOUNT NAME>.blob.core.windows.net/<CONTAINER NAME>/<FILE NAME><SAS>
RESULT:

Related

Some NFT images minted from Candy Machine V2 are not displayed

I created Candy Machine and I noticed that some NFT images are not displayed neither on wallet or solscan. Metadata and images were uploaded and pinned on Pinata using Candy Machine upload command which finished successfully.
Here is an example of broken NFT.The metadata URI is pointed to: and image is pointing to. I looked at metadata several times and could not find what is the issue and why is this happening to some of the NFTs (here is the example of valid NFT that does not have this problem).
Questions:
What is the problem with metadata which is causing image not to be displayed?
What is the best way to fix this? Metadata are mutable and I am planing to use metaboss to update URI of metadata file. Is this correct way of fixing this problem?
If you look at the URI metadata on solscan here
https://solscan.io/token/4ToXb3aD5YLpXqyZhcdp5ynpbXXFFjKjsaw1x94CTd7A#metadata
and swap to the URI version of the metadata for some reason it's returning an object keypair value out of every single character in the JSON for some reason, extremely weird. I'd highly recommended reuploading this JSON metadata file and then updating the metadata URI in the NFT to make the changes.
Metaboss is a great tool as you have researched that can do this for you.
Upload new Metadata json to ipfs or arweave.
Use Metaboss to udpate the NFT's metadata URI.
Hopefully this fixes your issue.
While not a conclusive answer as to "why" this happened, I don't really know the why. This is my first time in all the NFT's that I've seen that have behaved this way upon an upload and displaying the metadata.
If that doesn't fix your issue please stop by the Metaplex Discord server and chat with us there. :)
Edit: After further inspection it turns out your JSON file is of an invalid structure for this particular NFT so will definately need replacing.
Tony Boyle has a great point with updating your json.
Your problem gets visible when you run the json through a JSON validator.
It will show that you have too many } in there. Therefore parsing the JSON fails in solscan, phantom etc.
What you have to do is
Modify the metadata to be a valid JSON file
upload it again
update the NFT uri e.g. with metaboss (if you need to update multiple NFTs) or if its just one / a few https://sol-tools.tonyboyle.io/update-nft

Can I serve files stored in Google Cloud Storage via a http.FileServer in golang?

I have developed a small web application that runs a web server in golang.
Each user can login, view the list of their docs (previously uploaded) and click on an item to view an html page that shows some fields of the document plus an tag with a src attribute
The src attribute includes an url like "mydocuments/download/123-456-789.pdf"
On the server side I handle the URL ("mydocuments/download/*") via an http Handler
mymux.HandleFunc(pat.Get("/mydocuments/download/:docname"), DocDownloadHandler)
where:
I check that the user has the rights to view the document in the url
Then I create a fileserver that obviously re-maps the url to the real path of the folder where the files are stored on the filesystem of the server
fileServer := http.StripPrefix("/mydocs/download/",http.FileServer(http.Dir("/the-real-path-to-documents-folder/user-specific-folder/)))
and of course I serve the files
fileServer.ServeHTTP(w, r)
IMPORTANT: the directory where the documents are stored is not the static-files directory I sue for the website but a directory where all files end after being uploaded by users.
My QUESTION
As I am trying to convert the code for it to work also on Google Cloud, I am trying to change the code so that files are stored in a bucket (or, better in "sub-directories" -as they do not properly exist- of a bucket).
How can I modify the code so to map the real document url as available via the cloud storage bucket?
Can I still use the http.FileServer technique above (if so what should I use instead of http.Dir to map the bucket "sub-folder" path where the documents are stored)?
I hope I was enough clear to explain my issue...
I apologise in advance for any unclear point...
Some options are:
Give the user direct access to the resource using a signed URL.
Write code to proxy the request to GCS.
Use http.FS with an fs.FS backed by GCS.
It's possible that a fs.FS for GCS already exists, but you may need to write one.
You can use http.FileSystem since it is an interface and can be implemented however you like.

anonymous download request to azure storage blob service does not download pdf with user friendly name configured during upload

The requirement is to provide a friendly file name during pdf download, to our customers with whom we have shared the azure blob download urls(blobs without SAS token).I am working on this requirement using azure emulator in my local set up. I have set the content disposition property during upload of the file and am able to see the same in the blob properties(using storage explorer) after upload but it isn't returned in the response during download. Is this the expected behaviour?
I have already tried the following suggestion:
set the DefaultServiceVersion of blob service before setting containerAcl--have set it to 2017-11-09..but still the x-ms-version returned in the download response header shows 2009-09-19 and there is no content disposition returned in the response. Have checked the property in powershell too using Get-AzStorageServiceProperty -ServiceType Blob -Context $ctx
the defaultversion is set to 2017-11-09...
cases where content disposition works:
1.When i send x-ms-version in the request header, i am able to download the pdf with the name set in the content disposition parameter of the uploaded file.
2.While using SAS token too, the content disposition parameter is used and i am able to download the file with the desired name.
I need to get this working for anonymous request.
this is what i have as of now:(PHP):
$this->blobSvc = BlobRestProxy::createBlobService($this->connectionString);
$serviceProperties = $this->blobSvc->getServiceProperties();
$serviceProperties->getValue()->setDefaultServiceVersion('2017-11-09');
$this->blobSvc->setServiceProperties($serviceProperties->getValue());
the defaultserviceversion gets set correctly. But still x-ms-version is incorrect in the response and content disposition header isnt returned during download
azure Emulator seems to have the above issue. With an actual azure account , content disposition for anonymous request works as expected.Thanks for all help.

Parse Server - Where does it store uploaded files

I am new to Parse Server (implementing it on Heroku and locally).
I have a basic question, when I upload a file using the ParseFile class, it provides me a URL and a fileobject. Where is this File being stored?
Is it being stored physically on a file system? Or in Mongodb?
Thank you!
I found a collection in Mongodb named fs.files. The files I uploaded were located there. I assume the Parse URL is generated as a redirect.

Call Google Spreadsheet by bash

I would like to call my private google spreadsheet by bash/shell. I don't want to edit it or something, only read it and put the content on the stand-output...
Is it possible?
I found a way, where I can call published projects from Google Drive, but that doesn't fit, as I have confidential data in that spreadsheet!
Also I haven't find an easy to use API solution to do that one simple thing.
Thanks!
Yes it's possible.
Your script will need to do 3 things, each with curl:-
Use a saved Refresh Token to request an Access Token
Use the Access Token to fetch your spreadsheet's File resource (json)
Parse the json to extract the exportLinks object, and choose which format (eg. csv) you want to download
GET that exportLink

Resources