Is there a way or a template that can be used within Power Automate that will remove a file from an azure blob container.
I have a flow that creates a file in SharePoint once it has been added to Blob storage container but I want to remove the file then from the container after it is created in SharePoint
You can use Delete blob(V2) action right after the SharePoint's Create file step and mention path to the blob from the trigger action. Below is the flow of my logic app.
RESULTS:
Related
I want to configure an azure lifecycle policy to containers such that the blobs are moved to cool tier after some specific time, but the blobs are inside dynamically created directory inside container, e.g
mycontainer/test-123/blob1.pdf;
mycontainer/test-98765/blob2.pdf;
mycontainer/test-qw9876/blob3.pdf
where "mycontainer/test-" remains same for all the blobs, but file names are dynamic, we need to apply the policy for the all blobs under the container "mycontainer/test-*".
Note: * values will be dynamically generated.
I tried to add the blobs in the container and move to cool tiers I'm getting result successfully without any dynamical creation
In your storage account -> use storage browser -> under blob container -> create container -> upload files directly in the container
After configuring an azure lifecycle policy to containers moving blobs to cool tier try to remove the character and update the rule
Here, blobs are moved to cool tier without any changes even after specific time
For reference:
https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-policy-configure?tabs=azure-portal
Unexpected links generated when creating a folder hierarchy in blob storage - Stack Overflow
I'd like to copy files incrementally from an FTP server into Snowflake.
Currently, I am using ADF pipeline which runs every 15 minutes and copies them into Azure Blob Storage. Then I have Snowpipes which ingest them into the target tables in Snowflake. However, I am looking for a more optimized solution.
Is there a way to load data directly from FTP to Snowflake instead of using ADF to copy files to Azure blob storage and read them from there?
Thanks
you have 2 alternative options:
Use logic apps instead of ADF. In Logic Apps, add FTP component with (when the file is added or modified).
Use stream and task instead of snowpipe with ADF or Logic Apps.
I had uploaded files mostly media files to Azure's File Storage, which I am able to see in Azure's Explorer as well. But when I view the file as anonymous user, I am not able to view the file. Tried to check with Permissions setting as well, but to no avail.
Any help would be welcomed :)
Azure files have Shared Access Signatures (SAS). This is a key that you compute with the storage account key, that gives access to a particular URL. Here is an example (storage account name is obfuscated here):
https://mystorageaccount.file.core.windows.net/sampleshare/2.png?sv=2015-04-05&sr=f&si=sampleread&sig=Zq%2BfflhhbAU4CkCuz9q%2BnUFEM%2Fsg2PbXe3L4MeCC9Bo%3D&sip=0.0.0.0-255.255.255.255
You have sample code on how to create a SAS with Azure files at https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/, ยง"Generate a shared access signature for a file or file share".
You can also do it interactively with a number of tools. For instance, CloudXPlorer has this feature.
I want to cache some cropped images and serve them without calculating them again in a Azure WebSite. When I used the Azure VM I was just storing them at the D drive (temporary drive) but I don't know where to store them now.
I could use the Path.GetTempPath but I am not sure if this is the best approach.
Can you suggest me where should I store my Temporary files when I am serving from a Azure WebSite?
Azure Websites also comes with a Temp folder. The path is defined in the environment variable %TEMP%
You can store your images in App_Data folder in the root of your application or you can use Azure CDN for caching.
You could store the processed content on Azure Blob Storage and serve the content from there.
If what you really want is a cache you can also look into using the Azure Redis Cache.
you can use Path.GetTempPath() and Path.GetTempFileName() functions for the temp file name, but you are limited though in terms of space, so if you're doing a 10K save for every request and expect 100,000 requests at a time per server, maybe blob storage is better.
Following sample demonstrate how to save temp file in azure, both Path and Bolb.
Doc is here:https://code.msdn.microsoft.com/How-to-store-temp-files-in-d33bbb10
Code click here:https://github.com/Azure-Samples/storage-blob-dotnet-store-temp-files/archive/master.zip
I am a bit stuck with this Windows Azure Blob storage.
I have a controller that receive a file path (local).
So on the web page I do something loke this:
http:...?filepath=C:/temp/myfile.txt
On the web service I want to get this file and put it on the blob service. When I launch it in local there is no problem but when i publish it there is no way to get the file. I always get:
Error encountered: Could not find a part of the path 'C:/temp/myfile.txt'.
Can someone help me. Is there a solution ?
First i would say to get proper help you would need to provide better description about your problem. What do you mean by "On the web service"? Is it a WCF web role which seems to match with your partial problem description. However most of the web service use http://whatever.cloudapp.net/whatever.svc as well as http://whatever.cloudapp.net/whaterever.aspx?whatever if added. Have you done something like that in your application.
You have also mentioned the controller in your code which makes me think it is a MVC based Web Role application.
I am writing above information to help you to formulate your question much better next time.
Finally Based on what you have provided you are reading a file from local file system (C:\temp\myfile.txt) and uploading to Azure Blob. This will work in compute Emulator and sure will fail in Windows Azure because:
In your Web Role code you will not have access to write on C:\ drive and that's why file is not there and you get error. Your best bet is to use Azure Local Storage to write any content there and then use Local Storage to read the file and then upload the Azure Blob. Azure Local storage is designed to write any content from web role (you will have write permission).
Finally, I am concern with your application design also because Azure VM are no persisted so having a solution to write to anywhere in VM is not good and you may need to directly write to Azure storage without using system memory, if that is possible anyways.
Did you verify the file exists on the Azure server?