Is there an audit trail for Azure Blob Storage - azure-blob-storage

I'm trying to diagnose an issue whereby blob items uploaded to Azure blob storage may be being deleted by some scheduled task.
Is there a way to see an audit trail against deleted items in Azure portal?

There is, it's not specific to Blob storage, but the overall storage account
https://learn.microsoft.com/en-us/azure/storage/common/storage-monitor-storage-account

Related

Azure Dataverse To Sync the Data via Azure Synapse Link But Option set Text value not able export to Storage

We are using dataverse to export the data to azure storage by using Azure Synapse Link.
We can see the all the data load into azure data lake without any issues.
now as per requirement, we could do the transformation to load the option set values which separate CSV loaded in storage account but we could not do the transformation with model.json. because the model json available all the schema details.
https://learn.microsoft.com/en-us/power-apps/maker/data-platform/export-to-data-lake-data-adf

Copying JSON data from CosmosDB to Snowflake through ADF

**Hello,
I am trying to copy data from Cosmos DB to Snowflake through Azure Data Factory. But I get the error- "Direct copying data to Snowflake is only supported when source dataset is DelimitedText, Parquet, JSON with Azure Blob Storage or Amazon S3 linked service, for other dataset or linked service, please enable staging". Would that imply that I need to create a linked service with blob storage? What URL and SAS token should I give? Do I need to move everything to Blob and then move forward with staging?
Any help is appreciated. Thank You very much.**
Try it with a data flow activity instead of a copy activity

incremental loads from blob storage to azure table storage

I have a following scenario. (A rather common one but I am not entirely sure where to start)
I have data incoming into blob storage container (our raw zone). The files get dropped in raw zone everyday(by someone sitting somewhere). Each day as the new files come in, the old files are overwritten, but the number of records increases.
Suppose a customer file from yesterday may have 100 records, today's file might have 150 records. (100 from yesterday and 50 from today).
Now, what is the best way to do an incremental load (or other solutions welcome) for moving latest number of records into the azure table storage.
I have worked with using watermarks etc when loading data from or into sql, but don't have so much experience with Azure table. Would appreciate if I can get a lead.
Thanks in advance.
You can use ADF to do incremental load into Azure Table Storage using watermarks. Refer to below links and you might need to tweak the implementation a little based on requirements.
Incrementally load data from Azure SQL Database to Azure Blob storage using the Azure portal
Copy data to and from Azure Table storage using Azure Data Factory or Synapse Analytics

Auto generated block blobs

I am observing that whenever I create a new folder inside the Azure blob storage, a block blob file with the same name as the folder is auto created. I dont know why and what setting is making this behave this way. Any pointers on why and how to disable this ? Thank you.
In azure blob storage(not Azure data lake storage Gen2), you should know one important thing: You cannot create an empty folder. The reason is that blob storage has a 2 level hierarchy - blob container and blob. Any folder/directory should be part of a blob.
If you want to create an empty folder, you can use Azure data lake storage Gen2 instead. It's built on blob storage and has some familiar operations.

Backup Azure blob storage in line with SQL Azure DB

It is recommended that we store document information in blob storage. In our case the blob storage is related to the SQL Azure data, is the facility available to back up the blob storage in sync with the SQL Azure data ? What I don't want to see is a point in time restore of the SQL Azure data only to find we don't have the same snapshot of the blob data at that time :(
Does anyone know what is available
Interesting issue you have to solve. But there is no automated way to keep in sync BLOB and Azure SQL Database Data. You have to do manage this yourself. And here is not just about blob snapshots. How about your updated DB record refers to a new blob. What happens with the old one ?! This is all business rules to apply at application level. And you have to question yourself to what degree you want that backup of Blobs.
Here is an interesting blog post on Azure SQL and Storage backup. But again - there is no service that will keep for you in sync data between SQL DB and Azure Storage.

Resources