I've finally been able to upload two videos as blobs into a storage account container. I can see the blobs from the portal when I drill down into Storage > storage account name > Container > container name.
I can also see them from the CLI with the command "storage blob list".
However
When I attempt to upload the content into my Media service account - I select upload content from storage, Select the Account, then the container... and I get the erroneous message that there are no blobs
Clearly, there are - but they are not showing up. Any clues?
(see attached screen shots)
Did you try with Azure Media Services Explorer ? It's a very nice tool to work with Azure Media Services without any line of code !
You can download it directly from GitHub: https://github.com/Azure/Azure-Media-Services-Explorer
EDIT : ok, I think I have found why the blob list is empty. I did not saw that your two files have no extensions. I have just repro your issue with a file without extension, as you can see below:
To work with Azure Media Services encoders, your files need to have valid extensions.
Hope this helps,
Julien
Related
I created a Next.js project that is deployed on Vercel and uses a MySQL database. I then deployed a Directus instance on Heroku that is tied to that same database. In my Next.js project I want to fetch and render images that I uploaded to Directus. At first this works, but after a while all the images disappear in the Directus media library. The folders and references to the images are still there, but I don't see the pictures anymore, I get to see a JPG logo instead. When I try to fetch the images I get a 502 "Bad Gateway" error. I don't know what causes the images to disappear and how to fix this.
By default, Directus stores uploaded files locally on disk.
All Heroku applications run in a collection of lightweight Linux containers called dynos. Be aware that Heroku dyno's filesystem is ephemeral.
It means if your Heroku app doesn't receive any traffic for 30 minutes or is being deployed, the VM it lives on is destroyed, and its filesystem goes along with it. So, this filesystem should not be used for any permanent storage (Directus storage in your case).
You can configure Directus to use S3, Google Cloud Storage, Azure, or Cloudinary.
For more details check the File Storage Directus docs.
I gave access to few of my colleagues on one of my Azure storage Account (Contributor).
The idea is to have them access (read and list) the data in the blob container, but I want to restrict them from downloading the data.
I tried the below:
Using SAS key with read and list still allowing them to download the blobs (Using Storage Explorer).
Giving them just reader access and "Storage blob data reader" access did not stop them from downloading the data.
Changing the blob access tier to "Archive" is not a solution that suites.
Tried creating a custom role, but failing to find the exact allow and disallow permissions.
I see the similar kind of question before but wasn't been answered yet # Restrict from downloading file on Azure Blob
Can you please help.
If a user has read permission on a blob (either through SAS Token or Azure AD role), they will be able to download the blob.
To prevent users from downloading a blob, remove read permissions on the blob for the users. For example if you are using a SAS Token, simply use List permissions there. Then the users will be able to see the list of the blobs but will not be able to download it.
When I initiate an async copy of a block blob to another storage account using StartCopyAsync, is Azure doing any kind of integrity check for me, or if not, is there a way to have it do so?
I found that I can set the Properties.ContentMD5 property and have the integrity verified when uploading blobs. Is it also verifying during a copy operation?
I searched through the docs and found no mention of an integrity check during an async copy specifically. I found a couple references to AzCopy making integrity checks, and it also has the /CheckMD5 option, which is essentially what I'd like Azure to do after the blob copy.
As far as I know, the azure blob SDK is the package of the azure blob rest api.
So the azure SDK StartCopyAsync method will use copy operation(rest api) send to the azure server side to tell the server copy.
According to the copy operation article, you could find "When a blob is copied, the following system properties are copied to the destination blob with the same values".
It contains the "Content-MD5" property.
I had uploaded files mostly media files to Azure's File Storage, which I am able to see in Azure's Explorer as well. But when I view the file as anonymous user, I am not able to view the file. Tried to check with Permissions setting as well, but to no avail.
Any help would be welcomed :)
Azure files have Shared Access Signatures (SAS). This is a key that you compute with the storage account key, that gives access to a particular URL. Here is an example (storage account name is obfuscated here):
https://mystorageaccount.file.core.windows.net/sampleshare/2.png?sv=2015-04-05&sr=f&si=sampleread&sig=Zq%2BfflhhbAU4CkCuz9q%2BnUFEM%2Fsg2PbXe3L4MeCC9Bo%3D&sip=0.0.0.0-255.255.255.255
You have sample code on how to create a SAS with Azure files at https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/, ยง"Generate a shared access signature for a file or file share".
You can also do it interactively with a number of tools. For instance, CloudXPlorer has this feature.
I am a bit stuck with this Windows Azure Blob storage.
I have a controller that receive a file path (local).
So on the web page I do something loke this:
http:...?filepath=C:/temp/myfile.txt
On the web service I want to get this file and put it on the blob service. When I launch it in local there is no problem but when i publish it there is no way to get the file. I always get:
Error encountered: Could not find a part of the path 'C:/temp/myfile.txt'.
Can someone help me. Is there a solution ?
First i would say to get proper help you would need to provide better description about your problem. What do you mean by "On the web service"? Is it a WCF web role which seems to match with your partial problem description. However most of the web service use http://whatever.cloudapp.net/whatever.svc as well as http://whatever.cloudapp.net/whaterever.aspx?whatever if added. Have you done something like that in your application.
You have also mentioned the controller in your code which makes me think it is a MVC based Web Role application.
I am writing above information to help you to formulate your question much better next time.
Finally Based on what you have provided you are reading a file from local file system (C:\temp\myfile.txt) and uploading to Azure Blob. This will work in compute Emulator and sure will fail in Windows Azure because:
In your Web Role code you will not have access to write on C:\ drive and that's why file is not there and you get error. Your best bet is to use Azure Local Storage to write any content there and then use Local Storage to read the file and then upload the Azure Blob. Azure Local storage is designed to write any content from web role (you will have write permission).
Finally, I am concern with your application design also because Azure VM are no persisted so having a solution to write to anywhere in VM is not good and you may need to directly write to Azure storage without using system memory, if that is possible anyways.
Did you verify the file exists on the Azure server?