I am a bit stuck with this Windows Azure Blob storage.
I have a controller that receive a file path (local).
So on the web page I do something loke this:
http:...?filepath=C:/temp/myfile.txt
On the web service I want to get this file and put it on the blob service. When I launch it in local there is no problem but when i publish it there is no way to get the file. I always get:
Error encountered: Could not find a part of the path 'C:/temp/myfile.txt'.
Can someone help me. Is there a solution ?
First i would say to get proper help you would need to provide better description about your problem. What do you mean by "On the web service"? Is it a WCF web role which seems to match with your partial problem description. However most of the web service use http://whatever.cloudapp.net/whatever.svc as well as http://whatever.cloudapp.net/whaterever.aspx?whatever if added. Have you done something like that in your application.
You have also mentioned the controller in your code which makes me think it is a MVC based Web Role application.
I am writing above information to help you to formulate your question much better next time.
Finally Based on what you have provided you are reading a file from local file system (C:\temp\myfile.txt) and uploading to Azure Blob. This will work in compute Emulator and sure will fail in Windows Azure because:
In your Web Role code you will not have access to write on C:\ drive and that's why file is not there and you get error. Your best bet is to use Azure Local Storage to write any content there and then use Local Storage to read the file and then upload the Azure Blob. Azure Local storage is designed to write any content from web role (you will have write permission).
Finally, I am concern with your application design also because Azure VM are no persisted so having a solution to write to anywhere in VM is not good and you may need to directly write to Azure storage without using system memory, if that is possible anyways.
Did you verify the file exists on the Azure server?
Related
I am confused, and googled everything but there's no answer:
I got a excel file stored in somewhere like this on windows, it's a shared file under 'Network':
\\[serverName]\[folderName]\[folderName]\[folderName]\[folderName]\ZNAC.XLSX
It's compulsory that I can only read/download the file here.
Everything works fine when I am reading it from local, it both works fine by using SMB or declare the file path directly as an inputstream.
But when I deploy to SAP cloud foundry, it always ends up with FileNotFoundException, and I tried a lot of ways and no change.
I am wondering if the cloud instance is finding the file from internally not externally.
But I tried SMB as well, it's not working.
I found there is something called 'Volume service' on cloud foundry, but it's not usable in SAP Cloud Foundry.
Any help to make my application able to read an external file from SAP Cloud Foundry?
To read a file from an external share you must first create volume service for the corresponding share (NFS or SMB) and start it.
Then you must bind service instance to the CF app like this:
cf bind-service YOUR-APP SERVICE-NAME -c '{"uid":"UID","gid":"GID","mount":"OPTIONAL-MOUNT-PATH","readonly":true}'
The detailed guide is here
https://docs.cloudfoundry.org/devguide/services/using-vol-services.html#smb
SAP Cloud Platform / SAP BTP does not have a service that allows you to access SMB drives. One possibility would be to use a SMB/SAMBA Java client library, configure the Firewall / SAP Cloud Connector accordingly. We once implemented something like that, but there are some challanges on the way.
An other, easier, possibility would be to create an on-premise service (e.g. REST) that allows you to access the files. This service needs to be made available to SCP as well, for example, through SAP API Management.
I have a computer that is used for getting database information from the server in the same domain, and this computer is used by employees who don't have the server admin information.
When the computer restarts, I'd like it to automatically log in to Windows Server so that it can access the database files. Is it possible to write a script for this that runs on boot?
Thanks in advance
I solved this by adding the credentials to the Credentials Manager in Windows, along with disabling the Windows Server dashboard program. This makes Windows automatically log in to the server with the stored credentials on boot.
Since your question really isn't specific, I'd like to suggest two ways of accomplishing your goal.
Since you'd like to access database information, why not use some kind of database management software (like SSMS if you're using MSSQL) and set up proper permissions for the user/computer that will need to obtain information from that particular server/database.
If you need access to raw files (which doesn't make much sense in case of MSSQL for accessing purposes), why not set up proper permissions on the file or parent folder, giving the user that is logged to the client PC proper permissions to access the files that are of interest.
I had uploaded files mostly media files to Azure's File Storage, which I am able to see in Azure's Explorer as well. But when I view the file as anonymous user, I am not able to view the file. Tried to check with Permissions setting as well, but to no avail.
Any help would be welcomed :)
Azure files have Shared Access Signatures (SAS). This is a key that you compute with the storage account key, that gives access to a particular URL. Here is an example (storage account name is obfuscated here):
https://mystorageaccount.file.core.windows.net/sampleshare/2.png?sv=2015-04-05&sr=f&si=sampleread&sig=Zq%2BfflhhbAU4CkCuz9q%2BnUFEM%2Fsg2PbXe3L4MeCC9Bo%3D&sip=0.0.0.0-255.255.255.255
You have sample code on how to create a SAS with Azure files at https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/, ยง"Generate a shared access signature for a file or file share".
You can also do it interactively with a number of tools. For instance, CloudXPlorer has this feature.
I want to cache some cropped images and serve them without calculating them again in a Azure WebSite. When I used the Azure VM I was just storing them at the D drive (temporary drive) but I don't know where to store them now.
I could use the Path.GetTempPath but I am not sure if this is the best approach.
Can you suggest me where should I store my Temporary files when I am serving from a Azure WebSite?
Azure Websites also comes with a Temp folder. The path is defined in the environment variable %TEMP%
You can store your images in App_Data folder in the root of your application or you can use Azure CDN for caching.
You could store the processed content on Azure Blob Storage and serve the content from there.
If what you really want is a cache you can also look into using the Azure Redis Cache.
you can use Path.GetTempPath() and Path.GetTempFileName() functions for the temp file name, but you are limited though in terms of space, so if you're doing a 10K save for every request and expect 100,000 requests at a time per server, maybe blob storage is better.
Following sample demonstrate how to save temp file in azure, both Path and Bolb.
Doc is here:https://code.msdn.microsoft.com/How-to-store-temp-files-in-d33bbb10
Code click here:https://github.com/Azure-Samples/storage-blob-dotnet-store-temp-files/archive/master.zip
I want to try AppHarbor, but I have an application which stores uploaded files in certain place on a filesystem. Is it compatible with AppHarbor? Can I store files in the file system and access them later?
(what kind of path can I expect, like c:\blabla something or what?)
Thank you.
You can store files on the local filesystem, but the application directory is wiped on each new deployment so it's not recommended to rely on for file storage.
Instead we recommend that you use a cloud storage service such as Amazon S3, Google Cloud Storage or similar. There are .NET libraries for both services.
We recently wrote a blog post about uploading files directly to S3 and GCS from the browser that you might want to read.
If you are using a background worker, you need to 'Enable File System Write Access' in the settings of you application.
Then, you are permitted access to write to: Path.GetTempPath()
Sourced from this support question: http://support.appharbor.com/discussions/problems/5868-create-directory-in-background-worker