I am confused, and googled everything but there's no answer:
I got a excel file stored in somewhere like this on windows, it's a shared file under 'Network':
\\[serverName]\[folderName]\[folderName]\[folderName]\[folderName]\ZNAC.XLSX
It's compulsory that I can only read/download the file here.
Everything works fine when I am reading it from local, it both works fine by using SMB or declare the file path directly as an inputstream.
But when I deploy to SAP cloud foundry, it always ends up with FileNotFoundException, and I tried a lot of ways and no change.
I am wondering if the cloud instance is finding the file from internally not externally.
But I tried SMB as well, it's not working.
I found there is something called 'Volume service' on cloud foundry, but it's not usable in SAP Cloud Foundry.
Any help to make my application able to read an external file from SAP Cloud Foundry?
To read a file from an external share you must first create volume service for the corresponding share (NFS or SMB) and start it.
Then you must bind service instance to the CF app like this:
cf bind-service YOUR-APP SERVICE-NAME -c '{"uid":"UID","gid":"GID","mount":"OPTIONAL-MOUNT-PATH","readonly":true}'
The detailed guide is here
https://docs.cloudfoundry.org/devguide/services/using-vol-services.html#smb
SAP Cloud Platform / SAP BTP does not have a service that allows you to access SMB drives. One possibility would be to use a SMB/SAMBA Java client library, configure the Firewall / SAP Cloud Connector accordingly. We once implemented something like that, but there are some challanges on the way.
An other, easier, possibility would be to create an on-premise service (e.g. REST) that allows you to access the files. This service needs to be made available to SCP as well, for example, through SAP API Management.
Related
I have just started developing a Golang app, and have deployed it on Google App Engine. But, when I try to connect my local server to CloudSQL instance through proxy, I am able to connect only through TCP.
However, when connecting with the same CloudSQL instance in AppEngine, I am able to connect only through UNIX.
To cope with this, I have made changes in my local environment handler file, so that it can adapt to local and GCloud config, but I'm not sure how I can skip the update on just this file for GCloud? Again, I don't want AppEngine to delete this file, I just want the CLI to avoid uploading the new version of the handler file.
I use this command for deploying: gcloud app deploy
Currently, I deploy directly to AppEngine, instead of pushing it through VCS. Also, if there is an option to detect if the app is running on AppEngine, then it'd be really great.
TIA
Got it, in case anyone gets stuck in such situation, we can make use of environment variables set in GCloud AppEngine. Although there is documentation stating the environment variables, I would still give importance to checking the environment variables in Cloud Console.
Documentation link for Go 1.12+ Runtime env:
https://cloud.google.com/appengine/docs/standard/go/runtime
I am looking for tools/software for dynamically changing Linux conf/YAML files via API or tool like Consul.
If you have any experience on consul, please give feedback about creating templates for conf/YAML files, and without using service can it be done via consul?
Consul Template or Gomplate can be used to template configuration files based on changes in a backend data source.
https://learn.hashicorp.com/tutorials/consul/consul-template provides a basic example of a template which regenerates file contents when keys are added to Consul's key-value store.
I would like to setup Camunda-BPM in a Tomcat 7 running on Jelastic. I followed the instructions.
The problem now is that Jelastic does not allow to add the file bpm-platform.xml into the catalina-home/conf directory. So when I start the tomcat I get
...
Caused by: org.camunda.bpm.engine.ProcessEngineException: /opt/tomcat/conf/bpm-platform.xml does not exist. This file is necessary for deploying the camunda BPM platform
Can someone please give me a hint where I can place bpm-platform.xml so that the BPM engine starts?
The directory you're looking for is labelled as 'server' in the Jelastic dashboard - but sadly you cannot upload new files to this directory via the dashboard (only edit the existing ones).
However, you can write to this directory via FTP (http://docs.jelastic.com/ftp-ftps-support), so you should be able to add the file that way.
If you are just using a trial account at the moment, you may need to seek assistance from your hosting provider to add the file there for you manually from their side (since trial accounts do not have public IP, so can't use FTP).
I want to try AppHarbor, but I have an application which stores uploaded files in certain place on a filesystem. Is it compatible with AppHarbor? Can I store files in the file system and access them later?
(what kind of path can I expect, like c:\blabla something or what?)
Thank you.
You can store files on the local filesystem, but the application directory is wiped on each new deployment so it's not recommended to rely on for file storage.
Instead we recommend that you use a cloud storage service such as Amazon S3, Google Cloud Storage or similar. There are .NET libraries for both services.
We recently wrote a blog post about uploading files directly to S3 and GCS from the browser that you might want to read.
If you are using a background worker, you need to 'Enable File System Write Access' in the settings of you application.
Then, you are permitted access to write to: Path.GetTempPath()
Sourced from this support question: http://support.appharbor.com/discussions/problems/5868-create-directory-in-background-worker
I am a bit stuck with this Windows Azure Blob storage.
I have a controller that receive a file path (local).
So on the web page I do something loke this:
http:...?filepath=C:/temp/myfile.txt
On the web service I want to get this file and put it on the blob service. When I launch it in local there is no problem but when i publish it there is no way to get the file. I always get:
Error encountered: Could not find a part of the path 'C:/temp/myfile.txt'.
Can someone help me. Is there a solution ?
First i would say to get proper help you would need to provide better description about your problem. What do you mean by "On the web service"? Is it a WCF web role which seems to match with your partial problem description. However most of the web service use http://whatever.cloudapp.net/whatever.svc as well as http://whatever.cloudapp.net/whaterever.aspx?whatever if added. Have you done something like that in your application.
You have also mentioned the controller in your code which makes me think it is a MVC based Web Role application.
I am writing above information to help you to formulate your question much better next time.
Finally Based on what you have provided you are reading a file from local file system (C:\temp\myfile.txt) and uploading to Azure Blob. This will work in compute Emulator and sure will fail in Windows Azure because:
In your Web Role code you will not have access to write on C:\ drive and that's why file is not there and you get error. Your best bet is to use Azure Local Storage to write any content there and then use Local Storage to read the file and then upload the Azure Blob. Azure Local storage is designed to write any content from web role (you will have write permission).
Finally, I am concern with your application design also because Azure VM are no persisted so having a solution to write to anywhere in VM is not good and you may need to directly write to Azure storage without using system memory, if that is possible anyways.
Did you verify the file exists on the Azure server?