Connect to Azure Storage Blob via FTP - ftp

I'm trying to connect to an Azure storage Blob via FTP for comparison purposes, I tried following the guide to connect to an Azure website via FTP but I can't seem to find ftp credentials as there is no publishing profile

As in being able to upload/download blobs via an FTP client? If that is what you mean then, unfortunately, this isn't possible. This Stack Overflow post might help if that is the scenario you're looking for.

Related

How to connect Google Data Studio to AWS Athena?

I need to connect Google Data Studio to AWS Athena. One way to do that is with a the JDBC URL connection option. I used the following parameters in the Database Authentication form and I got the error shown below:
Params:
Url
jdbc:awsathena://athena.us-east-2.amazonaws.com:443;UID=[MY_AWS_ACCESS_KEY];PWD=[MY_AWS_SECRET_KEY];S3OutputLocation=s3://[S3_OUTPUT_BUCKET];
Username
[MY_AWS_ACCESS_KEY]
Password
[MY_AWS_SECRET_KEY]
Error:
The server encountered an internal error and was unable to complete your request.
Any solution to connect Google Data Studio to AWS Athena or even connect to AWS S3 should solve this problem. I tried this Google Apps script to connect to S3 but failed to connect with authentication error as shown in this open issue.
An open-source connector for Google Data Studio now exists here.
To use it, you need to follow the instructions detailed here, i.e. copy/paste the code into a Google Apps Script Project and then deploy it to Google Data Studio.
Google Data Studio does not appear to support installing drivers (not that surprising, since it's a hosted, not self-hosted, app).
I think your best bet would be to create a connector -- I actually am surprised no open-source community connector exists yet. I think basically you'd need to wrap the Athena API, following the Google Data Studio connector guide. It's not simple, but it looks doable...

Copying data from on premises FTP to Azure storage with gateway/VPN and Logic apps/Functions

We have data on premises ftp server that I need to copy on schedule to Azure database. The ftp servers are behind NAT and can be reached from our on premises network. I have tried several things to copy the data:
1) Logic apps have ftp transfer option, but they don't support Data Management Gateway? How could I integrate VPN from onprem<->Azure with Logic Apps/Functions?
2) Azure Data Factory. This is not optimal, because I want to add ftp sources programmatically, but I tried this option to see if it works. I wasn't able to reach local ftp servers with Data Management Gateway. Probably because the NAT.
All suggestions how I should proceed with this would be greatly appreciated. Let me know if you need more info.
The scenario is:
-----------------------------------------
| Tosibox VPN | Azure VPN/Gateway |
-----------------------------------------
| ftp server | On prem | Azure |
-----------------------------------------
| 10.10.10.* | 192.168.75.* | 10.10.0.* |
-----------------------------------------
EDIT:
I still claim that Logic apps ftp-plugin doesn't work with on premises Data management gateway, so that doesn't go for an answer.
I tried to mount the ftp-server to my local windows machine and share that with file connector + data management gateway. However ftp-server mounts as a network drive and doesn't give local drive letter where file connector could connect out of the box. There is some hacks how one gets the drive letter and I was able to do that, but it didn't work other than the root ftp-folder.
I any case this would feel like unscalable hack, because I have several ftp-sources and in future might have alot more. I don't think there is easy way to get Logic apps/functions work with on premises ftp at the moment.
I think I am going to try to make VPN-gateway to on premises and copy the files using virtual machine/Web job.
I understand that FTP is boring and legacy, but still should be better supported from Microsoft imo. One more thing. In case you wonder why FTP and not something else, these are building automation controllers which support only FTP as output. So this is sort of IoT-case...
Why not use the File-connector which is supported by the Data Management Gateway and read from the underlying FTP-folders instead of using the FTP-connector?
Or are the FTP root folders not accessible from the machine on which you installed the gateway (by for example a fileshare) ?
FTP is not supported by Data Management Gateway, so if you want to connect to the FTP, it will need to be publicly available for your application to be able to make the FTP-connection.
With Azure Logic Apps, you can use on-premises data gateway to connect to on-premises systems, it supports FTP, or even just plain file share. Depending on which database you want to insert the data into, Logic Apps most likely also supports it - Cosmos Db, SQL, etc., and all are possible via designer without you having to write any code.
We used Function and VPN gateway in the end to fetch the FTP data. To use Function with VNET integration one needs to create the function with App Service Plan.

Is direct transfer from Dropbox to FTP possible?

Is there a way, to setup directl file transfer from Dropbox/Google Drive/OneDrive... to FTP server. Is it "physically" possible or/and does any API and programming logic even allow that?
No, Dropbox does not offer an FTP interface or functionality like this. Dropbox offers an HTTP based API:
https://www.dropbox.com/developers/core
You could use that API to programmatically download content from Dropbox, and then upload that to an FTP server, but that would require a client in the middle to manage the download and upload.

How to bring a server online and link it to parse.com?

I have a server of my own running locally on my wifi, on 0.0.0.0:5000.
I have built an app with the parse.com backend, and I want to link this server to Cloud Code, so I can call functions on it.
I am completely lost and don't know where to start to bring my server online with only Parse being able to access it and use its API.
Or am I better off renting a VPS and connecting to that?

Connect salesforce to ftp using http request

I want to connect my salesforce account to an FTP server so that I do not have to use the local system for the storage of any files coming from FTP. I have tried connecting to the FTP server using CLI which I have done successfully as I can see the files coming from the FTP.
Can somebody explain how i ccould connect the ftp server to my salesforce account using an http request and also how could i transfer and use the .Csv file to update/import the data into the salesforce custom object**
There are many methods to import data into Salesforce, however, FTP to Salesforce isn't one of them. I would either use the Salesforce Data Loader, if you have access to this data at the console, or use the Salesforce API for whatever platform you program in, to create a program that uses your FTP and can parse your data and deliver it to Salesforce through the API.
You're not going to be able to do this directly. You might look at using an intermediary system like TIBCO or Informatica. SFDC specifically prohibits FTP access.
Using your staging system, have it kick off an Apex bulk API job to do the import. You could do it with a cheap cloud server running your FTP box and a script that kicks off when a file is created in the specified directory, and then when the API job is done, archive or delete the original file and log files.

Resources