I have an app running on Heroku, and I need to download a file from an FTP. But I need to do it using a fixed IP. I´m using www.quotaguard.com to have fixed IPs.
But I can´t get it working.
Does anyone has a Ruby example to download a file from an FTP via a proxy server (quotaguard).
Both the proxy server and the FTP require username and password.
I´ve tried everything, using Ruby. And also calling wget from system to initiate a download, but wget apparently doesn´t go via the proxy. Also checked many posts, but no success so far.
I´m using Ruby 2.4.5
Thanks for any comments.
Thank you QuotaGuard. Socksify is not maintained and really old, we gave it a try but didn´t want to spend much time on it.
We actually managed to get this working with curl. You can call it within Heroku as well.
Here´s the command in case anyone wonders.
curl -x socks5h://socksproxyurl 'ftp://theftp/some.pdf' --user "ftp_user:ftp_pass" -o some.pdf
We've seen a few customers do this before with the socksify gem.
require 'socksify'
proxy = URI(ENV['QUOTAGUARDSTATIC_URL'])
TCPSocket::socks_username = proxy.user
TCPSocket::socks_password = proxy.password
Socksify::proxy(proxy.hostname, 1080) do |soc|
# do your FTP stuff in here
end
If that doesn't do it, post the errors you're seeing and we'll help get this running for you.
Related
I am banging my head trying to find out from where is the proxy being read from.
Here is the background. I am trying to do brew install wget on the Mac terminal
However, I immediately get fatal: unable to access 'https://github.com/Homebrew/brew/': Failed to connect to xxxxproxy.proxy.proxy.com port 8080: Connection refused
I tried to remove the HTTP_PROXY, HTTPS_PROXY, ALL_PROXY, http_proxy, https_proxy, all_proxy. When I run echo $xxxxx_proxy for all the above variables, I get blank values, indicating that the proxies are unset.
Where is this proxy being read from? Any help/guide would be really appreciated. Thank you!
Edit: I might have posted a very silly question, or maybe the question is a duplicate (which I am not able to find here). If so, I apologize in advance :)
It seems that the .gitconfig file in my home /Users/<username>/.gitconfig had the proxy values set within it. As soon as I commented those, everything started working smoothly.
#[http]
# proxy = xxx.xxx.xxx.com:80
#[https]
# proxy = xx.xxx.xxx.com:80
To use brew package behind a proxy you should set variable ALL_PROXY
export ALL_PROXY=proxyIP:port
If you want to remove the proxy definition and use direct connection execute in terminal:
unset ALL_PROXY
and in the same terminal execute the command
I've installed the Snowsql CLI tool (v1.2.16) and tried connecting to Snowflake using a command similar to snowsql -a <account details> -user datamonk3y#domain.com --authenticator externalbrowser.
For myself, and a few other colleagues, a pop up window appears which will allow us to authenticate. Unfortunately this isn't the case for some of my other colleagues...
I've not found anything obvious, but the authentication browser window simply isn't popping up for some users (Around half of us), therefore the connection is aborting after time out.
We're all using AWS workspaces with the same version of windows, same version of chrome and the same version of Snowsql. There's nothing I can see in the chrome settings that could be causing this. I'm also able to change the default browser to Firefox and I still authenticate fine.
Logging into the UI works for everyone too...
The logs don't really give much away, the failed attempts get a Failed to check OSCP response cache file message, but I think this is because the authentication isn't initiated with the server.
When I check my local machine (C:/Users/<datamonk3y>/AppData/Local/Snowflake/Caches/) I see a ocsp_response_cache.json file, but this isn't there for my colleagues who aren't able to log in.
As #SrinathMenon has mentioned in the comments below, adding -o insecure_mode=True to the login command will bypass this issue, but does anyone have any thoughts as to what could be causing this?
Thanks
Try by using the turning off OCSP :
snowsql -a ACCOUNT -u USER -o insecure_mode=True
The only root cause I see this issue happening is when the request is not able to reach the OCSP URL and that is failing.
Adding the debug flag in snowsql would give more details / information. Use this to collect the debug logs:
snowsql -a <account details> -user datamonk3y#domain.com --authenticator externalbrowser -o log_level=debug -o log_file=<path>
In my case, what worked was including the region in account name. So instead of -a abc1234, you would do something like -a abc1234.us-east-1.
https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#format-2-legacy-account-locator-in-a-region explains this a little, but basically you use the first part of the web console URL, eg: https://abc1234.us-east-1.snowflakecomputing.com/ (this only works with classic console)
I have installed knox server and done all the steps mentioned on hortonworks site.
When i ran the below command on the sandbox , it gives me the proper output.
curl http://sandbox:50070/webhdfs/v1?op=GETHOMEDIRECTORY
Now i have another VM running fedora . I am assuming it as external client and trying to do external access but getting no output:-
curl -k https://<sandbox-ip>:8443/gateway/sandbox/webhdfs/v1?op=GETHOMEDIRECTORY
Can someone point me whats wrong with my settings.
Not sure about your topology but if you are using the default one (sandbox) you probably need to add basic auth e.g.
curl -k -u guest:guest-password -X GET https://<sandbox- ip>:8443/gateway/sandbox/webhdfs/v1?op=GETHOMEDIRECTORY
Also check the logs at
<knox_install>/logs/gateway.log
They should tell you more about what went wrong.
Good luck !
I have a file on a FTP Server. I´m trying to download it using PhantomJS. I´ve tried using the following code:
var page = require('webpage').create();
page.open('ftp://USERNAME:PASSWORD#www.mywebsite.com/exempleFIle.xlsx');
phantom.exit();
It runs without throwing any errors, however the file is not downloaded. Is it possible to download it with PhantomJS?
My main goal is synchronizing the files in the FTP with my computer, so I can put it in my Google Drive and from there using it in my reports. I use PhantomJS to access some webpages and get some data for the same purpose. Since I´m already using PhantomJS, I thought I could do the same for the ftp server, but if there is a simpler solution that use other methods, I´m open to trying it.
Thank You
PhantomJS is a headless web-browser, it's not an FTP client, so it won't be able to help you.
My main goal is synchronizing the files in the FTP with my computer
I'd suggest using lftp.
lftp -u user,password -e 'mirror /remote/server/files/ /local/computer/files/' ftp.myserver.com
This will get files from the remote server to the local computer.
How do I run perl script from local host?
I have installed Apache 2.2 and Active Perl 5.16.3.
I am able to run the perl scrip from command prompt.
But since i am dealing with web application, i want it to be run from localhost.
However, I am getting the following error in the browser
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please help me out!
Your problem is probably related to the configuration of Apache. (It may be Apache that needs configuring for .cgi scripts) - If this is the case then you can find good info on this here:
http://www.perlmonks.org/?node_id=44536
http://www.cgi101.com/book/connect/winxp.html
http://www.editrocket.com/articles/perl_apache_windows.html
There is usually a host of things that you need to do to get it working. Following a good HOWTO to make sure that everything is installed and configured correctly will usually get you going to execute scripts on your local Windows machine.
You could run cgi script from web browser. CGI means that it should send a HTML header before sending any output to the server (which is going to send it back to the browser).
http://perldoc.perl.org/CGI.html#NAME
http://www.cs.tut.fi/~jkorpela/perl/cgi.html
Like this:
use CGI; # load CGI routines
$q = CGI->new;
print $q->header; # create the HTTP header
print $q->start_html('hello world'); # start the HTML
### your script logic goes here
print $q->end_html; # end the HTML
Of course, CGI is outdated, for new development you should use some mmore recent framwork like: Dancer, Mojolicious, ...
if someone is looking for an answer in year 2017 then follow this on windows 2010 machine.
Download Strawberry perl from perl site
Install in default directory.C:\Strawberry
Download Apache from https://httpd.apache.org/download.cgi#apache24
Once you successfully get apache server running then
Place your perl scripts in your Apache24/cgi-bin/ folder.
firstline of your perl script should point to path where perl is installed for my case it is #!C:/Strawberry/perl/bin/perl.exe
you should change your path according your installation folder
Filename - first.pl
#!C:/Strawberry/perl/bin/perl.exe
print "Content-type: text/html\n\n";
print "Hello, World.";
Now you can run your script from browser http://localhost/cgi-bin/first.pl