acess ftp with wget through proxy - proxy

i am currently stuck at a problem, that i think if i use a proxy, wget tries to connect to the server via http instead of ftp. Since i cannot post the full settings i will post an example of the behaviour I see:
Accessing the ftp server without a proxy works using the command:
wget -r --user=username --password=mypassord ftp://ftp.myadress.com/
works as expected.
Accessing the ftp server with
wget -r --ftp-user=username --ftp-password=mypassord ftp://ftp.myadress.com/
works too.
Accessing the server with proxy and
wget -r --ftp-user=username --ftp-password=mypassord ftp://ftp.myadress.com/
leads to a
401 Unauthorized
error.
Using
wget -r --user=username --password=mypassord ftp://ftp.myadress.com/
leads to an index.html to be created. Unfortunately, the ftp server does not have any index.html files in the folders. Accessing a file with its full path through the proxy via
wget --user=username --password=mypassord ftp://ftp.myadress.com/test/test.txt
downloads the file as expected.
Accessing a different ftp server through the same proxy, which has an index.html in every folder with the command:
wget -r --user=username2 --password=mypassord2 ftp://ftp.myadress2.com/
works fine.
So how can i force wget to use the ftp protocol through the proxy?
Thanks in advance

Have you tried to put the proxy settings into a .wgetrc file instead?
It should contain something similar:
use_proxy=yes
http_proxy=127.0.0.1:8080

Related

Use proxy with wget on the fly

I know that if I want to use a proxy with wget, I have to edit /etc/wgetrc or ~/.wgetrc file and set there the proxy address and port.
What I want to know if is there any option to use wget and use a proxy BUT without editing any config file.
That's all. Thanks.
You can pass the proxy settings via the environment, e.g.:
https_proxy=http://proxy.example.com:3128 wget http://www.example.com

Programmatic ngrok tunnel url

I'm trying to get ngrok's dynamically generated IP address programmatically by using bash to set globals and env variables.
Below is what I have so far.
Run ngrok http {server url}
Inside your host root directory run:
curl http://localhost:4040/api/tunnels > ~/ngrok_tunnels.json;
Install jq
Brew install [jq](https://stedolan.github.io/jq/) (let's you access json through bash)
Afterwards you just need to access this json following jq's docs.
Inside the project root that is calling the dev URL. [0]=(http) [1]=(https)
echo “NGROK_TUNNEL=$(jq .tunnels[1].public_url ~/ngrok_tunnels.json
)" >> .env
Set all of your dev urls to process.env.NGROK_TUNNEL
So this works, but is it the "best way" to do this?
For people who want to get a url through ngrok using python there is the pyngrok library
from pyngrok import ngrok
#Open a HTTP tunnel on the default port 80
#<NgrokTunnel: "http://<public_sub>.ngrok.io" -> "http://localhost:80">
http_tunnel = ngrok.connect()
#Open a SSH tunnel
#<NgrokTunnel: "tcp://0.tcp.ngrok.io:12345" -> "localhost:22">
ssh_tunnel = ngrok.connect(22, "tcp")
it is also possible to do some things directly via the ngrok API. I didn't find the option to create a tunnel, but having a tunnel created you can restart it or update it
https://ngrok.com/docs/api#api-tunnel-sessions-list
The short answer is yes.
You can upgrade to a paid plan and use the --subdomain argument to get the same ngrok url every time. The next price level from that includes white labeling where you can use your own custom domain as well.

URL-forwarding to download a file: wget only downloads the index.html

From time to time I have to download a specific file from a website with wget. The URL is very long, so I created a free .tk-domain that forwards to the file. If I use my new .tk-URL in my browser, it downloads the file as I want it but on my VPS on Ubuntu, it only downloads the index.html file if I use wget. I've two forwarding options on Dot.TK
Frame (Cloaking)
Redirect (HTTP 301 Forwarding)
Which option should I use and is there a way to get the file instead of the index.html?
If you use a 301, wget should be able to download the file. You can also use curl -LO <URL> with the 301.

BASH: Get HTTP link of file

Is there a way to get a HTTP link of a file from a script?
For example:
I have a file at:
/home/User/video.mp4
Next, I would like to get the http link of that file. For example:
http://192.168.1.5/video.mp4
I currently have nginx installed onto the remote server with a specific directory as the root of the web server.
On the server I have, you can get the server link using this:
echo "http://$(whoami).$(hostname -f)/path/to/file"
I could get the file link using the command above but this would be an issue with files with spaces in them.
I'm doing this so that I can send the link to Internet Download Manager under windows. So using wget to download files will not work for me.
I'm currently using cygwin to create the script.
To solve the spaces problem, you can replace them with %20:
path="http://$(whoami).$(hostname -f)/path/to/file"
path=${path// /%20}
echo $path
Regards.

Can wget download file from ftp site when url path and ftp path are different?

I have a website like www.mysite.com
When I login to my site with FTP client, my root folder is: /home/domains
And my site files are located at /home/domains/mysite.com/public_html/
There are too many web sites under /home/domains like /home/domains/myothersite
But I am unable to download my file via wget.
If I use this command below, it downloads ALL SITES on my server.
wget.exe --mirror --ftp-user=XXX --ftp-password=XXX ftp://my.ip.add.ress
I also tried -directory-prefix=/home/domains/mysite.com/public_html/ but it didn't work.
Can I download file from ftp site when url path and ftp path are different?
You can use wget like this;
wget -m ftp://username:password#www.mydomain.tld/public_html
:)
This is my own answer :)
wget --mirror -r --no-parent ftp://username:password#www.mydomain.tld/public_html/
It worked for me

Resources