`wget -i files.txt` gives Scheme Missing error on VPS - bash

I have gathered few min.map files and stored it in mapfiles.txt
as shown below,
https://example.com/app.bundle.min.map
https://example.com/app.bundle.min.map
https://example.com/app.bundle.min.map
I read the man wget page and found that to download files using wget we need to use
wget -i filename.txt
So I tried same thing and observed weird behavior
On local system,
wget -i mapfiles.txt
This command started downloading .map files
On VPS,
wget -i mapfiles.txt
Got this error
https://example/app.bundle.min.map: Scheme missing No URLs found in mapfiles.txt.
Could you guys help me to figure out where I am going wrong?

Related

SCP AWS EC2 Fails

I'm aware that there are other questions concerning the command, but I believe my situation to be unique.
Running the command:
scp -i ~/Documents/awskeys.pem -r website ec2-user#ec2-XX-XX-XXX-XXX.compute-1.amazonaws.com:home/ec2-user
gives the following output:
Loaded plugins: extras_suggestions, langpacks, priorities, update-motd
and nothing else. No copying was done either. I have tried permutations of flags and many other possible solutions but come up empty-handed, and this was the recommended solution in many other threads. Even Google has only one instance, unrelated to SCP, of the message I received.
Any thoughts on what the problem could be?
There is issue in your path
scp -i ~/Documents/awskeys.pem -r website ec2-user#ec2-XX-XX-XXX-XXX.compute-1.amazonaws.com:~/
. Then Please check security group and make sure that 22 port is allowed. If still stuck then try rsync as alternative -
rsync -e "ssh -i ~/Documents/awskeys.pem" -r website ec2-user#ec2-XX-XX-XXX-XXX.compute-1.amazonaws.com:~/

Mirror multiple page site with lftp

I need to mirror data hosted on a web site on a regular basis, I am trying to use lftp (version 4.0.9) as it usually does a great job for this task. However the site I am downloading from has multiple pages (I am intending to loop over the most recent n pages in a bash script which will run several times a day). I can't work out how to get lftp to accept the page parameter. I've had no luck searching for a solution online and what I have tried has failed so far.
This works perfectly:
lftp -c 'mirror -v -i "S1A" -P 4 https://qc.sentinel1.eo.esa.int/aux_resorb/'
This does not:
lftp -c 'mirror -v -i "S1A" -P 4 https://qc.sentinel1.eo.esa.int/aux_resorb/?page=2'
It gives error:
mirror: Access failed: 404 NOT FOUND (/aux_resorb/?page=2)
I also tried passing the new URL in as a variable but that didn't work either. I'd be grateful for suggestions to solve this issue.
Before it is suggested, I know wget is an option and the pagination works - I tested it - I don't want to use it because it is less appropriate for this as it wastes a lot of time getting all the "index.html?param=value" and then removing them, given the number of pages this isn't feasible.
The problem with the lftp's mirror command is that it adds a slash to the given URL when requesting the page (see below). So it boils down how the remote end will handle URLs and whether it gets upset of the trailing slash. On my tests, Drupal sites for example do not like the trailing slash and will return a 404 but some other sites worked fine. Unfortunately I was not able to figure out a workaround if you insist of using lftp.
Tests
I tried the following requests against a web server:
1. lftp -c 'mirror -v http://example/path'
2. lftp -c 'mirror -v http://example/path/?page=2'
3. lftp -c 'mirror -v http://example/path/file'
4. lftp -c 'mirror -v http://example/path/file?page=2'
These commands resulted to the following HEAD requests seen by the web server:
1. HEAD /path/
2. HEAD /path/%3Fpage=2/
3. HEAD /path/file/
4. HEAD /path/file%3Fpage=2/
Note that there's always a trailing slash in the request. %3F is just the URL encoded character ?.

DD-WRT wget returns a cached file

I'm developing an installer for my YAMon script for *WRT routers (see http://www.dd-wrt.com/phpBB2/viewtopic.php?t=289324).
I'm currently testing on a TP-Link TL-WR1043ND with DD-WRT v3.0-r28647 std (01/02/16). Like many others, this firmware variant does not include curl so I (gracefully) fall back to a wget call. But, it appears that DD-WRT includes a cut-down version of wget so the -C and --no-cache options are not recognized.
Long & short, my wget calls insist on downloading cached versions of the requested files.
BTW - I'm using: wget "$src" -qO "$dst"
where src is the source file on my remote server and dst is the destination on the local router
So far I've unsuccessfully tried to:
1. append a timestamp to the request URL
2. reboot the router
3. run stopservice dnsmasq & startservice dnsmasq
None have changed the fact that I'm still getting a cached version of the file.
I'm beating my head against the wall... any suggestions? Thx!
Al
Not really an answer but a seemingly viable workaround...
After a lot of experimentation, I found that wget seems to always return the latest version of the file from the remote server if the extension on the requested file is '.html'; but if it is something else (e.g., '.txt' or '.sh'), it does not.
I have no clue why this happens or where they are cached.
But now that I do, all of the files required by my installer have an html extension on the remove server and the script saves them with the proper extension locally. (Sigh...several days of my life that I won't get back)
Al
I had the same prob. While getting images from a camera the HTTP server on the camera always send the same image.
wget --no-http-keep-alive ..
solved my problem
and my full line is
wget --no-check-certificate --no-cache --no-cookies --no-http-keep-alive $URL -O img.jpg -o wget_last.log

curl issue while uploading files to ftp

I have an issue while I need from script to upload all files which stored in some directory. Every time I get this issue:
curl: (9) Server denied you to change to the given directory
#!/bin/sh
for file in /export/test/*
do
curl -T ${file} ftp://192.168.10.10/${file} --user tester:psswd
done
I checked vsftpd config and I have permissions to write/read and when I do it manually It runs.
for example when I run this command, everything is OK.
curl -T /export/test/testing.txt ftp://192.168.10.10/export/status/testing.txt --user tester:psswd
Have someone else also this problem?
I don't have any idea how to solve it, I tried everything.
By the way: My ftp root folder is /var/www/stats and I need to rewrite files in subfolders which is named: /var/www/stats/export/test.
FIXED
my bad: error is in that file variable putting full path to server and I put one more slash there.
so final conclusion is this:
#!/bin/sh
for file in /export/test/*
do
curl -T ${file} ftp://192.168.10.10${file} --user tester:psswd
done
It works. Done.

What does this command do: 'wget -qO- 127.0.0.1'?

In the provisioning part of vagrant guide, there is a command wget -qO- 127.0.0.1 to check if apache is installed property.
Can anyone explain the command more in detail? I dont understand what the -qO- option does. Also, what is the meaning of wget to 127.0.0.1?
Thanks!
The dash after the O instructs output to go to standard output.
The q option means the command should be "quiet" and not write to standard output.
The two options together mean the instruction can be used nicely in a pipeline.
As far as added 127.0.0.1 as the source of the wget, that is there to make sure you have a local webserver running. Running wget on the commandline is faster than bringing up a browser.

Resources