Wget trying to write to original request? - shell

I use wget for simple things so don't scream if there is an obvious answer to my problem but here is an example of a wget for a simple image:
MBP:bin Mike$ wget http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png
--2011-04-25 12:48:05-- http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png
Resolving www.mactricksandtips.com... 209.20.76.249
Connecting to www.mactricksandtips.com|209.20.76.249|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4432 (4.3K) [image/png]
terminal-small.png: Permission denied
Cannot write to `terminal-small.png' (Permission denied).
MBP:bin Mike$
Any suggestions on why its not simply writing it to my computer? This happens for every single wget request I make...

If the file exists already on your machine, check its permissions. You may not have the writes to overwrite it. As well, check the permissions on the containing directory to see if you're even allowed to write in there.

You don't seem to have write permissions to that folder. I'm not sure what distro you use, but prepending sudo might do the trick:
sudo wget http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png

Related

using lftp to upload to ftp site got 501 Insufficient disk space

I'm new to using ftp and recently i came across this really wired situation.
I was trying to upload a file to someone else's ftp site, and i tried to use this command
lftp -e "set ftp:passive-mode true; put /dir/to/myfile -o dest_folder/`basename /dir/to/myfile`; bye" ftp://userName:passWord#ftp.site.com
but i got the error
put: Access failed: 501 Insufficient disk space : only 0 bytes available. (To dest_folder/myfile)
and when i log on to their site and check, a 0 byte file with myfile name is uploaded.
At first i thought the ftp site is out of disk space, but i then tried log on to the site using
lftp userName:passWord#ftp.site.com
and then set passive mode
set ftp:passive-mode true
and then upload the file(using another name)
put /dir/to/myfile_1 -o dest_folder/`basename /dir/to/myfile_1`
this time the file was successfully uploaded without the 501 insufficient disk space error.
Does any one know why this happens? Thanks!
You might try using lftp -d, to enable the debug/verbose mode. Some FTP clients use the ALLO FTP command, to tell the FTP server to "allocate" some amount of bytes in advance; the FTP server can then accept/reject that. I suspect that lftp is sending ALLO to your FTP server, and it is the FTP server responding to that ALLO command with a 501 response code, causing your issue.
Per updates/comments, the OP confirmed that lftp's use of ALLO was indeed resulting in the initially reported behaviors. Subsequent errors happened because lftp was attempting to update the timestamp of the uploaded file; these attempts were also being rejected by the FTP server. lftp had tried using the MFMT and SITE UTIME FTP commands.
To disable those, and to get lftp to succeed for the OP, the following lftp settings were needed:
ftp:trust-feat no
ftp:use-allo no
ftp:use-feat no
ftp:use-site-utime no
ftp:use-site-utime2 no
With these settings, you should be able to have lftp upload a file without using the ALLO command beforehand, and without trying to modifying the server-side timestamp of the uploaded file using MFMT or SITE UTIME.
Hope this helps!

DD-WRT wget returns a cached file

I'm developing an installer for my YAMon script for *WRT routers (see http://www.dd-wrt.com/phpBB2/viewtopic.php?t=289324).
I'm currently testing on a TP-Link TL-WR1043ND with DD-WRT v3.0-r28647 std (01/02/16). Like many others, this firmware variant does not include curl so I (gracefully) fall back to a wget call. But, it appears that DD-WRT includes a cut-down version of wget so the -C and --no-cache options are not recognized.
Long & short, my wget calls insist on downloading cached versions of the requested files.
BTW - I'm using: wget "$src" -qO "$dst"
where src is the source file on my remote server and dst is the destination on the local router
So far I've unsuccessfully tried to:
1. append a timestamp to the request URL
2. reboot the router
3. run stopservice dnsmasq & startservice dnsmasq
None have changed the fact that I'm still getting a cached version of the file.
I'm beating my head against the wall... any suggestions? Thx!
Al
Not really an answer but a seemingly viable workaround...
After a lot of experimentation, I found that wget seems to always return the latest version of the file from the remote server if the extension on the requested file is '.html'; but if it is something else (e.g., '.txt' or '.sh'), it does not.
I have no clue why this happens or where they are cached.
But now that I do, all of the files required by my installer have an html extension on the remove server and the script saves them with the proper extension locally. (Sigh...several days of my life that I won't get back)
Al
I had the same prob. While getting images from a camera the HTTP server on the camera always send the same image.
wget --no-http-keep-alive ..
solved my problem
and my full line is
wget --no-check-certificate --no-cache --no-cookies --no-http-keep-alive $URL -O img.jpg -o wget_last.log

wget does not log into ftp server without sudo

When I do:
sudo wget ${FTPpath}* -nc
It gets the files without a problem, but entering a password is awkward for automation. Also just seems like bad practice.
When I leave out the sudo, it does not connect to the ftp server:
--2015-12-09 11:02:17-- ftp://dmanalytics:*password*#10.10.23.32/dmanalytics/download/*
=> ‘.listing’
Connecting to 197.80.203.9:3128... connected.
Logging in as dmanalytics ...
And then it just sits there.
I am working via a proxy, but I don't think its that. Above it says connected, but also I tried:
wget ${FTPpath}* -e use_proxy=yes -e http_proxy=197.80.203.9:3128 -nc
without any success. Any ideas what is causing my failure to log into the ftp server, and how I can fix my wget command so that I don't have to use sudo?

Transferring a file to an amazon ec2 instance using scp always gives me permission denied (publickey,gssapi-with-mic)

I am trying to transfer a file to an ec2 instance. I followed the Amazon's documentation, this is what my command looked like:
scp -i [the key's location] Documents/[the file's location] ec2-user#[public dns]:[home/[destination]]
where I replaced all the variables with the proper things, I am sure it's the correct key and it has permission 400. When I call the command, it tells me the RSA key fingerprint, asks me if I want to continue connecting. I type yes and it replies with
Permission denied (publickey,gssapi-with-mic)
lost connection
I have looked at many of the other similar questions on stack overflow and can't find a correct way to do it.
Also ssh traffic is enabled on port 22.
The example amazon provided is correct. It sounds like a folder permissions issue. If you created the folder you are trying to copy to with another user or another user created it, chances are you don't have permissions to copy to it or edit it.
If you have sudo abilities, you can try opening access for yourself. Though not recommended to be left this way, you could try this command:
sudo chmod 777 /folderlocation
That gives complete read/write/executable permissions to anyone (hence why you shouldn't leave it at 777) but it will give you the chance to test your scp command to rule out permissions.
Afterwards if you aren't familiar with permissions, I suggest you read up on it. this is an example: http://www.tuxfiles.org/linuxhelp/filepermissions.html It is generally suggested you lock down the folder as much as possible depending on the type of information held within.
If that was not the cause some other things you might want to check:
are you in the directory of your key when executing the 'scp -i keyname' command?
do you have permissions to use the folder you are transferring from?
Best of luck.
The problem may be the user name. I copied a file to my Amazon instance and first tried to use the command:
scp -r -i ../.ssh/Amazon_server_key_pair.pem ./empty.test ec2-user#ec2-xx-yy-zz-tt.compute-1.amazonaws.com:~
and got the error:Permission denied (publickey).
I then realized that my instance is an Ubuntu environment and the user user is then "ubuntu" the correct command that worked for me is then:
scp -r -i ../.ssh/Amazon_server_key_pair.pem ./empty.test ubuntu#ec2-xx-yy-zz-tt.us-west-2.compute.amazonaws.com:~
The file "empty.test" is a text file containing the text "testing ...". Replace the address of your virtual server with the correct address to your instance's Public DNS. I have replaced the ip to my instance with xx.yy.zz.tt.
I have to use ubuntu# instead of ec2-user# because when i ssh i was seeing ubuntu# in my terminal, try changing to the name you see at your terminal
Also you have to set permission for pem file in your computer
chmod 400 /path/my-key-pair.pem
The below code will copy file from your computer to Ec2 instance.
scp -i ~/location_of_your_ec2_key_pair.pem ~/location_of_transfer_file/sample.txt ubuntu#ec2_your_ec2_instance.compute.amazonaws.com:~/folder_to_which_it_needs_to_be_copied
The below code will copy file from Ec2 instance to your computer
scp -i ~/location_of_your_ec2_key_pair.pem ubuntu#ec2_your_ec2_instance.compute.amazonaws.com:~/location_of_transfer_file/sample.txt ~/folder_to_which_it_needs_to_be_copied
I was facing the same problem. Hope this will work for you.
scp -rp -i yourfile.pem ~/local_directory username#instance_url:directory
Permission should also be correct to make this work.
Might be ones uses wrong username. Happened to me, was the same error msg -> Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
lost connection

How would I construct a terminal command to download a folder with wget from a Media Temple (gs) server?

I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following:
ftp username is: serveradmin#mydomain.ca
ftp host is: ftp.s12345.gridserver.com
ftp password is: somepassword
I have tried to write the command in the following ways:
wget -r ftp://serveradmin#mydomain.ca:somepassword#s12345.gridserver.com/path/to/desired/folder/
wget -r ftp://serveradmin:somepassword#s12345.gridserver.com/path/to/desired/folder/
When I try the first way I get this error:
Bad port number.
When I try the second way I get a little further but I get this error:
Resolving s12345.gridserver.com... 71.46.226.79
Connecting to s12345.gridserver.com|71.46.226.79|:21... connected.
Logging in as serveradmin ...
Login incorrect.
What could I be doing wrong?
Use scp on the Mac instead, it will probably work much nicer.
scp -r user#mediatemplehost.net:/folder/path /local/path

Resources