I'm trying to download a lot of files using wget but sending the command direct to the OS.
To simplify the problem here's my code
import os
address = "wget -r -l1 --no-parent -e robots=off -A *c2_1024.jpg
https://soho.nascom.nasa.gov//data//REPROCESSING//Completed//2022//c2//20221019//"
os.system(address)
If I open a command prompt and type
https://soho.nascom.nasa.gov//data//REPROCESSING//Completed//2022//c2//20221019//
the files download perfectly.
However when I run my code above I get the following error.
SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc
syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
--2022-10-20 15:50:52--
https://soho.nascom.nasa.gov//data//REPROCESSING//Completed//2022//c2//20221019//
Resolving soho.nascom.nasa.gov...
SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc
syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
--2022-10-20 15:50:52--
https://soho.nascom.nasa.gov//data//REPROCESSING//Completed//2022//c2//20221019//
Resolving soho.nascom.nasa.gov... 198.118.248.108
Connecting to soho.nascom.nasa.gov|198.118.248.108|:443... connected.
OpenSSL: error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol
version
Unable to establish SSL connection.
How can I use python to download my files using the variable address?
Thanks Peter
Related
I am trying to download some files from a directory in ftp server. I would like to use "wget" command, but I can't get them.
ftp server URL: ftp://192.168.0.10
ftp user name: GL840
ftp password: no password
folder name in ftp: SD1/181004/
I am using the following command to download all files in the folder SD1/181004 in ftp server.
command is "wget -r -nd ftp://GL840:#192.168.0.10/SD1/181004/ -P /root/wang/powerdata/"
However, the following message is displayed and files are not downloaded.
Could you please tell me how to modify my command to download files?
I have tried accessing file on ftp server. Usually I follow the following commands and this works for me. You can give a try with this command
wget --user='GL840' --password='' ftp://192.168.0.10/SD1/181004/
I'm developing an installer for my YAMon script for *WRT routers (see http://www.dd-wrt.com/phpBB2/viewtopic.php?t=289324).
I'm currently testing on a TP-Link TL-WR1043ND with DD-WRT v3.0-r28647 std (01/02/16). Like many others, this firmware variant does not include curl so I (gracefully) fall back to a wget call. But, it appears that DD-WRT includes a cut-down version of wget so the -C and --no-cache options are not recognized.
Long & short, my wget calls insist on downloading cached versions of the requested files.
BTW - I'm using: wget "$src" -qO "$dst"
where src is the source file on my remote server and dst is the destination on the local router
So far I've unsuccessfully tried to:
1. append a timestamp to the request URL
2. reboot the router
3. run stopservice dnsmasq & startservice dnsmasq
None have changed the fact that I'm still getting a cached version of the file.
I'm beating my head against the wall... any suggestions? Thx!
Al
Not really an answer but a seemingly viable workaround...
After a lot of experimentation, I found that wget seems to always return the latest version of the file from the remote server if the extension on the requested file is '.html'; but if it is something else (e.g., '.txt' or '.sh'), it does not.
I have no clue why this happens or where they are cached.
But now that I do, all of the files required by my installer have an html extension on the remove server and the script saves them with the proper extension locally. (Sigh...several days of my life that I won't get back)
Al
I had the same prob. While getting images from a camera the HTTP server on the camera always send the same image.
wget --no-http-keep-alive ..
solved my problem
and my full line is
wget --no-check-certificate --no-cache --no-cookies --no-http-keep-alive $URL -O img.jpg -o wget_last.log
noob here,
I'm trying to transfer a file using rsync from windows to linux.
I have this code, but I'm getting an error
rsync -avz -e ssh C:\users\file.txt root#123.45.67.89:/var/dir
I get this error The source and destination cannot be both remote
And if I try
rsync -avz -e ssh /c/users/file.txt root#123.45.67.89:/var/dir
I get, No such file or directory
So, I think the problem is with the path of the file on the windows.. I've heard about cygwin but haven't really tried it
What can I do to get the path to work?
If you're using CygWin to rsync from the Windows box, the local file you want is almost certainly:
/cygdrive/c/users/file.txt
rather then:
/c/users/file.txt
Writing the right rsyncd.conf is a little bit complicated.
I get the error message
#ERROR: chdir failed rsync error: error starting client-server
protocol (code 5) at main.c(1296) [receiver=2.6.8]
and
2012/06/10 14:59:43 [9252] rsync: chdir /cygdrive/d failed : No such
file or directory (2)
Because I miss the "use chroot = false" directive in my rsyncd.conf
Here is an example of rsyncd.conf:
log file = c:/rsync.log
auth users = backup
secrets file = /cygdrive/c/Program Files/RSync/rsyncd.secrets
[backup]
comment = Backup this host
path = /cygdrive/d
use chroot = false
strict modes = false
read only = false
transfer logging = yes
hosts allow = 192.168.2.252
Be sure the rsync.log file is created/changed after restart service, if it's not done, you have some error in your rsyncd.conf file and the rsync does'nt use them but no error or info is showed at all!
And so, the format of rsyncd.secrets file, is:
backup:<password>
In the wsl2 env, use the rsync command indirectly.
I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following:
ftp username is: serveradmin#mydomain.ca
ftp host is: ftp.s12345.gridserver.com
ftp password is: somepassword
I have tried to write the command in the following ways:
wget -r ftp://serveradmin#mydomain.ca:somepassword#s12345.gridserver.com/path/to/desired/folder/
wget -r ftp://serveradmin:somepassword#s12345.gridserver.com/path/to/desired/folder/
When I try the first way I get this error:
Bad port number.
When I try the second way I get a little further but I get this error:
Resolving s12345.gridserver.com... 71.46.226.79
Connecting to s12345.gridserver.com|71.46.226.79|:21... connected.
Logging in as serveradmin ...
Login incorrect.
What could I be doing wrong?
Use scp on the Mac instead, it will probably work much nicer.
scp -r user#mediatemplehost.net:/folder/path /local/path
Can anyone point me in the direction of a simple step by step guide as to how to connect to a mercurial repo via ssh on windows. Im really struggling to get my head around it, and so far i jus keep getting a string of errors. Any help would be appreciated.
take a look at this http://www.codza.com/mercurial-with-ssh-setup-on-windows
Assumes: you have a putty suite installed, a ppk and using TortoiseHg.
Here is my original c:\somerepo\.hg\hgrc file:
[paths]
default = ssh://hg#bitbucket.org/someuser/somerepo
So what's happening with ssh? Let's debug a pull statement, hg pull --debug on the command-line. I noticed it is running C:\Program Files\TortoiseHg\lib\TortoisePlink.exe instead of ssh to make the call:
PS C:\somerepo> hg pull --debug
pulling from ssh://hg#bitbucket.org/someuser/somerepo
running "C:\Program Files\TortoiseHg\lib\TortoisePlink.exe" -ssh -2 hg#bitbucket.org "hg -R someuser/somerepo serve --stdio"
sending hello command
sending between command
abort: no suitable response from remote hg!
So let's just reuse the call, add compression (yay!), non-interactive (batch) and our key:
[paths]
default = ssh://hg#bitbucket.org/someuser/somerepo
[ui]
ssh = "C:\Program Files\TortoiseHg\lib\TortoisePlink.exe" -ssh -2 -C -batch -i "c:\keys\somekey.ppk"