I need to download the files located in the root of in my FTP server, but not the folders, is there a way to do that?
I'm currently connecting with this string:
wget --timeout 20 -m -nH --user "user" --password "pass" ftp://123.123.com
Thanks.
wget doesn't have filtering capabilities.
However you can create a simple script to collect the files first and then get them using wget.
echo open $HOST > ftp.txt
echo user $USER $PASS >> ftp.txt
echo ls >> ftp.txt
echo bye >> ftp.txt
ftp < ftp.txt | awk '^-' | awk '{print $9}' > files.txt
wget --timeout 20 -m -nH --user "user" --password "pass" -i files.txt ftp://123.123.com
Related
I want to delete file by bash script using ftp
I use below code
$file = xyz/ab/file.txt
curl -v -u $user:$pass ftp://server.com/$file -Q "DELE $file"
but it's give these error
*Entry path is '/'
DELE xyz/ab/file.txt
* ftp_perform ends with SECONDARY: 0
< 550 Could not delete xyz/ab/file.txt: No such file or directory
* QUOT command failed with 550
How can I delete file with single line bash script command
How to delete file from ftp server with curl:
user="foo"
pass="bar"
dir="xyz/ab/" # with trailing slash
file="file.txt"
curl -v -u "$user:$pass" "ftp://server.com/$dir" -Q "-DELE $file"
or
curl -v -u "$user:$pass" 'ftp://server.com' -Q "-DELE /$dir$file"
or without leading /
curl -v -u "$user:$pass" 'ftp://server.com' -Q "-DELE $dir$file"
You could use the ftp command if you want to add additional commands.
user=foo
user=bar
ftp -n 127.0.0.1 <<EOF
quote USER $user
quote PASS $pass
delete xyz/ab/file.txt
exit
EOF
Deleting must be available on the ftp server, however. If I remember correctly, for vsftpd you must set anon_other_write_enable=YES in /etc/vsftpd.conf
I am trying to find all the datanodes in a remote host, write them into a .txt file and copy it back into my local machine.
I have used the following commands:
# Port forwarding to remote host and scp back to local.
ssh -f user#remoteHost -L 22222:remoteHost-1:22 -N &&
ssh -t -p 22222 user#localhost "consul members | grep data | awk '{ print $2 > /tmp/data_nodes }'" &&
scp -t -p 22222 user#localhost:/tmp/data_nodes.txt .
This doesn't work somehow. The data_nodes file is not getting created inside /tmp directory.
Any help will be greatly appreciated. Thanks.
If I understand correctly you want to run a remote command and get its output, then just do
ssh <opts> "cmd1 | cmd2 | cmd3" > /path/to/local.file
I use curl to open with username and password router
curl http://192.168.1.1 --user admin:admin |grep -i "stats"
But when I made this code to use curl from bash script I have problem to read server and password from file `LINKS_FILE="server"
PASS="passwd"
for link in `cat "$LINKS_FILE"`
do
for pass in `cat "$PASS"`
do
res=$(curl -m 1 "http://${link}:8080" --user admin:${pass} )
if echo $res | grep -i "stats"; then `
I am trying to use wget in a bash and display a custom download percentage per file so that the user knows the process is running and what has been downloaded. The below seems to download the files but there is no percentage or filename being downloaded displayed. I am not sure why and cannot seem to figure it out. Thank you :).
list
xxxx://www.xxx.com/xxx/xxxx/xxx/FilterDuplicates.html
xxxx://www.xxx.com/xxx/xxxx/xxx/file1.bam
xxxx://www.xxx.com/xxx/xxxx/xxx/file2.bam
xxxx://www.xxx.com/xxx/xxxx/xxx/file1.vcf.gz
xxxx://www.xxx.com/xxx/xxxx/xxx/file2.vcf.gz
bash that uses list to download all fiiles
# download all from list
download() {
local url=$1
echo -n " "
wget --progress=dot $url 2>&1 | grep --line-buffered "%" | sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}'
echo -ne "\b\b\b\b"
echo " starting download"
}
cd "/home/user/Desktop/folder/subfolder"
wget -i /home/cmccabe/list --user=xxx--password=xxx --xxx \
xxxx://www.xxx.com/xxx/xxxx/xxx/ 2>&1 -o wget.log | download
I'm trying to get a Backup Script working for rackspace. The final part where it sends the backup to a server isnt working.
#!/bin/sh
#Set information specific to your site
webroot="/mnt/stor08-wc1-ord1/666666/www.mysite.com/"
db_host="mysql51-900.wc1.ord1.stabletransit.com"
db_user="666666_backup"
db_password="PassBackup2015"
db_name="666666_wealth"
#Set the date and name for the backup files
date=`date '+%F-%H-%M'`
backupname="backup.$date.tar.gz"
#Dump the mysql database
mysqldump -h $db_host -u $db_user --password="$db_password" $db_name > $webroot/db_backup.sql
#Backup Site
tar -czpvf $webroot/sitebackup.tar.gz $webroot/web/content/
#Compress DB and Site backup into one file
tar --exclude 'sitebackup' --remove-files -czpvf $webroot/$backupname $webroot/sitebackup.tar.gz $webroot/db_backup.sql
HOST='172.0.0.1'
USER='FILESERV\ftp'
PASSWD='PassBackup2015!'
ftp $HOST <<END_SCRIPT
user $USER $PASSWD
cd $webroot
put $backupname
quit
END_SCRIPT
exit 0
You may need to use ftp -n to prevent auto-login from occurring. Using auto-login with the user command often causes problems.
In other words, something like:
ftp -n $HOST <<END_SCRIPT
user $USER $PASSWD
cd $webroot
put $backupname
quit
END_SCRIPT