Script to download a list with wget in bash - bash

I have a list of download links (list.txt):
https://web.com/file1.zip
https://web.com/file2.zip
https://web.com/file3.zip
and so on up to 100...
I'm trying to build a script that will download my files using wget, but the emphasis here is that every time one file is downloaded, the ls command will be executed.
This is what my long script looks like, but I want to shorten it and make it smarter, and instead of manually writing line by line each time, that the script is read line by line and downloaded in order with the execution of the ls command in each download when the first one to turn has finished
wget https://web.com/file1.zip && ls ; wget https://web.com/file2.zip && ls ; wget https://web.com/file3.zip && ls ;
And so on up to 100

use forloop to squash your similar wget commands.
for i in {1..100}
do
wget https://web.com/file"$i".zip && ls
done
Please be aware of that this is bash shell style. You can also write other shell style.

#!/bin/bash
for iurl in $(cat list.txt); do
wget $iurl && ls
done

Related

How to make a bash script for mac?

I'm trying to make this bash script but get this: Error reading *.docx. The file doesn’t exist
Here's the script:
#!/bin/bash
textutil -convert txt *.docx
cat *.txt | wc -w
I'm currently running it from the folder but I'd like to make it a global script I can just call from any current folder.
If you want to make it available on your whole system you need to move it to a bin location like so
chmod a+rx yourscript.sh && sudo mv yourscript.sh /usr/local/bin/yourscript
then you can use it like a normal script in any folder

With Bash or ZSH is there a way to use a wildcard to execute the same command for each script?

I have a directory with script files, say:
scripts/
foo.sh
script1.sh
test.sh
... etc
and would like to execute each script like:
$ ./scripts/foo.sh start
$ ./scripts/script1.sh start
etc
without needing to know all the script filenames.
Is there a way to append start to them and execute? I've tried tab-completion as it's pretty good in ZSH, using ./scripts/*[TAB] start with no luck, but I would imagine there's another way to do so, so it outputs:
$ ./scripts/foo.sh start ./scripts/script1.sh start
Or perhaps some other way to make it easier? I'd like to do so in the Terminal without an alias or function if possible, as these scripts are on a box I SSH to and shouldn't be modifying *._profile or .*rc files.
Use a simple loop:
for script in scripts/*.sh; do
"$script" start
done
There's just one caveat: if there are no such *.sh files, you will get an error. A simple workaround for that is to check if $script is actually a file (and executable):
for script in scripts/*.sh; do
[ -x "$script" ] && "$script" start
done
Note that this can be written on a single line, if that's what you're after for:
for script in scripts/*.sh; do [ -x "$script" ] && "$script" start; done
Zsh has some shorthand loops that bash doesn't:
for f (scripts/*.sh) "$f" start

Pipe script and binary data to stdin via ssh

I want to execute a bash script remotely which consumes a tarball and performs some logic to it. The trick is that I want to use only one ssh command to do it (rather than scp for the tarball followed by ssh for the script).
The bash script looks like this:
cd /tmp
tar -zx
./archive/some_script.sh
rm -r archive
I realize that I can simply reformat this script into a one-liner and use
tar -cz ./archive | ssh $HOST bash -c '<commands>'
but my actual script is complicated enough that I must pipe it to bash via stdin. The challenge here is that ssh provides only one input pipe (stdin) which I want to use for both the bash script and the tarball.
I came up with two solutions, both of which include the bash script and the tarball in stdin.
1. Embed base64-encoded tarball in a heredoc
In this case the server receives a bash script with the tarball is embedded inside a heredoc:
base64 -d <<'EOF_TAR' | tar -zx
<base64_tarball>
EOF_TAR
Here's the complete example:
ssh $HOST bash -s < <(
# Feed script header
cat <<'EOF'
cd /tmp
base64 -d <<'EOF_TAR' | tar -zx
EOF
# Create local tarball, and pipe base64-encoded version
tar -cz ./archive | base64
# Feed rest of script
cat <<'EOF'
EOF_TAR
./archive/some_script.sh
rm -r archive
EOF
)
In this approach however, tar does not start extracting the tarball until it is fully transferred over the network.
2. Feed tar binary data after the script
In this case the bash script is piped into stdin followed by the raw tarball data. bash passes control to tar which processes the tar portion of stdin:
ssh $HOST bash -s < <(
# Feed script.
cat <<'EOF'
function main() {
cd /tmp
tar -zx
./archive/some_script.sh
rm -r archive
}
main
EOF
# Create local tarball and pipe it
tar -cz ./archive
)
Unlike the first approach, this one allows tar to start extracting the tarball as it is being transferred over the network.
Side note
Why do we need the main function, you ask? Why feed the entire bash script first, followed by binary tar data? Well, if the binary data were put in the middle of the bash script, there would be an error since tar consumes past the end of the tarfile, which in this case would eat up some of the bash script. So, the main function is used to force the whole bash script to come before the tar data.

Shell/Bash - pipe output into another script's input via a variable

Normally I would break things into separate actions and copy and paste the output into another input:
$ which git
/usr/local/bin/git
$ sudo mv git-credential-osxkeychain /usr/local/bin/git
Any quick hack to get output into input?
something like:
$echo which wget | sudo mv git-credential-osxkeychain
set -vx
myGit=$(which git)
gitDir=${myGit#/git} ; gitDir=${gitDir#/bin}/git
echo sudo mv git-credential-osxkeychain ${gitDir}
Remove the set -vx and the echo on the last line when you're sure this performs the action that you require.
It's probably possible to reduce the number of keystrokes required, but I think this version is easier to understand what techniques are being used, and how they work.
IHTH
use command substitution with $(command)
sudo mv git-credential-osxkeychain $(which git)
This substitutes the command for its output. You can find all about it in http://tldp.org/LDP/abs/html/commandsub.html
The answer would be what Chirlo and shellter said.
Why $echo which wget | sudo mv git-credential-osxkeychain wouldn't work is because piping redirect the stdout from a previous command to the stdin of the next command. In this case, move doesn't take input from stdin.
A curious thing is that which git returns
/usr/local/bin/git
but you are moving git-credential-osxkeychain to
/usr/local/git/bin/
Those two don't match. Is there a typo or something?
If you want to use the pipe syntax, then you should look at xargs.

Dynamic command execution with lftp - multiple commands

I'm sure there is a simple way to do this, but I am not finding it. What I want to do is execute a series of commands using lftp, and I want to avoid repeatedly connecting to the server if possible.
Basically, I have a file with a list full of ftp directories on the server. I want to connect to the server then execute something like the following: (assume at this point that I have already converted the text file into an array of lines using cat)
for f in "${myarray}"
do
cd $f;
nlist >> $f.txt;
cd ..;
done
Of course that doesn't work, but I have to imagine there is a simple solution to what I am trying to accomplish.
I am quite inexperienced when it comes to shell scripting. Any suggestions?
First build a string that contains the list of lftp commands. Then call lftp, passing the command on its standard input. Lftp itself can redirect the output of a command to a file, with a syntax that resembles the shell.
list_commands=""
for dir in "${myarray[#]}"; do
list_commands="$list_commands
cd \"$dir\"
nlist >\"$dir.txt\"
cd .."
done
lftp <<EOF
open -u $username,$password $site
$list_commands
bye
EOF
Note that I assume that the directory names don't contain backslashes, single quotes or globbing characters. Add proper escaping if necessary.
By the way, to read lines from a file, see Why is while IFS= read used so often, instead of IFS=; while read..?. You might prefer to combine reading from the list of directories and building the commands:
list_commands=""
while IFS= read -r dir; do
list_commands="$list_commands
cd \"$dir\"
nlist >\"$dir.txt\"
cd .."
done <directory_list.txt

Resources