I got a strange behavior when running a script with arguments that contains a command substitution. I would like to understand why is this behavior happening. The script is:
#!/bin/bash
# MAIL=$1
# USER=$2
PASSWORD=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w ${1:-20} | head -n 1);
echo "$PASSWORD"
Then I run: ./test.sh mail user, I get the error:
fold: invalid number of columns: ‘mail’
and the Password is not generated.
If I don't pass an argument or I don't generate the password, it works fine.
Update (for understanding the behavior)
I think I've found out what is happening:
When running a script with two arguments the $1 and $2 have the passed values. Example:
./test.sh arg1 arg2 have $1 -> arg1 and $2 -> arg2
When using a pipe inside a script, the original arguments are still passed and thus if you have two arguments as input you will have the piped output inserted into the third place $3.
$1 -> arg1
$2 -> arg2
$3 -> piped output
So a working solution would be:
PASSWORD=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w ${3:-20} | head -n 1);
but if you vary the input arguments, it will not work. Therefore the best solution is what #KamilCuk suggested:
PASSWORD=$(< /dev/urandom tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1);
If you do not wish to pass the first script argument to fold, then do not use $1 in it.
PASSWORD=$(< /dev/urandom tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1);
# ^^ - pass the number, not $1
echo "$PASSWORD"
Related
I have the following bash script called countscript.sh
1 #!/bin/bash
2 echo "Running" $0
3 tr -cs A-Za-z '\n' | tr A-Z a-z | sort | uniq -c | sort -rn | sed $1 q
But I don't understand how to pass the argument correctly: ( "3" should be the argument $1 of sed).
$ echo " one two two three three three" | ./countscript.sh 3
Running ./countscript.sh
sed: -e expression #1, char 1: missing command
This works fine:
$ echo "one two three four one one four" | tr -cs A-Za-z '\n' | tr A-Z a-z | sort | uniq -c | sort -rn | sed 3q
3 one
2 four
1 two
Thanks.
PS: Anybody else noticed the
bug in this script on page 10, https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-2.pdf ?
In the quoted paper, I think you are misreading
sed ${1}q
as
sed ${1} q
and sed does not consider 3 by itself a valid command. The separate argument q is treated as an input file name. If the value of $1 did result in a single valid sed script, you would have likely gotten an error for the missing input file q.
Proper shell programming would dictate this be written as
sed "${1}q"
or
sed "${1} q"
instead; with the space as part of the script, sed correctly outputs the first $1 lines of input and exits.
It's somewhat curious that the authors used sed instead of head - "$1" to output the first few lines, as one of them (McIlroy) essentially invented the idea of the Unix pipeline as a series of special-purpose, narrowly focused tools. Not having read the full paper, I don't know what Knuth and McIlroy's contributions to the paper were; perhaps Bentley just likes sed. :)
When running the following command:
$ echo " one two two three three three" | ./countscript.sh 3
the special variable $1 will be replaced by 3, your first argument. Hence, the script runs:
tr -cs A-Za-z '\n' | tr A-Z a-z | sort | uniq -c | sort -rn | sed 3 q
Notice the space between the 3 and the q. sed does not know what to do, because you give it no command (3 is not a command).
Remove the space, and you should be fine.
tr -cs A-Za-z '\n' | tr A-Z a-z | sort | uniq -c | sort -rn | sed "${1}q"
I'm trying to write a simple shell script in linux that creates directories with random names.
The names must be made from the date of the day followed by a random string
like in this example:
2018-02-22y2Fdv9zzLVLupkl9El0dWalJAGTROLxE
This is the shell script
#!/bin/bash
# the date
DATAOGGI= echo -n $(date +"%Y-%m-%d")
# random string
RANDOM_STRING=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1)
# the dir
NEW_DIR=$(echo -n ${DATAOGGI}${RANDOM_STRING})
echo $NEW_DIR
mkdir $NEW_DIR
Unfortunately, even if the variable NEW_DIR is correct
echo $NEW_DIR -> 2018-02-22y2Fdv9zzLVLupkl9El0dWalJAGTROLxE
the name of the directory is
y2Fdv9zzLVLupkl9El0dWalJAGTROLxE
try just:
#!/bin/bash
DATAOGGI=$(date +"%Y-%m-%d")
RANDOM_STRING=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1)
mkdir "${DATAOGGI}${RANDOM_STRING}"
apart from fact that it is not necessary in this example echo -n AFAIK has very inconsistent behavior and it is advised to use printf instead
I had a job to perform that involved:
grep lines from a log
find a number in the line
perform basic arithmetic on the number (say, number + 1234)
The final result is a bunch of numbers separated by a newline.
If the input was:
1000
2000
3000
Then the required output was:
2234
3234
4234
I ended up with the following command:
cat log.txt | grep "word" | cut -d'|' -f7 | cut -d' ' -f5 | xargs -n 1 bash -c 'echo $(($1 + 1234))' args
I found the xargs -n 1 bash -c 'echo $(($1 + 1234))' args snippet in an answer to this question but I don't understand the need for the final args argument that is passed in. I can change it to anything, args could be blah, but if I omit it the arithmetic fails and the output is the numbers unchanged:
1000
2000
3000
Could anyone shed some light on why args is a required argument to bash -c?
A simple awk command can do the same - in a clean way:
awk -F'|' '/word/{split($7,a," "); print a[5]+1234}' log.txt
Man bash:
-c If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after
the command_string, they are assigned to the positional parameters, starting with $0.
So, for your case, 'args' is a placeholder that goes in $0, making your actual input go in $1.
You should be able to alter your command to:
grep "word" log.txt | cut -d'|' -f7 | cut -d' ' -f5 | xargs -n 1 bash -c 'echo $(($0 + 1234))'
I have a command line tool which receives two arguments:
TOOL arg1 -o arg2
I would like to invoke it with the same argument provided it for arg1 and arg2, and to make that easy for me, i thought i would do:
each <arg1_value> | TOOL $1 -o $1
but that doesn't work, $1 is not replaced, but is added once to the end of the commandline.
An explicit example, performing:
cp fileA fileA
returns an error fileA and fileA are identical (not copied)
While performing:
echo fileA | cp $1 $1
returns the following error:
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
any ideas?
If you want to use xargs, the [-I] option may help:
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names read from standard input. Also, unquoted blanks do not terminate input items; instead the separa‐
tor is the newline character. Implies -x and -L 1.
Here is a simple example:
mkdir test && cd test && touch tmp
ls | xargs -I '{}' cp '{}' '{}'
Returns an Error cp: tmp and tmp are the same file
The xargs utility will duplicate its input stream to replace all placeholders in its argument if you use the -I flag:
$ echo hello | xargs -I XXX echo XXX XXX XXX
hello hello hello
The placeholder XXX (may be any string) is replaced with the entire line of input from the input stream to xargs, so if we give it two lines:
$ printf "hello\nworld\n" | xargs -I XXX echo XXX XXX XXX
hello hello hello
world world world
You may use this with your tool:
$ generate_args | xargs -I XXX TOOL XXX -o XXX
Where generate_args is a script, command or shell function that generates arguments for your tool.
The reason
each <arg1_value> | TOOL $1 -o $1
did not work, apart from each not being a command that I recognise, is that $1 expands to the first positional parameter of the current shell or function.
The following would have worked:
set - "arg1_value"
TOOL "$1" -o "$1"
because that sets the value of $1 before calling you tool.
You can re-run a shell to perform variable expansion, with sh -c. The -c takes an argument which is command to run in a shell, performing expansion. Next arguments of sh will be interpreted as $0, $1, and so on, to use in the -c. For example:
sh -c 'echo $1, i repeat: $1' foo bar baz will print execute echo $1, i repeat: $1 with $1 set to bar ($0 is set to foo and $2 to baz), finally printing bar, i repeat: bar
The $1,$2...$N are only visible to bash script to interpret arguments to those scripts and won't work the way you want them to. Piping redirects stdout to stdin and is not what you are looking for either.
If you just want a one-liner, use something like
ARG1=hello && tool $ARG1 $ARG1
Using GNU parallel to use STDIN four times, to print a multiplication table:
seq 5 | parallel 'echo {} \* {} = $(( {} * {} ))'
Output:
1 * 1 = 1
2 * 2 = 4
3 * 3 = 9
4 * 4 = 16
5 * 5 = 25
One could encapsulate the tool using awk:
$ echo arg1 arg2 | awk '{ system("echo TOOL " $1 " -o " $2) }'
TOOL arg1 -o arg2
Remove the echo within the system() call and TOOL should be executed in accordance with requirements:
echo arg1 arg2 | awk '{ system("TOOL " $1 " -o " $2) }'
Double up the data from a pipe, and feed it to a command two at a time, using sed and xargs:
seq 5 | sed p | xargs -L 2 echo
Output:
1 1
2 2
3 3
4 4
5 5
Right now i a have a working script to pass 2 arguments to a shell script. The script basically takes a ticket# and svn URL as arguments on command line and gives an output of all the revisions that have been changed associated with that ticket# (in svn comments).
#!/bin/sh
jira_ticket=$1
src_url=$2
revs=(`svn log $2 --stop-on-copy | grep -B 2 $1 | grep "^r" | cut -d"r" -f2 | cut -d" " -f1| sort`)
for revisions in ${!revs[*]}
do
printf "%s %s\n" ${revs[$revisions]}
done
Output:
4738
4739
4743
4744
4745
I need some help to pass an array of arguments - meaning more than one ticket# and give the output of revisions associated with those ticket numbers that get passed as args to the script.
I don't think POSIX shell has arrays, so be plain and use #!/bin/bash
I would put the url as the first arg, and all the reset are tickets
#!/bin/bash
revs=()
src_url=$1
svn_log=$(svn log "$src_url" --stop-on-copy)
shift
for jira_ticket in "$#"; do
revs+=( $(grep -B 2 "$jira_ticket" <<< "$svn_log" | grep "^r" | cut -d"r" -f2 | cut -d" " -f1) )
done
for revisions in $( printf "%s\n" "${!revs[#]}" | sort )
do
printf "%s %s\n" ${revs[$revisions]}
done