Passing multiple commands to script calling time - bash

I have a bash script let's say foo.sh that in this minimal example looks like this:
#!/bin/bash
function __measure_time {
time "$#"
}
__measure_time "$*"
What I want to do now is pass two commands to this script that are supposed to be run after another and I want to measure the time. So basically I am doing something like:
./foo.sh bash -c "echo 'ss' && sleep 1"
But that doesn't work the way I want it to. I don't get the 'ss' from the echo and the sleep is basically being ignored. Is there a way to make this work?

If you want the arguments to pass through correctly, you need to call __measure_time with "$#", not "$*":
#!/bin/bash
__measure_time() { #why the `function` syntax when you can be POSIX?
time "$#"
}
__measure_time "$#"
"$*" joins all arguments on the first character of $IFS into a string.
"$#" is magic for "give me all the arguments, as if each was separately quoted."

Related

Command not found with first positional argument used more than once

I have a script in bash which basically creates a user and install all the necessary applications.
It works the way that it iterates through a couple of commands, where I put a variable at the end of the command (positional argument).
I've set it up this way
function Install
{
COMMANDS=(
"Do_1st_thing $1"
"Do_2nd_thing $1"
"Do_3rd_thing $1"
)
for CMD in "${COMMANDS[#]}" ; do
$CMD
done
}
Then I run it
Install first_argument
The problem is that the first command is successful, however every next command says "Command not found".
Does the first positional argument ($1) changes after the execution of the first command?
Would I have to "eval" the "$CMD" in the "for loop" to get it working?
Feel free to ask any question, I will do my best to answer them.
Thank you for your help,
Kris
You are declaring an array with the first argument hard-coded in. If $1 is "foo" you are declaring
COMMANDS=(
"Do_1st_thing foo"
"Do_2nd_thing foo"
"Do_3rd_thing foo"
)
Storing these commands in an array seems like a weird thing to do anyway. Just
Install () {
Do_1st_thing "$#"
Do_2nd_thing "$#"
Do_3rd_thing "$#"
}
If your commands don't all accept the same arguments, you need to refactor the code, but that seems to be outside the scope of your concrete question here.
If they do, you might also consider refactoring into
commands=(Do_1st_thing Do_2nd_thing Do_3rd_thing)
for cmd in "${commands[#]}"; do
"$cmd" "$#"
done
(Notice also Correct Bash and shell script variable capitalization)
Maybe see also http://mywiki.wooledge.org/BashFAQ/050
As this is a bash function, you don't need the word function to designate it as a function. You would therefore write the code as below:
#!/bin/bash
Install()
{
COMMANDS=(
"ls $1"
"stat $1"
"file $1"
)
for CMD in "${COMMANDS[#]}" ; do
$CMD
done
}
Install testfile.txt

How to pass multiple commands to a single command in bash using &&?

Currently I have a script that does some extra processing, but ultimately calls the command the user passed (FYI, this is to run some commands in a docker container, but we'll call it foo.sh):
#!/usr/bin/env bash
# ...
runner "$#"
This works great (e.g. foo.sh echo HI), until the users wants to pass multiple commands to be run:
e.g.: foo.sh echo HI && echo BYE
&& is of course interpreted by the ambient shell before being passed into arguments.
Is there a workaround or means of escaping && that might work?
An idiom that often comes in handy for this kind of case:
cmds_q='true'
add_command() {
local new_cmd
printf -v new_cmd '%q ' "$#"
cmds_q+=" && $new_cmd"
}
add_command echo HI
add_command echo BYE
runner bash -c "$cmds_q"
The big advantage here is that add_command can be called with arbitrary arguments (after the first one defining the command to run, of course) with no risk of those arguments being parsed as syntax / used in injection attacks, so long as the caller never directly modifies cmds_q.

All arguments into files with correct quoting using "$#"

I need my bashscript to cat all of its parameters into a file. I tried to use cat for this because I need to add a lot of lines:
#!/bin/sh
cat > /tmp/output << EOF
I was called with the following parameters:
"$#"
or
$#
EOF
cat /tmp/output
Which leads to the following output
$./test.sh "dsggdssgd" "dsggdssgd dgdsdsg"
I was called with the following parameters:
"dsggdssgd dsggdssgd dgdsdsg"
or
dsggdssgd dsggdssgd dgdsdsg
I want neither of these two things: I need the exact quoting which was used on the command line. How can I achieve this? I always thought $# does everything right in regards to quoting.
Well, you are right that "$#" has the args including the whitespace in each arg. However, since the shell performs quote removal before executing a command, you can never know how exactly the args were quoted (e.g. whether with single or double quotes, or backslashes or any combination thereof--but you shouldn't need to know, since all you should care for are the argument values).
Placing "$#" in a here-document is pointless because you lose the information about where each arg starts and ends (they're joined with a space inbetween). Here's a way to see just this:
$ cat test.sh
#!/bin/sh
printf 'I was called with the following parameters:\n'
printf '"%s"\n' "$#"
$ ./test.sh "dsggdssgd" "dsggdssgd dgdsdsg"
I was called with the following parameters:
"dsggdssgd"
"dsggdssgd dgdsdsg"
Try:
#!/bin/bash
for x in "$#"; do echo -ne "\"$x\" "; done; echo
To see what's interpreted by Bash, use:
bash -x ./script.sh
or add this to the beginning of your script:
set -x
You might want add this on the parent script.

put awk or grep output to command line arguments in bash

I'm pretty new to shell programming and I'm trying to write a shell script to assign grep or awk pattern filtering output to command line parameter in bash shell.
a.sh
source ./b.sh
called a function like // a(function name) parameter1 parameter2
b.sh
function a{
$2=grep -ai "some string" a.txt(parameter 1)
echo "$2"
}
I wanna do like, but it won't let me to do it.
Is this even possible?
In bash, you cannot set positional parameters in a way that the caller can read that value. If you want to 'return' a string from a function, you must write it to stdout, like so:
function myfunc()
{
echo "test"
}
VAR=$(myfunc)
When the above code is run, VAR will contain the string 'test'.
For reference questions, look at the man pages; for example, man bash, man grep etc. For internal shell commands like function there's a bash built-in with similar functionality called help, for example help function.
To set positional parameters, you can use the built-in set. For example, set -- "a b" "c d" sets $1 to a b and $2 to c d.
For a pragmatic introduction to bash programming see the Bash wiki. It's simply the best Bash resource out there.
You can't assign to positional parameters, but you can do something like this:
function myf {
#do something with $1,$2, etc
}
FOO=$(awk command)
BAR=$(other command)
myf $FOO $BAR #the function will use $FOO and $BAR as $1 and $2 positional parameters
So you can pass the content of those commands to the function myf through the use of variables (FOO and BAR) in this case.
You could even do it without dummy variables calling myf $(some command) but the way I wrote it improves readability.
Before you try function, try a script first.
#!/bin/sh
arg1=${1?'Missing argument'}
grep -ai "some string" $arg1
And then put this script in your ~/bin folder (make sure you have changed your PATH directory to include ~/bin
Then just execute the script.
If you really need a function, then do
#!/bin/sh
b() {
grep -ai "some string" $1
}
b filename

How can my Bash script see a loop variable inside a command line argument?

I can't seem to "access" the value of my loop variable when executing a command line argument in a Bash script. I'd like to be able to write something like
#!/bin/bash
for myvar in 1 2 3
do
$#
done
and run the script as ./myscript echo "${myvar}".
When I do this, the lines are echoed as empty. I probably don't have a firm grasp one exactly what's being evaluated where.
Is what I want even possible?
$myvar is evaluated before the child script is even run, so it can't be evaluated within.
That is, when you invoke your script as:
./myscript echo "${myvar}"
what is actually being called is:
./myscript echo ''
presuming that $myvar is empty in the enclosing environment.
If you wanted to be evil (and this is evil, and will create bugs, and you should not do it), you could use eval:
#!/bin/bash
for myvar in 1 2 3; do
eval "$1"
done
...and then call as:
./myscript 'echo "${myvar}"'
Something less awful would be to export a variable:
#!/bin/bash
export myvar
for myvar in 1 2 3; do
"$#"
done
...but even then, the way you call the script will need to avoid preevaluation. For instance, this will work in conjunction with the above:
./myscript bash -c 'echo "$myvar"'
...but that basically has you back to eval. On the other hand:
./myscript ./otherscript
...will do what you want, presuming that otherscript refers to $myvar somewhere within. (This need not be a shell script, or even a script; it can be an executable or other command, and it will still be able to find myvar in the environment).
What's the real goal (ie. business purpose) you're trying to achieve? There's probably a way to accomplish it better aligned with best practices.

Resources