put awk or grep output to command line arguments in bash - bash

I'm pretty new to shell programming and I'm trying to write a shell script to assign grep or awk pattern filtering output to command line parameter in bash shell.
a.sh
source ./b.sh
called a function like // a(function name) parameter1 parameter2
b.sh
function a{
$2=grep -ai "some string" a.txt(parameter 1)
echo "$2"
}
I wanna do like, but it won't let me to do it.
Is this even possible?

In bash, you cannot set positional parameters in a way that the caller can read that value. If you want to 'return' a string from a function, you must write it to stdout, like so:
function myfunc()
{
echo "test"
}
VAR=$(myfunc)
When the above code is run, VAR will contain the string 'test'.

For reference questions, look at the man pages; for example, man bash, man grep etc. For internal shell commands like function there's a bash built-in with similar functionality called help, for example help function.
To set positional parameters, you can use the built-in set. For example, set -- "a b" "c d" sets $1 to a b and $2 to c d.
For a pragmatic introduction to bash programming see the Bash wiki. It's simply the best Bash resource out there.

You can't assign to positional parameters, but you can do something like this:
function myf {
#do something with $1,$2, etc
}
FOO=$(awk command)
BAR=$(other command)
myf $FOO $BAR #the function will use $FOO and $BAR as $1 and $2 positional parameters
So you can pass the content of those commands to the function myf through the use of variables (FOO and BAR) in this case.
You could even do it without dummy variables calling myf $(some command) but the way I wrote it improves readability.

Before you try function, try a script first.
#!/bin/sh
arg1=${1?'Missing argument'}
grep -ai "some string" $arg1
And then put this script in your ~/bin folder (make sure you have changed your PATH directory to include ~/bin
Then just execute the script.
If you really need a function, then do
#!/bin/sh
b() {
grep -ai "some string" $1
}
b filename

Related

Appending command line arguments to a Bash array

I am trying to write a Bash script that appends a string to a Bash array, where the string contains the path to a Python script together with the arguments passed into the Bash script, enclosed in double quotes.
If I call the script using ./script.sh -o "a b", I would like a CMD_COUNT of 1, but I am getting 2 instead.
script.sh:
#!/bin/bash
declare -a COMMANDS=()
COMMANDS+=("/path/to/myscript.py \"${#}\"")
CMD_COUNT=${#COMMANDS[*]}
echo $CMD_COUNT
How can I ensure that the appended string is /path/to/myscript.py "-o" "a b"?
EDIT: The full script is actually like this:
script.sh:
#!/bin/bash
declare -a COMMANDS=()
COMMANDS+=("/path/to/myscript2.py")
COMMANDS+=("/path/to/myscript.py \"${#}\"")
CMD_COUNT=${#COMMANDS[*]}
echo $CMD_COUNT
for i in ${!COMMANDS[*]}
do
echo "${0} - command: ${COMMANDS[${i}]}"
${COMMANDS[${i}]}
done
It's a bad idea, but if it's what you really want, printf %q can be used to generate a string that, when parsed by the shell, will result in a given list of arguments. (The exact escaping might not be identical to what you'd write by hand, but the effect of evaluating it -- using eval -- will be).
#!/bin/bash
declare -a COMMANDS=( )
printf -v command '%q ' "/path/to/myscript" "$#"
COMMANDS+=( "$command" )
CMD_COUNT=${#COMMANDS[#]}
echo "$CMD_COUNT"
...but, as I said, this is all a bad idea.
Best-practice ways to encapsulate code as data in bash involve using functions, or arrays with one element per argument.
eval results in code that's prone to security bugs.

Passing multiple commands to script calling time

I have a bash script let's say foo.sh that in this minimal example looks like this:
#!/bin/bash
function __measure_time {
time "$#"
}
__measure_time "$*"
What I want to do now is pass two commands to this script that are supposed to be run after another and I want to measure the time. So basically I am doing something like:
./foo.sh bash -c "echo 'ss' && sleep 1"
But that doesn't work the way I want it to. I don't get the 'ss' from the echo and the sleep is basically being ignored. Is there a way to make this work?
If you want the arguments to pass through correctly, you need to call __measure_time with "$#", not "$*":
#!/bin/bash
__measure_time() { #why the `function` syntax when you can be POSIX?
time "$#"
}
__measure_time "$#"
"$*" joins all arguments on the first character of $IFS into a string.
"$#" is magic for "give me all the arguments, as if each was separately quoted."

How to evaluate bash function arguments as command with possible environment overrides?

How to write a function in bash (I can rely on it being v4+), that, given words constituting a command with possible environment overrides, execute this command in the current shell?
For example, given
f cd src
f CXX="ccache gcc" make -k XOPTIONS="--test1 --test2"
the function f would do approximately same thing as simply having these lines in the shell script without the f up front?
A few unsuccessful attempts.
This tries to evaluate environment override CXX="ccache gcc" as command.
f() { "$#" ; }
This loses word-quoting on all arguments, breaking single argument words on spaces:
f() { eval "$#" ; }
This handles the environment overrides, but runs the command in a subshell, as env(1) is not a bash builtin:
f() { env -- "$#" ; }
This question came up multiple times on SO and Unix SE, but I have never seen it asked about supporting all three important parts, namely: environment overrides; execution in the current shell; and correct handling of arguments containing spaces (and other characters that are lexically special to bash).
One thing I could potentially use is that environment overrides are rarely used with builtins (but v. IFS= read...), so I can select between the "#" ; and eval -- "#" ; patterns based on $1 being syntactically a variable assignment. But that is, again, not as simple as spotting a = in it, as the equal sign may be quoted part of a command, albeit that is not likely sane. Still, I usually prefer correct code to mostly correct code, and this approach has 2 consecutive leaps of faith.
Addressing a possible question why do I need a function replicating the default behavior of the shell ("just drop the f"): in reality, f() is more complex that just running a command, implementing a pattern repeating in the script in a few dozen locations; this is only the part I cannot get right.
If you can make eval see your arguments properly quoted, it should work. To this end, you can use the %q format specification of printf, which works as follows:
$ printf '%q ' CXX="ccache gcc" make -k XOPTIONS="--test1 --test2"
CXX=ccache\ gcc make -k XOPTIONS=--test1\ --test2
This would result in a function like
f () {
eval "$(printf '%q ' "$#")"
}
Notice that this appends an extra space at the end of the command, but this shouldn't hurt.
Tricky. You could do this, but it's going to pollute the environment of the shell:
f() {
# process any leading "var=value" assignments
while [[ $1 == ?*=* ]]; do
declare -x "$1"
shift
done
"$#"
}
Just did a quick test: the env vars declared in the function are still local to the scope of the function and will not actually pollute the script's environment.
$ f() {
declare -x FOO=bar
sh -c 'echo subshell FOO=$FOO'
echo function FOO=$FOO
}
$ unset foo
$ f
subshell FOO=bar
function FOO=bar
$ echo main shell FOO=$FOO
main shell FOO=

Passing several values between two shell scripts

I have two shell scripts, i.e. (master, function); the master calls the function-script and tries to pass values to it.
Please note that function-script is an interactive script; i.e. it waits for user's answers to perform according to the answer.
So to pass one value I can write the following:
echo "string" | ./function-script
The problem is that I have to pass several values. Any advice?
Can the "function-script" operate on positional parameters? If so, you'd call it like:
./function-script arg1 "argument 2" arg3
And then "function-script" would use "$1", "$2" and "$3" as required.
If "function-script" only takes input on stdin, do something like this:
printf "%s\n" arg1 "argument 2" arg3 | ./function-script
And "function-script" would do:
IFS= read -r arg1
IFS= read -r arg2
IFS= read -r arg3
Simple solution:
Don't try to pass multiple variable.
Just export all the variable within master script using export a=1 syntax.
Then call child script from master like a regular script
All the variable will be available in child script.
Use command line arguments.
./function-script "string" "another string"
If you pre-empt standard input by piping data into the function script, you make interactive operation of the function script hard.
You could instead export the variables as environment variables, but just as global variables in regular programming are not a good idea because their use is hidden, so too with environment variables.

Using spaces in bash scripts

I have a script foo.sh
CMD='export FOO="BAR"'
$CMD
echo $FOO
It works as expected
>./foo.sh
"BAR"
Now I want to change FOO variable to BAR BAR. So I get script
CMD='export FOO="BAR BAR"'
$CMD
echo $FOO
When I run it I expect to get "BAR BAR", but I get
./foo.sh: line 2: export: `BAR"': not a valid identifier
"BAR
How I can deal with that?
You should not use a variable as a command by just calling it (like in your $CMD). Instead, use eval to evaluate a command stored in a variable. Only by doing this, a true evaluation step with all the shell logic is performed:
eval "$CMD"
(And use double quotes to pass the command to eval.)
Just don't do that.
And read Bash FAQ #50
I'm trying to save a command so I can run it later without having to repeat it each time
If you want to put a command in a container for later use, use a
function. Variables hold data, functions hold code.
pingMe() {
ping -q -c1 "$HOSTNAME"
}
[...]
if pingMe; then ..
The proper way to do that is to use an array instead:
CMD=(export FOO="BAR BAR")
"${CMD[#]}"

Resources