Global variables not acting like local variables? - bash

I have two bash scripts. Script1 does the following (doesn't matter why I'm using two scripts; just assume it's for a good reason):
export RUN=1
And script2:
. script1
echo ${RUN}
sed -n ${RUN}p mytext.txt > mytextnew.txt
In script 2, echo returns "1" as I expect. However, the sed command (or any other command I try and use the RUN variable with) returns an error, as if RUN doesn't exist. If I simply run script2 with the following:
RUN=2p
sed -n ${RUN} mytext.txt > mytextnew.txt
then everything works fine. This only happens with global variables. If I do the exact same thing I do with local variables that I do with global variables, everything works. But the minute a global variable is thrown in there, everything goes haywire.
Any insight into the problem?

The following:
$ cat script1
export RUN=1
$cat script2
. script1
echo ${RUN}
sed -n ${RUN}p /etc/passwd
after the
$ bash -x script2
prints
+ . script1
++ export RUN=1
++ RUN=1
+ echo 1
1
+ sed -n 1p /etc/passwd
if adding the space to
sed -n ${RUN} p /etc/passwd
prints
sed: -e expression #1, char 1: missing command
check the content of the $RUN with
echo "${RUN}" | od -bc #note the double quotes
#or
echo "${RUN}" | xxd
to see, what really contains your $RUN variable...

Related

Is there a way to print interpolated shell commands while preserving redirections?

Suppose I have the following shell program.
#!/bin/sh
FOO="foo"
echo $FOO | cat
I want to generate another shell program that does the same thing as this one, except that all shell variables have been substituted. For example,
#!/bin/sh
echo "foo" | cat
I know that I can get close if I run the above program using #!/bin/sh -x, but that output does not preserve redirections. Instead, I get
+ FOO=foo
+ echo foo
+ cat
foo
Any ideas?
The following shell:
$ cat eval.sh
echo "#!/bin/sh"
FOO="foo"
echo "echo $FOO | cat"
will write a shell:
$ sh eval.sh
#!/bin/sh
echo foo | cat
which does what you need.

invoke xterm and run command with variable

I would like invoke a xterm with two commands where the first command is to echo some header message follow by some other command (for this sample code I use sleep command for simplicity). The exec command with "echo $msg1" isn't print out any message. Please help me to fix it.
#!/bin/csh -f
set msg1 = ""
set msg1 = "$msg1#[INFO] xx"
set msg1 = "$msg1#[INFO] yy"
# not okay
exec /usr/bin/xterm -e sh -c 'echo "$msg1" | tr "#" "\n" ;sleep 5'
# okay
exec /usr/bin/xterm -e sh -c 'echo hello;sleep 5'
exec /usr/bin/xterm -e sh -c 'echo hello#world | tr "#" "\n" ;sleep 5'
Variables don't work inside single quotes ('), only double quotes ("):
% set x = 'asdf'
% echo '$x'
$x
% echo "$x"
asdf
Right now, the sh process inside the xterm will see echo "$msg1", but it doesn't know about the $msg1 variable since that's local to the script, which is a different process.
You can adjust that command to something like:
exec /usr/bin/xterm -e sh -c "echo '$msg1' | tr '#' '\n' ; sleep 5"
But this won't work well if msg1 can contain single quote or has a \ at the end. Quoting is complex, especially since you're dealing with two different shells (your script and the sh inside xterm) each with its own quoting rules, so it's probably better to use an environment variable:
setenv msg1 "$msg1"
And then you can use the same command as you had above, since the environment variables are inherited by the child process.

Bash: execute and print command as I'd write it in the terminal

Having executed:
cd ~ && mkdir mytmp && cd mytmp/
echo > somefile
and doing this in a bash script mytmp/myscript.sh:
#!/bin/bash
# version 1
cmd="find . -type f -printf %f\n"
$cmd
renders the wanted result, i.e.:
somefile
myscript.sh
Notice that for some reason I don't need to surround %f\n with quotes as I'd do it if I were to write the command in the terminal. Doing so would render a bad result:
#!/bin/bash
# version 2
cmd="find . -type f -printf '%f\n'"
$cmd
results in:
'somefile
''myscript.sh
'
I need to execute $cmd and at the same time print it as I'd write it in the terminal.
Adding echo $cmd in version 1 executes the command properly but echoes the command without quotes.
Adding echo $cmd in version 2 echoes the command with quotes, like I want, but the result of command execution is bad.
How can I achieve this?
Use set -v.
Example Script
I used some overly complicated script, to test the output for quoting and so on.
#! /bin/bash
set -v
myVariable='test'
# a comment
echo "$(echo "$myVariable") two" | cat -
Output When Running The Script
$ ./myscript
myVariable='test'
# a comment
echo "$(echo "$myVariable") two" | cat -
echo "$myVariable"
test two
As we can see, quotes, variable names, and comments are retained, but commands from subshells will appear twice. Since you don't use any subshells, that shouldn't be a problem.
I actually had the answer to this question, but since I "accidentally" got what I wanted and haven't found the answer online after searching for it, I thought I'd post it here anyway.
The solution is including the quotes as in version 2 and getting rid of them when executing the command by using eval:
#!/bin/bash
cmd="find . -type f -printf '%f\n'"
eval $cmd
echo $cmd

Replacing 'source file' with its content, and expanding variables, in bash

In a script.sh,
source a.sh
source b.sh
CMD1
CMD2
CMD3
how can I replace the source *.sh with their content (without executing the commands)?
I would like to see what the bash interpreter executes after sourcing the files and expanding all variables.
I know I can use set -n -v or run bash -n -v script.sh 2>output.sh, but that would not replace the source commands (and even less if a.sh or b.sh contain variables).
I thought of using a subshell, but that still doesn't expand the source lines. I tried a combination of set +n +v and set -n -v before and after the source lines, but that still does not work.
I'm going to send that output to a remote machine using ssh.
I could use <<output.sh to pipe the content into the ssh command, but I can't log as root onto the remote machine, but I am however a sudoer.
Therefore, I thought I could create the script and send it as a base64-encoded string (using that clever trick )
base64 script | ssh remotehost 'base64 -d | sudo bash'
Is there a solution?
Or do you have a better idea?
You can do something like this:
inline.sh:
#!/usr/bin/env bash
while read line; do
if [[ "$line" =~ (\.|source)\s+.+ ]]; then
file="$(echo $line | cut -d' ' -f2)"
echo "$(cat $file)"
else
echo "$line"
fi
done < "$1"
Note this assumes the sourced files exist, and doesn't handle errors. You should also handle possible hashbangs. If the sourced files contain themselves source, you need to apply the script recursively, e.g. something like (not tested):
while egrep -q '^(source|\.)' main.sh; do
bash inline.sh main.sh > main.sh
done
Let's test it
main.sh:
source a.sh
. b.sh
echo cc
echo "$var_a $var_b"
a.sh:
echo aa
var_a="stack"
b.sh:
echo bb
var_b="overflow"
Result:
bash inline.sh main.sh
echo aa
var_a="stack"
echo bb
var_b="overflow"
echo cc
echo "$var_a $var_b"
bash inline.sh main.sh | bash
aa
bb
cc
stack overflow
BTW, if you just want to see what bash executes, you can run
bash -x [script]
or remotely
ssh user#host -t "bash -x [script]"

Using xargs to assign stdin to a variable

All that I really want to do is make sure everything in a pipeline succeeded and assign the last stdin to a variable. Consider the following dumbed down scenario:
x=`exit 1|cat`
When I run declare -a, I see this:
declare -a PIPESTATUS='([0]="0")'
I need some way to notice the exit 1, so I converted it to this:
exit 1|cat|xargs -I {} x={}
And declare -a gave me:
declare -a PIPESTATUS='([0]="1" [1]="0" [2]="0")'
That is what I wanted, so I tried to see what would happen if the exit 1 didn't happen:
echo 1|cat|xargs -I {} x={}
But it fails with:
xargs: x={}: No such file or directory
Is there any way to have xargs assign {} to x? What about other methods of having PIPESTATUS work and assigning the stdin to a variable?
Note: these examples are dumbed down. I'm not really doing an exit 1, echo 1 or a cat, but used these commands to simplify so we can focus on my particular issue.
When you use backticks (or the preferred $()) you're running those commands in a subshell. The PIPESTATUS you're getting is for the assignment rather than the piped commands in the subshell.
When you use xargs, it knows nothing about the shell so it can't make variable assignments.
Try set -o pipefail then you can get the status from $?.
xargs is run in a child process, as are all the commands you call. So they can't effect the environment of your shell.
You might be able to do something with named pipes (mkfifo), or possible bash's read function?
EDIT:
Maybe just redirect the output to a file, then you can use PIPESTATUS:
command1 | command2 | command3 >/tmp/tmpfile
## Examine PIPESTATUS
X=$(cat /tmp/tmpfile)
How about ...
read x <<<"$(echo 1)"
read x < <(echo 1)
echo "$x"
Why not just populate a new array?
IFS=$'\n' read -r -d '' -a result < <(echo a | cat | cat; echo "PIPESTATUS='${PIPESTATUS[*]}'" )
IFS=$'\n' read -r -d '' -a result < <(echo a | exit 1 | cat; echo "PIPESTATUS='${PIPESTATUS[*]}'" )
echo "${#result[#]}"
echo "${result[#]}"
echo "${result[0]}"
echo "${result[1]}"
There are already a few helpful solutions. It turns out that I actually had an example that matches the question as framed above; close-enough anyway.
Consider this:
XX=$(ls -l *.cpp | wc -l | xargs -I{} echo {})
echo $XX
3
Meaning that I had 3 x .cpp files to in my working directory. Now $XX is 3 and I can make use of that result in my script. It is contrived, because I don't actually need the xargs in this example. It works though.
In the example from the question ...
x=`exit 1|cat`
I don't think that will give you what was specified. exit will quit the sub-shell before the cat gets a mention. Also on that note,
I might start with something like
declare -a PIPESTATUS='([0]="0")'
x=$?
x now has the status from the last command.
Assign each line of input to an array, e.g. all python files in a directory
declare -a pyFiles=($(ls -l *.py | awk '{print $9}'))
where $9 is the nineth field in ls -l corresponding to the filename

Resources