I have a testFile having two parameters separated by pipe.
vi testFile
1|A
2|B
3|C
4|D
5|E
I am creating map and running it in a for loop, below is working:
while IFS='|' read -r NUM CHAR
do
export MAP[$NUM]=$CHAR
done < testFile
for i in ${!MAP[#]}
do
echo "$i ${MAP[$i]}"
done
But when I am going ssh to any machine and running the loop, getting
./test.sh[11]: syntax error at line 20: '!' unexpected
Below is not working
ssh someUser#someHost << EOF
for i in ${!MAP[#]}
do
echo "$i ${MAP[$i]}"
done
EOF
How do I use MAP in ssh machine
NOTE testFile is not fixed file, i am creating this file from sql query which is varying at every run.
you can try this;
#!/bin/ksh
while IFS='|' read -r NUM CHAR
do
export MAP[$NUM]=$CHAR
done < testFile
for i in "${!MAP[#]}"
do
echo "$i "${MAP[$i]}""
done
ssh someUser#someHost <<EOF
eval `typeset -p MAP`
for i in "\${!MAP[#]}"
do
echo "\$i "\${MAP[\$i]}""
done
EOF
eval : evaluated server-side
typeset: to permit modifying the properties of variables.
\$ : escape a variable
Test :
$ ksh test.ksh
1 A
2 B
3 C
4 D
5 E
user#localhost's password:
1 A
2 B
3 C
4 D
5 E
Related
I have following line in my script, ${snap[#]} array contain my ssh server list.
while IFS= read -r con; do
ssh foo#"$con" /bin/bash <<- EOF
echo "Current server is $con"
EOF
done <<< "${snap[#]}"
I want to print current iteration value of the array as the ssh ran successfully, the $con should print current ssh server --> example#server. How would I do that ?
If the elements in snap are the hosts that you want to connect to, just use a for loop:
for con in "${snap[#]}"; do
# connect to "$con"
done
"${snap[#]}" expands to the safely-quoted list of elements in the array snap, suitable for use with for.
If you really want to use while, then you can do something like this:
i=0
while [ $i -lt ${#snap[#]} ]; do # while i is less than the length of the array
# connect to "${snap[i]}"
i=$(( i + 1 )) # increment i
done
But as you can see, it's more awkward than the for-based approach.
Like this :
while IFS= read -r con; do
ssh "foo#$con" /bin/bash <<EOF
echo "Current server is $con"
EOF
done < <(printf '%s\n' "${snap[#]}")
# ____
# ^
# |
# bash process substitution < <( )
Or simply :
for server in "${snap[#]}"; do
ssh "foo#$con" /bin/bash <<EOF
echo "Current server is $con"
EOF
done
I would like to run a bash script with a watchdog function launched in sub thread that will stop my program when a given variable reach a value. This variable is incremented in the main thread.
var=0
function watchdog()
{
if [[ $var -ge 3 ]]; then
echo "Error"
fi
}
{ watchdog;} &
# main program loop
((var++))
The problem in this code is that $var stays at 0. I also tried without {} around the watchdog call, same result.
Is my code style good ?
You cannot share variables between processes in bash, and it does not support multi-threading. So you need a form of Inter-Process Communication. One of the simplest is to use a named pipe, also known as a FIFO.
Here is and example:
pipe='/tmp/mypipe'
mkfifo "$pipe"
var=0
# Your definition is not strictly correct (although it will work)
watchdog()
{
# Note the loop
while read var
do
if (( var >= 3 )) # a better way to do numeric comparisons
then
echo "Error $var"
else
echo "$var"
fi
sleep 2 # to prevent CPU hogging
done
}
watchdog < "$pipe" & # No need for a group
# main program loop - ??? I see no loop
((var++))
echo "$var" > "$pipe"
((var++))
echo "$var" > "$pipe"
((var++))
echo "$var" > "$pipe"
echo "waiting"
wait
rm "$pipe"
Example run:
$ bash gash.sh
1
waiting
2
Error 3
However I really don't see the point in using a separate process. Why not just call a function to test the value after each change?
if you run your bashscript with a . before, it will be use the same environment and can change existing variable. Look at this:
$ cat test.sh
#!/usr/bin/env bash
a=12
echo $a
$ a=1
$ echo $a
1
$ ./test.sh
12
$ echo $a
1
$ . ./test.sh
12
$ echo $a
12
After i run . ./test.sh the variable $a has been changed through the script.
I know I can do something like
cat <(cat somefile)
But I want to build up a string of <().
So:
for file in *.file; do
mySubs="${mySubs} <(cat ${file})"
done
cat ${mySubs} #cat <(cat 1.file) <(cat 2.file) ... <(cat something.file)
Without having to use eval.
Use named pipes directly. Use mktemp to create temporary file names for each pipe so that you can remove them after you are done.
fifos=()
for f in file1 file2 file3; do
t=$(mktemp)
mkfifo "$t"
pipes+=("$t")
someCommand "$f" > "$t" &
done
someOtherCommand "${pipes[#]}"
rm "${pipes[#]}"
I'm assuming cat is a standin for a more complicated command. Here, I'm explicitly wrapping it to show that:
#!/usr/bin/env bash
someCommand() { echo "Starting file $1"; cat "$1"; echo "Ending file $1"; }
wrap_all() {
## STAGE 1: Assemble the actual command we want to run
local fd cmd_len retval
local -a cmd fds fd_args
cmd_len=$1; shift
while (( cmd_len > 0 )); do
cmd+=( "$1" )
cmd_len=$((cmd_len - 1))
shift
done
## STAGE 2: Open an instance of someCommand for each remaining argument
local fd; local -a fds
fds=( )
for arg; do
exec {fd}< <(someCommand "$arg")
fds+=( "$fd" )
fd_args+=( "/dev/fd/$fd" )
done
## STAGE 3: Actually run the command
"${cmd[#]}" "${fd_args[#]}"; retval=$?
## STAGE 4: Close all the file descriptors
for fd in "${fds[#]}"; do
exec {fd}>&-
done
return "$retval"
}
Invocation as:
echo "one" >one.txt; echo "two" >two.txt
wrap_all 1 cat one.txt two.txt
...which outputs:
Starting file one.txt
one
Ending file one.txt
Starting file two.txt
two
Ending file two.txt
Note that this requires bash 4.1 for automatic FD allocation support (letting us avoid the need for named pipes).
I want to create a function locally, echo_a in the example, and pass it with to a remote shell through ssh, here with typeset -f. The problem is that function does not have access to the local variables.
export a=1
echo_a() {
echo a: $a
}
bash <<EOF
$(typeset -f echo_a)
echo local heredoc:
echo_a
echo
echo local raw heredoc:
echo a: $a
echo
EOF
ssh localhost bash <<EOF
$(typeset -f echo_a)
echo remote heredoc:
echo_a
echo
echo remote raw heredoc:
echo a: $a
echo
EOF
Assuming the ssh connection is automatic, running the above script gives me as output:
local heredoc:
a: 1
local raw heredoc:
a: 1
remote heredoc:
a:
remote raw heredoc:
a: 1
See how the "remote heredoc" a is empty? What can I do to get 1 there?
I tested adding quotes and backslashes everywhere without success.
What am I missing? Would something else than typeset make this work?
Thanks to #Guy for the hint, it indeed is because ssh disables by default sending the environment variables. In my case, changing the server's setting was not wanted.
Hopefully we can hack around by using compgen, eval and declare.
First we identify added variables generically. Works if variables are created inside a called function too. Using compgen is neat because we don't need to export variables explicitely.
The array diff code comes from https://stackoverflow.com/a/2315459/1013628 and the compgen trick from https://stackoverflow.com/a/16337687/1013628.
# Store in env_before all variables created at this point
IFS=$'\n' read -rd '' -a env_before <<<"$(compgen -v)"
a=1
# Store in env_after all variables created at this point
IFS=$'\n' read -rd '' -a env_after <<<"$(compgen -v)"
# Store in env_added the diff betwen env_after and env_before
env_added=()
for i in "${env_after[#]}"; do
skip=
for j in "${env_before[#]}"; do
[[ $i == $j ]] && { skip=1; break; }
done
if [[ $i == "env_before" || $i == "PIPESTATUS" ]]; then
skip=1
fi
[[ -n $skip ]] || env_added+=("$i")
done
echo_a() {
echo a: $a
}
env_added holds now an array of all names of added variables between the two calls to compgen.
$ echo "${env_added[#]}"
a
I filter out also the variables env_before and PIPESTATUS as they are added automatically by bash.
Then, inside the heredocs, we add eval $(declare -p "${env_added[#]}").
declare -p VAR [VAR ...] prints, for each VAR, the variable name followed by = followed by its value:
$ a = 1
$ b = 2
$ declare -p a b
declare -- a=1
declare -- b=2
And the eval is to actually evaluate the declare lines. The rest of the code looks like:
bash <<EOF
# Eval the variables computed earlier
eval $(declare -p "${env_added[#]}")
$(typeset -f echo_a)
echo local heredoc:
echo_a
echo
echo local raw heredoc:
echo a: $a
echo
EOF
ssh rpi_301 bash <<EOF
# Eval the variables computed earlier
eval $(declare -p "${env_added[#]}")
$(typeset -f echo_a)
echo remote heredoc:
echo_a
echo
echo remote raw heredoc:
echo a: $a
echo
EOF
Finally, running the modified script gives me the wanted behavior:
local heredoc:
a: 1
local raw heredoc:
a: 1
remote heredoc:
a: 1
remote raw heredoc:
a: 1
I would like to expand a little more on "Bash - How to pass arguments to a script that is read via standard input" post.
I would like to create a script that takes standard input and runs it remotely while passing arguments to it.
Simplified contents of the script that I'm building:
ssh server_name bash <&0
How do I take the following method of accepting arguments and apply it to my script?
cat script.sh | bash /dev/stdin arguments
Maybe I am doing this incorrectly, please provide alternate solutions as well.
Try this:
cat script.sh | ssh some_server bash -s - <arguments>
ssh shouldn't make a difference:
$ cat do_x
#!/bin/sh
arg1=$1
arg2=$2
all_cmdline=$*
read arg2_from_stdin
echo "arg1: ${arg1}"
echo "arg2: ${arg2}"
echo "all_cmdline: ${all_cmdline}"
echo "arg2_from_stdin: ${arg2_from_stdin}"
$ echo 'a b c' > some_file
$ ./do_x 1 2 3 4 5 < some_file
arg1: 1
arg2: 2
all_cmdline: 1 2 3 4 5
arg2_from_stdin: a b c
$ ssh some-server do_x 1 2 3 4 5 < some_file
arg1: 1
arg2: 2
all_cmdline: 1 2 3 4 5
arg2_from_stdin: a b c
This variant on ccarton's answer also seems to work well:
ssh some_server bash -s - < script.sh <arguments>