Bash script start docker container script & pass in arguments - bash

I have a bash script that runs command line functions, and I then need the script to run commands in a docker container. I then need the script to pass in arguments into the docker, and eventually exit. However, I'm unable to have the script pass in arguments into the docker container. How can I do this?
This is what the docker commands look like without the bash script for reference.
$ docker exec -it rti_cmd
root#29c:/data# rti
187.0.0.1:9806> run_cmd
(integer) 0
187.0.0.1:9806> exit
root#29c:/data# exit
exit
Code snippet with two variations of attempts:
#!/bin/bash
docker exec -it rti_cmd bash<< eeee
rti
run_cmd
exit
exit
eeee
#also have done without the ";"
docker exec -it rti_cmd bash /bin/sh -c
"rti;
run_cmd;
exit;
exit"
Errors:
$ chmod +x test.sh
$ ./test.sh
the input device is not a TTY
/bin/sh: /bin/sh: cannot execute binary file
./test.sh: line 17: $'rti;\nrun_cmd;\nexit;\nexit': command not found

You don't need -i interacive nor -t tty if you want to be non-interactive.
docker exec rti_cmd sh -c 'rti;run_cmd'

Related

Bash script with -e not terminated when using piped script and docker

The following script.sh is executed:
#!/bin/bash
set -eu
# code ...
su buser
mkdir /does/not/work
echo $?
echo This should not be printed
Output:
1
This should not be printed
How i execute the script:
docker exec -i fancy_container bash < script.sh
Question: Why does the script not terminate after the failing command even when set -e was defined and how can i get the script to exit on any failing command? I think the key point is the '<' operator, which i do not understand exactly how it executes the script.
Notes:
-e means: Abort script at first error, when a command exits with non-zero status (except in until or while loops, if-tests, list constructs)
Possible solution:
docker exec -i fancy_container bash -c "cat > tmp.sh; bash tmp.sh" < script.sh
How it works:
< script.sh - Pipe all rows of this file from the host, to the docker exec command.
cat > tmp.sh - Save the incoming piped content to a file inside the container.
bash tmp.sh - Execute the file as-whole inside the container, which means -e works again as expected!
But i still don't know why the initial approach isn't working.

How do I pass multiple arguments to a shell script into `kubectl exec`?

Consider the following shell script, where POD is set to the name of a K8 pod.
kubectl exec -it $POD -c messenger -- bash -c "echo '$#'"
When I run this script with one argument, it works fine.
hq6:bot hqin$ ./Test.sh x
x
When I run it with two arguments, it blows up.
hq6:bot hqin$ ./Test.sh x y
y': -c: line 0: unexpected EOF while looking for matching `''
y': -c: line 1: syntax error: unexpected end of file
I suspect that something is wrong with how the arguments are passed.
How might I fix this so that arguments are expanded literally by my shell and then passed in as literals to the bash running in kubectl exec?
Note that removing the single quotes results in an output of x only.
Note also that I need the bash -c so I can eventually pass in file redirection: https://stackoverflow.com/a/49189635/391161.
I managed to work around this with the following solution:
kubectl exec -it $POD -c messenger -- bash -c "echo $*"
This appears to have the additional benefit that I can do internal redirects.
./Test.sh x y '> /tmp/X'
You're going to want something like this:
kubectl exec POD -c CONTAINER -- sh -c 'echo "$#"' -- "$#"
With this syntax, the command we're running inside the container is echo "$#". We then take the local value of "$#" and pass that as parameters to the remote shell, thus setting $# in the remote shell.
On my local system:
bash-5.0$ ./Test.sh hello
hello
bash-5.0$ ./Test.sh hello world
hello world

Why is executing "docker exec" killing my SSH session?

Let's say I have two servers, A and B. I also have a bash script that is executed on server A that looks like this:
build_test.sh
#!/bin/bash
ssh user#B <<'ENDSSH'
echo "doing test"
bash -ex test.sh
echo "completed test"
ENDSSH
test.sh
#!/bin/bash
docker exec -i my_container /bin/bash -c "echo hi!"
The problem is that completed test does not get printed to the terminal.
Here's the output of running build_test.sh:
$ ./build_test
doing test
+ docker exec -i my_container /bin/bash -c "echo hi!"
hi!
I'm expecting completed test to be output after hi!, but it isn't. How do I fix this?
docker is consuming, though not using, its standard input, which it inherits from test.sh. test.sh inherits its standard input from bash, which inherits its standard input from ssh. This means that docker itself is reading the last line of the script before the remote shell can.
To fix, just redirect docker's standard input from /dev/null.
docker exec -i my_container /bin/bash -c "echo hi!" < /dev/null

Executing 'bash -c' in 'docker exec' command

Context: I'm trying to write a shortcut for my daily use of the docker exec command. For some reasons, I'm experimenting the problem that my output is sometimes broken when I'm using a bash console inside a container (history messed up, lines overwrite each other as I'm writing, ...)
I read here that you could overcome this problem by adding some command before starting the bash console.
Here is a relevant excerpt of what my script does
#!/bin/bash
containerHash=$1
commandToRun='bash -c "stty cols $COLUMNS rows $LINES && bash -l"'
finalCommand="winpty docker exec -it $containerHash $commandToRun"
echo $finalCommand
$finalCommand
Here is the output I get:
winpty docker exec -it 0b63a bash -c "stty cols $COLUMNS rows $LINES && bash -l"
cols: -c: line 0: unexpected EOF while looking for matching `"'
cols: -c: line 1: syntax error: unexpected end of file
I read here that this had to do with parsing and expansion. However, I can't use a function or an eval command (or at least I didn't succeed in making it work).
If I execute the first output line directly in my terminal, it works without trouble.
How can I overcome this problem?
It's not Docker related, but Bash (In other words, the docker's part of the command works well, it's just bash grumbling on the container like it would grumble on your host):
Minimal reproducible error
cmd='bash -c "echo hello"'
$cmd
hello": -c: line 0: unexpected EOF while looking for matching `"'
hello": -c: line 1: syntax error: unexpected end of file
Fix
cmd='bash -c "echo hello"'
eval $cmd
hello
Answer
foo='docker exec -it XXX bash -c "echo hello"'
eval $foo
This will let you execute your command echo hello on your container, now if you want to add dynamic variables to this command (like echo $string) you just have to get rid of single quotes for double ones, to make this works you will have to escape inner double quotes:
foo="docker exec -it $container bash -c \"echo $variable\""
A complete example
FOO="Hello"
container=$1
bar=$2
cmd="bash -c \"echo $FOO, $bar\""
final_cmd="docker exec -it $container $cmd"
echo "running command: \"$final_cmd\""
eval $final_cmd
Let's take time to dig in,
$FOO is a static variable, in our case it works exactly like a regular variable, just to show you.
$bar is a dynamic variable which takes second command line argument as value
Because $cmd and $final_cmd uses only double quotes, variables are interpreted
Because we use eval $final_cmd command is well interpreted, bash is happy.
Finally, a usage example:
bash /tmp/dockerize.sh 5b02ab015730 world
Gives
running command: "docker exec -it 5b02ab015730 bash -c "echo Hello, world""
Hello, world

Using exec -a in Script

Hi I'm trying to run the following script. However, I get an error. Any tips?
prog1 takes in an argument in this case 1000. I am using the exec command because I want to change the program name to "/bin/grade" when executing prog1.
This is the error I am getting:
/script.sh: 2: exec: -a: not found
#! /bin/sh
exec -a "/bin/grade" ./prog1 1000 &
sleep 0.001
kill -14 $!
Run the script with bash instead of bash instead of sh - put #!/bin/bash at the top. The -a flag is specific to the bash shell.
Example A:
#!/bin/sh
exec -a "/bin/bash" pwd
Returns: ./test.sh: 3: exec: -a: not found
Example B:
#!/bin/bash
exec -a "/bin/sh" pwd
Returns: /home/dev

Resources