passing environment to command in bash for loop one liner - bash

bash for i in {1..1}; do POOL_SIZE=10 bundle exec sidekiq -e production -c 50 -C ./config/sidekiq.yml & done
# => bash: syntax error near unexpected token `do'
what have I missed?

If you don't want to use a script and indeed want to pass a list of commands as a string directly to the bash executable, use the -c option, and quote the entire string:
bash -c 'for i in {1..1}; do POOL_SIZE=10 bundle exec sidekiq -e production -c 50 -C ./config/sidekiq.yml & done'

Related

Unable to execute ssh command containing parentheses with perl: "syntax error near unexpected token `('"

If I run this command from the command line, it works as expected on the remote server:
ssh admin#example.com "docker exec -it -d tasks_live_work_ec2_test_server /bin/sh -c \"/usr/bin/nvim -c 'silent! call SetupInstantServer()'\""
However, if I try to execute it from a perl script, I get errors:
my $cmd = qq|docker exec -it -d ${image_name}_server /bin/sh -c \"/usr/bin/nvim -c 'silent! call SetupInstantServer()'\"|;
`ssh admin\#example.com "$cmd"`;
bash: -c: line 1: syntax error near unexpected token '('`
Escaping the parens with backslashes suppresses the error, but the SetupInstantServer function in vim never gets called.
What I would do, using 2 here-doc:
#!/usr/bin/perl
system<<PerlEOF;
ssh admin\#example.com<<ShellEOF
docker exec -it -d ${image_name}_server /bin/sh -c "
/usr/bin/nvim -c 'silent! call SetupInstantServer()'
"
ShellEOF
PerlEOF
You can decide to add quotes on a 'HereDoc' to prevent shell expansion or the need to escape #. Up to your needs.
Check perldoc perlop#Quote-and-Quote-like-Operators

How do I pass multiple arguments to a shell script into `kubectl exec`?

Consider the following shell script, where POD is set to the name of a K8 pod.
kubectl exec -it $POD -c messenger -- bash -c "echo '$#'"
When I run this script with one argument, it works fine.
hq6:bot hqin$ ./Test.sh x
x
When I run it with two arguments, it blows up.
hq6:bot hqin$ ./Test.sh x y
y': -c: line 0: unexpected EOF while looking for matching `''
y': -c: line 1: syntax error: unexpected end of file
I suspect that something is wrong with how the arguments are passed.
How might I fix this so that arguments are expanded literally by my shell and then passed in as literals to the bash running in kubectl exec?
Note that removing the single quotes results in an output of x only.
Note also that I need the bash -c so I can eventually pass in file redirection: https://stackoverflow.com/a/49189635/391161.
I managed to work around this with the following solution:
kubectl exec -it $POD -c messenger -- bash -c "echo $*"
This appears to have the additional benefit that I can do internal redirects.
./Test.sh x y '> /tmp/X'
You're going to want something like this:
kubectl exec POD -c CONTAINER -- sh -c 'echo "$#"' -- "$#"
With this syntax, the command we're running inside the container is echo "$#". We then take the local value of "$#" and pass that as parameters to the remote shell, thus setting $# in the remote shell.
On my local system:
bash-5.0$ ./Test.sh hello
hello
bash-5.0$ ./Test.sh hello world
hello world

Executing 'bash -c' in 'docker exec' command

Context: I'm trying to write a shortcut for my daily use of the docker exec command. For some reasons, I'm experimenting the problem that my output is sometimes broken when I'm using a bash console inside a container (history messed up, lines overwrite each other as I'm writing, ...)
I read here that you could overcome this problem by adding some command before starting the bash console.
Here is a relevant excerpt of what my script does
#!/bin/bash
containerHash=$1
commandToRun='bash -c "stty cols $COLUMNS rows $LINES && bash -l"'
finalCommand="winpty docker exec -it $containerHash $commandToRun"
echo $finalCommand
$finalCommand
Here is the output I get:
winpty docker exec -it 0b63a bash -c "stty cols $COLUMNS rows $LINES && bash -l"
cols: -c: line 0: unexpected EOF while looking for matching `"'
cols: -c: line 1: syntax error: unexpected end of file
I read here that this had to do with parsing and expansion. However, I can't use a function or an eval command (or at least I didn't succeed in making it work).
If I execute the first output line directly in my terminal, it works without trouble.
How can I overcome this problem?
It's not Docker related, but Bash (In other words, the docker's part of the command works well, it's just bash grumbling on the container like it would grumble on your host):
Minimal reproducible error
cmd='bash -c "echo hello"'
$cmd
hello": -c: line 0: unexpected EOF while looking for matching `"'
hello": -c: line 1: syntax error: unexpected end of file
Fix
cmd='bash -c "echo hello"'
eval $cmd
hello
Answer
foo='docker exec -it XXX bash -c "echo hello"'
eval $foo
This will let you execute your command echo hello on your container, now if you want to add dynamic variables to this command (like echo $string) you just have to get rid of single quotes for double ones, to make this works you will have to escape inner double quotes:
foo="docker exec -it $container bash -c \"echo $variable\""
A complete example
FOO="Hello"
container=$1
bar=$2
cmd="bash -c \"echo $FOO, $bar\""
final_cmd="docker exec -it $container $cmd"
echo "running command: \"$final_cmd\""
eval $final_cmd
Let's take time to dig in,
$FOO is a static variable, in our case it works exactly like a regular variable, just to show you.
$bar is a dynamic variable which takes second command line argument as value
Because $cmd and $final_cmd uses only double quotes, variables are interpreted
Because we use eval $final_cmd command is well interpreted, bash is happy.
Finally, a usage example:
bash /tmp/dockerize.sh 5b02ab015730 world
Gives
running command: "docker exec -it 5b02ab015730 bash -c "echo Hello, world""
Hello, world

Pass all args to a command called in a new shell using bash -c

I've simplified my example to the following:
file1.sh:
#!/bin/bash
bash -c "./file2.sh $#"
file2.sh:
#!/bin/bash
echo "first $1"
echo "second $2"
I expect that if I call ./file1.sh a b to get:
first a
second b
but instead I get:
first a
second
In other words, my later arguments after the first one are not getting passed through to the command that I'm executing inside a new bash shell. I've tried many variations of removing and moving around the quotation marks in the file1.sh file, but haven't got this to work.
Why is this happening, and how do I get the behavior I want?
(UPDATE - I realize it seems pointless that I'm calling bash -c in this example, my actual file1.sh is a proxy script for a command that gets called locally to run in a docker container so it's actually docker exec -i mycontainer bash -c '')
Change file1.sh to this with different quoting:
#!/bin/bash
bash -c './file2.sh "$#"' - "$#"
- "$#" is passing hyphen to populate $0 and $# is being passed in to populate all other positional parameters in bash -c command line.
You can also make it:
bash -c './file2.sh "$#"' "$0" "$#"
However there is no real need to use bash -c here and you can just use:
./file2.sh "$#"

Using bash -c and Globbing

I'm running gnu-parallel on a command that works fine when run from a bash shell but returns an error when parallel executes it with bash using the -c flag. I assume this has to do with the special globbing expression I'm using.
ls !(*site*).mol2
This returns successfully.
With the flag enabled the command fails
/bin/bash -c 'ls !(*site*).mol2'
/bin/bash: -c: line 0: syntax error near unexpected token `('
The manual only specifies that -c calls for bash to read the arguments for a string, am I missing something?
Edit:
I should add I need this to run from a gnu-parallel string, so the end resultant command must be runnable by /bin/bash -c "Some Command"
You should try the following code :
bash <<EOF
shopt -s extglob
ls !(*site*).mol2
EOF
Explanation :
when you run bash -c, you create a subshell, and shopt settings are not inherited.
EDIT
If you really need a one liner :
bash -O extglob -c 'ls !(*site*).mol2'
See this thread

Resources