Bash redirection help - bash

I'm having a little issue getting quite a simple bash script running.
The part that's working:
qstat -f $queuenum | grep -Po '(?<=PBS_O_WORKDIR=).*(?=,)'
Outputs a directory to the screen (for example):
/short/h72/SiO2/defected/2-999/3-forces/FORCES_364
All I want to do is change directory to this folder. Appending a "| cd" to the end of the above command doesn't work, and I can't quite figure out how to use the $(()) tags either.
Any help would be appreciated.

cd `qstat -f $queuenum | grep -Po '(?<=PBS_O_WORKDIR=).*(?=,)' `

Invoking your script creates a new bash shell
This shell is destroyed when your script ends.
If you use exec <scriptname> to run your script, the new bash shell is substituted for the current one. So if you add the command bash at the end of your script, you will get what you desire, but not the same bash shell.

You should use:
cd "$(qstat -f $queuenum | grep -Po '(?<=PBS_O_WORKDIR=).*(?=,)' )"
you are trying to achieve command substitution which is achieved by using either of these two syntax:
$(command)
or like this using backticks:
`command`
The first one is the preferred way as it allows nesting of command substitutions something like:
foo=$(command1 | command2 $(command3))
also you should enclose the entire command substitution in double quotes to protect you if the result of the command substitution is a string with spaces.

Related

How to not escape pipe redirection symbols in Cygwin

I’ve got build scripts working perfectly for Linux bash. Currently I’d like to adapt build for Cygwin, to make it cross-platform. The problem is the build script uses pipe redirection like | and >, which need to be escaped with ^ when using with Cygwin’s ‘bash -c’.
I’m wondering is there any solution to not escape every pipe redirection symbol for Cygwin, maybe bash command line option or env var.
The example:
ls -l | tee 1.log, working perfectly in Linux, needs to be escaped in the following way to be used with Cygwin’s bash -c:
ls -l ^| tee 1.log
I would assume bash -c "ls -l | tee 1.log" should do the trick for symbols like | and >. But if you want to use " inside the bash command, you might run into problems again (depends on the actual use of quotes, and how Cycgwin implemented/compiled bash).
In general, it would be safer to either run bash interactively and input the command there; or (if not possible) use a script file instead, e.g. bash script.sh.

Shell script command substitution with multiple commands

I'm developing a shell script where I have the variable RUN as a result of a command substitution.
Something like this:
RUN="$(kubectl logs ${POD_LISTENER} | grep ${FROM_DATE})"
OUTPUT=$(eval $RUN)
Problem is with the grep portion.
The pattern I'm searching with grep is a date, so I need to add single quote around the variable ${FROM_DATE} to match exactly what I need.
From the terminal, I run the command below and get the result I need
kubectl logs cortex-listener-prod-6b8884d45b-mlmzz | grep '2018-08-11'
And it work well, but I can't make it run from the script.
I don't see, why you need to quote your FROM_DATE, because this string contains only digits and hyphens, and neither one is treated as a filename character in a POSIX shell. However, your outer double quotes don't make sense. Just use
RUN=$(kubectl logs $POD_LISTENER | grep $FROM_DATE)

Replacement substring in shell input

I get the set of strings as input in terminal. I need to replace the ".awk" substring to ".sh" in each string using shell and then output modified string.
I wrote such script for doing this:
#!/bin/bash
while read line
do
result=${line/.awk/.sh}
echo $result
done
But it gives me an error: script-ch.sh: 6: script-ch.sh: Bad substitution.
How should I change this simple script to fix error?
UPD: ".awk" may be inside the string. For example: "file.awk.old".
If you are using Bash, then there is nothing wrong with your substitution. There is no reason to spawn an additional subshell and use a separate utility when bash substring replacement was tailor made to do that job:
$ fn="myfile.awk.old"; echo "$fn --> ${fn/.awk/.sh}"
myfile.awk.old --> myfile.sh.old
Note: if you are substituting .sh for .awk, then the . is unnecessary. A simple ${fn/awk/sh} will suffice.
I suspect you have some stray DOS character in your original script.
Not sure why it works for you and not for me.. might be the input you're giving it. It could have a space in it.
This should work:
#!/bin/bash
while read line
do
result=$(echo $line | sed 's/\.awk/\.sh/')
echo $result
done
If you run chmod +x script.sh and then run it with ./script.sh, or if you run it with bash script.sh, it should work fine.
Running it with sh script.sh will not work because the hashbang line will be ignored and the script will be interpreted by dash, which does not support that string substitution syntax.

Using grep inside shell script gives file not found error

I cannot believe I've spent 1.5 hours on something as trivial as this. I'm writing a very simple shell script which greps a file, stores the output in a variable, and echos the variable to STDOUT.
I have checked the grep command with the regex on the command line, and it works fine. But for some reason, the grep command doesn't work inside the shell script.
Here is the shell script I wrote up:
#!/bin/bash
tt=grep 'test' $1
echo $tt
I ran this with the following command: ./myScript.sh testingFile. It just prints an empty line.
I have already used chmod and made the script executable.
I have checked that the PATH variable has /bin in it.
Verified that echo $SHELL gives /bin/bash
In my desperation, I have tried all combinations of:
tt=grep 'test' "$1"
echo ${tt}
Not using the command line argument at all, and hardcoding the name of the file tt=grep 'test' testingFile
I found this: grep fails inside bash script but works on command line, and even used dos2unix to remove any possible carriage returns.
Also, when I try to use any of the grep options, like: tt=grep -oE 'test' testingFile, I get an error saying: ./out.sh: line 3: -oE: command not found.
This is crazy.
You need to use command substitution:
#!/usr/bin/env bash
test=$(grep 'foo' "$1")
echo "$test"
Command substitution allows the output of a command to replace the command itself. Command substitution occurs when a command is enclosed like this:
$(command)
or like this using backticks:
`command`
Bash performs the expansion by executing COMMAND and replacing the command substitution with the standard output of the command, with any trailing newlines deleted. Embedded newlines are not deleted, but they may be removed during word splitting.
The $() version is usually preferred because it allows nesting:
$(command $(command))
For more information read the command substitution section in man bash.

Bash call a function in a command

I have a question regarding the usage of functions in a command in bash. getRegex is my function, it is defined at the end of the the file. The command that I want to use is the following:
COUNT=`grep -rnE 'getRegex' $HOME/new`
Now I tried a lot of different variants but I cannot make it work, even if I split it in 2. The method works correctly if I call it the following way: getRegex. Any idea what I am missing? TIA
The key words to answer are "bash command substitution", which you could find in man bash or google.
By the way, double quotes are really important here.
#!/bin/bash
function my_func () {
echo "no"
}
string="no you don't
no you don't
no you don't
no you don't
no you don't"
COUNT="$( echo "${string}" | grep "$( my_func )" -c )"
echo "${COUNT}"
And
$> ./ok.sh
5
If you're trying to call a bash command within another bash command, the inner command (here getRegex) needs to be enclosed in backticks `` or else it will be interpreted as text. Since you here would have backticks inside backticks, you'll need to escape the inner ones. Try this:
COUNT=`grep -rnE '\`getRegex\`' $HOME/new`
But, through the wonders of POSIX, we can use a different syntax. Anywhere you use backticks, you can also use $(). So to avoid backslash emesis, you could write:
COUNT=$(grep -rnE '$(getRegex)' $HOME/new)

Resources