Curly braces in shell script while passing argument - shell

Hi I am new to shell script. I want to pass an argument to a shell script. I know how to do the same.I have written a simple shell script
#!/bin/bash
parameter=$1
whatispased=${parameter:-"nothing"}
echo $whatispassed
If something is passed as the first argument, it is printed else "nothing" is printed. I have seen some people writing.
parameter=${1,,}
i tried replacing the first line with the one above, but i am getting a bad substitution error. Any help is appreciated.

First you can reduce this to:
#!/bin/bash
whatispassed=${1:-"nothing"}
echo $whatispassed
or even:
#!/bin/bash
echo ${1:-"nothing"}
For ${parameter,,pattern} look into Shell Parameter Expansion
For more on bash or shell scripting, see info:bash and https://stackoverflow.com/q/6798269/1741542.

Related

Bash variable usage

I am confused by how to use variables in Bash. Please see the following example. I am not able to figure out why Bash isn't able to recognize the variable within (). Can anybody please help me understand what is going on.
$echo $SHELL
/bin/bash
$export TestC=/Users
$echo $TestC
/Users
$export TestD=$TestC/ABCD
$echo $TestD
/Users/ABCD
$export TestD=$(TestC)/ABCD
-bash: TestC: command not found
Thanks for your help
When referencing a bash variable you either use $ then the name, as in $TestC or you can put braces around the name like ${TestC}.
$(...) is a subshell syntax called command substitution that will execute the command inside the parens then "return" the stdout of that command.
Read all about parameter/variable expansion here, which also shows a lot of the extra things you can do with the parameter expansion when using braces.

Echo-ing an environment variable returns string literal rather than environment variable value

I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo

Concatenate command string in a shell script

I am maintaining an existing shell script which assigns a command to a variable in side a shell script like:
MY_COMMAND="/bin/command -dosomething"
and then later on down the line it passes an "argument" to $MY_COMMAND by doing this :
MY_ARGUMENT="fubar"
$MY_COMMAND $MY_ARGUMENT
The idea being that $MY_COMMAND is supposed to execute with $MY_ARGUMENT appended.
Now, I am not an expert in shell scripts, but from what I can tell, $MY_COMMAND does not execute with $MY_ARGUMENT as an argument. However, if I do:
MY_ARGUMENT="itworks"
MY_COMMAND="/bin/command -dosomething $MY_ARGUMENT"
It works just fine.
Is it valid syntax to call $MY_COMMAND $MY_ARGUMENT so it executes a shell command inside a shell script with MY_ARGUMENT as the argument?
With Bash you could use arrays:
MY_COMMAND=("/bin/command" "-dosomething") ## Quoting is not necessary sometimes. Just a demo.
MY_ARGUMENTS=("fubar") ## You can add more.
"${MY_COMMAND[#]}" "${MY_ARGUMENTS[#]}" ## Execute.
It works just the way you expect it to work, but fubar is going to be the second argument ( $2 ) and not $1.
So if you echo arguments in your /bin/command you will get something like this:
echo "$1" # prints '-dosomething'
echo "$2" # prints 'fubar'

Bash command line arguments, replacing defaults for variables

I have a script which has several input files, generally these are defaults stored in a standard place and called by the script.
However, sometimes it is necessary to run it with changed inputs.
In the script I currently have, say, three variables, $A $B, and $C. Now I want to run it with a non default $B, and tomorrow I may want to run it with a non default $A and $B.
I have had a look around at how to parse command line arguments:
How do I parse command line arguments in Bash?
How do I deal with having some set by command line arguments some of the time?
I don't have enough reputation points to answer my own question. However, I have a solution:
Override a variable in a Bash script from the command line
#!/bin/bash
a=input1
b=input2
c=input3
while getopts "a:b:c:" flag
do
case $flag in
a) a=$OPTARG;;
b) b=$OPTARG;;
c) c=$OPTARG;;
esac
done
You can do it the following way. See Shell Parameter Expansion on the Bash man page.
#! /bin/bash
value=${1:-the default value}
echo value=$value
On the command line:
$ ./myscript.sh
value=the default value
$ ./myscript.sh foobar
value=foobar
Instead of using command line arguments to overwrite default values, you can also set the variables outside of the script. For example, the following script can be invoked with foo=54 /tmp/foobar or bar=/var/tmp /tmp/foobar:
#! /bin/bash
: ${foo:=42}
: ${bar:=/tmp}
echo "foo=$foo bar=$bar"

Problem in running a script

i have unix shell script which is need to be run like below
test_sh XYZ=KLMN
the content of the script is
#!/bin/ksh
echo $XYZ
for using the value of XYZ i have do set -k before i run the script.
is there a way where i can do this without doint set -k before running the script. or is there something that i can do in the script where i can use value of the parameter given while running the script in the below way
test_sh XYZ=KLMN
i am using ksh.
Any help is appreciated.
How about running this?
XYZ=KLMN ./test_sh //running from directory where test_sh is
If your script needs no other arguments, a quick and dirty way do to it is to put
eval "$#"
at the start of your script. This will evaluate the command line arguments as shell commands. If those commands are to assign a shell/environment variable, then that's what it will do.
It's quick-and-dirty since anything could be put on the command line, causing problems from a syntax error to a bad security hole (if the script is trusted).
I'm not sure if "$#" means the same in ksh as it does in bash - using just $* (without quotes) would work too, but is even dirtier.
It looks like you are trying to use the environment variable "INSTANCE" in your script.
For that, the environment variable must be set in advance of executing your script. Using the "set" command sets exportable environment variables. Incidentally, my version of ksh dates from 1993 and the "-k" option was obsolete back then.
To set an environment variable so that it is exported into spawned shells, simply use the "export" command like so:
export INSTANCE='whatever you want to put here'
If you want to use a positional parameter for your script -- that is have the "KLMN" value accessed within your script, and assuming it is the first parameter, then you do the following in your script:
#!/bin/ksh
echo $1
You can also assign the positional parameter to a local variable for later use in your script like so:
#!/bin/ksh
param_one=$1
echo $param_one
You can call this with:
test_sh KLMN
Note that the spacing in the assignment is important -- do not use spaces.
I am tring this option
#!/bin/ksh
echo $1
awk '{FS="=";print $2}' $1
and on the command line
test_sh INSTANCE=LSN_MUM
but awk is failing.is there any problem over here?
Probably #!/bin/ksh -k will work (untested).

Resources