Difference between local a and local a= - bash

from /lib/lsb/init-functions (maybe this file is debian specific, but doesn't really matter for the question):
pidofproc () {
local pidfile line i pids= status specified pid
pidfile=
specified=
Whats the difference between saying
local a
and
local a=
?

Both types remove any external versions of the variables from the scope.
The = assigns a null value to the variable, whereas the bare form leaves the variable unset.
For example:
A=30
B=30
function foo()
{
local A B=
echo A - $A
echo B - $B
echo A :- ${A:-minusA}
echo B :- ${B:-minusB}
echo A :+ ${A:+plusA}
echo B :+ ${B:+plusB}
echo A hash ${#A}
echo B hash ${#B}
echo A - ${A-minusA}
echo B - ${B-minusB}
echo A + ${A+plusA}
echo B + ${B+plusB}
## Modifies variable
echo A := ${A:=eqA}
echo B := ${B:=eqB}
echo A - $A
echo B - $B
}
foo
Output:
A -
B -
A :- minusA
B :- minusB
A :+
B :+
A hash 0
B hash 0
A - minusA
B -
A +
B + plusB
A := eqA
B := eqB
A - eqA
B - eqB
You can see the section:
echo A - ${A-minusA}
echo B - ${B-minusB}
echo A + ${A+plusA}
echo B + ${B+plusB}
is different for A and B.

Related

bash: set -o errexit is not exiting when a command has an error. how do you change code to make it exit when there is an error? [duplicate]

In the bash man page, it states:
Exit immediately if a pipeline (which may consist of a single simple command),
a subshell command enclosed in parentheses, or one of the commands executed as
part of a command list enclosed by braces...
So I assumed that a function should be considered a command list enclosed by braces. However, if you apply a conditional to the function call, errexit no longer persists inside the function body and it executes the entire command list before returning. Even if you explicitly create a subshell inside the function with errexit enabled for that subshell, all commands in the command list are executed. Here is a simple example that demonstrates the issue:
function a() { b ; c ; d ; e ; }
function ap() { { b ; c ; d ; e ; } ; }
function as() { ( set -e ; b ; c ; d ; e ) ; }
function b() { false ; }
function c() { false ; }
function d() { false ; }
function e() { false ; }
( set -Eex ; a )
+ a
+ b
+ false
( set -Eex ; ap )
+ ap
+ b
+ false
( set -Eex ; as )
+ as
+ set -e
+ b
+ false
Now if I apply a conditional to each of them...
( set -Eex ; a || false )
+ a
+ b
+ false
+ c
+ false
+ d
+ false
+ e
+ false
+ false
( set -Eex ; ap || false )
+ ap
+ b
+ false
+ c
+ false
+ d
+ false
+ e
+ false
+ false
( set -Eex ; as )
+ as
+ set -e
+ b
+ false
+ c
+ false
+ d
+ false
+ e
+ false
+ false
You started to quote the manual but then you cut the bit that explained this behaviour, which was in the very next sentence:
-e Exit immediately if a pipeline, which may consist of a single simple command, a subshell command enclosed in parentheses, or one of the commands executed as part of a command list enclosed by braces returns a non-zero status. The shell does not exit if the command that fails is part of the command list immediately following a while or until keyword, part of the test in an if statement, part of any command executed in a && or || list except the command following the final && or ||, any command in a pipeline but the last, or if the command’s return status is being inverted with !.
bug-bash mailing list has an explanation by Eric Blake more explicit about functions:
Short answer: historical compatibility.
...
Indeed, the correct behavior mandated by POSIX (namely, that 'set -e' is
completely ignored for the duration of the entire body of f(), because f
was invoked in a context that ignores 'set -e') is not intuitive. But
it is standardized, so we have to live with it.
And some words about whether set -e can be exploited to achieve the wanted behavior:
Because once you are in a context that ignores 'set -e', the historical
behavior is that there is no further way to turn it back on, for that
entire body of code in the ignored context. That's how it was done 30
years ago, before shell functions were really thought about, and we are
stuck with that poor design decision.
Not an answer to the original question, but a work-around for the underlying problem: set up a trap on errors:
function on_error() {
echo "error happened!"
}
trap on_error ERR
echo "OK so far"
false
echo "this line should not execute"
The reason for the behavior itself is properly explained in other answers (basically it's expected bash behavior as per the manual and POSIX): https://stackoverflow.com/a/19789651/1091436
not an answer but you might fix this counter intuitive behaviur by defining this helper function:
fixerrexit() { ( eval "expr '$-' : '.*e' >/dev/null && set -e; $*"; ); }
then call your functions via fixerrexit.
example:
f1()
{
mv unimportant-0.txt important.txt
rm unimportant-*.txt
}
set -e
if fixerrexit f1
then
echo here is the important file: important.txt
echo unimportant files are deleted
fi
if outer context has errexit on, then fixerrexit turns on errexit inside f1() as well, so you dont need to worry about commands being executed after a failure occurs.
the only downside is you can not set variables since it runs f1 inside a subshell.

cat * mixes up the order of files

Right now, I am working on a "Text Editor" made with Bash. Everything was going perfectly until I tested it. When I opened the file the script created, everything was jumbled up. I eventually figured out it had something to do with the cat BASHTE/* >> $file I had put in. I still have no idea why this happens. My crappy original code is below:
#!/bin/bash
# ripoff vim
clear
echo "###############################################################################"
echo "# BASHTE TEXT EDITOR - \\\ = interupt :q = quit :w = write #"
echo "# :wq = Write and quit :q! = quit and discard :dd = Delete Previous line #"
echo "###############################################################################"
echo ""
read -p "Enter file name: " file
touch .$file
mkdir BASHTE
clear
echo "###############################################################################"
echo "# BASHTE TEXT EDITOR - \\\ = interupt :q = quit :w = write #"
echo "# :wq = Write and quit :q! = quit and discard :dd = Delete Previous line #"
echo "###############################################################################"
while true
do
read -p "$lines >" store
if [ "$store" = "\\:q" ]
then
break
elif [ "$store" = "\\:w" ]
then
cat BASHTE/* >> $file
elif [ "$store" = "\\:wq" ]
then
cat BASHTE/* >> $file
rm -rf .$file
break
elif [ "$store" = "\\:q!" ]
then
rm -rf BASHTE
rm -rf $file
break
elif [ "$store" = "\\:dd" ]
then
LinesMinusOne=$(expr $lines - 1)
rm -rf BASHTE/$LinesMinusOne.txt
else
echo $store >> BASHTE/$lines.txt
# counts the number of times the while loop is run
((lines++))
fi
done
This is what I got after I typed in the alphabet:
b
j
k
l
m
n
o
p
q
r
s
c
t
u
v
w
x
y
z
d
e
f
g
h
I
This was what I inputted
a
v
c
d
e
f
g
h
I
j
k
l
m
n
o
p
q
r
s
t
u
v
w
x
y
z
\\:wq
Any help would be great, Thanks
BASHTE/* is in lexical order, so every line starting with 1 will come before every line starting with 2 and so on. That means the order of your input lines is:
1 a
10 j
11 k
12 l
...and so on...
To make the lines sort well with the * operator, you'll need to name them with leading zeros, for example:
# ...
echo $store >> BASHTE/$(printf %020d $lines).txt
# ...
I chose the %020d format because it should store any number of lines applicable for a 64-bit system, since 2 ** 64 = 18446744073709551616, which is 20 digits long.

How preserve words containing spaces while expanding variables in bash script?

My scheme is the following:
I have a shell script that executes a command which calls a C program :
name=$1
StringWithSpaces=$name
command="someprogram.out $StringWithSpaces $otherarguments"
$command
where name is a string with spaces s.a. "String With Spaces" passed to the shell from another python script.
My problem is that when I read that argument in C, it is passed as several arguments instead of just one. I have tried $#, $* and all that stuff. I have also tried to make a function in C that separate the several argv[i] within the StringWithSpaces one, but I am a bit stuck. I wish I could read the variable in C just as a single argument, to make the program as simple as I can.
This is the exact shell code (bash):
#!/bin/bash
#$1 Nombre de la base de datos
#$2 $3 gps coordinates
#$4 geoDistance (radio en km)
#$5 imDownload (a 0: solo se descargan GPS, a 1 también imágenes y tags)
#$Disabled keywords (opcional, lista de keywords separados por comas)
#Generamos el base path
BASE_DIR=`pwd`
#BASE_DIR=${BASE_DIR%/*}
EXE_DIR=$BASE_DIR/FlickrAPI/bin
DB_PATH=$BASE_DIR/data
#Exportamos las librerías, necesario para que funcione el código
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${BASE_DIR}/FlickrAPI/lib
cont=1;
#General DB
name=$1
gps="$2 $3"
geoDistance=$4
numImagesDB=3751
ncores=400;
imDownload=$5
dbDir=$DB_PATH/$name
mkdir "$dbDir";
rm -rf "$dbDir"/imagesDB
rm -rf "$dbDir"/tagsDB
rm -rf "$dbDir"/gps.txt
mkdir "$dbDir"/imagesDB
mkdir "$dbDir"/tagsDB
#tidx=`seq 7 $#`;
#keywords="";
#for ((i=7;i<=$#;i++))
#do
# keywords=$keywords" "${!i};
#done
keywords=$6;
export LD_LIBRARY_PATH=/home/user/anaconda2/lib/
command="$EXE_DIR/get_GPS_bimestral $dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords"
echo $command
$command
Put the command in an array with:
Command=(someprogram.out "$StringWithSpaces" "$otherarguments")
When array is expanded in quotes using # to request all its members, as in "${Command[#]}", then bash expands its array member to a single word. So the desired strings with spaces are kept as single strings.
Here is a sample script:
#!/bin/bash -e
function ShowArguments()
{
for argument in "$#"; do
echo "Argument: $argument"
done
}
Argument1="abc def"
Argument2="pdq xyz"
Command=(ShowArguments "$Argument1" "$Argument2")
echo "${Command[#]}"
"${Command[#]}"
The output from the above is:
ShowArguments abc def pdq xyz
Argument: abc def
Argument: pdq xyz
You may want "$otherarguments" to be $otherarguments. Or, if it contains a string with spaces that should be kept as a string, you should handle it the same way, as an array that is expanded with ${otherarguments[#]} in quotes. Here is an example with quoting of one of the variables used to hold arguments:
#!/bin/bash -e
function ShowArguments()
{
for argument in "$#"; do
echo "Argument: $argument"
done
}
Argument1="Single argument with multiple words"
Argument2=("Multiple argument" with various "numbers of words")
Command=(ShowArguments "$Argument1" "${Argument2[#]}")
echo "${Command[#]}"
"${Command[#]}"
It produces:
ShowArguments Single argument with multiple words Multiple argument with various numbers of words
Argument: Single argument with multiple words
Argument: Multiple argument
Argument: with
Argument: various
Argument: numbers of words
After that, you might want to replace echo with a fancier command that quotes its arguments, to show the proper quoting to the user. But that is an aesthetic or user-interface choice; it will not affect the command executed.
My problem is that when I read that argument in C, it is passed as several arguments instead of just one.
in your script replace
command="$EXE_DIR/get_GPS_bimestral $dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords"
echo $command
$command
by
command="$EXE_DIR/get_GPS_bimestral \"$dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords\""
echo $command
echo $command | bash
Explanations :
if the shell script b is :
./a.out "$* 3 4"
and c.c is
#include <stdio.h>
int main(int argc, char ** argv)
{
printf("argc = %d, argv[1] = '%s'\n", argc, argv[1]);
}
then :
pi#raspberrypi:/tmp $ ./b 1 2
argc = 2, argv[1] = '1 2 3 4'
the C program receives only one arg, that is ok
but if the script is modified to be for instance :
command="./a.out \"$* 3 4\""
echo $command
$command
that doesn't work :
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3 4"
argc = 5, argv[1] = '"1'
so if you want to store in a var to echo then execute it you can do :
command="./a.out \"$* 3 4\""
echo $command
echo $command | bash
execution:
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3 4"
argc = 2, argv[1] = '1 2 3 4'
of course if you want for example 4 received as a separate argument just put it outside the string :
command="./a.out \"$* 3\" 4"
echo $command
echo $command | bash
execution :
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3" 4
argc = 3, argv[1] = '1 2 3'

How to get names of variables are available/set in shell scripts

I want all variables names are set in shell script. I have a file which contains key value pairs and I was read content from that file and store/set into variables. I want to do some processes if a variable is available/set otherwise I don't need to do those processes. How to achieve this.
For example I run loop in shell scripts in each iteration it gives one of the variables is set before that command.
If code like this
a=test1
b=test2
c=test3
for i in ???
do
echo $i
done
then I want output like this
a
b
c
What command is used o achieve this.
You could use set before and after setting the variables
e.g:
$ set > aux1
$ c=345
$ set > aux2
$ diff aux1 aux2
57c57
< PIPESTATUS=([0]="141" [1]="0")
---
> PIPESTATUS=([0]="0")
112a113
> c=345
If you have a pre-defined list of such variables, then you can test it like this:
for i in $(echo "a b c"); do
echo $i
done
If i could help you :
#!/bin/sh
# Define list of tests
LIST_TESTS=`cat list_test.txt`
for TEST in ${LIST_TESTS}
do
vartest=`echo ${TEST}`
if [ "${vartest}" = "" ]
# No Test
then
echo "*** WARNING*** Test not found"
else
echo "${vartest} is available"
fi
done
# Second Define list of tests
tabTest=('test1' 'test2' 'test3')
i=0
while [ "${tabTest[$i]}" != "" ]
do
echo "${tabTest[$i]} is available"
i=$(($i+1))
done

Variable affectation from function not working with $()

I'm trying to define a function which behaviour that will, the first time it has been called, ask the user for some choice and then remember thoses choice in order not to ask the user again.
Using a variable to store the state (initialized or not) works well when the function is called the normal way, however if I want this function to return results, capturing thoses results with $() breaks the behaviour I'm trying to achieve
Here is a simple reproductible exemple:
#!/bin/bash
initialized=false
function givevalue
{
echo "init = $initialized" 1>&2
if ! $initialized;
then
echo "a b c"
initialized=true
else
echo "d e f"
fi
}
for i in $(givevalue); do echo "run 1 : $i"; done
for i in $(givevalue); do echo "run 2 : $i"; done
the result I get is
init = false
run 1 : a
run 1 : b
run 1 : c
init = false
run 2 : a
run 2 : b
run 2 : c
while I was expecting
init = false
run 1 : a
run 1 : b
run 1 : c
init = true
run 2 : d
run 2 : e
run 2 : f
The contents of $(...) are run as a separate process and changes to variables are not propagated to the parent shell. This cannot be changed.

Resources