Assign fields to variables - Bash [duplicate] - bash

This question already has answers here:
How do I pipe a file line by line into multiple read variables?
(3 answers)
How to split one string into multiple variables in bash shell? [duplicate]
(5 answers)
Read tab-separated file line into array
(3 answers)
Closed 4 years ago.
Suppose I have a string with pipe separator:
str="1|2|3|4"
I want them to be assigned to specific variables.
var_a=1
var_b=2
var_c=3
var_d=4
I am doing it in this way:
var_a="`echo $str | cut -d'|' -f1`"
var_b="`echo $str | cut -d'|' -f2`"
var_c="`echo $str | cut -d'|' -f3`"
var_d="`echo $str | cut -d'|' -f4`"
Can this be done in an efficient way? Please suggest.

It is better to use an array to store individual delimited values:
str="1|2|3|4"
IFS='|' read -ra arr <<< "$str"
#examine array values
declare -p arr
declare -a arr='([0]="1" [1]="2" [2]="3" [3]="4")'
To loop through array, use:
for i in "${arr[#]}"; do echo "$i"; done
1
2
3
4

IFS='|' read -r var_a var_b var_c var_d rest <<<"$str"
rest is the variable that gets further columns after the first four, should any others exist. If you just want to discard them, the conventional name to use for a placeholder variable is _.
This is covered in detail in BashFAQ #1: How can I read a file (data stream, variable) line-by-line (and/or field-by-field)?

Related

BASH - convert array string to array [duplicate]

This question already has answers here:
Convert a JSON array to a bash array of strings
(4 answers)
Closed 3 years ago.
I am getting following string and I am trying to convert that into array.
[ "aaa", "bbb" ]
Would someone let me know if there is a way to convert this into BASH array.
You can use jq to extract the individual strings, and then read them line by line:
myJsonArray='[ "aaa", "bbb", "more \"complex\"\u0020value" ]'
mapfile -t myBashArray < <(jq -r '.[]' <<< "$myJsonArray")
declare -p myBashArray
This outputs declare -a myBashArray=([0]="aaa" [1]="bbb" [2]="more \"complex\" value")
If you additionally want to support elements with linefeeds and such, you can have jq output NUL-separated values with a bit more work:
myJsonArray='[ "multi\nline\ndata", "hostile'\''\"\u0000\n$(rm foo)data" ]'
array=()
while IFS= read -r -d '' line
do
array+=("$line")
done < <(jq -j '(.[] | gsub("\u0000"; "")) + "\u0000"' <<< "$myJsonArray")
declare -p array
This outputs declare -a array=([0]=$'multi\nline\ndata' [1]=$'hostile\'"\n$(rm foo)data')
It makes sure NUL bytes in the data don't interfere, but they will be stripped from the output. This is unlikely to matter since Bash variables can't represent NUL bytes in the first place.

How to put each line of a file in an array [duplicate]

This question already has answers here:
Creating an array from a text file in Bash
(7 answers)
Closed 3 years ago.
I have a problem with one of my bash scripts.
I have a file where is stored a list of email addresses line by line like so :
mail#adress1
mail#adress2
...
what I'd like to do is to actually put each line of the file in an array where each index corresponds to a line in the right order.
For me mapfile was not available, you can also do this with potentially older Bash versions:
set -f
IFS=$'\n'
arr=($(<foo.txt))
To read the lines of a file into an array:
mapfile -t myArray < myFile
or
myArray=()
while IFS= read -r line || [[ "$line" ]] ; do
myArray+=( "$line" )
done < myFile
To read the fields of a line into an array: use read with -a and a "herestring" <<<
# suppose: line="foo,bar,baz"
IFS=, read -r -a fields <<< "$line"

Associative array in bash not storing values inside loop [duplicate]

This question already has answers here:
A variable modified inside a while loop is not remembered
(8 answers)
Closed 4 years ago.
This is my $a output:
[root#node1 ~]# echo "${a}"
/dev/vdc1 /gfs1
/dev/vdd1 /elastic
mfsmount /usr/local/flytxt
I gotta store these in an associative array fsmounts with first column as keys and second col as value.
This is my code for that:
declare -A fsmounts
echo "$a" | while read i ; do key=$(echo "$i" | awk '{print $1}'); value=$(echo "$i" | awk '{print $2}');fsmounts[$key]=$value; done;
But when I try to print outside the loop with
[root#node1 ~]# echo ${fsmounts[/dev/vdb1]}
Blank is the output. I think the associative array fsmounts is not actually storing values. Please help me.
But I can actually able to echo fsmounts[$key] inside the loop. See this:
echo "$a" | while read i ; do key=$(echo "$i" | awk '{print $1}'); value=$(echo "$i" | awk '{print $2}');fsmounts[$key]=$value; echo ${fsmounts[$key]}; done;
/gfs1
/elastic
/usr/local/flytxt
The imminent problem associated with your logic is I set variables in a loop that's in a pipeline. Why do they disappear after the loop terminates? Or, why can't I pipe data to read?
You don't need awk or any third party tool at all for this. Just run it over a loop with a read and feed the string via standard input over the loop
declare -A fsmounts
while read -r key value; do
fsmounts["$key"]="$value"
done <<<"$a"
now loop over the array with key and values
for key in "${!fsmounts[#]}"; do
printf 'key=%s value=%s\n' "$key" "${fsmounts[$key]}"
done
More on how to use associative arrays - How can I use variable variables (indirect variables, pointers, references) or associative arrays?

Bad Substitution when I try to print a specific position of array [duplicate]

This question already has answers here:
Difference between sh and Bash
(11 answers)
Closed 10 months ago.
I'm getting started with bash programming, and I want to print a specific position of array, but when I try I get this error: Bad substitution
#!/bin/sh
user=`cut -d ";" -f1 $ultimocsv | sort -d | uniq -c`
arr=$(echo $user | tr " " "\n")
a=5
echo "${arr[$a]}" #Error:bad substitution
why?
You are using "sh" which does not support arrays. Even if you would use the "bash" you get the same error, because the "arr" will not be an array. I am not sure, if the "-c" at "uniq" is what you wanted.
I assume, this is what you are looking for:
#!/bin/bash
mapfile -t arr < <( cut -d ";" -f1 $ultimocsv | sort -d | uniq )
a=5
echo "${arr[$a]}"
This will not give the error, even if your file has less than 5 unique lines, because bash will return an empty string for a defined but empty array.
It works even with "uniq -c", because it puts complete lines in the array.

How to split one string into multiple variables in bash shell? [duplicate]

This question already has answers here:
How do I split a string on a delimiter in Bash?
(37 answers)
Closed 7 years ago.
I've been looking for a solution and found similar questions, only they were attempting to split sentences with spaces between them, and the answers do not work for my situation.
Currently a variable is being set to something a string like this:
ABCDE-123456
and I would like to split that into 2 variables, while eliminating the "-". i.e.:
var1=ABCDE
var2=123456
How is it possible to accomplish this?
This is the solution that worked for me:
var1=$(echo $STR | cut -f1 -d-)
var2=$(echo $STR | cut -f2 -d-)
Is it possible to use the cut command that will work without a delimiter (each character gets set as a variable)?
var1=$(echo $STR | cut -f1 -d?)
var2=$(echo $STR | cut -f1 -d?)
var3=$(echo $STR | cut -f1 -d?)
etc.
To split a string separated by -, you can use read with IFS:
$ IFS=- read -r var1 var2 <<< ABCDE-123456
$ echo "$var1"
ABCDE
$ echo "$var2"
123456
Edit:
Here is how you can read each individual character into array elements:
$ read -ra foo <<<"$(echo "ABCDE-123456" | sed 's/./& /g')"
Dump the array:
$ declare -p foo
declare -a foo='([0]="A" [1]="B" [2]="C" [3]="D" [4]="E" [5]="-" [6]="1" [7]="2" [8]="3" [9]="4" [10]="5" [11]="6")'
If there are spaces in the string:
$ IFS=$'\v' read -ra foo <<<"$(echo "ABCDE 123456" | sed $'s/./&\v/g')"
$ declare -p foo
declare -a foo='([0]="A" [1]="B" [2]="C" [3]="D" [4]="E" [5]=" " [6]="1" [7]="2" [8]="3" [9]="4" [10]="5" [11]="6")'
If you know it's going to be just two fields, you can skip the extra subprocesses like this, using :
var1=${STR%-*}
var2=${STR#*-}
What does this do? ${STR%-*} deletes the shortest substring of $STR that matches the pattern -* starting from the end of the string. ${STR#*-} does the same, but with the *- pattern and starting from the beginning of the string. They each have counterparts %% and ## which find the longest anchored pattern match. If anyone has a helpful mnemonic to remember which does which, let me know! I always have to try both to remember.
See the bash documentation for more information.
If your solution doesn't have to be general, i.e. only needs to work for strings like your example, you could do:
var1=$(echo $STR | cut -f1 -d-)
var2=$(echo $STR | cut -f2 -d-)
I chose cut here because you could simply extend the code for a few more variables...
Sounds like a job for set with a custom IFS.
IFS=-
set $STR
var1=$1
var2=$2
(You will want to do this in a function with a local IFS so you don't mess up other parts of your script where you require IFS to be what you expect.)
Using bash regex capabilities:
re="^([^-]+)-(.*)$"
[[ "ABCDE-123456" =~ $re ]] && var1="${BASH_REMATCH[1]}" && var2="${BASH_REMATCH[2]}"
echo $var1
echo $var2
OUTPUT
ABCDE
123456

Resources