Use key/value data from a file in a shell script - bash

I have a file called Year.txt
Year2000= 1/2/3/4/
Year2001= 5/6/7/8/
Year2002= 9/10/11/12/
....
....
....
Year2020= 100/101/102/
etc and so on
I need to take this Year.txt as reference in my another script some sample.sh
sample.sh
source /home/user/Year.txt
d=cp $filename $1
echo $d
sample.sh Year2000(passing Year2000 as first argument)
**I need to cut the second part after = if I pass Year2000 as my argument and paste this 1/2/3/4/ in my statement
**I need to cut the second part after = if I pass Year2001 as my argument and paste this 5/6/7/8/ in my copy statement
etc..
I need output like this:
Input1 sample.sh Year2000
Output: cp somefile.txt 1/2/3/4/
Input2: sample.sh Year2001
Output: cp somefile.txt 5/6/7/8/
In Short -- I need to take the reference from another file and generate the copy statement

Don't source files that aren't legal bash code. In this case, an associative array lets you store as many key/value pairs as you need inside a single variable.
#!/usr/bin/env bash
case $BASH_VERSION in ''|[123].*) echo "ERROR: Needs bash 4.0 or newer" >&2; exit 1;; esac
year_name=$1
file_name=$2
[[ $file_name ]] || { echo "Usage: $0 year-name file-name" >&2; exit 1; }
# Read year.txt, and generate a map
declare -A dirs_by_year=( )
while IFS='= ' read -r k v; do
dirs_by_year[$k]=$v
done <Year.txt
if ! [[ ${dirs_by_year[$year_name]} ]]; then
echo "ERROR: User specified year $1, but input file does not have a directory for it" >&2
echo " ...defined years follow:" >&2
declare -p dirs_by_year >&2 # print array definition to show what we read
exit 1
fi
# generate and write a cp command
printf '%q ' cp "$file_name" "${dirs_by_year[$year_name]}"

Related

Is there a way to access variables inside of a .xcconfigfile from the terminal?

I have an .xcconfig file that I want to access via the terminal to access variables in the file. Is there a command or is there some way to do this? For example I have a variable called Build_Code = 1234. How would I access that?
Create a script to read the value of a variable.
Ex: .xconfig
var1 = value1
var2 = value2
get_value.bash
#!/bin/bash
#
# get_value.bash <file> <variable>
#
usage()
{
echo "Usage: get_value.bash <file> <variable>"
exit 1
}
#################################################
# Arguments
if [[ $# -eq 2 ]]
then
file="$1"
var="$2"
else
usage
fi
# Check if the file exists
if [[ ! -f "$file" ]]
then
echo "ERROR: file $file does not exist."
exit 2
fi
# Get the variable's value
grep -w "$var" "$file" | cut -d'=' -f2 | tr -d ' '
This simple version assumes the format of the lines is VARIABLE\s*=\s*VALUE.
The tr is to remove spaces around the value.
The VALUE cannot contain spaces.
The <file> argument could be hard coded if you will only ever check .xconfig
Many other solutions could be conceived, depending on the exact requirements, but this does the basic need you put in your question.

Can't add a new element to an array in bash [duplicate]

In the following program, if I set the variable $foo to the value 1 inside the first if statement, it works in the sense that its value is remembered after the if statement. However, when I set the same variable to the value 2 inside an if which is inside a while statement, it's forgotten after the while loop. It's behaving like I'm using some sort of copy of the variable $foo inside the while loop and I am modifying only that particular copy. Here's a complete test program:
#!/bin/bash
set -e
set -u
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to 1: $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done
echo "Variable \$foo after while loop: $foo"
# Output:
# $ ./testbash.sh
# Setting $foo to 1: 1
# Variable $foo after if statement: 1
# Value of $foo in while loop body: 1
# Variable $foo updated to 2 inside if inside while loop
# Value of $foo in while loop body: 2
# Value of $foo in while loop body: 2
# Variable $foo after while loop: 1
# bash --version
# GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)
echo -e $lines | while read line
...
done
The while loop is executed in a subshell. So any changes you do to the variable will not be available once the subshell exits.
Instead you can use a here string to re-write the while loop to be in the main shell process; only echo -e $lines will run in a subshell:
while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done <<< "$(echo -e "$lines")"
You can get rid of the rather ugly echo in the here-string above by expanding the backslash sequences immediately when assigning lines. The $'...' form of quoting can be used there:
lines=$'first line\nsecond line\nthird line'
while read line; do
...
done <<< "$lines"
UPDATED#2
Explanation is in Blue Moons's answer.
Alternative solutions:
Eliminate echo
while read line; do
...
done <<EOT
first line
second line
third line
EOT
Add the echo inside the here-is-the-document
while read line; do
...
done <<EOT
$(echo -e $lines)
EOT
Run echo in background:
coproc echo -e $lines
while read -u ${COPROC[0]} line; do
...
done
Redirect to a file handle explicitly (Mind the space in < <!):
exec 3< <(echo -e $lines)
while read -u 3 line; do
...
done
Or just redirect to the stdin:
while read line; do
...
done < <(echo -e $lines)
And one for chepner (eliminating echo):
arr=("first line" "second line" "third line");
for((i=0;i<${#arr[*]};++i)) { line=${arr[i]};
...
}
Variable $lines can be converted to an array without starting a new sub-shell. The characters \ and n has to be converted to some character (e.g. a real new line character) and use the IFS (Internal Field Separator) variable to split the string into array elements. This can be done like:
lines="first line\nsecond line\nthird line"
echo "$lines"
OIFS="$IFS"
IFS=$'\n' arr=(${lines//\\n/$'\n'}) # Conversion
IFS="$OIFS"
echo "${arr[#]}", Length: ${#arr[*]}
set|grep ^arr
Result is
first line\nsecond line\nthird line
first line second line third line, Length: 3
arr=([0]="first line" [1]="second line" [2]="third line")
You are asking this bash FAQ. The answer also describes the general case of variables set in subshells created by pipes:
E4) If I pipe the output of a command into read variable, why
doesn't the output show up in $variable when the read command finishes?
This has to do with the parent-child relationship between Unix
processes. It affects all commands run in pipelines, not just
simple calls to read. For example, piping a command's output
into a while loop that repeatedly calls read will result in
the same behavior.
Each element of a pipeline, even a builtin or shell function,
runs in a separate process, a child of the shell running the
pipeline. A subprocess cannot affect its parent's environment.
When the read command sets the variable to the input, that
variable is set only in the subshell, not the parent shell. When
the subshell exits, the value of the variable is lost.
Many pipelines that end with read variable can be converted
into command substitutions, which will capture the output of
a specified command. The output can then be assigned to a
variable:
grep ^gnu /usr/lib/news/active | wc -l | read ngroup
can be converted into
ngroup=$(grep ^gnu /usr/lib/news/active | wc -l)
This does not, unfortunately, work to split the text among
multiple variables, as read does when given multiple variable
arguments. If you need to do this, you can either use the
command substitution above to read the output into a variable
and chop up the variable using the bash pattern removal
expansion operators or use some variant of the following
approach.
Say /usr/local/bin/ipaddr is the following shell script:
#! /bin/sh
host `hostname` | awk '/address/ {print $NF}'
Instead of using
/usr/local/bin/ipaddr | read A B C D
to break the local machine's IP address into separate octets, use
OIFS="$IFS"
IFS=.
set -- $(/usr/local/bin/ipaddr)
IFS="$OIFS"
A="$1" B="$2" C="$3" D="$4"
Beware, however, that this will change the shell's positional
parameters. If you need them, you should save them before doing
this.
This is the general approach -- in most cases you will not need to
set $IFS to a different value.
Some other user-supplied alternatives include:
read A B C D << HERE
$(IFS=.; echo $(/usr/local/bin/ipaddr))
HERE
and, where process substitution is available,
read A B C D < <(IFS=.; echo $(/usr/local/bin/ipaddr))
Hmmm... I would almost swear that this worked for the original Bourne shell, but don't have access to a running copy just now to check.
There is, however, a very trivial workaround to the problem.
Change the first line of the script from:
#!/bin/bash
to
#!/bin/ksh
Et voila! A read at the end of a pipeline works just fine, assuming you have the Korn shell installed.
This is an interesting question and touches on a very basic concept in Bourne shell and subshell. Here I provide a solution that is different from the previous solutions by doing some kind of filtering. I will give an example that may be useful in real life. This is a fragment for checking that downloaded files conform to a known checksum. The checksum file look like the following (Showing just 3 lines):
49174 36326 dna_align_feature.txt.gz
54757 1 dna.txt.gz
55409 9971 exon_transcript.txt.gz
The shell script:
#!/bin/sh
.....
failcnt=0 # this variable is only valid in the parent shell
#variable xx captures all the outputs from the while loop
xx=$(cat ${checkfile} | while read -r line; do
num1=$(echo $line | awk '{print $1}')
num2=$(echo $line | awk '{print $2}')
fname=$(echo $line | awk '{print $3}')
if [ -f "$fname" ]; then
res=$(sum $fname)
filegood=$(sum $fname | awk -v na=$num1 -v nb=$num2 -v fn=$fname '{ if (na == $1 && nb == $2) { print "TRUE"; } else { print "FALSE"; }}')
if [ "$filegood" = "FALSE" ]; then
failcnt=$(expr $failcnt + 1) # only in subshell
echo "$fname BAD $failcnt"
fi
fi
done | tail -1) # I am only interested in the final result
# you can capture a whole bunch of texts and do further filtering
failcnt=${xx#* BAD } # I am only interested in the number
# this variable is in the parent shell
echo failcnt $failcnt
if [ $failcnt -gt 0 ]; then
echo $failcnt files failed
else
echo download successful
fi
The parent and subshell communicate through the echo command. You can pick some easy to parse text for the parent shell. This method does not break your normal way of thinking, just that you have to do some post processing. You can use grep, sed, awk, and more for doing so.
I use stderr to store within a loop, and read from it outside.
Here var i is initially set and read inside the loop as 1.
# reading lines of content from 2 files concatenated
# inside loop: write value of var i to stderr (before iteration)
# outside: read var i from stderr, has last iterative value
f=/tmp/file1
g=/tmp/file2
i=1
cat $f $g | \
while read -r s;
do
echo $s > /dev/null; # some work
echo $i > 2
let i++
done;
read -r i < 2
echo $i
Or use the heredoc method to reduce the amount of code in a subshell.
Note the iterative i value can be read outside the while loop.
i=1
while read -r s;
do
echo $s > /dev/null
let i++
done <<EOT
$(cat $f $g)
EOT
let i--
echo $i
How about a very simple method
+call your while loop in a function
- set your value inside (nonsense, but shows the example)
- return your value inside
+capture your value outside
+set outside
+display outside
#!/bin/bash
# set -e
# set -u
# No idea why you need this, not using here
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
function my_while_loop
{
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2; return 2;
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2;
echo "Variable \$foo updated to $foo inside if inside while loop"
return 2;
fi
# Code below won't be executed since we returned from function in 'if' statement
# We aready reported the $foo var beint set to 2 anyway
echo "Value of \$foo in while loop body: $foo"
done
}
my_while_loop; foo="$?"
echo "Variable \$foo after while loop: $foo"
Output:
Setting $foo 1
Variable $foo after if statement: 1
Value of $foo in while loop body: 1
Variable $foo after while loop: 2
bash --version
GNU bash, version 3.2.51(1)-release (x86_64-apple-darwin13)
Copyright (C) 2007 Free Software Foundation, Inc.
Though this is an old question and asked several times, here's what I'm doing after hours fidgeting with here strings, and the only option that worked for me is to store the value in a file during while loop sub-shells and then retrieve it. Simple.
Use echo statement to store and cat statement to retrieve. And the bash user must chown the directory or have read-write chmod access.
#write to file
echo "1" > foo.txt
while condition; do
if (condition); then
#write again to file
echo "2" > foo.txt
fi
done
#read from file
echo "Value of \$foo in while loop body: $(cat foo.txt)"

Shell-write a file into shell variable

I have a file like this format:
a;b;c
e;d;f
how can I use shell to read the file detail information into variables?
I would have 6 variables to store the data.
more detailed information for this is that as the following shows:
I have written a script:
#!/bin/sh
unset ret
ret=0
if [ "$#" -ne 3 ]; then
logger -p err "Usage: $0 eth_name rule_file table_name"
ret=1
exit ret
fi
OFS=$IFS # store field separator
IFS=";" # define field separator
eth_name=$1 # ethernet device name
rule_file=$2 # input file name
table_name=$3 # lookup table name
logger -p notice "$0 $eth_name $rule_file $table_name"
unset a # reference to line array
unset i j # index
unset m n # dimension
### read route configuration
i=0
while read line
do
a=A$i
unset $a
declare -a $a='($line)'
i=$((i+1))
done < $rule_file
# store number of lines
m=$i
# function for apply route
add_route()
{
if [ "source" = "$1" ]; then
src_address=$(ifconfig $eth_name | sed -n 's/.*inet addr:\([0-9.]\+\)\s.*/\1/p')
ip rule add from $src_address lookup $table_name
ret=$?
logger -p notice "ip rule add from $src_address lookup $table_name $ret"
elif [ "default" = "$1" ]; then
ip route add default via $2 table $table_name
ret=$?
logger -p notice "ip route add default via $2 table $table_name $ret"
else
ipaddress_range=$1
gateway_ipaddress=$2
ip route add $ipaddress_range via $gateway_ipaddress dev $eth_name table $table_name
ret=$?
logger -p notice "ip route add $ipaddress_range via $gateway_ipaddress dev $eth_name table $table_name $ret"
fi
}
### apply route configuration
for ((i=0; i < $m; i++))
do
a=A$i
# get line size
# double escape '\\' for sub shell '``' and 'echo'
p0=`eval echo \\${$a[0]}`
p1=`eval echo \\${$a[1]}`
add_route $p0 $p1
done
IFS=$OFS
the rule file's format is as the following shows:
source;
default_route;172.20.5.192/26
default_gateway;172.20.5.254
172.17.23.64/26;172.20.5.254
172.31.252.0/24;172.20.5.254
172.31.254.0/24;172.20.5.254
10.217.1.0/24;172.20.5.254
10.217.2.0/24;172.20.5.254
this script is working normally under the bash environment, now my linux system is not having bash now, this script is not working now, how to change the script to make the script running?
the function for this script is very simple, write every line into the linux system's ip rule and ip route. need to throw 3 variables to make the script running.
You can easily achieve it with read.
#!/bin/bash
while IFS=";" read -r var1 var2 var3; do
command
done <file
exit 0
where:
IFS is your delimiter
varN are your vars, you can use any name you want
N.B. If you need stdin you need to use a file descriptor:
#!/bin/bash
exec 3<file
while IFS=";" read -r var1 var2 var3 <&3; do
command
done
exec 3>&-
exit 0
Further readings here.
N.B.#2 The command read is not the faster solution, usually after 5k lines lose 100 ms compared to other tools (E.G. awk).
The following parses your original sample text and saves fields to sequentially numbered variables, stripping values out of lines using parameter expansion, and doesn't care how many fields you have per line.
#!/bin/sh
i=0
while read line; do
while [ -n "$line" ]; do
eval this_$((i=i+1))="\${line%%;*}"
last="$line"
line="${line#*;}"
[ "$last" = "$line" ] && break
done
done < input.txt
I've tested this successfully with both bash and FreeBSD's /bin/sh (which is based on ash). (Note that FreeBSD's /bin/sh doesn't seem to like arithmetic expressions like $((i++)), but both shells I tested are fine with the notation above.)
But from the look of the script you updated your question with, this isn't what you need. It seems that you have input data with:
one record per line, and
two fields, separated by semicolons.
But I wonder if you even need to store things in variables. It seems to me you'd be looking more for the following type of structure.
#!/bin/sh
...
while IFS=";" read range gateway; do
case "$range" in
source)
: Note that $gateway is blank
;;
default_route)
: do something
;;
default_gateway)
: do something else
;;
[0-9]*)
ip route add "$range" via "$gateway" dev "$eth_name" table "$table_name"
;;
*)
printf 'ERROR: unknown range in rule file: %s\n' "$range" >&2
;;
esac
done < $rule_file
Additional input validation wouldn't hurt.

How to use a text file for multiple variable in bash

I want to make an bash script for things I use much and for easy access of things but I want to make an firstrun setup that saves the typed paths to programs or commands in a txt file. But how can I do that. And how can I include the lines of the text file to multiple variables?
After a lot of testing I could use the 2 anwsers given. I need to store a variable directly to a textfile and not asking a user for his details and then stores that to a file
So I want it to be like this
if [[ -d "/home/$(whoami)/.minecraft" && ! -L "/home/$(whoami)/.minecraft" ]] ; then
echo "Minecraft found"
minecraft="/home/$(whoami)/Desktop/shortcuts/Minecraft.jar" > safetofile
# This ^ needs to be stored on a line in the textfile
else
echo "No Minecraft found"
fi
if [[ -d "/home/$(whoami)/.technic" && ! -L "/home/$(whoami)/.technic" ]]; then
echo "Technic found"
technic="/home/$(whoami)/Desktop/shortcuts/TechnicLauncher.jar" > safetofile
# This ^ also needs to be stored on an other line in the textfile
else
echo "No Technic found"
fi
I really want to have an anwser to this because I want to script bash. I already experience in bash scripting.
Here's an example:
#!/bin/bash
if [[ -f ~/.myname ]]
then
name=$(< ~/.myname)
else
echo "First time setup. Please enter your name:"
read name
echo "$name" > ~/.myname
fi
echo "Hello $name!"
The first time this script is run, it will ask the user for their name and save it. The next time, it will load the name from the file instead of asking.
#!/bin/bash
# file to save the vars
init_file=~/.init_vars.txt
# save_to_file - subroutine to read var and save to file
# first arg is the var, assumes init_file already exists
save_to_file()
{
echo "Enter $1:"
read val
# check if val has any spaces in them, you will need to quote them if so
case "$val" in
*\ *)
# quote with double quotes before saving to init_file
echo "$1=\"$val\"" >> $init_file
;;
*)
# save var=val to file
echo "$1=$val" >> $init_file
;;
esac
}
if [[ ! -f $init_file ]]
then
# init_file doesnt exist, this will come here only once
# create an empty init_file
touch $init_file
# vars to be read and saved in file, modify accordingly
for var in "name" "age" "country"
do
# call subroutine
save_to_file "$var"
done
fi
# init_file now has three entries,
# name=val1
# age=val2
# country=val3
# source the init_file which will read and execute commands from init_file,
# which set the three variables
. ${init_file}
# echo to make sure it is working
echo $name $age $country

Shell scripting to print list of elements

Is there any command in shell scripting which is similar to "list" in tcl? I want to write a list of elements to a file (each in separate line) .But, if the element matches a particular pattern then element next to it and the element itself should be printed in the same line. Is there any command in shell script for doing this?
example: my string is like " execute the command run abcd.v"
I want to write each word in separate lines of a file but if the word is "run" then abcd.v and run must be printed in the same line. So, the output should be like,
execute
the
command
run abcd.v
How to do this in shell scripting?
line="execute the command run abcd.v"
for word in $line # the variable needs to be unquoted to get "word splitting"
do
case $word in
run|open|etc) sep=" " ;;
*) sep=$'\n' ;;
esac
printf "%s%s" $word "$sep"
done
See http://www.gnu.org/software/bash/manual/bashref.html#Word-Splitting
Here's how you can do it in bash:
Name this following script as list
Set it to executable
Copy it to your ~/bin/:
List:
#!/bin/bash
# list
while [[ -n "$1" ]]
do
if [[ "$1" == "run" ]]; then
echo "$1 $2"
else
echo "$1"
fi
shift
done
And this is how you can use it on the command prompt:
list execute the command run abcd.v > outputfile.txt
And your outputfile.txt will be written as:
execute
the
command
run abcd.v
You could accomplish it by using the below script. It would not be a single command. Below is a for loop that has an if statment to check for the keyword run. It does not append a new line character( echo -n ).
for i in `echo "execute the command run abcd.v"`
do
if [ $i = "run" ] ; then
echo -n "$i " >> fileOutput
else
echo $i >> fileOutput
fi
done

Resources