I have a file with 2 lines and I want to read them into 2 variables respectively. How do I accomplish this in shellscript(bash)?
You can open file descriptors in a shell to read the variables:
#!/bin/bash
# open file
exec 6<tst.txt
read foo <&6
read bar <&6
# close file again
exec 6<&-
echo $foo $bar
EDIT:
As a quick explanation, this is using IO redirection. Normally the file descriptors are handled as follows:
0 stdin (input)
1 stdout (output)
2 stderr (error)
However, there's nothing preventing from using other file descriptors (up to 9), so we're opening the "tst.txt" file in file descriptor 6, and read from it using IO redirection.
So, exec 6<tst.txt opens file descriptor 6 and redirects tst.txt into it, whereas exec 6<&- closes it again.
I'm unfortunately not on linux right now to test, but this would be close.
#!/bin/bash
file="/path/to/file"
# Store the previous IFS so we don't break anything else in the script.
prevIFS='$IFS'
# You need the line break to capture a newline.
IFS='
'
read var1 var2 < $file
echo "Var1: $var1"
echo "Var2: $var2"
# Set IFS back to normal
IFS='$prevIFS'
The simplest answer would be using sed command. Assuming that your file name is file.txt
var1=($(sed '1q;d' file.txt))
var2=($(sed '2q;d' file.txt))
Where 1q and 2q defines the line number.
All the values in Line 1 will be assigned to var1 and similarly to var2.
try this
#!/bin/bash
I=0
while read; do
VAR[$I]=$REPLY
((I++))
done < file
echo ${VAR[0]}
echo ${VAR[1]}
this will work with a file with more than 2 lines
Can you reconfigure the input file (with variables) to work as shell code? i.e.
$ cat varFile
var1=xyz
var2=abc
$ cat myShellScript.sh
#/bin/whatever (bash)?
# source the variable file
. /path/to/varFile
echo $var1
echo $var2
This is a standard concept in shell scripting and makes it much easier to manage configuration issues where you need to control your (unix/linux) environment based on which physical hardware you are running your system. If this is part of you concern, please let me know and I'll update the sample code to extend on this technique.
I hope this helps.
Related
#!/bin/bash
if [ $# -ne 1 ]
then
echo "USAGE:vitest filename"
else
FILENAME=$1
exec vi $FILENAME <<EOF
i
Line 1.
Line 2.
^[
ZZ
EOF
fi
exit 0
I'm trying to input the Line 1. and Line 2. with Exec vi using the here doc, and commands.
When running the script it gives me the following:
Vim(?):Warning: Input is not from a terminal
Vim: Error reading input, exiting...
Press ENTER or type command to continueVim: Finished.
Vim: Error reading input, exiting...
Vim: Finished.
You want to start vi in ex mode, with a few minor changes to the script.
vi -e "$FILENAME" <<EOF
i
Line 1.
Line 2.
.
wq
EOF
exec is almost certainly unnecessary, especially since you have an exit command following vi. exec is used to replace the current script with the given command; it is not needed simply to execute a command.
A brief history of UNIX text editors:
ed was the original editor, designed to work with a teletype rather than a video terminal.
ex was an extended version of ed, designed to take advantage of a video terminal.
vi was an editor that provided ex with a full-screen visual mode, in contrast with the line-oriented interface employed by ed and ex.
As suggested, ed
ed file << END
1i
line1
line2
.
wq
END
The "dot" line means "end of input".
It can be written less legibly as a one-liner
printf "%s\n" 1i "line1" "line2" . wq | ed file
Use cat.
$ cat file1.txt file2.txt | tee file3.txt
Line 1
Line 2
aaaa
bbbb
cccc
Using sed
If I understand correctly, you want to add two lines to the beginning of a file. In that case, as per Cyrus' suggestion, run:
#!/bin/bash
if [ $# -ne 1 ]
then
echo "USAGE:vitest filename"
exit 1
fi
sed -i.bak '1 s/^/line1\nline2\n/' "$1"
Notes:
When a shell variable is used, it should be in double-quotes unless you want word splitting and pathname expansion to be performed. This is important for file names, for example, as it is now common for them to contain whitespace.
It is best practice to use lower or mixed case names for shell variables. The system uses upper case names for its variables and you don't want to overwrite one of them accidentally.
In the check for the argument, the if statement should include an exit to prevent the rest of the script from being run in the case that no argument was provided. In the above, we added exit 1 which sets the exit code to 1 to signal an error.
Using vi
Let's start with this test file:
$ cat File
some line
Now, let's run vi and see what is in File afterward:
$ vi -s <(echo $'iline1\nline2\n\eZZ') File
$ cat File
line1
line2
some line
The above requires bash or similar.
I have this test script:
#!/bin/bash
echo "Read a variable"
#open file
exec 6<test.txt
read EXAMPLE <&6
#close file again
exec 6<&-
echo $EXAMPLE
The file test.txt has only one line:
EXAMPLE=1
The output is:
bash-3.2$ ./Read_Variables.sh
Read the variable
EXAMPLE=1
I need just to use the value of $EXAMPLE, in this case 1. So how can I avoid getting the EXAMPLE= part in the output?
Thanks
If the file containing your variables is using bash syntax throughout (e.g. X=Y), another option is to use source:
#!/bin/bash
echo "Read a variable"
source test.txt
echo $EXAMPLE
As an alternative to sourcing the entire file, you can try the following:
while read line; do
[[ $line =~ EXAMPLE= ]] && declare "$line" && break
done < test.txt
which will scan the file until it finds the first line that looks like an assignment to EXAMPLE, then use the declare builtin to perform the assignment. It's probably a little slower, but it's more selective about what is actually executed.
I think the most proper way to do this is by sourcing the file which contains the variable (if it has bash syntax), but if I were to do that, I'd source it in a subshell, so that if there are ever other variables declared there, they won't override any important variables in current shell:
(. test.txt && echo $EXAMPLE)
You could read the line in as an array (notice the -a option) which can then be indexed into:
# ...
IFS='=' read -a EXAMPLE <&6
echo ${EXAMPLE[0]} # EXAMPLE
echo ${EXAMPLE[1]} # 1
# ...
This call to read splits the input line on the IFS and puts the remaining parts into an indexed array.
See help read for more information about read options and behaviour.
You could also manipulate the EXAMPLE variable directly:
# ...
read EXAMPLE <&6
echo ${EXAMPLE##*=} # 1
# ...
If all you need is to "import" other Bash declarations from a file you should just use:
source file
Consider a ASCII text file (lets say it contains code of a non-shell scripting language):
Text_File.msh:
spool on to '$LOG_FILE_PATH/logfile.log';
login 'username' 'password';
....
Now if this were a shell script I could run it as $ sh Text_File.msh and the shell would automatically expand the variables.
What I want to do is have shell expand these variables and then create a new file as Text_File_expanded.msh as follows:
Text_File_expanded.msh:
spool on to '/expanded/path/of/the/log/file/../logfile.log';
login 'username' 'password';
....
Consider:
$ a=123
$ echo "$a"
123
So technically this should do the trick:
$ echo "`cat Text_File.msh`" > Text_File_expanded.msh
...but it doesn't work as expected and the output-file while is identical to the source.
So I am unsure how to achieve this.. My goal is make it easier to maintain the directory paths embedded within my non-shell scripts. These scripts cannot contain any UNIX code as it is not compiled by the UNIX shell.
This question has been asked in another thread, and this is the best answer IMO:
export LOG_FILE_PATH=/expanded/path/of/the/log/file/../logfile.log
cat Text_File.msh | envsubst > Text_File_expanded.msh
if on Mac, install gettext first: brew install gettext
see:
Forcing bash to expand variables in a string loaded from a file
This solution is not elegant, but it works. Create a script call shell_expansion.sh:
echo 'cat <<END_OF_TEXT' > temp.sh
cat "$1" >> temp.sh
echo 'END_OF_TEXT' >> temp.sh
bash temp.sh >> "$2"
rm temp.sh
You can then invoke this script as followed:
bash shell_expansion.sh Text_File.msh Text_File_expanded.msh
If you want it in one line (I'm not a bash expert so there may be caveats to this but it works everywhere I've tried it):
when test.txt contains
${line1}
${line2}
then:
>line1=fark
>line2=fork
>value=$(eval "echo \"$(cat test.txt)\"")
>echo "$value"
line1 says fark
line2 says fork
Obviously if you just want to print it you can take out the extra value=$() and echo "$value".
If a Perl solution is ok for you:
Sample file:
$ cat file.sh
spool on to '$HOME/logfile.log';
login 'username' 'password';
Solution:
$ perl -pe 's/\$(\w+)/$ENV{$1}/g' file.sh
spool on to '/home/user/logfile.log';
login 'username' 'password';
One limitation of the above answers is that they both require the variables to be exported to the environment. Here's what i came up with that would allow the variables to be local to the current shell script:
#!/bin/sh
FOO=bar;
FILE=`mktemp`; # Let the shell create a temporary file
trap 'rm -f $FILE' 0 1 2 3 15; # Clean up the temporary file
(
echo 'cat <<END_OF_TEXT'
cat "$#"
echo 'END_OF_TEXT'
) > $FILE
. $FILE
The above example allows the variable $FOO to be substituted in the files named on the command line. I'm sure it can be improved, but this works for me so far.
Thanks to both previous answers for their ideas!
If the variables you want to translate are known and limited in number, you can always do the translation yourself:
sed "s/\$LOG_FILE_PATH/$LOG_FILE_PATH/g" input > output
And also assuming the variable itself is already known
This solution allows you to keep the same formatting in the ouput file
Copy and paste the following lines in your script
cat $1 | while read line
do
eval $line
echo $line
eval echo $line
done | uniq | grep -v '\$'
this will read the file passed as argument line by line, and then process to try and print each line twice:
- once without substitution
- once with substitution of the variables.
then remove the duplicate lines
then remove the lines containing visible variables ($)
Yes eval should be used carefully, but it provided me this simple oneliner for my problem. Below is an example using your filename:
eval "echo \"$(<Text_File.msh)\""
I use printf instead of echo for my own purposes, but that should do the trick. Thank you abyss.7 providing the link that solve my problem. Hope it helps.
Create an ascii file test.txt with the following content:
Try to replace this ${myTestVariable1}
bla bla
....
Now create a file “sub.sed” containing variable names, eg
's,${myTestVariable1},'"${myTestVariable1}"',g;
s,${myTestVariable2},'"${myTestVariable2}"',g;
s,${myTestVariable3},'"${myTestVariable3}"',g;
s,${myTestVariable4},'"${myTestVariable4}"',g'
Open a terminal move to the folder containing test.txt and sub.sed.
Define the value of the varible to be replaced
myTestVariable1=SomeNewText
Now call sed to replace that variable
sed "$(eval echo $(cat sub.sed))" test.txt > test2.txt
The output will be
$cat test2.txt
Try to replace this SomeNewText
bla bla
....
#logfiles.list:
$EAMSROOT/var/log/LinuxOSAgent.log
$EAMSROOT/var/log/PanacesServer.log
$EAMSROOT/var/log/PanacesStrutsGUI.log
#My Program:
cat logfiles.list | while read line
do
eval Eline=$line
echo $Eline
done
How can I write data to a text file automatically by shell scripting in Linux?
I was able to open the file. However, I don't know how to write data to it.
The short answer:
echo "some data for the file" >> fileName
However, echo doesn't deal with end of line characters (EOFs) in an ideal way. So, if you're going to append more than one line, do it with printf:
printf "some data for the file\nAnd a new line" >> fileName
The >> and > operators are very useful for redirecting output of commands, they work with multiple other bash commands.
#!/bin/sh
FILE="/path/to/file"
/bin/cat <<EOM >$FILE
text1
text2 # This comment will be inside of the file.
The keyword EOM can be any text, but it must start the line and be alone.
EOM # This will be also inside of the file, see the space in front of EOM.
EOM # No comments and spaces around here, or it will not work.
text4
EOM
You can redirect the output of a command to a file:
$ cat file > copy_file
or append to it
$ cat file >> copy_file
If you want to write directly the command is echo 'text'
$ echo 'Hello World' > file
#!/bin/bash
cat > FILE.txt <<EOF
info code info
info code info
info code info
EOF
I know this is a damn old question, but as the OP is about scripting, and for the fact that google brought me here, opening file descriptors for reading and writing at the same time should also be mentioned.
#!/bin/bash
# Open file descriptor (fd) 3 for read/write on a text file.
exec 3<> poem.txt
# Let's print some text to fd 3
echo "Roses are red" >&3
echo "Violets are blue" >&3
echo "Poems are cute" >&3
echo "And so are you" >&3
# Close fd 3
exec 3>&-
Then cat the file on terminal
$ cat poem.txt
Roses are red
Violets are blue
Poems are cute
And so are you
This example causes file poem.txt to be open for reading and writing on file descriptor 3. It also shows that *nix boxes know more fd's then just stdin, stdout and stderr (fd 0,1,2). It actually holds a lot. Usually the max number of file descriptors the kernel can allocate can be found in /proc/sys/file-max or /proc/sys/fs/file-max but using any fd above 9 is dangerous as it could conflict with fd's used by the shell internally. So don't bother and only use fd's 0-9. If you need more the 9 file descriptors in a bash script you should use a different language anyways :)
Anyhow, fd's can be used in a lot of interesting ways.
I like this answer:
cat > FILE.txt <<EOF
info code info
...
EOF
but would suggest cat >> FILE.txt << EOF if you want just add something to the end of the file without wiping out what is already exists
Like this:
cat >> FILE.txt <<EOF
info code info
...
EOF
Moving my comment as an answer, as requested by #lycono
If you need to do this with root privileges, do it this way:
sudo sh -c 'echo "some data for the file" >> fileName'
For environments where here documents are unavailable (Makefile, Dockerfile, etc) you can often use printf for a reasonably legible and efficient solution.
printf '%s\n' '#!/bin/sh' '# Second line' \
'# Third line' \
'# Conveniently mix single and double quotes, too' \
"# Generated $(date)" \
'# ^ the date command executes when the file is generated' \
'for file in *; do' \
' echo "Found $file"' \
'done' >outputfile
I thought there were a few perfectly fine answers, but no concise summary of all possibilities; thus:
The core principal behind most answers here is redirection. Two are important redirection operators for writing to files:
Redirecting Output:
echo 'text to completely overwrite contents of myfile' > myfile
Appending Redirected Output
echo 'text to add to end of myfile' >> myfile
Here Documents
Others mentioned, rather than from a fixed input source like echo 'text', you could also interactively write to files via a "Here Document", which are also detailed in the link to the bash manual above. Those answers, e.g.
cat > FILE.txt <<EOF` or `cat >> FILE.txt <<EOF
make use of the same redirection operators, but add another layer via "Here Documents". In the above syntax, you write to the FILE.txt via the output of cat. The writing only takes place after the interactive input is given some specific string, in this case 'EOF', but this could be any string, e.g.:
cat > FILE.txt <<'StopEverything'` or `cat >> FILE.txt <<'StopEverything'
would work just as well. Here Documents also look for various delimiters and other interesting parsing characters, so have a look at the docs for further info on that.
Here Strings
A bit convoluted, and more of an exercise in understanding both redirection and Here Documents syntax, but you could combine Here Document style syntax with standard redirect operators to become a Here String:
Redirecting Output of cat Input
cat > myfile <<<'text to completely overwrite contents of myfile'
Appending Redirected Output of cat Input
cat >> myfile <<<'text to completely overwrite contents of myfile'
This approach works and is the best
cat > (filename) <<EOF
Text1...
Text2...
EOF
Basically the text will search for keyword "EOF" till it terminates writing/appending the file
If you are using variables, you can use
first_var="Hello"
second_var="How are you"
If you want to concat both string and write it to file, then use below
echo "${first_var} - ${second_var}" > ./file_name.txt
Your file_name.txt content will be "Hello - How are you"
Can also use here document and vi, the below script generates a FILE.txt with 3 lines and variable interpolation
VAR=Test
vi FILE.txt <<EOFXX
i
#This is my var in text file
var = $VAR
#Thats end of text file
^[
ZZ
EOFXX
Then file will have 3 lines as below. "i" is to start vi insert mode and similarly to close the file with Esc and ZZ.
#This is my var in text file
var = Test
#Thats end of text file
I have a file file.txt with contents like
i love this world
I hate stupid managers
I love linux
I have MS
When I do the following:
for line in `cat file.txt`; do
echo $line
done
It gives output like
I
love
this
world
I
..
..
But I need the output as entire lines like below — any thoughts ?
i love this world
I hate stupid managers
I love linux
I have MS
while read -r line; do echo "$line"; done < file.txt
As #Zac noted in the comments, the simplest solution to the question you post is simply cat file.txt so i must assume there is something more interesting going on so i have put the two options that solve the question as asked as well:
There are two things you can do here, either you can set IFS (Internal Field Separator) to a newline and use existing code, or you can use the read or line command in a while loop
IFS="
"
or
(while read line ; do
//do something
done) < file.txt
I believe the question was how to read in an entire line at a time. The simple script below will do this. If you don't specify a variable name for "read" it will stuff the entire line into the variable $REPLY.
cat file.txt|while read; do echo $REPLY; done
Dave..
You can do it by using read if the file is coming into stdin. If you need to do it in the middle of a script that already uses stdin for other purposes, you can temporarily reassign the stdin file descriptor.
#!/bin/bash
file=$1
# save stdin to usually unused file descriptor 3
exec 3<&0
# connect the file to stdin
exec 0<"$file"
# read from stdin
while read -r line
do
echo "[$line]"
done
# when done, restore stdin
exec 0<&3
Try
(while read l; do echo $l; done) < temp.txt
read: Read a line from the standard
input and split it into fields.
Reads a single line from the standard input, or from file
descriptor FD
if the -u option is supplied. The line is split into fields as with word
splitting, and the first word is assigned to the first NAME, the second
word to the second NAME, and so on, with any leftover words assigned
to
the last NAME. Only the characters found in $IFS are
recognized as word
delimiters.