I am trying, so far unsuccessfully to read and print a tab character from a file in a Bourne shell script.
For example, here is my file, in.txt (stackoverflow won't let me write a tab, so replace [tabcharacter] with a tab):
[tabcharacter]Hello World!
My script as as follows:
#!/bin/sh
while read line
do
echo -e "${line}" >> out.txt
/bin/echo -e "${line}" >> out.txt
done < "./in.txt"
The out.txt I get is:
-e hello!
hello!
Whereas I would expect from one of these the output to be the same as in.txt.
I think it's a problem with the way I use the read command. But I'm not sure how I can get it to read tabs.
Any help much appreciated.
#!/bin/sh
export IFS=
while read line
do
echo -e "$line" >> out.txt
/bin/echo -e "$line" >> out.txt
done < "./in.txt"
I seted the IFS variable to a empty string, now its working, please test it!
Related
How do I append the output of a command to the end of a text file?
Use >> instead of > when directing output to a file:
your_command >> file_to_append_to
If file_to_append_to does not exist, it will be created.
Example:
$ echo "hello" > file
$ echo "world" >> file
$ cat file
hello
world
To append a file use >>
echo "hello world" >> read.txt
cat read.txt
echo "hello siva" >> read.txt
cat read.txt
then the output should be
hello world # from 1st echo command
hello world # from 2nd echo command
hello siva
To overwrite a file use >
echo "hello tom" > read.txt
cat read.txt
then the out put is
hello tom
You can use the >> operator. This will append data from a command to the end of a text file.
To test this try running:
echo "Hi this is a test" >> textfile.txt
Do this a couple of times and then run:
cat textfile.txt
You'll see your text has been appended several times to the textfile.txt file.
Use command >> file_to_append_to to append to a file.
For example echo "Hello" >> testFile.txt
CAUTION: if you only use a single > you will overwrite the contents of the file. To ensure that doesn't ever happen, you can add set -o noclobber to your .bashrc.
This ensures that if you accidentally type command > file_to_append_to to an existing file, it will alert you that the file exists already. Sample error message: file exists: testFile.txt
Thus, when you use > it will only allow you to create a new file, not overwrite an existing file.
Using tee with option -a (--append) allows you to append to multiple files at once and also to use sudo (very useful when appending to protected files). Besides that, it is interesting if you need to use other shells besides bash, as not all shells support the > and >> operators
echo "hello world" | sudo tee -a output.txt
This thread has good answers about tee
Use the >> operator to append text to a file.
I often confuse the two. Better to remember through their output:
> for Overwrite
$ touch someFile.txt
$ echo ">" > someFile.txt
$ cat someFile.txt
>
$ echo ">" > someFile.txt
$ cat someFile.txt
>
>> for Append
$ echo ">" > someFile.txt
$ cat someFile.txt
>
$ echo ">" >> someFile.txt
$ cat someFile.txt
>>
for the whole question:
cmd >> o.txt && [[ $(wc -l <o.txt) -eq 720 ]] && mv o.txt $(date +%F).o.txt
this will append 720 lines (30*24) into o.txt and after will rename the file based on the current date.
Run the above with the cron every hour, or
while :
do
cmd >> o.txt && [[ $(wc -l <o.txt) -eq 720 ]] && mv o.txt $(date +%F).o.txt
sleep 3600
done
I would use printf instead of echo because it's more reliable and processes formatting such as new line \n properly.
This example produces an output similar to echo in previous examples:
printf "hello world" >> read.txt
cat read.txt
hello world
However if you were to replace printf with echo in this example, echo would treat \n as a string, thus ignoring the intent
printf "hello\nworld" >> read.txt
cat read.txt
hello
world
I'd suggest you do two things:
Use >> in your shell script to append contents to particular file. The filename can be fixed or using some pattern.
Setup a hourly cronjob to trigger the shell script
For example your file contains :
1. mangesh#001:~$ cat output.txt
1
2
EOF
if u want to append at end of file then ---->remember spaces between 'text' >> 'filename'
2. mangesh#001:~$ echo somthing to append >> output.txt|cat output.txt
1
2
EOF
somthing to append
And to overwrite contents of file :
3. mangesh#001:~$ echo 'somthing new to write' > output.tx|cat output.tx
somthing new to write
In Linux, You can use cat command to append file content to another file
cat fileName_1.txt >> fileName_2.txt
In the previous command you will append content of fileName_1.txt to fileName_2.txt.
In Windows OS you can use type command
type fileName_1.txt >> fileName_2.txt
See this gif image:
While all of these answers are technically correct that appending to a file with >> is generally the way to go, note that if you use this in a loop when for example parsing/processing a file and append each line to the resulting file, this might be much slower then you would expect.
A faster alternative might be this:
stringBuilder=""
while read -r line; do
# $'\n' prints a newline so we don't have to know what special chars the string contains
stringBuilder+="$line"$'\n'
done < "myFile.txt"
echo "$stringBuilder" > $file
WARNING: you are reading all lines into memory; memory is a limited resource, so don't go doing this for gigantic files.
I am trying to write a shell testing program which compares the output for my program with the sample program. I have stored a list of command in a text file, it looks like this:
commands.txt:
echo line A > a
echo line A > b
./program a b
and the shell test looks like this:
cat $testname | while read LINE
do
echo -e "$LINE$"
$LINE
done
but rather than crating files a and b the program produces the flowing:
echo line A > a
line A > a
echo line B > b
line B > b
How can I execute the command just like it was written in the shell file and redirect the out put to another file?
I think the only way to do that is to use eval:
cat "$testname" | while read -r; do
echo "$REPLY"
eval "$REPLY"
done
If you just run $LINE, it will perform word splitting, but not I/O redirection, so it'll just pass > as a normal argument to echo.
The shell processes redirections before word expansion, which means that the > inside the string is not interpreted by the shell in this context. You need to request explicitly that the string is interpreted as a full command, like this:
eval "$LINE"
If you would like to write the exact same lines inside of the commands.txt file, into another file, you can say;
echo "$line" >> WriteTheLines.txt
If you would like to execute the commands inside of the commands.txt file, and write the output of the commands into another file, you can say;
eval "$line" >> ExecuteTheCommands.txt
So as an example;
#!/bin/bash
input="/home/commands.txt"
while read line
do
echo "$line" >> WriteTheCommands.txt
eval "$line" >> ExecuteTheCommands.txt
done<"$input"
How do I append the output of a command to the end of a text file?
Use >> instead of > when directing output to a file:
your_command >> file_to_append_to
If file_to_append_to does not exist, it will be created.
Example:
$ echo "hello" > file
$ echo "world" >> file
$ cat file
hello
world
To append a file use >>
echo "hello world" >> read.txt
cat read.txt
echo "hello siva" >> read.txt
cat read.txt
then the output should be
hello world # from 1st echo command
hello world # from 2nd echo command
hello siva
To overwrite a file use >
echo "hello tom" > read.txt
cat read.txt
then the out put is
hello tom
You can use the >> operator. This will append data from a command to the end of a text file.
To test this try running:
echo "Hi this is a test" >> textfile.txt
Do this a couple of times and then run:
cat textfile.txt
You'll see your text has been appended several times to the textfile.txt file.
Use command >> file_to_append_to to append to a file.
For example echo "Hello" >> testFile.txt
CAUTION: if you only use a single > you will overwrite the contents of the file. To ensure that doesn't ever happen, you can add set -o noclobber to your .bashrc.
This ensures that if you accidentally type command > file_to_append_to to an existing file, it will alert you that the file exists already. Sample error message: file exists: testFile.txt
Thus, when you use > it will only allow you to create a new file, not overwrite an existing file.
Using tee with option -a (--append) allows you to append to multiple files at once and also to use sudo (very useful when appending to protected files). Besides that, it is interesting if you need to use other shells besides bash, as not all shells support the > and >> operators
echo "hello world" | sudo tee -a output.txt
This thread has good answers about tee
Use the >> operator to append text to a file.
I often confuse the two. Better to remember through their output:
> for Overwrite
$ touch someFile.txt
$ echo ">" > someFile.txt
$ cat someFile.txt
>
$ echo ">" > someFile.txt
$ cat someFile.txt
>
>> for Append
$ echo ">" > someFile.txt
$ cat someFile.txt
>
$ echo ">" >> someFile.txt
$ cat someFile.txt
>>
for the whole question:
cmd >> o.txt && [[ $(wc -l <o.txt) -eq 720 ]] && mv o.txt $(date +%F).o.txt
this will append 720 lines (30*24) into o.txt and after will rename the file based on the current date.
Run the above with the cron every hour, or
while :
do
cmd >> o.txt && [[ $(wc -l <o.txt) -eq 720 ]] && mv o.txt $(date +%F).o.txt
sleep 3600
done
I would use printf instead of echo because it's more reliable and processes formatting such as new line \n properly.
This example produces an output similar to echo in previous examples:
printf "hello world" >> read.txt
cat read.txt
hello world
However if you were to replace printf with echo in this example, echo would treat \n as a string, thus ignoring the intent
printf "hello\nworld" >> read.txt
cat read.txt
hello
world
I'd suggest you do two things:
Use >> in your shell script to append contents to particular file. The filename can be fixed or using some pattern.
Setup a hourly cronjob to trigger the shell script
For example your file contains :
1. mangesh#001:~$ cat output.txt
1
2
EOF
if u want to append at end of file then ---->remember spaces between 'text' >> 'filename'
2. mangesh#001:~$ echo somthing to append >> output.txt|cat output.txt
1
2
EOF
somthing to append
And to overwrite contents of file :
3. mangesh#001:~$ echo 'somthing new to write' > output.tx|cat output.tx
somthing new to write
In Linux, You can use cat command to append file content to another file
cat fileName_1.txt >> fileName_2.txt
In the previous command you will append content of fileName_1.txt to fileName_2.txt.
In Windows OS you can use type command
type fileName_1.txt >> fileName_2.txt
See this gif image:
While all of these answers are technically correct that appending to a file with >> is generally the way to go, note that if you use this in a loop when for example parsing/processing a file and append each line to the resulting file, this might be much slower then you would expect.
A faster alternative might be this:
stringBuilder=""
while read -r line; do
# $'\n' prints a newline so we don't have to know what special chars the string contains
stringBuilder+="$line"$'\n'
done < "myFile.txt"
echo "$stringBuilder" > $file
WARNING: you are reading all lines into memory; memory is a limited resource, so don't go doing this for gigantic files.
Say I have a bash script as follows
while
read $f;
do
cat $f >> output.txt;
echo "aaa" >> output.txt;
done
Yet the second echo statement isn't executed. At all. What am I doing wrong?
I'm running this via
tail -f /var/log/somelog | ./script.sh
$f should not be empty. It's only supposed to output when tail notices a change in the file.
The variable $f is probably empty, and your script is hanging on a call to cat with no arguments. Did you want to say
while read f
instead of
while read $f
?
My input file's contents are:
welcome
welcome1
welcome2
My script is:
for groupline in `cat file`
do
echo $groupline;
done
I got the following output:
welcome
welcome1
welcome2
Why doesn't it print the empty line?
you need to set IFS to newline \n
IFS=$"\n"
for groupline in $(cat file)
do
echo "$groupline";
done
Or put double quotes. See here for explanation
for groupline in "$(cat file)"
do
echo "$groupline";
done
without meddling with IFS, the "proper" way is to use while read loop
while read -r line
do
echo "$line"
done <"file"
Because you're doing it all wrong. You want while not for, and you want read, not cat:
while read groupline
do
echo "$groupline"
done < file
The solution ghostdog74 provided is helpful, but has a flaw.
IFS could not use double quotes (at least in Mac OS X), but can use single quotes like:
IFS=$'\n'
It's nice but not dash-compatible, maybe this is better:
IFS='
'
The blank line will be eaten in the following program:
IFS='
'
for line in $(cat file)
do
echo "$line"
done
But you can not add double quotes around $(cat file), it will treat the whole file as one single string.
for line in "$(cat file)"
If want blank line also be processed, using the following
while read line
do
echo "$line"
done < file
Using IFS=$"\n" and var=$(cat text.txt) removes all the "n" characters from the output echo $var