strings not appended onto the file in shell scripting - shell

I was trying a simple shell program as below to append data at the end of the file,
path="/root/dir"
secure="*(rw,..)"
echo "$path $secure" >> a.txt
is not appending the string to a.txt

Just a guess but your script may be in DOS format that you're actually trying to write output to a.txt\r instead. Try to run one of the following to your code and try again:
sed -i 's|\r||' file
dos2unix file

Related

to append the file using vi/vim using shell script

I'm trying to write a shell which would append a '.txt' file with some data(stored in a variable). This i'm trying to do using 'vi'. I know there are other tools too to append the file...but i need to use vi only
I tried below command, but unfortunately this command is not inserting the data to the end of the file.:
echo $'i{$var}\E:x\n' |vi file.txt
Using vi/vim does not allow you to do a command line edit of the file in-place. Instead you could use its command line equivalent tool ex(vi-summary.doc) which should be available in any POSIX compliant shell.
cat file
foo
bar
Now use the ex utility in the command line as
var=dude
printf '%s\n' '$a' "$var" '.' x | ex file
This would edit the file in-place and add the text dude at the last line of the file.
cat file
foo
bar
dude
I think this work too
var="value"
printf "$(cat file.txt)\n$var" > newfile.txt

Process substitution, /dev/fd/63

I have a script that takes a file name as input in $1, and processes it...and creates an output file as ${1}.output.log and it works fine. e.g. if i tried
./myscript filename.txt
It'll process and generate output file with name: filename.txt.output.log
But When I tried to substitute a process to give input to this script like
./myscript <(echo something), it failed as it cannot create a file anymore with ${1}. output.log ; because now $1 is not an actual file and doesn't exist in my working directory where script is suppose to create an output.
Any suggestions to work around this problem?
The problem is probably that when using process substitution you are trying to create a file in /dev, more specifically, /dev/fd/63.output.log
I recommend doing this:
output_file="$( sed 's|/dev/fd/|./process_substitution-|' <<< ${1} ).output.log"
echo "my output" >> "$output_file"
We use sed to replace /dev/fd/ to ./process_substitution- so the file gets created in the current working directory (pwd) with the name process_substitution-63.output.log

cat command: unexpected output

I tried the following command $cat < text > text where text is a non-empty file.
There was no output to stdout and the file text became blank. I expected cat command to read the file text and output the contents to the same file.
However when I try $cat < text > newtext it works! newtext is a copy of text.
Another doubt, When I try $cat < text >>text where >> usually appends to a file. My terminal gets stuck in an infinite loop and file text is repeatedly appended to itself. Why does this happen?
You cannot use the same file as stdin and stdout. Why? Because all commands are executed at the same time, and the shell prepares redirections before executing the commands. Hence, the output file is emptied before executing.
Instead, you have to use temporary files.
A workaround could be your solution or also:
cat text > newtext && mv newtext text
When you redirect your output to a file with
echo "something" > file.txt
the first thing that happens, is that your file is (re)created. If it exists, it will be emptied and that's exactly what you see.
You can show the contents of the file by simply invoking
cat file.txt
To achieve what you've tried, you could do
cat file.txt > temp.txt && mv temp.txt file.txt
but I don't see a reason why you would want to do that.
If you think about it the shell has to do the redirections first. If the shell actually executed the command from left to right as you expect then cat file would display the files contents to the terminal then the shell would see > file and have to back track i.e. clear the output from the terminal and rerun the command with the stdout redirected.
This obviously doesn't make sense so when the shell parses your command the redirections must be done first and since > overwrites the contents your file is clobbered before it is read i.e it's empty!

Bash - run command using lines from text file

I am a bit new to bash scripting and have not been able to find an answer for what I am about to ask, that is if it is possible.
I have a text file which is created by search a directory using grep for files containing "Name" and outputs the below, say the file is called PathOutput.txt
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/4c3483af-b41a-4979-98b7-6f6a4f147670/4c3483af-b41a-4979-98b7-6f6a4f147670.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/5b5538a5-423f-4eaf-9678-d377a6706c58/5b5538a5-423f-4eaf-9678-d377a6706c58.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/0e2d1451-45cc-456e-846d-d174515a60dd/0e2d1451-45cc-456e-846d-d174515a60dd.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/daaf622e-e035-4c1b-a6d7-8ee209c4ded6/daaf622e-e035-4c1b-a6d7-8ee209c4ded6.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/48f52ab9-64df-4b1e-9c35-c024ae2a64c4/48f52ab9-64df-4b1e-9c35-c024ae2a64c4.ovf
Now what I would like to do if possible is loop through the file with a command, using a variable to bring in each line in the text file. But I cannot work out a way to run the command againist each line. With all my playing around I did get a results where it would run once against the first line, but this was when the output of grep was piped into another command.
At the moment in a bash script I am just extracting the paths to PathOutput.txt, cat to display the paths, then copy the path I want to a read -p command to create a variable to run against a command. It works fine now, just have to run the script each time for each path. If I could get the command to loop through each line I could output the results to a txt file.
Is it possible?
You could use xargs:
$ xargs -n1 echo "arg:" < file
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/4c3483af-b41a-4979-98b7-6f6a4f147670/4c3483af-b41a-4979-98b7-6f6a4f147670.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/5b5538a5-423f-4eaf-9678-d377a6706c58/5b5538a5-423f-4eaf-9678-d377a6706c58.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/0e2d1451-45cc-456e-846d-d174515a60dd/0e2d1451-45cc-456e-846d-d174515a60dd.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/daaf622e-e035-4c1b-a6d7-8ee209c4ded6/daaf622e-e035-4c1b-a6d7-8ee209c4ded6.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/48f52ab9-64df-4b1e-9c35-c024ae2a64c4/48f52ab9-64df-4b1e-9c35-c024ae2a64c4.ovf
Just replace echo "arg:" with the command you actually want to use. If you want to passed all the files at once drop the -n1 option.
If I understand correctly, you may want something like this:
for L in `cat PathOutput.txt`; do
echo "I read line $L from PathOutput.txt"
# do something useful with $L
done

Why reading and writing the same file through I/O redirection results in an empty file in Unix?

If I redirect output of a command to same file it reads from, its contents is erased.
sed 's/abd/def/g' a.txt > a.txt
Can anyone explain why?
The first thing the redirection does is to open the file for writing, thus clearing any existing contents. sed then tries to read this empty file you have just created, and does nothing. The file is then closed, containing nothing.
The redirection operations <, >, etc. are handled by the shell. When you give a command to the shell that includes redirection, the shell will first open the file. In the case of > the file will be opened for writing, which means it gets truncated to zero size. After the redirection files have been opened, the shell starts a new process, binding its standard input, output, and error to any possible redirected files, and only then executes the command you gave. So when the sed command in your example begins execution, a.txt has already been truncated by the shell.
Incidentally, and somewhat tangentially, this is also the reason why you cannot use redirection directly with sudo because it is the shell that needs the permissions to open the redirection file, not the command being executed.
You need to use the -i option to edit the file in place:
sed -i .bck 's/abd/def/g' a.txt
EDIT: as noted by neil, the redirection first opens the file for writing thus clears it.
EDIT2: it might be interesting for some reader
On OSX, if you want to use -i with an empty extension to prevent the creation of a backup file, you need to use the -eswitch as well otherwise it fails to parse the arguments correctly:
sed -i -e 's/abc/def/g' a.txt
stdout and stderr will first prepared and then stdin and then the command execute. so a.txt would be clear for stdout first and then when the comamnd execute no content could be found.
try
sed -i 's/abd/def/g' a.txt
or
sed 's/abd/def/g' a.txt 1<> a.txt

Resources