Bash - run command using lines from text file - bash

I am a bit new to bash scripting and have not been able to find an answer for what I am about to ask, that is if it is possible.
I have a text file which is created by search a directory using grep for files containing "Name" and outputs the below, say the file is called PathOutput.txt
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/4c3483af-b41a-4979-98b7-6f6a4f147670/4c3483af-b41a-4979-98b7-6f6a4f147670.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/5b5538a5-423f-4eaf-9678-d377a6706c58/5b5538a5-423f-4eaf-9678-d377a6706c58.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/0e2d1451-45cc-456e-846d-d174515a60dd/0e2d1451-45cc-456e-846d-d174515a60dd.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/daaf622e-e035-4c1b-a6d7-8ee209c4ded6/daaf622e-e035-4c1b-a6d7-8ee209c4ded6.ovf
/mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/48f52ab9-64df-4b1e-9c35-c024ae2a64c4/48f52ab9-64df-4b1e-9c35-c024ae2a64c4.ovf
Now what I would like to do if possible is loop through the file with a command, using a variable to bring in each line in the text file. But I cannot work out a way to run the command againist each line. With all my playing around I did get a results where it would run once against the first line, but this was when the output of grep was piped into another command.
At the moment in a bash script I am just extracting the paths to PathOutput.txt, cat to display the paths, then copy the path I want to a read -p command to create a variable to run against a command. It works fine now, just have to run the script each time for each path. If I could get the command to loop through each line I could output the results to a txt file.
Is it possible?

You could use xargs:
$ xargs -n1 echo "arg:" < file
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/4c3483af-b41a-4979-98b7-6f6a4f147670/4c3483af-b41a-4979-98b7-6f6a4f147670.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/5b5538a5-423f-4eaf-9678-d377a6706c58/5b5538a5-423f-4eaf-9678-d377a6706c58.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/0e2d1451-45cc-456e-846d-d174515a60dd/0e2d1451-45cc-456e-846d-d174515a60dd.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/daaf622e-e035-4c1b-a6d7-8ee209c4ded6/daaf622e-e035-4c1b-a6d7-8ee209c4ded6.ovf
arg: /mnt/volnfs3rvdata/4007dc45-477a-45b2-9c28-43bc5bbb4f9f/master/vms/48f52ab9-64df-4b1e-9c35-c024ae2a64c4/48f52ab9-64df-4b1e-9c35-c024ae2a64c4.ovf
Just replace echo "arg:" with the command you actually want to use. If you want to passed all the files at once drop the -n1 option.

If I understand correctly, you may want something like this:
for L in `cat PathOutput.txt`; do
echo "I read line $L from PathOutput.txt"
# do something useful with $L
done

Related

Read a file line-by-line on bash; each line containing the path to another unqiue file

Each line in a given file 'a.txt' contains the directory/path to another unique file. Suppose we want to parse 'a.txt' line-by-line, extract the path in string format, and then use a tool such as vim to process the file at this path, and so on.
After going through this thread - Read a file line by line assigning the value to a variable, I wrote the following script, say 'open-file.sh' on bash (I'm new to it)
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
vim -c ":q" -cq $line # Just open the file and close it using :q
done < "$1"
We would then run the above script as -
./open-file.sh a.txt
The problem is that although the path to a new file is correctly specified by $line, when vim opens the file, vim continues to receive the text contained in 'a.txt' as a command. How can I write a script where I can correctly obtain the path from 'a.txt', open it using vim, and then continue parsing the remaining lines in 'a.txt' ?
Replace:
vim -c ":q" -cq $line
With:
vim -c ":q" -cq "$line" </dev/tty
The redirection </dev/tty tells vim to take its standard input from the terminal. Without that, the standard input for vim is "$1".
Also, it is good practice to put $line in double-quotes to protect it from word splitting, etc.
Lastly, while vim is excellent for interactive work, if your end-goal is fully automated processing of each file, you might want to consider tools such as sed or awk.
Although I'm not sure of your ultimate goal, this shell command will execute vim once per line in a.txt:
xargs -o -n1 vim -c ':q' < a.txt
As explained in the comments to Read a file line by line assigning the value to a variable, the issue you're encountering is due to the fact that vim is an interactive program and thus continues to read input from $line.
The problem was already mentioned in a comment under the answer you based your script on.
vim is consuming stdin which is given to the loop by done < $1. We can observe the same behavior in the following example:
$ while read i; do cat; done < <(seq 3)
2
3
<(seq 3) simulates a file with the three lines 1, 2, and 3. Instead of three silent iterations we get only one iteration and the output 2 and 3.
stdin is not only passed to read in the head of the loop, but also to cat in the body of the loop. Therefore read reads one line, the loop is entered, cat reads all remaining lines, stdin is empty, read has nothing to read anymore, the loop exits.
You could circumvent the problem by redirecting something to vim, however there is an even better way. You don't need the loop at all:
< "$1" xargs -d\\n -n1 vim -c :q -cq
xargs will execute vim once for every line in the file given by $1.

Process substitution, /dev/fd/63

I have a script that takes a file name as input in $1, and processes it...and creates an output file as ${1}.output.log and it works fine. e.g. if i tried
./myscript filename.txt
It'll process and generate output file with name: filename.txt.output.log
But When I tried to substitute a process to give input to this script like
./myscript <(echo something), it failed as it cannot create a file anymore with ${1}. output.log ; because now $1 is not an actual file and doesn't exist in my working directory where script is suppose to create an output.
Any suggestions to work around this problem?
The problem is probably that when using process substitution you are trying to create a file in /dev, more specifically, /dev/fd/63.output.log
I recommend doing this:
output_file="$( sed 's|/dev/fd/|./process_substitution-|' <<< ${1} ).output.log"
echo "my output" >> "$output_file"
We use sed to replace /dev/fd/ to ./process_substitution- so the file gets created in the current working directory (pwd) with the name process_substitution-63.output.log

Bash: How to pass input file content as command argument

I have a command command that takes an input file as argument. Is there a way to call command without actually creating a file?
I would like to achieve the following behavior
$ echo "content" > tempfile
$ command tempfile
$ rm tempfile
if possible:
as a one-liner,
without creating a file,
using either a bash (or sh) feature or a "well-known" command (as standard as xargs)
It feels like there must be an easy way to do it but I can't find it.
Just use a process substitution.
command <(echo "content")
Bash will create a FIFO or other type of temporary file in /dev for the standard output of whatever happens in the process. For example:
$ echo <(echo hi)
/dev/fd/63

strings not appended onto the file in shell scripting

I was trying a simple shell program as below to append data at the end of the file,
path="/root/dir"
secure="*(rw,..)"
echo "$path $secure" >> a.txt
is not appending the string to a.txt
Just a guess but your script may be in DOS format that you're actually trying to write output to a.txt\r instead. Try to run one of the following to your code and try again:
sed -i 's|\r||' file
dos2unix file

UNIX: How to run a program with a file as an input

I'm writing a bash script called 'run' that tests programs with pre-defined inputs.
It takes in a file as the first parameter, then a program as a second parameter.
The call would look like
./run text.txt ./check
for example, the program 'run' would run 'check' with text.txt as the input. This will save me lots of testing time with my programs.
right now I have
$2 < text.txt > text.created
So it takes the text.txt and redirects it as input in to the program specified, which is the second argument. Then dumps the result in text.created.
I have the input in text.txt and I know what the output should look like, but when I cat text.created, it's empty.
Does anybody know the proper way to run a program with a file as the input? This seems intuitive to me, but could there be something wrong with the 'check' program rather than what I'm doing in the 'run' script?
Thanks! Any help is always appreciated!
EDIT: the file text.txt contains multiple lines of files that each have an input for the program 'check'.
That is, text.txt could contain
asdf1.txt
asdf2.txt
asdf3.txt
I want to test check with each file asdf1.txt, asdf2.txt, asdf3.txt.
A simple test with
#!/bin/sh
# the whole loop reads $1 line by line
while read
do
# run $2 with the contents of the file that is in the line just read
xargs < $REPLY $2
done < $1
works fine. Call that file "run" and run it with
./run text.txt ./check
I get the program ./check executed with text.txt as the parameters. Don't forget to chmod +x run to make it executable.
This is the sample check program that I use:
#!/bin/sh
echo "This is check with parameters $1 and $2"
Which prints the given parameters.
My file text.txt is:
textfile1.txt
textfile2.txt
textfile3.txt
textfile4.txt
and the files textfile1.txt, ... contain one line each for every instance of "check", for example:
lets go
or
one two
The output:
$ ./run text.txt ./check
This is check with parameters lets and go
This is check with parameters one and two
This is check with parameters uno and dos
This is check with parameters eins and zwei
The < operator redirects the contents of the file to the standard input of the program. This is not the same as using the file's contents for the arguments of the file--which seems to be what you want. For that do
./program $(cat file.txt)
in bash (or in plain old /bin/sh, use
./program `cat file.txt`
).
This won't manage multiple lines as separate invocations, which your edit indicates is desired. For that you probably going to what some kind scripting language (perl, awk, python...) which makes parsing a file linewise easy.

Resources