Pipe data into shell command expecting a filename - bash

Suppose I have a shell command foo that expects a file such as bar.txt as an argument, and I want to pass a one-line file to foo and then erase it like so:
echo "This is a one line file" > bar.txt
foo bar.txt
rm bar.txt
Is there a way to do this all in a single line of shell script without ever creating the file bar.txt?

You can use Process Substitution:
foo <(echo "This is a one line file")

I'm assuming that foo doesn't try to read stdin. If so, as an alternative to the Process Substitution suggested by Cyrus, you can also do
echo "This is a one line file" | foo /dev/stdin

try command substitution like this
cat $(ls)
where, result of 'ls' will be substituted as an argument for 'cat' to execute

Related

to append the file using vi/vim using shell script

I'm trying to write a shell which would append a '.txt' file with some data(stored in a variable). This i'm trying to do using 'vi'. I know there are other tools too to append the file...but i need to use vi only
I tried below command, but unfortunately this command is not inserting the data to the end of the file.:
echo $'i{$var}\E:x\n' |vi file.txt
Using vi/vim does not allow you to do a command line edit of the file in-place. Instead you could use its command line equivalent tool ex(vi-summary.doc) which should be available in any POSIX compliant shell.
cat file
foo
bar
Now use the ex utility in the command line as
var=dude
printf '%s\n' '$a' "$var" '.' x | ex file
This would edit the file in-place and add the text dude at the last line of the file.
cat file
foo
bar
dude
I think this work too
var="value"
printf "$(cat file.txt)\n$var" > newfile.txt

bash: filenames as parameter, perform action in cycle

My current script goes like:
#!/bin/bash
victims=*asci*
for f in $victims ; do
awk /some blah blah here/ ;done
so basically takes all files containing ascii in their name and performs an action on them.
I wanted, however, the filenames be entered as a parameter. Like:
bash myscript.sh *log* for example.
When using
#!/bin/bash
victims="$1"
for f in $victims ; do
awk /some blah blah here/ ;done
it doesnt do what expected. Performs only on the first file (as far as I remember).
May I ask for a help? Want the script to perform a function over a bunch of files that contain the parameter in their filename. Im not very experienced in bash, honestly. Thanks, cheers!
If you're just calling awk then you don't even need the for loop. Just pass it all of the file names at once.
awk '/whatever/' "$#"
If you do want to loop over all the command-line arguments, write:
for f in "$#"; do
...
done
Or, since in "$#" is implied:
for f; do
...
done
If you want to store them in an intermediate variable, you need to use an array:
victims=("$#")
for f in "${victims[#]}"; do
...
done
Also, you should avoid explicitly invoking bash. Run the script directly so it can use whatever shell's listed in its shebang line.
bash myscript.sh *log*
./myscript.sh *log*
You need to watch out how you call your script. Suppose your script myscript.sh is simply
victims="$1"
echo "$victims"
and your cwd contains files a.log, another.log and logmore.txt.
Then, executing
myscript.sh *log*
Wil result in simply
a.log
because "*log*" is interpreted by the shell before calling myscript.sh. In fact, you're executing
myscript.sh a.log another.log logmore.txt
and your script only handles the first parameter. Also very funny is, when your cwd contains no file with "log" in its name, your script will result in:
*log*
So, your call should be:
myscript.sh "*log*"
and your script should handle the fact that its input may be a regulare expression iso. an existing filename.

Why does cat exit a shell script, but only when it's fed by a pipe?

Why does cat exit a shell script, but only when it's fed by a pipe?
Case in point, take this shell script called "foobar.sh":
#! /bin/sh
echo $#
echo $#
cat $1
sed -e 's|foo|bar|g' $1
And a text file called "foo.txt" which contains only one line:
foo
Now if I type ./foobar.sh foo.txt on the command line, then I'll get this expected output:
1
foo.txt
foo
bar
However if I type cat foo.txt | ./foobar.sh then surprisingly I only get this output:
0
foo
I don't understand. If the number of arguments reported by $# is zero, then how can cat $1 still return foo? And, that being the case, why doesn't sed -e 's|foo|bar|g' $1 return anything since clearly $1 is foo?
This seems an awful lot like a bug, but I'm assuming it's magic instead. Please explain!
UPDATE
Based on the given answer, the following script gives the expected output, assuming a one-line foo.txt:
#! /bin/sh
if [ $# ]
then
yay=$(cat $1)
else
read yay
fi
echo $yay | cat
echo $yay | sed -e 's|foo|bar|g'
No, $1 is not "foo". $1 is
ie, undefined/nothing.
Unlike a programming language, variables in the shell are quite dumbly and literally replaced, and the resulting commands textually executed (well, sorta kinda). In this case, "cat $1" becomes just "cat ", which will take input from stdin. That's terribly convenient to your execution since you've kindly provided "foo" on stdin via your pipe!
See what's happening?
sed likewise will read from stdin, but is already on end of stream, so exits.
When you don't give an argument to cat, it reads from stdin. When $1 isn't given the cat $1 is the same as a simple cat, which reads the text you piped in (cat foo.txt).
Then the sed command runs, and same as cat, it reads from stdin because it has no filename argument. cat has already consumed all of stdin. There's nothing left to read, so sed quits without printing anything.

Bash Expression Evaluation Order on Command Line

Background:
I'm working on quickly calling bash command line expressions inside of SGE's job submission program qSub in parallel. While doing so, I was attempting to submit an expression (as an argument) to be ran inside of another script like so:
./runArguments.sh grep foo bar.txt > output.txt
runArguments.sh looks like this:
#!/bin/bash
${1} ${2} ${3} etc....to 12
The idea is that I want "grep foo bar.txt > output.txt" to be evaluated in the script...NOT ON THE COMMAND LINE. In the example above, "grep foo bar.txt" will evaluate during runArguments.sh execution, but the output redirection would be evaluated on the command line. I eventually found a working solution using "eval" that is shown below, but I do not understand why it works.
Question(s)
1) Why does
./runArguments.sh eval "grep foo bar.txt > output.txt"
allow the eval and the expression to be taken as arguments, but
./runArguments.sh $(grep foo bar.txt > output.txt)
evaluates on the command line before the script is called? (the output of $(grep...) is taken as the arguments instead)
2) Is there a better way of doing this?
Thanks in advance!
Your first question is a bit hard to answer, because you've already answered it yourself. As you've seen, command substitution (the $(...) or `...` notation) substitutes the output of the command, and then processes the result. For example, this:
cat $(echo tmp.sh)
gets converted to this:
cat tmp.sh
So in your case, this:
./runArguments.sh $(grep foo bar.txt > output.txt)
runs grep foo bar.txt > output.txt, grabs its output — which will be nothing, since you've redirected any output to output.txt — and substitutes it, yielding:
./runArguments.sh
(so your script is run with no arguments).
By contrast, this:
./runArguments.sh eval "grep foo bar.txt > output.txt"
does not perform any command substitution, so your script is run with two arguments: eval, and grep foo bar.txt > output.txt. This command inside your script:
${1} ${2} ${3} ${4} ${5} ${6} ${7} ${8} ${9} ${10} ${11} ${12}
is therefore equivalent to this:
eval grep foo bar.txt '>' output.txt
which invokes the eval built-in with five arguments: grep, foo, bar.txt, >, and output.txt. The eval built-in assembles its arguments into a command, and runs them, and even translates the > and output.txt arguments into an output-redirection, so the above is equivalent to this:
grep foo bar.txt > output.txt
. . . and you already know what that does. :-)
As for your second question — no, there's not really a better way to do this. You need to pass the > in as an argument, and that means that you need to use eval ... or bash -c "..." or the like in order to "translate" it back into meaning output-redirection. If you're O.K. with modifying the script, then you might want to change this line:
${1} ${2} ${3} ${4} ${5} ${6} ${7} ${8} ${9} ${10} ${11} ${12}
to this:
eval ${1} ${2} ${3} ${4} ${5} ${6} ${7} ${8} ${9} ${10} ${11} ${12}
so that you don't need to handle this in the parameters. Or, actually, you might as well change it to this:
eval ${#}
which will let you pass in more than twelve parameters; or, better yet, this:
eval "${#}"
which will give you slightly more control over word-splitting and fileglobbing and whatnot.

piping in linux

i have a file called test which contains the word "hello" in it.
shouldn't
echo test | cat
output hello? since its taking the output from the echo test, which is test, as the input for cat. so essentially im doing cat test.
but the actual output is test, im really confused.
Your pipes sends test to cat as the input, not as the argument. You could do:
cat `echo test`
to control the argument to cat with echo.
echo prints its arguments. cat prints a file which is by default standard input. When you pipe echo's standard output is connected to cat's standard input.
Correct is simply cat test.
From cat --help
If no FILE or when FILE is -, read standard input.
In your case, cat reads from stdin, which is test and outputs that.
In some cases you might want the argument to be passed through the pipe. This is how you would do that:
echo test | xargs cat
which will output the contents of the file named "test".

Resources