How to wrap piped input to stdout in a bash script? - bash

I want to write a bash script that will wrap piped input with some text.
Based on Googling and trying to pick from examples.
Here is what I have so far, that does not work:
#!/bin/sh
if readlink /proc/$$/fd/0 | grep -q "^pipe:"; then
echo "{ "template":{"name":"contact sheet template","root":"root","parameters": ["pages"]},"pages":"
cat
echo "}"
fi
I am receiving a JSON list from another program as piped input and I want to output before and afterwards with the above text before I pipe the results to the next program.
program_1 | wrapper.sh | program_2 > outputfile
But it doesn't output anything.
Can someone with more bash expertise point me in the right direction?

Personally I'll search in this way :
myscript.sh
echo 'BEFORE' $(cat) 'AFTER'

Do you mean your script is reading standard input from a pipe, such as
$ other-process | my-script
?
Then the commands in your script will simply inherit standard input from the pipe
#!/bin/sh
# Output preamble
cat <<EOF
{ "template":{"name":"contact sheet template","root":"root","parameters": ["pages"]},"pages":
EOF
cat # This reads from standard input inherited from your script
# Output the closing
cat <<EOF
}
EOF

Related

Bash: How to direct output to both stderr and to stdout, to pipe into another command?

I know variations of this question have been asked and answered several times before, but I'm either misunderstanding the solutions, or am trying to do something eccentric. My instinct is that it shouldn't require tee but maybe I'm completely wrong...
Given a command like this:
sh
echo "hello"
I want to send it to STDERR so that it can be logged/seen on the console, and so that it can be sent to another command. For example, if I run:
sh
echo "hello" SOLUTION>&2 > myfile.txt
(SOLUTION> being whatever the answer to my problem is)
I want:
hello to be shown in the console like any other STDERR message
The file myfile.txt to contain hello
There's no need to redirect it to stderr. Just use tee to send it to the file while also sending to stdout, which will go to the terminal.
echo "hello" | tee myfile.txt
If you want to pipe the output to another command without writing it to a file, then you could use
echo "hello" | tee /dev/stderr | other_command
You could also write a shell function that does the equivalent of tee /dev/stderr:
$ tee_to_stderr() {
while read -r line; do
printf "%s\n" "$line";
printf "%s\n" "$line" >&2
done
}
$ echo "hello" | tee_to_stderr | wc
hello
1 1 6
This doesn't work well with binary output, but since you intend to use this to display text on the terminal that shouldn't be a concern.
tee copies stdin to the files on its command line, and also to stdout.
echo hello | tee myfile.txt >&2
This will save hello in myfile.txt and also print it to stderr.

Why does cat exit a shell script, but only when it's fed by a pipe?

Why does cat exit a shell script, but only when it's fed by a pipe?
Case in point, take this shell script called "foobar.sh":
#! /bin/sh
echo $#
echo $#
cat $1
sed -e 's|foo|bar|g' $1
And a text file called "foo.txt" which contains only one line:
foo
Now if I type ./foobar.sh foo.txt on the command line, then I'll get this expected output:
1
foo.txt
foo
bar
However if I type cat foo.txt | ./foobar.sh then surprisingly I only get this output:
0
foo
I don't understand. If the number of arguments reported by $# is zero, then how can cat $1 still return foo? And, that being the case, why doesn't sed -e 's|foo|bar|g' $1 return anything since clearly $1 is foo?
This seems an awful lot like a bug, but I'm assuming it's magic instead. Please explain!
UPDATE
Based on the given answer, the following script gives the expected output, assuming a one-line foo.txt:
#! /bin/sh
if [ $# ]
then
yay=$(cat $1)
else
read yay
fi
echo $yay | cat
echo $yay | sed -e 's|foo|bar|g'
No, $1 is not "foo". $1 is
ie, undefined/nothing.
Unlike a programming language, variables in the shell are quite dumbly and literally replaced, and the resulting commands textually executed (well, sorta kinda). In this case, "cat $1" becomes just "cat ", which will take input from stdin. That's terribly convenient to your execution since you've kindly provided "foo" on stdin via your pipe!
See what's happening?
sed likewise will read from stdin, but is already on end of stream, so exits.
When you don't give an argument to cat, it reads from stdin. When $1 isn't given the cat $1 is the same as a simple cat, which reads the text you piped in (cat foo.txt).
Then the sed command runs, and same as cat, it reads from stdin because it has no filename argument. cat has already consumed all of stdin. There's nothing left to read, so sed quits without printing anything.

bash script not getting piped data

I'm trying to write a bash script that will manipulate the data piped from xsel.
...
ary=()
while read data; do
echo $data
ary=( "${ary[#]}" "$data" )
done
The problem is there is nothing being read when I call:
xsel | myscript.sh
I have tried
echo "testing testing" | myscript.sh
and that works, and I also made sure there was something coming from xsel
xsel | festival --tts --pipe
# will read the clipboard string piped from xsel aloud
Any Suggestions? Thanks in advance
read fails if it can't read a full line, and xsel doesn't output a line feed.
Replace your loop with:
readarray ary # new in Bash 4
If you're only adding lines in an array as a proxy for sticking all the data in a variable, you can instead do:
input=$(cat)

automatically send stdin to interactive bash script

I have a compiled program which i run from the shell; as i run it, it asks me for an input file in stdin. I want to run that program in a bash loop, with predefined input file, such as
for i in $(seq 100); do
input.txt | ./myscript
done
but of course this won't work. How can I achieve that? I cannot edit the source code.
Try
for i in $(seq 100); do
./myscript < input.txt
done
Pipes (|) are inter-process. That is, they stream between processes. What you're looking for is file redirection (e.g. <, > etc.)
Redirection simply means capturing output from a file, command,
program, script, or even code block within a script and sending it as
input to another file, command, program, or script.
You may see cat used for this e.g. cat file | mycommand. Given the above, this usage is redundant and often the winner of a 'Useless use of cat' award.
You can use:
./myscript < input.txt
to send content of input.txt on stdin of myscript
Based on your comments, it looks like myscript prompts for a file name and you want to always respond with input.txt. Did you try this?
for i in $(seq 100); do
echo input.txt | ./myscript
done
You might want to just try this first:
echo input.txt | ./myscript
just in case.

Open and write data to text file using Bash?

How can I write data to a text file automatically by shell scripting in Linux?
I was able to open the file. However, I don't know how to write data to it.
The short answer:
echo "some data for the file" >> fileName
However, echo doesn't deal with end of line characters (EOFs) in an ideal way. So, if you're going to append more than one line, do it with printf:
printf "some data for the file\nAnd a new line" >> fileName
The >> and > operators are very useful for redirecting output of commands, they work with multiple other bash commands.
#!/bin/sh
FILE="/path/to/file"
/bin/cat <<EOM >$FILE
text1
text2 # This comment will be inside of the file.
The keyword EOM can be any text, but it must start the line and be alone.
EOM # This will be also inside of the file, see the space in front of EOM.
EOM # No comments and spaces around here, or it will not work.
text4
EOM
You can redirect the output of a command to a file:
$ cat file > copy_file
or append to it
$ cat file >> copy_file
If you want to write directly the command is echo 'text'
$ echo 'Hello World' > file
#!/bin/bash
cat > FILE.txt <<EOF
info code info
info code info
info code info
EOF
I know this is a damn old question, but as the OP is about scripting, and for the fact that google brought me here, opening file descriptors for reading and writing at the same time should also be mentioned.
#!/bin/bash
# Open file descriptor (fd) 3 for read/write on a text file.
exec 3<> poem.txt
# Let's print some text to fd 3
echo "Roses are red" >&3
echo "Violets are blue" >&3
echo "Poems are cute" >&3
echo "And so are you" >&3
# Close fd 3
exec 3>&-
Then cat the file on terminal
$ cat poem.txt
Roses are red
Violets are blue
Poems are cute
And so are you
This example causes file poem.txt to be open for reading and writing on file descriptor 3. It also shows that *nix boxes know more fd's then just stdin, stdout and stderr (fd 0,1,2). It actually holds a lot. Usually the max number of file descriptors the kernel can allocate can be found in /proc/sys/file-max or /proc/sys/fs/file-max but using any fd above 9 is dangerous as it could conflict with fd's used by the shell internally. So don't bother and only use fd's 0-9. If you need more the 9 file descriptors in a bash script you should use a different language anyways :)
Anyhow, fd's can be used in a lot of interesting ways.
I like this answer:
cat > FILE.txt <<EOF
info code info
...
EOF
but would suggest cat >> FILE.txt << EOF if you want just add something to the end of the file without wiping out what is already exists
Like this:
cat >> FILE.txt <<EOF
info code info
...
EOF
Moving my comment as an answer, as requested by #lycono
If you need to do this with root privileges, do it this way:
sudo sh -c 'echo "some data for the file" >> fileName'
For environments where here documents are unavailable (Makefile, Dockerfile, etc) you can often use printf for a reasonably legible and efficient solution.
printf '%s\n' '#!/bin/sh' '# Second line' \
'# Third line' \
'# Conveniently mix single and double quotes, too' \
"# Generated $(date)" \
'# ^ the date command executes when the file is generated' \
'for file in *; do' \
' echo "Found $file"' \
'done' >outputfile
I thought there were a few perfectly fine answers, but no concise summary of all possibilities; thus:
The core principal behind most answers here is redirection. Two are important redirection operators for writing to files:
Redirecting Output:
echo 'text to completely overwrite contents of myfile' > myfile
Appending Redirected Output
echo 'text to add to end of myfile' >> myfile
Here Documents
Others mentioned, rather than from a fixed input source like echo 'text', you could also interactively write to files via a "Here Document", which are also detailed in the link to the bash manual above. Those answers, e.g.
cat > FILE.txt <<EOF` or `cat >> FILE.txt <<EOF
make use of the same redirection operators, but add another layer via "Here Documents". In the above syntax, you write to the FILE.txt via the output of cat. The writing only takes place after the interactive input is given some specific string, in this case 'EOF', but this could be any string, e.g.:
cat > FILE.txt <<'StopEverything'` or `cat >> FILE.txt <<'StopEverything'
would work just as well. Here Documents also look for various delimiters and other interesting parsing characters, so have a look at the docs for further info on that.
Here Strings
A bit convoluted, and more of an exercise in understanding both redirection and Here Documents syntax, but you could combine Here Document style syntax with standard redirect operators to become a Here String:
Redirecting Output of cat Input
cat > myfile <<<'text to completely overwrite contents of myfile'
Appending Redirected Output of cat Input
cat >> myfile <<<'text to completely overwrite contents of myfile'
This approach works and is the best
cat > (filename) <<EOF
Text1...
Text2...
EOF
Basically the text will search for keyword "EOF" till it terminates writing/appending the file
If you are using variables, you can use
first_var="Hello"
second_var="How are you"
If you want to concat both string and write it to file, then use below
echo "${first_var} - ${second_var}" > ./file_name.txt
Your file_name.txt content will be "Hello - How are you"
Can also use here document and vi, the below script generates a FILE.txt with 3 lines and variable interpolation
VAR=Test
vi FILE.txt <<EOFXX
i
#This is my var in text file
var = $VAR
#Thats end of text file
^[
ZZ
EOFXX
Then file will have 3 lines as below. "i" is to start vi insert mode and similarly to close the file with Esc and ZZ.
#This is my var in text file
var = Test
#Thats end of text file

Resources