Read after 'while read lines' not evaluated (SHELL) - bash

I am currently trying to debug some scripts I've made and I cannot find a way for a 'read' instruction to be executed.
To summarize, I've got two functions, one with a 'while read lines' that is called after a pipe, and another functions that read user input after while read is processed.
Let me now explain this with code :
This is how I called my function ($lines contains multiple lines separated with '\n')
echo "$lines" | saveLines
saveLines(){
# ...
while read line ; do
# processing lines
done
myOtherFunction
}
myOtherFunction(){
echo "I am here !" # <= This is printed in console
read -p "Type in : " tmp # <= Input is never asked to user, and the message is not printed
echo "I now am here !" # <= This is printed in console
}
This code is simplified but the spirit is here.
I tried to insert a 'read' instruction before the 'read -p ...', it did not seems to change things...
So please, if you can show my error or tell me why this behavior is expected, I would be very happy. Thanks for you time

This question is very close to that other question, in a slightly different context. To be more precise and as explained by the OP, the command run was
echo "$lines" | saveLines
meaning that the standard input of the code executed by saveLines wasn't the terminal anymore, but the same descriptor as the standard output of the echo... command.
To solve this it thus suffices to replace
…
read -p "Type in : " tmp
…
with
…
read -p "Type in : " tmp </dev/tty
…

Related

Stop reading from STDOUT if stream is empty in BASH

I am creating a script (myscript.sh) in BASH that reads from STDOUT, typically a stream of data that comes from cat, or from a file and outputs the stream of data (amazing!), like this:
$cat myfile.txt
hello world!
$cat myfile.txt | myscript.sh
hello world!
$myscript.sh myfile.txt
hello world!
But I also would like the following behaviour: if I call the script without arguments I'd like it to output a brief help:
$myscript.sh
I am the help: I just print what you say.
== THE PROBLEM ==
The problem is that I am capturing the stream of data like this:
if [[ $# -eq 0 ]]; then
stream=$(cat <&0)
elif [[ -n "$stream" ]]; then
echo "I am the help: I just print what you say."
else
echo "Unknown error."
fi
And when I call the script with no arguments like this:
$myscript.sh
It SHOULD print the "help" part, but it just keep waiting for a stream of data in line 2 of code above...
Is there any way to tell bash that if nothing comes from STDOUT just break and continue executing?
Thanks in advance.
There's always a standard input stream; if no arguments are given and input isn't redirected, standard input is the terminal.
If you want to treat that specially, use test -t to test if standard input is connected to a terminal.
if [[ $# -eq 0 && -t 0 ]]; then
echo "I am the help: I just print what you say."
else
stream=$(cat -- "$#")
fi
There's no need to test $#. Just pass your arguments to cat; if it gets filenames it will read from them, otherwise it will read from standard input.
I agree to #Barmar's solution.
However, it might be better to entirely avoid a situation where your program behavior depends on whether the input file descriptor is a terminal (there are situations where a terminal is mimicked even though there's none -- in such a situation, your script would just produce the help string).
You could instead introduce a special - argument to explicitly request reading from stdin. This will result in simpler option handling and uniform behavior of your script, no matter what's the environment.
First answer is to help yourself - try running the script with bash -x myscript.sh. It will include lot of information to help you.
If you specific case, the condition $# -eq 0 was flipped. As per requirement, you want to print the help message is NOT ARGUMENT ARE PROVIDED:
if [[ $# -eq 0 ]] ; then
echo "I am the help: I just print what you say."
exit 0
fi
# Rest of you script, read data from file, etc.
cat -- "$#"
Assuming this approach is taken, and if you want to process standard input or a file, simple pass '-' as parameter: cat foobar.txt | myscript.sh -

Indenting "read" input when it contains multiple lines

I have a read command in a bash script whose input defines an array. The input will frequently be copy/pasted data that contains multiple lines. Each line in the multi-line input is correctly captured and added to the array as separate elements, but I'd like to indent each line with a > prefix in the terminal window when it is pasted in.
This is for bash v3 running on macOS. I've attempted various flavors of the read command, but couldn't come across anything that worked.
Script:
#!/bin/bash
echo "Provide inputs:"
until [[ "$message" = "three" ]]; do
read -p "> " message
myArray+=($message) #Input added to array for later processing
done
Manually typed inputs look like this:
Provide inputs:
> one
> two
> three
But a copy/pasted multi-line input look like this:
Provide inputs:
> one
two
three
> >
The desired result is for the copy/pasted multi-line input to look identical to the manually entered inputs.
It sounds like the issue is with the way read works. Read echos back keystrokes and I think perhaps because of stdout buffer it is writtern before the echo statements are flushed.
Using a combo of echo command and the -e argument to read (interactive) fixes this in my testing.
#!/bin/bash
echo "Provide inputs:"
until [[ "$message" = "three" ]]; do
echo -ne "> "
read -e message
myArray+=($message) #Input added to array for later processing
done
(Answer changed after explanation of OP what he wants)
The screen will look identical when the input is entered line-by-line and when entered after copy-paste of multiple lines when you remove the > (I added -r in view of special characters).
until [[ "$message" = "three" ]]; do
read -r message
myArray+=("$message")
done
When you want to see the >, you can use the ugly
printf "> "
until [[ "$message" = "three" ]]; do
read -rs message
printf "%s\n> " "${message}"
myArray+=("$message")
done
In this case the input is only shown after an Enter, so this seems worse.

Adding new lines to multiple files

I need to add new lines with specific information to one or multiple files at the same time.
I tried to automate this task using the following script:
for i in /apps/data/FILE*
do
echo "nice weather 20190830 friday" >> $i
done
It does the job yet I wish I can automate it more and let the script ask me for to provide the file name and the line I want to add.
I expect the output to be like
enter file name : file01
enter line to add : IWISHIKNOW HOWTODOTHAT
Thank you everyone.
In order to read user input you can use
read user_input_file
read user_input_text
read user_input_line
You can print before the question as you like with echo -n:
echo -n "enter file name : "
read user_input_file
echo -n "enter line to add : "
read user_input_text
echo -n "enter line position : "
read user_input_line
In order to add line at the desired position you can "play" with head and tail
head -n $[$user_input_line - 1] $user_input_file > $new_file
echo $user_input_text >> $new_file
tail -n +$user_input_line $user_input_file >> $new_file
Requiring interactive input is horrible for automation. Make a command which accepts a message and a list of files to append to as command-line arguments instead.
#!/bin/sh
msg="$1"
shift
echo "$msg" | tee -a "$#"
Usage:
scriptname "today is a nice day" file1 file2 file3
The benefits for interactive use are obvious -- you get to use your shell's history mechanism and filename completion (usually bound to tab) but also it's much easier to build more complicated scripts on top of this one further on.
The design to put the message in the first command-line argument is baffling to newcomers, but allows for a very simple overall design where "the other arguments" (zero or more) are the files you want to manipulate. See how grep has this design, and sed, and many many other standard Unix commands.
You can use read statement to prompt for input,
read does make your script generic, but if you wish to automate it then you have to have an accompanying expect script to provide inputs to the read statement.
Instead you can take in arguments to the script which helps you in automation.. No prompting...
#!/usr/bin/env bash
[[ $# -ne 2 ]] && echo "print usage here" && exit 1
file=$1 && shift
con=$1
for i in `ls $file`
do
echo $con >> $i
done
To use:
./script.sh "<filename>" "<content>"
The quotes are important for the content so that the spaces in the content are considered to be part of it. For filenames use quotes so that the shell does not expand them before calling the script.
Example: ./script.sh "file*" "samdhaskdnf asdfjhasdf"

How to use 'coproc' to interact with another command driven program

Ok, obviously I am NOT a bash guru and am in need of one!
I have never used 'coproc' before, but it seems to be just what I need. But, I have to admit that I can not extrapolate from the various 'ping' examples out there! [I did try for a couple of hours...]
All I want to do is to start a 'coproc' shell script that can take input from standard in and writes it's results to standard out. I want the main script to do the sending and processing of those commands and results respectively.
Here is one of the simplest outlines of what I am trying to do:
EDITED WITH BETTER DETAIL
#! /bin/bash
coproc bkgndProc {
/some/path/to/usefulScript.sh maybeSomeArgsHere
}
// send command #1 to bkgndProc here
result=$(echo 'command' <&${bkgndProc[0]}) ### Doesn't work for me
echo "Did it work? $result" ### this just prints back the 'command' I used
// here execute conditional logic based on result:
// if result1; then
// send command #2 here, getting results
// else
// send command #3 here, again getting results
// fi
Sorry about using pseudo-code above, but I'm not sure what those sending commands should be! If anyone could supply the details that would be greatly appreciated!
result = $(echo 'command' <&${bkgndProc[0]}) ### Doesn't work for me
wouldn't work at least basically since you have spaces on it
result=$(echo 'command' <&${bkgndProc[0]})
---- Update ----
A simple concept could be shown in a script like this:
#!/bin/bash
# create the co-process
coproc myproc {
bash
}
# send a command to it (echo a)
echo 'echo a' >&"${myproc[1]}"
# read a line from its output
read line <&"${myproc[0]}"
# show the line
echo "$line"
Outputs:
a
Another which reads multiple lines using a timeout:
#!/bin/bash
coproc myproc {
bash
}
# send a command to message 4 random numbers in 4 lines
echo 'echo "$RANDOM"; echo "$RANDOM"; echo "$RANDOM"; echo "$RANDOM"' >&"${myproc[1]}"
# keep reading the line until it times out
while read -t 1 -u "${myproc[0]}" line; do
echo "$line"
done
Output:
17393
1423
8368
1782
If we use cat, it would no longer quit since the other end is still alive and connected, and EOF is not yet reached. It's the reason why we used timeouts.
cat <&"${myproc[0]}"

ksh ignores exactly two newlines when reading from /dev/tty

We have a ksh script which is reading from a doing a 'while read line' with the input piped into it. At the same time we're reading user confirmation input with 'read < /dev/tty', similar to the following sketch:
cat interestingdata | while read line ; do
x=$(dostuff $line)
if [[ x -ne 0 ]] ; then
read y < /dev/tty
$(domorestuff $y)
fi
echo "done optional stuff"
done
All works fine for processing the lines of 'interestingdata', and for most of the reads from /dev/tty. However, on the first two iterations of the while loop, the first string + newline are ignored.
By this, I mean the user types something and presses enter, and the script doesn't progress to echo "done optional stuff". Instead, the user has to type something else and press enter again, and only then does the script proceed.
This happens only for the first two iterations of the while loop, and then everything works perfectly. Any ideas how I can fix this? I have no idea what else I can do here!
Running linux kernel 2.6.9-55.9.vm2.ELsmp with ksh93 if that helps.
It sounds like either "dostuff" or "domorestuff" is sometimes reading from stdin.
Try replacing "dostuff" with "dostuff < /dev/null" and "domorestuff" with "domorestuff < /dev/null" and see if the behavior changes.

Resources