Capture /dev/tty in sub sub shell - bash

I have tow bash files.
File Child
echo some text | tee /dev/tty
File Parent
my_variable=$(./Child)
Executing ./Child prints "some text", executing ./Parent prints "some text" twice. So far so good.
However, if I modify Child file to:
a=$(echo some text | tee /dev/tty)
Then Child still prints "some text" but Parent only prints once, and my_variable is an empty variable.
Is there a way to capture the value from /dev/tty on the parent?

Related

Making a script in debian which would create a new file from names file with different order of the names

Existing names in the "names" file is in form of lastname1,firstname1 ; lastname2,firstname2.
In the new file it should be like down below.
Create a script that outputs a list of existing users (from the "names" file) in the form:
firstname1.lastname1
firstname2.lastname2
etc.
And saves a file called "cat list"
This kind of command line should be a solution for you :
awk -F '\.' '{print $2","$1}' source_file >> "cat list"
First awk revers the order of the field and put the char ',' under
">>" Second step redirect full output to a file called "cat list" as requested
I don't think I have the most efficient solution here but it works and outputs the different stages of translation to help illustrate the process:
#!/bin/sh
echo "lastname1,firstname1 ; lastname2,firstname2" >testfile
echo "original file:"
cat testfile
echo "\n"
# first replace semi-colon with newline
tr ';' '\n' <testfile >testfile_n
echo "after first translation:"
cat testfile_n
echo "\n"
# also remove extra spaces
tr -d '[:blank:]' <testfile_n >testfile_n_s
echo "after second translation:"
cat testfile_n_s
echo "\n"
# now swap name order using sed and use periods instead of commas
sed -E 's/([a-zA-Z0-9]*),([a-zA-Z0-9]*)/\2\.\1/g' testfile_n_s >"cat list"
echo "after third iteration:"
cat "cat list"
echo "\n"
The script above will save a file called 'cat list' and output something similar to:
original file:
lastname1,firstname1 ; lastname2,firstname2
after first translation:
lastname1,firstname1
lastname2,firstname2
after second translation:
lastname1,firstname1
lastname2,firstname2
after third iteration:
firstname1.lastname1
firstname2.lastname2

bash prepend text to every line printed by commands

I'm trying to find a way to do something like this:
# script.sh:
cmd0
set_prepend "some text"
cmd1
cmd2
cmd3
unset_prepend
cmd4
Then whatever stdout generated by cmd1, 2 and 3, every line will be prepended by "some text". There is no relationship between the commands, and the commands can be anything (ls, cat, awk, whatever):
$ script.sh
cmd0 line1
...
cmd0 lineN0
some text cmd1 line1
some text ...
some text cmd1 lineN1
some text cmd2 line1
some text ...
some text cmd2 lineN2
some text cmd3 line1
some text ...
some text cmd3 lineN3
cmd4 line1
...
cmd4 lineN4
The only way I can think of is far from elegant:
script.sh | prepender
and for each line received by prepender, it checks the existence of a file; if the file exists, the contents are the text to prepend; and set_prepend would create that file and unset_prepend would remove it. However buffering would interfere with this, so it would have to be turned off, and I'm not sure how to garantee that a line going to stdout will be processed by prepender before the next script cmd is execute (otherwise race condition possible).
Use exec to redirect output to a pipe. You'll need to save the old stdout in another FD so you can restore it later.
set_prepend() {
exec 3>&1 | sed "s/^/$1 /"
}
unset_prepend() {
exec >&3 3>&-
}

How can I from stdin even when called in a pipe

How can I define a function bar in a Bash script such that echo foo | bar in that script will read an input from the script's stdin and not the pipe? In other words, if bar is:
function bar(){
read ZOO
}
I want it to wait for my input rather than setting ZOO to "foo"
The idea of a pipe is to connect stdout of the process on the left side of the pipe with stdin of the process on the right side of the pipe. So foo is in this case piped into stdin of the function bar().
If you want to read explicitly from the current terminal, then pass the special device /dev/tty to stdin of read:
function bar() {
read ZOO < /dev/tty
}

How to concatenate stdin and a string?

How to I concatenate stdin to a string, like this?
echo "input" | COMMAND "string"
and get
inputstring
A bit hacky, but this might be the shortest way to do what you asked in the question (use a pipe to accept stdout from echo "input" as stdin to another process / command:
echo "input" | awk '{print $1"string"}'
Output:
inputstring
What task are you exactly trying to accomplish? More context can get you more direction on a better solution.
Update - responding to comment:
#NoamRoss
The more idiomatic way of doing what you want is then:
echo 'http://dx.doi.org/'"$(pbpaste)"
The $(...) syntax is called command substitution. In short, it executes the commands enclosed in a new subshell, and substitutes the its stdout output to where the $(...) was invoked in the parent shell. So you would get, in effect:
echo 'http://dx.doi.org/'"rsif.2012.0125"
use cat - to read from stdin, and put it in $() to throw away the trailing newline
echo input | COMMAND "$(cat -)string"
However why don't you drop the pipe and grab the output of the left side in a command substitution:
COMMAND "$(echo input)string"
I'm often using pipes, so this tends to be an easy way to prefix and suffix stdin:
echo -n "my standard in" | cat <(echo -n "prefix... ") - <(echo " ...suffix")
prefix... my standard in ...suffix
There are some ways of accomplish this, i personally think the best is:
echo input | while read line; do echo $line string; done
Another can be by substituting "$" (end of line character) with "string" in a sed command:
echo input | sed "s/$/ string/g"
Why i prefer the former? Because it concatenates a string to stdin instantly, for example with the following command:
(echo input_one ;sleep 5; echo input_two ) | while read line; do echo $line string; done
you get immediatly the first output:
input_one string
and then after 5 seconds you get the other echo:
input_two string
On the other hand using "sed" first it performs all the content of the parenthesis and then it gives it to "sed", so the command
(echo input_one ;sleep 5; echo input_two ) | sed "s/$/ string/g"
will output both the lines
input_one string
input_two string
after 5 seconds.
This can be very useful in cases you are performing calls to functions which takes a long time to complete and want to be continuously updated about the output of the function.
You can do it with sed:
seq 5 | sed '$a\6'
seq 5 | sed '$ s/.*/\0 6/'
In your example:
echo input | sed 's/.*/\0string/'
I know this is a few years late, but you can accomplish this with the xargs -J option:
echo "input" | xargs -J "%" echo "%" "string"
And since it is xargs, you can do this on multiple lines of a file at once. If the file 'names' has three lines, like:
Adam
Bob
Charlie
You could do:
cat names | xargs -n 1 -J "%" echo "I like" "%" "because he is nice"
Also works:
seq -w 0 100 | xargs -I {} echo "string "{}
Will generate strings like:
string 000
string 001
string 002
string 003
string 004
...
The command you posted would take the string "input" use it as COMMAND's stdin stream, which would not produce the results you are looking for unless COMMAND first printed out the contents of its stdin and then printed out its command line arguments.
It seems like what you want to do is more close to command substitution.
http://www.gnu.org/software/bash/manual/html_node/Command-Substitution.html#Command-Substitution
With command substitution you can have a commandline like this:
echo input `COMMAND "string"`
This will first evaluate COMMAND with "string" as input, and then expand the results of that commands execution onto a line, replacing what's between the ‘`’ characters.
cat will be my choice: ls | cat - <(echo new line)
With perl
echo "input" | perl -ne 'print "prefix $_"'
Output:
prefix input
A solution using sd (basically a modern sed; much easier to use IMO):
# replace '$' (end of string marker) with 'Ipsum'
# the `e` flag disables multi-line matching (treats all lines as one)
$ echo "Lorem" | sd --flags e '$' 'Ipsum'
Lorem
Ipsum#no new line here
You might observe that Ipsum appears on a new line, and the output is missing a \n. The reason is echo's output ends in a \n, and you didn't tell sd to add a new \n. sd is technically correct because it's doing exactly what you are asking it to do and nothing else.
However this may not be what you want, so instead you can do this:
# replace '\n$' (new line, immediately followed by end of string) by 'Ipsum\n'
# don't forget to re-add the `\n` that you removed (if you want it)
$ echo "Lorem" | sd --flags e '\n$' 'Ipsum\n'
LoremIpsum
If you have a multi-line string, but you want to append to the end of each individual line:
$ ls
foo bar baz
$ ls | sd '\n' '/file\n'
bar/file
baz/file
foo/file
I want to prepend my sql script with "set" statement before running it.
So I echo the "set" instruction, then pipe it to cat. Command cat takes two parameters : STDIN marked as "-" and my sql file, cat joins both of them to one output. Next I pass the result to mysql command to run it as a script.
echo "set #ZERO_PRODUCTS_DISPLAY='$ZERO_PRODUCTS_DISPLAY';" | cat - sql/test_parameter.sql | mysql
p.s. mysql login and password stored in .my.cnf file

Ruby: execute bash command, capture output AND dump to screen at the same time

So my problem is that I need to have the output of running the command dumped to the screen and also capture it in a variable in a ruby script. I know that I can do the second part like this:
some_variable = `./some_kickbutt`
But my problem is that I need it to still print to the console as Hudson captures that output and records it for posterity's sake.
thanks in advance for any ideas...
Just tee the stdout stream to stderr like so:
ruby -e 'var = `ls | tee /dev/stderr`; puts "\nFROM RUBY\n\n"; puts var' | nl
ruby -e 'var = `ls | tee /dev/stderr`; puts "\nFROM RUBY\n\n"; puts var' 2>&1 | nl

Resources