echo to stdout and append to file - bash

I have this:
echo "all done creating tables" >> ${SUMAN_DEBUG_LOG_PATH}
but that should only append to the file, not write to stdout.
How can I write to stdout and append to a file in the same bash line?

Something like this?
echo "all done creating tables" | tee -a "${SUMAN_DEBUG_LOG_PATH}"

Use the tee command
$ echo hi | tee -a foo.txt
hi
$ cat foo.txt
hi

Normally tee is used, however a version using just bash:
#!/bin/bash
function mytee (){
fn=$1
shift
IFS= read -r LINE
printf '%s\n' "$LINE"
printf '%s\n' "$LINE" >> "$fn"
}
SUMAN_DEBUG_LOG_PATH=/tmp/abc
echo "all done creating tables" | mytee "${SUMAN_DEBUG_LOG_PATH}"

Related

Shell - Execute commands in external file between two patterns

I have got a question. How should I proceed and make this code print out and execute curl examples that I have on my external file?
How I want it to work is to match the pattern, get text between the patterns (without the pattern) and then execute it.
Is there way to do this?
Thanks for the help.
read -p "Enter a word: " instance
testfile=test.txt
case $instance in
loresipsum)
sed -n '/^loremipsum1/,${p;/^loremipsum2/q}' $testfile \
| while read -r line; do
makingcurlCall=$(eval "$line")
echo "makingcurlCall"
done < $testfile ;;
foobar)
sed -n '/^foobar1/,${p;/^foobar2/q}' $testfile \
| while read -r line; do
makingcurlCall=$(eval "$line")
echo "makingcurlCall"
done < $testfile ;;
*)
printf 'No match for "%s"\n' ":instance"
esac
Text file looks like this
loremipsum1
curl example1
curl example2
curl example3
loremipsum2
foobar1
curl foo
curl bar
curl foo
foobar2
You cannot have the while loop read from both the output of sed and directly from the file. Your current code is ignoring the output from sed and reading directly from the file. Perhaps refactor it like:
#!/bin/sh
instance=${1-loresipsum}
testfile=test.txt
case $instance in
loresipsum) sed -n '/^loremipsum1/,/^loremipsum2/p' "$testfile";;
foobar) sed -n '/^foobar1/,/^foobar2/p' "$testfile";;
*) echo "Error: no match" >&2;;
esac \
| sed -e 1d -e '$d' -e '/^\s*$/d' | while read -r line; do
# makingcurlCall=$(eval "$line")
echo "makingcurlCall: $line"
done

How to iterate two variables in bash script?

I have these kind of files:
file6543_015.bam
subreadset_15.xml
file6543_024.bam
subreadset_24.xml
file6543_027.bam
subreadset_27.xml
I would like to run something like this:
for i in *bam && l in *xml
do
my_script $i $l > output_file
done
Because in my command the first bam file goes with the first xml file. For each combination bam/xml, that command will give a specific output file.
Like this, using bash arrays:
bam=( *.bam )
xml=( *.xml )
for ((i=0; i<${#bam[#]}; i++)); do
my_script "${bam[i]}" "${xml[i]}"
done
Assuming you have way to uniquely name your output_file for each specific output,
here is one way:
#!/bin/bash
ls file*.bam | while read i
do
CMD=`echo -n "my_script $i "`
CMD="$CMD `echo $i | sed -e 's/file.*_0/subreadset_/' -e 's/.bam/.xml/'`"
$CMD >> output_file
done

How to run commands off of a pipe

I would like to run commands such as "history" or "!23" off of a pipe.
How might I achieve this?
Why does the following command not work?
echo "history" | xargs eval $1
To answer (2) first:
history and eval are both bash builtins. So xargs cannot run either of them.
xargs does not use $1 arguments. man xargs for the correct syntax.
For (1), it doesn't really make much sense to do what you are attempting because shell history is not likely to be synchronised between invocations, but you could try something like:
{ echo 'history'; echo '!23'; } | bash -i
or:
{ echo 'history'; echo '!23'; } | while read -r cmd; do eval "$cmd"; done
Note that pipelines run inside subshells. Environment changes are not retained:
x=1; echo "x=2" | while read -r cmd; do eval "$cmd"; done; echo "$x"
You can try like this
First redirect the history commands to a file (cut out the line numbers)
history | cut -c 8- > cmd.txt
Now Create this script hcmd.sh(Referred to this Read a file line by line assigning the value to a variable)
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
$line
done < "cmd.txt"
Run it like this
./hcmd.sh

Ignoring all but the (multi-line) results of the last query sent to a program

I have an executable that accepts queries from stdin and responds to them, reading until EOF. Additionally I have an input file and a special command, let's call those EXEC, FILE and CMD respectively.
What I need to do is:
Pass FILE to EXEC as input.
Disregard all the output corresponding to commands read from FILE (/dev/null/).
Pass CMD as the last command.
Fetch output for the last command and save it in a variable.
EXEC's output can be multiline for each query.
I know how to pass FILE + CMD into the EXEC:
echo ${CMD} | cat ${FILE} - | ${EXEC}
but I have no idea how to fetch only output resulting from CMD.
Is there a magical one-liner that does this?
After looking around I've found the following partial solution:
mkfifo mypipe
(tail -f mypipe) | ${EXEC} &
cat ${FILE} | while read line; do
echo ${line} > mypipe
done
echo ${CMD} > mypipe
This allows me to redirect my input, but now the output gets printed to screen. I want to ignore all the output produced by EXEC in the while loop and get only what it prints for the last line.
I tried what first came into my mind, which is:
(tail -f mypipe) | ${EXEC} > somefile &
But it didn't work, the file was empty.
This is race-prone -- I'd suggest putting in a delay after the kill, or using an explicit sigil to determine when it's been received. That said:
#!/usr/bin/env bash
# route FD 4 to your output routine
exec 4> >(
output=; trap 'output=1' USR1
while IFS= read -r line; do
[[ $output ]] && printf '%s\n' "$line"
done
); out_pid=$!
# Capture the PID for the process substitution above; note that this requires a very
# new version of bash (4.4?)
[[ $out_pid ]] || { echo "ERROR: Your bash version is too old" >&2; exit 1; }
# Run your program in another process substitution, and close the parent's handle on FD 4
exec 3> >("$EXEC" >&4) 4>&-
# cat your file to FD 3...
cat "$file" >&3
# UGLY HACK: Wait to let your program finish flushing output from those commands
sleep 0.1
# notify the subshell writing output to disk that the ignored input is done...
kill -USR1 "$out_pid"
# UGLY HACK: Wait to let the subprocess actually receive the signal and set output=1
sleep 0.1
# ...and then write the command for which you actually want content logged.
echo "command" >&3
In validating this answer, I'm doing the following:
EXEC=stub_function
stub_function() {
local count line
count=0
while IFS= read -r line; do
(( ++count ))
printf '%s: %s\n' "$count" "$line"
done
}
cat >file <<EOF
do-not-log-my-output-1
do-not-log-my-output-2
do-not-log-my-output-3
EOF
file=file
export -f stub_function
export file EXEC
Output is only:
4: command
You could pipe it into a sed:
var=$(YOUR COMMAND | sed '$!d')
This will put only the last line into the variable
I think, that your proram EXEC does something special (open connection or remember state). When that is not the case, you can use
${EXEC} < ${FILE} > /dev/null
myvar=$(echo ${CMD} | ${EXEC})
Or with normal commands:
# Do not use (printf "==%s==\n" 1 2 3 ; printf "oo%soo\n" 4 5 6) | cat
printf "==%s==\n" 1 2 3 | cat > /dev/null
myvar=$(printf "oo%soo\n" 4 5 6 | cat)
When you need to give all input to one process, perhaps you can think of a marker that you can filter on:
(printf "==%s==\n" 1 2 3 ; printf "%s\n" "marker"; printf "oo%soo\n" 4 5 6) | cat | sed '1,/marker/ d'
You should examine your EXEC what could be used. When it is running SQL, you might use something like
(cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d'
and write this in a var with
myvar=$( (cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d' )

Concatenate strings in bash

I have in a bash script:
for i in `seq 1 10`
do
read AA BB CC <<< $(cat file1 | grep DATA)
echo ${i}
echo ${CC}
SORT=${CC}${i}
echo ${SORT}
done
so "i" is a integer, and CC is a string like "TODAY"
I would like to get then in SORT, "TODAY1", etc
But I get "1ODAY", "2ODAY" and so
Where is the error?
Thanks
You should try
SORT="${CC}${i}"
Make sure your file does not contain "\r" that would end just in the end of $CC.
This could well explain why you get "1ODAY".
Try including
|tr '\r' ''
after the cat command
try
for i in {1..10}
do
while read -r line
do
case "$line" in
*DATA* )
set -- $line
CC=$3
SORT=${CC}${i}
echo ${SORT}
esac
done <"file1"
done
Otherwise, show an example of file1 and your desired output
ghostdog is right: with the -r option, read avoids succumbing to potential horrors, like CRLFs. Using arrays makes the -r option more pleasant:
for i in `seq 1 10`
do
read -ra line <<< $(cat file1 | grep DATA)
CC="${line[3]}"
echo ${i}
echo ${CC}
SORT=${CC}${i}
echo ${SORT}
done

Resources