How to run commands off of a pipe - bash

I would like to run commands such as "history" or "!23" off of a pipe.
How might I achieve this?
Why does the following command not work?
echo "history" | xargs eval $1

To answer (2) first:
history and eval are both bash builtins. So xargs cannot run either of them.
xargs does not use $1 arguments. man xargs for the correct syntax.
For (1), it doesn't really make much sense to do what you are attempting because shell history is not likely to be synchronised between invocations, but you could try something like:
{ echo 'history'; echo '!23'; } | bash -i
or:
{ echo 'history'; echo '!23'; } | while read -r cmd; do eval "$cmd"; done
Note that pipelines run inside subshells. Environment changes are not retained:
x=1; echo "x=2" | while read -r cmd; do eval "$cmd"; done; echo "$x"

You can try like this
First redirect the history commands to a file (cut out the line numbers)
history | cut -c 8- > cmd.txt
Now Create this script hcmd.sh(Referred to this Read a file line by line assigning the value to a variable)
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
$line
done < "cmd.txt"
Run it like this
./hcmd.sh

Related

Calling bash script from bash script

I have made two programms and I'm trying to call the one from the other but this is appearing on my screen:
cp: cannot stat ‘PerShip/.csv’: No such file or directory
cp: target ‘tmpship.csv’ is not a directory
I don't know what to do. Here are the programms. Could somebody help me please?
#!/bin/bash
shipname=$1
imo=$(grep "$shipname" shipsNAME-IMO.txt | cut -d "," -f 2)
cp PerShip/$imo'.csv' tmpship.csv
dist=$(octave -q ShipDistance.m 2>/dev/null)
grep "$shipname" shipsNAME-IMO.txt | cut -d "," -f 2 > IMO.txt
idnumber=$(cut -b 4-10 IMO.txt)
echo $idnumber,$dist
#!/bin/bash
rm -f shipsdist.csv
for ship in $(cat shipsNAME-IMO.txt | cut -d "," -f 1)
do
./FindShipDistance "$ship" >> shipsdist.csv
done
cat shipsdist.csv | sort | head -n 1
The code and error messages presented suggest that the second script is calling the first with an empty command-line argument. That would certainly happen if input file shipsNAME-IMO.txt contained any empty lines or otherwise any lines with an empty first field. An empty line at the beginning or end would do it.
I suggest
using the read command to read the data, and manipulating IFS to parse out comma-delimited fields
validating your inputs and other data early and often
making your scripts behave more pleasantly in the event of predictable failures
More generally, using internal Bash features instead of external programs where the former are reasonably natural.
For example:
#!/bin/bash
# Validate one command-line argument
[[ -n "$1" ]] || { echo empty ship name 1>&2; exit 1; }
# Read and validate an IMO corresponding to the argument
IFS=, read -r dummy imo tail < <(grep -F -- "$1" shipsNAME-IMO.txt)
[[ -f PerShip/"${imo}.csv" ]] || { echo no data for "'$imo'" 1>&2; exit 1; }
# Perform the distance calculation and output the result
cp PerShip/"${imo}.csv" tmpship.csv
dist=$(octave -q ShipDistance.m 2>/dev/null) ||
{ echo "failed to compute ship distance for '${imo}'" 2>&1; exit 1; }
echo "${imo:3:7},${dist}"
and
#!/bin/bash
# Note: the original shipsdist.csv will be clobbered
while IFS=, read -r ship tail; do
# Ignore any empty ship name, however it might arise
[[ -n "$ship" ]] && ./FindShipDistance "$ship"
done < shipsNAME-IMO.txt |
tee shipsdist.csv |
sort |
head -n 1
Note that making the while loop in the second script part of a pipeline will cause it to run in a subshell. That is sometimes a gotcha, but it won't cause any problem in this case.

How to extract code into a funciton when using xargs -P?

At fisrt,I have write the code,and it run well.
# version1
all_num=10
thread_num=5
a=$(date +%H%M%S)
seq 1 ${all_num} | xargs -n 1 -I {} -P ${thread_num} sh -c 'echo abc{}'
b=$(date +%H%M%S)
echo -e "startTime:\t$a"
echo -e "endTime:\t$b"
Now I want to extract code into a funciton,but it was wrong,how to fix it?
get_file(i){
echo "abc"+i
}
all_num=10
thread_num=5
a=$(date +%H%M%S)
seq 1 ${all_num} | xargs -n 1 -I {} -P ${thread_num} sh -c "$(get_file {})"
b=$(date +%H%M%S)
echo -e "startTime:\t$a"
echo -e "endTime:\t$b"
Because /bin/sh isn't guaranteed to have support for either printing text that when evaluates defines your function, or exporting functions through the environment, we need to do this the hard way, just duplicating the text of the function inside the copy of sh started by xargs.
Other questions already exist in this site describing how to accomplish this with bash, which is quite considerably easier. See f/e How can I use xargs to run a function in a command substitution for each match?
#!/bin/sh
all_num=10
thread_num=5
batch_size=1 # but with a larger all_num, turn this up to start fewer copies of sh
a=$(date +%H%M%S) # warning: this is really inefficient
seq 1 ${all_num} | xargs -n "${batch_size}" -P "${thread_num}" sh -c '
get_file() { i=$1; echo "abc ${i}"; }
for arg do
get_file "$arg"
done
' _
b=$(date +%H%M%S)
printf 'startTime:\t%s\n' "$a"
printf 'endTime:\t%s\n' "$b"
Note:
echo -e is not guaranteed to work with /bin/sh. Moreover, for a shell to be truly compliant, echo -e is required to write -e to its output. See Why is printf better than echo? on UNIX & Linux Stack Exchange, and the APPLICATION USAGE section of the POSIX echo specification.
Putting {} in a sh -c '...{}...' position is a Really Bad Idea. Consider the case where you're passed in a filename that contains $(rm -rf ~)'$(rm -rf ~)' -- it can't be safely inserted in an unquoted context, or a double-quoted context, or a single-quoted context, or a heredoc.
Note that seq is also nonstandard and not guaranteed to be present on all POSIX-compliant systems. i=0; while [ "$i" -lt "$all_num" ]; do echo "$i"; i=$((i + 1)); done is an alternative that will work on all POSIX systems.

How to iterate two variables in bash script?

I have these kind of files:
file6543_015.bam
subreadset_15.xml
file6543_024.bam
subreadset_24.xml
file6543_027.bam
subreadset_27.xml
I would like to run something like this:
for i in *bam && l in *xml
do
my_script $i $l > output_file
done
Because in my command the first bam file goes with the first xml file. For each combination bam/xml, that command will give a specific output file.
Like this, using bash arrays:
bam=( *.bam )
xml=( *.xml )
for ((i=0; i<${#bam[#]}; i++)); do
my_script "${bam[i]}" "${xml[i]}"
done
Assuming you have way to uniquely name your output_file for each specific output,
here is one way:
#!/bin/bash
ls file*.bam | while read i
do
CMD=`echo -n "my_script $i "`
CMD="$CMD `echo $i | sed -e 's/file.*_0/subreadset_/' -e 's/.bam/.xml/'`"
$CMD >> output_file
done

Ignoring all but the (multi-line) results of the last query sent to a program

I have an executable that accepts queries from stdin and responds to them, reading until EOF. Additionally I have an input file and a special command, let's call those EXEC, FILE and CMD respectively.
What I need to do is:
Pass FILE to EXEC as input.
Disregard all the output corresponding to commands read from FILE (/dev/null/).
Pass CMD as the last command.
Fetch output for the last command and save it in a variable.
EXEC's output can be multiline for each query.
I know how to pass FILE + CMD into the EXEC:
echo ${CMD} | cat ${FILE} - | ${EXEC}
but I have no idea how to fetch only output resulting from CMD.
Is there a magical one-liner that does this?
After looking around I've found the following partial solution:
mkfifo mypipe
(tail -f mypipe) | ${EXEC} &
cat ${FILE} | while read line; do
echo ${line} > mypipe
done
echo ${CMD} > mypipe
This allows me to redirect my input, but now the output gets printed to screen. I want to ignore all the output produced by EXEC in the while loop and get only what it prints for the last line.
I tried what first came into my mind, which is:
(tail -f mypipe) | ${EXEC} > somefile &
But it didn't work, the file was empty.
This is race-prone -- I'd suggest putting in a delay after the kill, or using an explicit sigil to determine when it's been received. That said:
#!/usr/bin/env bash
# route FD 4 to your output routine
exec 4> >(
output=; trap 'output=1' USR1
while IFS= read -r line; do
[[ $output ]] && printf '%s\n' "$line"
done
); out_pid=$!
# Capture the PID for the process substitution above; note that this requires a very
# new version of bash (4.4?)
[[ $out_pid ]] || { echo "ERROR: Your bash version is too old" >&2; exit 1; }
# Run your program in another process substitution, and close the parent's handle on FD 4
exec 3> >("$EXEC" >&4) 4>&-
# cat your file to FD 3...
cat "$file" >&3
# UGLY HACK: Wait to let your program finish flushing output from those commands
sleep 0.1
# notify the subshell writing output to disk that the ignored input is done...
kill -USR1 "$out_pid"
# UGLY HACK: Wait to let the subprocess actually receive the signal and set output=1
sleep 0.1
# ...and then write the command for which you actually want content logged.
echo "command" >&3
In validating this answer, I'm doing the following:
EXEC=stub_function
stub_function() {
local count line
count=0
while IFS= read -r line; do
(( ++count ))
printf '%s: %s\n' "$count" "$line"
done
}
cat >file <<EOF
do-not-log-my-output-1
do-not-log-my-output-2
do-not-log-my-output-3
EOF
file=file
export -f stub_function
export file EXEC
Output is only:
4: command
You could pipe it into a sed:
var=$(YOUR COMMAND | sed '$!d')
This will put only the last line into the variable
I think, that your proram EXEC does something special (open connection or remember state). When that is not the case, you can use
${EXEC} < ${FILE} > /dev/null
myvar=$(echo ${CMD} | ${EXEC})
Or with normal commands:
# Do not use (printf "==%s==\n" 1 2 3 ; printf "oo%soo\n" 4 5 6) | cat
printf "==%s==\n" 1 2 3 | cat > /dev/null
myvar=$(printf "oo%soo\n" 4 5 6 | cat)
When you need to give all input to one process, perhaps you can think of a marker that you can filter on:
(printf "==%s==\n" 1 2 3 ; printf "%s\n" "marker"; printf "oo%soo\n" 4 5 6) | cat | sed '1,/marker/ d'
You should examine your EXEC what could be used. When it is running SQL, you might use something like
(cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d'
and write this in a var with
myvar=$( (cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d' )

Bash Script - using cmd instead of cat

I wrote a script, including this loop:
#!/bin/bash
cat "$1" | while read -r line; do
echo "$line"; sleep 2;
done
A shellcheck run put out the following message:
SC2002: Useless cat. Consider 'cmd < file | ..' or 'cmd file | ..' instead.
I changed the script to:
#!/bin/bash
cmd < "$1" | while read -r line; do
echo "$line"; sleep 2;
done
but now bash exits with:
cmd: command not found
what have I done wrong?
Your cmd is the whole while cond; do ... done compound statement and in this case the redirection needs to come at the end:
while read -r line; do
echo "$line"; sleep 0.2
done < "$1"
Remove the | and have the end line as :
done < "$1"

Resources