My script v.sh
select f in "$#" ; do
echo $f
done
v.sh 1 2 3 I can select options after the command executed.
echo 1 2 3 | v.sh Showing nothing.
echo 1 2 3 | xargs v.sh Showing the options, but I can't select them.
How to select the options? Thx in advance.
With bash:
#!/bin/bash
if [[ -z $1 ]]; then
# no arguments, read from stdin
mapfile -t input </dev/stdin
exec </dev/tty
else
# use arguments
input=("$#")
fi
select f in "${input[#]}" ; do
echo "$f"
done
Example with stdin:
cut -d : -f 1 /etc/passwd | ./script.sh
with arguments:
./script.sh 1 2 "3 3" 4 5
Related
I have a script which takes in only one positional parameter which is a list of values, and I'm trying to get the parameter from stdin with xargs.
However by default, xargs passes all the lists to my script as positional parameters, e.g. when doing:
echo 1 2 3 | xargs myScript, it will essentially be myScript 1 2 3, and what I'm looking for is myScript "1 2 3". What is the best way to achieve this?
Change the delimiter.
$ echo 1 2 3 | xargs -d '\n' printf '%s\n'
1 2 3
Not all xargs implementations have -d though.
And not sure if there is an actual use case for this but you can also resort to spawning another shell instance if you have to. Like
$ echo -e '1 2\n3' | xargs sh -c 'printf '\''%s\n'\'' "$*"' sh
1 2 3
If the input can be altered, you can do this. But not sure if this is what you wanted.
echo \"1 2 3\"|xargs ./myScript
Here is the example.
$ cat myScript
#!/bin/bash
echo $1; shift
echo $1; shift
echo $1;
$ echo \"1 2 3\"|xargs ./myScript
1 2 3
$ echo 1 2 3|xargs ./myScript
1
2
3
I have a program which transposes a matrix. It works properly when passed a file as a parameter, but it gives strange output when given input via stdin.
This works:
$ cat m1
1 2 3 4
5 6 7 8
$ ./matrix transpose m1
1 5
2 6
3 7
4 8
This doesn't:
$ cat m1 | ./matrix transpose
5
[newline]
[newline]
[newline]
This is the code I'm using to transpose the matrix:
function transpose {
# Set file to be argument 1 or stdin
FILE="${1:-/dev/stdin}"
if [[ $# -gt 1 ]]; then
print_stderr "Too many arguments. Exiting."
exit 1
elif ! [[ -r $FILE ]]; then
print_stderr "File not found. Exiting."
exit 1
else
col=1
read -r line < $FILE
for num in $line; do
cut -f$col $FILE | tr '\n' '\t'
((col++))
echo
done
exit 0
fi
}
And this code handles the argument passing:
# Main
COMMAND=$1
if func_exists $COMMAND; then
$COMMAND "${#:2}"
else
print_stderr "Command \"$COMMAND\" not found. Exiting."
exit 1
fi
I'm aware of this answer but I can't figure out where I've gone wrong. Any ideas?
for num in $line; do
cut -f$col $FILE | tr '\n' '\t'
((col++))
echo
done
This loop reads $FILE over and over, once for each column. That works fine for a file but isn't suitable for stdin, which is a stream of data that can only be read once.
A quick fix would be to read the file into memory and use <<< to pass it to read and cut.
matrix=$(< "$FILE")
read -r line <<< "$matrix"
for num in $line; do
cut -f$col <<< "$matrix" | tr '\n' '\t'
((col++))
echo
done
See An efficient way to transpose a file in Bash for a variety of more efficient one-pass solutions.
I have an executable that accepts queries from stdin and responds to them, reading until EOF. Additionally I have an input file and a special command, let's call those EXEC, FILE and CMD respectively.
What I need to do is:
Pass FILE to EXEC as input.
Disregard all the output corresponding to commands read from FILE (/dev/null/).
Pass CMD as the last command.
Fetch output for the last command and save it in a variable.
EXEC's output can be multiline for each query.
I know how to pass FILE + CMD into the EXEC:
echo ${CMD} | cat ${FILE} - | ${EXEC}
but I have no idea how to fetch only output resulting from CMD.
Is there a magical one-liner that does this?
After looking around I've found the following partial solution:
mkfifo mypipe
(tail -f mypipe) | ${EXEC} &
cat ${FILE} | while read line; do
echo ${line} > mypipe
done
echo ${CMD} > mypipe
This allows me to redirect my input, but now the output gets printed to screen. I want to ignore all the output produced by EXEC in the while loop and get only what it prints for the last line.
I tried what first came into my mind, which is:
(tail -f mypipe) | ${EXEC} > somefile &
But it didn't work, the file was empty.
This is race-prone -- I'd suggest putting in a delay after the kill, or using an explicit sigil to determine when it's been received. That said:
#!/usr/bin/env bash
# route FD 4 to your output routine
exec 4> >(
output=; trap 'output=1' USR1
while IFS= read -r line; do
[[ $output ]] && printf '%s\n' "$line"
done
); out_pid=$!
# Capture the PID for the process substitution above; note that this requires a very
# new version of bash (4.4?)
[[ $out_pid ]] || { echo "ERROR: Your bash version is too old" >&2; exit 1; }
# Run your program in another process substitution, and close the parent's handle on FD 4
exec 3> >("$EXEC" >&4) 4>&-
# cat your file to FD 3...
cat "$file" >&3
# UGLY HACK: Wait to let your program finish flushing output from those commands
sleep 0.1
# notify the subshell writing output to disk that the ignored input is done...
kill -USR1 "$out_pid"
# UGLY HACK: Wait to let the subprocess actually receive the signal and set output=1
sleep 0.1
# ...and then write the command for which you actually want content logged.
echo "command" >&3
In validating this answer, I'm doing the following:
EXEC=stub_function
stub_function() {
local count line
count=0
while IFS= read -r line; do
(( ++count ))
printf '%s: %s\n' "$count" "$line"
done
}
cat >file <<EOF
do-not-log-my-output-1
do-not-log-my-output-2
do-not-log-my-output-3
EOF
file=file
export -f stub_function
export file EXEC
Output is only:
4: command
You could pipe it into a sed:
var=$(YOUR COMMAND | sed '$!d')
This will put only the last line into the variable
I think, that your proram EXEC does something special (open connection or remember state). When that is not the case, you can use
${EXEC} < ${FILE} > /dev/null
myvar=$(echo ${CMD} | ${EXEC})
Or with normal commands:
# Do not use (printf "==%s==\n" 1 2 3 ; printf "oo%soo\n" 4 5 6) | cat
printf "==%s==\n" 1 2 3 | cat > /dev/null
myvar=$(printf "oo%soo\n" 4 5 6 | cat)
When you need to give all input to one process, perhaps you can think of a marker that you can filter on:
(printf "==%s==\n" 1 2 3 ; printf "%s\n" "marker"; printf "oo%soo\n" 4 5 6) | cat | sed '1,/marker/ d'
You should examine your EXEC what could be used. When it is running SQL, you might use something like
(cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d'
and write this in a var with
myvar=$( (cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d' )
For a homework assignment I have to Take the results from the grep command, and write out up to the first 5 of them, numbering them from 1 to 5. (Print the number, then a space, then the line from grep.) If there are no lines, print a message saying so. So far I managed to store the grep command in an array but this is where I've gotten stuck: Can anyone provide guidance as to how to proceed in printing this as stated above
pattern="*.c"
fileList=$(grep -l "main" $pattern)
IFS=$"\n"
declare -a array
array=$fileList
for x in "${array[#]}"; do
echo "$x"
done
you can grep options -c and -l
pattern="*.c"
searchPattern="main"
counter=1
while read -r line ; do
IFS=':' read -r -a lineInfo <<< "$line"
if [[ $counter > 5 ]]; then
exit 1
fi
if [[ ${lineInfo[1]} > 0 ]]; then
numsOfLine=""
while read -r fileline ; do
IFS=':' read -r -a fileLineInfo <<< "$fileline"
numsOfLine="$numsOfLine ${fileLineInfo[0]} "
done < <(grep -n $searchPattern ${lineInfo[0]})
echo "$counter ${lineInfo[0]} match on lines: $numsOfLine"
let "counter += 1"
else
echo "${lineInfo[0]} no match lines"
fi
done < <(grep -c $searchPattern $pattern)
If you're only allowed to use grep and bash(?):
pattern="*.c"
fileList=($(grep -l "main" $pattern))
if test ${#fileList[#]} = 0 ; then
echo "No results"
else
n=0
while test $n -lt ${#fileList[#]} -a $n -lt 5 ; do
i=$n
n=$(( n + 1 ))
echo "$n ${fileList[$i]}"
done
fi
If you are allowed to use commands in addition to grep, you can pipe the results through nl to add line numbers, then head to limit the results to the first 5 lines, then a second grep to test if there were any lines. For example:
if ! grep -l "main" $pattern | \
nl -s ' ' | sed -e 's/^ *//' | \
head -n 5 | grep '' ; then
echo "No results"
fi
I am trying to write a script in BASH that will take between 1 and 5 command line arguments from the user and report them back in reverse numerical order to standard output. The only command I know that would work similarly to this is the sort command, but this only works for files. Is there a similar command for sorting command line arguments? Here is what I have so far.
#!/bin/bash
if [ $# -lt 1 ] || [ $# -gt 5 ];
then echo "Incorrect number of arguments!"
else
sorted=sort -rn $*
echo "SORTED: $sorted"
fi
Try:
sorted=$( printf '%s\n' "$#" | sort -rn )
printf '%s\n' "${sorted//$'\n'/ }"
You can give the sort command values from standard input. It expects every value on its own line, which you can achieve by combining echo and tr:
sorted=$(echo $* | tr ' ' '\n' | sort -rn - | tr '\n' ' ')
The last invocation of tr is only necessary if you want the result to be space-delimited again and not newline-delimited.
#!/bin/bash
if [ $# -lt 1 ] || [ $# -gt 5 ];
then echo "Incorrect number of arguments!"
else
sorted=$(echo $* | tr ' ' '\n' | sort -rn | tr '\n' ' ')
echo "SORTED: $sorted"
fi
echo $* | tr ' ' '\n' | sort -rn | tr '\n' ' '
You need to use command substitution $(...) to capture the output of a command like that.
#!/bin/bash
if [ $# -lt 1 ] || [ $# -gt 5 ]; then
echo "Incorrect number of arguments!"
else
sorted=$(for var in "$#"; do echo "$var"; done | sort -rn | tr -d '\n')
echo "SORTED: $sorted"
fi
$ ./test 1 2 3 4 5
SORTED: 5 4 3 2 1
$ ./test 5 4 3 2 1
SORTED: 5 4 3 2 1