Bash Parse Variable Values - bash

I have a command and it returns 108 set of week/enumeration:
Command:
impala-shell -B -f query.sql
Results:
20180203 1
20180127 2
20180120 3
...
I parsed the results and read the week and enumeration as two variables. However, I have to use a variable wk to store intermediate results first:
wk="$(impala-shell -B -f query.sql)"
echo "$wk" | while read -r a b; do echo $a--$b; done
I tried to avoid using additional variable wk:
"$(impala-shell -B -f query.sql)" | while read -r a b; do echo $a--$b; done
But it returned:
...
20160213 104
20160206 105
20160130 106
20160123 107
20160116 108: command not found
I understand you can use wk="$(impala-shell -B -f query.sql)" && echo "$wk" | while read -r a b; do echo $a--$b; done but that doesn't avoid using a variable in the middle. How to compose a one-liner without using the variable wk?

or
awk to the rescue!
$ impala-shell -B -f query.sql | awk '{print $1"--"$2}'

You can execute commands first (inline) when using special quotes ``
Try this (untested, as i neither have your shell, nor that script):
`impala-shell -B -f query.sql` | while read -r a b; do echo $a--$b; done

Most elegant answer goes to choroba in the question comments! You just need to remove the quotes!
impala-shell -B -f query.sql | while read -r a b ; do echo $a--$b; done

Related

Set a command to a variable in bash script problem

Trying to run a command as a variable but I am getting strange results
Expected result "1" :
grep -i nosuid /etc/fstab | grep -iq nfs
echo $?
1
Unexpected result as a variable command:
cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
$cmd
echo $?
0
It seems it returns 0 as the command was correct not actual outcome. How to do this better ?
You can only execute exactly one command stored in a variable. The pipe is passed as an argument to the first grep.
Example
$ printArgs() { printf %s\\n "$#"; }
# Two commands. The 1st command has parameters "a" and "b".
# The 2nd command prints stdin from the first command.
$ printArgs a b | cat
a
b
$ cmd='printArgs a b | cat'
# Only one command with parameters "a", "b", "|", and "cat".
$ $cmd
a
b
|
cat
How to do this better?
Don't execute the command using variables.
Use a function.
$ cmd() { grep -i nosuid /etc/fstab | grep -iq nfs; }
$ cmd
$ echo $?
1
Solution to the actual problem
I see three options to your actual problem:
Use a DEBUG trap and the BASH_COMMAND variable inside the trap.
Enable bash's history feature for your script and use the hist command.
Use a function which takes a command string and executes it using eval.
Regarding your comment on the last approach: You only need one function. Something like
execAndLog() {
description="$1"
shift
if eval "$*"; then
info="PASSED: $description: $*"
passed+=("${FUNCNAME[1]}")
else
info="FAILED: $description: $*"
failed+=("${FUNCNAME[1]}")
done
}
You can use this function as follows
execAndLog 'Scanned system' 'grep -i nfs /etc/fstab | grep -iq noexec'
The first argument is the description for the log, the remaining arguments are the command to be executed.
using bash -x or set -x will allow you to see what bash executes:
> cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
> set -x
> $cmd
+ grep -i nosuid /etc/fstab '|' grep -iq nfs
as you can see your pipe | is passed as an argument to the first grep command.

How to run commands off of a pipe

I would like to run commands such as "history" or "!23" off of a pipe.
How might I achieve this?
Why does the following command not work?
echo "history" | xargs eval $1
To answer (2) first:
history and eval are both bash builtins. So xargs cannot run either of them.
xargs does not use $1 arguments. man xargs for the correct syntax.
For (1), it doesn't really make much sense to do what you are attempting because shell history is not likely to be synchronised between invocations, but you could try something like:
{ echo 'history'; echo '!23'; } | bash -i
or:
{ echo 'history'; echo '!23'; } | while read -r cmd; do eval "$cmd"; done
Note that pipelines run inside subshells. Environment changes are not retained:
x=1; echo "x=2" | while read -r cmd; do eval "$cmd"; done; echo "$x"
You can try like this
First redirect the history commands to a file (cut out the line numbers)
history | cut -c 8- > cmd.txt
Now Create this script hcmd.sh(Referred to this Read a file line by line assigning the value to a variable)
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
$line
done < "cmd.txt"
Run it like this
./hcmd.sh

Bash: use nth row as part of the command line

I am trying to use a tool called fastqtl, but it's probably less relevant here. I am interested in assigning each row of the "loc_info.txt" into the options. I wrote the following commands but it bounced back as "Error parsing command line :unrecognised option '-n+1'
Is there a way that I can make the fastQTL read and use that 1 line from "loc_info.txt" each time it runs the function?
Thanks for any suggestions!!
#!/bin/bash
tool="/path/FastQTL-2.165.linux/bin/"
vcf="/path/vcf/"
out="/path/perm_out"
for i in {1..1061}
do
${tool}fastQTL.1.165.linux --vcf ${vcf}GT.vcf.gz --bed pheno_bed.gz --region tail -n+"$i" loc_info.txt --permute 1000 --out "$i"_perm.txt
done
Read the file in a loop:
i=1
while read -r line; do
${tool}fastQTL.1.165.linux --vcf ${vcf}GT.vcf.gz --bed pheno_bed.gz --region "$line" --permute 1000 --out "$i"_perm.txt
((i++))
done < loc_info.txt
You can use a subshell for this, if you want to use the output from one command within another command so something like:
cmd1 -option $(cmd2)
here you're using the cmd2 output as input in cmd. The key here is '$' and the subshell '()'. So the solution might be:
#!/bin/bash
tool="/path/FastQTL-2.165.linux/bin/"
vcf="/path/vcf/"
out="/path/perm_out"
for i in {1..1061}
do
${tool}fastQTL.1.165.linux --vcf ${vcf}GT.vcf.gz --bed pheno_bed.gz --region $(tail -n+"$i" loc_info.txt) --permute 1000 --out "$i"_perm.txt
done
Try replacing tail -n+"$i" loc_info.txt
with $(head -n $i loc_info.txt | tail -n 1)
Example
numOfLines=$(wc -l loc_info.txt | cut -d ' ' -f 1)
for i in $(seq 1 $numOfLines)
do
${tool}fastQTL.1.165.linux --vcf ${vcf}GT.vcf.gz --bed pheno_bed.gz -
-region $(head -n $i loc_info.txt | tail -n 1) --permute 1000 --out "$i"_perm.txt
done

load list into separate variables with varying names

Debian testing, bash...
I'm trying to load variables from existing programs.
Set Programs Variable
xPROGS="$(echo -e "exiftool\nrsync\nxsel")"
Attempt to create variables with x(program name)
echo "$xPROGS" | while read z; do x$z="$(whereis -b "$z" | awk '{print $2}')" ; done
Errors;
bash: xexiftool=/usr/bin/rsync: No such file or directory
bash: xrsync=/usr/bin/rsync: No such file or directory
bash: xxsel=/usr/bin/rsync: No such file or directory
This works;
$ whereis -b rsync | awk '{print $2}'
I can't achieve varying the variable name successfully.
Could someone please help.
$ cat t.sh
#!/bin/bash
progs=(exiftool rsync xsel)
for prog in "${progs[#]}"; do
read -r _ "x${prog}" _ <<< "$(whereis -b "${prog}")"
done
echo "exiftool: [${xexiftool}]"
echo "rsync: [${xrsync}]"
echo "xsel: [${xxsel}]"
 
$ ./t.sh
exiftool: []
rsync: [/usr/bin/rsync]
xsel: []
#Etan Reisner provided the link from where the following codes are based:
echo "$xPROGS" | while read z; do IFS= read -r "x$z" <<<$(whereis -b rsync | awk '{print $2}') ; done
But I feel that the rsync command will not change its location in the file system tree
RSYNC=$(whereis -b rsync); RSYNC="${RSYNC#* }"; echo "$xPROGS" | while read z; do IFS= read -r "x$z" <<<"$RSYNC" ; done

AWK: execute CURL on each line and parse result

given an input stream with following lines:
123
456
789
098
...
I would like to call
curl -s http://foo.bar/some.php?id=xxx
with xxx being the number for each line, and everytime let an awk script fetch some information from the curl output which is written to the output stream. I am wondering if this is possible without using the awk "system()" call in following way:
cat lines | grep "^[0-9]*$" | awk '
{
system("curl -s " $0 \
" | awk \'{ #parsing; print }\'")
}'
You can use bash and avoid awk system call:
grep "^[0-9]*$" lines | while read line; do
curl -s "http://foo.bar/some.php?id=$line" | awk 'do your parsing ...'
done
A shell loop would achieve a similar result, as follows:
#!/bin/bash
for f in $(cat lines|grep "^[0-9]*$"); do
curl -s "http://foo.bar/some.php?id=$f" | awk '{....}'
done
Alternative methods for doing similar tasks include using Perl or Python with an HTTP client.
If your file gets dynamically appended the id's, you can daemonize a small while loop to keep checking for more data in the file, like this:
while IFS= read -d $'\n' -r a || sleep 1; do [[ -n "$a" ]] && curl -s "http://foo.bar/some.php?id=${a}"; done < lines.txt
Otherwise if it's static, you can change the sleep 1 to break and it will read the file and quit when there is no data left, pretty useful to know how to do.

Resources