How to save the output of an sh to a Groovy variable? - shell

I need to store the output of this command into a variable:
sh "curl -s 'http://nexus-cicd.stgcloud.xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.xml' | grep '<version>.*</version>' | sort --version-sort | uniq | tail -n1 | sed -e 's#\\(.*\\)\\(<version>\\)\\(.*\\)\\(</version>\\)\\(.*\\)#\\3#g'"
I tried the following, but echo output null
parentLast = sh ("curl -s 'http://nexus-cicd.stgcloud.xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.xml' | grep '<version>.*</version>' | sort --version-sort | uniq | tail -n1 | sed -e 's#\\(.*\\)\\(<version>\\)\\(.*\\)\\(</version>\\)\\(.*\\)#\\3#g'")
echo "$parentLast"

Related

convert bash pipeline into a function with parameter

I have a pipeline I use to preview csv files:
cat file_name.csv | sed -e 's/,,/, ,/g' | column -t -s ","| less -s
But i want to create an alias viewcsv that will allow to just replace the filename.
I tried viewcsv="cat $1 | sed -e 's/,,/, ,/g' | column -t -s ","| less -s" but that didn't work. Googling turned up that I need to convert this pipeline to a function? How can i convert this to a function so that viewcsv file_name.csv will return same output as cat file_name.csv | sed -e 's/,,/, ,/g' | column -t -s ","| less -s does?
Function syntax looks like this:
viewcsv() {
sed -e 's/,,/, ,/g' "$1" | column -t -s ","| less -s
}
Notice that I have replaced cat "$1" | sed with sed "$1".
csvkit has a CSV previewer, by the way:
$ csvlook <<< $'a,b,c\n10,20,30'
| a | b | c |
| -- | -- | -- |
| 10 | 20 | 30 |

Bash Script: Filter large files for value

I have several config files with around 20k lines each and I need to get some values from them.
I know that each of the values I need starts with a specific word "CONFNET" so I tried to get the values with a while loop, which reads every line.
But unfortunately this is extremely inefficient and slow.
Is there a better solution to this?
for filename in ~/configs/*; do
ip=$(cat $filename | strings | grep -i -A 7 "addnet_outside" | head -7 | grep "IP" | sed "s/IP//" | sed "s/=//" | sed -e 's/^[ \t]*//')
hostname=$(cat $filename | strings | grep -a "Inst:" | head -1 | sed "s/Inst://" | sed -e 's/^[ \t]*//')
while IFS= read -r line; do
object_name=$(echo $line | strings | grep "CONFNET" | sed "s/CONFNET//" | awk '{print $1}')
object_value=$(echo $line | strings | grep "CONFNET" | sed "s/CONFNET//" | awk '{print $3}' | sed -e 's/^[ \t]*//')
if [ ! -z $object_name ] && [ ! -z $object_value ]
then
echo $hostname "->" $object_name ":" $object_value
done < "$filename"
done

Unable to substitute redirection for redundant cat

cat joined.txt | xargs -t -a <(cut --fields=1 | sort -u | grep -E '\S') -I{} --max-args=1 --max-procs=4 echo "mkdir -p imdb/movies/{}; grep '^{}' joined.txt > imdb/movies/{}/movies.txt" | bash
The code above works but substituting the redundant cat at the start of the code with a redirection like below doesn't work and leads to a cut input output error.
< joined.txt xargs -t -a <(cut --fields=1 | sort -u | grep -E '\S') -I{} --max-args=1 --max-procs=4 echo "mkdir -p imdb/movies/{}; grep '^{}' joined.txt > imdb/movies/{}/movies.txt" | bash
In either case, it is the cut command inside the process substitution (and not xargs) that should be reading from joined.txt, so to be completely safe, you should put either the pipe or the input redirection inside the the process substitution. Actually, neither is necessary; cut can just take joined.txt as an argument.
xargs -t -a <( cat joined.txt | cut ... ) ... | bash
or
xargs -t -a <( cut -f1 joined.txt | ... ) ... | bash
However, it would be clearest to skip the process substitution altogether, and pipe the output of that pipeline to xargs:
cut -f joined.txt | sort -u | grep -E '\S' | xargs -t ...

BASH command: How to save output of bash command into variable and later pipeline into command

i have a question about how to store the output into variable and then later pipeline into another command
var=$(ps -auxc | grep -vE '^USER' )
#get top CPU
echo $var | sort -nr -k3 | head -1
#get top memory
echo $var | sort -nr -k4 | head -1
Make sure to use quotes in assignment and while accessing variable:
var="$(ps -auxc | grep -vE '^USER')"
#get top CPU
sort -nr -k3 <<< "$var" | head -1
#get top memory
sort -nr -k4 <<< "$var" | head -1
I'm not sure if this would always work:
IFS= read -rd '' var < <(ps -auxc | grep -vE '^USER') ## -d '' may be -d $'\0'
echo -n "$var" | sort -nr -k3 | head -1
However using readarray could:
readarray -t var < <(ps -auxc | grep -vE '^USER')
printf '%s\n' "${var[#]}" | sort -nr -k4 | head -1

OpenStreetMap Bash / CGI Script

I am trying to build a Bash CGI Script that takes in coordinates as parameters from the url and uses osmosis to extract the map and the splitter and mkgmap to make the map so that it can be opened with Qlandkarte. My problem being is that when i type wget localhost/cgi-bin/script.pl?top=42&left=10&bottom=39&right=9&file=map.osm the linux terminal reads the file with the coordinates. How can I make wget just activate the script so it takes the coordinates and executes the commands. And also when the map is created at the end how can a return the file that was created by the script.
Thanks
#!/bin/bash
TOP=`echo "$QUERY_STRING" | grep -oE "(^|[?&])top=[0.0-9.0]+" | cut -f 2 -d "=" | head -n1`
LEFT=`echo "$QUERY_STRING" | grep -oE "(^|[?&])left=[0.0-9.0]+" | cut -f 2 -d "=" | head -n1`
BOTTOM=`echo "$QUERY_STRING" | grep -oE "(^|[?&])bottom=[0.0-9.0]+" | cut -f 2 -d "=" | head -n1`
RIGHT=`echo "$QUERY_STRING" | grep -oE "(^|[?&])right=[0.0-9.0]+" | cut -f 2 -d "=" | head -n1`
FILE=`echo "$QUERY_STRING" | grep -oE "(^|[?&])file=[^&]+" | sed "s/%20/ /g" | cut -f 2 -d "="`
$(sudo osmosis --read-xml file=bulgaria.osm --bounding-box top=$TOP left=$LEFT bottom=$BOTTOM right=$RIGHT --write-xml file=$FILE)
$(sudo java -Xmx900m -jar splitter.jar --max-nodes=110000 $FILE)
$(sudo java -ea -Xmx900m -jar mkgmap.jar --tdbfile --route -c template.args)
echo "Content-type: text/html"
echo ""

Resources