How to use unset with xargs in shell script? - bash

ITNOA
I want to write bash script, to remove some environmental variable
my problem is when I run below command
env | grep -i _proxy= | cut -d "=" -f1 | xargs -I {} echo {}
I see below result
HTTPS_PROXY
HTTP_PROXY
ALL_PROXY
but when I replace echo with unset, something like below
env | grep -i _proxy= | cut -d "=" -f1 | xargs -I {} unset {}
I see below error
xargs: unset: No such file or directory
What is my problem? If I use xargs incorrectly?

You have xargs running in a pipeline. Therefore it is running in a separate process, and it cannot alter the environment of the "parent" shell.
Also, xargs works with commands, not shell builtins.
You'll need to do this:
while read -r varname; do unset "$varname"; done < <(
env | grep -i _proxy= | cut -d "=" -f1
)
or
mapfile -t varnames < <(env | grep -i _proxy= | cut -d "=" -f1)
unset "${varnames[#]}"

I came across this:
]$ env |grep AWS_|tr "=" " "|awk '{print $1}'
AWS_SESSION_TOKEN
AWS_SECRET_ACCESS_KEY
AWS_ACCESS_KEY_ID
]$ env |grep AWS_|tr "=" " "|awk '{print $1}'|xargs -n1 unset
xargs: unset: No such file or directory
]$
And below simple solution worked for me
unset `env |grep AWS_|tr "=" " "|awk '{print $1}'`
or below
unset `env |grep AWS_|cut -d= -f1`

Related

echo strings with envrionment variables from lines pulled from a file in bash

I have a file like so:
- ${VAR1}/blah/blah:/blah1
- ${VAR2}/blah/blah:/blah2
- $VAR3:/blah3
I ultimately need to create those three folders.
I am using sed to extract the folder part:
$ cat test.txt | grep -E '^ +- \$.*?:.*?$' | sed 's/.*- \(\$.*\):.*/\1/g'
${VAR1}/blah/blah
${VAR2}/blah/blah
$VAR3
I need to create those folders but I need those shell variables to expand. Right now they don't:
$ cat test.txt | grep -E '^ +- \$.*?:.*?$' | sed 's/.*- \(\$.*\):.*/\1/g' | while read line; do echo "$line"; done
${VAR1}/blah/blah
${VAR2}/blah/blah
$VAR3
Is there a way to get the expanded strings so I can run mkdir instead of echo to make the folders?
You may use this bash script with envsubst:
#!/usr/bin/env bash
export VAR1 VAR2 VAR3
while IFS=' -:' read -r _ d _; do
mkdir -p "$d"
done < <(envsubst < test.txt)
Alternatively use this envsubst + awk + xargs solution:
envsubst < text.txt |
awk -F '[-:[:blank:]]+' -v ORS='\0' '{print $2}' |
xargs -0 mkdir -p
First of all those variables should be exported to be accessible from your script. Then you could just use the cut and tr commands combination to extract dir name in a loop like the following:
#!/bin/bash -eu
while read -r LINE; do
echo "$LINE" | cut -d ':' -f 1 | tr -d ' ' | tr -d '-'
done < test.txt

How to add shell script to jenkins pipeline

I have the below shell script:
du -sh /bbhome/shared/data/repositories/* |sort -h |tail -20 |
while IFS= read -r line;do
DIR=`echo $line | awk '{print$2}'`
Rep=`cat $DIR/repository-config |grep 'project\|repo' | tr '\n' ' '`
Size=`echo $line | awk '{print $1}' `
echo $Size $Rep
done
How can I run it thought Execute shell in Jenkins? I need also to add ssh command to the env (no need for a password).
Note I don't want to connect to the env and run this shell, but directly from Excecue shell box
If I'm not wrong your are using a Freestyle job and not a pipeline job.
Anyway, I think you have to try the following :
ssh -t XXXXX#YYYYY << 'EOF'
du -sh /bbhome/shared/data/repositories/* |sort -h |tail -20 |
while IFS= read -r line;do\
DIR=echo $line | awk '{print$2}'\
Rep=cat $DIR/repository-config |grep 'project\|repo' | tr '\n' ' '\
Size=echo $line | awk '{print $1}' \
echo $Size $Rep\
done
EOF
I've escaped the code inside your while loop using \, if it's doesn't works you can use ; instead.
If you want help for using a pipeline job, let me know but i might be a bit more complex.

grep search with filename as parameter

I'm working on a shell script.
OUT=$1
here, the OUT variable is my filename.
I'm using grep search as follows:
l=`grep "$pattern " -A 15 $OUT | grep -w $i | awk '{print $8}'|tail -1 | tr '\n' ','`
The issue is that the filename parameter I must pass is test.log.However, I have the folder structure :
test.log
test.log.001
test.log.002
I would ideally like to pass the filename as test.log and would like it to search it in all log files.I know the usual way to do is by using test.log.* in command line, but I'm facing difficulty replicating the same in shell script.
My efforts:
var-$'.*'
l=`grep "$pattern " -A 15 $OUT$var | grep -w $i | awk '{print $8}'|tail -1 | tr '\n' ','`
However, I did not get the desired result.
Hopefully this will get you closer:
#!/bin/bash
for f in "${1}*"; do
grep "$pattern" -A15 "$f"
done | grep -w $i | awk 'END{print $8}'

Unable to substitute redirection for redundant cat

cat joined.txt | xargs -t -a <(cut --fields=1 | sort -u | grep -E '\S') -I{} --max-args=1 --max-procs=4 echo "mkdir -p imdb/movies/{}; grep '^{}' joined.txt > imdb/movies/{}/movies.txt" | bash
The code above works but substituting the redundant cat at the start of the code with a redirection like below doesn't work and leads to a cut input output error.
< joined.txt xargs -t -a <(cut --fields=1 | sort -u | grep -E '\S') -I{} --max-args=1 --max-procs=4 echo "mkdir -p imdb/movies/{}; grep '^{}' joined.txt > imdb/movies/{}/movies.txt" | bash
In either case, it is the cut command inside the process substitution (and not xargs) that should be reading from joined.txt, so to be completely safe, you should put either the pipe or the input redirection inside the the process substitution. Actually, neither is necessary; cut can just take joined.txt as an argument.
xargs -t -a <( cat joined.txt | cut ... ) ... | bash
or
xargs -t -a <( cut -f1 joined.txt | ... ) ... | bash
However, it would be clearest to skip the process substitution altogether, and pipe the output of that pipeline to xargs:
cut -f joined.txt | sort -u | grep -E '\S' | xargs -t ...

" echo" in bash script empty file

I have problem with echo command I need export data to csv but its empty file
#!/bin/bash
while read domain
do
ownname= whois $domain | grep -A 1 -i "Administrative Contact:" |cut -d ":" -f 2 | sed 's/ //' | sed -e :a -e '$!N;s/ \n/,/;ta'
echo -e "$ownname" >> test.csv
done < dom.txt
You need to use command substitution to store command's output in a shell variable:
#!/bin/bash
while read domain; do
ownname=$(whois $domain | grep -A 1 -i "Administrative Contact:" |cut -d ":" -f 2 | sed 's/ //' | sed -e :a -e '$!N;s/ \n/,/;ta')
echo -e "$ownname" >> test.csv
done
PS: I haven't tested all the piped commands.

Resources