Script as a variable value - bash

Is it possible to store the script as a variable? This is the script:
### SCRIPT1 ###
if [ "$4" = "test-1" ] ; then
existing_user1=$(ldapsearch -x -b "cn=users,cn=servers,ou=servers,dc=com" -H ldap://127.0.0.1 -D "cn=admin,dc=servers,dc=com" -w "pass" cn uid | grep "uid: $3" | grep -oE '[^ ]+$')
if [ "$existing_user1" = "$3" ] ; then
exit -1
elif [ "$existing_user1" = "" ] ; then
gid=523 ; cn="cn=users,cn=servers"
fi
elif [ "$4" = "servers2" ] ; then
existing_user2=$(ldapsearch -x -b "cn=users,cn=servers,ou=servers,dc=servers,dc=com" -H ldap://127.0.0.1 -D "cn=admin,dc=servers,dc=com" -w "pass" cn uid | grep "uid: $3" | grep -oE '[^ ]+$')
if [ "$existing_user2" = "$3" ] ; then
(echo "dn: cn=servers,ou=servers,dc=servers,dc=com"
echo "add: memberUid"
echo "memberUid: $3") | ldapmodify -D "cn=admin,dc=servers,dc=com" -w "pass"
exit -1
elif [ "$existing_user2" = "" ] ; then
gid=523 ; cn="cn=users,cn=servers"
(echo "dn: cn=servers,ou=servers,dc=servers,dc=com"
echo "add: memberUid"
echo "memberUid: $3") | ldapmodify -D "cn=admin,dc=servers,dc=com" -w "pass"
fi
fi
)
I tried like this:
my_new_variable=(
here I pasted the script from above
)
But, looks like it is not working properly... I changed brackets with quotation mark, but that that did not work either.
So far I did have some simple stuff as variable value, but nothing complicated like another bash script.
I would like to have the above script set as the variable that I can later in the script inside awk.

You can use a here-document:
script=$(cat <<"EOF"
### SCRIPT 1 ###
# commands
# last line
EOF
)
Everything on the lines in between "EOF" and EOF are passed literally, as cat's standard input. This is nested inside a command substitution, to put it in a variable. The final EOF string must be on it's own line, so the closing ) must go on the next line. Note that any trailing new lines will be stripped, because it's a command substitution.
"EOF" can be any quoted word. If quotes are not used, variables, arithmetic and command subs are expanded.
This syntax is portable to sh, and not bash specific.

You should first save it as a file, and then store it into some variable.
e.g.
var=$(cat script.sh)

Related

Refactoring my shell scipts : calling a script and redirect output to a new file inside a loop

I have a shell script that uses the sed command to transform an input file to a new output file, this script works when I call :
./sed_command.sh input_file > output_file
here is my script :
#!/usr/bin/env sh
while read line
do
# Split each line based on ; separator
transposed=$(echo "$line" | sed -e "s/;/\\n/g")
res=$(echo "$transposed" | sed '/${GDATEF(\([^,]*\),ddMMyyyy)}/{
# Split the string in two part
s//\1/g
# # parse the input for GNU date input format
s/D/ day/g
# Handle shell quoting
'"s/'/\\''\\'/"'
# pass and execute date
s/.*/date -d "now "'\''&'\'' +%d%m%Y/e
}')
oneline=$(echo "$res" | sed -z "s/\n/;/g" | sed 's/.$//')
echo $oneline
done < $1
Now I need to call that script inside a double loop that browse all directories in a folder :
cd /etc/newman/Newman
for team in *;do
if [ -d "$team" ]; then
echo "team=$team"
cd $team
for scope in *;do
if [ -d "$scope" ]; then
echo "scope=$scope"
CSV_FILE=$(ls /etc/newman/Newman/$team/$scope/*.csv -1 || true)
echo "Transforming : ${CSV_FILE}"
if [ ! -z "${CSV_FILE}" ] ; then sh sed_command.sh ${CSV_FILE} > ${CSV_FILE}_Newman; else echo "No CSV files to transform"; fi
fi
done
cd ..
fi
done
I have tested the second script looping works fine the only issue is when I call sed_command.sh inside the second script I get theses two errors :
: not found.sh: line 2:
sed_command.sh: line 19: syntax error: unexpected "done" (expecting "do")
I guess redirection does not work the same way inside a loop? thanks for help

bash: script to identify specific alias causing a bug

[Arch Linux v5.0.7 with GNU bash 5.0.3]
Some .bashrc aliases seem to conflict with a bash shell-scripts provided by pyenv and pyenv-virtualenvwrapper.I tracked down the problem running the script, using set -x and with all aliases enabled, and saw finally that the script exits gracefully with exit code is 0 only when aliases are disabled with unalias -a. So this has to do with aliases... but which one ?
To try to automate that, I wrote the shell-script below:
It un-aliases one alias at a time, reading iteratively from the complete list of aliases,
It tests the conflicting shell script test.sh against that leave-one-out alias configuration, and prints something in case an error is detected,
It undoes the previous un-aliasing,
It goes on to un-aliasing the next alias.
But the two built-ins alias and unalias do not fare well in the script cac.sh below:
#! /usr/bin/bash
[ -e aliases.txt ] && rm -f aliases.txt
alias | sed 's/alias //' | cut -d "=" -f1 > aliases.txt
printf "File aliases.txt created with %d lines.\n" \
"$(wc -l < <(\cat aliases.txt))"
IFS=" "
n=0
while read -r line || [ -n "$line" ]; do
n=$((n+1))
aliasedAs=$( alias "$line" | sed 's/alias //' )
printf "Line %2d: %s\n" "$n" "$aliasedAs"
unalias "$line"
[ -z $(eval "$*" 1> /dev/null) ] \ # check output to stderr only
&& printf "********** Look up: %s\n" "$line"
eval "${aliasedAs}"
done < <(tail aliases.txt) # use tail + proc substitution for testing only
Use the script like so: $ cac.sh test.sh [optional arguments to test.sh] Any test.sh will do. It just needs to return some non-empty string to stderr.
The first anomaly is that the file aliases.txt is empty as if the alias builtin was not accessible from within the script. If I start the script from its 3rd line, using an already populated aliases.txt file, the script fails at the second line within the while block, again as if alias could not be called from within the script. Any suggestions appreciated.
Note: The one liner below works in console:
$ n=0;while read -r line || [ -n "$line" ]; do n=$((n+1)); printf "alias %d : %s\n" "$n" "$(alias "$line" | sed 's/alias //')"; done < aliases.txt
I would generally advise against implementing this as an external script at all -- it makes much more sense as a function that can be evaluated directly in your interactive shell (which is, after all, where all the potentially-involved aliases are defined).
print_result() {
local prior_retval=$? label=$1
if (( prior_retval == 0 )); then
printf '%-30s - %s\n' "$label" WORKS >&2
else
printf '%-30s - %s\n' "$label" BROKEN >&2
fi
}
test_without_each_alias() {
[ "$#" = 1 ] || { echo "Usage: test_without_each_alias 'code here'" >&2; return 1; }
local alias
(eval "$1"); print_result "Unchanged aliases"
for alias in "${!BASH_ALIASES[#]}"; do
(unalias "$alias" && eval "$1"); print_result "Without $alias"
done
}
Consider the following:
rm_in_home_only() { [[ $1 = /home/* ]] || return 1; rm -- "$#"; }
alias rm=rm_in_home_only # alias actually causing our bug
alias red_herring=true # another alias that's harmless
test_without_each_alias 'touch /tmp/foobar; rm /tmp/foobar; [[ ! -e /tmp/foobar ]]'
...which emits something like:
Unchanged aliases - BROKEN
Without rm - WORKS
Without red_herring - BROKEN
Note that if the code you pass executes a function, you'll want to be sure that the function is defined inside the eval'd code; since aliases are parser behavior, they take place when functions are defined, not when functions are run.
#Kamil_Cuk, #Benjamin_W and #cdarke all pointed to the fact that a noninteractive shell (as that spawned from a bash script) does not have access to aliases.
#CharlesDuffy pointed to probable word splitting and glob expansion resulting in something that could be invalid test syntax in the original [ -z $(eval "$*" 1> /dev/null) ] block above, or worse yet in the possibility of $(eval "$*" 1> /dev/null) being parsed as a glob resulting in unpredictable script behavior. Block corrected to: [ -z "$(eval "$*" 1> /dev/null)" ].
Making the shell spawned by cac.sh interactive, with #! /usr/bin/bash -i. make the two built-ins alias and unalias returned non-null result when invoked, and BASH_ALIASES[#] became accessible from within the script.
#! /usr/bin/bash -i
[ -e aliases.txt ] && rm -f aliases.txt
alias | sed 's/alias //' | cut -d "=" -f1 > aliases.txt
printf "File aliases.txt created with %d lines.\n" \
"$(wc -l < <(\cat aliases.txt))"
IFS=" "
while read -r line || [ -n "$line" ]; do
aliasedAs=$( alias "$line" | sed 's/alias //' )
unalias "$line"
[ -z "$(eval "$*" 2>&1 1>/dev/null)" ] \ # check output to stderr only
&& printf "********** Look up: %s\n" "$line"
eval "${aliasedAs}"
done < aliases.txt
Warning: testing test.sh resorts to the eval built-in. Arbitrary code can be executed on your system if test.sh and optional arguments do not come from a trusted source.

Bash Script - Will not completely execute

I am writing a script that will take in 3 outputs and then search all files within a predefined path. However, my grep command seems to be breaking the script with error code 123. I have been staring at it for a while and cannot really seem the error so I was hoping someone could point out my error. Here is the code:
#! /bin/bash -e
#Check if path exists
if [ -z $ARCHIVE ]; then
echo "ARCHIVE NOT SET, PLEASE SET TO PROCEED."
echo "EXITING...."
exit 1
elif [ $# -ne 3 ]; then
echo "Illegal number of arguments"
echo "Please enter the date in yyyy mm dd"
echo "EXITING..."
exit 1
fi
filename=output.txt
#Simple signal handler
signal_handler()
{
echo ""
echo "Process killed or interrupted"
echo "Cleaning up files..."
rm -f out
echo "Finsihed"
exit 1
}
trap 'signal_handler' KILL
trap 'signal_handler' TERM
trap 'signal_handler' INT
echo "line 32"
echo $1 $2 $3
#Search for the TimeStamp field and replace the / and : characters
find $ARCHIVE | xargs grep -l "TimeStamp: $2/$3/$1"
echo "line 35"
fileSize=`wc -c out.txt | cut -f 1 -d ' '`
echo $fileSize
if [ $fileSize -ge 1 ]; then
echo "no"
xargs -n1 basename < $filename
else
echo "NO FILES EXIST"
fi
I added the echo's to know where it was breaking. My program prints out line 32 and the args but never line 35. When I check the exit code I get 123.
Thanks!
Notes:
ARCHIVE is set to a test directory, i.e. /home/'uname'/testDir
$1 $2 $3 == yyyy mm dd (ie a date)
In testDir there are N number of directories. Inside these directories there are data files that have contain data as well as a time tag. The time tag is of the following format: TimeStamp: 02/02/2004 at 20:38:01
The scripts goal is to find all files that have the date tag you are searching for.
Here's a simpler test case that demonstrates your problem:
#!/bin/bash -e
echo "This prints"
true | xargs false
echo "This does not"
The snippet exits with code 123.
The problem is that xargs exits with code 123 if any command fails. When xargs exits with non-zero status, -e causes the script to exit.
The quickest fix is to use || true to effectively ignore xargs' status:
#!/bin/bash -e
echo "This prints"
true | xargs false || true
echo "This now prints too"
The better fix is to not rely on -e, since this option is misleading and unpredictable.
xargs makes the error code 123 when grep returns a nonzero code even just once. Since you're using -e (#!/bin/bash -e), bash would exit the script when one of its commands return a nonzero exit code. Not using -e would allow your code to continue. Just disabling it on that part can be a solution too:
set +e ## Disable
find "$ARCHIVE" | xargs grep -l "TimeStamp: $2/$1/$3" ## If one of the files doesn't match the pattern, `grep` would return a nonzero code.
set -e ## Enable again.
Consider placing your variables around quotes to prevent word splitting as well like "$ARCHIVE".
-d '\n' may also be required if one of your files' filename contain spaces.
find "$ARCHIVE" | xargs -d '\n' grep -l "TimeStamp: $2/$1/$3"

How to access the bash PIPESTATUS array of an eval'd command?

I have this code:
error(){
time=$( date +"%T %F" )
echo "Start : ${time} : ${1}" 1>&2
result=$( eval "${1}" )
if [ `echo "${PIPESTATUS[#]}" | tr -s ' ' + | bc` -ne 0 ]; then
echo "command ${1} return ERROR" 1>&2
exit
else
if [ "${2}" != "silent" ]; then
echo "${result}"
fi
fi
}
I start testing command:
error "ifconfig | wc -l" "silent"
Start : 14:41:53 2014-02-19 : ifconfig | wc -l
error "ifconfiggg | wc -l" "silent"
Start : 14:43:13 2014-02-19 : ifconfiggg | wc -l
./install-streamserver.sh: line 42: ifconfiggg: command not found
But, I expect a different result. Example:
error "ifconfig" "silent"
Start : 14:44:52 2014-02-19 : ifconfig
Start : 14:45:40 2014-02-19 : ifconfiggg
./install-streamserver.sh: line 42: ifconfiggg: command not found
command ifconfiggg return ERROR (<<<<<<<<<<<< This message)
I don't have it, because when bash runs a command with eval, as in
eval "ifconfiggg | wc -l"
the $PIPESTATUS[#] array just contains "0" instead of the expected "1 0".
How can I fix this?
The eval starts a new shell context which has a separate PIPESTATUS[] array. The lifetime of this context ends when the eval ends. You can communicate this array to the parent context through assigning to a variable, say, PIPE as follows:
$ eval 'ifconfiggg | wc -l; PIPE=${PIPESTATUS[#]}'
bash: ifconfiggg: command not found
0
$ echo $PIPE
127 0
Note the single quotes to prevent ${PIPESTATUS[#]} from expanding too early.
Wrapping this in yet another result=$(...) does not work, since this creates yet another shell context. I suggest instead something along the lines of
eval "${1}; "'PIPE=${PIPESTATUS[#]}' >result.out 2>result.err
# do something with $PIPE here
# do something with result.out or result.err here
Note the use of both double quotes followed by single quotes.
Thanks #Jens for posting this information. I noticed for
eval "${1}; "'PIPE=${PIPESTATUS[#]}' >result.out 2>result.err
that it's better to use parentheses around array PIPESTATUS. Otherwise it seems to be interpreted as string and the complete result is in ${PIPESTATUS[0]} only. So
eval "${1}; "'PIPE=(${PIPESTATUS[#]})' >result.out 2>result.err
is working as expected.

bash - how to pass arg values to shell script within in a cronjob

I have cron entry that looks like this which works: (passing in 5 inputs)
30 10 * * 5 sh /home/test.sh hostnm101.abc /mypath/dir test foobar F008AR >> /logs/mytst.log 2>&1
I want to change it so I store inputs (4,5) foobar and F008AR in a separate file and read in by
script test.sh ($4,$5)
test.sh
#!/bin/bash
if [ $# != 5 ]; then
exit 1
fi
DIRDT=`date '+%Y%m%d'`
TSTDIR=$2/test/$DIRDATE
[ ! -d "$TSTDIR" ] && ( mkdir "$TSTDIR" || { echo 'mkdir command failed'; exit 1; } )
perl /home/dev/tstextr.pl -n $1 -b $2 -d $TSTDIR/ -s $3 -u $4 -p $5 -f $DIRDT
Is there any easy way to do this within the cron for those (2) input values? Thanks.
Alright try this way.
1) Create a separate file /mypath/dir/login.info with content like this (username/password in separate lines):
foobar
F008AR
2) Modify your test.sh like this:
#!/bin/bash
if [ $# != 4 ]; then
exit 1
fi
DIRDT=`date '+%Y%m%d'`
TSTDIR=$2/test/$DIRDATE
[ ! -d "$TSTDIR" ] && ( mkdir "$TSTDIR" || { echo 'mkdir command failed'; exit 1; } )
IFS="
"
arr=( $(<$2/$4) )
#echo "username=${arr[0]} password=${arr[1]}"
perl /home/dev/tstextr.pl -n $1 -b $2 -d $TSTDIR/ -s $3 -u ${arr[0]} -p ${arr[1]} -f $DIRDT
3) Have your cron command like this:
30 10 * * 5 sh /home/test.sh hostnm101.abc /mypath/dir test login.info >> /logs/mytst.log 2>&1
Summary
IFS stands for Internal Field Separator (IFS) in bash
I am using it like this:
IFS="
"
Which means make new line character as field separator (since we are storing username and password in 2 separate lines). And then this line to read file /mypath/dir/login.info into an array:
arr=( $(<$2/$4) )
First line (username) is read into $arr[0]
Second line (password) is read into $arr[1]
You can echo it to check read content:
echo "username=${arr[0]}"
echo "password=${arr[1]}"
cron executes with sh -c so:
sh /home/test.sh hostnm101.abc /mypath/dir test `cat /mypath/dir/foobar_file` `cat /mypath/dir/F008AR_file` >> /logs/mytst.log 2>&

Resources