How to break loop in shell script used to truncate table? - shell

I'm writing a script truncating tables. The problem is that the script goes in infinite loop.
./n_input contains environment variables that are used in script.
#!/bin/ksh
INPUT_FILE=./n_input
if [ ! -e $INPUT_FILE ];
then
echo "Error: Input file require"
exit 1
fi
source $INPUT_FILE
while read Table
do
TABLE="$DSTN_DATABASE.dbo.$Table"
echo "Table : $TABLE"
QUERY="truncate table $TABLE"
echo $QUERY > ./tmp_file
sqlcmd -m 1 -U $DSTN_USER -P $DSTN_PASSWORD -D -S $DSTN_SERVER -m1 -i ./tmp_file
RET_VALUE=$?
if [ $RET_VALUE -ne 0 ]
then
echo "Error $TABLE"
fi
done < $TABLE_LIST
exit 0
How do I break the loop? I have tried to remove sqlcmd from script and verified that it was working. It's working as expected. Observed the same behavior with sqlcmd -Q option.
$TABLE_LIST file contains only one table name.

You don't need to write the query into temporary file. Use -q, or -Q options instead:
q="truncate table ${TABLE};"
sqlcmd -m 1 -U "$DSTN_USER" -P "$DSTN_PASSWORD" -S "$DSTN_SERVER" -q "$q"
Note the ; at the end of the query. Probably, that's the reason why the script "stalls". That may look like an infinitely running loop.
Also note the use of double quotes. You should wrap variables in double quotes to prevent reinterpretation of the special characters.
By the way, you can locate the exact command that is causing the issue by adding set -x at the beginning of the script. set -x turns on debugging mode. With debugging mode on, you see the commands being executed.
It's very unlikely that the content of $TABLE_LIST file is causing such behavior, unless the file is enormously big. The loop construct is correct, and the number of iterations should match the number of lines in the file.

Related

loop variables stuck as constant when using sqlcmd

While creating a bash script that calls sqlcmd for different variables in a loop, noticed it appears to be setting the loop variables that should update in the loop to be constant once it is called the first time (can tell that this is very likely the culprit by continueing and later and later segments of the loop until got to the sqlcmd segction). Ie. if we loop through a list (of length L) of MSSQL Server table names with sqlcmd, the loop will just do, not L, but infinite iterations of the loop instructions using only the first entry in the list.
A minimal example is below:
#!/bin/bash
tables_list=$1
while read -r line
do
tablecols="$line"
IFS=',' read -a arr_tablecols <<< "$tablecols"
mssql_tablename=${arr_tablecols[0]}
echo -e "\n\n\n##### Processing: $mssql_tablename #####\n"
TO_SERVER_ODBCDSN="-D -S <ODBC DSN name for mssql host>"
TO_SERVER_IP="-S <my mssql host IP>"
DB="ClarityETL_test"
TABLE="$mssql_tablename"
USER=<my mssql username>
PASSWORD=<my mssql login password>
#uncomment to see that sqlcmd does in fact appear to be the problem
#continue
{
echo -e "Counting destination table: $DB/$TABLE"
sqlcmd -Q "PRINT '$mssql_tablename';" \
$TO_SERVER_ODBCDSN \
-U $USER -P $PASSWORD \
-d $DB
} || { echo -e "\nFailed to truncate MSSQL DB"; exit 255; }
done < "$tables_list"
where the file being used as $table_list looks like
mymssqltable1
mymssqltable2
...
(since the example just uses the PRINT command, the list could be mostly anything you want to use, really).
This behavior is really weird to me and could not find anything mentioning this in the docs (https://learn.microsoft.com/en-us/sql/linux/quickstart-install-connect-red-hat?view=sql-server-2017#create-and-query-data) (I'm on CentOS 7). If anyone knows what is going on here or any further debugging advice, please let me know.
I suspect sqlcmd is reading from standard input, so it's consuming the rest of the input file. I'm not sure why this is causing the loop to repeat infintely, rather than ending after the first iteration. But if this is what's going on, the solution is to redirect the input of sqlcmd.
sqlcmd -Q "PRINT '$mssql_tablename';" \
$TO_SERVER_ODBCDSN \
-U $USER -P $PASSWORD \
-d $DB < /dev/null
When you redirect input of the while loop, all the commands inside the loop inherit that redirected input. So note that if you were calling sqlcmd from within another script nested in the loop (rather than directly within the loop itself), you would do something like
while read -r line
do
tablecols="$line"
IFS=',' read -a arr_tablecols <<< "$tablecols"
mssql_tablename=${arr_tablecols[0]}
bash scriptThatUsesSqlcmd.sh mssql_tablename < /dev/null
done < "$tables_list"
to un-inherit the loop's standard input redirect in the script that will use sqlcmd.

printing output of command history 1 from shell script

Here's my problem, from console if I type the below,
var=`history 1`
echo $var
I get the desired output. But when I do the same inside a shell script, it is not showing any output. Also, for other commands like pwd, ls etc, the script shows the desired output without any issue.
As value of variable contains space, add quotes around it.
E.g.:
var='history 1'
echo $var
I believe all you need is this as follows:
1- Ask user for the number till which user need to print the history in script.
2- Run the script and take Input from user and get the output as follows:
cat get_history.ksh
echo "Enter the line number of history which you want to get.."
read number
if [[ $# -eq 0 ]]
then
echo "Usage of script: get_history.ksh number_of_lines"
exit
else
history "$number"
fi
Added logic where it will check arguments if number of arguments passed is 0 then it will exit from script then.
By default history is turned off in a script, therefore you need to turn it on:
set -o history
var=$(history 1)
echo "$var"
Note the preferred use of $( ) rather than the deprecated backticks.
However, this will only look at the history of the current process, that is this shell script, so it is fairly useless.

shell command to skip file in sequence

I run a C++ code for several data files in sequence using:
for i in $(seq 0 100); do ./f.out c2.$(($i*40+495)).bin `c2.avg.$(($i*40+495)).txt; done`
Now if some input files are missing, say c2.575.bin is missing then the command is not executed for the rest of the files. How could I modify the shell command to skip the input files those are missing and move to the next input file?
Thanks.
in the loop, test if file exists before calling a program operating on that file:
for i in $(seq 0 100); do
INPUT=c2.$(($i*40+495)).bin
test -e $INPUT && ./f.out $INPUT c2.avg.$(($i*40+495)).txt
done
This way the ./f.out ... will be executed only for existing input files.
See man test for details.
BTW, the && notation is a shorthand for if. See help if or man sh.
You can use {0..100} instead of $(seq 0 100) for better readability. You can put the following code in a script and execute the script. e.g., runCode.bash
#!/bin/bash
for i in {0..100}
do
# Assign a variable for the filenames
ifn=c2.$(($i*40+495)).bin
# -s option checks if the file exists and size greater than zero
if [ -s "${ifn}" ]; then
./f.out "${ifn}" c2.avg.$(($i*40+495)).txt
else
echo "${ifn} No such file"
fi
done
Change permission and execute the script.
chmod u+x runCode.bash`
./runCode.bash`

Bash script comparing curl results with file

So I am writing a script that will curl a site I've written that will return a string. I am using it to compare a value that is saved within a file. I am having trouble getting my conditional statement to return true for matching values.
Here is a snippet of my bash code
var1=$(curl -s -w %{http_code}\\n www.somesite.com)
var2=$(cat somefile)
if [[ "$var1" = "$var2" ]]; then
echo "TRUE"
else
echo "FALSE"
fi
I have manually looked at both the strings and they seem to be identical. I've done wc with all applicable options with it. I've copy and pasted both the values into Notepad++ and did a find for the expected string and it said that both values matched the find command.
Obviously, if I manually put the values in the condition it returns true, so I know its not my conditional statement.
My guess is there is some type of hidden character on the end of curl...('\r' or '\n' or something that I am unaware)...or maybe on the end of the file that I am unaware of. Is this a known issue when trying to compare curl output and file content?
This may be a dumb question, but for the life of me I cannot seem to get these strings to match doing it dynamically instead of hardcoding the values in the condition.
Any suggestions would be great. Thanks
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\ndef') > /dev/null ; echo $?
0
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\nde') > /dev/null ; echo $?
1

BASH Variables with multiple commands and reentrant

I have a bash script that sources contents from another file. The contents of the other file are commands I would like to execute and compare the return value. Some of the commands are have multiple commands separated by either a semicolon (;) or by ampersands (&&) and I can't seem to make this work. To work on this, I created some test scripts as shown:
test.conf is the file being sourced by test
Example-1 (this works), My output is 2 seconds in difference
test.conf
CMD[1]="date"
test.sh
. test.conf
i=2
echo "$(${CMD[$i]})"
sleep 2
echo "$(${CMD[$i]})"
Example-2 (this does not work)
test.conf (same script as above)
CMD[1]="date;date"
Example-3 (tried this, it does not work either)
test.conf (same script as above)
CMD[1]="date && date"
I don't want my variable, CMD, to be inside tick marks because then, the commands would be executed at time of invocation of the source and I see no way of re-evaluating the variable.
This script essentially calls CMD on pass-1 to check something, if on pass-1 I get a false reading, I do some work in the script to correct the false reading and re-execute & re-evaluate the output of CMD; pass-2.
Here is an example. Here I'm checking to see if SSHD is running. If it's not running when I evaluate CMD[1] on pass-1, I will start it and re-evaluate CMD[1] again.
test.conf
CMD[1]=`pgrep -u root -d , sshd 1>/dev/null; echo $?`
So if I modify this for my test script, then test.conf becomes:
NOTE: Tick marks are not showing up but it's the key below the ~ mark on my keyboard.
CMD[1]=`date;date` or `date && date`
My script looks like this (to handle the tick marks)
. test.conf
i=2
echo "${CMD[$i]}"
sleep 2
echo "${CMD[$i]}"
I get the same date/time printed twice despite the 2 second delay. As such, CMD is not getting re-evaluate.
First of all, you should never use backticks unless you need to be compatible with an old shell that doesn't support $() - and only then.
Secondly, I don't understand why you're setting CMD[1] but then calling CMD[$i] with i set to 2.
Anyway, this is one way (and it's similar to part of Barry's answer):
CMD[1]='$(date;date)' # no backticks (remember - they carry Lime disease)
eval echo "${CMD[1]}" # or $i instead of 1
From the couple of lines of your question, I would have expected some approach like this:
#!/bin/bash
while read -r line; do
# munge $line
if eval "$line"; then
# success
else
# fail
fi
done
Where you have backticks in the source, you'll have to escape them to avoid evaluating them too early. Also, backticks aren't the only way to evaluate code - there is eval, as shown above. Maybe it's eval that you were looking for?
For example, this line:
CMD[1]=`pgrep -u root -d , sshd 1>/dev/null; echo $?`
Ought probably look more like this:
CMD[1]='`pgrep -u root -d , sshd 1>/dev/null; echo $?`'

Resources