In DB2, after connecting to DB, until and unless we are specifically use TERMINATE command, the connection keeps open and can be used for several sql execution using same connection inside a shell script.
Is there any way to do the same in oracle sqlplus from shell script?
For Example: bash script may look like below
1. start of bash
2. sqlplus connection created
3. some bash commands
4. execute query using the same sqlplus connection created in 2nd step
5. some bash commands
6. execute query using the same sqlplus connection created in 2nd step
thanks for the help. I have found another solution which I think best suits my requirement and its working perfectly fine.
By accessing SQL*Plus using a Korn Shell Coprocess.
Below is an example I have run and it has given all results perfectly.
#!/bin/ksh
##
##
output=""
set -f output
integer rc
typeset -r ERRFILE=$0.err
typeset -r EOF="DONE"
## Create the error file or zero it out if it already exists.
> $ERRFILE
## Start sqlplus in a coprocess.
sqlplus -s connection |&
## Exit SQL/Plus if any of the following signals are received:
## 0=normal exit, 2=interrupt, 3=quit, 9=kill, 15=termination
trap 'print -p "exit"' 0 2 3 9 15
## Send commands to SQL/Plus.
print -p "set heading off;"
print -p "set feedback off;"
print -p "set pagesize 0;"
print -p "set linesize 500;"
##
## Send a query to SQL/Plus. It is formatted so we can set a shell variable.
##
print -p "select 'COUNT1='||count(*) as count from dual;"
print -p "prompt $EOF"
while read -p output
do
if [[ "$output" == "$EOF" ]]; then
break
else
## eval forces the shell to evaluate the line twice. First, replacing
## "$output" with "COUNT1=99999", then again which creates and sets
## a variable.
eval $output
fi
done
##
## Send another query to the same running sql/plus coprocess.
##
print -p "select 'COUNT1_DATE='|| sysdate as count_date from dual;"
print -p "prompt $EOF"
while read -p output
do
if [[ "$output" == "$EOF" ]]; then
break
else
eval $output
fi
done
print -p "select 'COUNT2='||count(*)||';COUNT2_DATE='||sysdate from dual;"
print -p "prompt $EOF"
while read -p output
do
if [[ "$output" == "$EOF" ]]; then
break
else
eval $output
fi
done
print -p "select count(*)||'|'||sysdate from dual;"
print -p "prompt $EOF"
while read -p output
do
if [[ "$output" == "$EOF" ]]; then
break
else
IFS="|" ## Set the Input Field Separator.
set -A output_array $output ## Create an array. It parses
## automatically on the IFS character.
fi
done
print "COUNT1 count is $COUNT1"
print "COUNT1 date is $COUNT1_DATE\n"
print "COUNT2 count is $COUNT2"
print "COUNT2 date is $COUNT2_DATE\n"
print "Array count3: ${output_array[0]}"
print "Array date3: ${output_array[1]}"
Related
I have following line in my script, ${snap[#]} array contain my ssh server list.
while IFS= read -r con; do
ssh foo#"$con" /bin/bash <<- EOF
echo "Current server is $con"
EOF
done <<< "${snap[#]}"
I want to print current iteration value of the array as the ssh ran successfully, the $con should print current ssh server --> example#server. How would I do that ?
If the elements in snap are the hosts that you want to connect to, just use a for loop:
for con in "${snap[#]}"; do
# connect to "$con"
done
"${snap[#]}" expands to the safely-quoted list of elements in the array snap, suitable for use with for.
If you really want to use while, then you can do something like this:
i=0
while [ $i -lt ${#snap[#]} ]; do # while i is less than the length of the array
# connect to "${snap[i]}"
i=$(( i + 1 )) # increment i
done
But as you can see, it's more awkward than the for-based approach.
Like this :
while IFS= read -r con; do
ssh "foo#$con" /bin/bash <<EOF
echo "Current server is $con"
EOF
done < <(printf '%s\n' "${snap[#]}")
# ____
# ^
# |
# bash process substitution < <( )
Or simply :
for server in "${snap[#]}"; do
ssh "foo#$con" /bin/bash <<EOF
echo "Current server is $con"
EOF
done
I am trying to read a text file which has few commented starts with '#', my bash script should read the lines of the text file which doesn't start with '#'.
Also im trying to capture the output of echo statements in both logs and to show it console window for the user understanding.
I have tried to use the below query for capturing logs and printing in console
exec 2>&1 1>>$logfile
For reading each line of the file and calling the function, i have declared an array and to eliminate lines which starts with '#' , i have used the below query.
declare -a cmd_array
while read -r -a cmd_array | grep -vE '^(\s*$|#)'
do
"${cmd_array[#]}"
done < "$text_file"
Note : I need to eliminate the line starts with '#' and remaining lines to be read and place in array as declared.
Bash script
***********
#! /bin/bash
Function_1()
{
now=$( date '+%Y%m%d%H%M' )
eval logfile="$1"_"$now".log
exec 2>&1 1>>$logfile ### Capture echo output in log and printing in console
#exec 3>&1 1>>$logfile 2>&1
echo " "
echo "############################"
echo "Function execution Begins"
echo "############################"
echo "Log file got created with file name as $1.log"
eval number=$1
eval path=$2
echo "number= $number"
ls -lR $path >> temp.txt
if [ $? -eq 0 ]; then
echo " Above query executed."
else
echo "Query execution failed"
fi
echo "############################"
echo "Function execution Ends"
echo "############################"
echo " "
}
text_file=$1
echo $text_file
declare -a cmd_array ### declaring a array
while read -r -a cmd_array | grep -vE '^(\s*$|#)' ### Read each line in the file with doesnt starts with '#' & keep it in array
do
"${cmd_array[#]}"
done < "$text_file"
Text file
*********
####################################
#Test
#Line2
####################################
Function_1 '125' '' ''
Function_1 '123' '' ''
Consider piping the grep output into the read:
declare -a cmd_array ### declaring a array
### Read each line in the file with doesnt starts with '#' & keep it in array
grep -vE '^(\s*$|#)' < "$text_file" | while read -r -a cmd_array
do
"${cmd_array[#]}"
done
I'm not clear about the output/logging comment. If you need the output appended to a file, in addition to stdout/console), consider using the 'tee' (probably 'tee -a')
I tested with the input file inputfile
echo a
Function_1 '125' '' ''
# skip me
Function_1 '123' '' ''
echo b
and wrote this script:
declare -a cmd_array ### declaring a array
while read -r -a cmd_array
do
echo "${cmd_array[#]}"
"${cmd_array[#]}"
echo
done < <(grep -vE '^(\s*$|#)' inputfile)
For showing output in log and console, see https://unix.stackexchange.com/a/145654/57293
As #GordonDavisson suggested in a comment, you get a simular result with
source inputfile
ignoring comments and empty lines, and calling functions, so I am not sure why you would want an array. This command can be included in your master script, you do not need to modify the inputfile.
Another advantage of sourcing the input is the handling of multi-line input and # in strings:
Function_1 '123' 'this is the second parameter, the third will be on the next line' \
'third parameter for the Function_1 call'
echo "This echo continues
on the next line."
echo "Don't delete # comments in a string"
Function_1 '124' 'Parameter with #, interesting!' ''
I established a connection with a BLE device using gatttool. First I connected to the device with sudo gatttool -t random -b FF:3C:8F:22:C9:C8 -I and connect. After that I read the value of specific characteristic with char-read-uuid 2d30c082-f39f-4ce6-923f-3484ea480596.
What I want to do is to automate the whole process and put the latter command (querying for value) in the loop, ideally saving each value (appending) to a text file. I tried something like
sudo gatttool -t random -b FF:3C:8F:22:C9:C8 -I <<EOF
connect
while[ 1 ]; do
char-read-uuid 2d30c082-f39f-4ce6-923f-3484ea480596 > output.txt
done
exit 1
EOF
but it does not help, since I am not even able to connect to the device (ideally there should be some delay between the first and the second command). Also after connecting, an interactive mode is enabled and the shell commands do not work there. I'd appreciate any clues on how to tackle this issue.
If gattool writes prompts to stdout (and doesn't suppress them given non-TTY file descriptors), consider something like:
#!/usr/bin/env bash
case $BASH_VERSION in ''|[123].*|4.0.*) echo "ERROR: bash 4.1 or newer required" >&2; exit 1;; esac
exec {output_fd}>output.txt
prompt_re='[>] '
capture_re='^handle:.*value:.*$'
wait_for_prompt() {
IFS= read -r line || return
while ! [[ $line =~ $prompt_re ]]; do
[[ $line =~ $capture_re ]] && printf '%s\n' "$line" >&$output_fd
IFS= read -r line || return
done
}
wait_for_prompt
echo connect
while wait_for_prompt; do
echo "char-read-uuid 2d30c082-f39f-4ce6-923f-3484ea480596"
done
...saved as yourscript, and invoked using socat as:
socat 'SYSTEM:sudo gatttool -t random -b FF:3C:8F:22:C9:C8 -I 2>&1' 'EXEC:./yourscript'
(assuming that sudo is configured to work without a TTY; otherwise, you might move it to be sudo socat).
Indeed, pexpect works fine here. You can find my solution below. The code reads the value of the specific UUID, which contains IMU readings (floats).
import pexpect
import struct
import time
import sys
IMU_MAC_ADDRESS = "FF:3C:8F:22:C9:C8"
UUID_DATA = "2d30c082-f39f-4ce6-923f-3484ea480596"
if __name__ == '__main__':
gatt = pexpect.spawn("gatttool -t random -b " + IMU_MAC_ADDRESS + " -I")
gatt.sendline("connect")
gatt.expect("Connection successful")
while(True):
gatt.sendline("char-read-uuid " + UUID_DATA)
gatt.expect("handle: 0x0011 value: ")
gatt.expect(" \r\n")
data = (gatt.before).decode('UTF-8').replace(" ", "").decode('hex')
print(struct.unpack('f', data)[0]
Running in a docker container, I am trying to determine if a table exits in the mssql database:
RESULT='/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -d master -i "SELECT name FROM master.sys.databases WHERE name = N'MyTable'"'
if [ "$RESULT" == "MyTable" ]; then
echo YES
fi
echo "$RESULT"
echo "$RESULT" always just outputs the entire command as a string, its not getting executed. So its just assigning it as a sting...
How can I execute it and assign the result?
Bash does not execute commands defined in single quotes. Single quotes are used for string definitions.
What you try to do is called command substitution. And it can be done in two different ways:
RESULT=`command`
RESULT=$(command)
So your example should look like this:
RESULT=`/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -d master -i "SELECT name FROM master.sys.databases WHERE name = 'MyTable'`
if [ "$RESULT" == "MyTable" ]; then
echo YES
fi
echo "$RESULT"
You can find more information regarding command substitution here:
https://www.gnu.org/software/bash/manual/html_node/Command-Substitution.html
I have a heredoc that needs to call existing variables from the main script, and set its own variables to use later. Something like this:
count=0
ssh $other_host <<ENDSSH
if [[ "${count}" == "0" ]]; then
output="string1"
else
output="string2"
fi
echo output
ENDSSH
That doesn't work because 'output' doesn't get set to anything.
I tried using the solution from this question:
count=0
ssh $other_host << \ENDSSH
if [[ "${count}" == "0" ]]; then
output="string1"
else
output="string2"
fi
echo output
ENDSSH
It didn't work either. $output got set to "string2" because $count wasn't expanded.
How can I use a heredoc that expands variables from the parent script, and sets its own variables?
You can use:
count=0
ssh -t -t "$other_host" << ENDSSH
if [[ "${count}" == "0" ]]; then
output="string1"
else
output="string2"
fi
echo "\$output"
exit
ENDSSH
We use \$output so that it is expanded on remote host not locally.
It is better not to use stdin (such as by using here-docs) to pass commands to ssh.
If you use a command-line argument to pass your shell commands instead, you can better separate what is expanded locally and what will be executed remotely:
# Use a *literal* here-doc to read the script into a *variable*.
# Note how the script references parameter $1 instead of
# local variable $count.
read -d '' -r script <<'EOF'
[[ $1 == '0' ]] && output='zero' || output='nonzero'
echo "$output"
EOF
# The variable whose value to pass as a parameter.
# With value 0, the script will echo 'zero', otherwise 'nonzero'.
count=0
# Use `set -- '$<local-var>'...;` to pass the local variables as
# positional parameters, followed by the script code.
ssh localhost "set -- '$count'; $script"
You can escape the variables as #anubhava said, or, if you get too much variables for the escaping, you can do it in two steps:
# prepare the part which should not be expanded
# note the quoted 'EOF'
read -r -d '' commands <<'EOF'
if [[ "$count" == "0" ]]; then
echo "$count - $HOME"
else
echo "$count - $PATH"
fi
EOF
localcount=1
#use the unquoted ENDSSH
ssh me#nox.local <<ENDSSH
count=$localcount # count=1
#here will be inserted the above prepared commands
$commands
ENDSSH
will print something like:
1 - /usr/bin:/bin:/usr/sbin:/sbin