Attempting to connect to an oracle database from shell script - oracle

I am trying to connect to an oracle database from a shell script ( I am a new user ) . The script will then pass a query and transfer the result to a variable called canadacount. I have written the code but it does not work
#this script will attempt to connect to a remote database CFQ143 with user ID 'userid' and password 'password'.
#After loggin in it will read data from the PLATFORMSPECIFIC table.
#We can pass a query 'select count (platform) from platformspecific where platform='CANADA';
#The result from this query will be passed to a variable called canadacount which we can then echo back to the user.
canadacount='$ORACLE_HOME/bin/sqlplus -s /nolog<<EOF
connect userid/passsword#CFQ143:1521:CFQ143
set pages 0 feed off
select count (platform) from platformspecific where platform='CANADA';
exit
EOF'
echo $canadacount

The answer is :
I changed the connect line to the following:
connect userid/passsword#CFQ143

Related

How to format oracle connection string and output query to file

I am trying to connect to a oracle database, query it, and send results to a txt file. When I run my statement, this shows in the .txt file:
In reality, it should be values from my sql script.
Here is the string i am running:
sql_file1=Cb.sql
sqlplus -s "username/pwd#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=my_host)(Port=1521))(CONNECT_DATA=(SERVICE_NAME=my_ser_name))))" #sql/$sql_file1 > /home/path/to/my/files/'cb.txt'
Any reasons as to why my 'cb.txt' file shows the screenshot from above instead of any date from the query inside my sql file?
You have extra ) in your connection string:
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=my_host)(Port=1521))(CONNECT_DATA=(SERVICE_NAME=my_ser_name))))
should be
sqlplus -s "username/pwd#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=my_host)(Port=1521))(CONNECT_DATA=(SERVICE_NAME=my_ser_name)))" #sql/$sql_file1 > /home/path/to/my/files/cb.txt
But even easier to use EZConnect string:
sqlplus -s "username/pwd#//my_host:1521/my_ser_name" #sql/$sql_file1

Oracle sqlplus execute sql script from command line while passing parameters to the sql script

I have one script script.sql which I want to execute from command line using oracle and passing it two parameters, as shown below
sqlplus user/pass # script.sql my_parameter1_value my_parameter2_value
What should it be in script.sql in order to be able to run it with the parameter values?
The solution can be prepared looking at oracle blogs:
https://blogs.oracle.com/opal/sqlplus-101-substitution-variables#2_7
For the question above, the solution would be to create a script.sql like this:
DEFINE START_VALUE = &1;
DEFINE STOP_VALUE = &2;
SELECT * FROM my_table
WHERE
value BETWEN &&START_VALUE AND &&STOP_VALUE;
I wanted to run a script that would return all orders raised during the last seven days. Here's how...
the script
SELECT * FROM orders_detail WHERE order_date BETWEEN '&1' AND '&2';
EXIT;
the command
sqlplus ot/Orcl1234#xepdb1 #"/opt/oracle/oradata/Custom Scripts/orders_between_dates.sql" $(date +%d-%b-%Y -d '-7 days') $(date +%d-%b-%Y)
Hope that helps someone. Luck.

SQLPLUS connection to different dbs

Hello i want to connect to following dbs in loop and execute statements on each:
conn support/support#sp0666to
conn support/support#sp0667to
conn support/support#sp0668to
Is there any way to do this in sqlplus?
Thank you for your answers in advance!
Create one script (doWork.sql) that contains the majority of what you want to do:
conn &1/&2#&3
select EMPLOYEE, AUTHORIZED, TIME, DAT, WORKSTATION
from EMPLOYEE
where status = 25;
In a separate script (goToWork.sql):
set lines 1500 pages 10000
set colsep ';'
set sqlprompt ''
set heading on
set headsep off
set newpage none column tm new_value file_time noprint
select to_char(sysdate, 'DDMMYYYY_HH24.MI') tm from dual;
accept user
accept pass
spool C:\Users\NANCHEV\Desktop\parked.csv
##doWork &user &pass sp0666to
##doWork &user &pass sp0667to
##doWork &user &pass sp0668to
spool off;
exit
If you want separate files, then move the two spool commands to the doWork.sql file.
Assuming you want to run the same set of queries for each database, I'd create a script file (e.g. main_statements.sql) containing those statements.
Then, if the list of databases was static, I'd create a second script file (e.g. run_me.sql) in the same directory, with contents along the lines of:
connect &&user/&&password#db1
##main_statements.sql
connect &&user/&&password#db2
##main_statements.sql
connect &&user/&&password#db3
##main_statements.sql
...
If, however, the databases are static but the list is contained in a database somewhere, then I'd write a script (e.g. run_me.sql) that generates a script, something like:
set echo off
set feedback off
set verify off
spool databases_to_run_through.sql
select 'connect '||username||'/'||password||'#'||database_name||chr(10)||
'##main_statements.sql'
from list_of_databases_to_query;
spool off;
##databases.run_through.sql
N.B. untested. Also, I have assumed that your table contains the usernames and passwords for each db that needs to be connected to; if that's not the case, you'll have to work out how to handle them; maybe they're all the same (in which case, you can hardcode them - or better yet, use substitution variables (e.g. &&username) to avoid having to store them in a plain file. You'd then have to enter them at runtime.
You'll also need to run the script from the same directory, otherwise you could end up with the generated script not being created in the same directory as your main_statements.sql equivalent script.
Yes it's possible, you can use oracle DBLink to connect to different dbs just like your example.

Execute plsql dynamiclly via SSH in remote server (using dynamic login info) and get returned value to executing server

I am trying to execute a dynamic plsql command in a remove machine (say 192.168.x.x) and get it's return value to local machine from where I initiate shell script. I am testing two approaches to do this. But none of them seems to work properly.
Approach - I
in this approach if i supply values for ssh and sqlplus (login user/ip and schema user/ip) via shell variable it won't work. working code with hardcoded value is below.
#!/bin/sh
varfld="SYSDATE"
retVal=`echo "SELECT $varfld FROM dual;" | ssh utility#192.168.x.x 'sqlplus -S utility/pwd' | tail -2 | head -1`
echo "return value 1: "$retVal
Approach - II
In this approach I can pass everything i need in a variable. But the plsql command (i.e value of variable $ssh_execute_command) is not recognized inside plsql. Only hardcoded plsql command gets executed.
P.S.This works fine with vsql with few modification to connect Vertica.
v_server_user=utility
v_server_name=192.168.x.x
ssh_v_schema_name=utility
ssh_v_schema_pwd=pwd
varfld="SYSDATE"
execute_command="SELECT $varfld FROM dual;"
retVal=$(ssh $v_server_user#$v_server_name ssh_v_schema_name=$v_schema_name ssh_v_schema_pwd=$v_schema_pwd ssh_execute_command=\""$execute_command"\" 'bash -s' <<SSHSQLTEXT
sqlplus -S $ssh_v_schema_name/$ssh_v_schema_pwd
SET ECHO OFF
SET HEADING OFF
SET FEEDBACK OFF
SELECT SYSDATE FROM dual;
SSHSQLTEXT
)
echo "return value 2: "$retVal
Queries
1. How can we make sql command passed in a variable make work inside pqlplus? or
2. How can we pass required values for ssh and sqlplus (login user/ip and schema user/ip) dynamically and make it work?
The sql statement "SELECT SYSDATE FROM dual;" used here is for testing purpose only. I will be calling package function instead to get return value. And this is yet to be tried in any of these scenarios. If anyone could address that too with example, that would be great!
Thanks in advance.
You need to assign the result of sqlplus to a variable which must be displayed within ssh so that the result is passed to variable retVal. Try the following code:
retVal=$(ssh $v_server_user#$v_server_name ssh_v_schema_name=$v_schema_name ssh_v_schema_pwd=$v_schema_pwd ssh_execute_command=\""$execute_command"\" 'bash -s' <<SSHSQLTEXT
sqlresult=`sqlplus -S $ssh_v_schema_name/$ssh_v_schema_pwd <<EOF
SET ECHO OFF
SET HEADING OFF
SET FEEDBACK OFF
SELECT SYSDATE FROM dual;
exit
EOF`
echo $sqlresult
SSHSQLTEXT
)
echo "return value 2: "$retVal
If you execute the above block of codes, you will have the sysdate as result in $retval

Change passwords of many schemas under different databases without DBA privileges?

We have many schemas under different databases. I do not have DBA privileges. I do have privileges to log-on to schemas and to change password. We do this password change ones in six months. Currently it's a manual and time consuming process. i.e. log-on to every schema under database/s and change password using "password" command. When I do password changes I have two files - current password and new password.
I log-on to each schema#database and issue following command –
alter user schema_name identified by new_password replace old_password;
Remember I don’t have DBA privileges; I can only log-on to schemas using username and password.
I thought about creating shell script and using “expect” from shell script. Though first trying to find out whether there is any easier approach. I am wondering is there a simple way of doing this either from SQL*Plus or from PL/SQL?
It seems impossible to do it using PLSQL or SQL*Plus because using both of them, you can execute from a single schema only and since there is no DBA privilege, you cannot do it for all schemas, rather you will end up running script after logging into each schema.
You can write a shell script that reads the two files and for each line, it inserts following data into the third file,
sqlplus -s username_1/old_password_1#oracle_instance <<EOF
alter user username_1 identified by new_password1 replace old_password_1;
exit
EOF
sqlplus -s username_2/old_password_2#oracle_instance <<EOF
alter user username_2 identified by new_password_2 replace old_password_2;
exit
EOF
.
.
.
sqlplus -s username_n/old_password_n#oracle_instance <<EOF
alter user username_n identified by new_password_n replace old_password_n;
exit
EOF
and so on
Once the third file is created, execute it after setting its permission to _rwxrwxrwx
You need to script this.
Input: List of tnsalias, List of (schema_name, old password, new password).
Here is the script I use when I alter my account on multiple databases.
$ cat alterpassword.py
"""Update oracle database passwords for user by typing the old and new password once.
"""
import cx_Oracle
import getpass
username = 'bjarte'
connect_strings = ['DB1.SUPERSITE.COM',
'DB2.SUPERSITE.COM',
'DB3.SUPERSITE.COM',
'DB4.SUPERSITE.COM',
'DB5.SUPERSITE.COM',
'DB6.SUPERSITE.COM']
def alter_password(username, old_password, new_password, tnsalias):
connect_string = "%s/%s#%s" % (username, old_password, tnsalias)
try:
connection = cx_Oracle.connect(connect_string)
try:
cursor = connection.cursor()
statement = "alter user %s identified by %s" % (username, new_password)
cursor.execute(statement)
return True
except:
return False
else:
cursor.close()
except:
return False
else:
connection.close()
if __name__ == '__main__':
print "Type in old password"
old_password = getpass.getpass()
print "Type in new password"
new_password = getpass.getpass()
for tnsalias in connect_strings:
success = alter_password(username, old_password, new_password, tnsalias)
if success:
print "password altered for user %s in database %s" % (username, tnsalias)
else:
print "password alternation failed for user %s in database %s" % (username, tnsalias)
You can adjust this script to read input from file and rewrite it in your favorite scripting language - bash, php, Perl, python, ruby or Powershell.
Side note: Schema accounts are not application login-accounts
schema accounts should always be locked (no need for a password change).
When you request for ddl changes for a specific schema, the DBA can open up and give you a password. When done, lock the schema account again.
schema-accounts are special. They own objects and can do ddl like: "drop objecttype objectname". You most likely don't want your applications to have these powerful privileges.

Resources