Dynamic Schema name in SQL based on database-name Oracle - oracle

I have a DML statement "Simple update statement" that i need to execute in Different Oracle Environment (DEV,SIT,QA,PROD). As of Now im hard coding the Schema Name in Alter Session Command and Running the DML statement. So i need to maintain 4 different scripts across the environment.
Is there a way i can get the Database name on which the script is Running, from the database Name i can use If Else Condition to Choose between the Schema and assign it to the ALTER SESSION COMMAND?

You can query the name from the v$database view. How you translate that to an alter session command depends on your client and how you're connecting.
For example, in SQL*Plus you could do:
column x_schema new_value y_schema noprint
set termout off
select case name
when 'DEVDB' then 'DEVSCHEMA'
when 'PRODDB' then 'PRODSCHEMA'
...
end as x_schema
from v$database;
alter session set current_schema = &y_schema;
set termout on
The first query uses the DB name to determine a schema name; the column command makes that column alias x_schema available later as substitution variable &y_schema. Then that is used in the alter. The set termout is optional but will hide the query if it's run as part of a script (though then I guess the noprint is a bit pointless).
You could base your case on global_name.global_name if you prefer, or even sys_context('USERENV', 'SERVICE_NAME').
Of course, you could also just pass the schema name in as a positional parameter when you run the DML script instead. It's unusual, I think, for the schema name to be different for the same application in different databases, but this should work if that is the situation you have.

To get the current schema name:
select user from dual
Is this what you're after?

Related

SQL*Plus: Providing values of bind variables at run time like in Toad/SQL-Developer

Dear Techies,
Our application triggers queries like below very frequently.
select name,emp_id,prod_id,prod_name, .... from appuser.table where emp_id=:1 and prod_id=:2;
We usually spend ample amount of time finding the SQL_ID when we receive the problematic SQL_TEXT. We have an option in SQL*Plus for variables like emp_id, prodnum, etc.. as below
VARIABLE emp_id NUMBER; EXEC :emp_id := 101;
However, we have :1, :2 as bind variables name which can't be set before running the SQL as these are mere number (although treated as bind variables by Oracle). We can't ask Application vendor to rebuild all queries removing these numbered bind variables.
So, I was looking for any of the below option in SQL*Plus:
How to declare/define such bind variables (:1, :2, etc...) before running the SQL?
Can we bypass in any way the bind values and send this SQL to cursor cache in Oracle? Looks to be difficult but still wanted to give a try asking.
Can we pass the values of these bind variables (:1, :2, etc..) during runtime as we do in Toad & SQL developer? This way we can track the correct SQL_ID from the cursor (v$sql).
I have been trying and searching for various options but didn't get specific to mine. Any help in this regard would be greatly appreciated. Any version of Oracle database which addresses this concern would be fine.

how to find the query used for creation of Temporary table in Oracle sql developer

I have created a temporary table in oracle sql developer but I forgot to save it and now I want to reuse the query but I don't remember the code used then. Is there a process to get query used creation of temp table?
You can use dbms_metadata.get_ddl()
select dbms_metadata.get_ddl('TABLE', 'YOUR_TABLE_NAME_HERE')
from dual;
The result is a CLOB with the complete DDL. You might need to adjust the display in SQL Developer to make the content of that value fully visible (I don't use SQL Developer, so I don't know if that is necessary and if so, what you would need to do)
Edit:
It seems SQL Developer can't display the result of this query properly unless you use the "Run Script" option. And with that you need to use a SET LONG 60000 (or some other big number) before you run it, to see the complete source code:

Use a database value in SQLPLUS as a parameter to a batch file

I need to run a batch file and pass in a database value. Depending on the day, a different value needs to be passed into the DOS command. This particular business rule is defined in a view in the database.
Our primary scheduling tool is datastage which runs scripts in SQLPLUS, so this is my preferred method for schedules activities. I could also run a batch file directly taht calls SQLPLUS and gets a value, but there is much more hassle setting it up so I'd prefer not to.
I'm a SQLPLUS amateur.
The SQL script I'm passing into SQLPLUS is below. The problem I'm having is it's passing in :sTriggerName, not the value of it.
set echo on
set feedback on
set termout on
VAR sTriggerName VARCHAR2(100)
SELECT TRIGGERNAME INTO :sTriggerName
FROM SCHEMA.VIEW
WHERE CALENDAR_DATE = TRUNC(SYSDATE) AND ROWNUM < 2;
/
HOST "E:\CallScheduleTrigger.CMD :sTriggerName."
quit
In the example above I am using a bind variable.
This link showed me hwo to load a substitution variable from the database and use that instead:
http://www.oracle.com/technetwork/testcontent/sub-var9-086145.html
To load a a database value into a substitution variable called sTriggerName
COLUMN TRIGGERNAME new_value sTriggerName
SELECT TRIGGERNAME FROM SCHEMA.VIEW WHERE CALENDAR_DATE = TRUNC(SYSDATE) AND ROWNUM < 2;
To use this substitution variable in a host command (i.e. as a parameter to a batch file):
HOST "E:\CallScheduleTrigger.CMD &sTriggerName"

Oracle build-automation?

I have a created number of table, function, view, and procedure scripts to support reporting. Due to the complicated environment, migrating scripts (development-->testing) can be a chore.
The DBA does not allow the developers to use the primary tablespace ('VENDOR'), nor either of main schemae ('UTIL','REPORTING'). The UTIL schema is intended for functions and procedures; REPORTING is for tables and views.
Because the development server is often recommissioned for other purposes, development is done on the testing server, using a development tablespace ('DEVL') and a schema for each developer ('CRAIG', for example).
As a result, a table's script must be converted from:
DROP TABLE CRAIG.X_TABLE;
CREATE TABLE CRAIG.X_TABLE;
...
TABLESPACE "DEVL";
to:
DROP TABLE REPORTING.X_TABLE;
CREATE TABLE REPORTING.X_TABLE;
...
TABLESPACE "VENDOR";
A view's script must be changed from:
CREATE OR REPLACE VIEW CRAIG.X_VIEW
...
;
to:
CREATE OR REPLACE VIEW REPORTING.X_VIEW
...
;
A procedure's script must be changed from:
CREATE OR REPLACE PROCEDURE CRAIG.X_PROCEDURE
...
INSERT INTO CRAIG.X_PROCEDURE
SELECT ...
-- reference a table in REPORTING schema
FROM REPORTING.ANOTHER_TABLE
;
to:
CREATE OR REPLACE PROCEDURE UTIL.X_PROCEDURE
...
INSERT INTO REPORTING.X_PROCEDURE
SELECT ...
FROM REPORTING.ANOTHER_TABLE
;
The table and procedure scripts require the most intervention, as you can see.
If it makes a difference, I use SQL Developer, TextMate, and Sublime Text 2 for coding and Cornerstone to interact with our organization's Subversion (SVN) repository.
Is there a way to simplify (i.e. automate) the changes that I need to each type of script as I migrate the logic from the development environment to the testing one?
I would connect as the schema owner; not sure if you're implying you're connecting as one user and building objects in a different schema? i.e. don't qualify the table names etc. at all. And have a suitable default tablespace for that user. Then the scripts don't need to specify either. Maybe I'm missing something?
If you really want to specify them, you can prompt for and accept the values at the start of the script and use substitution variables:
accept schma char prompt 'Enter schema: '
accept tbspc char prompt 'Enter tablespace: '
create table &&schma..my_table (...) tablespace &&tbspc;
etc.
If there are a limited number of scenarios you could maybe set the values automatically based on the database name, assuming different environments are in different instances:
column q_schma new_value schma
column q_tbspc new_value tbspc
select case name when 'TEST_NAME' then 'CBUCH' else 'PROD_USER' end as q_schma,
case name when 'TEST_NAME' then 'TBSP_DEV' else 'PROD_SCHEMA' end as q_tbspc
from v$database;
create table &&schma..my_table (...) tablespace &&tbspc;
You could also change your default schema to avoid the prefixes:
alter session set current_schema = &schma
create table my_table (...) tablespace &&tbspc;
Another approach might be to use placeholders in the checked-in code, and run the code through sed or similar to put the real values in.

Explain Plan for Query in a Stored Procedure

I have a stored procedure that consists of a single select query used to insert into another table based on some minor math that is done to the arguments in the procedure. Can I generate the plan used for this query by referencing the procedure somehow, or do I have to copy and paste the query and create bind variables for the input parameters?
Use SQL Trace and TKPROF. For example, open SQL*Plus, and then issue the following code:-
alter session set tracefile_identifier = 'something-unique'
alter session set sql_trace = true;
alter session set events '10046 trace name context forever, level 8';
select 'right-before-my-sp' from dual;
exec your_stored_procedure
alter session set sql_trace = false;
Once this has been done, go look in your database's UDUMP directory for a TRC file with "something-unique" in the filename. Format this TRC file with TKPROF, and then open the formatted file and search for the string "right-before-my-sp". The SQL command issued by your stored procedure should be shortly after this section, and immediately under that SQL statement will be the plan for the SQL statement.
Edit: For the purposes of full disclosure, I should thank all those who gave me answers on this thread last week that helped me learn how to do this.
From what I understand, this was done on purpose. The idea is that individual queries within the procedure are considered separately by the optimizer, so EXPLAIN PLAN doesn't make sense against a stored proc, which could contain multiple queries/statements.
The current answer is NO, you can't run it against a proc, and you must run it against the individual statements themselves. Tricky when you have variables and calculations, but that's the way it is.
Many tools, such as Toad or SQL Developer, will prompt you for the bind variable values when you execute an explain plan. You would have to do so manually in SQL*Plus or other tools.
You could also turn on SQL tracing and execute the stored procedure, then retrieve the explain plan from the trace file.
Be careful that you do not just retrieve the explain plan for the SELECT statement. The presence of the INSERT clause can change the optimizer goal from first rows to all rows.

Resources