Moving data between different servers in oracle - oracle

I'm new to Oracle, and I am working on moving specific data from a DB on one server to the DB on another server.
The two DBs have the same schema, but I want to pull specific columns referenced by their keys and move the data into other server. I'm trying to figure out what the best plan of attack on this would be.
A method that allows a command line just so I can type in the key of the data I want moved is preferred. Is it perhaps possible to accomplish with a PLSQL script?
Thanks.

Assuming that you can create network connections between the two databases, the simplest option would be to create a database link between them, i.e.
CREATE DATABASE LINK to_b
CONNECT TO username_on_b
IDENTIFIED BY password
USING 'tns_alias_for_b'
You could then use that database link to query data from database B, i.e.
INSERT INTO table_name( list_of_columns )
SELECT list_of_columns
FROM table_name#to_b
WHERE primary_key_value = <<some value>>;
That can be either a straight SQL statement, part of a PL/SQL procedure, or part of a SQL*Plus script. A PL/SQL procedure
CREATE OR REPLACE PROCEDURE move_row_from_b(
p_key_value IN table_name.primary_key%type
)
AS
BEGIN
INSERT INTO table_name( list_of_columns )
SELECT list_of_columns
FROM table_name#to_b
WHERE primary_key_value = p_key_value;
END move_row_from_b;
which can be invoked either via EXEC from SQL*Plus or via an anonymous PL/SQL block
SQL> exec move_row_from_b( 23 );
BEGIN
move_row_from_b( 23 );
END;
Or you could write a SQL*Plus script
variable key_value number;
accept key_value prompt 'Enter key: '
INSERT INTO table_name( list_of_columns )
SELECT list_of_columns
FROM table_name#to_b
WHERE primary_key_value = :key_value;

Related

How can I use copy command in stored procedure in oracle

I have a problem with this stored procedure, I get an error when I use copy command
create or replace PROCEDURE COPY_PROCEDURE AS
BEGIN
COPY FROM USER/PASS#DB1 TO USER/PASS#DB2 REPLACE TABLE1 USING select * from TABLE2;
END;
Copy command is niche facility supported by few oracle client tools. It is not integrated with oracle backend (RDBMS).
create table cloneA as select * from A#dblink
is much cleaner way to copy a table from one db instance to the other.

Oracle - using synonym as variable

i need to use synonym as variable in a block. I have 2 different schemas with same tables on them and job that switches between schemas making one active. Now I want to write a block checking which schema is active with ALL_SYNONYMS and using result as part of a query.
Here is example:
DECLARE
OWNER VARCHAR2(15);
BEGIN
SELECT TABLE_OWNER
INTO OWNER
FROM ALL_SYNONYMS
WHERE TABLE_NAME = 'MY_TABLE1';
SELECT *
FROM OWNER.MY_TABLE2 ;
END;
But I’m getting ORA-06550 table or view does not exist, and when i run query itself where i put value from ALL_SYNONYMS it returns result.
Any idea how to fix this?
Thanks
You are attempting using symptoms incorrectly. Synonyms are used so you do not need to know which is active. According to the documentation:
Synonyms provide both data independence and location transparency.
Synonyms permit applications to function without modification
regardless of which user owns the table or view and regardless of
which database holds the table or view.
You just use the synonym instead of the object itself.
create table evens( id integer generated always as identity
, val integer
) ;
create table odds( id integer generated always as identity
, val integer
) ;
insert all
when mod(val,2) = 0 then into evens(val)
when mod(val,2) = 1 then into odds(val)
select level val
from dual connect by level <= 10;
-- create the synonym then use it in Select;
create or replace synonym current_even_odd for evens;
select * from current_even_odd;
-- now change the synonym, then run the EXACT same query.
create or replace synonym current_even_odd for odds;
select * from current_even_odd;
In this case it is not quite without modification, you need to change the synonym, But it seems you are trying that already.
Note: You cannot create a synonym for a schema but must point it to a specific object.
I attempted a db<>fiddle for the above, but it appears it is having problems at the moment.
I agree with Belayer that the synonym should provide a layer of abstraction on your tables and your procedure shouldn't need to know what the schema is. But the "table or view does not exist" error is likely an issue related to privileges and definer's rights versus invoker's rights.
To directly reference an object in a procedure, the procedure's schema must have a direct grant to the table. However, an ad hoc query only needs a role with privileges on the object. This is why the SQL will work in your IDE but not in the procedure. Ensure the code that modifies objects and switches synonyms is granting privileges to both roles and directly to schemas.
If direct grants are not possible, you will need to modify the procedure to use AUTHID CURRENT_USER and change the SQL statements to use dynamic SQL - which can be a huge pain. For example:
create or replace procedure test_procedure authid current_user is
v_count number;
begin
execute immediate
q'[
select count(*)
from some_table
]'
into v_count;
end test_procedure;
/
If you really do need to manually switch between schemas, then you may want to consider using something like execute immediate 'alter session set current_schema=schema1'; in the procedure and using dynamic SQL for all of the querying.

How to call Oracle stored procedure from azure data factory v2

My requirement is copy data from Oracle to SQL Server. Before copying from Oracle database, I need to update the Oracle table using procedure which has some logic.
How do I execute Oracle stored procedure from Azure datafactory?
I referred to this thread
if I use EXECUTE PROC_NAME (PARAM); in preCopy script it's failing with following error
Failure happened on 'Source' side.
ErrorCode=UserErrorOdbcOperationFailed,
Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException
Message=ERROR [42000] [Microsoft][ODBC Oracle Wire Protocol driver]
[Oracle]ORA-00900: invalid SQL statement
Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,
Type=System.Data.Odbc.OdbcException
Message=ERROR [42000] [Microsoft][ODBC Oracle Wire Protocol driver]
[Oracle]ORA-00900: invalid SQL statement,Source=msora28.dll
Could anyone help on this?
Note: I am using self-hosted runtime environment for data factory
thanks!!
I used a Lookup Activity and a SELECT statement of DUAL TABLE. Due to the stored procedures can not be call from a statement SELECT. I created an oracle function and the function calls the stored procedure. The function returns a value and this value is received by the lookup activity.
When you define the function, you have to add the statement PRAGMA AUTONOMOUS_TRANSACTION. This is because Oracle does not allow to execute DML instructions with a SELECT statement by default. Then, you need to define that DML instructions in the Stored Procedure will be an autonomous transaction.
--Tabla
CREATE TABLE empleados(
emp_id NUMBER(9),
nombre VARCHAR2(100),
CONSTRAINT empleados_pk PRIMARY KEY(emp_id),
);
create or replace procedure insert_empleado (numero in NUMBER, nombre in VARCHAR2) is
begin
INSERT INTO empleados (emp_id, nombre)
Values(numero, nombre);
COMMIT;
end;
create or replace function funcinsert_empleado (numero in NUMBER, nombre in VARCHAR2)
return VARCHAR2
is
PRAGMA AUTONOMOUS_TRANSACTION;
begin
insert_empleado (numero, nombre);
return 'done';
end;
--statement in query of lookup
SELECT funcinsert_empleado ('1', 'Roger Federer')
FROM DUAL;
Example lookup
This is example in Spanish. https://dev.to/maritzag/ejecutar-un-stored-procedure-de-oracle-desde-data-factory-2jcp
In Oracle, EXECUTE X(Y) is a SQL*Plus-specific command shortcut for the PL/SQL statement BEGIN X(Y); END;. Since you are not using SQL*Plus, try the BEGIN/END syntax.
In case you only want to execute the DML query using the Azure Data Factory without procedure on oracle database :-
I have another solution where you can use the copy activity with the pre-copy feature of sink in-spite of lookup activity.
For this approach just follow the below steps :-
Keep both the source table and sink table as same ( Let say table A ) using the same linked service.
In sink use the pre-copy script feature and keep the DML (Insert/Update/Delete ) query that you want to perform over the table B.( This table is not necessary to be same as table A )
In case you want to avoid the copy of data to same table you can select query option in the source part and provide a where clause which is not going to satisfy and hence no copy of data will happen .
or you can create a table temp with one column and one row .
I have tested both the options and it works ... good part of above solution is you can avoid the procedure or function creation and maintenance .

How can we use oracle private temporary tables in a pl/sql block?

I see the concept of temporary table in oracle is quite different from other databases like SQL Server. In Oracle, we have a concept of global temporary table and we create it only once and in each session we fill it with data which is not the same in other databases.
In 18c, oracle has introduced the concept of private temporary tables which states that upon successful usage, tables can be dropped like in other databases. But how do we use it in a PL/SQL block?
I tried using it using dynamic SQL - EXECUTE IMMEDIATE. But it is giving me table must be declared error. what do I do here?
But how do we use it in a PL/SQL block?
If what you mean is, how can we use private temporary tables in a PL/SQL program (procedure or function) the answer is simple: we can't. PL/SQL programs need to be compiled before we can call them. This means any table referenced in the program must exist at compilation time. Private temporary tables don't change that.
The private temporary table is intended for use in ad hoc SQL work. It allows us to create a data structure we can use in SQL statements for the duration of a session, to make life easier for ourselves.
For instance, suppose I have a massive table of sales data - low level transactions - and my task is to investigate monthly trends. So I only need the total sales by month. Unfortunately, there is no materialized view providing this summary. I don't want to include the aggregating query in my select statements. In previous versions I would have had to create a permanent table (and had to remember to drop it afterwards) but in 18c I can use a private temporary table to stage my summary just for the session.
create private temporary table ora$ptt_sales_summary (
sales_month date
, total_value number )
/
insert into ora$ptt_sales_summary
select trunc(sales_date, 'MM')
, sum (qty*price)
from massive_sales_table
group by trunc(sales_date, 'MM')
/
select *
from ora$ptt_sales_summary
order by sales_month
/
Obviously we can write anonymous PL/SQL blocks in our session but let's continue assuming that's not what you need. So what is the equivalent of a private temporary table in a permanent PL/SQL program? Same as it's been for several versions now: a PL/SQL collection or a SQL nested table type.
Private temporary tables (Available from Oracle 18c ) are dropped at the end of the session/transaction depending on the definition of PTT.
The ON COMMIT DROP DEFINITION option creates a private temporary table that is transaction-specific. At the end of the transaction,
Oracle drops both table definitions and data.
The ON COMMIT PRESERVE DEFINITION option creates a private temporary table that is session-specific. Oracle removes all data and
drops the table at the end of the session.
You do not need to drop it manually. Oracle will do it for you.
CREATE PRIVATE TEMPORARY TABLE ora$ptt_temp_table (
......
)
ON COMMIT DROP DEFINITION;
-- or
-- ON COMMIT PRESERVE DEFINITION;
Example of ON COMMIT DROP DEFINITION (table is dropped after COMMIT is executed)
Example of ON COMMIT PRESERVE DEFINITION (table is retained after COMMIT is executed but it will be dropped at the end of the session)
Note: I don't have access to 18c DB currently and db<>fiddle is facing some issue so I have posted images for you.
Cheers!!
It works with dynamic SQL:
declare
cnt int;
begin
execute immediate 'create private temporary table ora$ptt_tmp (id int)';
execute immediate 'insert into ora$ptt_tmp values (55)';
execute immediate 'insert into ora$ptt_tmp values (66)';
execute immediate 'insert into ora$ptt_tmp values (77)';
execute immediate 'select count(*) from ora$ptt_tmp' into cnt;
dbms_output.put_line(cnt);
execute immediate 'delete from ora$ptt_tmp where id = 66';
cnt := 0;
execute immediate 'select count(*) from ora$ptt_tmp' into cnt;
dbms_output.put_line(cnt);
end;
Example here:
https://livesql.oracle.com/apex/livesql/s/l7lrzxpulhtj3hfea0wml09yg

Moving XML over a DBLink

I am trying to move some data over a dblink and one of the columns is an XMLType column. The code looks like this:
begin
delete from some_schema.some_remote_tab#src_2_trg_dblink;
INSERT INTO some_schema.some_remote_tab#src_2_trg_dblink(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xml_data
FROM local_table;
end;
Oracle returns these errors:
ORA-02055: distributed update operation failed; rollback required
ORA-22804: remote operations not permitted on object tables or user-defined type columns
Some research on ORA-22804 shows that I am probably getting this error because of the XMLType column, but I am not sure how to resolve this.
(Oracle 10g)
We get ORA-22804 because every instance of a Type in our Oracle database has an OID, which is unique within the database. We cannot transfer that OID to another database; this has caused me grief before when trying to import schemas which have User-Defined Types. I hadn't realised that it also affected XMLType, but it is an Object so it is not surprising.
The solution is icky: you will have to unload the XML into text on your local database and then convert it back into XML in the remote database.
I don't have a distributed DB set-up to test this right now, but if you're lucky it may work:
INSERT INTO some_schema.some_remote_tab#src_2_trg_dblink(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xmltype ( xml_data.asClobVal() )
FROM local_table;
If the asClobVal() method doesn't work you may need to use the SQL function XMLSERIALIZE() instead.
XMLSerialize(DOCUMENT xml_data AS CLOB)
If you're really unlucky you won't be able to do this in a single SQL statement, and you'll have to solve it using PL/SQL. To a certain extent this will depend on which version of the database you are using; the more recent the version, the more likely you'll be able to it in SQL rather than PL/SQL.
Try to do this the other way around. That is log into the remote db, create a dblink to the local db, and do an insert like this
INSERT INTO remote_schema.some_remote_tab(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xml_data
FROM local_table#dblink_to_local_db;
Instead Perform a Data PULL.
create the data pull procedure at Remote database B.
create synonyms and provide grants to the dblink user.
Call the Remote procedure from Database A (Source) Perform a commit at Database A(source).
(Meanwhile .. wait for oracle to find some solution to perform the PUSH of XML over dblink in the future)
Create a procedure at Remote site Database B
CREATE OR REPLACE PROCEDURE PR_REMOTE(OP_TOTAL_COUNT OUT NUMBER) IS
BEGIN
INSERT /*+ DRIVING_SITE(src) */
INTO REMOTE_TABLE TGT_B
(XMLDATA_COL)
SELECT SRC.XMLDATA FROM LOCAL_TABLE#TGT2SRC_DBLINK SRC;
OP_TOTAL_COUNT := SQL%ROWCOUNT;
END;
Call the procedure from Database A
DECLARE
V_COUNT NUMBER := 0;
BEGIN
PR_REMOTE(V_COUNT);
COMMIT;
END;
I was facing the same issue with an heterogeneous DB link to SQL server.
Ended up using xmltype.getStringVal() to insert in a VARCHAR column on SQL Server side as the data was under 4000 characters.
There is also xmltype.getClobVal() if over 4000 characters but I haven't tested it.
The "xml->text->xml" chain might be complicated, but could help in some cases (for example when inserting is not on option but updating only).
You can try with "n" peaces of varchar columns (in the destination table or in a differnet one, perheaps in different schema on the remote DB), where "n" is:
ceil(max(dbms_lob.getlength(MyXmlColumn)) / 4000)
Then you can transfer these fragments to remote temporary fields:
insert into RemoteSchema.MyTable(Id, XmlPart1, XmlPart2,...)
(select 1 /*some Id*/,
dbma_lob.substr(MyXmlColumn.getclobval(), 4000, 1),
dbma_lob.substr(MyXmlColumn.getclobval(), 4000, 4001),
...
from LocalSchema.MyTable
XmlType can be re-composed from fragments like this:
create or replace function concat_to_xml(p_id number)
return xmltype
is
xml_lob clob;
xml xmltype;
begin
dbms_lob.createtemporary(xml_lob, true);
for r in (select XmlPart1, XmlPart2, ... from RemoteSchema.MyTable where Id = p_id)
loop
if r.XmlPart1 is not null then
dbms_lob.writeappend(xml_lob, length(r.XmlPart1), r.XmlPart1);
end if;
if r.XmlPart2 is not null then
dbms_lob.writeappend(xml_lob, length(r.XmlPart2), r.XmlPart2);
end if;
...
end loop;
xml := xmltype(xml_lob);
dbms_lob.freetemporary(xml_lob);
return xml;
end;
Finally use the result to update any other table in the remothe schema like:
update RemoteSchema.MyTable2 t2 set t2.MyXmlColumn = concat_to_xml(1 /*some Id*/);

Resources