i have a problem - I want to use temp table in stored procedure is SQL Server, which will be executed from SSIS package.
I read some tips how to do it and I tried this one (first answer): Using Temp tables in SSIS
but it didn't work.
I have MS Visual Studio 2010, couldn't be problem with this version?
Here is my code in stored proc.:
CREATE PROCEDURE some_procedure
AS
SET NOCOUNT ON
IF 1 = 0
BEGIN
SELECT CAST(NULL AS int) as number
END
CREATE TABLE #some_table (number int)
INSERT INTO #some_table VALUES (250)
SELECT number FROM #some_table
Thanks for any advice or experience.
Here is error message from Visual Studio:
Error at Data Flow Task [OLE DB Source [1]]: SSIS Error Code
DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code:
0x80004005. An OLE DB record is available. Source: "Microsoft SQL
Server Native Client 11.0" Hresult: 0x80004005 Description: "The
metadata could not be determined because statement 'INSERT INTO
#some_table VALUES (250)' in procedure 'some_procedure' uses a temp table.".
Error at Data Flow Task [OLE DB Source [1]]: Unable to retrieve column
information from the data source. Make sure your target table in the
database is available.
In SQL Server 2012 if you use temporary tables you must specify a results set.
This is an issue with the sp_describe_first_result_set procedure that SSIS uses to returns the output metadata.
E.g.
EXEC dbo.RptResults_StoredProcedure
Becomes
EXEC dbo.RptResults_StoredProcedure
WITH RESULT SETS
((
Date NVARCHAR(10),
Location VARCHAR(12),
Department CHAR(1),
Shift CHAR(1),
ForecastSales DECIMAL(18,2),
ActualSales DECIMAL(18,2)
))
For more information view
http://blog.concentra.co.uk/2014/08/22/column-metadata-determined-correctly-ssis-data-flow-task-stored-procedure-inputs/
Instead of temp table you can use table variable or cte... these don't have the issue like temp table.
CREATE PROCEDURE some_procedure
AS
SET NOCOUNT ON
Declare #some_table TABLE (number int)
INSERT INTO #some_table VALUES (250)
SELECT number FROM #some_table
Related
My requirement is copy data from Oracle to SQL Server. Before copying from Oracle database, I need to update the Oracle table using procedure which has some logic.
How do I execute Oracle stored procedure from Azure datafactory?
I referred to this thread
if I use EXECUTE PROC_NAME (PARAM); in preCopy script it's failing with following error
Failure happened on 'Source' side.
ErrorCode=UserErrorOdbcOperationFailed,
Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException
Message=ERROR [42000] [Microsoft][ODBC Oracle Wire Protocol driver]
[Oracle]ORA-00900: invalid SQL statement
Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,
Type=System.Data.Odbc.OdbcException
Message=ERROR [42000] [Microsoft][ODBC Oracle Wire Protocol driver]
[Oracle]ORA-00900: invalid SQL statement,Source=msora28.dll
Could anyone help on this?
Note: I am using self-hosted runtime environment for data factory
thanks!!
I used a Lookup Activity and a SELECT statement of DUAL TABLE. Due to the stored procedures can not be call from a statement SELECT. I created an oracle function and the function calls the stored procedure. The function returns a value and this value is received by the lookup activity.
When you define the function, you have to add the statement PRAGMA AUTONOMOUS_TRANSACTION. This is because Oracle does not allow to execute DML instructions with a SELECT statement by default. Then, you need to define that DML instructions in the Stored Procedure will be an autonomous transaction.
--Tabla
CREATE TABLE empleados(
emp_id NUMBER(9),
nombre VARCHAR2(100),
CONSTRAINT empleados_pk PRIMARY KEY(emp_id),
);
create or replace procedure insert_empleado (numero in NUMBER, nombre in VARCHAR2) is
begin
INSERT INTO empleados (emp_id, nombre)
Values(numero, nombre);
COMMIT;
end;
create or replace function funcinsert_empleado (numero in NUMBER, nombre in VARCHAR2)
return VARCHAR2
is
PRAGMA AUTONOMOUS_TRANSACTION;
begin
insert_empleado (numero, nombre);
return 'done';
end;
--statement in query of lookup
SELECT funcinsert_empleado ('1', 'Roger Federer')
FROM DUAL;
Example lookup
This is example in Spanish. https://dev.to/maritzag/ejecutar-un-stored-procedure-de-oracle-desde-data-factory-2jcp
In Oracle, EXECUTE X(Y) is a SQL*Plus-specific command shortcut for the PL/SQL statement BEGIN X(Y); END;. Since you are not using SQL*Plus, try the BEGIN/END syntax.
In case you only want to execute the DML query using the Azure Data Factory without procedure on oracle database :-
I have another solution where you can use the copy activity with the pre-copy feature of sink in-spite of lookup activity.
For this approach just follow the below steps :-
Keep both the source table and sink table as same ( Let say table A ) using the same linked service.
In sink use the pre-copy script feature and keep the DML (Insert/Update/Delete ) query that you want to perform over the table B.( This table is not necessary to be same as table A )
In case you want to avoid the copy of data to same table you can select query option in the source part and provide a where clause which is not going to satisfy and hence no copy of data will happen .
or you can create a table temp with one column and one row .
I have tested both the options and it works ... good part of above solution is you can avoid the procedure or function creation and maintenance .
I need to use the stored procedure stage.
Currently I'm making just for an example for how to use it right.
CREATE OR REPLACE PROCEDURE "STG"."TRUNC_TEST"
AS
BEGIN
execute immediate 'truncate table TESTSP';
END;
That's my example of simple stored procedure.
My job design probably seen like this
Oracle Connector 1=>Transformer=>Oracle Connector 2=>Stored Procedure Stage
Oracle Connector 1 do Select, Oracle Connector 2 do Insert to TESTSP
My settings in the stored procedure stage
General : I've already put all the credential , with Transaction ISO as None
Syntax
Procedure Name : TRUNC_TEST
Procedure Type : Transform (i've also tried to change it to Target)
Database Procedure Type : Stored Procedure
Generate procedure call (checked)
Parameters
Empty
Error Codes
Empty
NLS Map
Project Default UTF-8
Advanced
Execution mode :Default(Sequential)
Combinability : Default
Configuration file : default
In the Input tabs
General
Execute Procedure for each row (checked)
Transaction size : 0
Partitioning
Collector type Auto
Columns
Just brought all the columns from Oracle connector 2
Advanced
Default
The job showing green line and success, but the SP isn't working. It should've been truncate the TESTSP table, but when I do a select *, the data is still there.
Maybe my stored procedure is wrong since I'm still learning how to make it? Or maybe there is something wrong with my 'Settings' in the stored procedure stage?
Every time you are running your job, you are just creating or replacing the SP definition.
You will have to call the SP in order to execute it.
That is the reason you are able to execute it outside.
I suggest you to create a generic SP in your database which will accept table name as parameter.
Once this is created, use SP stage to invoke the SP with column you want to truncate.
You can also use after SQL option in oracle connector to avoid extra SP stage.
Whenever I am trying to fetch data from remote table using db link and put inside the collection variable, I am getting ORA-22804 Error.
I don't have access to the remote database.
I tried recreating the type using oid, but still facing the same issue.
create or replace type type_demo as table of varchar2(32767);
CREATE OR REPLACE PROCEDURE PROC_TEST(EMP_ID IN TYPE_DEMO,E_NAME OUT TYPE_DEMO)
AS
BEGIN
SELECT EMP_NAME BULK COLLECT INTO E_NAME FROM EMP_TABLE#CDM_LINK WHERE EMPLOYEE_ID MEMBER OF EMP_ID;
END;
/
Whenever I am trying to test the above procedure I am getting ORA-22804 Error:
DECLARE
A TYPE_DEMO:=TYPE_DEMO('1001');
B TYPE_DEMO:=TYPE_DEMO();
BEGIN
PROC_TEST(A,B);
FOR I IN 1..B.COUNT LOOP
DBMS_OUTPUT.PUT_LINE(B(i));
END LOOP;
END;
The error message says it all:
ORA-22804 remote operations not permitted on object tables or user-defined type columns
You're trying to do exactly that - remote operation on a user defined type type_demo, over a database link CDM_LINK.
Cheers
I'm working on adding a database project in VS2010. I created a SQL 2008 DB Project, pointed at the development server, and it seems to have generated all of the appropriate schema objects. However, all of the CREATE TRIGGER scripts have the following error:
SQL03120: Cannot find element referenced by the supporting statement
Googling that error message doesn't return much, and seems to point to scripts using ALTER instead of CREATE which isn't the case here. This is an example of one of the scripts:
CREATE TRIGGER [TR_t_TABLE_TRIGGERNAME] ON [content].[t_TABLE]
FOR INSERT
AS
BEGIN
IF ( SELECT COUNT(*) FROM inserted) > 0
BEGIN
DECLARE #columnBits VARBINARY(50)
SELECT #columnBits = COLUMNS_UPDATED() | CAST (0 AS BIGINT)
INSERT INTO [history].[t_TABLE]
(
....
)
SELECT
....
FROM inserted
END
END
GO
EXECUTE sp_settriggerorder #triggername = N'[Content].[TR_t_TABLE_TRIGGER]', #order = N'last', #stmttype = N'insert';
The line that Visual Studio is attributing the error to is the last line executing the system proc. What stands out to me is that none of the object exist in the dbo schema. The table is in the Content schema and it has a matching table in the History schema. It would seem like the [Content] and [History] qualifiers would be resolvable though. Can't figure this one out...
Since this post comes up when you search for the above error on Google: another reason for this error is if you've got a stored procedure in a database project which specifies ALTER PROCEDURE rather than CREATE PROCEDURE.
I am trying to move some data over a dblink and one of the columns is an XMLType column. The code looks like this:
begin
delete from some_schema.some_remote_tab#src_2_trg_dblink;
INSERT INTO some_schema.some_remote_tab#src_2_trg_dblink(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xml_data
FROM local_table;
end;
Oracle returns these errors:
ORA-02055: distributed update operation failed; rollback required
ORA-22804: remote operations not permitted on object tables or user-defined type columns
Some research on ORA-22804 shows that I am probably getting this error because of the XMLType column, but I am not sure how to resolve this.
(Oracle 10g)
We get ORA-22804 because every instance of a Type in our Oracle database has an OID, which is unique within the database. We cannot transfer that OID to another database; this has caused me grief before when trying to import schemas which have User-Defined Types. I hadn't realised that it also affected XMLType, but it is an Object so it is not surprising.
The solution is icky: you will have to unload the XML into text on your local database and then convert it back into XML in the remote database.
I don't have a distributed DB set-up to test this right now, but if you're lucky it may work:
INSERT INTO some_schema.some_remote_tab#src_2_trg_dblink(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xmltype ( xml_data.asClobVal() )
FROM local_table;
If the asClobVal() method doesn't work you may need to use the SQL function XMLSERIALIZE() instead.
XMLSerialize(DOCUMENT xml_data AS CLOB)
If you're really unlucky you won't be able to do this in a single SQL statement, and you'll have to solve it using PL/SQL. To a certain extent this will depend on which version of the database you are using; the more recent the version, the more likely you'll be able to it in SQL rather than PL/SQL.
Try to do this the other way around. That is log into the remote db, create a dblink to the local db, and do an insert like this
INSERT INTO remote_schema.some_remote_tab(id, code, gen_date, xml_data)
SELECT id, code, gen_date, xml_data
FROM local_table#dblink_to_local_db;
Instead Perform a Data PULL.
create the data pull procedure at Remote database B.
create synonyms and provide grants to the dblink user.
Call the Remote procedure from Database A (Source) Perform a commit at Database A(source).
(Meanwhile .. wait for oracle to find some solution to perform the PUSH of XML over dblink in the future)
Create a procedure at Remote site Database B
CREATE OR REPLACE PROCEDURE PR_REMOTE(OP_TOTAL_COUNT OUT NUMBER) IS
BEGIN
INSERT /*+ DRIVING_SITE(src) */
INTO REMOTE_TABLE TGT_B
(XMLDATA_COL)
SELECT SRC.XMLDATA FROM LOCAL_TABLE#TGT2SRC_DBLINK SRC;
OP_TOTAL_COUNT := SQL%ROWCOUNT;
END;
Call the procedure from Database A
DECLARE
V_COUNT NUMBER := 0;
BEGIN
PR_REMOTE(V_COUNT);
COMMIT;
END;
I was facing the same issue with an heterogeneous DB link to SQL server.
Ended up using xmltype.getStringVal() to insert in a VARCHAR column on SQL Server side as the data was under 4000 characters.
There is also xmltype.getClobVal() if over 4000 characters but I haven't tested it.
The "xml->text->xml" chain might be complicated, but could help in some cases (for example when inserting is not on option but updating only).
You can try with "n" peaces of varchar columns (in the destination table or in a differnet one, perheaps in different schema on the remote DB), where "n" is:
ceil(max(dbms_lob.getlength(MyXmlColumn)) / 4000)
Then you can transfer these fragments to remote temporary fields:
insert into RemoteSchema.MyTable(Id, XmlPart1, XmlPart2,...)
(select 1 /*some Id*/,
dbma_lob.substr(MyXmlColumn.getclobval(), 4000, 1),
dbma_lob.substr(MyXmlColumn.getclobval(), 4000, 4001),
...
from LocalSchema.MyTable
XmlType can be re-composed from fragments like this:
create or replace function concat_to_xml(p_id number)
return xmltype
is
xml_lob clob;
xml xmltype;
begin
dbms_lob.createtemporary(xml_lob, true);
for r in (select XmlPart1, XmlPart2, ... from RemoteSchema.MyTable where Id = p_id)
loop
if r.XmlPart1 is not null then
dbms_lob.writeappend(xml_lob, length(r.XmlPart1), r.XmlPart1);
end if;
if r.XmlPart2 is not null then
dbms_lob.writeappend(xml_lob, length(r.XmlPart2), r.XmlPart2);
end if;
...
end loop;
xml := xmltype(xml_lob);
dbms_lob.freetemporary(xml_lob);
return xml;
end;
Finally use the result to update any other table in the remothe schema like:
update RemoteSchema.MyTable2 t2 set t2.MyXmlColumn = concat_to_xml(1 /*some Id*/);