I am trying to develop an SSIS package that truncates a table in Oracle db. Unfortunately i am getting an error
When i am trying to do select from the truncated table, it works fine - connection manager is setup correctly. I've recreated connection manager just in case but that did help.
Truncated table is in the same schema as the user on which ETL is runned.
Despite an error message, task does it's job. Table get's truncated but error message still appears.
Any ideas what could be the reason?
Regards,
Lukas
2.
Component that i used is Oracle Source task inside of DFT.Oracle source is using a query "TRUNCATE TABLE schema_Name.TableName":
When i use "SELECT * FROM schema_Name.TableName" works fine.
I have the solution.
Instead of using "SQL command" inside of Oracle Source task i used "Execute SQL Task" with the same query. I connected to Oracle db using OLE DB.
Related
I'm currently importing data from a SQL Server database into an Oracle 11g one and I'm encountering some strange behavior when importing nvarchar2 columns. The charset on the SQL Server db is Unicode (UTF-8), while on Oracle's side it is not (WE8ISO8859P1).
Whenever i try to import SQL Server nvarchar2 values, Oracle inserts them as blank strings. Trying this on a different IDE, the show up as NULL. Whatever I do, the text is not imported.
I've had success using the dbms_hs_passthrough package, yet this seems overkill for such a simple insert/select task.
What am I missing?
Thanks in advance.
EDIT: If i perform a SELECT on the remote table I can see the text values just fine. CTAS also works as expected. I can only replicate the erroneous behavior when i run the package.
EDIT 2: I've narrowed this problem down to a possible bug in the MERGE statement in Oracle with data from SQL Server over DBLink. I solved this problem by performing separate INSERT/UPDATE DML statements.
Afternoon Folks,
We are using Visual Studio 2013 and have an SSIS package that we are creating. We have a simple Data Flow Task that essentially takes some data from SQL and Pushes through to an Access Database. It has three DFD flow items:
I have a OLE DB Source (obtains the SQL via a select statement) --> Data Conversion (Convert SQL Data Types to Access) --> OLE DB Destination (Access Database)
The steps selecting the SQL and converting it works fine.
The issue we have is the SQL command that we are using to update the Access 2010 database.
We have tried to run and create a simple UPDATE statement to update a couple of fields with hardcoded data, but this doesn't update. We have also tried creating a stored procedure and then executing this within the SQL Command line in the OLE Destination Editor.
We can see from posts on the net that we can create a procedure in access 2010 and use this. We are also using Native OLE DB\Microsoft Office 12.0 access Database Engine OLE DB Provider. This connection tests successfully.
We can write a SELECT statement within the SQL Command line and this does pull back data. We just seem to have a problem with the UPDATE and or Create Procedure. In turn we are unable to populate the Mappings. the mappings display the destination box but no fields are displayed within here.
We have had a good look around on the internet but we are struggling to find a solution.
Here is a sample of the code in the form of the update statement we are trying to get working.
UPDATE ReferenceFields
INNER JOIN Addresses
ON ReferenceFields.ID = Addresses.ID
SET ReferenceFields.Reference2 =CustomerName,
ReferenceFields.Reference3 = telephone
WHERE Addresses.UPRN = 12345678910
If I Parse the query it is successful but when I select Mappings a warning is displayed....
Error at Data Flow Task [OLE DB Destination [136]]: No column
information was returned by the SQL command
https://msdn.microsoft.com/EN-US/library/office/ff845861.aspx
https://msdn.microsoft.com/en-us/library/ms141044.aspx
SSIS OLEDB destination with SQL command (Insert if not exists)
After some time I have managed to answer this question myself.
I have added unique parameters to each field I need to update and used the # char to set this.
UPDATE ReferenceFields
INNER JOIN Addresses
ON ReferenceFields.ID = Addresses.ID
SET ReferenceFields.Reference2 = #CustomerName,
ReferenceFields.Reference3 = #telephone
WHERE UPRN = #UPRN
This runs through the code and updates the MS Access database.
I am using
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod &
Toad for oracle 10.6.1.3
when i try to run insert statement which is having around 84,000 + records, it is showing Out of Memory error. Here is the error image.
Any of you please suggest me, how i should execute this insert script in toad.
P.S : since toad is connecting to remote machine I'm not able to run it with SQLPLUS. If any one knows option to do that, please let me know.
If you need any more information, Please raise your hands in comment box i will provide you.
I got the same error when i want to execute sql script 70k rows.
But i solved it just like this.
Firstly
You should run it with sqlplus commands. Log in the sqlplus and run
this command
#scriptName.sql
Secondly (this is alternative)
You can use DBLINK in oracle.
As bpgergo suggested copy the sql file into remote machine using FTP, then open your SQLPLUS.
Follow this steps in SQLPLUS.
step 1 : change your current session using following query.
alter session set current_schema = Schema_Name;
Here, SCHEMA_NAME is schema name of your insert query table.
step 2 : Execute the sql script file using following query.
#{PATH}/FILE_NAME.SQL
Eg : #D:/oracle/script/FILE_NAME.SQL
Here, D:/oracle/script/ is the file available location and FILE_NAME is your sql script file name.
Now, It will work as expected.
When enabling Fast Load in Attunity Oracle Destination Components in several similar SSIS packages using Oracle 11g as target - a few packages fail and return the error below, but the rest of them work fine.
The error message I get is:
Description: Fast Load error encountered during PreLoad or Setup
phase. Text: ORA-39826: Direct path load of view or synonym (
TABLE_NAME ) could not be resolved.
If I'll disable the Fast Load, those that failed would work fine too of course.
More importantly, the failing packages work fine with Fast Load when using Oracle 10g as target.
I don't understand why it doesn't work in those that failed.
What am I missing? What should I do to make the Fast Load work at all times and not sometimes?
probably a driver issue.
The 'fast load' option internally uses a BULK INSERT statement for uploading data into the destination table instead of a simple INSERT statement for each single row. Since bulk insert is a native sql server function you should try to understand how does it work for oracle. It probably changed from 10g to 11g
By pure chance, I discovered that the target component fails in an SSIS package if its ‘TableName’ property contains spaces before or after(!) the name of the table. Once deleted it works fine.
This error didn’t occur on 10g.
i am newbie to oracle and i like to export database from remote database and import it on local machine. eOn both machines i have oracle 10.2.
I need to know how to export/import schema and data from oracle 10.2 using SQLDeveloper 3.0.0.4.
To export from remote database, i have used export Tool-> Database Export -> export wizard.
and at the end i have got only sql file with DDL and DML statements but somewhere in file it is written
"Cannot render Table DDL for object XXX.TABLE_NAME with DBMS_METADATA attempting internal generator error.
I have ignored previously mentioned message and tried to run those DDL and DML statements but all this ended up with errors.
Is it possible that all this tied with read-only database user? More over, i dont find any table under tables but also tables under other users in SqlDeveloper.
Thanks in advance
As a test, can you select one object in the tree, and navigate to the script panel? SQLDEV also uses DBMS_METADATA to generate those scripts.
Also, as a work-around, try using DataPump to export and import your data. It will be much more efficient for moving around larger schemas.
Your note about not seeing tables under indicates your schema doesn't actually own any tables. You may be working with synonyms that allow you to query objects as if they are in your account. You could be running into a privilege issue, but your error message doesn't indicate that. Error messages often come in bunches, and the very first one is usually the most important.
If you could try using the EXPORT feature say against a very simple schema like SCOTT as a test, this should indicate whether there is a problem with your account settings or with the software.
I'm not sure with SQL Developer 3.0 but with version 3.1 you can follow this:
SQL Developer 3.1 Data Pump Wizards (expdp, impdp)