How to use ODI 11g ETL error table as source? - oracle

I'm currently using ODI 11g to import into Oracle, via CSV files, records from Mainframe Adabas table views. This is being done successfully.
The point is that I'm trying now to send back to a mainframe application via CSV the records that, for a reason or other, could not be imported into Oracle and are stored in the ETL's error tables.
I'm trying to use the same process, in this case backwards, to export the data from the error tables to a CSV file, which is to be imported by the mainframe application into Adabas.
I successfully imported via reverse engineering the structure of the error table to be my source base. I've set up new physical e and logical models to be used by this process. I've also created the interface.
My problem is that when I try to save the interface, it gives me a fatal error saying that I don't have an "LKM selected for this origin set".
When I try to set the LKM in Flow tab, it doesn't give me any option at LKM Selector.
I'm quite green on ODI and have no idea how to solve this problem, so any insights would be most appreciated.
Thanks all!

You need to change the location where transformations will occur. Currently the interface is trying to move all data to the file technology and process it there. But it's easier to work the other way around and make the database do the job. To do so, go on the overview pane of your interface and select "Staging Area Different From Target" checkbox, then select the logical schema of your Oracle source below.
On the Flow tab, click on your target and select the following IKM : "IKM SQL to File Append". This is a multi-technology IKM which means you won't need an LKM anymore to move data from source to target.

Related

Oracle APEX application failing to read data present in database

I am working on an APEX application and am faced with an issue in which data present in the database is not read by the application. Rather, the application reads some of fields from the database and not the rest without throwing any error message.
This small application has a main page that contains tombstone information about work projects we are engaged in and is represented by approximately 30 Oracle items of type ‘Database column' that are the fields of the main project overview table. This main page also has several interactive reports that allow the users to manipulate data from other related tables like project costs, project deliverables, project recommendations, etc. While the functionality to create/update/delete from these related tables works perfectly, I am still unable make the same functionality work for the main table and am unable to figure out why as Oracle is not giving me an error message. The changes I submit in the application clearly hit the database as I can see the record updated in the Oracle Object Browser or by querying in SQL commands. I have visually depicted the situation below and thank you in advance for any assistance.
Open project number ‘1982’ and populate with dummy data and ‘save’.
Entered data appears in the main table correctly
Open record project number 1982 and observe that only some of the fields are
shown in the application even though the fields are clearly populated accurately in the underlying table.

Power Center view result

I'm starting working with Informatica Power Center, I'm new in this technology. In the past I worked with Datastage. I made a task that read data from an Oracle table and write them on a Flat File. The Job run and finish correctly (I saw on Workflow Manger).
Is there a way to view the records written on my flat file on Power Center?
Thanks
Luca
You need to access the output file. Informatica does not provide a data browser to inspect flat files or databases. You need to use a separate tool.
Try FTP or SSH connection to wherever the output file got generated.

Use of ODBC and Relational connections in Informatica

I noticed that in the mapping level we are creating the ODBC connection and in the Workflow level we are creating the Relational connection. What are those 2 connections needed for?
Question is crystal clear.
When you create the mapping you are describing what you want the data to do and shouldn't be restricted by whether the data model exists yet or not.
In order to do this you need to know the structure of your source and target but you don't need to actually connect to them. Having a dummy csv to get you going in mapping designer and while the dba builds the tables in the database is enough.
In the designer you may connect to existing structure to create Source or Target transformations. But you just create the structure, the definition - it's not connected to the mapping anymore.
In the workflow designer you choose a connection that should process data strutured in a way described by the Source or Target definition. It's the connection that will be used to access the data.

SSIS Data Flow Task doesn't load all rows from OLE DB

I'm trying to load a table from SQL Server using the Microsoft OLE DB Provider into an Oracle table (using the Oracle Provider for OLE DB). The package is a straight forward OLE DB Source (SQL Server) -> OLE DB Destination (Oracle).
I'm using SQL Server 2008 R2 and Oracle 11g.
Every time I run the package, I get a different number of rows in the destination table, and BIDS reports fewer rows read than there are in the source table. The number of rows returned is different each time I run it. I get no errors or kickouts, but the boxes for the source and destination remain yellow even after BIDS says "Package completed successfully".
Dumping the source table into a flat file instead of the Oracle destination works fine, and I get all the rows that I expect. I can use this flat file to pull the information into the Oracle destination table without problems as well.
Even though I have a work-around, I want to understand why this is happening, and what I can do to resolve this problem without having to use flat files.
Edit: Looks like even using the flat file to Oracle doesn't bring over all the rows. The first time was just luck?
Edit/Update: Running the package out of Integration Services (not BIDS) seems to have eliminated the problem (tested three times). Still don't understand why this is happening though.

Import and Export Data plus schema using SQLDeveloper 3.0.04

i am newbie to oracle and i like to export database from remote database and import it on local machine. eOn both machines i have oracle 10.2.
I need to know how to export/import schema and data from oracle 10.2 using SQLDeveloper 3.0.0.4.
To export from remote database, i have used export Tool-> Database Export -> export wizard.
and at the end i have got only sql file with DDL and DML statements but somewhere in file it is written
"Cannot render Table DDL for object XXX.TABLE_NAME with DBMS_METADATA attempting internal generator error.
I have ignored previously mentioned message and tried to run those DDL and DML statements but all this ended up with errors.
Is it possible that all this tied with read-only database user? More over, i dont find any table under tables but also tables under other users in SqlDeveloper.
Thanks in advance
As a test, can you select one object in the tree, and navigate to the script panel? SQLDEV also uses DBMS_METADATA to generate those scripts.
Also, as a work-around, try using DataPump to export and import your data. It will be much more efficient for moving around larger schemas.
Your note about not seeing tables under indicates your schema doesn't actually own any tables. You may be working with synonyms that allow you to query objects as if they are in your account. You could be running into a privilege issue, but your error message doesn't indicate that. Error messages often come in bunches, and the very first one is usually the most important.
If you could try using the EXPORT feature say against a very simple schema like SCOTT as a test, this should indicate whether there is a problem with your account settings or with the software.
I'm not sure with SQL Developer 3.0 but with version 3.1 you can follow this:
SQL Developer 3.1 Data Pump Wizards (expdp, impdp)

Resources