Talend tFileInputDelimited to tDBoutput ORA-00904 - oracle

I made a job design which consists of tFileInputDelimited -> tMap -> tDBOutput(Oracle)
The csv I am using has columns which are not currently in the table which I don't think should be a problem.. but when I run my job I get multiple ORA-00904 invalid identifier errors.
I check my DB in Oracle SQL developer and no rows have been updated.
Looking for some help how to fix this.. I looked up the error and I get referenced to a SQL code but I am not using SQL only a CSV file to upload.
Thank you!

You say that your csv has columns that are not in your table. That is a problem if you map those columns to the tMap output. Only those columns which are present in your target table need to be in the tMap output flow going to tDBOutput.

Related

Oracle Data Integrator- ODI 12.2.1--Loadplan Issue no of records count issue

I come across a scenario in my project.I am loading data from file to Table using ODI.I am running My interfaces through loadplan.I've 1000 Records in my source file,and also getting 1000 records in target file.but when I'm checking ODI loadplan execution log its showing number of insert is 2000.can anyone please help.or is it a ODI bug.?
The number of inserts does not only show the inserts in the target table but also all the insert happening in temporary tables. Depending on the knowledge modules (KMs) used in an interface, ODI might load data in a C$_ table (LKM) or I$_ table (IKM/CKM). The rows loaded in these table will also be counted.
You can look at the code generated in the operator to check if your KMs are using using these temporary. You can also simulate an execution to see the code generated.

talend etl oracle error 0 row insert

I am a newbie to TalendETL and am using Talend Open Studio for Big Data version 6.2.
I have developed a simple Talend ETL job that picks up data from a tFileInputExcel and tOracleInput(dimension date ) and inserts data into my local Oracle Database.
Below is how my package looks :
this job run but i get 0 rows insert into my local Oracle Database
Your picture shows that no rows come out your tMap Component. Verify that your links inside the Tmap are corrects.
Seems there is no data that matches between fgf.LIBELLE_MOIS and row2.B.

Table importation fails with error code ORA-31693

I've been receiving backups from an Oracle database into my Oracle database for 2 years now. My company is running version 10.2.0.1.0 and we are receiving the exports from version 12.1.0.2.0. They are using expdp and I'm using impdp. I added a new column into my database, using this script
ALTER TABLE CONTAINERS
ADD ("SHELL" NUMBER(14, 6) DEFAULT 0 );
After running the above on both databases now when they send an export to me the table in question will not import. I receive the following error.
ORA-31693: Table data object "PAS"."CONTAINERS" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-02373: Error parsing insert statement for table "PAS"."CONTAINERS".
ORA-00904: "SYS_NC00067$": invalid identifier
This error has been going on for a about two weeks, I have tried to resolve the problem multiple ways, this is my last resort as it were.
Any help is greatly appreciated.
Did you try to track down SYS_NC00067? It looks like a system-assigned column name. This sometimes happens when you add a function-based index. Did you create a function-based index on Shell?

Populate indexed table in Oracle using Informatica

I'm new to both Oracle and Informatica.
Currently working on a small task where I need to select all records from the source table, filter the results to get only records where field1='Y' and finally insert new rows into the target table that contains only src.field2 and src.field3 values.
These 2 fields are used for the PK and for the Index of the target table.
So i get an error in Informatica:
"ORA-26002: Table has index defined upon it"
I rather not dropping the index? is there a work around?
I've tried alter index to "unusable" but I got the same error.
Please advice.
Thanks.
Try to use Normal load mode instead of Bulk. You can set in session properties for the target.

Oracle: importing records from tab delimited text file to database using pl-sql

I have never worked with Oracle. This is the first time and the job is quite tricky. I have a text file with records delimited with tab. These records are to be imported into a database using pl-sql. I have searched online but the solutions suggests using SQL Loader utility. But the requirement is to do that using sql statements. No command line utility. Preferable the SP or UDF will take file path and database name as input parameters and it will import the records when executed. Is this possible? Can someone provide me sample sql statements or any link that explain this process step by step? Also note that there can be blank records in file. Thanks in advance.
External Tables seems like the best approach.
http://docs.oracle.com/cd/B19306_01/server.102/b14215/et_concepts.htm
UTL_FILE is possible but you would have to write the code to parse the tab delimited text etc.
http://www.allroundautomations.nl/download/NewFeatures7.pdf
check on that file, easy to upload a csv file to a table

Resources