SQL loader POSITION giving errors - oracle

I have a question.I'd like to know how we can specify POSITION if we have a column that allows only 1 character? For ex: I have a table Example with column SEC (char (1).I am using control file to load it to my table.
My control file reads this:Note I am trying to insert SEC on position 2.
LOAD DATA
INFILE 'C:\Users\....\Data\h1.dat'
BADFILE 'C:\Users\...\Reject\h1.bad'
INSERT INTO TABLE EXAMPLE
(SEC POSITION (2))
My log reads:
Record 1: Rejected - Error on table SCHEDULE, column SEC.
ORA-01400: cannot insert NULL into ("SCOTT"."EXAMPLE"."SEC")
Record 2:
Rejected - Error on table SCHEDULE, column SEC.
ORA-01400: cannot insert NULL into ("SCOTT"."EXAMPLE"."SEC")

Related

Sqlldr log shows wrong column name when facing ORA-01722

I have a simple table with three columns.
Table name DUMMY20210527
desc DUMMY20210527
PID NUMBER(10)
VDATE DATE
SID NUMBER(10)
Control File
LOAD DATA CHARACTERSET JA16SJIS
APPEND INTO TABLE "DUMMY20210527" fields terminated by '\t' trailing nullcols
(
pid ,
vdate date "YYYY/MM/DD",
sid
)
Data File
a b c
1 2019/01/10 X
When trying to load this file log is showing error in column vdate. however wrong value x is provided in column sid
Log
Record 1: Rejected - Error on table "DUMMY20210527", column VDATE.
ORA-01722: invalid number
Table "DUMMY20210527":
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
However if the following file is loaded then proper error message is shown
a b c
X 2019/01/10 1
Log
Record 1: Rejected - Error on table "DUMMY20210527", column PID.
ORA-01722: invalid number
Table "DUMMY20210527":
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Oracle version used: Oracle 19c

load NULL data into different table with the help of SQL*Loader

I have a CSV which contains 8 column. It has NULL data for few of the rows.
To load CSV data, I have created 2 tables with the same definition.
1) TABLE_NOT_NULL to load Not NULL data
2) TABLE_NULL to load NULL Data
I am successfully able to load data into TABLE_NOT_NULL with below when condition:
insert into table '<TABLE_NAME>' when '<COLUMN_NAME>'!=' '.
Now, I want to load NULL data into the table called TABLE_NULL but I am not able to filter out only NULL value with when condition.
I tried too many things but none of them worked; like:
a) insert into table '<TABLE_NAME>' WHEN '<COLUMN_NAME>'=BLANKS
b) insert into table '<TABLE_NAME>' WHEN '<COLUMN_NAME>'=' '
Can anyone please suggest any workaround or solution for it?
Workaround?
1
Load everything into TABLE_NULL
insert into TABLE_NOT_NULL select * From TABLE_NULL where column is not null
delete from TABLE_NULL where column is not null
2
load everything into TABLE_NOT_NULL
rows, that contain NULL values, won't be loaded but end up in the BAD
file
using another control file, load BAD file into TABLE_NULL
3 (EDIT)
instead of SQL*Loader, create an external table - it acts as if it was an ordinary Oracle table, but is (really) just a pointer to the file
you'd then write 2 INSERT statements:
insert into table_not_null
select * From external_table where column is not null;
insert into table_null
select * From external_table where column is null;

partition the hive data complex data types while inserting data its shows error

i created a table using hive i want to partition the data based on location
create table student(
id bigint
,name string
,location string
, course array<string>)
ROW FORMAT DELIMiTED fields terminated by '\t'
collection items terminated by ','
stored as textfile;
and data like
100 student1 ongole java,.net,hadoop
101 student2 hyderabad .net,hadoop
102 student3 vizag java,hadoop
103 student4 ongole .net,hadoop
104 student5 vizag java,.net
105 student6 ongole java,.net,hadoop
106 student7 neollre .net,hadoop
creating partition table:
create table student_partition(
id bigint
,name string
,course array<string>)
PARTITIONED BY (address string)
ROW FORMAT DELIMiTED fields terminated by '\t'
collection items terminated by ','
stored as textfile;
INSERT OVERWRITE TABLE student_partition PARTITION(address) select *
from student;
i'm trying to partition the data based on location but it shows below error:
FAILED: SemanticException [Error 10044]: Line 1:23 Cannot insert into
target table because column number/types are different 'address':
Cannot convert column 2 from string to array.
please anyone help me.
The columns of the source and the target should match
Option 1: adjust the source to the target. The partition column goes last
insert into student_partition partition (address)
select id,name,course,location
from student
;
Option 2: adjust the target to the source
insert into student_partition partition (address) (id,name,address,course)
select *
from student
;
P.s.
You might need this -
set hive.exec.dynamic.partition.mode=nonstrict
;

Hive: Insert into hive table with column using select 1

Let's say I have a hive table test_entry with column called entry_id.
hive> desc test_entry;
OK
entry_id int
Time taken: 0.4 seconds, Fetched: 1 row(s)
Suppose I need to insert one row into this above table using select 1 (which returns 1). For example: A syntax which looks like the below:
hive> insert into table test_entry select 1;
But I get the below error:
FAILED: NullPointerException null
So effectively, I would like to insert one row for entry)id whose value will be 1 with such a select statement(without referring another table).
How can this be done?
Hive does not support what you're trying to do. Inserts to ORC based tables was introduced in Hive 0.13.
Prior to that, you have to specify a FROM clause if you're doing INSERT .. SELECT
A workaround might be to create an external table with one row and do the following
INSERT .. SELECT 1 FROM table_with_one_row

Problem with Oracle Sql Loader control file

I'm trying to load some data using sql loader. Here is the top of my control/data file:
LOAD DATA
INFILE *
APPEND INTO TABLE economic_indicators
FIELDS TERMINATED BY ','
(ASOF_DATE DATE 'DD-MON-YY',
VALUE FLOAT EXTERNAL,
STATE,
SERIES_ID INTEGER EXTERNAL,
CREATE_DATE DATE 'DD-MON-YYYY')
BEGINDATA
01-Jan-79,AL,67.39940538,1,23-Jun-2009
... lots of other data lines.
The problem is that sql loader won't recognize the data types I'm specifying. This is the log file:
Table ECONOMIC_INDICATORS, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ASOF_DATE FIRST * , DATE DD-MON-YY
VALUE NEXT * , CHARACTER
STATE NEXT * , CHARACTER
SERIES_ID NEXT * , CHARACTER
CREATE_DATE NEXT * , DATE DD-MON-YYYY
value used for ROWS parameter changed from 10000 to 198
Record 1: Rejected - Error on table ECONOMIC_INDICATORS, column VALUE.
ORA-01722: invalid number
... lots of similiar errors, expected if trying to insert char data into a numeric column.
I've tried no datatype spec, all other numeric specs, and always the same issue. Any ideas?
Also, any ideas on why it's changing the Rows parameter?
From your example, SQL*Loader will try to evaluate the string "AL" to a number value, which will result in the error message you gave. The sample data has something looking like it could be a decimal number at third position, not second as specified int he column list.

Resources