SQL Loader error for opening log file - oracle

I have created an external table :
CREATE TABLE XX_Lookup_EXT
(
LOOKUP_TYPE varchar2(200),
LOOKUP_CODE varchar2(200),
MEANING varchar2(200),
ENABLED_FLAG varchar2(10)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY INTF_DIR1
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE SKIP 1
NODISCARDFILE
FIELDS TERMINATED BY '|'
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
)
LOCATION (INTF_DIR1:'LOOKUP_CODE.csv')
)
REJECT LIMIT UNLIMITED
NOPARALLEL
nomonitoring;
When I am querying this table it is giving me the following error :
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file /orabin/tst/test/XX_LOOKUP_EXT_30723.log
29913. 00000 - "error in executing %s callout"
*Cause: The execution of the specified callout caused an error.
*Action: Examine the error messages take appropriate action.
I have tried everything out. Still I am getting this error.

#alex poole is right. The /orabin/tst/test/ directory must be local to the database and the database server account, usually 'oracle', needs read and write permissions within the directory.

Related

Oracle 18c external table importing csv file from UNC path

Preface
This script has worked previously when importing a csv file into an external table when using a local path for the created directory (with a local drive like F:\stats) instead of a UNC path (\\station-1\...)
Script
CREATE OR REPLACE DIRECTORY csvdir AS '\\Station-1\mainFolder\stats';
DROP TABLE EXT_index;
CREATE TABLE EXT_index(
SESSION_STATION NVARCHAR2(60),
"user" NVARCHAR2(20),
"type" NCHAR(4),
"date" DATE,
"hour" NVARCHAR2(8),
BATCH NUMBER,
STEP NUMBER,
VARIANTE NUMBER,
TIME_MS NUMBER,
NB_FOLDER NUMBER,
NB_DOC NUMBER,
NB_FIELDS NUMBER,
NB_FIELDS_SHOW NUMBER,
NB_FIELDS_CONFIRM NUMBER,
NB_FIELDS_EMPTY NUMBER,
NB_KEY NUMBER,
NB_CHAR NUMBER,
NB_USEFUL_CHAR NUMBER
)organization external (
type oracle_loader
default directory csvdir
access parameters (
records delimited by newline skip 1
fields terminated by ';' lrtrim
missing field values are null (
SESSION_STATION,
"user",
"type",
"date" date 'yyyy-mm-dd',
"hour",
BATCH,
STEP,
VARIANTE,
TIME_MS,
NB_FOLDER,
NB_DOC,
NB_FIELDS,
NB_FIELDS_SHOW,
NB_FIELDS_CONFIRM,
NB_FIELDS_EMPTY,
NB_KEY,
NB_CHAR,
NB_USEFUL_CHAR
)
)
location('INDEX.csv')
)
reject limit unlimited;
DROP TABLE TMP_index;
CREATE TABLE TMP_index AS(SELECT * FROM EXT_index);
Information
Database used : OracleXE 18c
The folder containing the csv we need to import is '\\Station-1\mainFolder\stats' and is situated on a remote station, different than my local computer running the database
Problem
As said previously, this script works fine when csvdir is a local path (F:\mainFolder\stats), the csv is imported into the external table and the temp table can be created from it afterwards just fine.
When csvdir is a UNC path the error is the following : KUP-04027: file name check failed: INDEX.csv
When csvdir is a network drive mapped to the previous UNC path (J:\ mapped to \\Station-1\mainFolder, such as csvdir=J:\stats) the error is the following : KUP-04040: file INDEX.csv in CSVDIR not found
Both these errors occur when trying to create TMP_index from EXT_index :
CREATE TABLE TMP_index AS(SELECT * FROM EXT_index)
Error report -
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
--> Either KUP-04027 or KUP-04040 here, depending on how csvdir was created <--
When checking in SQL Developer, EXT_index is always empty, the table doesn't seem to be created properly (i.e. the Data tab of that table doesn't even show column names)
Steps taken for attempted resolution
The folder '\\Station-1\mainFolder' is shared with 2 users having full control over it : Everyone and User1 (my account)
On the computer that has the database (My local computer) this '\\Station-1\mainFolder' is mapped to a network drive : J:\
When using J:\, csvdir is created as 'J:\stats'
When using \\Station-1\mainFolder, csvdir is created as '\\Station-1\mainFolder\stats'
As per this link both services TNSListener and OracleServiceXE are started logging in to my account, User1
Question
How can I import a csv file situated on a remote server into a local external table?

HDFS -extra data after last expected column

We have source and target system. trying to import the data from SQL server 2012 to Pivotal Hadoop (PHD 3.0) version using talend tool.
Getting error:
ERROR: extra data after last expected column (seg0 slice1 datanode.domain.com:40000 pid=15035)
Detail: External table pick_report_stg0, line 5472 of pxf://masternnode/path/to/hdfs?profile=HdfsTextSimple: "5472;2016-11-28 08:39:54.217;;2016-11-15 00:00:00.0;SAMPLES;0005525;MORGAN -EVENTS;254056;1;IHBL-NHO..."
We tried
We have identified the BAD line as
[hdfs#mdw ~]$ hdfs dfs -cat /path/to/hdfs|grep 3548
3548;2016-11-28 04:21:39.97;;2016-11-15 00:00:00.0;SAMPLES;0005525;MORGAN -EVENTS;254056;1;IHBL-NHO-13OZ-01;0;ROC NATION; NH;2016-11-15 00:00:00.0;2016-11-15 00:00:00.0;;2.0;11.99;SA;SC01;NH02;EA;1;F2;NEW PKG ONLY PLEASE!! BY NOON
Structure of External table and Format clause
CREATE EXTERNAL TABLE schemaname.tablename
(
"ID" bigint,
"time" timestamp without time zone,
"ShipAddress4" character(40),
"EntrySystemDate" timestamp without time zone,
"CorpAcctName" character(40),
"Customer" character(7),
"CustomerName" character(30),
"SalesOrder" character(6),
"OrderStatus" character(1),
"MStockCode" character(30),
"ShipPostalCode" character(9),
"CustomerPoNumber" character(30),
"OrderDate" timestamp without time zone,
"ReqShipDate" timestamp without time zone,
"DateValue" timestamp without time zone,
"MOrderQty" numeric(9,0),
"MPrice" numeric(9,0),
"CustomerClass" character(2),
"ProductClass" character(4),
"ProductGroup" character(10),
"StockUom" character(3),
"DispatchCount" integer,
"MWarehouse" character(2),
"AlphaValue" varchar(100)
)
LOCATION (
'pxf://path/to/hdfs?profile=HdfsTextSimple'
)
FORMAT 'csv' (delimiter ';' null '' quote ';')
ENCODING 'UTF8';
Finding : Extra semi colon appeared which causes extra data. But I am still unable to supply correct format clause . Please guide How do I remove extra data column error.
What format clause should I use.
Any help on it would be much Appreciated !
If you append the following to your external table definition, after the ENCODING clause, it should help to resolve the issue where a small number of rows fail due to this issue:
LOG ERRORS INTO my_err_table SEGMENT REJECT LIMIT 1 PERCENT;
Here is a reference on this syntax: http://gpdb.docs.pivotal.io/4320/ref_guide/sql_commands/CREATE_EXTERNAL_TABLE.html

ORA-29913: error in executing ODCIEXTTABLEOPEN callout when inserting csv into oracle

I'm trying to execute this code in PL/SQL:
create or replace directory ext_tab_dir as 'C:/mydir';
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO PUBLIC;
DROP TABLE emp_load;
CREATE TABLE emp_load (v1 VARCHAR2(4000),
v2 VARCHAR2(4000)
)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
BADFILE ext_tab_dir:'bad.bad'
LOGFILE ext_tab_dir:'log.log'
FIELDS TERMINATED BY ','
)
LOCATION ('testfile.csv')
);
-- INSERT INTO tablename(v1,v2)
SELECT * From emp_load
and then getting next errors:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error error opening file C:/mydir/log.log
I do get that it has to do something with permissions, but I'm the one who created that directory, so how do I grant priveleges to myself if it is set like this by default? Is there any way to perform that sort of operation from PL/SQL?
Try something like this.
GRANT SELECT, INSERT, UPDATE, DELETE ON emp_load TO NikitaBuriak;
Replace 'NikitaBuriak' with the ID you used when you created the table..
You should grant all the privileges to the directories and files in the location pointing to the file you are trying to access.
eg : if the file you are trying to excess is in /home/dummy_folder/new_folder/file.txt
then you should grant all the administrative privileges to dummy_folder, new folder and file.txt as well

Oracle external table with dba's directory

I wanted to create an external table, but did not have the CREATE ANY DIRECTORY permission (and could not have it granted). Fair enough, I asked the DBAs to run the following:
CREATE OR REPLACE DIRECTORY ext_data_files AS '/data/ext_data_files';
GRANT ALL ON DIRECTORY ext_data_files TO MYAPPUSER;
They did, and the final object has the following script:
CREATE OR REPLACE DIRECTORY
EXT_DATA_FILES AS
'/data/ext_data_files';
GRANT READ, WRITE ON DIRECTORY SYS.EXT_DATA_FILES TO MYAPPUSER;
(I got that from asking a desc with Toad)
I was then hoping to use this directory to create my external table with the script as follows:
CREATE TABLE MYAPPUSER.MY_EXT_TABLE
(
ID VARCHAR2(100 BYTE),
LOGIN VARCHAR2(100 BYTE),
CODE VARCHAR2(100 BYTE),
CREATED_AT VARCHAR2(100 BYTE)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY SYS.EXT_DATA_FILES
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
NOBADFILE
NOLOGFILE
FIELDS TERMINATED BY ';'
MISSING FIELD VALUES ARE NULL
( ID, LOGIN, CODE, CREATED_AT) )
LOCATION (SYS.EXT_DATA_FILES:'the_external_file.txt')
)
REJECT LIMIT 0
PARALLEL ( DEGREE DEFAULT INSTANCES DEFAULT )
NOMONITORING;
but then when I SELECT * FROM MY_EXT_TABLE, the result is the infamous
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file the_external_file.txt in EXT_DATA_FILES not found
ORA-06512: at "SYS.ORACLE_LOADER", line 19
(which has quite a few hits on google, but none seem related)
I'm confident of the syntax since this is the exact same script used in our DEV environment. Also, the permissions of all files and directories involved were checked and there is nothing lower than 775.
The only difference I have here from DEV (where it works) is that the directory EXT_DATA_FILES was not created by MYAPPUSER. I tried to create a synonym for it.. but had no effect.
Maybe worth mentioning, it is Oracle 10g we are talking about.
Am I missing something obvious? Is this allowed?
All directories are in fact owned by SYS. That's why there is no CREATE DIRECTORY privilege, only CREATE ANY DIRECTORY.
So try the command without prefixing the directory name with the SYS schema and see what happens.
The error message reads:
"file the_external_file.txt in EXT_DATA_FILES not found"
Are you sure it's there?

Oracle. load data infile error

Table:
CREATE TABLE image_table (
image_id NUMBER(5),
file_name VARCHAR2(30),
image_data BLOB);
SQL:
load data infile * replace into table test_image_table
fields terminated by ','
(
image_id INTEGER(5),
file_name CHAR(30),
image_data LOBFILE (CONSTANT 'C:\img.txt') TERMINATED BY EOF
)
C:\img.txt: 001,C:\1.jpg
Error:
ORA-00928: missing SELECT keyword
00928. 00000 - "missing SELECT keyword"
*Cause:
*Action:
Error at Line: 4 Column: 1
What I do wrong ??
You want to use SQL*Loader which is not SQL*Plus. You have to save what you call SQL as a file with the .ctl extension, and call sqlldr:
sqlldr login/password#database control=my_file.ctl
Note that infile * means that you must have some BEGINDATA inside your CTL file.
It seems like you are trying to use the SQL*Plus to run your SQL*Loader control file. Use one of the below sqlldr in your UNIX command line. Don't forget to save your mentioned SQL file as a .ctl file.
sqlldr username#server/password control=loader.ctl
or
sqlldr username/password#server control=loader.ctl
Try this in SQL Developer: host sqlldr username/password control=my_file.ctl

Resources