Oracle. load data infile error - image

Table:
CREATE TABLE image_table (
image_id NUMBER(5),
file_name VARCHAR2(30),
image_data BLOB);
SQL:
load data infile * replace into table test_image_table
fields terminated by ','
(
image_id INTEGER(5),
file_name CHAR(30),
image_data LOBFILE (CONSTANT 'C:\img.txt') TERMINATED BY EOF
)
C:\img.txt: 001,C:\1.jpg
Error:
ORA-00928: missing SELECT keyword
00928. 00000 - "missing SELECT keyword"
*Cause:
*Action:
Error at Line: 4 Column: 1
What I do wrong ??

You want to use SQL*Loader which is not SQL*Plus. You have to save what you call SQL as a file with the .ctl extension, and call sqlldr:
sqlldr login/password#database control=my_file.ctl
Note that infile * means that you must have some BEGINDATA inside your CTL file.

It seems like you are trying to use the SQL*Plus to run your SQL*Loader control file. Use one of the below sqlldr in your UNIX command line. Don't forget to save your mentioned SQL file as a .ctl file.
sqlldr username#server/password control=loader.ctl
or
sqlldr username/password#server control=loader.ctl

Try this in SQL Developer: host sqlldr username/password control=my_file.ctl

Related

multi file insert from hive table not working?

Hi i have 200 gb of data in one of my hive table backed on HBase.
I have to create 142 different files out of that table currently trying for 3 files only .
I want to run all query to run parallel at the same time .
I was trying multi file insert from hive table but getting parse exception .
This is my query that i was trying .
FROM hbase_table_FinancialLineItem
INSERT OVERWRITE LOCAL DIRECTORY '/hadoop/user/m6034690/FSDI/FinancialLineItem/Japan.txt'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
select * from hbase_table_FinancialLineItem WHERE FilePartition='Japan'
INSERT OVERWRITE LOCAL DIRECTORY '/hadoop/user/m6034690/FSDI/FinancialLineItem/SelfSourcedPrivate.txt'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
select * from hbase_table_FinancialLineItem WHERE FilePartition='SelfSourcedPrivate'
INSERT OVERWRITE LOCAL DIRECTORY '/hadoop/user/m6034690/FSDI/FinancialLineItem/ThirdPartyPrivate.txt'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
select * from hbase_table_FinancialLineItem WHERE FilePartition='ThirdPartyPrivate';
And after running this i was getting below error.
FAILED: ParseException line 7:9 missing EOF at 'from' near '*'
I think it can be solved when you add this FROM hbase_table_FinancialLineItem; at the end of each insert overwrite.

How to make an INFILE in SQLLDR

Help guys i need to make a dynamic infile. so the common infile is
INFILE 'yourcsvfile.csv'
LOAD DATA
INFILE 'yourcsvfile.csv'
append
INTO TABLE table_name
TRUNCATE
FIELDS TERMINATED BY ','
is it possible to make it like this?
INFILE (sysdate , 'YYYYMMDD') || (_STRING.csv)
so meaning im searching for 20160816_STRING.csv
You can create a Windows batch script which gets the current date in the format you want and then calls the Oracle loader:
set "part1=!date:~10,4!!date:~6,2!/!date:~4,2!"
set "part2=_STRING.csv"
set "yourfile=%part1%%part2%"
#echo LOAD DATA INFILE %yourfile% APPEND INTO TABLE table_name TRUNCATE FIELDS TERMINATED BY ','; | sqlplus username/password#database

ORA-29913: error in executing ODCIEXTTABLEOPEN callout when inserting csv into oracle

I'm trying to execute this code in PL/SQL:
create or replace directory ext_tab_dir as 'C:/mydir';
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO PUBLIC;
DROP TABLE emp_load;
CREATE TABLE emp_load (v1 VARCHAR2(4000),
v2 VARCHAR2(4000)
)
ORGANIZATION EXTERNAL (
TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
BADFILE ext_tab_dir:'bad.bad'
LOGFILE ext_tab_dir:'log.log'
FIELDS TERMINATED BY ','
)
LOCATION ('testfile.csv')
);
-- INSERT INTO tablename(v1,v2)
SELECT * From emp_load
and then getting next errors:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error error opening file C:/mydir/log.log
I do get that it has to do something with permissions, but I'm the one who created that directory, so how do I grant priveleges to myself if it is set like this by default? Is there any way to perform that sort of operation from PL/SQL?
Try something like this.
GRANT SELECT, INSERT, UPDATE, DELETE ON emp_load TO NikitaBuriak;
Replace 'NikitaBuriak' with the ID you used when you created the table..
You should grant all the privileges to the directories and files in the location pointing to the file you are trying to access.
eg : if the file you are trying to excess is in /home/dummy_folder/new_folder/file.txt
then you should grant all the administrative privileges to dummy_folder, new folder and file.txt as well

Hive error: parseexception missing EOF

I am not sure what I am doing wrong here:
hive> CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
tblproperties ("orc.compress"="NONE")
LOCATION "/user/hive/test_table";
FAILED: ParseException line 1:107 missing EOF at 'LOCATION' near ')'
while the following query works perfectly fine:
hive> CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
tblproperties ("orc.compress"="NONE");
OK
Time taken: 0.106 seconds
Am I missing something here. Any pointers will help. Thanks!
Try put the "LOCATION" in front of "tblproperties" like below, worked for me.
CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
LOCATION "/user/hive/test_table"
tblproperties ("orc.compress"="NONE");
It seems even the sample SQL from book "Programming Hive" got the order wrong. Please reference to the official definition of create table command:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-CreateTable
#Haiying Wang pointed out that LOCATION is to be put in front of tblproperties.
But I think the error also occurs when location is specified above stored as.
Its better to stick to the correct order:
CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name -- (Note: TEMPORARY available in Hive 0.14.0 and later)
[(col_name data_type [COMMENT col_comment], ... [constraint_specification])]
[COMMENT table_comment]
[PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
[CLUSTERED BY (col_name, col_name, ...) [SORTED BY (col_name [ASC|DESC], ...)] INTO num_buckets BUCKETS]
[SKEWED BY (col_name, col_name, ...) -- (Note: Available in Hive 0.10.0 and later)]
ON ((col_value, col_value, ...), (col_value, col_value, ...), ...)
[STORED AS DIRECTORIES]
[
[ROW FORMAT row_format]
[STORED AS file_format]
| STORED BY 'storage.handler.class.name' [WITH SERDEPROPERTIES (...)] -- (Note: Available in Hive 0.6.0 and later)
]
[LOCATION hdfs_path]
[TBLPROPERTIES (property_name=property_value, ...)] -- (Note: Available in Hive 0.6.0 and later)
[AS select_statement]; -- (Note: Available in Hive 0.5.0 and later; not supported for external tables)
Refer: Hive Create Table
Check this post:
Loading Data from a .txt file to Table Stored as ORC in Hive
And check your source files present at the specified directory /user/hive/test_table. Incase the files are in .txt or some other non ORC format then you can follow the steps in the above post to come out of the error.
ParseException line lineNumber missing EOF at '.' near 'schemaName':
Got the above error while trying to execute the following command from linux script to truncate a hive table
dse -u username -p password hive -e "truncate table keyspace.tablename;"
Fix:
Need to separate the commands within the script line as follows -
dse -u username -p password hive -e "use keyspace; truncate table keyspace.tablename;"
Happy coding!
Got the same error while creating a table in hive.
I used the drop command to drop the table and then run the create table command that I had again.
Worked for me.
If you see this error when running the HiveQL from a file with the command "hive -f file.hql". And that it points the first line of your query most definitely this is because of a forgotten semicolon(;) for a previous query.
Since parser looks for semicolon(;) as a terminator for each query.
for example:
DROP TABLE IF EXISTS default.emp
create table default.emp (
field1 type,
field2 type)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
STORED AS TEXTFILE
LOCATION 's3://gts-promocube/source-data/Lowes/POS/';
If you save the above in a file and execute it with hive -f, then you'll get the error:
FAILED: ParseException line 2:0 missing EOF at 'CREATE' near emp.
Solution: Put a semicolon(;) for the DROP TABLE command above.

Exporting Data from Hive table to Local Machine File System

Using following command:
insert overwrite local directory '/my/local/filesystem/directory/path'
select * from Emp;
overwrites the entire already existing data in /my/local/filesystem/directory/path with the data of Emp.
What i want is to just copy the data of Emp to /my/loca/filesystem/directory/path and not overwrite, how to do that?
Following are my failed trials:
hive> insert into local directory '/home/cloudera/Desktop/Sumit' select * from appdata;
FAILED: ParseException line 1:12 mismatched input 'local' expecting
TABLE near 'into' in insert clause
hive> insert local directory '/home/cloudera/Desktop/Sumit' select * from appdata;
FAILED: ParseException line 1:0 cannot recognize input near 'insert'
'local' 'directory' in insert clause
Can u please tell me how can I get this solved?
To appened to a hive table you need to use INSERT INTO:
INSERT INTO will append to the table or partition keeping the existing
data in tact. (Note: INSERT INTO syntax is only available starting in
version 0.8)
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML#LanguageManualDML-InsertingdataintoHiveTablesfromqueries
But you can't use this to append to an existing local file so another option is to use a bash command.
If you have a file called 'export.hql' and in that file your code is:
select * from Emp;
Then your bash command can be:
hive -f 'export.hql' >> localfile.txt
The -f command executes the hive file and the >> append pipes the results to the text file.
EDIT:
The command:
hive -f 'export.hql' > localfile.txt
Will save the hive query to a new file, not append.
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-SQLOperations
When using 'LOCAL', 'OVERWRITE' is also needed in your hql.
For example:
INSERT OVERWRITE LOCAL DIRECTORY '/tmp/out' SELECT * FROM test

Resources