Oracle error:- LRM-00116: syntax error at 'control' following '=' - oracle

Oracle DB/Windows XP:-
I am running an batch file that calls an “.ctl” file which in turn calls an “.xls” file, both present in the same folder.
The idea is to load the data onto Oracle db present on an remote oracle server.(non local machine)
I am getting this error, no matter what I do.
Oracle error:- LRM-00116: syntax error at 'control' following '='
The .bat file code is as below
rem SET SQLLOGIN=remod/P3w1d0ry#wsd
pause Ready to Load the remo.Temp_data Table
sqlldr userid=%SQLLOGIN% control=TempData.ctl errors=100
pause
The .ctl file is as follows:-
LOAD DATA
INFILE "data.xls"
replace
into table remo.Temp_data
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
test_abbr "rtrim(:test_abbr)",
test_pk "rtrim(:test_pk)",
test_sk "rtrim(:test_sk)",
test_dt "rtrim(:test_dt)",
test_email "rtrim(:test_email)",
)

You've remarked out the the SET of SQLLOGIN. Also you might want to put a call in front of the sqlldr statement. You'll also need some data to load...
SET SQLLOGIN=remod/P3w1d0ry#wsd
pause Ready to Load the remo.Temp_data Table
call sqlldr userid=%SQLLOGIN% control=TempData.ctl data=mydata.csv errors=100

Related

use UTL_TCP.connection in pl sql

i have to read a file .csv that is in a directory of my virtual box oracle. In my local system windows with sql developer i want to create a procedure that read that file in the virtual box. I don't undestand how use UTL_TCP.connection and how write that procedure
Consider using External tables.
Note that prior to creating external table You must have database directory created. The location specified is then used to read files from. You can specify separators, delimiters and filenames in the DDL statement withing creation of table. Once set-up it could be queries by using standard select statement.
For example:
create directory << db directory >> as '/u01';
create table ext_filename_csv (
column01 varchar2(4000),
column02 varchar2(4000),
column03 varchar2(4000)
)
organization external (
type oracle_loader
default directory << db directory >>
access parameters (
records delimited by newline
fields terminated by ",")
location ( << db directory >>:'filename.txt')
);
Read more here:
http://psoug.org/reference/externaltab.html

What does "expected StringLiteral" mean in HIVE?

I cannot figure out why HIVE is throwing me an error in the following script:
use <output_db>
drop table if exists <new_tbl>;
create table <new_tbl> like <old_tbl>;
load data local inpath <directory> into table <new_tbl>;​
The exception is:
FAILED: ParseException line 4:23 mismatched input '<directory>' expecting StringLiteral near 'inpath' in load statement
Sorry if this is an elementary question. But I've copied it from similar hql statements that work and I can't find a satisfactory answer.
Seems like this :
load data local inpath directory into table
should be :
load data local inpath 'directory' into table
Enclosed within single quotes.
Hope it helps...!!!

SQL*Loader-522: lfiopn failed for file

I am getting below error in my script which is running a SQLLDR :
SQL*Loader-522: lfiopn failed for file (/home/abc/test_loader/load/badfiles/TBLLOAD20150520.bad)
As far my knowledge this is the error related to permission,but i am wondering in the folder "/load" there is no "badfiles" folder present .i have already define badfiles folder outside the load folder,but why in the error it is taking this location ?
is it like my input file having some problem and SQLLDR trying to create a bad file in the mention location ?
below is the SQLLDR command :
$SQLLDR $LOADER_USER/$USER_PWD#$LOADER_HOSTNAME control=$CTLFDIR/CTL_FILE.ctl BAD=$BADFDIR/$BADFILE$TABLE_NAME ERRORS=
0 DIRECT=TRUE PARALLEL=TRUE LOG=$LOGDIR/$TABLE_NAME$LOGFILE &
below is the control file temp :
LOAD DATA
INFILE '/home/abc/test_loader/load/FILENAME_20150417_001.csv' "STR '\n'"
APPEND
INTO TABLE STAGING.TAB_NAME
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(
COBDATE,
--
--
--
FUTUSE30 TERMINATED BY WHITESPACE
)
Yes, your input file is having a problem so the sqlldr wants to create a file containing rejected rows (BAD file). The BAD file creation fails due to insufficient privileges - the user who runs the sqlldr does not have rights to create file in the folder you defined to contain BAD files.
Add write privileges on the BAD folder to the user who runs the sqlldr or place the BAD folder elsewhere.
This is likely some kind of permissions issue on writing the log file, maybe after moving services to a different server.
I ran into the same error. Problem was resolved by changing the name of the existing log file in filesystem and rerunning process. Upon rerunning, the SQLLDR process was able to recreate the log file, and subsequent executions were able to rewrite the log.

Error running hive script in pseudo distributed mode

I am trying to run a hive script in pseudo distributed mode. The commands in the script runs absolutely fine when I run it interactive mode. However, when I add all those commands in a script and run I get an error.
The script:
add jar /path/to/jar/file;
create table flights(year int, month int,code string) row format serde 'com.bizo.hive.serde.csv.CSVSerde';
load data inpath '/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv' overwrite into table flights;
The 'On_Time_On_Time_Performance_2013_1.csv' does exist in the HDFS. The error I get is:
FAILED: SemanticException Line 3:17 Invalid path ''/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv'': No files matching path hdfs://localhost:54310/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv
fs.default.name=hdfs://localhost:54310
My hadoop is running fine.
Can someone give any pointers?
Thanks.
This is not really an answer, but it is a more detailed and repeatable formulation of your question.
a) one needs to download the csv-serde from here: git clone https://github.com/ogrodnek/csv-serde
b) Build it using mvn package
c) Create a text file containing three comma separated fields corresponding to the three fields of the given table.
c) If the path is say "/shared" then the following is the correct sequence to load:
add jar /shared/csv-serde/target/csv-serde-1.1.2-0.11.0-all.jar;
drop table if exists flights;
create table flights(year int, month int,code string) row format serde 'com.bizo.hive.serde.csv.CSVSerde' stored as textfile;
load data inpath '/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv' overwrite into table flights;
I do see the same error as in the OP: FAILED: SemanticException Line 2:17 Invalid path ''/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv'': No files matching path hdfs://localhost:9000/tmp/hive-user/On_Time_On_Time_Performance_2013_1.csv

Use parameters with CTL

I am using a CTL file to load data stored in a file to a specific table in my Oracle database.
Currently, I launch the loader file using the following command line:
sqlldr user/pwd#db data=my_data_file control=my_loader.ctl
I would like to know if it is possible to use specify parameters to be retrieved in the CTL file.
Also, is it possible to retrieve the name of the data file used by the CTL to fill the table ?I also would like to insert it for each row. I currently have to call a procedure to update previously inserted records.
Any help would be appreciated !
As I know don't have any way to pass parametter as variable in ctrl. But You can use constant in ctl and modify clt file to change that constant value (in ctl file content) for every loading times.
Edit: more specific.
my_loader.ctl:
--options
load data
infile 'c:\$datfilename$' --this is optional, you can specify here or from command line
into table mytable
fields....
(
datafilename constant '$datfilename$', -- will be replace by real datafname each load
datacol1 char(1),
....
)
dataload.bat: assume that $datfilename$ is the text will be replace by datafile's name.
::sample copy
copy my_loader.ctl my_loader_temp.ctl
::replace the name of datafile (mainly the content to load into table's data column)
findandreplace my_loader_temp.ctl "$datafilename$" "%1"
::load
sqlldr user/pwd#db data=%1 control=my_loader_temp.ctl
::or with data be obmitted if you specified by infile in control file.
sqlldr user/pwd#db control=my_loader_temp.ctl
using: dataload.bat mydatafile_2010_10_10.txt

Resources