I am having trouble importing the data of the TPCH-Benchmark (generated with dbgen) into my monetDB-Database.
I've already created all the tables and I'm trying to import using the following command:
COPY RECORDS INTO region FROM "PATH\region.tbl" DELIMITERS tuple_seperator '|' record_seperator '\r\n';
And I get the following error message:
syntax error, unexpected RECORDS, expecting BINARY or INTO in: "copy records"
I also found out this one on the internet:
COPY INTO sys.region 'PATH/region.tbl' using delimiters '|','\n';
But I get the following error message:
syntax error, unexpected IDENT, expecting FROM in: "copy into sys.region "C:\ProgramData\MySQL\MySQL Server 5.7\Uploads\region."
Because I'm a new monetDB user I'm not getting
What I'm doing wrong ?
Any help will be appreciate :)
The RECORDS construct expects a number, specifically how many records you are to load. I usually do this:
COPY 5 RECORDS INTO region FROM '/path/to/region.tbl' USING DELIMITERS '|', '|\n' LOCKED;
Also in the second attempt you are missing a FROM before the path to the file like
COPY INTO sys.region FROM '/path/to/region.tbl' USING DELIMITERS '|', '\n';
See here for more information: https://www.monetdb.org/Documentation/Manuals/SQLreference/CopyInto
Related
I cannot figure out why HIVE is throwing me an error in the following script:
use <output_db>
drop table if exists <new_tbl>;
create table <new_tbl> like <old_tbl>;
load data local inpath <directory> into table <new_tbl>;
The exception is:
FAILED: ParseException line 4:23 mismatched input '<directory>' expecting StringLiteral near 'inpath' in load statement
Sorry if this is an elementary question. But I've copied it from similar hql statements that work and I can't find a satisfactory answer.
Seems like this :
load data local inpath directory into table
should be :
load data local inpath 'directory' into table
Enclosed within single quotes.
Hope it helps...!!!
I am getting below error in my script which is running a SQLLDR :
SQL*Loader-522: lfiopn failed for file (/home/abc/test_loader/load/badfiles/TBLLOAD20150520.bad)
As far my knowledge this is the error related to permission,but i am wondering in the folder "/load" there is no "badfiles" folder present .i have already define badfiles folder outside the load folder,but why in the error it is taking this location ?
is it like my input file having some problem and SQLLDR trying to create a bad file in the mention location ?
below is the SQLLDR command :
$SQLLDR $LOADER_USER/$USER_PWD#$LOADER_HOSTNAME control=$CTLFDIR/CTL_FILE.ctl BAD=$BADFDIR/$BADFILE$TABLE_NAME ERRORS=
0 DIRECT=TRUE PARALLEL=TRUE LOG=$LOGDIR/$TABLE_NAME$LOGFILE &
below is the control file temp :
LOAD DATA
INFILE '/home/abc/test_loader/load/FILENAME_20150417_001.csv' "STR '\n'"
APPEND
INTO TABLE STAGING.TAB_NAME
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(
COBDATE,
--
--
--
FUTUSE30 TERMINATED BY WHITESPACE
)
Yes, your input file is having a problem so the sqlldr wants to create a file containing rejected rows (BAD file). The BAD file creation fails due to insufficient privileges - the user who runs the sqlldr does not have rights to create file in the folder you defined to contain BAD files.
Add write privileges on the BAD folder to the user who runs the sqlldr or place the BAD folder elsewhere.
This is likely some kind of permissions issue on writing the log file, maybe after moving services to a different server.
I ran into the same error. Problem was resolved by changing the name of the existing log file in filesystem and rerunning process. Upon rerunning, the SQLLDR process was able to recreate the log file, and subsequent executions were able to rewrite the log.
I am really new to SQLite.
I want to update BLOBs in the Column "data" in my database and i got it working:
UPDATE genp SET data= X'MyHexData' WHERE rowid=510849
As i want to update multiple BLOBs from the Column data i decited to write a .sh script:
sqlite3 my.db 'UPDATE genp SET data= X'MyHexData' WHERE rowid=510849'
When i execute this script i get the error message:
SQL error: no such column: XMyHexData
Why does SQLite think that my hex data is supposed to be the column? Where is my mistake? It works if i run this in the Command Line Shell of SQLite.
EDIT:
I got it working.Like this:
sqlite3 my.db "UPDATE genp SET data= X'MyHexData' WHERE rowid= '510849'"
Thanks for all your help
You've already used single quotes to quote the argument. Escape them.
... '...\'...'
I have just started using Laravel and cannot get my head around how it throws errors. It doesn't show the line where the error is so I don't know how to locate it. Can anyone help?
htmlentities() expects parameter 1 to be string, array given (View:
M:\webserver\www\app\views\products\admin\create.blade.php)
This file is incredibly long and I cannot see where this array is being sent.
It's obviously coming from a Form::text() but I am passing a null as the second param in all that I can see. Why doesn't Laravel simply tell me the line that is erroring. The error it puts out is no use to me.
check the error file:
app/storage/logs/laravel.log
you can watch changes in the file (on Mac and *NIX) using command line:
tail -f app/storage/logs/laravel.log
remember that the storage directory must be writable by the webserver/PHP process because it's used as scratch space (for blade views, logs, etc.)
Oracle DB/Windows XP:-
I am running an batch file that calls an “.ctl” file which in turn calls an “.xls” file, both present in the same folder.
The idea is to load the data onto Oracle db present on an remote oracle server.(non local machine)
I am getting this error, no matter what I do.
Oracle error:- LRM-00116: syntax error at 'control' following '='
The .bat file code is as below
rem SET SQLLOGIN=remod/P3w1d0ry#wsd
pause Ready to Load the remo.Temp_data Table
sqlldr userid=%SQLLOGIN% control=TempData.ctl errors=100
pause
The .ctl file is as follows:-
LOAD DATA
INFILE "data.xls"
replace
into table remo.Temp_data
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
test_abbr "rtrim(:test_abbr)",
test_pk "rtrim(:test_pk)",
test_sk "rtrim(:test_sk)",
test_dt "rtrim(:test_dt)",
test_email "rtrim(:test_email)",
)
You've remarked out the the SET of SQLLOGIN. Also you might want to put a call in front of the sqlldr statement. You'll also need some data to load...
SET SQLLOGIN=remod/P3w1d0ry#wsd
pause Ready to Load the remo.Temp_data Table
call sqlldr userid=%SQLLOGIN% control=TempData.ctl data=mydata.csv errors=100