Import from windows to Hive, ParseException line 1:0 cannot recognize input near 'into' '<EOF>' - windows

I am trying to import a file text from my machine windows to Hive. I create a table in Hive and I used this command:
load data local in path C:\Users\me\Desktop\test.txt into test_t sample;
But it doesn't work, and the Error is:
Error while compiling statement: FAILED: ParseException line 1:69 missing TABLE at 'test_t' near '' line 1:76 extraneous input 'sample' expecting EOF near '' [ERROR_STATUS]
Do you have any advise?

you have not mention table in your query
assume- your hive table name is sample
load data local inpath "C:\Users\me\Desktop\test.txt" into table sample

Related

Hive Load Data:No files matching path file:/home/hive/sample.log

I am trying to load sample.log file on HDP-sandbox
My initial efforts
LOAD DATA LOCAL INPATH 'sample.log' OVERWRITE INTO TABLE logs;
It seems that path is not matching
Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path ''sample.log'': No files matching path file:/home/hive/sample.log (state=42000,code=40000)
I logged out,moved to /root,then entered hive
0: jdbc:hive2://sandbox-hdp.hortonworks.com:2> LOAD DATA LOCAL INPATH '/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log' OVERWRITE INTO TABLE logs;
Full path does not work either.
Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path ''/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log'': No files matching path file:/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log (state=42000,code=40000)
It looks to me that it confuses /root and /home/hive.
How to set the proper path?
Your statement is being executed by user 'hive'. Make sure local file has permissions that allow 'hive' read access to it.

How to load csv file from hdfs to hbase table using Dimporttsv

I am trying to load csv file into an hbase table using shell command Dimporttsv.
The csv files reside in a dir in my hdfs (/csvFiles)
the csv file was generated from a mysql table with the following feilds:
+-------------+
Field
+-------------+
tweet_id
user_id
screen_name
description
created_at
+-------------+
I created a table in hbase with a single family name as shown below:
create 'dummyTable', 'cf1'
the command I am using:
ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns=HBASE_ROW_KEY,cf1:user_id,cf1:tweet_id,cf1:screen_name,cf1:description,cf1:created_at dummyTable /csvFiles/all_users.csv
however I am getting this syntax error:
SyntaxError: (hbase):8: syntax error, unexpected tSYMBEG
I've looked at the following posts and followed the recommendations in them but to no avail. I would appreciate your help.
Import TSV file into hbase table
https://community.hortonworks.com/articles/4942/import-csv-data-into-hbase-using-importtsv.html
http://hbase.apache.org/book.html#importtsv
Exit from Hbase shell and try by adding single quotes to importtsv.columns
bash$ hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns='HBASE_ROW_KEY,cf1:user_id,cf1:tweet_id,cf1:screen_name,cf1:description,cf1:created_at' dummyTable hdfs://<your_name_node_addr>/csvFiles/all_users.csv
(or)
From Hbase Shell:
hbase(main):001:0> ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns='HBASE_ROW_KEY,cf1:user_id,cf1:tweet_id,cf1:screen_name,cf1:description,cf1:created_at' dummyTable hdfs://<your_name_node_addr>/csvFiles/all_users.csv

Receiving error, while using parameter in Hive Script

I am trying to remove the hard coding from Hive script. For that I have created a hql file(src_sys_cd_param.hql).
Setting the source system value thru parameter, below the of param file
hive -f /data/data01/dev/edl/md/sptfr/landing/src_sys_cd_param.hql;
param file is having command set src_sys_cd = 'M09';
After running the below script:
INSERT INTO TABLE SPTFR_CORE.M09_PRTY SELECT C.EDW_SK,A.PRTY_TYPE_HIER_ID,
A.PRTY_NUM,A.PRTY_DESC,A.PRTY_DESC,'N',${hiveconf:src_sys_cd},
A.DAI_UPDT_DTTM,A.DAI_CRT_DTTM
FROM SPTFR_STG.M09_PRTY_VIEW_STG A JOIN SPTFR_STG.BKEY_PRTY_STG C
ON ( CONCAT(A.PRTY_TYPE_LVL_1_CD,'|^',A.PRTY_NUM ,'|^',A.SRC_SYS_CD)= C.SRC_CMBN);
Receing the error:
Error while compiling statement: FAILED: ParseException line 1:113 cannot recognize input near '$' '{' 'hiveconf' in selection target

Hive error: parseexception missing EOF

I am not sure what I am doing wrong here:
hive> CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
tblproperties ("orc.compress"="NONE")
LOCATION "/user/hive/test_table";
FAILED: ParseException line 1:107 missing EOF at 'LOCATION' near ')'
while the following query works perfectly fine:
hive> CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
tblproperties ("orc.compress"="NONE");
OK
Time taken: 0.106 seconds
Am I missing something here. Any pointers will help. Thanks!
Try put the "LOCATION" in front of "tblproperties" like below, worked for me.
CREATE TABLE default.testtbl(int1 INT,string1 STRING)
stored as orc
LOCATION "/user/hive/test_table"
tblproperties ("orc.compress"="NONE");
It seems even the sample SQL from book "Programming Hive" got the order wrong. Please reference to the official definition of create table command:
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-CreateTable
#Haiying Wang pointed out that LOCATION is to be put in front of tblproperties.
But I think the error also occurs when location is specified above stored as.
Its better to stick to the correct order:
CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name -- (Note: TEMPORARY available in Hive 0.14.0 and later)
[(col_name data_type [COMMENT col_comment], ... [constraint_specification])]
[COMMENT table_comment]
[PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
[CLUSTERED BY (col_name, col_name, ...) [SORTED BY (col_name [ASC|DESC], ...)] INTO num_buckets BUCKETS]
[SKEWED BY (col_name, col_name, ...) -- (Note: Available in Hive 0.10.0 and later)]
ON ((col_value, col_value, ...), (col_value, col_value, ...), ...)
[STORED AS DIRECTORIES]
[
[ROW FORMAT row_format]
[STORED AS file_format]
| STORED BY 'storage.handler.class.name' [WITH SERDEPROPERTIES (...)] -- (Note: Available in Hive 0.6.0 and later)
]
[LOCATION hdfs_path]
[TBLPROPERTIES (property_name=property_value, ...)] -- (Note: Available in Hive 0.6.0 and later)
[AS select_statement]; -- (Note: Available in Hive 0.5.0 and later; not supported for external tables)
Refer: Hive Create Table
Check this post:
Loading Data from a .txt file to Table Stored as ORC in Hive
And check your source files present at the specified directory /user/hive/test_table. Incase the files are in .txt or some other non ORC format then you can follow the steps in the above post to come out of the error.
ParseException line lineNumber missing EOF at '.' near 'schemaName':
Got the above error while trying to execute the following command from linux script to truncate a hive table
dse -u username -p password hive -e "truncate table keyspace.tablename;"
Fix:
Need to separate the commands within the script line as follows -
dse -u username -p password hive -e "use keyspace; truncate table keyspace.tablename;"
Happy coding!
Got the same error while creating a table in hive.
I used the drop command to drop the table and then run the create table command that I had again.
Worked for me.
If you see this error when running the HiveQL from a file with the command "hive -f file.hql". And that it points the first line of your query most definitely this is because of a forgotten semicolon(;) for a previous query.
Since parser looks for semicolon(;) as a terminator for each query.
for example:
DROP TABLE IF EXISTS default.emp
create table default.emp (
field1 type,
field2 type)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
STORED AS TEXTFILE
LOCATION 's3://gts-promocube/source-data/Lowes/POS/';
If you save the above in a file and execute it with hive -f, then you'll get the error:
FAILED: ParseException line 2:0 missing EOF at 'CREATE' near emp.
Solution: Put a semicolon(;) for the DROP TABLE command above.

Hive error when creating an external table (state=08S01,code=1)

I'm trying to create an external table in Hive, but keep getting the following error:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Aborting command set because "force" is false and command failed: "create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";"
The contents of /tmp/hive_test_1375711405.45852.txt are:
abc\tdef
I'm connecting via the beeline command line interface, which uses Thrift HiveServer2.
System:
Hadoop 2.0.0-cdh4.3.0
Hive 0.10.0-cdh4.3.0
Beeline 0.10.0-cdh4.3.0
Client OS - Red Hat Enterprise Linux Server release 6.4 (Santiago)
The issue was that I was pointing the external table at a file in HDFS instead of a directory. The cryptic Hive error message really threw me off.
The solution is to create a directory and put the data file in there. To fix this for the above example, you'd create a directory under /tmp/foobar and place hive_test_1375711405.45852.txt in it. Then create the table like so:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/foobar";
We faced similar problem in our company (Sentry, hive, and kerberos combination). We solved it by removing all privileges from non fully defined hdfs_url. For example, we changed GRANT ALL ON URI '/user/test' TO ROLE test; to GRANT ALL ON URI 'hdfs-ha-name:///user/test' TO ROLE test;.
You can find the privileges for a specific URI in the Hive database (mysql in our case).

Resources