Hive: getting 'unable to alter partition' error - hadoop

I'm trying to insert data into my partition table using insert statement but its getting error saying unable to Alter partition
The query that I used :
hive> INSERT INTO TABLE sms_partition table PARTITION(userc,coding) SELECT headerguid,messageid,userid,tonumber,fromnumber,smssplits,udh,operator,circle,vf,msgp,smstext,userc,coding from chotadata16june;
I got the error:
ERROR:Failed with exception org.apache.hadoop.hive.ql.metadata.hiveexception: unable to alter partition .For direct metastore db connections ,we dont support retries to the client level.

Related

Can't delete or update a row in Oracle table. Fails with ORA-08103: object no longer exists

delete from table1 where ROWID = 'XXXXXXXXXXXXXXX';
SQL Error: ORA-29876: failed in the execution of the ODCIINDEXDELETE routine
ORA-20000: Oracle Text error:
DRG-10602: failed to queue DML change to column Col1 for primary key XXXXXXXXXXXXXXX
DRG-50857: oracle error in drekqkd(execute k_stmt)
ORA-08103: object no longer exists
Please don't send me to contact oracle support :)
So the solution was to drop and create again the text index (indextype is ctxsys.context) for the column

alter table/add columns in non native table in hive

I created a hive table with a storage handler and now I want to add a column to that table but it gives me below error:
[Code: 10134, SQL State: 42000] Error while compiling statement: FAILED:
SemanticException [Error 10134]: ALTER TABLE can only be used for [ADDPROPS,
DROPPROPS] to a non-native table
As per the hive documentation any hive table you create with storage handler is non native table.
Here's a link https://cwiki.apache.org/confluence/display/Hive/StorageHandlers
There is a JIRA case for enhancement is open with Apache for the same.
https://issues.apache.org/jira/browse/HIVE-1240
For ex, I am using Druid Storage Handler in my case.
I created a hive table using:
CREATE TABLE druid_table_1
(`__time` TIMESTAMP, `dimension1` STRING, `metric1` int)
STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler';
and then I am trying to add a column:
ALTER TABLE druid_table_1 ADD COLUMNS (`dimension2` STRING);
With above approach I am getting an error.
Is there any other way to add a column to non native tables in hive without recreating it?
Patch is available in HDP 2.5+ from Hortonworks. Support for ADD columns has been added in ALTER statement.
Column can be added into druid table using ALTER table DDL in hive.
ALTER TABLE ADD COLUMNS (col_name data_type)
There is no need to specify partition spec as these are druid backed hive tables and partition/storage is maintained by druid.

Add Column to Hive External Table Error

Trying to add a column to an external table in HIVE but get the error below. This table currently has a thousand partitions registered and I want' to avoid re-creating the table and then running MSCK REPAIR which would take a very long time to complete. Also, the table uses OpenCSVSerde format. How can I add a column
hive> ALTER TABLE schema.Table123 ADD COLUMNS (Column1000 STRING);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. java.lang.IllegalArgumentException: Error: type expected at the position 0 of '<derived from deserializer>' but '<' is found.

I'm getting the following errors when I try to drop or delete a database table

I'm getting the following errors when I try to delete a table. Does the table include the mismatched data type for Image?
Unable to execute command:
drop table "APP"."GADGET"
DDL is not permitted for a read-only connection, user or database.

Hive exception java.lang.RuntimeException: java.lang.ClassCastException:

My Hive (version 0.14.0.2.2.6.4-1) table table_test is in ORC format with some data, I needed to re-structure that table
I have taken a backup table (ORC format) using
create table table_test_bk as select * from table_test;
Drop Original table as:
drop table table_test;
Running the modified DDL for table_test re-creation (with a new column in the middle column_new string)
The old table struncture:
(col1 string,col2 decimal(10,2),col3 timestamp);
The new table struncture: (col1 string,col2 decimal(10,2),column_new string,col3 timestamp);
Restoring the data-set from backup table using
insert into table table_test select col1,col2,null as column_new,col3 from table_test_bk;
When step 4 runs successfully, dropping up the backup table
All these 5 steps ran successfully, but while doing data-sanity I can see the below exception, and I am unable to get any o/p. Where as doing select count(1) from table_test; is giving me proper data-count.
Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.Text
Any help is appreciated.

Resources