Hbase tables are not getting saved. Hbase tables getting disappeared after shutdown - hadoop

I have installed hbase standalone application and faced many troubles in running hbase shell. And then I got rid off those problems and made it run. Now It's working normally. As now I create or list or describe a table all are working fine. The table is not getting saved. When I shutdown my system the tables are getting vanished. And I have to start freshly. I'm new to hbase. I want guidance to get the rid off this problem.

Hbase tables are not getting saved.
That is because, in my case, I had not set the 'hbase.rootdir' property in hbase -> conf -> hbase-site.xml. after setting up it, I got the tables saved normally.

Related

HiveServer2 trying to write log file under /var/log/hive/operation_logs/ in the different node

We upgraded CDH version from 5.x to 6.3.3 and also hive from 1.x to 2.1.1. When we run hive map join, there is no impact to the result, but we got the following error message
Caused by: main ERROR RandomAccessFileManager [java.io.IOException: Could not create directory /var/log/hive/operation_logs/a2e15f81-0e7e-4d56-8b8b-7a4768ced8ae]
When I run hive map join, the table size is less than 10KB.
Our HiveServer2 is installed in node1, the /var/log/hive/operation_logs/ folder exist in node1, but the operation_logs folder is empty, seems like it didn't write any log file there. Since the error message mentioned folder creation issue, I tried to create the operation_logs folder manually in edge node(node2). And when I run the hive map join, there is no error message, and I can see the log created in node2.
So I just wonder why HiveServer2 write the log in the edge node instead of it's own node? And does anyone know how can we fix it?

Hive View Not Opening

In the Ambari UI of the hortonworks sandbox, I was trying to open Hive View through the account of maria_dev. But however, I was getting the following error:
Service Hive check failed:
Cannot open a hive connection with connect string
jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;hive.server2.proxy.user=maria_dev
Can someone please help me sort out the error?
I was able to rectify the issue, although I was unable to understand why such a problem cropped up. I was using the hadoop ecosystem in a docker. I just stopped, removed and re-started the sandbox-hdp. After re-starting hive view was working just fine. But however, I am still getting the following error :
unable to load database password
If someone could clarify, it would be really great. :) :)
Hive reads some configurations from zookeeper and I was able to resolve the issue simply by restarting the Zoo keeper.

HBase components doesn't appear in Pentaho Kettle

I am trying to working with Pentaho, in order to build some big data solutions. But the Hadoop HBase components aren't appering in the dashboard. I don't understand why HBase doesn't appear, since HBase is up an running on my machine... I've been seeking for a solutions, but without success...
Please check this property value 'hbase.client.scanner.timeout.period' set to 10 mins in hbase-default.xml to get rid of hbase exceptions.
Check that you have added zookeeper host in the hbase output host in pentaho data integration tool.
Have you read this wiki in order to load hbase data into pentaho.

Unable to retain HIVE tables

I have set up a single node hadoop cluster on ubuntu.I have installed hadoop 2.6 version on in my machine.
Problem:
Everytime i create HIVE tables and load data into it , i can see the data by querying on it but once i shut-down my hadoop , tables gets wiped out. Is there any way i can retain them or is there any setting i am missing?
I tried some online solution provided , but nothing worked , kindly help me out with this.
Blockquote
Thanks
B
The hive table data is on the hadoop hdfs, hive just add a meta data and provide users sql like commands to prevent them from writing basic MR jobs.So if you shutdown the hadoop cluster,Hive cant find the data in the table.
But if you are saying when you restart the hadoop cluster, the data is lost.That's another problem.
seems you are using default derby as metastore.configure the hive metastore.i am pointing the link.please fallow it.
Hive is not showing tables

PIG cannot understand hbase table data

I'm running hbase(0.94.13) on a single node for my academic project. After loading data into hbase tables, I'm trying to run pig(0.11.1) scripts on the data using HBaseStorage. However this throws an error saying
IllegalArgumentException: Not a host:port pair: �\00\00\00
here is the load command I'm using in Pig
books = LOAD 'hbase://booksdb' USING
org.apache.pig.backend.hadoop.hbase.HBaseStorage('details:title','-loadKey
true') AS (ID:chararray,title:chararray);
I thought this might be a problem of hbase being a different version in pig than what my machine has. But can't seem to make it work without downgrading my hbase. Any help?
It seems you are trying to submit a pig job remotely
if so you'd need to add a few settings in the pig.properties file (or set setting_name='values' in your script)
hbase.zookeeper.quorum=<node>
hadoop.job.ugi=username,groupname
fs.default.name=hdfs://<node>:port
mapred.job.tracker=hdfs://<node>:port

Resources