Squirrel access to Phoenix/HBase - hadoop

I got phoenix 4.0 running on hbase 0.98/hadoop 2.3.0 and was impressed by the command line tools.
In the second step I followed the description on the webpage to connect to phoenix using its bundled JDBC driver.
When I try to connect I get the Exception message (on Squirrel side)
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.awaitConnection(OpenConnectionCommand.java:132)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$100(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$2.run(OpenConnectionCommand.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.RuntimeException: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:171)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$000(OpenConnectionCommand.java:45)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$1.run(OpenConnectionCommand.java:104)
... 5 more
Caused by: java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:254)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1446)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
at net.sourceforge.squirrel_sql.fw.sql.SQLDriverManager.getConnection(SQLDriverManager.java:133)
at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.executeConnect(OpenConnectionCommand.java:167)
... 7 more
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:309)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
... 12 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 15 more
Caused by: java.lang.RuntimeException: Socket Factory class not found: java.lang.ClassNotFoundException: Class org.apache.hadoop.net.StandardSocketFactory not found
at org.apache.hadoop.net.NetUtils.getSocketFactoryFromProperty(NetUtils.java:142)
at org.apache.hadoop.net.NetUtils.getDefaultSocketFactory(NetUtils.java:122)
at org.apache.hadoop.hbase.ipc.RpcClient.<init>(RpcClient.java:1293)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:664)
... 20 more
I double checked the jar files with classfinder to be sure that the class org.apache.hadoop.net.StandardSocketFactory IS in the classpath.
What can I do to get Squirrel connected with Phoenix?
Update:
I saw in the zookeeper log on the server side that the network communication started:
2014-05-28 06:24:29,411 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxnFactory: Accepted socket connection from /192.168.1.106:58172
2014-05-28 06:24:29,412 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.ZooKeeperServer: Client attempting to establish new session at /192.168.1.106:58172
2014-05-28 06:24:29,518 INFO [SyncThread:0] server.ZooKeeperServer: Established session 0x146413f6c3a000c with negotiated timeout 90000 for client /192.168.1.106:58172

I solved the problem replacing the downloaded binary version 4.0 of phoenix with the snapshot version 4.1 which I built by myown from the source version cloned via git from
http://git.apache.org/incubator-phoenix.git/
After the successful build I extracted the tarball from the assembly subdirectory and copied the following jars to hbase 0.98's lib dir
phoenix-core-4.1.0-incubating-SNAPSHOT.jar
phoenix-flume-4.1.0-incubating-SNAPSHOT.jar
phoenix-pig-4.1.0-incubating-SNAPSHOT.jar
In Squirrel I used just phoenix-4.1.0-incubating-SNAPSHOT-client.jar as a extra path to get the driver running.

Related

Error while connecting to Phoenix ERROR 103 (08004): Unable to establish connection

Am trying to establish connection to Phoenix, using DBVisualizer but getting below error
Followed steps give here
https://community.hortonworks.com/articles/19016/connect-to-phoenix-hbase-using-dbvisualizer.html
After which getting error as below
ERROR 103 (08004): Unable to establish connection
When checked the error log of DBVisualizer am seeing below things
2019-07-22 12:53:18.890 INFO 903 [ExecutorRunner-pool-3-thread-1 - H.Ĵ] Exception while connecting Hbase
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:422)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:392)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:211)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2269)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.onseven.dbvis.h.B.D.ā(Z:1548)
at com.onseven.dbvis.h.B.F$A.call(Z:1369)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:421)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:330)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:390)
... 18 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 23 more
Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.RpcClientImpl
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:54)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:34)
at org.apache.hadoop.hbase.ipc.RpcClientFactory.createClient(RpcClientFactory.java:64)
at org.apache.hadoop.hbase.ipc.RpcClientFactory.createClient(RpcClientFactory.java:48)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:638)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:46)
... 32 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.ClassSize
at org.apache.hadoop.hbase.ipc.IPCUtil.<init>(IPCUtil.java:74)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.<init>(AbstractRpcClient.java:95)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.<init>(RpcClientImpl.java:1092)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.<init>(RpcClientImpl.java:1118)
I tried using Intellij and it is working fine.
Add all jars that are required by the phoenix driver and restart the tool.
Following jars I had added for IntelliJ
hbase-annotations
hbase-common
hbase-client
phoenix-hbase-client
I had added in the drivers of Intellij and restarted it.
Make sure you are using the proper version of JDK while connecting the Phoenix through tool.

error connecting pentaho with Jena Fuseki

I'm trying to create a conecction from pentaho (7.0.0.0-25) to Jena (fuseki 2.4.1), using "connection type: SPARKSQL" (and localhost, port, and DBname) in pentaho, and I get this error:
Error connecting to database [test] :org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class org.apache.hive.jdbc.SparkSqlSimbaDriver)
No suitable driver found for jdbc:spark://localhost:3030/testSKMO;AuthMech=0;SocketTimeout=10
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class org.apache.hive.jdbc.SparkSqlSimbaDriver)
No suitable driver found for jdbc:spark://localhost:3030/testSKMO;AuthMech=0;SocketTimeout=10
at org.pentaho.di.core.database.Database.normalConnect(Database.java:472)
at org.pentaho.di.core.database.Database.connect(Database.java:370)
at org.pentaho.di.core.database.Database.connect(Database.java:341)
at org.pentaho.di.core.database.Database.connect(Database.java:331)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2795)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:598)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:137)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:80)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:47)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:475)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:462)
at org.pentaho.di.ui.spoon.Spoon.doubleClickedInTree(Spoon.java:3066)
at org.pentaho.di.ui.spoon.Spoon.doubleClickedInTree(Spoon.java:3036)
at org.pentaho.di.ui.spoon.Spoon.access$2200(Spoon.java:361)
at org.pentaho.di.ui.spoon.Spoon$26.widgetDefaultSelected(Spoon.java:6169)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1359)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7990)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9290)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:685)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hive.jdbc.SparkSqlSimbaDriver)
No suitable driver found for jdbc:spark://localhost:3030/testSKMO;AuthMech=0;SocketTimeout=10
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:585)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:456)
... 48 more
Caused by: java.sql.SQLException: No suitable driver found for jdbc:spark://localhost:3030/testSKMO;AuthMech=0;SocketTimeout=10
at java.sql.DriverManager.getConnection(DriverManager.java:689)
at java.sql.DriverManager.getConnection(DriverManager.java:270)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:571)
... 49 more
Hostname :localhost
Port :3030
Database name :testSKMO
I have download .jar files "jena-jdbc-driver-bundle-3.1.0.jar" (contains driver to Jena connection) and "pentaho-big-data-kettle-plugins-hive.jar" (contains "SparkSQLSimbaDriver" class, that class looks like cause error) and I have tried somethings:
To put that .jar in Pentaho/data-integration/lib
To create a new folder Pentaho/data-integration/libext/JDBC and to put .jar in this.
In both cases still getting the same error.
Someone have any idea or hint to solve it??
Finally, the question is how I can connect Pentaho with Jena.
Thanks!
To connect Pentaho with Jena you must:
put "jena-jdbc-driver-bundle-3.1.0.jar" (it contains drivers to connect with Jena) in folder Pentaho/data-integration/lib/
use a connection type "Generic" (in Pentaho, when you create a new connection). Probably, that is the problem's key
define, in this connection, "jdbc:jena:remote:update=http://localhost:3030/name_of_dataset/update" (in my case) like custom connection URL and "org.apache.jena.jdbc.remote.RemoteEndpointDriver" like driver class
...and... go!!

Failed to start application WAS 8.5.5.10

My application failed to start with the below error on WebSphere Application Server 8.5.5.10 which was working on 8.5.5.5
[1/2/17 17:54:20:842 IST] 0000006f ecs W com.ibm.ws.ecs.internal.scan.context.impl.ScannerContextImpl scanJAR unable to open input stream for resource org/reflections/scanners/MemberUsageScanner$1.class in archive WEB-INF/lib/reflections-0.9.10.jar
java.lang.RuntimeException
at org.objectweb.asm.MethodVisitor.visitParameter(Unknown Source)
at org.objectweb.asm.ClassReader.b(Unknown Source)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
The application failes to start after deployment
The application failes to start when JVM starts
The same application is getting started manually from console.
When i put reflections-0.9.10.jar in shared library i am getting Failed to load webapp
Caused by: com.ibm.ws.webcontainer.exception.WebAppNotLoadedException: Failed to load webapp: null
at com.ibm.ws.webcontainer.VirtualHostImpl.addWebApplication(VirtualHostImpl.java:177)
at com.ibm.ws.webcontainer.WSWebContainer.addWebApp(WSWebContainer.java:901)
... 73 more
Caused by: java.lang.RuntimeException
at org.objectweb.asm.MethodVisitor.visitParameter(Unknown Source)
at org.objectweb.asm.ClassReader.b(Unknown Source)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
at org.objectweb.asm.ClassReader.accept(Unknown Source)
5.The application started when i move all the third party jars to shared libraries.
upgrade to 8.5.5.11, issue fixed in latest fixpack
http://www-01.ibm.com/support/docview.wss?uid=swg1PI60902
If you find any other solution in 8.5.5.10, please do let me know
The SystemOut.log file for the application server will indicate which version of the Java (JVM) is in-use. Verify whether the version of the class files in reflections-0.9.10.jar is greater than the version of the server JVM. The error you posted will occur whenever this condition is true.

JMeter Hbase scan sampler fails with ClassNotFoundException exception

I'm trying to use Hadoop/Hbase sampler to connect and scan the tables created in Hbase, however my test is failing with the following messages.
Checked the configuration of Hadoop/Hbase all the service are running and listening to the appropriate port. (zookeeper is running and listening to 2181).
2016/04/02 15:44:04 ERROR - jmeter.threads.JMeterThread: Test failed! java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:217)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)
at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:210)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:468)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
at org.apache.hadoop.hbase.security.User.call(User.java:590)
at org.apache.hadoop.hbase.security.User.callStatic(User.java:580)
at org.apache.hadoop.hbase.security.User.access$400(User.java:51)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:397)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:392)
at org.apache.hadoop.hbase.security.User.getCurrent(User.java:140)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionKey.<init>(HConnectionManager.java:435)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:180)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:155)
at org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
at org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:265)
at org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:195)
at org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:174)
at com.atlantbh.jmeter.plugins.hbasecomponents.config.HBaseConnectionVariable.getTable(HBaseConnectionVariable.java:43)
at com.atlantbh.jmeter.plugins.hbasecomponents.samplers.HBaseScanSampler.sample(HBaseScanSampler.java:94)
at org.apache.jmeter.threads.JMeterThread.process_sampler(JMeterThread.java:434)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:261)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 35 more**
The problem is on your JMeter side. As the error message states:
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
which means that HBase Scan Sampler requires Apache Commons Configuration library, but looks like Hadoop plug-in does not provide it in its zip package. Try to ask on their forum if they are willing to fix it.
For now as a workaround you can:
Download commons-configuration2-2.0-bin.zip from the site
Unzip it
Copy commons-configuration2-2.0.jar into .../apache-jmeter/lib
Restart jmeter

Hive Derby Metastore Configuration

I have configured my Hive Over Hadoop and Hbase using tutorialspoint
http://www.tutorialspoint.com/hive/hive_installation.htm
Facing this issue with HIVE. Please Help
Logging initialized using configuration in jar:file:/usr/local/hive/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties. Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby://localhost:1527/metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLNonTransientConnectionException: java.net.ConnectException : Error connecting to server localhost on port 1527 with message Connection refused.
at org.apache.derby.client.am.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.client.am.SqlException.getSQLException(Unknown Source)
at org.apache.derby.jdbc.ClientDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
It looks like you haven't started Derby database. Take a look at this Hive doc: https://cwiki.apache.org/confluence/display/Hive/HiveDerbyServerMode. There is a short paragraph on how to start Derby.
Derby Server doesn't run.So it hive throws connection refused exception.
GOto %DERBY_HOME%\bin and StartNetworkServer.cmd file
Now restart hive services.
Hope this really helpful for you

Resources