this is my java hbase createtable program below:-
public class createtable
{
public static void main(String[] args) throws IOException
{
Configuration conf = HBaseConfiguration.create();
conf.set("hbase.zookeeper.quorum", "sandbox.hortonworks.com");
conf.set("hbase.zookeeper.property.clientPort", "2181");
conf.set("zookeeper.znode.parent", "/hbase-unsecure");
HBaseAdmin admin = new HBaseAdmin(conf);
HTableDescriptor tableDescriptor = new HTableDescriptor(TableName.valueOf("people"));
tableDescriptor.addFamily(new HColumnDescriptor("personal"));
tableDescriptor.addFamily(new HColumnDescriptor("contactinfo"));
admin.createTable(tableDescriptor);
Put put = new Put(Bytes.toBytes("doe-john-m-12345");
}
after creating jar (table-0.0.1-SNAPSHOT.jar) of the program when i am running the command
hadoop jar table-0.0.1-SNAPSHOT.jar table.createtable
i am getting
Exception in thread "main" java.lang.NoClassDefFoundError:org/apachehadoop/hbase/HBaseConfiguration at
table.createtable.main(createtable.java:17)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.jav a:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148) Causedby:
java.lang.ClassNotFoundException:org.apache.hadoop.hbase.HBase
Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
how do i resolve this error ??
Your code is not able to see hbase.jar . Try adding this piece in your pom.xml
<!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<version>0.90.2</version>
</dependency>
And then run it .
Please read this blog: https://my-bigdata-blog.blogspot.com/2017/08/hbase-programming-on-map-reduce-with.html
Two points to make it work
1) You code should have TableMapReduceUtil.addDependencyJars(job);
2) On command line - before you execute the command do this:
export HADOOP_CLASSPATH=your-jar-path
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:hbase classpath
This adds hbase libraries for execution. The one you add in maven/netbeans are for compilation.
i solved this issue as there is issue in making fat jar and the in new hbase version jars are deprecated with many dependency..so i also added hbase-client-1.2.6-jar and hbase-common-1.2.6-jar to eclipse externally and then building fat jar ...this solved my issue .
Related
Kindly forgive the poor presentation and unethical coding lines if any
I'm trying to create my first jdbc class using embedded derby driver, I have created a maven project and added the following dependency
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.11.1.1</version>
</dependency>
I created a derby database in eclipse database developer, the ping to the database was successful, created a table and inserted values into it.
When I run the below java program
import java.sql.*;
import org.apache.derby.jdbc.*;
public class SiteDao implements SiteDaoInterface {
public static void main(String[] args)
{
String url ="jdbc:derby:C://Users//shash//MyDB;create=true";
String username="xyz";
String password="xyz";
String query="select * from practise.abc";
//Class.forName("org.apache.derby.jdbc.ClientDriver")
try {
//Class.forName("org.apache.derby.jdbc.EmbeddedDriver");
//DriverManager.registerDriver(new org.apache.derby.jdbc.EmbeddedDriver());
Connection con=DriverManager.getConnection(url,username,password);
Statement st= con.createStatement();
ResultSet rs=st.executeQuery(query);
int i,j;
while( rs.next())
{
i=rs.getInt(1);
j=rs.getInt(2);
System.out.println("data is"+i+","+j);
}
st.close();
con.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
I'm encountering the below error.
Exception in thread "main" java.lang.NoSuchMethodError: 'void org.apache.derby.iapi.services.i18n.MessageService.setFinder(org.apache.derby.iapi.services.i18n.BundleFinder)'
at org.apache.derby.impl.services.monitor.BaseMonitor.runWithState(Unknown Source)
at org.apache.derby.impl.services.monitor.FileMonitor.<init>(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.startMonitor(Unknown Source)
at org.apache.derby.iapi.jdbc.JDBCBoot.boot(Unknown Source)
at org.apache.derby.jdbc.EmbeddedDriver.boot(Unknown Source)
at org.apache.derby.jdbc.EmbeddedDriver.<clinit>(Unknown Source)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:377)
at com.smedia.dao.SiteDao.main(SiteDao.java:17)
I tried to go through the class mentioned in the error but encountered the below issue
I tried many solutions offered on internet but it didn't work
I'm using java 15.0.1 and even tried using derby-10.15.2.0 as suggested in some stackoverflow solutions but it didn't work (moreover there is no EmbeddedDriver class in 10.15.x.x version to my surprise)
Please let me know where am I ruining the code?
I am trying to insert an array to postgres using java code , but I always get this error :
SEVERE [http-nio-8080-exec-2]org.apache.catalina.core.StandardWrapperValve.invoke Servlet.service()
for servlet [] in context with path [/] threw exception
[Servlet execution threw an exception] with root cause
java.lang.AbstractMethodError:
com.mchange.v2.c3p0.impl.NewProxyConnection.createArrayOf(Ljava/lang/String;[Ljava/lang/Object;)Ljava/sql/Array;
Code Used
pst = getConnection().prepareStatement(INSERT_QUERY,PreparedStatement.RETURN_GENERATED_KEYS);
pst.setString(1, t.getname());
pst.setString(2, t.getEmail());
Array itemIds = conn.createArrayOf("bigint", t.getItemIds());
pst.setArray(3, itemIds);
If I run the function through main class it works fine , but after deploying to tomcat server, http calls fail with above error .
DB Used - Postgres
JDBC Driver postgres-9.1-901-1.jdbc4
c3p0-0.9.5-pre10
tomcat-8.0.24
As per research I have done , createArrayOf() supposed to work with jdbc4 and c3p0-0.9.5 .
Using this works fine , but I don't see it as right approach
if (conn instanceof C3P0ProxyConnection) {
C3P0ProxyConnection proxy = (C3P0ProxyConnection) conn;
try {
Method m = Connection.class.getMethod("createArrayOf", String.class, Object[].class);
Object[] args = { "bigint", t.getItemIds() };
itemIds = (Array) proxy.rawConnectionOperation(m, C3P0ProxyConnection.RAW_CONNECTION, args);
} catch (IllegalArgumentException e) {
throw new SQLException(e);
}
} else {
itemIds = conn.createArrayOf("bigint", t.getItemIds());
}
Need help .
Thanks
I strongly suspect that you have an older version of c3p0 somewhere in your application's effective CLASSPATH. I've downloaded and verified from c3p0-0.9.5-pre10.jar and c3p0-0.9.5.1.jar on Maven Central that com.mchange.v2.c3p0.impl.NewProxyConnection does in fact contain the createArrayOf method.
% javap -sysinfo -cp ./c3p0-0.9.5.1.jar com.mchange.v2.c3p0.impl.NewProxyConnection
Classfile jar:file:/Users/swaldman/tmp/c3p0jars/c3p0-0.9.5.1.jar!/com/mchange/v2/c3p0/impl/NewProxyConnection.class
Last modified Jun 16, 2015; size 27098 bytes
MD5 checksum c1ff36b87219ddc84c92fb6c1445a2d1
Compiled from "NewProxyConnection.java"
public final class com.mchange.v2.c3p0.impl.NewProxyConnection implements java.sql.Connection,com.mchange.v2.c3p0.C3P0ProxyConnection {
//...
public synchronized java.sql.Array createArrayOf(java.lang.String, java.lang.Object[]) throws java.sql.SQLException;
//...
}
Although this is not likely the cause of your problem, I recommend that you do upgrade to c3p0-0.9.5.1 rather than using the prerelease version.
To see what version of c3p0 your application is actually using, look at INFO in your log files for the banner printed when the c3p0 library is initialized. It should look something like this:
INFO: Initializing c3p0-0.9.5.1 [built 16-June-2015 00:06:36 -0700; debug? true; trace: 10]
I suspect that you will see an older version, that somewhere, perhaps among the transitive dependencies, you are pulling in 0.9.1.x or 0.9.2.x versions of the library.
Good luck!
I can see you are using old version of the c3p0 library.
if you are using maven, update the library to higher version in you pom file from
<dependency>
<groupId>c3p0</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.1.2</version>
</dependency>
to
<dependency>
<groupId>com.mchange</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.5.2</version>
</dependency>
I hope it help someone.
Updating dependency to following solved my problem.
<dependency>
<groupId>com.mchange</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.5.2</version>
</dependency>
I'm facing ClassNotFoundException, when I run my job for the class org.apache.hcatalog.rcfile.RCFileMapReduceOutputFormat.
I tried to pass the additional jar files with -libjars, still I am facing the same issue. Any suggestions will be greatly helpful. Thanks in advance.
Below is the command I am using and exception I am facing!
hadoop jar MyJob.jar MyDriver -libjars hcatalog-core-0.5.0-cdh4.4.0.jar inputDir OutputDir
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hcatalog/rcfile/RCFileMapReduceOutputFormat
at com.cloudera.sa.omniture.mr.OmnitureToRCFileJob.run(OmnitureToRCFileJob.java:91)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at com.cloudera.sa.omniture.mr.OmnitureToRCFileJob.main(OmnitureToRCFileJob.java:131)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.rcfile.RCFileMapReduceOutputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 8 more
I implemented ToolRunner as well, below is the code which confirms that!
public class OmnitureToRCFileJob extends Configured implements Tool {
public static void main(String[] args) throws Exception {
OmnitureToRCFileJob processor = new OmnitureToRCFileJob();
String[] otherArgs = new GenericOptionsParser(processor.getConf(), args).getRemainingArgs();
System.exit(ToolRunner.run(processor.getConf(), processor, otherArgs));
}
}
Did you try running by giving full path of "hcatalog-core-0.5.0-cdh4.4.0.jar" jar file in your below line.
hadoop jar MyJob.jar MyDriver -libjars hcatalog-core-0.5.0-cdh4.4.0.jar inputDir OutputDir
or
Below configuration should also work for you
$ export LIBJARS= <fullpath>/hcatalog-core-0.5.0-cdh4.4.0.jar
$hadoop jar MyJob.jar MyDriver -libjars ${LIBJARS} inputDir OutputDir
If you look at hadoop command documentation, you can see that -libjars is a generic option. For parsing generic option, you got to override the ToolRunner.run() method in your driver class as follows :
public class TestDriver extends Configured implements Tool {
#Override
public int run(String[] args) throws Exception {
Configuration conf = getConf();
# Job configuration details
# Job submission
return 0;
}
}
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new TestDriver(), args);
System.exit(exitCode);
}
I'nk you are getting this exception from your driver code itself. Setting hcatalog-cor*.jar using -libjars option may not be available in client JVM(JVM in which driver code runs). Better you need to set this jar in HADOOP_CLASSPATH environment variable before executing the same using hadoop jar as follows
export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:<PATH-TO-HCAT-LIB>/hcatalog-core-0.5.0-cdh4.4.0.jar;
hadoop jar MyJob.jar MyDriver -libjars hcatalog-core-0.5.0-cdh4.4.0.jar inputDir OutputDir
Id had the same problem but found out the jar command doesn't accept the --libjars argument.
"Specify comma separated jar files to include in the classpath. Applies only to job." --> Hadoop Cli Generic Options
Instead you should use the env vars to add additional or replace jars.
export HADOOP_USER_CLASSPATH_FIRST=true
export HADOOP_CLASSPATH="./lib/*"
How can I run the code below from the command line?
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.*;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.*;
public class MyHBase {
public static void main(String[] args) throws Exception {
Configuration conf = HBaseConfiguration.create();
HBaseAdmin admin = new HBaseAdmin(conf);
try {
HTable table = new HTable(conf, "test-table");
Put put = new Put(Bytes.toBytes("test-key"));
put.add(Bytes.toBytes("cf"), Bytes.toBytes("q"), Bytes.toBytes("value"));
table.put(put);
} finally {
admin.close();
}
}
}
How to set my hbase classpath? I am getting a huge string in my classpath.
UPDATE
root# vi MyHBase.java
hbase-0.92.2 root# java -classpath `hbase classpath`:./ /var/root/MyHBase
-sh: hbase: command not found
Exception in thread "main" java.lang.NoClassDefFoundError: /var/root/MyHBase
Caused by: java.lang.ClassNotFoundException: .var.root.MyHBase
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
I am able to do
hbase-0.92.2 root# bin/hbase shell
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 0.92.2, r1379292, Fri Aug 31 13:13:53 UTC 2012
hbase(main):001:0> exit
Am i doing anything wrong?
You can also do something like this:-
# export HADOOP_CLASSPATH=`./hbase classpath`
and then bundle this java in jar to run it within hadoop cluster like this:-
#hadoop jar <jarfile> <mainclass>
Use this
java -classpath `hbase classpath`:./ MyHBase
this will get the entire classpath string that HBase uses and adds the current directory as well to the classpath.
Cheers
Rags
I have a created a simple MapReduce Driver that implements the Tool interface. But when I try to run the job in Eclipse, I get a NoClassDefFoundError before the run() method is invoked.
I am running Hadoop 0.20.2 on Ubuntu 10.04 LTS. The source code and stack trace are provided below. Any assistance will be greatly appreciated.
Sourcecode
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.*;
public class MyTestDriver extends Configured implements Tool {
#Override
public int run(String[] args) throws Exception {
if (args.length != 2) {
System.err.printf("Usage: %s [generic options] <input> <output>\n",
getClass().getSimpleName());
ToolRunner.printGenericCommandUsage(System.err);
return -1;
}
// Code here to submit Hadoop Job ...
return 0;
}
/**
* #param args
*/
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new MyTestDriver(), args);
System.exit(exitCode);
}
}
Stacktrace
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/cli/ParseException at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at
MaxTemperatureDriver.main(MaxTemperatureDriver.java:44) Caused by:
java.lang.ClassNotFoundException:
org.apache.commons.cli.ParseException at
java.net.URLClassLoader$1.run(URLClassLoader.java:202) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:190) at
java.lang.ClassLoader.loadClass(ClassLoader.java:307) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at
java.lang.ClassLoader.loadClass(ClassLoader.java:248) ... 3 more
Are all of Hadoop's dependencies in your Eclipse build path? Make sure all the jars in the hadoop/lib directory are in your build path.
The error means that there was a particular class that was not found. Classes are stored within .jar files. So, you need to check if all the required jar files are available or not. Jar files are searched based on the CLASSPATH variable. If you are using Eclipse as your development environment, please check if the build dependencies are satisfied (you can do this by - right clicking on your project -> configure build path -> add external jar).
Also, if you are not sure which class is missing, you can check the classname (Hadoop is open source, so you can find the class name) and then search the class name in findjar