I am using Hadoop 2.7 with geoserver 2.8.0, but while I am trying to configure Geomesa 1.2.0, I am getting this error message:
$ geomesa
Using GEOMESA_HOME = /usr/local/geomesa/dist/tools/geomesa-tools-1.2.0
Warning: you have not set ACCUMULO_HOME and/or HADOOP_HOME as environment variables.
GeoMesa tools will not run without the appropriate Accumulo and Hadoop jars in the tools classpath.
Please ensure that those jars are present in the classpath by running 'geomesa classpath' .
To take corrective action, please place the necessary jar files in the lib directory of geomesa-tools.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/core/client/TableNotFoundException
at org.locationtech.geomesa.tools.commands.TableConfCommand.<init>(TableConfCommand.scala:32)
at org.locationtech.geomesa.tools.Runner$.createCommand(Runner.scala:50)
at org.locationtech.geomesa.tools.Runner$.main(Runner.scala:21)
at org.locationtech.geomesa.tools.Runner.main(Runner.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.core.client.TableNotFoundException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 4 more
How can I fix this?
The GeoMesa tools need Hadoop and Accumulo jars in order to connect to Accumulo.
One quick option is to run the GeoMesa tools from a tablet server or another machine already configured to be part of the Hadoop cluster. If you are using another machine, you can mirror the $HADOOP_HOME and $ACCUMULO_HOME directories from a cluster node locally.
As another alternative, you can download the install-hadoop-accumulo.sh script in the geomesa-tools/bin directory to download a set of Hadoop and Accumulo jars.
verify that corresponding jar file is present in the classpath, you can check this with the help of command:- Geomesa classpath
If jar is not present then copy the jar in the Geomesa directoryin my case it is in following path:
/*/geomesa-1.2.4/dist/tools/geomesa-tools-1.2.4/lib/common/
Related
[version]
Apache Spark 2.2.0
Hadoop 2.7
I want to set up Apache Spark histroy server.
Spark events log located in Amazon S3.
I can save log file in S3, but cannot read from history server.
Apache Spark installed at /usr/local/spark
so, $SPARK_HOME is /usr/local/spark
$ cd /usr/local/spark/sbin
$ sh start-history-server.sh
I got following error
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hadoop.fs.s3a.S3AFileSystem
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
....
my spark-defaults.conf is below
spark.hadoop.fs.s3a.impl org.apache.hadoop.fs.s3a.S3AFileSystem
spark.history.provider org.apache.hadoop.fs.s3a.S3AFileSystem
spark.history.fs.logDirectory s3a://xxxxxxxxxxxxx
spark.eventLog.enabled true
spark.eventLog.dir s3a://xxxxxxxxxxxxxxx
I installed this 2 jar files in /usr/local/spark/jars/
aws-java-sdk-1.7.4.jar
hadoop-aws-2.7.3.jar
but error is same.
What is wrong?
Please add the following in spark-defaults.conf file and retry again.
spark.driver.extraClassPath :/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/* :
spark.executor.extraClassPath :/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/*:
What should i do further?
I have an error message when running this jar file on hadoop system.
hadoop jar units.jar /input_dir/sample.txt /output_dir/result
Exception in thread "main" java.lang.ClassNotFoundException:
/input_di /sample/txt at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:278) at
org.apache.hadoop.util.RunJar.run(RunJar.java:214) at
org.apache.hadoop.util.RunJar.main(RunJar.java:136)
From the Apache Hadoop Docs :
Usage: hadoop jar <jar> [mainClass] args...
Runs a jar file.
You are missing the fully qualified class name in your JAR command.
I am new to Hadoop and learnt that with 2.x version, I can try Hadoop on my local Windows 7 64-bit machine.
I installed hadoop 2.6.0 and installed cygwin.
I could execute bin/hadoop version but I get the below error while executing the jar command:
Note: I have also placed the winutils.jar in the bin, from hadoop-common-2.2.0.jar. Please help. I am not able to get rid of this error. I have also entered the input and output parameters, it still fails.
$ bin/hadoop jar /Hadoop/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount
15/02/03 12:40:45 ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
at org.apache.hadoop.util.GenericOptionsParser.preProcessForWindows
(GenericOptionsParser.java:438)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions
(GenericOptionsParser.java:484)
at org.apache.hadoop.util.GenericOptionsParser.<init>
(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>
(GenericOptionsParser.java:153)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke
(ProgramDriver.java:71)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Usage: wordcount <in> [<in>...] <out>
I could run the below command as well:
$ bin/hadoop jar /Hadoop/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar
It used to be an issue earlier. However if you are able to run the program through jar, there could be something else at fault.
If the same thing works for you using a Java code, you can edit the jar to remove the code where a new exception is being raised.
To be doubly sure, check if the bin directory contains winutils.exe and hadoop.dll.
If they are not present, chances are that someone else must have faced a similar issue and would have kept the files. These files are created when Hadoop is built from source code on the OS.
It seems like that you have installed hadoop 2.6.0 and older version of hadoop winutils. You must install hadoop winutils of your current hadoop version. Try to download winutils from this github repo https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin
Finally replace your bin directory with the winutils bin directory!
I am trying to use mysql DB with apache OOzie.
my $OOZIE_HOME is
-bash: /opt/oozie_install/oozie-3.3.0-cdh4.2.2: Is a directory
But I copied mysql-connector-java-5.1.29-bin.jar in almost every possible places.
Like I copied it inside
/opt/oozie_install/oozie-3.3.0-cdh4.2.2
/opt/oozie_install/oozie-3.3.0-cdh4.2.2/libs
/opt/oozie_install/oozie-3.3.0-cdh4.2.2/libtools
/usr/lib/jvm/jdk/libs
/user/home/hadoop/
But I am still getting ClassnotFoundException.
java.lang.Exception: Could not connect to the database: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.oozie.tools.OozieDBCLI.validateConnection(OozieDBCLI.java:473)
at org.apache.oozie.tools.OozieDBCLI.createDB(OozieDBCLI.java:179)
at org.apache.oozie.tools.OozieDBCLI.run(OozieDBCLI.java:118)
at org.apache.oozie.tools.OozieDBCLI.main(OozieDBCLI.java:64)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at org.apache.oozie.tools.OozieDBCLI.createConnection(OozieDBCLI.java:462)
at org.apache.oozie.tools.OozieDBCLI.validateConnection(OozieDBCLI.java:469)
Exactly where am I supposed to copy the Mysql connector.
I have verified my oozie-site.xml-
I followed the following steps to use mysql in Oozie
My oozie directory looks like-
You have to copy mysql-connector-java-5.1.29-bin.jar to /opt/oozie_install/oozie-3.3.0-cdh4.2.2/libext directory then restart oozie instance. Make sure that mysql user oozie has suffient privileges to the database oozie, if not, grant suffient permission using grant command in mysql server.
i meet the same problem finally i solve it by edit the oozie-env.sh and append the JAVA_HOME at last export JAVA_HOME=/usr/local/jdk1.7 the java_home is yourself javapath
I ran into this issue when I was converting my local Derby instance to MySql. The difference between my issue and the others, is that I did not install an RPM. My Oozie instance was pre-compiled in a tar.gz file. I had to copy the mysql-connector-java-bin.jar to the oozie-server/lib directory. This was in addition to copying it to the lib, libext, and libtools directories. I am not sure if all of those are needed, but I do know that oozie-server/lib is needed for Oozie to start. Hope this helps someone!
I get the following error while executing a MapReduce program.
I have placed all jars in hadoop/lib directory and have also mentioned the jars in -libjars.
This is the cmd I am executing:
$HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar /home/shash/distinct.jar HwordCount -libjars $LIB_JARS WordCount HWordCount2
java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hcatalog.mapreduce.HCatOutputFormat at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:996) at
org.apache.hadoop.mapreduce.JobContext.getOutputFormatClass(JobContext.java:248) at org.apache.hadoop.mapred.Task.initialize(Task.java:501) at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:306) at org.apache.hadoop.mapred.Child$4.run(Child.java:270) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) at
org.apache.hadoop.mapred.Child.main(Child.java:264) Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.HCatOutputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:423) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at
java.lang.ClassLoader.loadClass(ClassLoader.java:356) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:264) at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994) ...
8 more
Make sure LIB_JARS is a comma-separated list (as opposed to colon-separated like CLASSPATH)
Applies To CDH 5.0.x CDH 5.1.x CDH 5.2.x CDH 5.3.x Sqoop
Cause Sqoop cannot pick up the HCatalog libraries because Cloudera
Manager does not set the HIVE_HOME environment. It needs to be set
manually.
This problem is tracked with below JIRA:
https://issues.apache.org/jira/browse/SQOOP-2145
The fix of this JIRA has been included in CDH since version 5.4.0.
Workaround: Applicable to CDH versions lower than 5.4.0.
Execute below commands in shell before calling Sqoop command or adding them to /etc/sqoop/conf/sqoop-env.sh (create one, if it does not already exists):
export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive (for parcel installation)
export HIVE_HOME=/usr/lib/hive (for package installation)