What is the conflict of TIFFImageWriterSpi? - hadoop

I am trying to setup http://www.geomesa.org/documentation/tutorials/geomesa-raster.html
I have this jai_imageio-1.1.jar at geomesa libs which contains TIFFImageWriterSpi.
when i try to ingest raster using this command:
geomesa ingestraster -u root -p qwerty -t natearth -f "/home/gaurav/Downloads/CPSC-771/geoserver-2.8.3/data_dir/coverages/retile/1/NE2_HR_LC_SR_W_DR_01_01.tif" -F geotiff
Overall log:
Using GEOMESA_HOME = /home/gaurav/Downloads/CPSC-771/geomesa-1.2.1/dist/tools/geomesa-tools-1.2.1/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/gaurav/Downloads/CPSC-771/Installs/accumulo-1.6.5/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/gaurav/Downloads/CPSC-771/Installs/zookeeper-3.4.8/contrib/fatjar/zookeeper-3.4.8-fatjar.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoClassDefFoundError: it/geosolutions/imageioimpl/plugins/tiff/TIFFImageWriterSpi
at org.geotools.gce.geotiff.GeoTiffFormat.<clinit>(GeoTiffFormat.java:106)
at org.geotools.gce.geotiff.GeoTiffFormatFactorySpi.createFormat(GeoTiffFormatFactorySpi.java:88)
at org.geotools.coverage.grid.io.GridFormatFinder.findFormats(GridFormatFinder.java:185)
at org.geotools.coverage.grid.io.GridFormatFinder.findFormat(GridFormatFinder.java:236)
at org.geotools.coverage.grid.io.GridFormatFinder.findFormat(GridFormatFinder.java:216)
at org.locationtech.geomesa.tools.ingest.RasterIngest$class.getReader(RasterIngest.scala:57)
at org.locationtech.geomesa.tools.ingest.LocalRasterIngest.getReader(LocalRasterIngest.scala:26)
at org.locationtech.geomesa.tools.ingest.LocalRasterIngest.ingestRasterFromFile(LocalRasterIngest.scala:52)
at org.locationtech.geomesa.tools.ingest.LocalRasterIngest$$anonfun$runIngestTask$1$$anonfun$apply$mcV$sp$1.apply(LocalRasterIngest.scala:48)
at org.locationtech.geomesa.tools.ingest.LocalRasterIngest$$anonfun$runIngestTask$1$$anonfun$apply$mcV$sp$1.apply(LocalRasterIngest.scala:48)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach_quick(ParArray.scala:143)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach(ParArray.scala:136)
at scala.collection.parallel.ParIterableLike$Foreach.leaf(ParIterableLike.scala:972)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply$mcV$sp(Tasks.scala:49)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:48)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:48)
at scala.collection.parallel.Task$class.tryLeaf(Tasks.scala:51)
at scala.collection.parallel.ParIterableLike$Foreach.tryLeaf(ParIterableLike.scala:969)
at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask$class.compute(Tasks.scala:152)
at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:443)
at scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:160)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.ClassNotFoundException: it.geosolutions.imageioimpl.plugins.tiff.TIFFImageWriterSpi
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 25 more

Read your error message.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/gaurav/Downloads/CPSC-771/Installs/accumulo-1.6.5/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/gaurav/Downloads/CPSC-771/Installs/zookeeper-3.4.8/contrib/fatjar/zookeeper-3.4.8-fatjar.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Here is the link

Related

NoSuchMethodError while starting Hive shell

I have configured Hadoop on a single node cluster, Hadoop cluster is configured properly even I executed MapReduce in a cluster. Now I have newly installed hive tool and did configuration, but when I start Hive shell, I am getting below error.
[dsawale#localhost apache-hive-2.1.0-bin]$ bin/hive
which: no hbase in (/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/usr/lib/jvm/jre-1.8.0-openjdk/bin:/home/dsawale/hadoop-3.2.1/bin:/home/dsawale/kafka_2.11-2.4.1/bin:/home/dsawale/sqoop-1.4.7.bin__hadoop-2.6.0/bin:/home/dsawale/apache-hive-2.1.0-bin/bin:/home/dsawale/.local/bin:/home/dsawale/bin:/home/dsawale/spark-2.4.5-bin-hadoop2.7/bin:/usr/lib/jvm/jre-1.8.0-openjdk/bin:/home/dsawale/hadoop-3.2.1/bin:/home/dsawale/kafka_2.11-2.4.1/bin:/home/dsawale/sqoop-1.4.7.bin__hadoop-2.6.0/bin:/home/dsawale/apache-hive-2.1.0-bin/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/dsawale/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/dsawale/hadoop-3.2.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3612)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:3570)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:76)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:60)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:657)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
How can I resolve this?

Failed to launch hive

I installed hadoop and hive on a Mac OS. I am able to launch hadoop and yarn without any problem. I can run hadoop fs related commands to operate files on hdfs. But I failed to launch hive process and got below error.
$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/2.1.0/libexec/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.8.0/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/usr/local/Cellar/hive/2.1.0/libexec/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bhive.session.id%7D_resources
at org.apache.hadoop.fs.Path.initialize(Path.java:254)
at org.apache.hadoop.fs.Path.<init>(Path.java:212)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:634)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:550)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bhive.session.id%7D_resources
at java.net.URI.checkPath(URI.java:1823)
at java.net.URI.<init>(URI.java:745)
at org.apache.hadoop.fs.Path.initialize(Path.java:251)
... 12 more
I am new to hive and not sure where I should look at. How can I solve above issue?
You can try adding this at the top of your hive-site.xml
<property>
<name>system:java.io.tmpdir</name>
<value>/tmp/hive/java</value>
</property>
Or changing the directory to some /tmp/mydir as its told in Configuring Hive

how to avoid IO error while using kite-dataset to import data?

I'm using Hortonworks HDP distro (2.4) on Ubuntu 14
Downloaded kite-dataset
Running this command:
./kite-dataset -v csv-import --delimiter '|' ml-100k/u.item movies
Getting this error:
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
IO error
org.kitesdk.data.DatasetIOException: Cannot add jar path to distributed cache: /usr/hdp/2.4.2.0-258/hive/lib
at org.kitesdk.tools.TaskUtil$ConfigBuilder.addJarPathForClass(TaskUtil.java:129)
at org.kitesdk.tools.TransformTask.run(TransformTask.java:165)
at org.kitesdk.cli.commands.CSVImportCommand.run(CSVImportCommand.java:186)
at org.kitesdk.cli.Main.run(Main.java:184)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.kitesdk.cli.Main.main(Main.java:266)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.io.IOException: Jar file: /usr/hdp/2.4.2.0-258/hive/lib/ojdbc6.jar does not exist.
at org.apache.crunch.util.DistCache.addJarToDistributedCache(DistCache.java:115)
at org.apache.crunch.util.DistCache.addJarDirToDistributedCache(DistCache.java:208)
at org.apache.crunch.util.DistCache.addJarDirToDistributedCache(DistCache.java:229)
at org.kitesdk.tools.TaskUtil$ConfigBuilder.addJarPathForClass(TaskUtil.java:127)
... 11 more
What can I do to overcome this issue?
This seems to be the relevant part of the error message:
Caused by: java.io.IOException: Jar file: /usr/hdp/2.4.2.0-258/hive/lib/ojdbc6.jar does not exist
The missing jar seems to be an Oracle JDBC driver.
You can download JDBC drivers from this path.

Multiple SLF4J binding error

While i start hive by using hive; and this error appear.
How can i fix this error?
Logging initialized using configuration in jar:file:/usr/local/Cellar/hive/0.14.0/libexec/lib/hive-common-0.14.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.6.0/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/0.14.0/libexec/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
ps.sorry for bad english

slf4j multiple bindings with Mahout on Amazon EMR

I'm running a mahout job on Amazon EMR and getting the following exception:
ArrayUtil.oversize(II)I
attempt_201311181700_0002_m_000000_0: SLF4J: Class path contains multiple SLF4J bindings.
attempt_201311181700_0002_m_000000_0: SLF4J: Found binding in [jar:file:/home/hadoop/lib/slf4j-log4j12-1.7.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_0: SLF4J: Found binding in [jar:file:/mnt/var/lib/hadoop/mapred/taskTracker/hadoop/jobcache/job_201311181700_0002/jars/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
attempt_201311181700_0002_m_000000_0: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Error: org.apache.lucene.util.ArrayUtil.oversize(II)I
attempt_201311181700_0002_m_000000_1: SLF4J: Class path contains multiple SLF4J bindings.
attempt_201311181700_0002_m_000000_1: SLF4J: Found binding in [jar:file:/home/hadoop/lib/slf4j-log4j12-1.7.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_1: SLF4J: Found binding in [jar:file:/mnt/var/lib/hadoop/mapred/taskTracker/hadoop/jobcache/job_201311181700_0002/jars/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_1: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
attempt_201311181700_0002_m_000000_1: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Error: org.apache.lucene.util.ArrayUtil.oversize(II)I
Exception in thread "main" java.lang.IllegalStateException: Job failed!
at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateCollocations(CollocDriver.java:238)
at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateAllGrams(CollocDriver.java:187)
at org.apache.mahout.vectorizer.DictionaryVectorizer.createTermFrequencyVectors(DictionaryVectorizer.java:184)
at clustering.AmazonClusteringDriver.main(AmazonClusteringDriver.java:122)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:187)
I excluded the slf4j dependency in mahout dependency; however, it doesn't solve the problem. So, where is the problem?
This is a very wrong place to ask this.
You should be asking this on the Mahout developer mailing list.
https://cwiki.apache.org/confluence/display/MAHOUT/Mailing+Lists,+IRC+and+Archives#MailingLists%2CIRCandArchives-MahoutUserList

Resources