I have added some jar files to nifi lib folder to allow me to connect hive but unfortunately the nifi app not running in web , then I removed all the new jar files but the issue not fixed and I keep getting the following error (NIFI version 1.15.0)
ERROR [main] org.apache.nifi.NiFi Failure to launch NiFi due to java.lang.NoClassDefFoundError: org/apache/nifi/processor/DataUnit
I did find this answer that seems to point to the need to have declared the file in the pom. Apache Nifi failure due to java.lang.NoClassDefFoundError in a local maven dependency jar.
Related
Firstly, after changing the packaging from jar to war for spring boot application, I am not able to do maven build. Getting below error:
Caused by: org.springframework.beans.factory.BeanDefinitionStoreException: Failed to parse configuration class [com.goel.GameApplication]; nested exception is java.io.FileNotFoundException: Could not open ServletContext resource [/application.properties]
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:189)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:331)
The above error comes when maven tries to run app tests. So I commented out the test and ran maven build and I was able to get a war file.
But after deploying the same war file in tomcat(aws), I am getting the same error.
I have followed the 3 step approach properly.Ref: https://www.javadream.in/convert-jar-to-war-in-spring-boot/
I have checked that the build-path does not exclude resource or application.properties
I have tried moving from spring IDE to Eclipse IDE.
I have opened the war file and have verified that the application.properties file is present under WEB-INF/classes/application.properties.
I have read and tried many other things but nothing helped. Any pointers will be helpful. Thanks in advance.
Please do let me know if more info is required,
Java Version: 1.8 ,
Maven Version: 3.6.3 ,
spring-boot-starter-parent: 2.5.4
Exploded war file
Screenshot for SSGApplication
I have an existing Project I am working on. I got tired of downloading individual Jar files and importing them into my project's library. I decided to experiment with the Services->Maven.
I need to use Apache POI in my project. I can search and find the Apache POI Library, but how do i added it to my Project so that all dependencies are also present? I am currently getting errors like the following
java.io.IOException: org/apache/commons/collections4/ListValuedMap
at org.apache.poi.ss.usermodel.WorkbookFactory.createWorkbook(WorkbookFactory.java:353)
at org.apache.poi.ss.usermodel.WorkbookFactory.createXSSFWorkbook(WorkbookFactory.java:316)
at org.apache.poi.ss.usermodel.WorkbookFactory.create(WorkbookFactory.java:59)
at Tufin.InterfaceCommand.run(InterfaceMagicCLI.java:432)
at picocli.CommandLine.executeUserObject(CommandLine.java:1666)
at picocli.CommandLine.access$900(CommandLine.java:144)
at picocli.CommandLine$RunAll.handle(CommandLine.java:2094)
at picocli.CommandLine$RunAll.handle(CommandLine.java:2053)
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:1872)
at picocli.CommandLine.execute(CommandLine.java:1801)
at Tufin.InterfaceMagicCLI.main(InterfaceMagicCLI.java:81)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/collections4/ListValuedMap
I have some map-reduce working fine on y local machine.
When I run the job on a remote cluster, I get this error :
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.avro.mapreduce.AvroKeyInputFormat not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2298)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getInputFormatClass(JobContextImpl.java:175)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:751)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
My local machine runs 2.7.0 hadoop version and on my remote cluster I have 2.8.1 hadoop version
Where does this error may come from ?
It will be very less probable but you might double check your dependency :
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-mapred</artifactId>
<version>1.7.6</version>
</dependency>
The error clearly says ClassNotFoundException: Class org.apache.avro.mapreduce.AvroKeyInputFormat
It means that a required apache avro library is missing in classpath.
Hadoop picks the jar files from the path emit by below command:
$ hadoop classpath
By default it list all the hadoop core jars.
You can add your jar by executing below command on the prompt or add to the shell script for executing the map-reduce.
export HADOOP_CLASSPATH=/path/to/my/apache_avro_jar.jar
After executing this, try checking the class path again by hadoop classpath command and you should be able to see your jar listed along with hadoop core jars.
Try to execute the program again.
Another option is to create a fat jar of the program by including apache avro jar.
3rd option is to add Apache Avro jar into mapreduce classpath by adding it via DistributedCache.
DistributedCache.addFileToClassPath(avroJar, conf);
I'm running Spark's example called JavaPageRank, but it's a copy that I compiled separately using maven in a new jar. I keep getting this error:
ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.NoClassDefFoundError: com/google/common/collect/Iterables
Despite the fact that guava is listed as one of Spark's dependencies. I'm running compiled Spark 1.6 that I downloaded pre-compiled from the apache website.
Thanks!
The error means that the jar containing com.google.common.collect.Iterables class is not in the classpath. So your application is not able to find the required class in runtime.
If you are using maven/gradle , try to clean, build and refresh the project. Then check your classes folder and make sure the guava jar is in the lib folder.
Hope this will help.
Good luck!
i am using hbase 0.94.7 and hadoop 1.0.4 and tomcat 7
i wrote a small res-based application which performs crud operations on hbase.
earlier i used to run the app using maven tomcat plugin.
now i am trying to deploy the war in tomcat-server.
since hadoop and hbase jars already contain org.mortbay.jetty jsp-api and servlet-api jars of older verisons,
i am getting Abstract Method Exceptions
here's the exception log
so then i made a exclusion of org.mortbay.jetty from both hadoop and hbase dependencies in pom.xml. but it started showing more and more such kind of issues like jasper.
so then i added scope provided to hadoop and hbase dependencies.
now tomcat is unable to find the hadoop and hbase jars.
can someone help me in fixing this dependecy issues.
Thanks.
Do one thing,
- Right click on project
- go to property,
- type java build path,
- go to third tab of library,
- Removed dependency of lib and maven,
- Clean build your project.
might be solve your problem.