Apache Ignite Nifi Integration - apache-nifi

I am trying to create/use an apache ignite cache in nifi. I am using version 1.13.2 of nifi and i cannot find the PutIgniteCache and GetIgniteCache options. Please can someone assist me. Does this version support ignite?

https://cwiki.apache.org/confluence/display/NIFI/Migration+Guidance
Migrating from 1.12.x to 1.13.x
Removed the following nar(s) from the convenience build. They are still built and made available in maven repositories so you can add them to your deployment lib folder and use them if you like. They include; nifi-livy-nar, nifi-livy-controller-service-api-nar, nifi-kafka-0-11-nar, nifi-beats-nar, nifi-ignite-nar
so, it's removed from nifi installation but you could download nar file from maven repo and put it into lib directory to make processors available
https://search.maven.org/artifact/org.apache.nifi/nifi-ignite-nar/1.13.2/nar

Related

Apache Jmeter5.5 build code without runtime dependency com.sun.activation » javax.activation

I'm using Apache Jmeter 5.5 libraries to generate .jmx file & integrated with IDE.
Recently my company found vulnerability in com.sun.activation » javax.activation >> 1.2.0 Jar file.
Enterprise support team deleted this jar file from repository.
My Gradle build is failing, please suggest any alternatives ways to come over this.
https://mvnrepository.com/artifact/org.apache.jmeter/ApacheJMeter_components/5.5
com.sun.activation » javax.activation
Tried excluding this jar file from build , Also this jar don't have old or newer version available in maven repository.

Elasticsearch 7.17.4 hdfs repository plugin with Hadoop 2.10.2

We have a multi-nodes Elasticsearch 7.17.4 with a Hadoop 2.10.2 in our offline environment. We want to setup the snapshot repository using the "repository-hdfs-7.17.4.zip" plugin (https://www.elastic.co/guide/en/elasticsearch/plugins/7.17/repository-hdfs.html). But this plugin is packed with some hadoop-3.x.x jar. As the version of our Hadoop is 2.10.2, how should we modify this plugin to make it work?
Should we just pack the below jar into the zip?
hadoop-annotations-2.10.2.jar
hadoop-auth-2.10.2.jar
hadoop-client-2.10.2.jar
hadoop-hdfs-2.10.2.jar
hadoop-hdfs-client-2.10.2.jar
Do we need to remove the below jar from the zip?
hadoop-hdfs-3.3.1.jar
hadoop-client-runtime-3.3.1.jar
For the guava, it looks different from the previous version (7.15.2) as well, any modification needed?
(7.15.2) : guava-11.0.2.jar
(7.17.4) : guava-27.1-jre.jar
Many thanks.

Apache Zeppelin not working with https for maven repo

I'm running Apache Zeppelin 0.8.0 in Amazon EMR. Recently the spark interpreter started to fail to pull down library dependencies. This was because the zeppelin.interpreter.dep.mvnRepo configuration parameter was set to http://repo1.maven.org/maven2/ and the maven repo has recently stopped supporting http as outlined here: https://support.sonatype.com/hc/en-us/articles/360041287334
As per the maven documentation I updated the value of this parameter to https://repo1.maven.org/maven2/ but this didn't resolve the issue. Instead updating the value to http://insecure.repo1.maven.org/maven2/ fixed the problem.
It seems like Zeppelin is not working with https for the maven repo. Can anyone confirm if this is the case or is some extra set up required to get this working?
I was having issues pulling down library dependencies on Zeppelin 0.8.2 with zeppelin.interpreter.dep.mvnRepo set to http://repo1.maven.org/maven2/ like you mentioned, but for me it works with zeppelin.interpreter.dep.mvnRepo set to https://repo1.maven.org/maven2/ only if I set the spark.jars.packages property in the Zeppelin interpreter GUI to the groupId:artificaftId:version for the library I was trying to pull.

how apache-karaf download required depnedencies by features.xml whithout my intervention in case modification

Please see the image first.
i have multiple instance of apache-karaf, when i change something in my java-project i deploy the jar file inside deploy folder of karaf, and this not good because i have to do that for all instance.
now i dont know very well apache-karaf.
i saw that it's easy to use feature, so i create features.xml in deploy folder.
example.
mvn:org.apache.commons/com.springsource.org.apache.commons.logging/1.1.1
mvn:org.springframework/spring-core/3.1.1.RELEASE
what i want to do, is when i deploy a new jar in my local maven repository and when i change version of org.springframework/spring-core/ to 4.1.1.RELEASE in features.xml for example, i want that karaf download this modification whitout my intervention.
is karaf able to download new depnedencies and delete the older alone?
if it's not clear you can ask me question.
You can use Apache Karaf Cellar and Apache Karaf Cave for this scenario.
Apache Karaf Cellar brings "farming" to Karaf, in this scenario you can configure multiple karaf instances within one group.
Apache Karaf Cave is a central Repository which can be used to provide all required Bundles to Karaf instances.

dependency issues with app while deploying in tomcat-server

i am using hbase 0.94.7 and hadoop 1.0.4 and tomcat 7
i wrote a small res-based application which performs crud operations on hbase.
earlier i used to run the app using maven tomcat plugin.
now i am trying to deploy the war in tomcat-server.
since hadoop and hbase jars already contain org.mortbay.jetty jsp-api and servlet-api jars of older verisons,
i am getting Abstract Method Exceptions
here's the exception log
so then i made a exclusion of org.mortbay.jetty from both hadoop and hbase dependencies in pom.xml. but it started showing more and more such kind of issues like jasper.
so then i added scope provided to hadoop and hbase dependencies.
now tomcat is unable to find the hadoop and hbase jars.
can someone help me in fixing this dependecy issues.
Thanks.
Do one thing,
- Right click on project
- go to property,
- type java build path,
- go to third tab of library,
- Removed dependency of lib and maven,
- Clean build your project.
might be solve your problem.

Resources