SuperCSV missing package in JAR - supercsv

This sounds silly but I can't find the package org.supercsv.mock in any JAR file i've downloaded for SuperCSV.
Does anyone know where this package is located?

That package is purely used for test/example classes (which are not in any distributed jars that you can download).
You can view the source of the classes in this package by:
checking them out of the Super CSV subversion repository, or
viewing the test source cross-reference.

Related

Building a specific type of a package

I am new to Gradle. Here is my scenario. I have a Gradle project. This project doesn’t have any java code. All it has, is a ‘build.gradle’ file … to package other war/jar/libs/configs from certain source directories, and create a TAR.GZ file output. … This ‘build.gradle’ file is currently being used, before implementing DevOps.
Now, after implementing DevOps, we use Artifactory repository to store all war/jar/libs/configs files.
We want to update this ‘build.gradle’ file to fetch / download all the files from Artifactory … to build the TAR package ... rather sourcing it from local directories.
I have a specific need:
• Produce 3 different package types – LIGHT / PARTIAL / FULL … meaning, LIGHT package will have a pre-defined set of files, PARTIAL package will have custom selection, FULL package will have everything
• I want to pass the option, via the gradle.properties file
• Gradle should download the files from Artifactory, according to the package type mentioned in the properties file (LIGHT/PARTIAL/FULL)
• Is it possible to bring in such dynamism into a build.gradle file?
Please guide. THANKS A LOT
You can use the file specs to download files from Artifactory for your different packages. For example you can have different file specs for the different package types you mentioned and download them using the REST API or the JFrog CLI. These file specs can be used in your Gradle builds.
You can find a few sample specs on GitHub

In IntelliJ IDEA Maven gets the external library according to .pom but trying to compile gives "package does not exist"

I'm simply trying to get one external library into use for a test project in IntelliJ IDEA. I have a pom file, and doing "Reimport" gives the external library as shown in the "dependencies" section of the .pom file. But when I try to import that package, it just gives me compiler error that the package doesn't exist.
What gives? What step am I missing? If the project has a .pom file that gives a specific external library, shouldn't that be available for the project? Or does it matter what I have as "groupid" or "artifactId"?
Ah okay. I found it out. It wasn't as I described. Rather the library had changed its package name and that's why it couldn't be imported.

Kafka Connector - Packacking jars

QUESTION
I am not a maven pro and I got stuck trying to package a Kafka Connector. There are two options for packaging it:
Either you produce a folder with a jar that contains the connector + all the dependency jars - all the kafka-specific jars
Or build a fat jar with all of the dependencies (and I also assume without the kafka-specific jars again, but it is not explicit in the docs).
I am following docs on confluent webpage and the connector I am trying to package is this one on github.
What I tried, after cloning the repo with git, is the following mvn clean package. But this seems to create only a single jar of the original project with the dependencies in mvn cache (~/.m2/repository/).
Google also has link on how to create a fat jar, but I would need somehow to specify which jars I want to exclude from the fat jar.
Thanks
UPDATE
Now I am running:
connect-standalone /etc/kafka/connect-standalone.properties /etc/kafka/connect-cdc-mssql-source.properties
Where /etc/kafka/connect-standalone.properties contains the following line:
plugin.path=/shared_win_files
And ls -al /shared_win_files contains the following:
kafka-connect-cdc-mssql-0.0.1-SNAPSHOT.jar
And jar tvf kafka-connect-cdc-mssql-0.0.1-SNAPSHOT.jar contains the following:
6996 Thu Sep 07 14:47:24 BST 2017 com/github/jcustenborder/kafka/connect/cdc/mssql/MsSqlSourceConnector.class
where MsSqlSourceConnector.classis basically this class here which implements the Connector.
But when I try to run the connector with the command above, I get an error
Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.cdc.mssql.MsSqlSourceConnector
It gives a massive list with all available plugins, but mine is not in there.
Currently, an easy way to package your connector with maven is to use maven-assembly-plugin. This basically entails two main steps:
Define one or more assembly descriptors and save them under src/assembly.
Doc: http://maven.apache.org/plugins/maven-assembly-plugin/descriptor-refs.html
Example: https://github.com/confluentinc/kafka-connect-elasticsearch/blob/master/src/assembly/package.xml
In the descriptor, among other things, you may choose the packaging format of your archive, files and directories to include or exclude, as well as specific settings regarding your projects dependencies.
Include the plugin in your project's pom.xml
Example: https://github.com/confluentinc/kafka-connect-elasticsearch/blob/master/pom.xml
This mainly requires you to define the configuration and execution sections of the maven-assembly-plugin section. Additionally you can associate calls to specific assembly plugin descriptors with certain maven profiles that you may define.
Finally, stay tuned because packaging your Kafka Connect plugins (connectors, transforms, converters) might be significantly simplified soon.
Following Konstantine's answer how to package jars.
The remaining problem was that when specifying plugin.path=/abc in KafkaConnect config, you have to be careful.
You can either put a fat jar like this:
/abc/fatjar.jar
Or you have to create another folder in abc and put all the related jars into that folder like this:
/abc/my-connector-a/connector.jar
/abc/my-connector-a/connector-dependency.jar
...
As in my case, it was treating jars as separate plugins.

Importing Jyson Jar into nGrinder

I'm trying to import the Jyson module to Grinder running on a remote machine. However, I see no clear guidance on how to accomplish this. Where does the jar file go? The Jyson zip I downloaded has a lib and src folder as well. I had read this link and understood what has to be added into the grinder.properties file, but where do the actual lib and src files go? If there is an already existing link that explains the same, please do link me to it.
Thanks for the all help
Figured out how to do it. Created a src and lib folder via the UI and referenced it via code. Thanks!

Maven: Execute custom code during assembly

We are using the assembly plugin to build a zip package.
I would like to execute some custom java code during the execution of the Maven Assembly plugin. The java app should have access to the structure of the assembly but before the zip file is built. So, files which should go into the zip might possibly be modified/added/removed.
How would I configure that?
Cheers
Jonas
I do not think executing Java code is possible. Try getting along with exclusion patterns for file removal and maven filters for file modifications.
http://maven.apache.org/plugins/maven-assembly-plugin/advanced-descriptor-topics.html

Resources