I'm having a few issues with a spring-boot jar file that I have packaged in an RPM using the maven RPM plugin. The issue is it won't run if it has been compressed (which is what happens when it the jar file gets packaged in an RPM.)
I was wondering if there is any way to turn off/disable this RPM compression in a similar way that you can do in a zip file. I have already tried adding
%define _source_payload w0.gzdio
%define _binary_payload w0.gzdio
to my .spec file (through the rpm maven plugin) but the jar file is still being compressed.
RPM building does some post-processing on JAR files. See /usr/lib/rpm/redhat/brp-java-repack-jars for specifics.
See also: Packaging:Java and this old mailing list post.
Related
I have a Spring Boot 2.x project that uses Gradle 7.x.
I'm assembling a distribution of the artifact/service in a zip / tar file using the built-in Spring Boot task(s) provided. There is no meta-data associated with this asset, nor any need to add anything else to it.
I would like to copy (or publish) this zip / tar file into Artifactory (using Gradle), but so far everything I see around that subject includes (1) the file itself (usually a jar), (2) module meta-data and (3) the POM file.
Is there a way to accomplish what I'm looking for?
Not exactly answers your question, but an easier approach would be to upload to Artifactory using the JFrog CLI:
$ curl -fL https://getcli.jfrog.io
$ ./jfrog rt upload \
--url="https://domain.tld" \
--user="some_user" \
--password="the_password" \
file_to_upload.zip path/within/artifactory/to/place/the/file/
For more information see Uploading Files.
I have some .tar.gz-Archives and would like to only unpack the Tarball to create a sha256-checksum hash of the .tar-File. (The reason for this is that the archive will be un- and repacked later on, as we are generating patch-files.)
Now this seems like an easy task but I'm stuck. There are either Gradle examples for:
getting the unpacked tarTree (with a Gradle Copy-Task and tarTree(resources.gzip('model.tar.gz')) (from documentation: working with files)
unzipping Files (with zipTree), which does not work with gzipped files
Both approaches do not work, since I need to create a checksum of the .tar-File itself. Unfortunately I can't use commandLine or gunzip as the tasks should run on both Windows and Linux.
The only solution I can imagine of right now is unpacking the tar.gz to a fileTree and repacking it to a tar-file, but I'm not even sure the checksum would be the same.
Is there really no way to do this directly?
Finally I got it to work with help from a colleague.
Using resources.gzip(), which returns a ReadableResource, we can copy the resulting InputStream into a .tar-file with IOUtils.copy:
file("test.tar").withOutputStream { outputFile ->
IOUtils.copy(resources.gzip(file("test.tgz")).read(), outputFile)
}
We also needed to add Apache commons ("commons-io:commons-io:2.6") as a dependency.
When creating a rpm package using a SPEC file it is possible to configure files in the %files section which will not be included in the rpm itself but owned by it.
Typical use-case are log-files which do not exist when the rpm is created but will be created when the service / application runs.
Those files are normally marked by %ghost as "prefix" in the %files section. This way those files are deleted if the rpm is uninstalled.
How do i declare ghost files using the maven plugin?
I've already read the documentation multiple times and did not find anything on SO or my favorite internet search engine.
Is it even possible to do so using the plugin or do i have to provide a custom SPEC file to the plugin somehow?
I have created a new Java project in
eclipse-jee-kepler-SR2-win32-x86_64.
I have included the Jars in
flink-0.8.1\lib.
I have created the standard WordCount and it works.
I have modified my WordCount to take input from text files and csv files and it works.
all the imports work perfectly.
then i tried import org.apache.flink.api.java.io.jdbc.JDBCInputFormat.
Eclipse doesn't find it?
Why does Eclipse not find the import?
Because inside the jar flink-java-0.8.1.jar there is no directory io/jdbc.
I tried the same thing with flink-0.9.0-bin-hadoop27 and in the jar flink-dist-0.9.0.jar there is no org/apache/flink/api/java/io/jdbc directory. I uncompressed the jar and searched for the string "jdbcinputformat" with 0 results. I searched the string "jdbc" and it is only mentioned in org/apache/log4j, org/eclipse/jetty, and in other places that are not org.apache.flink.api.java.io
So my question is: Where do I find the class JDBCInputFormat?
What can I do to access SqlServer2012 in Flink (apart from accessing it outside Flink, create csv files, and then reading them in Flink (It sounds horrible to me since there should be a class specific for that))?
The corresponding module is not included. In order to use it, you need to build Flink from scratch. Run the following commands:
git clone https://github.com/apache/flink.git
cd flink
mvn -DskipTests clean install
This builds the latest snapshot for flink-0.10-SNAPSHOT. If you want to use stable version 0.9 run different git clone command:
git clone -b release-0.9 https://github.com/apache/flink.git
In your current project, you need to change the used Flink version in your pom file accordingly, eg, 0.10-SNAPSHOT or 0.9-SNAPSHOT.
With the Maven assembly plugin I know I can set the permissions of the files contained within my tar such as here. However can I use the plugin to set the permissions of the tar itself?
Maybe I should just the ant plugin but this is a little messy
I haven't tested this, but you might be able to use "exec-maven-plugin" to do this.
How to change permission of jar packaged by maven? I am using maven assembly plugin
"Use maven:exec plugin to execute chmod"
So the idea is that you would add another plugin to the pom.xml file that sets the permission on the tar itself.
The only drawback that I see is that you have to have the name of the file in the plugin xml code in the pom file. That's fine as I have that listed in in the maven-assembly-plugin. But the file extension is found in assembly.xml (.zip, or .tar-gz), so if you change the file extension in assembly.xml, you would have to remember to change it in the pom.xml file. Not a big hassle, but it might be easy to miss on your first review.