Gradle Version and DockerFile - gradle

In our Pipeline I have a surprising situation. If I using Gradle 6.8.x and higher a COPY *.jar to /opt/file.jar inside an unchanged Dockerfile did not work with:
Step 21/33 : COPY *.jar /opt/file.jar
When using COPY with more than one source file, the destination must be a directory and end with a /
Using Gradle with version 6.5.x and early it works.
Which different behavore has Gradle with 6.8.x and higher, which ends in conflicts with a Dockerfile and how can I solve this?
Thx in advance

Not sure if this helps, but I had the same problem (updating from Gradle 6.5 to 7.3), and also simultaneously updating from Spring Boot 2.4 to 2.6, which was actually my problem.
Spring Boot starts to create plain.jar by default next to your jar (documentation) since version 2.5.0, so in my build folder were actually 2 *.jar files and the COPY error was correct.
How to disable creation of plain.jar file is answered here
If this is not your case, please check what actually gets created in your build folder and if there are multiple files.

Related

java/spring-boot/gradle Wrong Entrypoint in image built with pack and paketobuildpacks/builder:base

I have a really simple java spring-boot gradle application.
When I build an image from source with:
pack build testapp:0.0.1 --builder paketobuildpacks/builder:base
and try to run it with docker I get the following error:
ERROR: failed to launch: determine start command: when there is no default process a command is required.
The generated Entrypoint in this image is "/cnb/lifecycle/launcher".
When I inspect the image with pack inspect-image there are no processes.
I tried this with different java spring-boot gradle applications. When I use the "bootBuildImage" gradle task, it does nearly the same but uses the pre-build .jar-file and the resulting image works. The generated Entrypoint in this image is "/cnb/process/web" and pack inspect-image shows three processes.
Any ideas?
I can't see your build output, but it sounds like you're hitting a known issue. If this is not your problem, please include the full output of running pack build.
Onto the issue. By default, Spring Boot Gradle projects will build both an executable and non-executable JAR. Because this produces two JAR files, it presently confuses the buildpacks.
There are a couple of solutions:
Tell Gradle to not build the non-executable JAR. The buildpack requires the executable JAR. You can do this by adding the following to your build.gradle file:
jar {
enabled = false
}
This is the solution we have used in the Paketo buildpack samples.
If you don't want to make the change suggested in #1, then you can add the following argument to pack build: -e BP_GRADLE_BUILT_ARTIFACT=build/libs/<your-jar>.jar. For ex: -e BP_GRADLE_BUILT_ARTIFACT=build/libs/demo-0.0.1-SNAPSHOT.jar. You can use glob-style pattern matching here, but you need to make sure that what you enter does not match *-plain.jar. That will be the non-executable JAR that gets built by default.
This option just simply tells the Gradle buildpack more specifically what the JAR file to pass along to subsequent buildpacks.
We also have an open issue that should help to mitigate this problem. When the executable-jar buildpack gains support for multiple JARs, it'll be less likely that you'll need to set this. Essentially, this fill will add support so the executable-jar buildpack can inspect and detect an executable JAR, which would allow it to throw out the -plain.jar file since it's not executable.

Build/Run Elasticsearch Locally with plugins

(On Elasticsearch version 6.5.1)
How can I build/run Elasticsearch from source with local plugins?
I've tried the following command to install the plugins:
./distribution/build/cluster/run\ node0/elasticsear-6.5.1-SNAPSHOT/bin/elasticsearch-plugin install file:/<path_to_plugin_zip> and that says it successfully installed the plugin.
However, when I run elasticsearch via ./gradlew run --debug-jvm, it cleans out the contents of that directory before running ES.
The reason I installed the plugin into that particular directory is that I put a debugger in the PluginsService.java file, and saw that the Path pluginsDirectory parameter in the constructor was set to /Users/jreback/Desktop/elasticsearch/distribution/build/cluster/run node0/elasticsearch-6.5.1-SNAPSHOT/plugins.
So, how can I get my plugin installed on my local ES version and run ES such that the plugin code doesn't get removed as the process starts up? Many thanks in advance!
FWIW, I got this working with some manual code changes (there may be or likely is a more recommended way to do this, but this worked for me).
In my ES checkout, I made the following code change to server/src/main/java/org/elasticsearch/env/Environment.java:
replace this line: pluginsFile = homeFile.resolve("plugins"); with pluginsFile = Paths.get("<path to plugin directory");
(Also, you must import java.nio.file.Paths at the top of that file).
The directory structure for the directory you listed above should look like this:
- plugin parent directory (should whatever you put in the Environment.java file)
- plugin directory (name of the plugin)
- plugin-descriptor.properties file
- plugin jar file (generated from building the plugin in some prior step)
Then you should see that it loaded the plugin you've just added in the logs when you start up ES again.

Update gradle version if necessary on autodeployment

I updated gradle locally by changing the version in my build.gradle file here:
wrapper {
gradleVersion = '5.6.1'
}
Next I was not able to build directly due to errors, but my IDE noticed that the version of gradle has changed and offered to install it through a popup. After that everything worked.
When I pushed my changes to my autodeployment tool it currently builds the project by executing:
call gradlew clean war
But I'm getting the same errors and this time there's no smart IDE to come to the rescue :D Therefore my question:
How can I make sure my gradle always updates to the version that is defined in build.gradle before trying to build?
The version of Gradle that is used by the wrapper script is the one defined in the file gradle/wrapper/gradle-wrapper.properties.
When you want to update Gradle, you could go manually changing that file, but this won't update the actual wrapper script and jar file. So it is a better practice to run ./gradlew wrapper, which will update gradle-wrapper.properties and, if needed, the other support files as well.
To tell the wrapper task which version you want to use upgrade to, you can either use a command line parameter, or do what you are doing and keep the version in the build.gradle file (this is always what I do as well).
I usually run the wrapper task twice: first to update the version and second to both download the new version and then regenerate the scripts from this new version.
Remember to commit all files changed by the wrapper task, which could be gradlew, gradlew.bat and the two files in the gradle/wrapper folder.

org.apache.flink.api.java.io.jdbc.JDBCInputFormat NOT INSIDE FLINK JARS

I have created a new Java project in
eclipse-jee-kepler-SR2-win32-x86_64.
I have included the Jars in
flink-0.8.1\lib.
I have created the standard WordCount and it works.
I have modified my WordCount to take input from text files and csv files and it works.
all the imports work perfectly.
then i tried import org.apache.flink.api.java.io.jdbc.JDBCInputFormat.
Eclipse doesn't find it?
Why does Eclipse not find the import?
Because inside the jar flink-java-0.8.1.jar there is no directory io/jdbc.
I tried the same thing with flink-0.9.0-bin-hadoop27 and in the jar flink-dist-0.9.0.jar there is no org/apache/flink/api/java/io/jdbc directory. I uncompressed the jar and searched for the string "jdbcinputformat" with 0 results. I searched the string "jdbc" and it is only mentioned in org/apache/log4j, org/eclipse/jetty, and in other places that are not org.apache.flink.api.java.io
So my question is: Where do I find the class JDBCInputFormat?
What can I do to access SqlServer2012 in Flink (apart from accessing it outside Flink, create csv files, and then reading them in Flink (It sounds horrible to me since there should be a class specific for that))?
The corresponding module is not included. In order to use it, you need to build Flink from scratch. Run the following commands:
git clone https://github.com/apache/flink.git
cd flink
mvn -DskipTests clean install
This builds the latest snapshot for flink-0.10-SNAPSHOT. If you want to use stable version 0.9 run different git clone command:
git clone -b release-0.9 https://github.com/apache/flink.git
In your current project, you need to change the used Flink version in your pom file accordingly, eg, 0.10-SNAPSHOT or 0.9-SNAPSHOT.

problems running state machine examples

Congratulations on the spring state machine, I found it yesterday and have been trying it out, specifically the turnstile example running in STS. I found it very easy and intuitive to build a FSM.
Because spring shell doesn't work well in STS I tracked down the instructions to run the examples from the command line in the reference doc,
"java -jar
spring-statemachine-samples-turnstile-1.0.0.BUILD-SNAPSHOT.jar"
,
but running it got an error
"no main manifest attribute, in spring-statemachine-samples-turnstile-1.0.0.BUILD-SNAPSHOT.jar".
Although not even a novice in using gradle, I tried fixing this by adding this line to build.gradle in the jar section
"manifest.attributes['Main-Class'] = 'demo.turnstile.Application'"
(which doesn't handle the various sub-projects I know) but got this error
"NoClassDefFoundError: org/springframework/shell/Bootstrap".
If it is possible to run the samples from gradle, could you include them in the reference document? I tried running the samples using
gradle run
but it there was no interaction with the shell scripts.
Samples are designed to be run as executable jar and with shell so that you can interact without a need to recompile with every change. Your error indicates that you didn't build that sample jar as mentioned in docs.
./gradlew clean build -x test
This will automatically use spring boot plugin which will add the necessary jar manifest headers to jar meta info to make it a true executable jar. Essentially every every sample is a spring boot app.
Building SM sample projects in Windows Environment:
Open Command prompt (windows key + r -->cmd-->Enter), Change directory to project root folder spring-statemachine-master (Inside the Extracted folder).
Run gradlew install to get all spring dependencies copied to local machine.
Run gradlew clean build -x test to get the spring shell jars built. Courtesy Janne
These steps should ideally get all .jar built, look into \build\libs folder of respective sample project for jar files.
Run the like any other java jar file java -jar [jar-file-name.jar] (make sure to be change directory to jar file directory location).
One more thing where I was stuck was, How to give events to SM:
It's like this sm event EVENT_NAME_AS_DEFINED_IN_CLASS. Ref
E.g.: sm event RINSE --> to washer project

Resources