I am new to Gradle. I am using 5.4.1 version. When I am trying build the application I am facing below issue -
java.lang.IllegalStateException
Caused by: org.springframework.context.ApplicationContextException
Caused by: java.lang.OutOfMemoryError
I have added gradle.properties file in my project root directory where build.gradle is present with below property but still facing issue.
org.gradle.jvmargs=-Xmx512m
I am not sure where to add gradle.properties file. I have also refere this https://gradle.org/docs/current/userguide/build_environment.html document but not sure why gradle.properties files not read by the application.
I am trying to connect remote kafka server from my application and error is
Caused by: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is java.lang.OutOfMemoryError: Java heap space
The default max heap size of jvm is 512m according to the Gradle doc. Can you try increasing it and setting it like :
org.gradle.jvmargs=-Xmx2g -XX:MaxMetaspaceSize=512m
Or you can try setting it in global in environment variables GRADLE_OPTS
From doc:
The org.gradle.jvmargs Gradle property controls the VM running the
build. It defaults to -Xmx512m "-XX:MaxMetaspaceSize=256m"
Related
gradle7.4.2 can not implementation local jar
the error is npe.
* Exception is:
org.gradle.api.ProjectConfigurationException: A problem occurred configuring project ':client:rest'.
at org.jetbrains.plugins.gradle.model.ProjectImportAction.execute(ProjectImportAction.java:116)
Caused by: java.lang.NullPointerException
I need to use local jar into project, used gradle 5.4.2 is success, but use 7.4.2, always error...
I use many method to test, but can not access...
enter image description here
how to handle the exception?
thanks~~~
I have configured jenkins on kubernetes cluster as a pod with NFS volume mount. when i try to build code, using maven and gradle , i'm getting these error.
Exception in thread "main" java.io.IOException: No locks available
at sun.nio.ch.FileDispatcherImpl.lock0(Native Method)
at sun.nio.ch.FileDispatcherImpl.lock(FileDispatcherImpl.java:94)
at sun.nio.ch.FileChannelImpl.tryLock(FileChannelImpl.java:1114)
at java.nio.channels.FileChannel.tryLock(FileChannel.java:1155)
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:55)
at org.gradle.wrapper.Install.createDist(Install.java:48)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:107)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
I fixed the issue. I had to update volume claim yaml. accessModes to ReadWriteOnce, previous it was ReadWriteMany.
Currently i am working on Spring boot 2.1 project configured with Gradle 5.2.1. But i got out of memory error when building project and could not understand the exact reason.
Please find the attached log
Caused by: org.gradle.cache.CacheOpenException: Could not open proj generic class cache for build file '/Users/mac/project/build.gradle' (/Users/mac/.gradle/caches/5.2.1/scripts/eajdx6l75dt1ypyljdsfupplm/proj/proj3ca90766b0adfce53d4b035e7e9dc5fe).
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:80)
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:42)
at org.gradle.cache.internal.DefaultCacheFactory.doOpen(DefaultCacheFactory.java:94)
at org.gradle.cache.internal.DefaultCacheFactory.open(DefaultCacheFactory.java:68)
at org.gradle.cache.internal.DefaultCacheRepository$PersistentCacheBuilder.open(DefaultCacheRepository.java:118)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$RemapBuildScriptsAction.execute(FileCacheBackedScriptClassCompiler.java:421)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$RemapBuildScriptsAction.execute(FileCacheBackedScriptClassCompiler.java:390)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$ProgressReportingInitializer.execute(FileCacheBackedScriptClassCompiler.java:175)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$ProgressReportingInitializer.execute(FileCacheBackedScriptClassCompiler.java:155)
at org.gradle.cache.internal.DefaultPersistentDirectoryCache$Initializer.initialize(DefaultPersistentDirectoryCache.java:98)
at org.gradle.cache.internal.FixedSharedModeCrossProcessCacheAccess$1.run(FixedSharedModeCrossProcessCacheAccess.java:85)
at org.gradle.cache.internal.DefaultFileLockManager$DefaultFileLock.doWriteAction(DefaultFileLockManager.java:207)
at org.gradle.cache.internal.DefaultFileLockManager$DefaultFileLock.writeFile(DefaultFileLockManager.java:197)
at org.gradle.cache.internal.FixedSharedModeCrossProcessCacheAccess.open(FixedSharedModeCrossProcessCacheAccess.java:83)
at org.gradle.cache.internal.DefaultCacheAccess.open(DefaultCacheAccess.java:142)
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:78)
... 133 more
Caused by: java.lang.OutOfMemoryError: Metaspace
First kill the Java(TM) Platform SE binary process and then delete the whole /Users//.gradle/caches folder which will allow Gradle commands to work once again.
this thread might help you.
I have a project module with 30K classes.
After migrating sonar analysis from ant script to gradle plugin I have OOM error with output like this:
13:10:36 Out of memory
13:10:36 Total memory: 954M
13:10:36 free memory: 119M
13:10:52 Caused by: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded
13:10:52 at org.sonar.plugins.findbugs.FindbugsExecutor.execute(FindbugsExecutor.java:163)
13:10:52 ... 109 more
13:10:52 Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
13:10:52 at
We've run ant script with the following parameters "-Xmx3800m -XX:ReservedCodeCacheSize=128m"
How can I set the same parameters for sonarqube gradle plugin?
I've tried setting the following env variable before calling gradle
GRADLE_OPTS=-Xmx3800m -XX:ReservedCodeCacheSize=128m
It's applied correctly, but findbugs still failing and prints "Total memory: 954M"
Also I've tried adding the following properties to reduce memory consumption, but without any luck
property 'sonar.skipPackageDesign', 'true'
property 'sonar.skipDesign', 'true'
Gradle version is 3.5
Sonarqube plugin version is 2.5
jdk version is 8u131
I realized what was the problem.
Setting GRADLE_OPTS was correct solution, however I also used gradle daemon, so those options were ignored for daemon.
I ended up disabling daemon by adding -Dorg.gradle.daemon=false to GRADLE_OPTS and it worked.
Check build.gradle file and set maxHeapSize as given below,
tasks.withType(FindBugs) {
maxHeapSize = "1000m"
}
You can change heap size.
I have problems with one job on my Jenkins server.
It fails during pom parse with this message:
Parsing POMs
Modules changed, recalculating dependency graph
[workspace] $ java -Xmx512m -XX:MaxPermSize=256m -Djava.awt.headless=true -cp /opt/edb/jenkins/plugins/maven-plugin/WEB-INF/lib/maven3-agent-1.2.jar:/opt/apache/maven3/boot/plexus-classworlds-2.4.jar org.jvnet.hudson.maven3.agent.Maven3Main /opt/apache/maven3 /var/lib/tomcat6/webapps/jenkins/WEB-INF/lib/remoting-2.17.jar /opt/edb/jenkins/plugins/maven-plugin/WEB-INF/lib/maven3-interceptor-1.2.jar 55430
<===[JENKINS REMOTING CAPACITY]===>channel started
ERROR: Failed to parse POMs
java.io.IOException: Remote call on Channel to Maven [java, -Xmx512m, -XX:MaxPermSize=256m, -Djava.awt.headless=true, -cp, /opt/edb/jenkins/plugins/maven-plugin/WEB-INF/lib/maven3-agent-1.2.jar:/opt/apache/maven3/boot/plexus-classworlds-2.4.jar, org.jvnet.hudson.maven3.agent.Maven3Main, /opt/apache/maven3, /var/lib/tomcat6/webapps/jenkins/WEB-INF/lib/remoting-2.17.jar, /opt/edb/jenkins/plugins/maven-plugin/WEB-INF/lib/maven3-interceptor-1.2.jar, 55430] failed
at hudson.remoting.Channel.call(Channel.java:673)
at hudson.maven.ProcessCache$MavenProcess.<init>(ProcessCache.java:112)
at hudson.maven.ProcessCache.get(ProcessCache.java:231)
at hudson.maven.MavenModuleSetBuild$MavenModuleSetBuildExecution.doRun(MavenModuleSetBuild.java:704)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:586)
at hudson.model.Run.execute(Run.java:1516)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:477)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:236)
Caused by: java.lang.Error: Failed to deserialize the Callable object.
Caused by: java.lang.IndexOutOfBoundsException: Index: 1997078527, Size: 0
I have tried creating a new build, no luck
Building locally works fine
All other similar jobs work fine
I'm running letest jenkins (1.489)
Any ideas?
Hard disk is full.
Try to restart.
Maybe the workspace is corrupted and on your pom.xml is not readable. Have to try to clean the workspace for this specific job ?
This issue was fixed for me by setting the correct version Java version in the PATH (as required in the pom.xml) - See https://issues.jenkins-ci.org/browse/JENKINS-5519