I have a project module with 30K classes.
After migrating sonar analysis from ant script to gradle plugin I have OOM error with output like this:
13:10:36 Out of memory
13:10:36 Total memory: 954M
13:10:36 free memory: 119M
13:10:52 Caused by: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded
13:10:52 at org.sonar.plugins.findbugs.FindbugsExecutor.execute(FindbugsExecutor.java:163)
13:10:52 ... 109 more
13:10:52 Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
13:10:52 at
We've run ant script with the following parameters "-Xmx3800m -XX:ReservedCodeCacheSize=128m"
How can I set the same parameters for sonarqube gradle plugin?
I've tried setting the following env variable before calling gradle
GRADLE_OPTS=-Xmx3800m -XX:ReservedCodeCacheSize=128m
It's applied correctly, but findbugs still failing and prints "Total memory: 954M"
Also I've tried adding the following properties to reduce memory consumption, but without any luck
property 'sonar.skipPackageDesign', 'true'
property 'sonar.skipDesign', 'true'
Gradle version is 3.5
Sonarqube plugin version is 2.5
jdk version is 8u131
I realized what was the problem.
Setting GRADLE_OPTS was correct solution, however I also used gradle daemon, so those options were ignored for daemon.
I ended up disabling daemon by adding -Dorg.gradle.daemon=false to GRADLE_OPTS and it worked.
Check build.gradle file and set maxHeapSize as given below,
tasks.withType(FindBugs) {
maxHeapSize = "1000m"
}
You can change heap size.
Related
I am new to Gradle. I am using 5.4.1 version. When I am trying build the application I am facing below issue -
java.lang.IllegalStateException
Caused by: org.springframework.context.ApplicationContextException
Caused by: java.lang.OutOfMemoryError
I have added gradle.properties file in my project root directory where build.gradle is present with below property but still facing issue.
org.gradle.jvmargs=-Xmx512m
I am not sure where to add gradle.properties file. I have also refere this https://gradle.org/docs/current/userguide/build_environment.html document but not sure why gradle.properties files not read by the application.
I am trying to connect remote kafka server from my application and error is
Caused by: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is java.lang.OutOfMemoryError: Java heap space
The default max heap size of jvm is 512m according to the Gradle doc. Can you try increasing it and setting it like :
org.gradle.jvmargs=-Xmx2g -XX:MaxMetaspaceSize=512m
Or you can try setting it in global in environment variables GRADLE_OPTS
From doc:
The org.gradle.jvmargs Gradle property controls the VM running the
build. It defaults to -Xmx512m "-XX:MaxMetaspaceSize=256m"
I am working on creating CI/CD pipeline for Spring Boot application on GKE using JenkinsX. As soon as I push the code to the master branch, build gets triggered but the build fails due to insufficient Java heap space.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-war-plugin:3.2.2:war (default-war) on project location-finder-api: Error assembling WAR: Problem creating war: Execution exception: Java heap space -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-war-plugin:3.2.2:war (default-war) on project location-finder-api: Error assembling WAR: Problem creating war: Execution exception
Caused by: java.lang.OutOfMemoryError: Java heap space
at org.codehaus.plexus.archiver.zip.ByteArrayOutputStream.needNewBuffer (ByteArrayOutputStream.java:153)
at org.codehaus.plexus.archiver.zip.ByteArrayOutputStream.write (ByteArrayOutputStream.java:192)
To resolve I tried to set the JVM argument in the Docekrfile as
CMD ["java", "-Xmx1024m","-jar", "app.jar"]
But it did not work. This is what I see when the build starts
+ mvn -e clean deploy -Pprod
Picked up _JAVA_OPTIONS: -XX:+UnlockExperimentalVMOptions -XX:+UseCGroupMemoryLimitForHeap -Dsun.zip.disableMemoryMapping=true -XX:+UseParallelGC -XX:MinHeapFreeRatio=5 -XX:MaxHeapFreeRatio=10 -XX:GCTimeRatio=4 -XX:AdaptiveSizePolicyWeight=90 -Xms10m -Xmx192m
OpenJDK 64-Bit Server VM warning: If the number of processors is expected to increase from one, then you should configure the number of parallel GC threads appropriately using -XX:ParallelGCThreads=N
Is there any way that I can set this heap option on my own?
it looks like maven is running out of memory so you need more memory in your build pod (not the Dockerfile for your app).
as a quick test you can edit the pod template inside the Jenkins UI: jx console then Manage Jenkins -> Configure System then find the jenkins-maven pod template in the UI and edit the _JAVA_OPTIONS environment variable from this value: https://github.com/jenkins-x/jenkins-x-platform/blob/master/jenkins-x-platform/values.yaml#L907 - try change -Xmx512m to something bigger like -Xmx912m
Once you've found a value that works for your projects you can make the change permanent of restarts of Jenkins by adding this to your myvalues.yaml - something like this...
# myvalues.yaml
jenkins:
Agent:
PodTemplates:
Maven:
Name: maven
Label: jenkins-maven
EnvVars:
_JAVA_OPTIONS: '-XX:+UnlockExperimentalVMOptions -Dsun.zip.disableMemoryMapping=true -XX:+UseParallelGC -XX:MinHeapFreeRatio=5 -XX:MaxHeapFreeRatio=10 -XX:GCTimeRatio=4 -XX:AdaptiveSizePolicyWeight=90 -Xms10m -Xmx912m'
see the docs on creating/configuring builders
Currently i am working on Spring boot 2.1 project configured with Gradle 5.2.1. But i got out of memory error when building project and could not understand the exact reason.
Please find the attached log
Caused by: org.gradle.cache.CacheOpenException: Could not open proj generic class cache for build file '/Users/mac/project/build.gradle' (/Users/mac/.gradle/caches/5.2.1/scripts/eajdx6l75dt1ypyljdsfupplm/proj/proj3ca90766b0adfce53d4b035e7e9dc5fe).
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:80)
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:42)
at org.gradle.cache.internal.DefaultCacheFactory.doOpen(DefaultCacheFactory.java:94)
at org.gradle.cache.internal.DefaultCacheFactory.open(DefaultCacheFactory.java:68)
at org.gradle.cache.internal.DefaultCacheRepository$PersistentCacheBuilder.open(DefaultCacheRepository.java:118)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$RemapBuildScriptsAction.execute(FileCacheBackedScriptClassCompiler.java:421)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$RemapBuildScriptsAction.execute(FileCacheBackedScriptClassCompiler.java:390)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$ProgressReportingInitializer.execute(FileCacheBackedScriptClassCompiler.java:175)
at org.gradle.groovy.scripts.internal.FileCacheBackedScriptClassCompiler$ProgressReportingInitializer.execute(FileCacheBackedScriptClassCompiler.java:155)
at org.gradle.cache.internal.DefaultPersistentDirectoryCache$Initializer.initialize(DefaultPersistentDirectoryCache.java:98)
at org.gradle.cache.internal.FixedSharedModeCrossProcessCacheAccess$1.run(FixedSharedModeCrossProcessCacheAccess.java:85)
at org.gradle.cache.internal.DefaultFileLockManager$DefaultFileLock.doWriteAction(DefaultFileLockManager.java:207)
at org.gradle.cache.internal.DefaultFileLockManager$DefaultFileLock.writeFile(DefaultFileLockManager.java:197)
at org.gradle.cache.internal.FixedSharedModeCrossProcessCacheAccess.open(FixedSharedModeCrossProcessCacheAccess.java:83)
at org.gradle.cache.internal.DefaultCacheAccess.open(DefaultCacheAccess.java:142)
at org.gradle.cache.internal.DefaultPersistentDirectoryStore.open(DefaultPersistentDirectoryStore.java:78)
... 133 more
Caused by: java.lang.OutOfMemoryError: Metaspace
First kill the Java(TM) Platform SE binary process and then delete the whole /Users//.gradle/caches folder which will allow Gradle commands to work once again.
this thread might help you.
I am analysing a large project in Sonar and get the following error:
[sonar:sonar] 03:55:39.511 INFO p.PhasesTimeProfiler - Execute decorators...
BUILD FAILED
[...]
[...] java.lang.OutOfMemoryError: Java heap space
at org.sonar.batch.index.MeasurePersister.model(MeasurePersister.java:127)
at org.sonar.batch.index.MeasurePersister.getMeasuresToSave(MeasurePersister.java:117)
at org.sonar.batch.index.MeasurePersister.dump(MeasurePersister.java:70)
at org.sonar.batch.index.DefaultPersistenceManager.dump(DefaultPersistenceManager.java:63)
at org.sonar.batch.phases.Phases.execute(Phases.java:95)
at org.sonar.batch.bootstrap.ProjectModule.doStart(ProjectModule.java:139)
at org.sonar.batch.bootstrap.Module.start(Module.java:83)
at org.sonar.batch.bootstrap.BatchModule.analyze(BatchModule.java:131)
at org.sonar.batch.bootstrap.BatchModule.doStart(BatchModule.java:121)
at org.sonar.batch.bootstrap.Module.start(Module.java:83)
at org.sonar.batch.bootstrap.BootstrapModule.doStart(BootstrapModule.java:121)
at org.sonar.batch.bootstrap.Module.start(Module.java:83)
at org.sonar.batch.Batch.execute(Batch.java:104)
at org.sonar.ant.Launcher.execute(Launcher.java:78)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.sonar.ant.SonarTask.delegateExecution(SonarTask.java:244)
at org.sonar.ant.SonarTask.execute(SonarTask.java:193)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:392)
at org.apache.tools.ant.Target.performTasks(Target.java:413)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:442)
Should I increase the java heap space of the running Sonar server, or the Ant target executing the Sonar job?
As you can see from the stack trace, the Ant starts the Sonar analysis, so you should increase the heap space for the VM the Ant runs in.
This is a very similar question BTW: How to increase Sonar Java heap space
For command line Ant usage
Quote from the answer by Mark O'Connor on the other question:
The Sonar ANT task executes as part of ANT so you need to set the JVM heap using the standard ANT environment parameter. For example:
export ANT_OPTS=-Xmx256m
Remarks:
this is for Linux, for Windows, use the set command
this is strictly for the heap space. for Permgen, use -XX:MaxPermSize=<desired amount>
This is an even more similar question: Build Failed java.lang.OutOfMemoryError: Java heap space
For Eclipse IDE
Quote from the article http://soenkerohde.com/2008/06/change-eclipse-ant-settings-when-you-run-out-of-memory/
In Eclipse open menu: Run->External Tools->Open External Tools Dialog…
Select the build script you want to change on the left
Select the JRE tab on the right
Set the following as VM arguments: -Xms768m -Xmx1024m -XX:MaxPermSize=512m
For IntelliJ Idea
This forum thread is useful: ANT build java heap space
Quote from the answers:
Please make sure that you increased heap in a correct place. You need to click 'Properties' button in IDEA's Ant tool window and edit 'Maximum heap size (Mb)' field there.
Also, from the IntelliJ Idea page: Increasing Memory Heap
Quote from the article:
The memory heap of the build process is independent of IntelliJ IDEA memory heap, and is released after the build process is complete.
To increase a memory heap:
Open the Build File Properties dialog box.
In the Maximum heap size field, type the required amount of memory.
For Jenkins Continuous Integration and Ant build
This question is useful : How to use the Java Options in jenkins ant build tool to set ANT_OPTS
Quote from the answer:
Set the JAVA OPTIONS as -Xmx512m -XX:MaxPermSize=256m only without the ANT_OPTS=
For Maven builds, this article is of use: How to increase maven heapspace in hudson builds
Navigate to your hudson job,
click Configure,
scroll down to the Build section, and
click the Advanced button.
Enter this into MAVEN_OPTS: -Xmx512m -XX:MaxPermSize=128m
Scenario
While using the Maven Ant Task artifact:deploy, I'm encountering the error java.lang.OutOfMemoryError: Java heap space.
I'm only getting the error if the size of the file being deployed is greater than 25 MB. My artifacts are not greater than 50 MB in size.
What could the reason be? And, what can I do to fix it?
Code snippet
<artifact:deploy file="#{app.name}.jar">
<pom file="#{pom.file}"/>
<remoteRepository url="http://xxx.com:xxx/xxx-webapp/content/repositories/xxx-releases/">
<authentication username="xxx" password="xxx" />
</remoteRepository>
</artifact:deploy>
Existing solutions
Most online results indicate that it's something to do with the JVM default heap size and that it can be fixed by setting the appropriate environmental variables.
However, I would want the Ant scripts to run on any computer and not to depend on the environmental variables.
Is there a way to configure these settings in the Ant scripts or the
POM file?
EDIT
The install-provider task (http://maven.apache.org/ant-tasks/examples/install-deploy.html) seems to work for some people. I keep getting download errors when I use it.
Answer
It turns out that I'm not getting the Java heap error when I run my Maven Ant task on a different machine (which probably has more memory allocated to the JVM heap). Hence, I haven't attempted the solution mentioned by #Attila, though it seems to be going in the right direction.
Once ant is running, you cannot change the heap size of the JVM runing ant. So your only option is to run the task that comsumes a large amount of memory in a separate JVM, specifying enough heap space. Note this relies on the task allowing you to fork a new JVM to execute the task
Update: I could not find a way to specify to fork the maven (deploy) task, but this page specifies how you can define a macro to run maven using the java task (note that this relies on maven beeing installed and properly configured on the machine) (see the "Using the Java Task" section)
please try to increase VM memory, eg.: -Xmx512m
if you are using ANT, you can add it to the ANT_OPTS environment variable: ANT_OPTS="-Xmx512m"