Triggering of gc on Metaspace memory in java 8 - java-8

What condition triggers the garbage collection of Metaspace when MaxMetaspaceSize property is not set on the jvm.
Say if i close few unused classloaders there is a scope to free up the memory in the metaspace memory. My questions is does the full gc trigger the cleanup of metaspace memory or is it triggered in some other way.

since by default Metaspace in Java 8 is not limited, when does jvm understands that it need to clean up the unreferenced classes from its metaspace.
Metaspace itself is not garbage-collected. But the java heap is. When java.lang.Class objects get collected the underlying metadata gets freed too. So in most circumstances regular GC cycles will also free up metaspace if there are any classes that are eligible for unloading.
but want to understand when does the gc get triggered to cleanup the metaspace memory of the undeployed apps.
At the latest when the current capacity of the metaspace is full, possibly earlier when regular garbage collections unload classes.

Yes, it's the full GC that cleans up the metaspace, specifically you should see entries like this Full GC (Metadata GC Threshold) after enabling verbose mode on the gc.

It's clearly documented here, I may just quote it here.
-XX:MetaspaceSize=size
Sets the size of the allocated class metadata space that will trigger a garbage collection the first time it is exceeded. This threshold for a garbage collection is increased or decreased depending on the amount of metadata used. The default size depends on the platform.
direct quote but the emphasis is mine ;)
Might be helpful to check the answer from #Holger.

Pls use System.gc() or Runtime.gc()

Related

Spring Native memory related issues

I want to know if I use Spring-Native, do I still need to configure what garbage collector to use, how much heap memory to set and other JVM startup parameters, do I not need to configure any of these at all? If so, then the actual memory occupied by the application is the actual memory used and no more space will be opened up in advance because of setting the heap size?

JVM Buffer pool only grows

I'm ElasticSearch in production and using prometheus to scrape the metrics. Looking at the graphs I could see the jvm_buffer_pool metric just grow until finally crashed
As I understood the buffer pool is outside of GC, but how to clean it up?
The JVM has Direct ByteBuffers which are on heap objects which proxy off-heap memory. The ByteBuffer is tens of bytes even if the off-heap memory is 1 GB. When the GC cleans up this proxy object because it is no longer referenced, the off-heap memory is also released.
If the off-heap memory isn't being released, it because;
it's on heap proxies are being retained. i.e. the memory is needed.
ElasticSearch is allocating off-heap memory directly and the library has a leak (unlikely)
I would try allowing more direct memory to see if this helps. -XX:MaxDirectMemorySize=64g or whatever you can spare.

Garbage collection tuning for Java 8 Applications

1) Our application: Spring boot, Java 8
2) Parameters we use: xms = 256 MB, xmx = 2 GB
We have been seeing that used heap size of our java8 applications are not shrinking back down when appropriate.
Any other parameters that we should be using along with #2 above, when launching our spring boot/Java 8 application, so that GC can do a better job?
Thanks for your help!
The above options have the following effect:
-Xms, -Xmx: Places boundaries on the heap size to increase the predictability of garbage collection. The heap size is limited in replica servers so that even Full GCs do not trigger SIP retransmissions. -Xms sets the starting size to prevent pauses caused by heap expansion.
-XX:+UseG1GC: Use the Garbage First (G1) Collector.
-XX:MaxGCPauseMillis: Sets a target for the maximum GC pause time. This is a soft goal, and the JVM will make its best effort to achieve it.
-XX:ParallelGCThreads: Sets the number of threads used during parallel phases of the garbage collectors. The default value varies with the platform on which the JVM is running.
-XX:ConcGCThreads: Number of threads concurrent garbage collectors will use. The default value varies with the platform on which the JVM is running.
-XX:InitiatingHeapOccupancyPercent: Percentage of the (entire) heap occupancy to start a concurrent GC cycle. GCs that trigger a concurrent GC cycle based on the occupancy of the entire heap and not just one of the generations, including G1, use this option. A value of 0 denotes 'do constant GC cycles'. The default value is 45.
Oracle JDK provides inbuilt Java VisualIVM tool to analyze and tune GC factors

Memory Management in Java - Metaspace/native memory

Is native memory (Metaspace) for a java application gets space from heap memory or there is completely a different set of memory dedicated for it?
I think it uses the memory which is used by OS to manage all their applications, but not clear.
Java Heap Space
Java Heap space is used by java runtime to allocate memory to Objects and JRE classes. Whenever we create any object, it’s always created in the Heap space. Garbage Collection runs on the heap memory to free the memory used by objects that doesn’t have any reference. Any object created in the heap space has global access and can be referenced from anywhere of the application.
Java Stack Memory
Java Stack memory is used for execution of a thread. They contain method specific values that are short-lived and references to other objects in the heap that are getting referred from the method. Stack memory is always referenced in LIFO (Last-In-First-Out) order. Whenever a method is invoked, a new block is created in the stack memory for the method to hold local primitive values and reference to other objects in the method. As soon as method ends, the block becomes unused and become available for next method. Stack memory size is very less compared to Heap memory.
Difference between Java Heap Space and Stack Memory
Heap memory is used by all the parts of the application whereas stack memory is used only by one thread of execution.
Whenever an object is created, it’s always stored in the Heap space and stack memory contains the reference to it. Stack memory only contains local primitive variables and reference variables to objects in heap space.
Objects stored in the heap are globally accessible whereas stack memory can’t be accessed by other threads.
Memory management in stack is done in LIFO manner whereas it’s more complex in Heap memory because it’s used globally. Heap memory is divided into Young-Generation, Old-Generation etc, more details at Java Garbage Collection.
Stack memory is short-lived whereas heap memory lives from the start till the end of application execution.
We can use -Xms and -Xmx JVM option to define the startup size and maximum size of heap memory. We can use -Xss to define the stack memory size.
When stack memory is full, Java runtime throws java.lang.StackOverFlowError whereas if heap memory is full, it throws java.lang.OutOfMemoryError: Java Heap Space error.
Stack memory size is very less when compared to Heap memory. Because of simplicity in memory allocation (LIFO), stack memory is very fast when compared to heap memory.
Reference:
Java Heap Space vs Stack – Memory Allocation in Java
Java (JVM) Memory Model – Memory Management in Java
This is the basic of Java Memory Management but going through the reference material should give you comprehensive idea.
Edit
Thanks to #rajvineet to pointing out to this great article on how the JVM uses native memory on Windows and Linux. Specially the section, how the Java runtime uses native memory describes everything clearly.

Why is the JVM using more memory than I am allocating

I am starting the Java process with the following command:
java -Xmx32m -jar winstone-lite.jar --warfile=myWarFile.war
Instead of using the amount of memory I specified, it is still allocating 144m.
EDIT:
When I say allocate, I mean when I look at the "top" process I am seeing 144m as the amount of memory being used.
I am using http://www.oracle.com/technetwork/java/embedded/documentation/index.html current version.
I would figure that if my application required more memory than I am allocating the jvm would crash.
-Xmxjust tells the JVM how much memory it may use for its internal heap.
The JVM needs memory for other purposes (permanent generation, temporary space etc.), plus like every binary it needs space for its own binary code, plus any libraries/DLLs/.so it loads.
The 144 MiB you quote probably contains at least some of these other memory uses.
How did you measure the memory usage? On modern OS using virtual memory, measuring memory usage of a process is not quite trivial, and cannot be expressed as a single value.

Resources