Verse Of The Day

Tuesday, July 31, 2007

Using JConsole To Connect To Remote JVM

We have been having issues with a certain Java web application we inherited. Actually, we've had a few problems with it, but that's another story.

One issue was that, even though there is nothing running on this Linux box except a single instance of Tomcat (5.5) with this single web application deployed to it, we will get into frequent issues where we get paged for "high CPU utilization". Log in to the machine, and the Java process is taking 100% CPU. It might stay like that for a few hours, or even a few days. Occasionally it clears up on it's own, but usually the server stops responding and we have to restart Tomcat.

We have had another issue where we would get Out of Memory exceptions (Heap space), and the JVM would stop running. The two issues weren't necessarily related, but we couldn't rule out the possibility either.

We don't have much visibility into the application and what's going on since it is a vendor built "black box". We have some customized source code, but most of it is off limits to us. They rolled their own database connection pool, MVC framework, persistence framework, etc.

In comes Jconsole. JConsole is an awesome tool that comes bundled with JDK 1.5 and above. It connects to the JVM and gives you all the info you could want on the various JVM memory pools, garbage collection, threads, classloading, and lets you manage anything exposed via JMX. It also has lots of pretty graphs, such as memory usage and garbage collections over time, for any or all of the memory pools (heap, non-heap, or individual pools, like permgen and eden space). Same goes for threads and loaded classes --current number, peak, total created.

The best thing about JConsole is the ability to connect to remote JVM's so you don't add too much overhead on the box being monitored. I have JConsole hooked up to my test and production servers, and it helped me prove that the two issues above were connected. There is some condition in the application (yet to be found, actually, first step was proof of what's really happening) that causes a substantial memory leak, and once the memory usage gets at it's ceiling, the garbage collection thread basically runs constantly, trying to regain some trivial amount of memory, then filling it up, and running GC again.

With an average of 100 active sessions at a time plus full garbage collection running non-stop, the CPU gets consumed quickly. I monitored the app for about a week and a half, and memory and CPU looked great -- there were a bit over 200 full GC's in that time period. Then this past weekend, we had to restart because of a DNS issue. That was Friday evening, and by Monday morning there were over 2000 full GC's performed, and I was restarting a non-responsive server by lunch time.

Below are two screenshots, one is of several days of "normal" memory usage, notice gradual rise and then sharp decrease at full GC, all the while keeping well below the JVM's allotted memory ceiling. Second is this past weekend's issue, where memory is hovering at ceiling and a full GC doesn't do much. Also notice the old generation memory pool is quite full.


To set up JConsole to run on a local JVM, you only need ot pass one extra argument to the JVM:
To set up remote (with no security), it is a matter of adding a few more parameters to the remote JVM at startup:
Then when you start up JConsole, go to the remote tab, enter the server name and the port that you specified (in this case, 8004).

Simple as that. It starts collecting stats immediately and the graphs appear. As you explore the JMX tree, you will notice yo ucan click on some of the stats and the simple integer displays "opens up" into a full graph display. I'm doing this to watch the active sessions patterns through the day and week.


Stoner said...

Can you figure out how to connect JConsole to Tomcat 5.0 running on JDK 1.4, I'd be most appreciative to know the details.

Robb said...

For 1.4 you might be sh-- out of luck. It was JDK 5.0 (I'm 99% sure) when Sun started to expose much of the JVM via JMX. Before that, you had to explicitly write mbeans for anything you wanted to monitor or manage. Do you have the option of running it on jdk 5?

You might be able to bump up GC logging in 1.4. My JAVA_OPTS line of looks like this below. After the memory options, option 3, 4, 5 spit info out to the logs about the sizes of memory pools, and when garbage collection runs. The last 4 options allow me to connect via Jconsole (1.5).

JAVA_OPTS=" -Xms512m -Xmx1024m -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:-TraceClassUnloading"

Mollusk said...

We have the same problem! What did you do to correct it?

yu said...

Iadded the following

in server properties for a remote OAS.

After making the chnages my oc4j instance in down,an i am unbale to start it.. MBean not found at

Robb said...

I've never used OC4J but if it's a decent JEE app server it should have it's own JMX impl. Read up on the OC4J docs. Also a chance OC4J is trying to use the port you specified.

No stack trace and no working knowledge of OC4J - that's the best I can do for you.