[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 660
  • Last Modified:

Tomcat JVM Inaccessible - Logs Clean

I have been running into this situation every 1-2 days on my production site where my Tomcat 5.5.7 JVM will become unresponsive. My Tomcat logs are clean. I have included a dump of my memory usage below. I notice that the Committed & Max are the same for the Eden Space, Survivor Space, & Old Gen, yet the used is considerably lower. Can someone please explain and offer suggestions:

MEMORY TYPE: Code Cache  
 Used:        22937280  
 Committed:   23035904
 Max:         50331648  
   
 MEMORY TYPE: PS Eden Space  
 Used:        28793728  
 Committed:   51183616
 Max:         51183616  
   
 MEMORY TYPE: PS Survivor Space  
 Used:        1935936  
 Committed:   4325376
 Max:         4325376  
   
 MEMORY TYPE: PS Old Gen  
 Used:        179478224  
 Committed:   477233152
 Max:         477233152  
   
 MEMORY TYPE: PS Perm Gen  
 Used:        66484456  
 Committed:   71172096
 Max:         268435456    
 
 Used Memory:  210207888
 Free Memory:  322534256
 Total Memory: 532742144
0
scott_m_ruby
Asked:
scott_m_ruby
  • 6
  • 5
1 Solution
 
objectsCommented:
sounds like you are running of request threads.
Do a thread dump when it is unresponsive to check whats going on
may be that you just need to increase the thread pool size

0
 
scott_m_rubyAuthor Commented:
Can you give me the syntax for doing a thread dump?
0
 
objectsCommented:
send the tomcat process a QUIT signal

what OS?

0
VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

 
scott_m_rubyAuthor Commented:
Linux
0
 
objectsCommented:
kill -quit

the thread dump will get dumped to stdout, which usually ends up going to catalina.out

0
 
scott_m_rubyAuthor Commented:
Tomcat became unresponsive again this afternoon and I executed the thread dump as you suggested. It looks like I am using my maxThreads of 150. I will plan to increase but was hoping that you could look at my dump file to help diagnose what the "problem" may be. I have attached the portion of my catalina.out containing the thread dump. Also, as a note when I execute the kill -quit <tomcat pid> my process did not get killed. Should it have, normally?

Thanks,
Scott
catalina.log
0
 
objectsCommented:
appears to be lots of waiting on your connection pool.
check the size of your pool and the amount of database connections available
and make sure you close connections when you no longer need them

>  Should it have, normally?

no it doesn't kill the process just sends it a signal

0
 
scott_m_rubyAuthor Commented:
I did double my maxThreads from 150 to 300 and increase my max connections for my database connection pool from 25 to 50. The problem did not occur on Tuesday or Wednesday. In my monitoring via the Tomcat Manager web app I saw the thread count at 25 all day both days. I did notice today that I had 37 database connections gobbled up in about 10 seconds and I don't even have that many active users. Any ideas on how to get a handle on? Is there any java profiling tool or Tomcat monitoring tool that can help?
0
 
objectsCommented:
sounds better.

> . I did notice today that I had 37 database connections gobbled up in about 10 seconds and I don't even have that many active users.

if you are checking at the db then that could be ok as the pool will keep them open for a period (in case they can be reused)
0
 
scott_m_rubyAuthor Commented:
I wanted to check back with you as I believe I need a Tomcat monitoring tool that will allow me to see what is going on with my database connection pool as I continue to have periods throughout the day where the app tries to gobble up large number of connections and that seems to be causing our problems.
0
 
objectsCommented:
you can check the db to see how many connections are currently open

0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

  • 6
  • 5
Tackle projects and never again get stuck behind a technical roadblock.
Join Now