?
Solved

Large JVM memory size.

Posted on 2004-10-14
7
Medium Priority
?
1,074 Views
Last Modified: 2008-02-01
The application that we are working on requires very fast response times and we are thinking of implementing some caching mechanisms using singleton.

The scary thing is that cache size may be quite large -- approximately 500MB - 1GB.

Question that I have consists of two issues:

1) How JVM would handle such large amount of memory dedicated to it ? Is there any limitations on memory size ?
2) In terms of the speed, shall we expect any performance degradation when JVM deals with such large objects ?

Thank you for your help.
0
Comment
Question by:sasha_kv
7 Comments
 
LVL 8

Expert Comment

by:kiranhk
ID: 12312347

1) How JVM would handle such large amount of memory dedicated to it ? Is there any limitations on memory size ?
You can increase the memory size for the JVM using the following
java -Xms256m -Xmx512m


2) In terms of the speed, shall we expect any performance degradation when JVM deals with such large objects ?
If indeed you have so much of data cache the performance really depends on how you actually use the cache. Basically like you might be loading the whole data into memory and for navigating to the required data from the clients perspective depends on how u program to retrieve the data.
0
 

Author Comment

by:sasha_kv
ID: 12312618
1) I am familiar how to  increase memory size for JVM. The question here is more conceptual on limitations of memory size.

I've found this quote on the internet and not sure about validity of the this statement:

-- On 32-bit processor machines, the largest contiguous memory address space the operating system can allocate to a process is 1.8GB. Because of this, the maximum heap size can only be set up to 1.8GB. On 64-bit processor machines, the 1.8 GB limit does not apply, as 64-bit processor machines have a larger memory address space."
--

2) Cache mechanism will be implemented using HashMap, where userID (int) would represent a key, and UserVO that contains all the necessary data about a user would represent a value. If desiralized version of UserVO takes up let's say 10kb, then caching 50,000 users would require 500mb of memory. (if my math is correct :)

0
 
LVL 14

Assisted Solution

by:Tommy Braas
Tommy Braas earned 600 total points
ID: 12312746
Hi sasha_kv,

As far as limits on heap memory allocation. It really (apart from obvious 'physical' address space constraints) depends largely on the JVM implementation. I don't see a problem with a 500 MB of cache. We have caches here which exceed that amount. As far as performance goes, you might see a decrease in performance if you have a lot of objects with the same hash code, which is to be expected with a container type such as a hash table. How are you guarding access to the cache?

Cheers!

\tt
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
LVL 24

Accepted Solution

by:
sciuriware earned 600 total points
ID: 12316428
I run a banking application to a limit of 1540 Mb; using the SUN JRE 1.4.2_03 there seemed to be a limit there.
I used very high quantities of small objects.
To my surprise the CPU usage was very low, so the garbage collector had no problems.
The application run repeatedly for 90 hours.
So, there isn't really a problem with major JRE's.
Only two advises:
1) monitor your memory usage (I put it in the aboutbox so everybody could inspect it),
2) keep off calling .gc() it does more harm than good.
;JOOP!
0
 
LVL 21

Assisted Solution

by:MogalManic
MogalManic earned 600 total points
ID: 12317650
Look into using the LinkedHashMap.  It combines a linked list with the hashmap where the linked list is the Least Recently Used element.  This allows you to set a maxsize of your cache and to remove the oldest item in the cache.

You HAVE to set a maxsize of your cache (even if you never reach it).   You will probably NEVER reach the 1.8G maximum imposed by Windows.  When allocated memory starts approching the size of the REAL memory, the OS task scheduler will start thrashing (spending more time swaping memory than executing tasks).  This will degrade performance significantly.

I have seen this happen on both Windows and Unix environments (though the unix handled a larger memory allocation before thrashing).
0
 

Author Comment

by:sasha_kv
ID: 12375766
Thanks very much for everyone's input !!!
0
 
LVL 14

Expert Comment

by:Tommy Braas
ID: 12377606
=-)
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction Java can be integrated with native programs using an interface called JNI(Java Native Interface). Native programs are programs which can directly run on the processor. JNI is simply a naming and calling convention so that the JVM (Java…
In this post we will learn how to connect and configure Android Device (Smartphone etc.) with Android Studio. After that we will run a simple Hello World Program.
Viewers will learn one way to get user input in Java. Introduce the Scanner object: Declare the variable that stores the user input: An example prompting the user for input: Methods you need to invoke in order to properly get  user input:
This theoretical tutorial explains exceptions, reasons for exceptions, different categories of exception and exception hierarchy.
Suggested Courses
Course of the Month16 days, 13 hours left to enroll

864 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question