Link to home
Create AccountLog in
Avatar of srikanthradix
srikanthradixFlag for United States of America

asked on

Increasing the Java Virtual Memory

JVMDG217: Dump Handler is Processing OutOfMemory - Please Wait.
JVMDG315: JVM Requesting Heap dump file
JVMDG318: Heap dump file written to C:\Documents and Settings\ab33973\IBM\rationalsdp6.0\workspace\UploadSwing\heapdump.20080805.155918.2760.phd
JVMDG303: JVM Requesting Java core file
JVMDG304: Java core file written to C:\Documents and Settings\ab33973\IBM\rationalsdp6.0\workspace\UploadSwing\javacore.20080805.155918.2760.txt
JVMDG274: Dump Handler has Processed OutOfMemory.
java.lang.OutOfMemoryError.


I am getting an OutOfMemory Error. How do i increase the Virtual Memory in Websphere for running Java Programs?
SOLUTION
Avatar of Mick Barry
Mick Barry
Flag of Australia image

Link to home
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
See answer
Avatar of srikanthradix

ASKER


@objects,
This is a Java application using Swing as UI which uploads files of sizes 255MB, 120MB and 70MB from network drive to local drive, Does the suggested mechanism applies for normal Java Programs also, rather than web application?
SOLUTION
Link to home
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
Avatar of rikga
rikga

The JRockit JVM will size the heap automatically. You don't need to set the -Xmx flag at startup
http://www.oracle.com/technology/software/products/jrockit/index.html
ASKER CERTIFIED SOLUTION
Link to home
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
I am trying to read a 255MB file and get byte array as follows:

Please suggest me if the following code is efficient, <b> if not please suggest me most efficient code.</b>

And there is a ValueObject FileItemVO which is populated based upon the returned bytes.

class FileItemVO{
byte[] fileitems;
....
}

FileItemVO fileItem = null;

fileItem = setFileItemVO(fileItem , file)

private FileItemVO setFileItemVO(FileItemVO fileItem, File file) {
            
      fileItem = new FileItemVO ();
          fileItem.setFileName(file.getAbsolutePath());
          byte fileContents[] = getFileContents(file);
          fileItem.setFileContents(fileContents);
          fileItem.setValidationSuccess(true);
          
          return fileItem;
          
            
}

private byte[] getFileContents(File file) {
            StringBuffer contents = new StringBuffer();
            BufferedReader br = null;
            try {
                  br = new BufferedReader(new FileReader(file));
                  String line = null;
                  try {
                        while ((line = br.readLine()) != null) {
                              contents.append(line+"\n");
                        }
                  } catch (IOException e1) {

                        e1.printStackTrace();
                  }

            } catch (FileNotFoundException e) {

                  e.printStackTrace();
            }
            System.out.println(contents.toString());
            return contents.toString().getBytes();
      }
There's nothing pointedly wrong or inefficient with the code you listed here. When looking at efficient use of memory, you need to look at the entire algorithm. If your application needs to read an entire 256MB file into memory, then you need to do it. But, if you can read parts at a time and process them sequentially, then you can limit the amount of concurrent memory needed during execution.

To help free up memory, if your program continues to execute after processing the file, be sure to set your FileItemVO objects to null when you're done with them. This will allow the Java garbage collector to free up that memory.

-- Alexander.