We help IT Professionals succeed at work.

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now

x

PrintWriter memory consumption

everton690
everton690 asked
on
Medium Priority
1,683 Views
Last Modified: 2012-05-04
Hi,
I have a java program with 20 threads.  Each tread uses a different PrintWriter that is already open to write to a seperate text file.  The problem is that I get a java.lang.OutOfMemory error after a certain period of time.  I was wondering if anyone could tell me if PrintWriter needs to use more memory as the size of the text file increases.  If so is there anyway to perform these write operations without incurring an out of memory error, or does the PrintWriter strain on memory remain static regardless of the size of the text file.
         Cheers,
         everton690  
Comment
Watch Question

CERTIFIED EXPERT
Top Expert 2016

Commented:
Your problem is more likely to be caused by not closing Writers or other resources after use.

Author

Commented:
The thing is that the Writers must remain open to continually allow data to be written to the text files.
Seems like all your contents are in memory.  It is not going to the file ... i suppose.

If it is in memory, when the text size grows, obviously, there would be out of memory exception.


Regards,
Muruga
Can you post your code
CERTIFIED EXPERT
Top Expert 2016

Commented:
Tell us more about what's happening. So far assuming 20 PW's and one text file. How big is the text file?

Author

Commented:
There are 20 PW's,20 text files and 20 threads.  Each thread calls a function that uses a PW that is already open to write to the text file.  The program then calls another function that reads from this text file.
CERTIFIED EXPERT
Top Expert 2016

Commented:
Sounds to me like you should be reading from and writing to buffers at this point - not files (at least until all processing is finished)
Let me tell you one thing, if all the contents are going into the file, then there would be very less chance of outofmemory exception.

That portion of file would help us finding the exact cause.
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
The JVM does not use all the available memory. You can specify how much memory you want it to use, by the -Xmx option. Like:

-Xmx200m

- tells the JVM to use 200 MB
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
I would still agree with CEHJ that you should use buffers instead of writing directly to files. You can try BufferedWriter.
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
Ah, I thought CEHJ told you to use buffers. I guess he meant that you were *using* buffers. Well, which side are you?
CERTIFIED EXPERT
Top Expert 2016

Commented:
>>I would still agree with CEHJ ...

Difficult to tell exactly though without knowing more
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
Generally, I prefer using buffers. Disk I/O is slower. But well, if it gives OutOfMemoryException, I would eliminate the buffers ;-)

Author

Commented:
Here is an example of a thread and the read and write functions.  Hopefully this will help.  


 Thread thread = new Thread(new Runnable() {
      public void run() {
              
    writesomething(input param,nameo PrintWriter);
   
         
    while (program not finished)
    {
   
     
      Writer.flush();
     
      readfromfile(input param,input param);
      
               
     }
   
do some computation

     writetofileagain(input param, input param);
 
     
}
    });
    thread.start();
          

//function to write to file

writesomething(input param, nameofPrintWriter){

nameofPrintWriter.println(something);

}

//function to read from file

readfromfile(int inputparam,nameoffile){
      
FileReader fr=null;
String s=null;

String patt =" "+inputparam+" :";

     String file =
      System.getProperty("user.dir") +
      System.getProperty("file.separator") +nameoffile+".txt";
String str=null;


try {
        fr = new FileReader(file);
        LineNumberReader lr =
          new LineNumberReader(fr);
       
        while ((str = lr.readLine()) != null) {
            
          if (str.indexOf(patt) != -1) {

         s=str.substring(11);
      
       // lr.close();
        }
    }
  fr.close();
}
         
    catch (ArrayIndexOutOfBoundsException e) {
        System.err.println("Caught ArrayIndexOutOfBoundsException: " +e.getMessage());
                                       }

    catch (IOException e) {
        System.err.println("Caught IOException: for readsomething " + e.getMessage());
      
                          }
 
return s;
}


     
CERTIFIED EXPERT
Top Expert 2016

Commented:
What's the average file size?
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
Looks like you might have to try the -Xmx option.
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
Check the amount of free memory that you have. I guess that the JVM is not using all of it. Suppose its 200 MB, then use:

java -Xmx200m

You can also increase the maximum stack-size with -Xms
CERTIFIED EXPERT
Top Expert 2016

Commented:
Frankly i'm surprised that is working, (excepting the OutOfMemory) as if the Writers are not being closed, you might get an old copy of the file instead of the data from the most recent write. *Was* it working?

Anyway, those (almost certainly unnecessarily) tight loops will cause a massive allocation of resources that will eventually produce your OOM

Author

Commented:
I tried using the -Xmx option but it only delays the java.lang.OutOfMemory error.  The file sizes will continually grow until each one contains roughly 700,000 lines of data.  The reason the data is being sent to the text files and not stored in memory  is to reduce the strain on memory.  Do you think this method of using Writers and Readers is increasing the strain on memory as the file size increases or is the strain irrelevant of the file size?
My suspect is at LineNumberReader.  It will read the entire content.  

I am not seeing any use of LineNumberReader.  Replace that with BufferedReader.  That might help.
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
If it delays the OutOfMemory error, then it means that it is working to some extent - you are still using lesser memory. You need even more. I suggest that you check how much available memory you have on your system, and then give the entire as parameter to -Xmx. Try increasing the stack-size as well.
CERTIFIED EXPERT
Top Expert 2016

Commented:
I think OOM, given the code, would probably happen with much smaller files but the fact they're large or at least becoming large is certainly going to make things worse.

You should probably be doing this with your own buffers, but at the moment i can't think of a good way to implement it.
Unlock this solution with a free trial preview.
(No credit card required)
Get Preview

Commented:
>while (program not finished) {
so, when the program got finish?

Maybe, the infinite loop cause the OOM occur. Trying to print something in the loop and run only 1 thread to test it.

Author

Commented:

The cause of the memory leak was nothing to do with any of the IO.  I gave the points to sciuriware as he was right but to provide a solution would have required a lot more information

Cheers,
everton690  
Mayank SPrincipal Technologist
CERTIFIED EXPERT

Commented:
Were you able to figure out that what was the actual cause?
Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a free trial preview!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

OR

Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.